Skip to main content
ML Coding·Implement Logistic Regression from Scratch
Medium74% accepted

Implement Logistic Regression from Scratch

ClassificationGradient DescentLoss Functions
Problem Statement

Implement binary logistic regression from scratch using NumPy. Include sigmoid function, binary cross-entropy loss, gradient computation, and training loop with mini-batch gradient descent.

Examplespublic test cases
  • sigmoid(0) should return exactly 0.5

    Input sigmoid(0)

    Output 0.5

  • sigmoid(large positive) clips to 1.0

    Input sigmoid(10000)

    Output 1.0

  • sigmoid(large negative) clips to 0.0

    Input sigmoid(-10000)

    Output 0.0

  • sigmoid(log(3)) = 0.75 (exact)

    Input sigmoid(1.0986122886681098)

    Output 0.75

Complexity
TimeO(n_iters * m * n)
SpaceO(n)
Asked By
GoogleMetaAmazon

Ready to solve?

Open the code editor with the official template, run tests, and iterate.