Implement binary logistic regression from scratch using NumPy. Include sigmoid function, binary cross-entropy loss, gradient computation, and training loop with mini-batch gradient descent.
sigmoid(0) should return exactly 0.5
Input sigmoid(0)
Output 0.5
sigmoid(large positive) clips to 1.0
Input sigmoid(10000)
Output 1.0
sigmoid(large negative) clips to 0.0
Input sigmoid(-10000)
Output 0.0
sigmoid(log(3)) = 0.75 (exact)
Input sigmoid(1.0986122886681098)
Output 0.75
O(n_iters * m * n)O(n)Ready to solve?
Open the code editor with the official template, run tests, and iterate.