mean() and the next mistake that I think is replace np.subtract(A-Y) with simple A-Y bcz. There are some small mistakes like you should use np.sum(Y*np.log(A) + (1-Y)*np.log(1-A)) / m in place of using. My Output: dw = Įxpected Output: dw = ]Ĭould anybody tell me why my dw dimension is not same with expected output and help to find the cost function? # START CODE HERE # (≈ 2 lines of code)Ĭost = (-Y * np.log(A) - (1 - Y) * np.log(1 - A)).mean() / m Write your code step by step for the propagation. Implement the cost function and its gradient for the propagation explained above Y - true "label" vector (containing 0 if non-cat, 1 if cat) of size (1, number of examples)Ĭost - negative log-likelihood cost for logistic regressionĭw - gradient of the loss with respect to w, thus same shape as wĭb - gradient of the loss with respect to b, thus same shape as b. ![]() ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |