Decision Boundaries with Logistic Regression

So now that we have defined our hypothesis for logistic regression lets talk about what is called a decision boundary. We can translate the output of the hypothesis to say if h(x) >= 0.5 then y = 1 otherwise h(x) < 0.5 and y = 0. This is fair because it gives a half/half split. …

Conceptual + Mathematical View on Gradient Descent

Continuing from the last article to narrow down the theta1 and theta 0 we use a learning technique called gradient descent. So we have our cost function J(theta, theta1) and we want the result of the function to be the smallest possible. What gradient descent does is that it randomly starts with a theta and …

A Mathematical View on the Cost Function: 2-D, 3-D, and Contour Graphs

Though the equation does look daunting don’t worry we’ll go over it. Okay so the inside term is y^I – yi. y^i is basically the work function h(x). We subtract the value of h(x) from y (our actual point) for all points (that’s why we use summation). We square the expression because we eliminate any …

Plans for 2020 and Update on Uploading Status

Sorry for not posting these last few months, I have just been trying to get adjusted to an exhausting sophomore life. I have quite a few drafts almost ready to publish, so more articles about machine learning and algorithmic designs will be coming out soon! For 2020 my goal is to upload an article once …

Design a site like this with WordPress.com
Get started