To date we’ve been solving regression problems, where the values being predicted (like the price of a house) are entirely unconstrained. Classification problems are those where we want a machine learning algorithm to predict a specific, discrete result from a predefined set of data.

A straight line won’t always be the best fit for our data. In this article, you’ll learn how to generate predictive polynomial functions that leverage the machinery of our linear function algorithms. This technique will allow you to generate sophisticated predictive functions that bend and curve to fit your data.

In the univariate linear regression series you learned how to implement a simple machine learning algorithm that can predict housing prices based on a single feature, the size of the house. In this article, we’ll start building a more sophisticated algorithm that can make housing price predictions based on multiple features....

Initialising and Managing Alpha Alpha is the only tunable parameter in the Gradient Descent algorithm, and using the right value is important; if alpha is too big, the algorithm may not converge on a solution; if alpha is to small, the algorithm can a long time to converge.

The Gradient Descent algorithm is a general purpose algorithm that has a number of practical applications in machine learning. By the end of this article you should have a good understanding of how the Gradient Descent algorithm works, and appreciate how it helps to solve to minimisation problems in general.