https://github.com/gtraines/perceptron_classification

*Background*

One of the fundamental concepts in artificial intelligence and machine learning is the perceptron learning algorithm which gives life to the abstract data structure known as the perceptron. The perceptron is a data structure created to resemble the functioning of a neuron in the brain. The perceptron has a set of inputs (variable values) which each has an excitatory (positive) or inhibitory (negative) weight associated with it. During the training phase, the perceptron receives a set of values corresponding to its inputs along with an expected target outcome. If the sum of the weights multiplied by their corresponding input values is greater than a threshold value, the perceptron will emit a positive response; if the sum is lower than the threshold value, the perceptron will emit a negative response.

Continue reading Introducing the Perceptron →

https://github.com/gtraines/linear-regression

Linear regression is an approach to machine/statistical learning generally applied to value prediction problems. It is a form of supervised learning, wherein the training data provides the “correct” answer in addition to the data points generated by an unknown function, (*f*). Although in this case we were provided a 2-dimensional data set, linear regression can be used on higher-dimensional data sets. The linear regression method assumes that the unknown function *f *can be approximated using a polynomial linear equation of *d *terms (the number of features being measured plus a constant value for bias). Among machine learning algorithms, it is fairly simple, and in his CalTech lectures Dr. Abu-Mostafa calls linear regression “one-step learning.”

Continue reading Linear Regression →

https://github.com/gtraines/logistic-regression-classification

Logistic regression is a type of machine learning approach to the classification of noisy data. Whereas linear classification requires data to be linearly separable in order to find the decision hyperplane, logistic regression allows for the expression of uncertainty by providing a probability that a given sample should be placed into one class or the other.

Logistic regression calculates the probability by running the vector of of inputs and weights through a logistic or “sigmoid” function which Continue reading Logistic Regression with Gradient Descent – Some Thoughts and Lessons →

## always choosing the local optimum