Learn

Select Category

All Posts

Hyperparameter Tuning and Cross Validation

When training a neural network, there are several design choices that can be made about the neural network architecture (these …

Gradient Descent and Backpropagation Walkthrough

Let’s train the following neural network architecture with initial edge weights all set to 1:

Gradient Descent and Backpropagation Walkthrough

While the weights are all …

Convolutional Padding

Convolutional layers have the power to detect and learn various features in input images such as shapes, textures, and horizontal and vertical …

Softmax Function

The softmax function is a special case of the logistic function, where it is applied to a multi-class problem. It …

Sobel Edge Detection

Edge Detection includes a variety of mathematical methods that helps us to identify points in an image where the image …

Root Mean Squared Error

The Root Mean Squared Error (RMSE) is one of the most commonly used measures for evaluating the quality of a …

Measuring your Regression Model's Performance

When you develop any machine learning model, it is crucial to measure the performance of this model. Several differentt methods are …

ROC Curve

The Receiver Operating Characteristics (ROC) curve is a performance metric for classification tasks at various classification thresholds.

The ROC curve …

ReLU Activation

A neural network is a weighted linear combination of inputs without any activation. An activation function introduces non-linearity in the …

Inclusion-Exclusion Principle

The Inclusion-Exclusion principle states that for two events A and B, \(P(AUB) = P(A) + P(B) - P(A\cap B)\)

This can …

Precision and Recall

The performance of classification models is often measured by the accuracy score: the number of correct predictions over the total output …

Negative Transformation

Image Enhancement is the process of escalating the details of an image. We can use image enhancement techniques to improve …

Multiple Linear Regression

Machine learning models are not required to only have one input variable. For example, let’s say we want to predict …

Neural Network Predictions

Let's say we want to predict the following neural network's output prediction when it has been trained with these weights:

Neural Network Predictions

We …

What is a Line of Best Fit?

How does linear regression find this line of best fit given the data points? First, let’s get more formal about …

Measures of Central Tendency

Introduction: 

A measure of central tendency is a single number that attempts to define a dataset by spotting the central …

Max Pooling

Convolutional layers are excellent for recording the precise location of features in the input image. A big problem of convolution neural networks is …

Image Filtering

Filtering is one of the conventional image preprocessing steps. Various kinds of filtering can be done on an image during …

Classification Intuition

The basic structure for classification is this:

Classification Intuition

We take input data, feed it into one of many possible classification methods, …

Linear Regression Intuition

Imagine that you have recorded the hours spent studying for an exam and the score on the exam for several …

Non-Linear Regression

What if the x and y variables don’t have a linear relationship? For example, let’s say we want to build …

Overfitting Intuition 1

Let’s say we have the following data, and we learn the following piecewise line as a regression predictor:

Overfitting Intuition

Now, let’s …

Preventing Overfitting with Regression: Regularization

The previous example was for regression. A popular method to prevent overfitting in regression is called regularization. Recall that a …

Knowing When a Model is Overfitting

It is critical to understand in which situations a model is overfitting in. A common way of doing this for …

K Nearest Neighbors Algorithm

KNN is one of the simplest supervised learning algorithm that makes predictions for a new data point. The KNN algorithm …

k-Means Clustering

Clustering data points together is one of the most common ways to analyze and understand unlabeled data. It identifies subgroups …

Independent Probabilities

A fundamental probability rule is:

P(A ∩ B) = P(A | B)P(B) = P(B | A)P(A) …

Independent Events in probability

Two events are said to independent if the occurance of one event does not affect the occurence of the other event. …

Image Translation

The translation of an image is the process of moving or relocating of an image or object from one location …

Translational Invariance

In the real world, images are not 2 pixels by 2 pixels. You might not even be able to see …

Image Thresholding

Thresholding is a basic image operation. During segmentation, we first seperate pixels into two or more categories. Basically, we can classify pixels …

Image Subtraction

Image subtraction is the process where value of pixels of an image is subtracted from another image. This results in …

Sharpening an Image

Image Sharpening is an image enhancement technique which increases the contrast between bright and dark regions to bring out the …

Classification on Images

Across all of computing, and not just in machine learning, images are represented as a matrix of pixels. A matrix …

Hyperparameter Tuning

Hyperparameters control the learning process of a model by determining the network structure and training. Hyperparameter tuning is the process …

Gradient Descent Clearly Explained: How Machine Learning Works

How does linear regression find the values of m and b that will minimize the mean square error? The process …

Gaussian Distribution

The Gaussian distribution is a bell-shaped curve in which the values are supposed to obey a normal distribution with a corresponding …

F1 Score

Several metrics can be used to evaluate the performance of a binary classifier. Accuracy is the simplest of all and …

Image Dithering

Image Dithering is a process of adding some noise to an image. This noise can be used to randomize the quantization error. …

Preventing Overfitting with Neural Networks: Dropout

The most popular regularization method for neural networks is called dropout. The idea is simple: at each iteration of training, …

Image Data Augmentation

A convolutional neural network (CNN) is trained with images from a certain dataset, and the number of images in a …

The Convolution Operator

Let's say that you want to detect features of a certain group of objects or class in other words. The features …

Using a CNN

Convolutional neural networks are perhaps the most impressive image classifier techniques being used today. They manage picture and video information where the …

Average Pooling

Convolution layers are great in recording the exact area of highlights of the picture. A major issue for convolution neural …

Convolutional Neural Networks (CNNs)

The central operation in the convolutional layer of a CNN is a convolution. In this section, we will describe convolutions.  …

Bayes Theorem

A central probability formula is the following:

Bayes Theorem

Let’s think about why this equation is the case. If we are given …

Activation Functions

A neural network would essentially be a weighted linear combination of inputs that can capture linear, simpler patterns in the …

K Nearest Neighbors

The k-nearest neighbors (k-NN) is a non-parametric, supervised learning algorithm that can be used to solve both regression and classification …

Decision Trees

Another popular type of machine learning model is called a “decision tree”. This method is quite intuitive. If we wanted …

Evaluating Classifiers

It is important to properly measure how well our machine learning classifiers do. While we may think that we can …

Confusion Matrices

Another useful visual tool for evaluating classifiers is called a confusion matrix. Confusion matrices show, for each category that …

Balanced Datasets

It is crucial that data sets are balanced across class. A balanced data set contains the same number of data …

Addressing Bias and Fairness via Separate Classifiers for Each Group

Even with oversampling of the data from underrepresented groups, the model may still not learn as well for this group …

Explainable and Interpretable AI

In order to verify that a model is not using biased assumptions to make its predictions, it is crucial to …

Reproducibility

Reproducibility of machine learning models in crucial for a variety of reasons. In terms of research ethics, it is critical …

Clustering Intuition

The intuition for clustering is simple. Let’s look at a few examples. Imagine you are a professor, and you have …

K Means

The simplest method for clustering is called k-means clustering. In k-means clustering, you input an argument k which represents the …

Dimensionality Reduction Intuition

Sometimes, there are many input variables for a classifier. In most of the examples we have shown so far, we …

Principal Component Analysis (PCA) Intuition

The simplest method of dimensionality reduction is called Principal Component Analysis, or PCA. The intuition behind how it works …

Predicting with a Neural Network

Neural networks are made up of nodes and edges. Nodes are analogous to neurons in the brain, and edges …

Network Activation Functions

We first encountered activation functions when we covered logistic regression. Recall that binary logistic regression involves simply applying the sigmoid …

Backpropagation

Training a neural network with gradient descent involves backpropagation, which is an efficient method for calculating the gradient …

Understanding Derivatives

The fundamental building block of calculus is the derivative. The derivative is a way of measuring the rate of …

Deriving Derivative Rules

We can use the general formula for the derivative to derive derivative rules which can be used to quickly calculate …

Calculating Integrals

The integral is the inverse of the derivative. To calculate an indefinite integral, then we simply find the antiderivative, …

Basics of Probability

Logistic Regression

Logistic regression is one of many methods for doing classification and is usually the first method that you learn about. …

K Nearest Neighbors (KNN) Classification

A very different, but potentially easier to understand classification method, is k-nearest neighbors.  This method is best described with …