fbpx
Building a Convolutional Neural Network: Male vs Female
In this blog, we are going to classify images using Convolutional Neural Network (CNN), and for deployment, you can use Colab, Kaggle, or even use your local machine since the dataset size is not very large. At the end of this, you will be able to build your own... Read more
Adversarial Attacks on Deep Neural Networks
Our deep neural networks are powerful machines, but what we don’t understand can hurt us. As sophisticated as they are, they’re highly vulnerable to small attacks that can radically change their outputs. As we go deeper into the capabilities of our networks, we must examine how these networks really... Read more
Building Neural Networks with Perceptron, One Year Later — Part III
Inside Perceptron Each neuron in a neural network like Perceptron will, at some point, have a value. Each weight (the neuron links) will also have a value, all of which the user sets initially as random decimals between a specified range. This is the third part in a three-part... Read more
Classic Regularization Techniques in Neural Networks
Neural networks are notoriously tricky to optimize. There isn’t a way to compute a global optimum for weight parameters, so we’re left fishing around in the dark for acceptable solutions while trying to ensure we don’t overfit the data. This is a quick overview of the most popular model regularization... Read more
5 Essential Neural Network Algorithms
Data scientists use many different algorithms to train neural networks, and there are many variations of each. In this article, I will outline five algorithms that will give you a rounded understanding of how neural networks operate. I will start with an overview of how a neural network works,... Read more
Choosing the right activation function in a neural network Activation functions are one of the many parameters you must choose to gain optimal success and performance with your neural network. In this article I’m going to assume you understand the basics of how a neural network works, and will... Read more
An Overview of Multi-Task Learning in Deep Neural Networks
Note: If you are looking for a review paper, this blog post is also available as an article on arXiv. Table of contents: Introduction Motivation Two MTL methods for Deep Learning Hard parameter sharing Soft parameter sharing Why does MTL work? Implicit data augmentation Attention focusing Eavesdropping Representation bias Regularization... Read more