Deep Learning

Santosh Thapa
4 min readNov 23, 2020

--

Deep learning is the subset of machine learning which is used for large datasets that even traditional Machine Learning algorithms fail to provide a certain level of good accuracy. The impact of Deep Learning is now increasing exponentially because most of the complex problem that was considered impossible is now made possible with the help of Deep Learning. Before diving into further let’s look back to its history.

Perceptron, in the Deep learning term a simple neural network was invented by Frank Rosenblatt in 1957. The concept was simple. Just train a model with some inputs with some wights. However, this concept failed since it was unable to perform for certain important problems like XOR problems and there were not enough resources for that. The chapter was closed when researchers in 1969 criticized it. However, in the 1970’s the concept of backpropagation was introduced which was able to solve the problem that problem which was not able to be solved by perception. The main concept inside backpropagation was it was able to backpropagate inside the neurons to update the mistaken weights and improve the model based on it. This made the neural networks again popular in the Data Science and Artificial Intelligence world. After that various development has been done in the field of Deep Learning. In the 1980s, the Convolution Neural Network( CNN) came which was a boon for the image classification and object detection problem. Similarly, the development of Recurrent Neural Network(RNN) boosted and pushed Deep Learning to the next level. And the main thing is that it is still not stopping, in my view, this is just the beginning era of Deep Learning, it has just taken birth. I am sure that its popularity is going to be growing exponentially in the coming future as it can solve the problem that is more analogous to human beings like image recognition, speech synthesis, self-driving cars. Day by day new research is being done for Deep Learning and all thanks to Deep Learning because of which now we can use the most advanced and stylish features like interacting with IOS Siri, Amazon’s Angela, Windows Cortana. This is just one simple example, there are other many benefits we are getting with the growth of Deep Learning and will be getting more in the coming future as well.

Again, Deep Learning is divided into different types:

  1. Artificial Neural Network (ANN): This is the basic neural network that we learn from scratch. This network is analogous to the human brain.
  2. Convolutional Neural Network(CNN): This type of neural network is used especially for images, videos, and computer visions.
  3. Recurrent Neural Network(RNN): This type of neural network is used in the time series version of Artificial Neural Network. It is the most common neural network that is used in Natural Language Processing(NLP).

Some basic terms used in Deep Learning:

  1. Inputs: Inputs are the features or values that will be passed to the neural network. Inputs can be text, videos, audio, and images.
  2. Neuron: It is the fundamental or the basic unit of a neural network.
  3. Activation Function: It is a function that provided non-linear output. It is basically to decide whether a neuron should be activated or not. The main purpose of the activation function is to introduce non-linearities into the network.
  4. Weights: Weights are added to each and every input that is connected to the neurons. Weights are generally used for the optimization of our model to reduce the errors. We are not allowed to change the input, however, weights can be changed to meet the accuracy of the model.
  5. Loss function: The differences between actual and predicted values. The main aim is to reduce this loss function using an optimization technique like Gradient Descent.

Another rise for Deep Learning is that the availability of powerful resources of hardware and software makes it possible to run Deep Learning problems. Deep learning, as I mentioned earlier can solve the non-linear and complex problem that even norman machine learning algorithms failed to do so.

Simple steps to developed Artificial Neural Network:

  1. Randomly initialize the weights to small numbers as close to zero but not exactly zero otherwise all the input feed to neurons will be zeros.
  2. Feed the input data to the input layers. Input can be in any form, it may be text, images, and videos.
  3. Perform the forward propagation from the direction left to right so that neurons are activated.
  4. Compare the predicted output with the actual output and observe the loss function.
  5. If the error is huge or if we don't achieve our desired accuracy perform the backpropagation from the direction right to left starting from the last node.
  6. Repeat steps 3 to 5 until the desired level of accuracy is achieved.

Some important Deep Learning Libraries:

A Deep Learning Library from Google.
A Deep Learning library running on top of Tensorflow, Theano
Deep Learning Library from Facebook

Some applications of Deep Learning:

  1. Image Recognition/ Object Detection.
  2. Speech recognition.
  3. Natural Language Processing
  4. Self Driving Car

--

--