Showing posts with label INQUISITOR. Show all posts
Showing posts with label INQUISITOR. Show all posts

Tuesday, July 14, 2020

What is the difference between KNN and K means clustering?

SMART SUBU


 

What is the difference between KNN and K means clustering

We are already aware of supervised and unsupervised machine learning algorithm as discussed in our previous sections. Let us understand K means clustering first. It is an unsupervised machine learning algorithm used for clustering where the algorithm is trained to divide the whole data set into different clusters as decided by the user. K is the number of predefined clusters.

The number of clusters that the algorithm is supposed to form is defined by the k-means clustering.

Whereas, KNN is K Nearest Neighbour algorithm, it is a supervised learning technique used for regression or classification. The algorithm is trained to identify any random unit and after that, the algorithm tries to locate the nearest neighbours based on the association of characteristics.

The number ‘K’ represents the number of nearest neighbours that the particular algorithm is supposed to find.

If You are Interested to know more about Machine Learning Algorithms, You can mail to smartsubu2020@gmail.com.

Monday, July 13, 2020

What is the process of deployment of models?

SMART SUBU


What is the process of deployment of models

In simple terms, deployment of any machine learning model should start from identifying the application that we want to build.

Second, we should understand the data requirements for the model and then gather the data. After that, we need to identify our model and train the model with the data and test for accuracy of prediction of the model.

After that, we need to create a new web application using the appropriate framework, for example, the most common Python-based web application framework is Flask. Then, we need to complete the code or rather put the code in some repository like GitHub.

We need to create an account in any of the service providers specialized in providing a platform as a service and the common being Heroku. After that, we need to link the account that we have created in Heroku with that of the GitHub.

Heroku is a free platform which can be used to you create the API for the machine learning model that we have created. We need to understand that there are requirements of the certain libraries for running the code in the Heroku platform and environment.

Sunday, July 12, 2020

What is deep learning?

SMART SUBU
What is deep learning

In its very simplistic term, deep learning is the subset of artificial Intelligence design paradigm which is based on the philosophy to mimic the human way of learning. 

Why it is called Deep Learning is mainly contributed to the usage of the term by Geoffrey Hinton (Deep). But, another philosophical interpretation may be that it helps the machine to go deep into the learning dimensions and not stay at a superficial level.

The neurons based on the extent of trigger fires and send signals for appropriate decisions. 

In the same way machines generate experiential learning with the help of neural networks designed to facilitate the process of learning. 

This leads to the question, what is the neural network and how does it work. We will discuss that in Neural Networks.

Friday, July 10, 2020

What are Neural Networks?

SMART SUBU


What are Neural Networks


We all know about it from the past biology lessons we had in school. Exactly. Your neurons are triggered and the brain has successfully processed the information or the input (the question which I asked at the beginning).

Now, just think that if machines also need to be made artificially intelligent we only need a framework of neurons and an architecture of learning. Does it sound cool? Oh yes. 

That’s what neural network is. A framework having input layers (called as nodes), the hidden layer (called as neurons, where the processing of the data happens) and finally the output layer (the result which the machine produces). 

Take a look at the image for clarification. Now you know what neural network is. But, how does it function? Well, we will take up the issue in the Functioning of Neural Networks. 

If You are interested to know more, you can mail to smartsubu2020@gmail.com.



Thursday, July 9, 2020

Functioning of Neural Networks

SMART SUBU
Functioning of Neural Networks

What you do when you touch something which is hot? We remove our hand, of course. That’s obvious, isn’t it? But, I have not specifically told you how much it is hot (the temperature). Nor, I have tried to access the extent of tolerance of heat of your skin (sensory inputs). 

Now your answer may not be a straight yes that I remove my hand when I touch something hot. It depends on the extent of hotness (remember it as a degree) and the sensitivity (tolerance of your skin layers towards hotness, remember this as a trigger). 

Now take the case of a machine. 

We give stimulus in the input layer, assign a degree (call it weight) to the input, provide some predetermined function to the hidden layer (call it activation function) and then analyze the result produced in the output layer.

 For multiple inputs, a weighted sum is passed to the hidden layers. That’s ok. But, how the machine does decide which decision to take. 

Well, that depends on which neurons are triggered based on the activation function and the weights. We will take up the discussion in the Activation Function. 

If You are interested to know more about Neural Networks, You can mail to smartsubu2020@gmail.com.

Tuesday, July 7, 2020

What is an Activation Function?

SMART SUBU
What is an Activation Function

As in the case of your hand reacting to the stimulus of hotness to decide on whether to stay or withdraw, the same context applies to the activation function. 

The hidden layers in the neural network receive the inputs and based on the activation function decides the output. Just like any other mathematical function the nature of the output depends on the functional forms (whether sigmoid, ReLU, leaky ReLU or any other).

Just imagine you have trained yourself in heat tolerance, in that case, you will be activated in more degree of hotness compared to others. Similarly, the nature of the reaction of the hidden layers is decided by the triggering agents called as the activation function. 

Different activation function produces different output and thus decides the nature of output (decisions) generated by the machines. 

Thus, the activation function and the weights of the inputs play a critical role in output determination. Well, this is fine. But, how do we train the neural networks? We will talk about that in Training the Neural Network. 

Monday, July 6, 2020

How we can train the Neural Network?

SMART SUBU
How we can train the Neural Network

How we humans train ourselves. Well the simple answer, By Experience. But how do we experience? The way we react to the inputs decides our experience. 

We reinforce the reactions to the inputs to the desired levels (of society or parents). Simple. Just like this, the training of the neural network happens by giving certain inputs with random weights to the inputs. 

The hidden layers react according to the activation function. The output produced is compared with the desired outcome. 

The difference between the predicted outcome (by the machine) and the desired outcome is technical known as Loss Function, which needs to be minimized. 

This particular process is known as forward pass or feed-forward propagation. But, the predicted outcome (by the machine) seldom conforms to the desired outcome. By this time you are aware that weights of the inputs play a critical role in deciding the predicted output by the machine. 

To complete the training and calibrate the weights of the inputs, the method of backward propagation is followed until the difference between the predicted outcome (by the machine) and the desired outcome is minimized. This completes the training of the neural network. 

Well, we need to take up the calibration issue of the weights. That’s Backpropagation. We will talk about that later. Until then Good Bye.

To know more, you can mail to smartsubu2020@gmail.com.


Sunday, July 5, 2020

What is Back Propagation?

SMART SUBU
What is Back Propagation

When we were a child, we ate a red food item. Oh my God, that was Sweet. You thought I will write hot. Well in your experience it may have been hot. So we associated the red colour of food items with either sweetness or hotness. That’s calibration at work in the human system. 

It depends on how quickly we learn things. Exactly in the same fashion, the weights of the inputs are calibrated by assigning new weights in neural networks. 

The new weights are arrived at by subtraction the rate of change of loss function with respect to the old weights from the old weights. Confusing. See the equation in the picture. 

The learning rate decides the rate of calibration and is a crucial component. For multiple inputs with multiple weights, the backpropagation follows the chain rule of derivatives which results in a scary-looking formula. 

Well, let’s forget the formula as of now. 

Thus, back propagation ensures the calibration of weights until the loss function is minimized. But, how loss function/ error is minimized. Well, with the help of Optimizer. We will take up Gradient Descent (one of the optimizers) in our next discussion. 

To know more, you can mail to smartsubu2020@gmail.com.