As in the case of your hand reacting to the stimulus of hotness to decide on whether to stay or withdraw, the same context applies to the activation function.
The hidden layers in the neural network receive the inputs and based on the activation function decides the output. Just like any other mathematical function the nature of the output depends on the functional forms (whether sigmoid, ReLU, leaky ReLU or any other).
Just imagine you have trained yourself in heat tolerance, in that case, you will be activated in more degree of hotness compared to others. Similarly, the nature of the reaction of the hidden layers is decided by the triggering agents called as the activation function.
Different activation function produces different output and thus decides the nature of output (decisions) generated by the machines.
Thus, the activation
function and the weights of the inputs play a critical role in output
determination. Well, this is fine. But, how do we train the neural networks? We
will talk about that in Training the Neural Network.