Neural Networks and TensorFlow - Deep Learning Series [Part 4]

in programming •  7 years ago 

In this fourth tutorial on deep learning with Python and TensorFlow we're going to look at Gradient Descent.

Gradient Descent is an optimization algorithm that's very popular and has been used extensively in an overwhelming number of deep learning projects.

The math behind gradient descent might be a bit intimidating, which is why we're going to leave it somewhat aside for now. Here we're trying to get an intuitive understanding of it.

As an optimization algorithm, the purpose of gradient descent is to update the parameters, the weights and the bias, to minimize the gap between the real output and the targeted output. In other words, to minimize the loss or to increase the accuracy of your model.

Please see the video for the complete walk-through.

To stay in touch with me, follow @cristi


Cristi Vlad Self-Experimenter and Author

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  

Thanks for sharing

  ·  7 years ago Reveal Comment

So nice to see this series continue... resteemed!