In this fourth tutorial on deep learning with Python and TensorFlow we're going to look at Gradient Descent.
Gradient Descent is an optimization algorithm that's very popular and has been used extensively in an overwhelming number of deep learning projects.
The math behind gradient descent might be a bit intimidating, which is why we're going to leave it somewhat aside for now. Here we're trying to get an intuitive understanding of it.
As an optimization algorithm, the purpose of gradient descent is to update the parameters, the weights and the bias, to minimize the gap between the real output and the targeted output. In other words, to minimize the loss or to increase the accuracy of your model.
Please see the video for the complete walk-through.
To stay in touch with me, follow @cristi
Cristi Vlad Self-Experimenter and Author
Thanks for sharing
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
So nice to see this series continue... resteemed!
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit