An overview of gradient descent optimization algorithms

@tachyeonz : Gradient descent is one of the most popular algorithms to perform optimization and by far the most common way to optimize neural networks. At the same time, every state-of-the-art Deep Learning library contains implementations of various algorithms to optimize gradient descent (e.g.

Click here to read more

Tags : #adagrad, #adam, #algorithms, #batch, #gradientdescent, #machinelearning, #mini-batch, #momentum, #newton, #optimization, #rmsprop, #stochastic, m

Published On:December 04, 2016 at 12:35AM

Connect On:
Facebook : /tachyeonz
Twitter :@tachyeonz

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s