Exploring LSTMs

@tachyeonz : The first time I learned about LSTMs, my eyes glazed over. Not in a good, jelly donut kind of way.

Read More

Connect On:
Twitter :@tachyeonz

Rohan & Lenny #3: Recurrent Neural Networks & LSTMs

@tachyeonz : It seems like most of our posts on this blog start with “We’re back!”, so… you know the drill. It’s been a while since our last post — just over 5 months — but it certainly doesn’t feel that way.

Read More

Connect On:
Twitter :@tachyeonz

Phase-Functioned Neural Networks for Character Control

@tachyeonz : Created on April 30, 2017, 3:48 p.m. This year at SIGGRAPH I am presenting Phase-Functioned Neural Networks for Character Control. This paper uses a new kind of neural network called a “Phase-Functioned Neural Network” to create a character controller suitable for games.

Read More

Connect On:
Twitter :@tachyeonz

Overcoming Catastrophic Forgetting in Neural Networks

@tachyeonz : I’ve found that the overwhelming majority of online information on artificial intelligence research falls into one of two categories: the first is aimed at explaining advances to lay audiences, and the second is aimed at explaining advances to other researchers.

Read More

Connect On:
Twitter :@tachyeonz

Understanding LSTM and its diagrams

@tachyeonz : I’m not better at explaining LSTM, I want to write this down as a way to remember it myself. I think the above blog post written by Christopher Olah is the best LSTM material you would find. Please visit the original link if you want to learn LSTM. (But I did create some nice diagrams.)

Read More

Connect On:
Twitter :@tachyeonz

Personalization and Scalable Deep Learning with MXNET

@tachyeonz : The presentation below by Alex Smola is “Personalization and Scalable Deep Learning with MXNET” from the MLconf San Francisco, 2016. User return times and movie preferences are inherently time dependent.

Read More

Connect On:
Twitter :@tachyeonz

How Neural Network Interactions Change In Working Memory

@tachyeonz : How does cross-talk in brain networks alter when working memory, the mental assembly of information needed to carry out a particular task, is engaged? Scientists at Massachusetts General Hospital have found that dopamine signaling in the cerebral cortex can predict changes in the extent of communic

Read More

Connect On:
Twitter :@tachyeonz

Unsupervised learning of 3D structure from images

@tachyeonz : Unsupervised learning of 3D structure from images Rezende et al. (Google DeepMind) NIPS,2016 Earlier this week we looked at how deep nets can learn intuitive physics given an input of objects and the relations between them. If only there was some way to look at a 2D scene (e.g.

More

Published On:January 10, 2017 at 09:21AM

Connect On:
Medium : http://www.medium.com/iiot
Twitter :@tachyeonz

These Were The Best Machine Learning Breakthroughs Of 2016

@tachyeonz : What were the main advances in machine learning/artificial intelligence in 2016? originally appeared on Quora: the knowledge sharing network where compelling questions are answered by people with unique insights.

More

Tags : 2016, breakthroughs, cntk, deep learning, dntk, gans, generative adversarial, innovations, lstm, machine learning, mxnet, neural networks, nips2016, probabilistic models, probabilistic programming, statistics, wavenet, z

Published On:January 04, 2017 at 04:25PM

Connect On:
Medium : http://www.medium.com/iiot
Twitter :@tachyeonz

NIPS 2016 Review, Day 1

@tachyeonz : Ever the scientists, the two organizers justified their choice on the program committee by maintaining that they want to grow the number submissions while decreasing bias and variance. They treated the problem with unknown ground truth of what the “best papers” were,

More

Tags : artificial intelligence, cnn, conference, convolution neural net, deep learning, gans, lstm, m, machine learning, meta learning, meta models, nips2016, phased lstm, recurrent neuralnet, reinforcement learning, rnn, time series data, unsupervised learning, yann le cun

Published On:December 25, 2016 at 07:26PM

Connect On:
Facebook : /tachyeonz
Twitter :@tachyeonz

Understanding LSTM and its diagrams

@tachyeonz : I’m not better at explaining LSTM, I want to write this down as a way to remember it myself. I think the above blog post written by Christopher Olah is the best LSTM material you would find. Please visit the original link if you want to learn LSTM. (But I did create some nice diagrams.)

Click here to read more

Tags : #ai, #deeplearning, #flowchart, #lstm, #machinelearning, #schematics, m

Published On:December 19, 2016 at 07:00AM

Connect On:
Facebook : /tachyeonz
Twitter :@tachyeonz

Sequence Classification with LSTM Recurrent Neural Networks in Python with Keras

@tachyeonz : Sequence classification is a predictive modeling problem where you have some sequence of inputs over space or time and the task is to predict a category for the sequence.

Click here to read more

Tags : #analytics, #artificialintelligence, #classification, #datascience, #lstm, #machinelearning, #python, #recurrentneuralnetworks, #rnn, m

Published On:December 05, 2016 at 06:35AM

Connect On:
Facebook : /tachyeonz
Twitter :@tachyeonz