Transfer Learning – Machine Learning’s Next Frontier

@tachyeonz : In recent years, we have become increasingly good at training deep neural networks to learn a very accurate mapping from inputs to outputs, whether they are images, sentences, label predictions, etc. from large amounts of labeled data.

Read More

Connect On:
Twitter :@tachyeonz

These Were The Best Machine Learning Breakthroughs Of 2016

@tachyeonz : What were the main advances in machine learning/artificial intelligence in 2016? originally appeared on Quora: the knowledge sharing network where compelling questions are answered by people with unique insights.

More

Tags : 2016, breakthroughs, cntk, deep learning, dntk, gans, generative adversarial, innovations, lstm, machine learning, mxnet, neural networks, nips2016, probabilistic models, probabilistic programming, statistics, wavenet, z

Published On:January 04, 2017 at 04:25PM

Connect On:
Medium : http://www.medium.com/iiot
Twitter :@tachyeonz

NIPS 2016 Review, Day 3

@tachyeonz : There were two invited speakers to kick off the day. The first one was on Brain and AI from IBM Watson’s Irina Rish, and the second was a Stanford’s Susan Holmes’ take on reproducibility of research and modeling bacteria in the body.

More

Tags : artificial intelligence, autoencoder, brain, deep learning, forward prediction nets, infogan, m, machine learning, memory networks, neural networks, neuroscience, nips2016, recurrent entity networks, rnn, wavenet

Published On:December 26, 2016 at 05:13PM

Connect On:
Facebook : /tachyeonz
Twitter :@tachyeonz

NIPS 2016 Review, Day 2

@tachyeonz : Why good morning again, fellow machine learners. It’s another day at NIPS, and what a grueling experience. The sessions ran from 9am to 9pm last night, and I was there for most of it! (Check out my NIPS 2016 Review, Day 1 for the low-down on yesterday’s action.) Ok, let’s get crackin’.

More

Tags : gans, generative adversarial, gradient descent, kyle cranmer, m, models, nips2016, pgm, probabilistic graphical, sgd, stochastic

Published On:December 26, 2016 at 05:13PM

Connect On:
Facebook : /tachyeonz
Twitter :@tachyeonz

NIPS 2016 Review, Day 1

@tachyeonz : Ever the scientists, the two organizers justified their choice on the program committee by maintaining that they want to grow the number submissions while decreasing bias and variance. They treated the problem with unknown ground truth of what the “best papers” were,

More

Tags : artificial intelligence, cnn, conference, convolution neural net, deep learning, gans, lstm, m, machine learning, meta learning, meta models, nips2016, phased lstm, recurrent neuralnet, reinforcement learning, rnn, time series data, unsupervised learning, yann le cun

Published On:December 25, 2016 at 07:26PM

Connect On:
Facebook : /tachyeonz
Twitter :@tachyeonz

Nuts and Bolts of Building Deep Learning Applications: Ng @ NIPS2016

@tachyeonz : In addition to these four accuracies, you might want to report the human-level accuracy, for a total of 5 quantities to report. The difference between human-level and training set performance is the Bias. The difference between the training set and the training-dev set is the Variance.

Click here to read more

Tags : #andrewng, #deviation, #error, #machinelearning, #nips2016, #variance, m

Published On:December 19, 2016 at 11:03PM

Connect On:
Facebook : /tachyeonz
Twitter :@tachyeonz