Geoffrey Hinton, Yann LeCun, and Yoshua Bengio, the “Godfathers of Deep Learning”, are revered pioneers in the field of deep learning, revolutionizing artificial neural networks and shaping the course of AI. In 2018, they received the Turing Award for their contributions to Deep Learning. In this article, we explores their groundbreaking contributions, collaborative efforts, and enduring impact, shedding light on their remarkable journeys as they propelled deep learning to unprecedented heights!
Geoffrey Hinton
When it comes to Deep Learning, I think nobody symbolizes the field better than Goeffrey Hinton, the father of Deep Learning. He even coined the term! Here are his biggest contributions to the field:
1984 - He invents the Boltzmann machines: Boltzmann machines: Constraint satisfaction networks that learn.
1985 - He proposes a new learning algorithm for Boltzmann machines: A learning algorithm for Boltzmann machines.
1986 - He is credited as one of the inventors of the Back-propagation algorithm: Learning representations by back-propagating errors.
1991 - He invents the Mixture of Experts: Adaptive mixtures of local experts.
2006 - He proposes an algorithm to train Deep Belief Nets. This is the article that led to the term "Deep Learning": A Fast Learning Algorithm for Deep Belief Nets.
2006 - He shows how to build Autoencoders with Neural Networks: Reducing the Dimensionality of Data with Neural Networks.
2008 - He invents t-SNE, a new technique for dimension reduction: Visualizing Data using t-SNE.
2009 - He presents an algorithm to train Deep Boltzmann machines: Deep Boltzmann Machines.
2009 - Trains Restricted Boltzmann Machines and Deep Belief Networks with the CIFAR-10 dataset: Learning Multiple Layers of Features from Tiny Images.
2010 - Shows the improved performance of Restricted Boltzmann Machines with ReLU: Rectified Linear Units Improve Restricted Boltzmann Machines.
2011 - Shows how to build a generative text model with Recurrent Neural Networks: Generating Text with Recurrent Neural Networks.
2012 - Invents RMSprop in a course lecture (!!!): Neural Networks for Machine Learning - Lecture 6a Overview of mini-batch gradient descent.
2012 - Proposes the Feature Dropout technique to improve networks: Improving neural networks by preventing co-adaptation of feature detectors.
2012 - Suggests mini-batch gradient descent in a course lecture: Neural Networks for Machine Learning - Lecture 6a Overview of mini-batch gradient descent.
2012 - Deep Learning for speech recognition: Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups.
2014 - Revolutions computer vision capabilities with AlexNet (the most cited paper of his whole career): ImageNet Classification with Deep Convolutional Neural Networks.
2014 - He proposes the Dropout technique to reduce overfitting: Dropout: A Simple Way to Prevent Neural Networks from Overfitting.
2014 - the CIFAR 10 dataset is made available: The CIFAR-10 dataset.
2015 - He invents the Distillation Network to reduce the size of models: Distilling the Knowledge in a Neural Network.
2016 - He invents the Layer Normation technique (used in every Transformer architecture): Layer Normalization.
2017- He propose CapsNets, or Capsule networks aiming to overcome some limitations of CNNs, particularly in the area of understanding hierarchical relationships between objects and their parts within an image: Dynamic Routing Between Capsules.
2022 - He presents a new alternative to the Back-propagation algorithm: the Forward-forward algorithm: The Forward-Forward Algorithm: Some Preliminary Investigations.
Now the guy has 327 publications, so I couldn't capture everything here but I believe this encapsulates his most impactful works. Considering the trend, it seems a lot more is going to come from him in the coming years!
Yann LeCun
Nobody has done more for the history of Convolutional Neural Networks than Yann LeCun! Here are his biggest contributions to the field:
Keep reading with a 7-day free trial
Subscribe to The AiEdge Newsletter to keep reading this post and get 7 days of free access to the full post archives.