Anabel Forte Deltell
Professor of the Department of Statistics and Operations Research at the University of Valencia
Hinton's work in what is known as Backpropagation and its application to neural networks was fundamental to give “depth” to deep learning.
In statistics what we do is learn from the data to estimate parameters that tell us how the variables are related. This is done by trying to minimize the error that is made in the predictions. But when you add layers of depth to a neural network (which is what allows us to do such impressive things as understand language or create an image) the relationship between the error made in the prediction and the input data is lost. To avoid that problem, the Hinton-driven mechanism causes the error to be distributed backwards from the layer where the result is predicted to the input data layer, allowing the best value for the parameters to be set at all layers.
In short, without Hinton's work we would not have chatGPT or AI-generated videos or any of the things that amaze us today.