这里只是引用相关观点。首先是,最近自然杂志采访了Hopfield,比较有趣的一段话如下:
There was discussion that your prizewinning work was not really physics, but computer science. What do you think?
My definition of physics is that physics is not what you’re working on, but how you’re working on it. If you have the attitude of someone who comes from physics, it’s a physics problem. Having a father and mother who were both physicists warped my view of what physics was. Everything that, to you, was interesting in the world was because you understood the physics of putting such a thing together. I grew up with puzzles and I wanted to figure them out.
In 1981, I gave a talk at a meeting and Terry Sejnowski, who had been my research student in physics, was sitting next to Geoff Hinton. [Sejnowski now runs a computational neurobiology group at the Salk Institute in La Jolla, California.] It was clear that Geoff knew how to get a system of that sort — the mechanics that I do — to express computer science. They got talking and eventually wrote their first paper together. Terry recalled this one day, and it was the story of how things came from physics into computer science.
全文来源:https://www.nature.com/articles/d41586-024-03520-0
另外,Zdeborova 与 Krzakala 在JSTAT的编辑观点:
The scientific field at the intersection between machine learning and statistical physics is not new. In fact, these communities were always very interconnected and indeed, many concepts and algorithms of machine learning have roots in physics. The very words ‘Boltzmann machine’, ‘Free energy’, and ‘Gibbs sampling’ speak for themselves. Perhaps starting with the pioneering work of John Hopfield in 1982 on memory retrieval, many scientific papers were published at the intersections of the two fields, as testified by the pioneers of modern machine learning. Turing prize winner Yann LeCun says in his book ‘Quand la Machine Apprend’: ‘Ma vie professionnelle bascule r´eellement en 1985 lors d’un symposium aux Houches’. In his lecture on neural nets, his colleague Geoffrey Hinton recalls how active and influential physicists were in the early days of neural networks and mentions in particular the amazing contribution of ‘one really smart physicist, Elizabeth Gardner ’, a name very familiar to those versed in spin glass theory. Isabelle Guyon recalls in ‘the story of the invention of support vector machines’ how two physicists working on the same topics inspired her creation: ‘Marc Mezard and Werner Krauth published a paper on an optimal margin algorithm called ‘minover’, which attracted my attention [...] it was not until I joined Bell Labs that I put things together ’. In the wake of the deep learning revolution, just after the pioneer of statistical physics of complex systems Giorgio Parisi was awarded the Nobel Prize for the development of the replica method (and its use in machine learning, as mentioned by the Nobel committee), it was time to revive this connection once again in Les Houches. At the time of writing this introduction in spring 2024, the Abel Prize was awarded to Michel Talagrand, in part for his mathematical contribution to establishing results stemming from the replica method.
Haim Sompolinsky 给出的观点【Les Houches 2022】:
Goals of deep learning theory: what should it seek to explain?
1. Memory capacity: how much random data can a network store?
2. Expressivity: what classes of functions can a given architecture express or approximate?
3. Generalization, regularization, inductive biases and overparameterization.
4. Learning dynamics.
5. Representations: understanding latent or hidden representations that emerge during or after learning.
6. Relation to the brain.