诺贝尔物理学奖为何颁给了“AI教父”?

教育   2024-10-12 13:11   广西  


NOBEL SCIENCE prizes are awarded in three areas: physics, chemistry and physiology or medicine.But occasionally some noteworthy discovery comes along that does not really fit into any of them.


Similar flexibility, though in an area with far more profound consequences than ethology, has been demonstrated with regard to this year's physics prize.


Showing a sense of timeliness not always apparent in its deliberations, Sweden's Royal Academy of Science has stretched the definition of physics to include computer science, and given its imprimatur to two of the progenitors of the artificial-intelligence (AI) revolution that is currently sweeping all before it.


John Hopfield of Princeton University and Geoffrey Hinton of the University of Toronto both did their crucial work in the early 1980s, at a time when computer hardware was unable to take full advantage of it.


Dr Hopfield was responsible for what has become known as the Hopfield network — a type of artificial neural network that behaves like a physical structure called a spin glass, which gave the academy a tenuous reason to call the field "physics".Dr Hinton's contribution was to use an algorithm known as backpropagation to train neural networks.


Artificial neural networks are computer programs based loosely on the way in which real, biological networks of nerve cells are believed to work.In particular, the strengths of the connections between "nodes" in such networks are plastic.Hopfield networks, in which each node is connected to every other except itself, are particularly good at learning to extract patterns from sparse or noisy data.


Dr Hinton's algorithm turbocharged neural networks' learning ability by letting them work, in effect, in three dimensions.Hopfield networks and their ilk are, in essence, two-dimensional.Though they actually exist only as simulations in software, they can be thought of as physical layers of nodes.


Stack such layers on top of one another, though, and train them by tweaking the weights as signals move both backward and forward between the layers and you have a much more sophisticated learning system.


Dr Hinton also, for good measure, tweaked Dr Hopfield's networks using a branch of maths called statistical mechanics to create what are known as Boltzmann machines.Boltzmann machines can be used to create systems that learn in an unsupervised manner, spotting patterns in data without having to be explicitly taught.


It is, then, the activities of these two researchers which have made machine learning really sing.AI models can now not only learn, but create.Such tools have thus gone from being able to perform highly specific tasks, such as recognising cancerous cells in pictures of tissue samples or streamlining particle-physics data, to anything from writing essays for lazy undergraduates to running robots.


灰灰考研
最全的【计算机考研】【软件考研】考研信息! 最丰富的共享资料! 最大程度上帮助学渣狗登上研究生大门!
 最新文章