When we talk about artificial intelligence, we often mean machine learning using artificial neural networks. This technology was originally inspired by the structure of the brain. In an artificial neural network, the brain’s neurons are represented by nodes that have different values. These nodes influence each other through connections that can be likened to synapses and which can be made stronger or weaker. The network is trained, for example by developing stronger connections between nodes with simultaneously high values. This year’s laureates have conducted important work with artificial neural networks from the 1980s onward.
John Hopfield invented a network that uses a method for saving and recreating patterns. We can imagine the nodes as pixels. The Hopfield network utilises physics that describes a material’s characteristics due to its atomic spin – a property that makes each atom a tiny magnet. The network as a whole is described in a manner equivalent to the energy in the spin system found in physics, and is trained by finding values for the connections between the nodes so that the saved images have low energy. When the Hopfield network is fed a distorted or incomplete image, it methodically works through the nodes and updates their values so the network’s energy falls. The network thus works stepwise to find the saved image that is most like the imperfect one it was fed with.
Geoffrey Hinton used the Hopfield network as the foundation for a new network that uses a different method: the Boltzmann machine. This can learn to recognise characteristic elements in a given type of data. Hinton used tools from statistical physics, the science of systems built from many similar components. The machine is trained by feeding it examples that are very likely to arise when the machine is run. The Boltzmann machine can be used to classify images or create new examples of the type of pattern on which it was trained. Hinton has built upon this work, helping initiate the current explosive development of machine learning.
“The laureates’ work has already been of the greatest benefit. In physics we use artificial neural networks in a vast range of areas, such as developing new materials with specific properties,” says Ellen Moons, Chair of the Nobel Committee for Physics.
John J. Hopfield, 1933年出生于美国伊利诺伊州芝加哥。1958年毕业于美国纽约州伊萨卡康奈尔大学博士。美国新泽西州普林斯顿大学教授。
Geoffrey E. Hinton, 1947年出生于英国伦敦。1978年获得英国爱丁堡大学博士学位。加拿大多伦多大学教授。
我对吴有训、叶企孙、萨本栋先生的点滴回忆 | 《物理》50年精选文章
国立西南联合大学物理系——抗日战争时期中国物理学界的一支奇葩(Ⅰ) | 《物理》50年精选文章
国立西南联合大学物理系——抗日战争时期中国物理学界的一支奇葩(Ⅱ) | 《物理》50年精选文章
原子核裂变的发现:历史与教训——纪念原子核裂变现象发现60周年 | 《物理》50年精选文章
回顾与展望——纪念量子论诞生100周年 | 《物理》50年精选文章
中国理论物理学家与生物学家结合的典范——回顾汤佩松和王竹溪先生对植物细胞水分关系研究的历史性贡献(上) |《物理》50年精选文章
中国理论物理学家与生物学家结合的典范——回顾汤佩松和王竹溪先生对植物细胞水分关系研究的历史性贡献(下) |《物理》50年精选文章
为了忘却的怀念——回忆晚年的叶企孙 | 《物理》50年精选文章
从分子生物学的历程看学科交叉——纪念金螺旋论文发表50周年 | 《物理》50年精选文章
美丽是可以表述的——描述花卉形态的数理方程 | 《物理》50年精选文章
一本培养了几代物理学家的经典著作 ——评《晶格动力学理论》 |《物理》50年精选文章
熵非商——the Myth of Entropy |《物理》50年精选文章
普渡琐记——从2010年诺贝尔化学奖谈起 |《物理》50年精选文章
天气预报——由经验到物理数学理论和超级计算 | 《物理》50年精选文章
纪念Bohr的《伟大的三部曲》发表100周年暨北京大学物理专业建系100周年 | 《物理》50年精选文章
凝聚态材料中的拓扑相与拓扑相变——2016年诺贝尔物理学奖解读 |《物理》50年精选文章
通用量子计算机和容错量子计算——概念、现状和展望 | 《物理》50年精选文章
谈书说人之一:《理论物理学教程》是怎样写成的?| 《物理》50年精选文章
时空奇点和黑洞 ——2020年诺贝尔物理学奖解读 |《物理》50年精选文章
凝聚态物理学的新篇章——超越朗道范式的拓扑量子物态 | 《物理》50年精选文章
对于麦克斯韦方程组,洛伦兹变换的低速极限是伽利略变换吗?| 《物理》50年精选文章