The annual Nobel Prize announcements have begun once again. On October 8, 2024, when the Physics Prize was announced, it left many observers confused and shocked, as it was awarded to AI scientists John J. Hopfield from Princeton University and Geoffrey E. Hinton from the University of Toronto. These two figures have been well-known in the field of artificial intelligence for years, with the prestige and credentials to win the Nobel Prize. Their contributions since 2012, along with the advancements in GPUs by Nvidia CEO Jensen Huang, have spurred the AI revolution that has dramatically changed the computer industry. However, there is no AI or Mathematics category in the Nobel Prizes, and their work on artificial neural networks and machine learning seems unrelated to the traditional Nobel categories in most people's minds.
The official Nobel Prize website explained the reason for the award, stating that the principles of artificial neural networks are based on fundamental concepts and methods from physics, specifically using physics to describe the properties of materials—such as how an atom’s spin can make it behave like a tiny magnet. The structure of neural networks is likened to the energy systems of spin systems in physics, and training involves finding the connection values between nodes. If the Nobel committee explains it this way, some may joke that, by the same logic, the Chemistry Prize could be awarded to the software AlphaFold, which revolutionized protein structure prediction.
In fact, on October 9, the Chemistry Prize was awarded to David Baker from the University of Washington, along with Demis Hassabis and John Jumper from Google DeepMind, for their revolutionary contributions to protein design and structure prediction, sparking even more controversy. While Professor Baker created entirely new types of proteins, Hassabis and Jumper were the developers of AlphaFold.
AI technology today heavily relies on training with vast amounts of data. The quality, completeness, and orderliness of input data directly affect the results. This implicit integrity and regularity determine the upper limits of the predictions. Simply put, AI can be understood as a specialized information search and processing method. The rapid development of computer storage technology provides the data necessary for AI, while GPUs enable large-scale parallel computing. These foundational technologies have set the stage for artificial neural networks.
By collecting and refining data from across the world, the output naturally won’t be too far off. With the introduction and widespread use of ChatGPT, many have experienced firsthand the comprehensiveness of these AI tools, often providing answers more complete than those given by many people. Completeness here refers to thoroughness and considering all aspects of an issue. If every step in a process is carefully considered, both on the whole and in detail, success is almost guaranteed. In modern industrial terms, this aligns with quality control standards.
Current AI technology relies on brute-force computation, but it’s a structured brute force, not random punches. Another traditional method that also uses brute-force computation is the Monte Carlo method, which similarly applies statistical principles but generates massive random data and then seeks order within chaos.
AI and Monte Carlo methods both depend on vast computational power and, ultimately, are limited by energy resources. Plug them in, and they come to life; cut the power, and they are rendered useless. At its core, energy sustains everything.
Human survival also consumes a lot of energy, and learning is a process of absorbing more and more information. Although people eat similar amounts of food, the ability to absorb and refine knowledge varies. Beyond the limits of the body, it’s the mind that serves as the filter and processing unit. How a person perceives and solves problems reflects the quality, completeness, and regularity of their thinking.
In traditional Chinese culture, a key principle is to emphasize quality, completeness, and regularity, with an insistence on the integration of multiple dimensions of time and space. Dong Zhongshu's Three Bonds and Five Constants, proposed in the early Western Han Dynasty, was based on this framework. However, due to Dong's incomplete thinking structure, his ideas led to increasingly noticeable systemic deviations over time. Historical disturbances of this nature, resulting from cyclical interactions between different time-space dimensions, produced ever more "garbage data," consuming human life and energy in later generations. This eventually caused the system of Chinese civilization to bloat and degrade, leading to its collapse under the violent impact of the foreign "garbage data" brought by communism.
Now is the time to gather particles from the ashes of the void, to reconstruct all things from scattered particles, and to restore traditional Chinese culture. Isn’t this the time to begin anew?
(First published by People News)
News magazine bootstrap themes!
I like this themes, fast loading and look profesional
Thank you Carlos!
You're welcome!
Please support me with give positive rating!
Yes Sure!