From The National Institute of Standards and Technology: “Brain-Inspired Computing Can Help Us Create Faster and More Energy-Efficient Devices — If We Win the Race”
From The National Institute of Standards and Technology
3.15.23
Advait Madhavan
One way race logic strives to save energy is by addressing the shortest-path problem. In one scenario, cars set off in multiple directions trying to find the fastest route. When the first car arrives, all the other cars stop, saving energy.
Credit: NIST.
“The human brain is an amazingly energy-efficient device. In computing terms, it can perform the equivalent of an exaflop — a billion-billion (1 followed by 18 zeros) mathematical operations per second — with just 20 watts of power.
In comparison, one of the most powerful supercomputers in the world, the Oak Ridge Frontier, has recently demonstrated exaflop computing. But it needs a million times more power — 20 megawatts — to pull off this feat.
My colleagues and I are looking to the brain as a guide in developing a powerful yet energy-efficient computer circuit design. You see, energy efficiency has emerged as the predominant factor keeping us from creating even more powerful computer chips. While ever-smaller electronic components have exponentially increased the computing power of our devices, those gains are slowing down.
Interestingly, our view of how the brain works has been a source of constant inspiration to the computing world. To understand how we arrived at our approach, we need to take a short tour of computing history.
How a 19th Century Mathematician Launched the Computing Revolution
Mathematician George Boole’s impact on the modern age is incalculable. His 1847 invention, now known as Boolean algebra, assigns 1 and 0 values to logical propositions (true or false, respectively). Boolean algebra describes a way to perform precise mathematical calculations with them.
Boole’s inspiration came from how he understood the brain to work. He imagined the laws of thought as being logical propositions that could take on true (1) or false (0) values. He expressed these laws mathematically, so they could perform arithmetic in a precise, systematic fashion. Many years later, with the research of information scientist Claude Shannon, Boolean algebra was used with electrical switches that we now know as transistors, the building blocks of today’s computers.
Today, tiny, nanometer-sized transistors operating as switches inside microchips are the most manufactured device ever. More than 13 sextillion (one followed by 21 zeros) devices have been fabricated as of 2018. Millions of such transistors are arranged into centralized, flexible architectures for processing data.
These micro-scale processing machines, or microprocessors, are used in nearly anything you use that has a computer — a cellphone, a washing machine, a car or a smartwatch. Their versatility, scalability and robustness have been responsible for their “Swiss Army knife” reputation. They are able to do many things that humans are not well suited to do, such as calculating trajectories of rockets or rapidly crunching numbers in large financial spreadsheets.
In recent years, transistors have gotten a thousand times smaller, a hundred thousand times faster, and able to use a billion times less energy. But these incredible improvements from the early days of computing are not enough anymore. The amount of data that is being produced by human activity has exponentially increased. The centralized Swiss Army-knife approach cannot keep up with the data deluge of the modern age.
On the other hand, our biological evolution over billions of years solved the problem of handling lots of data by using lots of processing elements.
While neurons, or nerve cells, were discovered in the late 1800s, their impact on computing would occur only 100 years later. Scientists studying the computational behavior of brains began building decentralized processing models that relied on large amounts of data. This allowed computer engineers to revisit the organization of the brain as a guiding light for rearranging the billions of transistors at their disposal.
The Brain as a Sea of Low-Precision Neurons
The neuron doctrine, as it is known today, envisions the brain as made up of a vast sea of interconnected nerve cells — known as neurons — that communicate with each other through electrical and chemical interactions. They communicate across physical barriers called synapses. This view, popularized by early neuroscientists, is very different from Boole’s more abstract logical view of brain function.
But what are they doing computationally? Enter Walter Pitts and Warren McCulloch.
In 1943, they proposed the first mathematical model of a neuron. Their model showed that nerve cells in the brain have enormous computational power. It described how a nerve cell accumulates electrical activity from its neighboring nerve cells based on their “importance” and outputs electrical activity based on its aggregated input. This electrical activity, in the form of spikes, enables the brain to do everything from transmitting information to storing memory to responding to visual stimuli, such as a family picture or a beautiful sunset. As neurons can have thousands of neighbors, such an approach was well suited to dealing with applications with a lot of input data.
The human brain is an amazingly energy-efficient device. NIST researchers are using the brain as inspiration to develop more energy-efficient computer circuit designs. Credit: MattLphotography/Shutterstock.
Today, 80 years later, the McCulloch and Pitts model is widely regarded as the parent of modern neural network models and the basis of the recent explosion in artificial intelligence. Especially useful in situations where the precision of data is limited, modern AI and machine learning algorithms have performed miraculous feats in a variety of areas such as search, analytics, forecasting, optimization, gaming and natural language processing. They perform with human-level accuracy on image and speech recognition tasks, predict weather, outclass chess grandmasters, and as shown recently with ChatGPT, parse and respond to natural language.
Historically, the power of our computing systems was rooted in being able to do very precise calculations. Small errors in initial conditions of rocket trajectories can lead to huge errors later in flight. Though many applications still have such requirements, the computational power of modern deep-learning networks arise from their large size and interconnectivity.
A single neuron in a modern network can have up to a couple of thousand other neurons connected to it. Though each neuron may not be as precise, its behavior is determined by the aggregated information of many of its neighbors. When trained on an input dataset, the interconnection strengths between each pair of neurons are adjusted over time, so that overall network makes correct decisions. Essentially, the neurons are all working together as a team. The network as a whole can make up for the reduced precision in each of its atomic elements. Quantity has a quality all its own.
To make such models more practical, the underlying hardware must reflect the network. Running large, low-precision networks on a small number of high-precision computing elements ends up being tremendously inefficient.
Today there is a race to build computing systems that look like regular grids of low-precision processing elements, each filled with neural network functionality. From smartphones to data centers, low-precision AI chips are becoming more and more common.
As the application space for AI and machine learning algorithms continues to grow, this trend is only going to increase. Soon conventional architectures will be incapable of keeping pace with the demands of future data processing.
Even though modern AI hardware systems can perform tremendous feats of cognitive intelligence, such as beating the best human player of Go, a complex strategy game, such systems take tens of thousands of watts of power to run. On the other hand, the human Go grandmaster’s brain is only consuming 20 watts of power. We must revisit the brain to understand its tremendous energy efficiency and use that understanding to build computing systems inspired by it.
One clue comes from recent neuroscience research where the timing of energy spikes in the brain is found to be important. We are beginning to believe that this timing may be the key to making computers more energy efficient.
The Brain as a Time Computer
In the early 1990s, French neuroscientists performed a series of experiments to test the speed of the human visual system. They were surprised to find that the visual system was much faster than they had previously thought, responding to a visual stimulus in as little as 100 milliseconds. This went against the prevailing notion of how spikes encode information.
The conventional picture suggests that a neuron must receive a long train of spikes from its neighbors, aggregate all of them, and respond. But the time it would take to aggregate a long spike train and respond would be much larger than experimentally discovered. This meant that some neurons were not aggregating long spike trains from neighbors — and conversely, were working with just a couple of spikes from their neighbors when producing an output.
This radically changes that way we think information is encoded. If the brain is using only a few spikes to make decisions, then it must be making decisions based on the timing differences between spikes. A similar process occurs in the auditory systems of humans and other animals. For example, researchers have verified that barn owls use the difference in the arrival times of a sound to each ear to locate their prey.
These observations suggest that we might make more efficient computers by mimicking this aspect of the brain and using the timing of signals to represent information.
Inspired by this idea, my NIST colleagues and I are aiming to develop a new type of computer circuit that uses something we call “race logic” to solve problems. In race logic, signals race against each other, and the timing between them matters. The winner of the race tells us something about the solution of the problem.
To understand race logic, let’s first go back briefly to conventional computing. Conventional digital computers solve problems by sending Boolean bits of information — 0s and 1s — on wires through a circuit. During circuit operation, bits regularly flip their values, from 0 to 1 and vice versa. Each bit flip consumes energy, and circuits with lots of bit flipping are said to have high activity and consume a lot of energy. Trying to reduce energy consumption suggests reducing activity, hence reducing the number of bit flips to perform a given task.
Race logic reduces activity by encoding information in the timing of those bit flips on a wire. This approach allows a single bit flip on a wire to encode values larger than 0 or 1, making it an efficient encoding.
The circuit could be configured to look like a map between your home and workplace and solve for problems, such as the most efficient route. Electrical signals travel through various pathways in the circuit. The first one to reach the end of the circuit wins the race, revealing the most efficient route in the process. Since only a single bit flip passes through many elements of the circuit, it has low activity and high efficiency.
An additional advantage of race logic is that signals that lose the race by moving through slower routes are stopped, further saving energy. Imagine a marathon where you asked the runners not to run the same route, but for each runner to find the most efficient route to the finish line. Once the winner crosses that finish line, all the other runners stop, saving their own energy. If you apply this to a computer, lots of energy is saved over time.
One important application that race logic is particularly good at solving is the shortest path problem in networks, such as finding the quickest route from one place to another or determining the lowest number of connections required on social media to connect two people. It also forms the basis for complicated network analytics that answer more complex questions, such as highest traffic nodes in networks, path planning for streets in a busy city, the spread of a disease through a population, or the fastest way to route information on the internet.
How did I think to connect brain science to a new type of computer circuit?
The concept of race logic was originally conceived and first practically demonstrated during my Ph.D. thesis work at the University of California-Santa Barbara, guided by the electrical engineering expertise of Professor Dmitri Strukov and the computer science expertise of Professor Tim Sherwood. I am currently working toward exploring mathematical techniques and practical technologies to make this concept even more efficient.
Using Race Logic to Make the Next Generation of Energy-Efficient Computers
Next-generation computers are going to look very different from the computers of yesterday. As the quantity and nature of our data gathering changes, the demands from our computing systems must change as well. Hardware that powers tomorrow’s computing applications must keep energy impacts minimal and be good for the planet.
By being in touch with the latest developments in brain science, next-generation computers can benefit from the recently uncovered secrets of biology and meet the ever-increasing demand for energy-efficient computing hardware.
Our team at NIST is running a race of our own — to help computing power reach its full potential and protect our planet at the same time.”
See the full article here.
Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.
five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.
The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values
Mission
To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
NIST’s vision
NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
NIST’s core competencies
Measurement science
Rigorous traceability
Development and use of standards
NIST’s core values
NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.
Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
Integrity: We are ethical, honest, independent, and provide an objective perspective.
Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.
Background
The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.
In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.
From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.
Bureau of Standards
In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)
President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.
Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.
In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.
Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.
Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.
Organization
NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:
Communications Technology Laboratory (CTL)
Engineering Laboratory (EL)
Information Technology Laboratory (ITL)
Center for Neutron Research (NCNR)
Material Measurement Laboratory (MML)
Physical Measurement Laboratory (PML)
Extramural programs include:
Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.
NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.
NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.
NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).
The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).
The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.
SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.
The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.
This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
Committees
NIST has seven standing committees:
Technical Guidelines Development Committee (TGDC)
Advisory Committee on Earthquake Hazards Reduction (ACEHR)
National Construction Safety Team Advisory Committee (NCST Advisory Committee)
Information Security and Privacy Advisory Board (ISPAB)
Visiting Committee on Advanced Technology (VCAT)
Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
Manufacturing Extension Partnership National Advisory Board (MEPNAB)
Measurements and standards
As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.
Handbook 44
NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.
NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.
Reply