## From The National Institute of Standards and Technology: “Brain-Inspired Computing Can Help Us Create Faster and More Energy-Efficient Devices — If We Win the Race”

From The National Institute of Standards and Technology

3.15.23

One way race logic strives to save energy is by addressing the shortest-path problem. In one scenario, cars set off in multiple directions trying to find the fastest route. When the first car arrives, all the other cars stop, saving energy.
Credit: NIST.

“The human brain is an amazingly energy-efficient device. In computing terms, it can perform the equivalent of an exaflop — a billion-billion (1 followed by 18 zeros) mathematical operations per second — with just 20 watts of power.

In comparison, one of the most powerful supercomputers in the world, the Oak Ridge Frontier, has recently demonstrated exaflop computing. But it needs a million times more power — 20 megawatts — to pull off this feat.

My colleagues and I are looking to the brain as a guide in developing a powerful yet energy-efficient computer circuit design. You see, energy efficiency has emerged as the predominant factor keeping us from creating even more powerful computer chips. While ever-smaller electronic components have exponentially increased the computing power of our devices, those gains are slowing down.

Interestingly, our view of how the brain works has been a source of constant inspiration to the computing world. To understand how we arrived at our approach, we need to take a short tour of computing history.

How a 19th Century Mathematician Launched the Computing Revolution

Mathematician George Boole’s impact on the modern age is incalculable. His 1847 invention, now known as Boolean algebra, assigns 1 and 0 values to logical propositions (true or false, respectively). Boolean algebra describes a way to perform precise mathematical calculations with them.

Boole’s inspiration came from how he understood the brain to work. He imagined the laws of thought as being logical propositions that could take on true (1) or false (0) values. He expressed these laws mathematically, so they could perform arithmetic in a precise, systematic fashion. Many years later, with the research of information scientist Claude Shannon, Boolean algebra was used with electrical switches that we now know as transistors, the building blocks of today’s computers.

Today, tiny, nanometer-sized transistors operating as switches inside microchips are the most manufactured device ever. More than 13 sextillion (one followed by 21 zeros) devices have been fabricated as of 2018. Millions of such transistors are arranged into centralized, flexible architectures for processing data.

These micro-scale processing machines, or microprocessors, are used in nearly anything you use that has a computer — a cellphone, a washing machine, a car or a smartwatch. Their versatility, scalability and robustness have been responsible for their “Swiss Army knife” reputation. They are able to do many things that humans are not well suited to do, such as calculating trajectories of rockets or rapidly crunching numbers in large financial spreadsheets.

In recent years, transistors have gotten a thousand times smaller, a hundred thousand times faster, and able to use a billion times less energy. But these incredible improvements from the early days of computing are not enough anymore. The amount of data that is being produced by human activity has exponentially increased. The centralized Swiss Army-knife approach cannot keep up with the data deluge of the modern age.

On the other hand, our biological evolution over billions of years solved the problem of handling lots of data by using lots of processing elements.

While neurons, or nerve cells, were discovered in the late 1800s, their impact on computing would occur only 100 years later. Scientists studying the computational behavior of brains began building decentralized processing models that relied on large amounts of data. This allowed computer engineers to revisit the organization of the brain as a guiding light for rearranging the billions of transistors at their disposal.

The Brain as a Sea of Low-Precision Neurons

The neuron doctrine, as it is known today, envisions the brain as made up of a vast sea of interconnected nerve cells — known as neurons — that communicate with each other through electrical and chemical interactions. They communicate across physical barriers called synapses. This view, popularized by early neuroscientists, is very different from Boole’s more abstract logical view of brain function.

But what are they doing computationally? Enter Walter Pitts and Warren McCulloch.

In 1943, they proposed the first mathematical model of a neuron. Their model showed that nerve cells in the brain have enormous computational power. It described how a nerve cell accumulates electrical activity from its neighboring nerve cells based on their “importance” and outputs electrical activity based on its aggregated input. This electrical activity, in the form of spikes, enables the brain to do everything from transmitting information to storing memory to responding to visual stimuli, such as a family picture or a beautiful sunset. As neurons can have thousands of neighbors, such an approach was well suited to dealing with applications with a lot of input data.

The human brain is an amazingly energy-efficient device. NIST researchers are using the brain as inspiration to develop more energy-efficient computer circuit designs. Credit: MattLphotography/Shutterstock.

Today, 80 years later, the McCulloch and Pitts model is widely regarded as the parent of modern neural network models and the basis of the recent explosion in artificial intelligence. Especially useful in situations where the precision of data is limited, modern AI and machine learning algorithms have performed miraculous feats in a variety of areas such as search, analytics, forecasting, optimization, gaming and natural language processing. They perform with human-level accuracy on image and speech recognition tasks, predict weather, outclass chess grandmasters, and as shown recently with ChatGPT, parse and respond to natural language.

Historically, the power of our computing systems was rooted in being able to do very precise calculations. Small errors in initial conditions of rocket trajectories can lead to huge errors later in flight. Though many applications still have such requirements, the computational power of modern deep-learning networks arise from their large size and interconnectivity.

A single neuron in a modern network can have up to a couple of thousand other neurons connected to it. Though each neuron may not be as precise, its behavior is determined by the aggregated information of many of its neighbors. When trained on an input dataset, the interconnection strengths between each pair of neurons are adjusted over time, so that overall network makes correct decisions. Essentially, the neurons are all working together as a team. The network as a whole can make up for the reduced precision in each of its atomic elements. Quantity has a quality all its own.

To make such models more practical, the underlying hardware must reflect the network. Running large, low-precision networks on a small number of high-precision computing elements ends up being tremendously inefficient.

Today there is a race to build computing systems that look like regular grids of low-precision processing elements, each filled with neural network functionality. From smartphones to data centers, low-precision AI chips are becoming more and more common.

As the application space for AI and machine learning algorithms continues to grow, this trend is only going to increase. Soon conventional architectures will be incapable of keeping pace with the demands of future data processing.

Even though modern AI hardware systems can perform tremendous feats of cognitive intelligence, such as beating the best human player of Go, a complex strategy game, such systems take tens of thousands of watts of power to run. On the other hand, the human Go grandmaster’s brain is only consuming 20 watts of power. We must revisit the brain to understand its tremendous energy efficiency and use that understanding to build computing systems inspired by it.

One clue comes from recent neuroscience research where the timing of energy spikes in the brain is found to be important. We are beginning to believe that this timing may be the key to making computers more energy efficient.

The Brain as a Time Computer

In the early 1990s, French neuroscientists performed a series of experiments to test the speed of the human visual system. They were surprised to find that the visual system was much faster than they had previously thought, responding to a visual stimulus in as little as 100 milliseconds. This went against the prevailing notion of how spikes encode information.

The conventional picture suggests that a neuron must receive a long train of spikes from its neighbors, aggregate all of them, and respond. But the time it would take to aggregate a long spike train and respond would be much larger than experimentally discovered. This meant that some neurons were not aggregating long spike trains from neighbors — and conversely, were working with just a couple of spikes from their neighbors when producing an output.

This radically changes that way we think information is encoded. If the brain is using only a few spikes to make decisions, then it must be making decisions based on the timing differences between spikes. A similar process occurs in the auditory systems of humans and other animals. For example, researchers have verified that barn owls use the difference in the arrival times of a sound to each ear to locate their prey.

These observations suggest that we might make more efficient computers by mimicking this aspect of the brain and using the timing of signals to represent information.

Inspired by this idea, my NIST colleagues and I are aiming to develop a new type of computer circuit that uses something we call “race logic” to solve problems. In race logic, signals race against each other, and the timing between them matters. The winner of the race tells us something about the solution of the problem.

To understand race logic, let’s first go back briefly to conventional computing. Conventional digital computers solve problems by sending Boolean bits of information — 0s and 1s — on wires through a circuit. During circuit operation, bits regularly flip their values, from 0 to 1 and vice versa. Each bit flip consumes energy, and circuits with lots of bit flipping are said to have high activity and consume a lot of energy. Trying to reduce energy consumption suggests reducing activity, hence reducing the number of bit flips to perform a given task.

Race logic reduces activity by encoding information in the timing of those bit flips on a wire. This approach allows a single bit flip on a wire to encode values larger than 0 or 1, making it an efficient encoding.

The circuit could be configured to look like a map between your home and workplace and solve for problems, such as the most efficient route. Electrical signals travel through various pathways in the circuit. The first one to reach the end of the circuit wins the race, revealing the most efficient route in the process. Since only a single bit flip passes through many elements of the circuit, it has low activity and high efficiency.

An additional advantage of race logic is that signals that lose the race by moving through slower routes are stopped, further saving energy. Imagine a marathon where you asked the runners not to run the same route, but for each runner to find the most efficient route to the finish line. Once the winner crosses that finish line, all the other runners stop, saving their own energy. If you apply this to a computer, lots of energy is saved over time.

One important application that race logic is particularly good at solving is the shortest path problem in networks, such as finding the quickest route from one place to another or determining the lowest number of connections required on social media to connect two people. It also forms the basis for complicated network analytics that answer more complex questions, such as highest traffic nodes in networks, path planning for streets in a busy city, the spread of a disease through a population, or the fastest way to route information on the internet.

How did I think to connect brain science to a new type of computer circuit?

The concept of race logic was originally conceived and first practically demonstrated during my Ph.D. thesis work at the University of California-Santa Barbara, guided by the electrical engineering expertise of Professor Dmitri Strukov and the computer science expertise of Professor Tim Sherwood. I am currently working toward exploring mathematical techniques and practical technologies to make this concept even more efficient.

Using Race Logic to Make the Next Generation of Energy-Efficient Computers

Next-generation computers are going to look very different from the computers of yesterday. As the quantity and nature of our data gathering changes, the demands from our computing systems must change as well. Hardware that powers tomorrow’s computing applications must keep energy impacts minimal and be good for the planet.

By being in touch with the latest developments in brain science, next-generation computers can benefit from the recently uncovered secrets of biology and meet the ever-increasing demand for energy-efficient computing hardware.

Our team at NIST is running a race of our own — to help computing power reach its full potential and protect our planet at the same time.”

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

NIST Campus, Gaitherberg, MD.

The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

Mission

To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

NIST’s vision

NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

NIST’s core competencies

Measurement science
Rigorous traceability
Development and use of standards

NIST’s core values

NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
Integrity: We are ethical, honest, independent, and provide an objective perspective.
Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

Background

The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

Bureau of Standards

In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was \$40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

Organization

NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

Communications Technology Laboratory (CTL)
Engineering Laboratory (EL)
Information Technology Laboratory (ITL)
Center for Neutron Research (NCNR)
Material Measurement Laboratory (MML)
Physical Measurement Laboratory (PML)

Extramural programs include:

Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
Committees

NIST has seven standing committees:

Technical Guidelines Development Committee (TGDC)
Advisory Committee on Earthquake Hazards Reduction (ACEHR)
Information Security and Privacy Advisory Board (ISPAB)
Visiting Committee on Advanced Technology (VCAT)
Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
Manufacturing Extension Partnership National Advisory Board (MEPNAB)

Measurements and standards

As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

Handbook 44

NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

## From The University of New South Wales (AU) : “Quantum engineers have designed a new tool to probe nature with extreme sensitivity”

From The University of New South Wales (AU)

3.13.23
Lachlan Gilbert

This new spin measuring device could help scientists – particularly in chemistry and biology – understand the structure and purpose of materials better.

An artist’s impression of the ultra-sensitive spin detection device. Image: Tony Melov.

In a paper published over the weekend in the journal Science Advances [below], Associate Professor Jarryd Pla and his team from UNSW School of Electrical Engineering and Telecommunications, together with colleague Scientia Professor Andrea Morello, described a new device that can measure the spins in materials with high precision.

“The spin of an electron – whether it points up or down – is a fundamental property of nature,” says A/Prof. Pla. “It is used in magnetic hard disks to store information, MRI machines use the spins of water molecules to create images of the inside of our bodies, and spins are even being used to build quantum computers.

“Being able to detect the spins inside materials is therefore important for a whole range of applications, particularly in chemistry and biology where it can be used to understand the structure and purpose of materials, allowing us to design better chemicals, drugs and so on.”

In fields of research such as chemistry, biology, physics and medicine, the tool that is used to measure spins is called a spin resonance spectrometer. Normally, commercially produced spectrometers require billions to trillions of spins to get an accurate reading, but A/Prof. Pla and his colleagues were able to measure spins of electrons in the order of thousands, meaning the new tool was about a million times more sensitive.

This is quite a feat, as there are a whole range of systems that cannot be measured with commercial tools, such as microscopic samples, two-dimensional materials and high-quality solar cells, which simply have too few spins to create a measurable signal.

The breakthrough came about almost by chance, as the team were developing a quantum memory element for a superconducting quantum computer. The objective of the memory element was to transfer quantum information from a superconducting electrical circuit to an ensemble of spins placed beneath the circuit.

“We noticed that while the device didn’t quite work as planned as a memory element, it was extremely good at measuring the spin ensemble,” says Wyatt Vine, a lead author on the study. “We found that by sending microwave power into the device as the spins emitted their signals, we could amplify the signals before they left the device. What’s more, this amplification could be performed with very little added noise, almost reaching the limit set by quantum mechanics.”

While other highly sensitive spectrometers using superconducting circuits had been developed in the past, they required multiple components, were incompatible with magnetic fields and had to be operated in very cold environments using expensive “dilution refrigerators”, which reach temperatures down to 0.01 Kelvin.

In this new development, A/Prof. Pla says he and the team managed to integrate the components on a single chip.

“Our new technology integrates several important parts of the spectrometer into one device and is compatible with relatively large magnetic fields. This is important, since measure the spins they need to be placed in a field of about 0.5 Tesla, which is ten thousand times stronger than the earth’s magnetic field.

“Further, our device operated at a temperature more than 10 times higher than previous demonstrations, meaning we don’t need to use a dilution refrigerator.”

A/Prof. Pla says the UNSW team has patented the technology with a view to potentially commercialize, but stresses that there is still work to be done.

“There is potential to package this thing up and commercialize it which will allow other researchers to plug it into their existing commercial systems to give them a sensitivity gain.

“If this new technology was properly developed, it could help chemists, biologists and medical researchers, who currently rely on tools made by these large tech companies that work, but which could do something orders of magnitude better.”

Fig. 1. Device design and resonator characterization.

(A) Schematic for the device. The resonator is depicted as a lumped element resonator with a geometric inductance (Lg) that couples to an ensemble of 209Bi and a nonlinear inductance Lk, which we exploit for amplification. The resonant mode is confined using a SIF, which we depict as a series of waveguides with alternating Zhi and Zlo. (B) Frequency-dependent transmission of the SIF calculated from ABCD matrices. Note that port 2 is used here for illustration and is not physical. The frequencies of the resonator and pump tones are indicated. (C) Frequency-dependent magnitude (top) and phase (bottom) response of the device when measured in reflection (S11) with a vector network analyzer. The solid red lines correspond to a fit of the data in the complex plane, from which we extract the resonator’s parameters. (D) Shift in resonance frequency (top) and variation of the internal and coupling quality factors (bottom) of the device extracted from measurements of S11 made as a function of IDC. The solid line in the top panel is a fit to the equation in the inset.

Fig. 2. ESR measurements of209Bi donors in Si using a KIPA.

(A) Allowed ESR transition frequencies for 209Bi in Si as a function of B0. We can measure ESR with the KIPA when ωESR = ω0, with the crossing points marked by vertical dashed lines. (B) CPMG sequence applied to detect the spins. We use τ = 75 μs and N = 200, averaging the echo produced by each of the N refocusing pulses to increase the SNR. (C) Homodyne-demodulated signal in the time domain as a function of B0, measured with the CPMG sequence shown in (B). The bright features correspond to a spin echo signal. (D) Integrated spin echo signal from (C). We label the five peaks according to the ESR transitions we expect from calculations of the spin Hamiltonian. a.u., arbitrary units. (E) Measurements of the Sx transitions between 0 and 370 mT. Each measurement is independently normalized.

For further illustrations see the science paper.

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

Stem Education Coalition

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

The University of New South Wales is an Australian public university with its largest campus in the Sydney suburb of Kensington.

Established in 1949, UNSW is a research university, ranked 44th in the world in the 2021 QS World University Rankings and 67th in the world in the 2021 Times Higher Education World University Rankings. UNSW is one of the founding members of the Group of Eight, a coalition of Australian research-intensive universities, and of Universitas 21, a global network of research universities. It has international exchange and research partnerships with over 200 universities around the world.

According to the 2021 QS World University Rankings by Subject, UNSW is ranked top 20 in the world for Law, Accounting and Finance, and 1st in Australia for Mathematics, Engineering and Technology. UNSW also leads Australia in Medicine, where the median ATAR (Australian university entrance examination results) of its Medical School students is higher than any other Australian medical school. UNSW enrolls the highest number of Australia’s top 500 high school students academically, and produces more millionaire graduates than any other Australian university.

The university comprises seven faculties, through which it offers bachelor’s, master’s and doctoral degrees. The main campus is in the Sydney suburb of Kensington, 7 kilometres (4.3 mi) from the Sydney CBD. The creative arts faculty, UNSW Art & Design, is located in Paddington, and subcampuses are located in the Sydney CBD as well as several other suburbs, including Randwick and Coogee. Research stations are located throughout the state of New South Wales.

The university’s second largest campus, known as UNSW Canberra at ADFA (formerly known as UNSW at ADFA), is situated in Canberra, in the Australian Capital Territory (ACT). ADFA is the military academy of the Australian Defense Force, and UNSW Canberra is the only national academic institution with a defense focus.

Research centres

The university has a number of purpose-built research facilities, including:

UNSW Lowy Cancer Research Centre is Australia’s first facility bringing together researchers in childhood and adult cancers, as well as one of the country’s largest cancer-research facilities, housing up to 400 researchers.
The Mark Wainwright Analytical Centre is a centre for the faculties of science, medicine, and engineering. It is used to study the structure and composition of biological, chemical, and physical materials.
UNSW Canberra Cyber is a cyber-security research and teaching centre.
The Sino-Australian Research Centre for Coastal Management (SARCCM) has a multidisciplinary focus, and works collaboratively with the Ocean University of China [中國海洋大學](CN) in coastal management research.

## From Engineering At The University of Michigan: “New kind of transistor could shrink communications devices on smartphones”

From Engineering

At

The University of Michigan

3.8.23
By Catharine June

Contact:
Kate McAlpine
kmca@umich.edu

Integrating a new ferroelectric semiconductor paves the way for single amplifiers that can do the work of multiple conventional amplifiers, among other possibilities.

Electrical & Computer Engineering research scientist Ding Wang and graduate student Minming He from Prof. Zetian Mi’s group, University of Michigan, are working on the epitaxy and fabrication of high electron mobility transistors (HEMTs) based on a new nitride material, ScAlN, which has been demonstrated recently as a promising high-k and ferroelectric gate dielectric that can foster new functionalities and boost device performances.” Image credit: Marcin Szczepanski/Lead Multimedia Storyteller, Michigan Engineering.

One month after announcing a ferroelectric semiconductor at the nanoscale thinness required for modern computing components, a team at the University of Michigan has demonstrated a reconfigurable transistor using that material.

The study is a featured article in Applied Physics Letters [below].

“By realizing this new type of transistor, it opens up the possibility for integrating multifunctional devices, such as reconfigurable transistors, filters and resonators, on the same platform—all while operating at very high frequency and high power,” said Zetian Mi, U-M professor of electrical and computer engineering who led the research, “That’s a game changer for many applications.”

Electrical & Computer Engineering research scientist Ding Wang and graduate student Minming He from Prof. Zetian Mi’s group, University of Michigan, are working on the epitaxy and fabrication of high electron mobility transistors (HEMTs) based on a new nitride material, ScAlN, which has been demonstrated recently as a promising high-k and ferroelectric gate dielectric that can foster new functionalities and boost device performances.” Image credit: Marcin Szczepanski/Lead Multimedia Storyteller, Michigan Engineering.

At its most basic level, a transistor is a kind of switch, letting an electric current through or preventing it from passing. The one demonstrated at Michigan is known as a ferroelectric high electron mobility transistor (FeHEMT)—a twist on the HEMTs that can increase the signal, known as gain, as well as offering high switching speed and low noise. This makes them well suited as amplifiers for sending out signals to cell towers and Wi-Fi routers at high speeds.

Ferroelectric semiconductors stand out from others because they can sustain an electrical polarization, like the electric version of magnetism. But unlike a fridge magnet, they can switch which end is positive and which is negative. In the context of a transistor, this capability adds flexibility—the transistor can change how it behaves.

“We can make our ferroelectric HEMT reconfigurable,” said Ding Wang, a research scientist in electrical and computer engineering and first author of the study. “That means it can function as several devices, such as one amplifier working as several amplifiers that we can dynamically control. This allows us to reduce the circuit area and lower the cost as well as the energy consumption.”

Electrical & Computer Engineering research scientist Ding Wang and graduate student Minming He from Prof. Zetian Mi’s group, University of Michigan, are working on the epitaxy and fabrication of high electron mobility transistors (HEMTs) based on a new nitride material, ScAlN, which has been demonstrated recently as a promising high-k and ferroelectric gate dielectric that can foster new functionalities and boost device performances.” Image credit: Marcin Szczepanski/Lead Multimedia Storyteller, Michigan Engineering.

Areas of particular interest for this device are reconfigurable radio frequency and microwave communication as well as memory devices in next-generation electronics and computing systems.

“By adding ferroelectricity to an HEMT, we can make the switching sharper. This could enable much lower power consumption in addition to high gain, making for much more efficient devices,” said Ping Wang, a research scientist in electrical and computer engineering and also the co-corresponding author of the research.

The ferroelectric semiconductor is made of aluminum nitride spiked with scandium, a metal sometimes used to fortify aluminum in performance bicycles and fighter jets. It is the first nitride-based ferroelectric semiconductor, enabling it to be integrated with the next-gen semiconductor gallium nitride. Offering speeds up to 100 times that of silicon, as well as high efficiency and low cost, gallium nitride semiconductors are contenders to displace silicon as the preferred material for electronic devices.

“This is a pivotal step toward integrating nitride ferroelectrics with mainstream electronics,” Mi said.

The new transistor was grown using molecular beam epitaxy, the same approach used to make semiconductor crystals that drive the lasers in CD and DVD players.

The University of Michigan has applied for patent protection. Early work leading to this study was funded by the Office of Naval Research and the Blue Sky Initiative at the U-M College of Engineering.

The device was built in the Lurie Nanofabrication Facility and studied at the Michigan Center for Materials Characterization.

Applied Physics Letters
See the science paper for instructive material with images.

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

Stem Education Coalition

Michigan Engineering provides scientific and technological leadership to the people of the world. Through our people-first engineering approach, we’re committed to fostering a community of engineers who will close critical gaps and elevate all people. We aspire to be the world’s preeminent college of engineering serving the common good.

Values

Creativity, innovation and daring
Diversity, equity and social impact
Collegiality and collaboration
Transparency and trustworthiness

The University of Michigan is a public research university located in Ann Arbor, Michigan, United States. Originally, founded in 1817 in Detroit as the Catholepistemiad, or University of Michigania, 20 years before the Michigan Territory officially became a state, the University of Michigan is the state’s oldest university. The university moved to Ann Arbor in 1837 onto 40 acres (16 ha) of what is now known as Central Campus. Since its establishment in Ann Arbor, the university campus has expanded to include more than 584 major buildings with a combined area of more than 34 million gross square feet (781 acres or 3.16 km²), and has two satellite campuses located in Flint and Dearborn. The University was one of the founding members of the Association of American Universities.

Considered one of the foremost research universities in the United States, the university has very high research activity and its comprehensive graduate program offers doctoral degrees in the humanities, social sciences, and STEM fields (Science, Technology, Engineering and Mathematics) as well as professional degrees in business, medicine, law, pharmacy, nursing, social work and dentistry. Michigan’s body of living alumni (as of 2012) comprises more than 500,000. Besides academic life, Michigan’s athletic teams compete in Division I of the NCAA and are collectively known as the Wolverines. They are members of the Big Ten Conference.

At over \$12.4 billion in 2019, Michigan’s endowment is among the largest of any university. As of October 2019, 53 MacArthur “genius award” winners (29 alumni winners and 24 faculty winners), 26 Nobel Prize winners, six Turing Award winners, one Fields Medalist and one Mitchell Scholar have been affiliated with the university. Its alumni include eight heads of state or government, including President of the United States Gerald Ford; 38 cabinet-level officials; and 26 living billionaires. It also has many alumni who are Fulbright Scholars and MacArthur Fellows.

Research

Michigan is one of the founding members (in the year 1900) of the Association of American Universities. With over 6,200 faculty members, 73 of whom are members of the National Academy and 471 of whom hold an endowed chair in their discipline, the university manages one of the largest annual collegiate research budgets of any university in the United States. According to the National Science Foundation, Michigan spent \$1.6 billion on research and development in 2018, ranking it 2nd in the nation. This figure totaled over \$1 billion in 2009. The Medical School spent the most at over \$445 million, while the College of Engineering was second at more than \$160 million. U-M also has a technology transfer office, which is the university conduit between laboratory research and corporate commercialization interests.

In 2009, the university signed an agreement to purchase a facility formerly owned by Pfizer. The acquisition includes over 170 acres (0.69 km^2) of property, and 30 major buildings comprising roughly 1,600,000 square feet (150,000 m^2) of wet laboratory space, and 400,000 square feet (37,000 m^2) of administrative space. At the time of the agreement, the university’s intentions for the space were not set, but the expectation was that the new space would allow the university to ramp up its research and ultimately employ in excess of 2,000 people.

The university is also a major contributor to the medical field with the EKG and the gastroscope. The university’s 13,000-acre (53 km^2) biological station in the Northern Lower Peninsula of Michigan is one of only 47 Biosphere Reserves in the United States.

In the mid-1960s U-M researchers worked with IBM to develop a new virtual memory architectural model that became part of IBM’s Model 360/67 mainframe computer (the 360/67 was initially dubbed the 360/65M where the “M” stood for Michigan). The Michigan Terminal System (MTS), an early time-sharing computer operating system developed at U-M, was the first system outside of IBM to use the 360/67’s virtual memory features.

U-M is home to the National Election Studies and the University of Michigan Consumer Sentiment Index. The Correlates of War project, also located at U-M, is an accumulation of scientific knowledge about war. The university is also home to major research centers in optics, reconfigurable manufacturing systems, wireless integrated microsystems, and social sciences. The University of Michigan Transportation Research Institute and the Life Sciences Institute are located at the university. The Institute for Social Research (ISR), the nation’s longest-standing laboratory for interdisciplinary research in the social sciences, is home to the Survey Research Center, Research Center for Group Dynamics, Center for Political Studies, Population Studies Center, and Inter-Consortium for Political and Social Research. Undergraduate students are able to participate in various research projects through the Undergraduate Research Opportunity Program (UROP) as well as the UROP/Creative-Programs.

The U-M library system comprises nineteen individual libraries with twenty-four separate collections—roughly 13.3 million volumes. U-M was the original home of the JSTOR database, which contains about 750,000 digitized pages from the entire pre-1990 backfile of ten journals of history and economics, and has initiated a book digitization program in collaboration with Google. The University of Michigan Press is also a part of the U-M library system.

In the late 1960s U-M, together with Michigan State University and Wayne State University, founded the Merit Network, one of the first university computer networks. The Merit Network was then and remains today administratively hosted by U-M. Another major contribution took place in 1987 when a proposal submitted by the Merit Network together with its partners IBM, MCI, and the State of Michigan won a national competition to upgrade and expand the National Science Foundation Network (NSFNET) backbone from 56,000 to 1.5 million, and later to 45 million bits per second. In 2006, U-M joined with Michigan State University and Wayne State University to create the the University Research Corridor. This effort was undertaken to highlight the capabilities of the state’s three leading research institutions and drive the transformation of Michigan’s economy. The three universities are electronically interconnected via the Michigan LambdaRail (MiLR, pronounced ‘MY-lar’), a high-speed data network providing 10 Gbit/s connections between the three university campuses and other national and international network connection points in Chicago.

## From The Henry Samueli School of Engineering and Applied Science At The University of California-Los Angeles: “ARTEMIS – UCLA’s most advanced humanoid robot – gets ready for action”

From The Henry Samueli School of Engineering and Applied Science

At

The University of California-Los Angeles

3.10.23
By Matthew Chin

Media Contact
Christine Wei-li Lee
310-206-0540
clee@seas.ucla.edu

ARTEMIS: Advanced Robotic Technology for Enhanced Mobility and Improved Stability.

ARTEMIS’ major innovation is that its actuators — devices that generate motion from energy — were custom-designed to behave like biological muscles. Credit: RoMeLa at UCLA.

Mechanical engineers at the UCLA Samueli School of Engineering have developed a full-sized humanoid robot with first-of-its-kind technology.

Named ARTEMIS, for Advanced Robotic Technology for Enhanced Mobility and Improved Stability, the robot is scheduled to travel in July to Bordeaux, France, where it will take part in the soccer competition of the 2023 RoboCup, an international scientific meeting where robots demonstrate capabilities across a range of categories.

The robot was designed by researchers at the Robotics and Mechanisms Laboratory at UCLA, or RoMeLa, as a general-purpose humanoid robot, with a particular focus on bipedal locomotion over uneven terrain. Standing 4 feet, 8 inches tall and weighing 85 pounds, it’s capable of walking on rough and unstable surfaces, as well as running and jumping. ARTEMIS is able to remain steady even when strongly shoved or otherwise disturbed.

During tests in the lab, ARTEMIS has been clocked walking 2.1 meters per second which would make it the world’s fastest walking humanoid robot, according to the UCLA researchers. It is also believed to be the first humanoid robot designed in an academic setting that is capable of running, and only the third overall.

The robot’s major innovation is that its actuators — devices that generate motion from energy — were custom-designed to behave like biological muscles. They’re springy and force-controlled, as opposed to the rigid, position-controlled actuators that most robots have.

“That is the key behind its excellent balance while walking on uneven terrain and its ability to run — getting both feet off the ground while in motion,” said Dennis Hong, a UCLA professor of mechanical and aerospace engineering and the director of RoMeLa. “This is a first-of-its-kind robot.”

Another major advance is that ARTEMIS’ actuators are electrically driven, rather than controlled by hydraulics, which uses differences in fluid pressure to drive movement. As a result, it makes less noise and operates more efficiently than robots with hydraulic actuators — and it’s cleaner, because hydraulic systems are notorious for leaking fluids.

ARTEMIS’ ability to respond and adapt to what it senses comes from its system of sensors and actuators. It has custom-designed force sensors on each foot, which help the machine keep its balance as it moves. It also has an orientation unit and cameras in its head to help it perceive its surroundings.

To prepare ARTEMIS for RoboCup, student researchers have been testing the robot on regular walks around the UCLA campus. In the coming weeks, they will fully test the robot’s running and soccer-playing skills at the UCLA Intramural Field. Researchers also will evaluate how well it can traverse uneven terrain and stairs, its capacity for falling and getting back up, and its ability to carry objects. RoMeLa’s Twitter account is regularly sharing information about the robot’s testing results and posting the routes for its campus walks, giving Bruins the chance to catch ARTEMIS in action and chat with researchers.

“We’re very excited to take ARTEMIS out for field testing here at UCLA and we see this as an opportunity to promote science, technology, engineering and mathematics to a much wider audience,” Hong said.

Taoyuanmin Zhu and Min Sung Ahn, both of whom recently earned doctorates in mechanical engineering at UCLA, developed ARTEMIS’ hardware and software systems, respectively.

RoMeLa, which has been making humanoid robots for more than two decades, has had earlier robots win the RoboCup competition five times already; the engineers are hoping ARTEMIS brings home trophy number six.

ARTEMIS’ development was funded in part by 232 donors who contributed more than \$118,000 through a UCLA Spark crowdfunding campaign. Additional support came from an Office of Naval Research grant.

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

The UCLA Henry Samueli School of Engineering and Applied Science is the school of engineering at the University of California-Los Angeles. It opened as the College of Engineering in 1945, and was renamed the School of Engineering in 1969. Since its initial enrollment of 379 students, the school has grown to approximately 6,100 students. The school is ranked 16th among all engineering schools in the United States. The school offers 28 degree programs and is home to eight externally funded interdisciplinary research centers, including those in space exploration, wireless sensor systems, and nanotechnology.

The University of California-Los Angeles

For nearly 100 years, The University of California-Los Angeles has been a pioneer, persevering through impossibility, turning the futile into the attainable.

We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

The University of California-Los Angeles is a public land-grant research university in Los Angeles, California. The University of California-Los Angeles traces its early origins back to 1882 as the southern branch of the California State Normal School (now San Jose State University). It became the Southern Branch of The University of California in 1919, making it the second-oldest (after University of California-Berkeley ) of the 10-campus University of California system.

The University of California-Los Angeles offers 337 undergraduate and graduate degree programs in a wide range of disciplines, enrolling about 31,500 undergraduate and 12,800 graduate students. The University of California-Los Angeles had 168,000 applicants for Fall 2021, including transfer applicants, making the school the most applied-to of any American university.

The university is organized into six undergraduate colleges; seven professional schools; and four professional health science schools. The undergraduate colleges are the College of Letters and Science; Samueli School of Engineering; School of the Arts and Architecture; Herb Alpert School of Music; School of Theater, Film and Television; and School of Nursing.

The University of California-Los Angeles is called a “Public Ivy”, and is ranked among the best public universities in the United States by major college and university rankings. This includes one ranking that has The University of California-Los Angeles as the top public university in the United States in 2021. As of October 2020, 25 Nobel laureates; three Fields Medalists; five Turing Award winners; and two Chief Scientists of the U.S. Air Force have been affiliated with The University of California-Los Angeles as faculty; researchers or alumni. Among the current faculty members, 55 have been elected to the National Academy of Sciences; 28 to the National Academy of Engineering ; 39 to the Institute of Medicine; and 124 to the American Academy of Arts and Sciences . The university was elected to the Association of American Universities in 1974.

The University of California-Los Angeles student-athletes compete as the Bruins in the Pac-12 Conference. The Bruins have won 129 national championships, including 118 NCAA team championships- more than any other university except Stanford University, whose athletes have won 126. The University of California-Los Angeles students, coaches, and staff have won 251 Olympic medals: 126 gold; 65 silver; and 60 bronze. The University of California-Los Angeles student-athletes have competed in every Olympics since 1920 with one exception (1924) and have won a gold medal in every Olympics the U.S. participated in since 1932.

History

In March 1881, at the request of state senator Reginaldo Francisco del Valle, the California State Legislature authorized the creation of a southern branch of the California State Normal School (now San José State University) in downtown Los Angeles to train teachers for the growing population of Southern California. The Los Angeles branch of the California State Normal School opened on August 29, 1882, on what is now the site of the Central Library of the Los Angeles Public Library system. The facility included an elementary school where teachers-in-training could practice their technique with children. That elementary school is related to the present day University of California-Los Angeles Lab School. In 1887, the branch campus became independent and changed its name to Los Angeles State Normal School.

In 1914, the school moved to a new campus on Vermont Avenue (now the site of Los Angeles City College) in East Hollywood. In 1917, UC Regent Edward Augustus Dickson, the only regent representing the Southland at the time and Ernest Carroll Moore- Director of the Normal School, began to lobby the State Legislature to enable the school to become the second University of California campus, after University of California-Berkeley. They met resistance from University of California-Berkeley alumni, Northern California members of the state legislature, and Benjamin Ide Wheeler- President of the University of California from 1899 to 1919 who were all vigorously opposed to the idea of a southern campus. However, David Prescott Barrows the new President of the University of California did not share Wheeler’s objections.

On May 23, 1919, the Southern Californians’ efforts were rewarded when Governor William D. Stephens signed Assembly Bill 626 into law which acquired the land and buildings and transformed the Los Angeles Normal School into the Southern Branch of the University of California. The same legislation added its general undergraduate program- the Junior College. The Southern Branch campus opened on September 15 of that year offering two-year undergraduate programs to 250 Junior College students and 1,250 students in the Teachers College under Moore’s continued direction. Southern Californians were furious that their so-called “branch” provided only an inferior junior college program (mocked at the time by The University of Southern California students as “the twig”) and continued to fight Northern Californians (specifically, Berkeley) for the right to three and then four years of instruction culminating in bachelor’s degrees. On December 11, 1923 the Board of Regents authorized a fourth year of instruction and transformed the Junior College into the College of Letters and Science which awarded its first bachelor’s degrees on June 12, 1925.

Under University of California President William Wallace Campbell, enrollment at the Southern Branch expanded so rapidly that by the mid-1920s the institution was outgrowing the 25-acre Vermont Avenue location. The Regents searched for a new location and announced their selection of the so-called “Beverly Site”—just west of Beverly Hills—on March 21, 1925 edging out the panoramic hills of the still-empty Palos Verdes Peninsula. After the athletic teams entered the Pacific Coast conference in 1926 the Southern Branch student council adopted the nickname “Bruins”, a name offered by the student council at The University of California-Berkeley. In 1927, the Regents renamed the Southern Branch the University of California at Los Angeles (the word “at” was officially replaced by a comma in 1958 in line with other UC campuses). In the same year the state broke ground in Westwood on land sold for \$1 million- less than one-third its value- by real estate developers Edwin and Harold Janss for whom the Janss Steps are named. The campus in Westwood opened to students in 1929.

The original four buildings were the College Library (now Powell Library); Royce Hall; the Physics-Biology Building (which became the Humanities Building and is now the Renee and David Kaplan Hall); and the Chemistry Building (now Haines Hall) arrayed around a quadrangular courtyard on the 400 acre (1.6 km^2) campus. The first undergraduate classes on the new campus were held in 1929 with 5,500 students. After lobbying by alumni; faculty; administration and community leaders University of California-Los Angeles was permitted to award the master’s degree in 1933 and the doctorate in 1936 against continued resistance from The University of California-Berkeley.

Maturity as a university

During its first 32 years University of California-Los Angeles was treated as an off-site department of The University of California. As such its presiding officer was called a “provost” and reported to the main campus in Berkeley. In 1951 University of California-Los Angeles was formally elevated to co-equal status with The University of California-Berkeley, and its presiding officer Raymond B. Allen was the first chief executive to be granted the title of chancellor. The appointment of Franklin David Murphy to the position of Chancellor in 1960 helped spark an era of tremendous growth of facilities and faculty honors. By the end of the decade University of California-Los Angeles had achieved distinction in a wide range of subjects. This era also secured University of California-Los Angeles’s position as a proper university and not simply a branch of the University of California system. This change is exemplified by an incident involving Chancellor Murphy, which was described by him:

“I picked up the telephone and called in from somewhere and the phone operator said, “University of California.” And I said, “Is this Berkeley?” She said, “No.” I said, “Well who have I gotten to?” ” University of California-Los Angeles.” I said, “Why didn’t you say University of California-Los Angeles?” “Oh”, she said, “we’re instructed to say University of California.” So, the next morning I went to the office and wrote a memo; I said, “Will you please instruct the operators, as of noon today, when they answer the phone to say, ‘ University of California-Los Angeles.'” And they said, “You know they won’t like it at Berkeley.” And I said, “Well, let’s just see. There are a few things maybe we can do around here without getting their permission.”

Recent history

On June 1, 2016 two men were killed in a murder-suicide at an engineering building in the university. School officials put the campus on lockdown as Los Angeles Police Department officers including SWAT cleared the campus.

In 2018, a student-led community coalition known as “Westwood Forward” successfully led an effort to break University of California-Los Angeles and Westwood Village away from the existing Westwood Neighborhood Council and form a new North Westwood Neighborhood Council with over 2,000 out of 3,521 stakeholders voting in favor of the split. Westwood Forward’s campaign focused on making housing more affordable and encouraging nightlife in Westwood by opposing many of the restrictions on housing developments and restaurants the Westwood Neighborhood Council had promoted.

Divisions

College of Letters and Science
Social Sciences Division
Humanities Division
Physical Sciences Division
Life Sciences Division
School of the Arts and Architecture
Henry Samueli School of Engineering and Applied Science (HSSEAS)
Herb Alpert School of Music
School of Theater, Film and Television
School of Nursing
Luskin School of Public Affairs

Graduate School of Education & Information Studies (GSEIS)
School of Law
Anderson School of Management
Luskin School of Public Affairs
David Geffen School of Medicine
School of Dentistry
Jonathan and Karin Fielding School of Public Health
Semel Institute for Neuroscience and Human Behavior
School of Nursing

Research

University of California-Los Angeles is classified among “R1: Doctoral Universities – Very high research activity” and had \$1.32 billion in research expenditures in FY 2018.

## From The University of California-Santa Cruz: “SpikeGPT: researcher releases code for largest-ever spiking neural network for language generation”

From The University of California-Santa Cruz

3.7.23
Emily Cerf
ecerf@ucsc.edu

Language generators such as ChatGPT are gaining attention for their ability to reshape how we use search engines and change the way we interact with Artificial Intelligence. But these algorithms are both computationally expensive to run and depend on maintenance from just a few companies to avoid outages.

Jason Eshraghian

But UC Santa Cruz Assistant Professor of Electrical and Computer Engineering Jason Eshraghian created a new model for language generation that can address both of these issues. Language models typically use modern deep learning methods called neural networks, but Eshraghian is powering a language model with an alternative algorithm called a spiking neural network (SNN). He and two students have recently released the open-sourced code for the largest language-generating SNN ever, named SpikeGPT, which uses 22 times less energy than a similar model using typical deep learning. Using SNNs for language generation can have huge implications for accessibility, data security, and green computing and energy efficiency within this field.

“Brains are way more efficient than AI algorithms,” Eshraghian said. “Large scale language models rely on ridiculous amounts of compute power, and that’s pretty damn expensive. We’re taking an informed approach to borrowing principles from the brain, copying this idea that neurons are usually quiet and not transmitting anything. Using spikes is a much more efficient way to represent information.”

Neural networks in general are based on emulating how the brain processes information, and spiking neural networks are a variation that try to make the networks more efficient. Instead of constantly transmitting information throughout the network, as non-spiking networks behave, the neurons in SNNs stay in a quiet state unless they are activated, and therefore spike. This introduces a temporal dimension into the equation, because the functions are concerned with how the neurons behave over time.

Spiking neural networks, however, face their own challenges in the training of the models. Many of the optimization strategies that have been developed for regular neural networks and modern deep learning, such as backpropagation and gradient descent, cannot be easily applied to the training of SNNs because the information inputted into the system is not compatible with the training techniques. But Eshraghian has pioneered methods to circumvent these problems and apply the optimization techniques developed for traditional deep learning for the training of SNNs.

Large language models, such as ChatGPT, use a technique called self-attention, taking a sequence of data, such as a string of words, and applying a function to determine how closely each data point is related to each other. The mathematics behind this requires matrix-matrix multiplication, a complexity which is computationally expensive.

When trying to combine self-attention with SNNs, there was a similar complexity problem, until Eshraghian and his incoming graduate student Ruijie Zhu developed a technique to feed each data point in the sequence into the SNN model one by one, eliminating the need to do matrix-matrix multiplication.

“By coming up with a way to break down that backbone of language models into sequences, we completely squashed down that computational complexity without compromising on the ability of the model to generate language,” Eshraghian said. “It was taking the best of both worlds – the low complexity of sequential models and the performance of self-attention.”

In a preprint paper [below], Eshraghian describes three versions of SpikeGPT. The first is the smallest scale, at 45 million parameters, close in size to the largest-ever SNN that had been developed up to this point. Right now Eshraghian has only released the code for this smallest model, and he is still training the two larger ones.

The medium- and large-size models, at 125 million and 260 million parameters respectively, will likely become the second-largest and largest models when their training is complete and their code is released.

The preprint shows examples of language generation that these two models were able to produce, even in their not-yet fully trained states. Eshraghian found that his small-scale version is significantly more energy efficient than typical deep-learning models, and expects similar results for the other size models.

Using SNNs for language generation to power language generation in a more energy-efficient way can mean a decreased dependency on the large companies that currently dominate the language generation field. Making the technology more accessible will mitigate issues such as those that occur when gigantic servers running ChatGPT go down and render the technology useless for a time.

“If we manage to get this low-power enough to function on a scale comparable with the brain, then that could be something that everyone has locally on their devices, with less reliance on some monopolized entity,” Eshraghian said.

SpikeGPT also offers huge benefits for data security and privacy. With the language generator on a local device, data imputed into the systems are much more secure, protected from potential data-harvesting enterprises.

Eshraghian hopes that his models will show the language generation industry the vast potential of SNNs.

“This work shows that we can actually train models at the same scale with very similar performance, with far, far better energy consumption than what’s currently out there. Showing that in this paper could nudge industry in a direction to be more open to adopting SNNs as a full-fledged technique to address their power-consumption problems.”

However, this transition will require the development of brain-inspired hardware, which is a significant investment. Eshraghian hopes to work with a hardware company such as Intel to host these models, which would allow him to further demonstrate the energy-saving benefits of his SNN.

Since releasing the preprint paper and the code for the SNN, Eshraghian has seen a positive reaction from the research community. Hugging Face, a major company that hosts open-source models that are too large to live on GitHub, offered to host his model. He has also started a Discord server for people to experiment, build chatbots, and share results.

“What’s most appreciated by the community is the fact that we’ve shown it’s actually possible to do language generation with spikes.”

preprint paper

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

Stem Education Coalition

The University of California-Santa Cruz, opened in 1965 and grew, one college at a time, to its current (2008-09) enrollment of more than 16,000 students. Undergraduates pursue more than 60 majors supervised by divisional deans of humanities, physical & biological sciences, social sciences, and arts. Graduate students work toward graduate certificates, master’s degrees, or doctoral degrees in more than 30 academic fields under the supervision of the divisional and graduate deans. The dean of the Jack Baskin School of Engineering oversees the campus’s undergraduate and graduate engineering programs.

The University of California-Santa Cruz is a public land-grant research university in Santa Cruz, California. It is one of the ten campuses in the University of California system. Located on Monterey Bay, on the edge of the coastal community of Santa Cruz, the campus lies on 2,001 acres (810 ha) of rolling, forested hills overlooking the Pacific Ocean.

Founded in 1965, The University of California-Santa Cruz began with the intention to showcase progressive, cross-disciplinary undergraduate education, innovative teaching methods and contemporary architecture. The residential college system consists of ten small colleges that were established as a variation of the Oxbridge collegiate university system.

Among the Faculty is 1 Nobel Prize Laureate, 1 Breakthrough Prize in Life Sciences recipient, 12 members from the National Academy of Sciences, 28 members of the American Academy of Arts and Sciences, and 40 members of the American Association for the Advancement of Science. Eight University of California-Santa Cruz alumni are winners of 10 Pulitzer Prizes. The University of California-Santa Cruz is classified among “R1: Doctoral Universities – Very high research activity”. It is a member of the Association of American Universities, an alliance of elite research universities in the United States and Canada.

The university has five academic divisions: Arts, Engineering, Humanities, Physical & Biological Sciences, and Social Sciences. Together, they offer 65 graduate programs, 64 undergraduate majors, and 41 minors.

Popular undergraduate majors include Art, Business Management Economics, Chemistry, Molecular and Cell Biology, Physics, and Psychology. Interdisciplinary programs, such as Computational Media, Feminist Studies, Environmental Studies, Visual Studies, Digital Arts and New Media, Critical Race & Ethnic Studies, and the History of Consciousness Department are also hosted alongside UCSC’s more traditional academic departments.

A joint program with The University of California-Hastings enables University of California-Santa Cruz students to earn a bachelor’s degree and Juris Doctor degree in six years instead of the usual seven. The “3+3 BA/JD” Program between University of California-Santa Cruz and The University of California-Hastings College of the Law in San Francisco accepted its first applicants in fall 2014. University of California-Santa Cruz students who declare their intent in their freshman or early sophomore year will complete three years at The University of California-Santa Cruz and then move on to The University of California-Hastings to begin the three-year law curriculum. Credits from the first year of law school will count toward a student’s bachelor’s degree. Students who successfully complete the first-year law course work will receive their bachelor’s degree and be able to graduate with their University of California-Santa Cruz class, then continue at The University of California-Hastings afterwards for two years.

According to the National Science Foundation, The University of California-Santa Cruz spent \$127.5 million on research and development in 2018, ranking it 144th in the nation.

Although designed as a liberal arts-oriented university, The University of California-Santa Cruz quickly acquired a graduate-level natural science research component with the appointment of plant physiologist Kenneth V. Thimann as the first provost of Crown College. Thimann developed The University of California-Santa Cruz’s early Division of Natural Sciences and recruited other well-known science faculty and graduate students to the fledgling campus. Immediately upon its founding, The University of California-Santa Cruz was also granted administrative responsibility for the Lick Observatory, which established the campus as a major center for Astronomy research. Founding members of the Social Science and Humanities faculty created the unique History of Consciousness graduate program in The University of California-Santa Cruz’s first year of operation.

Famous former University of California-Santa Cruz faculty members include Judith Butler and Angela Davis.

The University of California-Santa Cruz’s organic farm and garden program is the oldest in the country, and pioneered organic horticulture techniques internationally.

As of 2015, The University of California-Santa Cruz’s faculty include 13 members of the National Academy of Sciences, 24 fellows of the American Academy of Arts and Sciences, and 33 fellows of the American Association for the Advancement of Science. The Baskin School of Engineering, founded in 1997, is The University of California-Santa Cruz’s first and only professional school. Baskin Engineering is home to several research centers, including the Center for Biomolecular Science and Engineering and Cyberphysical Systems Research Center, which are gaining recognition, as has the work that UCSC researchers David Haussler and Jim Kent have done on the Human Genome Project, including the widely used University of California-Santa Cruz Genome Browser. The University of California-Santa Cruz administers the National Science Foundation’s Center for Adaptive Optics.

Off-campus research facilities maintained by The University of California-Santa Cruz include the Lick and The W. M. Keck Observatory, Mauna Kea, Hawai’i and the Long Marine Laboratory. From September 2003 to July 2016, The University of California-Santa Cruz managed a University Affiliated Research System (UARC) for the NASA Ames Research Center under a task order contract valued at more than \$330 million.

The University of California-Santa Cruz was tied for 58th in the list of Best Global Universities and tied for 97th in the list of Best National Universities in the United States by U.S. News & World Report’s 2021 rankings. In 2017 Kiplinger ranked The University of California-Santa Cruz 50th out of the top 100 best-value public colleges and universities in the nation, and 3rd in California. Money Magazine ranked The University of California-Santa Cruz 41st in the country out of the nearly 1500 schools it evaluated for its 2016 Best Colleges ranking. In 2016–2017, The University of California-Santa Cruz Santa Cruz was rated 146th in the world by Times Higher Education World University Rankings. In 2016 it was ranked 83rd in the world by the Academic Ranking of World Universities and 296th worldwide in 2016 by the QS World University Rankings.

In 2009, RePEc, an online database of research economics articles, ranked the The University of California-Santa Cruz Economics Department sixth in the world in the field of international finance. In 2007, High Times magazine placed The University of California-Santa Cruz as first among US universities as a “counterculture college.” In 2009, The Princeton Review (with Gamepro magazine) ranked The University of California-Santa Cruz’s Game Design major among the top 50 in the country. In 2011, The Princeton Review and Gamepro Media ranked The University of California-Santa Cruz’s graduate programs in Game Design as seventh in the nation. In 2012, The University of California-Santa Cruz was ranked No. 3 in the Most Beautiful Campus list of Princeton Review.

The University of California-Santa Cruz is the home base for the Lick Observatory.

Search for extraterrestrial intelligence expands at Lick Observatory

New instrument scans the sky for pulses of infrared light

March 23, 2015
By Hilary Lebow
Astronomers are expanding the search for extraterrestrial intelligence into a new realm with detectors tuned to infrared light at The University of California-Santa Cruz’s Lick Observatory. A new instrument, called NIROSETI, will soon scour the sky for messages from other worlds.

“Infrared light would be an excellent means of interstellar communication,” said Shelley Wright, an assistant professor of physics at The University of California-San Diego who led the development of the new instrument while at The University of Toronto (CA)’s Dunlap Institute for Astronomy and Astrophysics (CA).

Infrared light would be a good way for extraterrestrials to get our attention here on Earth, since pulses from a powerful infrared laser could outshine a star, if only for a billionth of a second. Interstellar gas and dust is almost transparent to near infrared, so these signals can be seen from great distances. It also takes less energy to send information using infrared signals than with visible light.

Astronomers are expanding the search for extraterrestrial intelligence into a new realm with detectors tuned to infrared light at University of California’s Lick Observatory. A new instrument, called NIROSETI, will soon scour the sky for messages from other worlds.

“Infrared light would be an excellent means of interstellar communication,” said Shelley Wright, an assistant professor of physics at The University of California-San Diego who led the development of the new instrument while at the University of Toronto’s Dunlap Institute for Astronomy and Astrophysics (CA).

Wright worked on an earlier SETI project at Lick Observatory as a University of California-Santa Cruz undergraduate, when she built an optical instrument designed by University of California-Berkeley researchers. The infrared project takes advantage of new technology not available for that first optical search.

Infrared light would be a good way for extraterrestrials to get our attention here on Earth, since pulses from a powerful infrared laser could outshine a star, if only for a billionth of a second. Interstellar gas and dust is almost transparent to near infrared, so these signals can be seen from great distances. It also takes less energy to send information using infrared signals than with visible light.

Frank Drake, professor emeritus of astronomy and astrophysics at The University of California-Santa Cruz and director emeritus of the SETI Institute, said there are several additional advantages to a search in the infrared realm.

Frank Drake with his Drake Equation. Credit Frank Drake.

“The signals are so strong that we only need a small telescope to receive them. Smaller telescopes can offer more observational time, and that is good because we need to search many stars for a chance of success,” said Drake.

The only downside is that extraterrestrials would need to be transmitting their signals in our direction, Drake said, though he sees this as a positive side to that limitation. “If we get a signal from someone who’s aiming for us, it could mean there’s altruism in the universe. I like that idea. If they want to be friendly, that’s who we will find.”

Scientists have searched the skies for radio signals for more than 50 years and expanded their search into the optical realm more than a decade ago. The idea of searching in the infrared is not a new one, but instruments capable of capturing pulses of infrared light only recently became available.

“We had to wait,” Wright said. “I spent eight years waiting and watching as new technology emerged.”

Now that technology has caught up, the search will extend to stars thousands of light years away, rather than just hundreds. NIROSETI, or Near-Infrared Optical Search for Extraterrestrial Intelligence, could also uncover new information about the physical universe.

## From “Penn Today” At The University of Pennsylvania : “The hidden costs of AI – Impending energy and resource strain”

From “Penn Today”

At

The University of Pennsylvania

3.8.23
Nathi Magubane

In recent years, artificial intelligence (AI) models like ChatGPT have seen notable improvements, with some people concerned about the societal impacts these new technologies may bring including looming concerns related to increasing energy and raw materials demands. (Image: iStock)

Deep Jariwala and Benjamin C. Lee on the energy and resource problems AI computing could bring.

New technologies like the rapidly advancing deep learning models have led to increasingly sophisticated artificial intelligence (AI) models. With promises ranging from autonomous vehicles—land, air, and seafaring—to highly specialized information retrieval and creation like ChatGPT, the possibilities seem boundless. Yet potential pitfalls exist, such as job displacement and privacy concerns, as well as materials and energy concerns.

Every operation a computer performs corresponds to electrical signals that travel through its hardware and consume power. The School of Engineering and Applied Science’s Deep Jariwala, assistant professor of electrical and systems engineering, and Benjamin C. Lee, professor of electrical and systems engineering and computer and information science, spoke with Penn Today about the impact an increasing AI computation reliance will have as infrastructure develops to facilitate its ever-growing needs.

What sets AI and its current applications apart from other iterations of computing?

Jariwala: It’s a totally new paradigm in terms of function. Think back to the very first computer, the Electrical Numerical Integrator and Computer (ENIAC) we have here at Penn. It was built to do math that would take too long for humans to calculate by hand and was mostly used for calculating ballistics trajectories, so it had an underlying logic that was straightforward: addition, subtraction, multiplication, and division of, say, 10-digit numbers that were manually input.

Lee: Computing for AI has three main pieces. One is data pre-processing, which means organizing a large dataset before you can do anything with it. This may involve labeling the data or cleaning it up, but basically you’re just trying to create some structure in it.

Once preprocessed, you can start to ‘train’ the AI; this is like teaching it how to interpret the data. Next, we can do what we call AI inference, which is running the model in response to user queries.

Jariwala: With AI it’s less about crunching raw numbers and more about using complex algorithms and machine learning to train and adapt it to new information or situations. It goes beyond manually entering a value, as it can draw information from larger datasets, like the internet.

This ability to gather data from different places, use probabilistic models to weigh relevance to the task at hand, integrate that information, and then provide an output that uncannily resembles that of a human in many instances is what sets it apart from traditional computing. Large language models, like ChatGPT, showcase this new set of operations when you ask it a question and it cobbles together a specific answer. It takes the basic premise of a search engine but kicks it up a gear.

What concerns do you have about these changes to the nature of computation?

Lee: As AI products like ChatGPT and Bing become more popular, the nature of computing is becoming more inference based. This is a slight departure from the machine-learning models that were popular a few years ago, like the DeepMind’s AlphaGO—the machine trained to be the best Go player—where the herculean effort was training the model and eventually demonstrating a novel capability. Now massive AI models are being embedded into day-to-day operations like running a search and that comes with trade-offs.

What are the material and resource costs associated with AI?

Jariwala: We take it for granted, but all the tasks our machines perform are transactions between memory and processors and each of these transactions requires energy. As these tasks become more elaborate and data-intensive, two things begin to scale up exponentially: the need for more memory storage and the need for more energy.

Regarding memory, an estimate from the Semiconductor Research Corporation, a consortium of all the major semiconductor companies, posits that if we continue to scale data at this rate, which is stored on memory made from silicon, we will outpace the global amount of silicon produced every year. So, pretty soon we will hit a wall where our silicon supply chains won’t be able to keep up with the amount of data being generated.

Couple this with the fact that our computers currently consume roughly 20-25% of the global energy supply and we see another cause for concern. If we continue at this rate by 2040 all the power we produce will be needed just for computing, further exacerbating the current energy crisis.

Lee: There is also concern about the operational carbon emissions from computation. So even before products like ChatGPT started getting a lot of attention, the rise of AI led to significant growth in data centers, facilities dedicated to housing IT infrastructure for data processing, management, and storage.

And companies like Amazon and Google and Meta have been building more and more of these massive facilities all over the country. In fact, data center power and carbon emissions associated with data centers doubled between 2017 and 2020. Each facility consumes in the order of 20 megawatts up to 40 megawatts of power, and most of the time data centers are running at 100% utilization, meaning all the processors are being kept busy with some work. So, a 20-megawatt facility probably draws 20 megawatts fairly consistently—enough to power roughly 16,000 households—computing as much as it can to amortize the costs of the data center, its servers, and power delivery systems.

And then there’s the embodied carbon footprint, which is associated with construction and manufacturing. This hearkens back to building new semiconductor foundries and packaging all the chips we’ll need to produce to keep up with increasing compute demand. These processes in and of themselves are extremely energy-intensive, expensive and have a carbon impact at each step.

A data center in Silicon Valley. (Image: iStock)

What role do these data centers play, and why are more of them needed?

Lee: Data centers offer economies of scale. In the past, a lot of businesses would build their own facilities, which meant they’d have to pay for construction, IT equipment, server room management, etc. So nowadays, it’s much easier to just ‘rent’ space from Amazon Web Services. It’s why cloud computing has taken off in the last decade.

And in recent years, the general-purpose processors that have been prevalent in data centers since the early ’90s started being supplanted by specialized processors to meet the demands of modern computing.

Why is that, and how have computer architects responded to this constraint?

Lee: Tying back to scaling, two observations have had profound effects on computer processor architecture: Moore’s law and Dennard scaling.

Moore’s law states that the number of transistors on a chip—the parts that control the flow of electrons on a semiconductor material—doubles every two or so years and has historically set the cadence for developing smaller, faster chips. And Dennard’s scaling suggests that doubling the number of transistors effectively means shrinking them but also maintaining their power density, so smaller chips meant more energy-efficient chips.

In the last decade, these effects have started to slow down for several reasons related to the physical limits of the materials we use. This waning effect put the onus on architects to develop new ways to stay at the bleeding edge.

General-purpose processors just weren’t fast enough at running several complex calculations at the same time, so computer architects started looking at alternative designs, which is why graphics processing units (GPUs) got a second look.

GPUs are particularly good at doing the sort of complex calculations essential for machine learning algorithms. These tend to be more linear algebra centric, like multiplying large matrices and adding complex vectors, so this has also significantly changed the landscape of computer architecture because they led to the creation of what we call domain-specific accelerators, pieces of hardware tailored to a particular application.

Accelerators are much more energy efficient because they’re custom-made for a specific type of computer and also provide much better performance. So modern data centers are far more diverse than what you would have had 10 to 15 years ago. However, with that diversity comes new costs because we need new engineers to build and design these custom pieces of hardware.

What other hardware changes are we likely to see to accommodate new systems?

Jariwala: As I mentioned, each computational task is a transaction between memory and processing that requires some energy, so our lab, in conjunction with Troy Olsson’s lab, is trying to figure out ways to make each operation use fewer watts of power. One way to reduce this metric is through tightly integrating memory and processing units because these currently exist in two separate locations that are millimeters to centimeters apart so electricity needs to travel great distances to facilitate computation which makes it energy and time inefficient.

It’s a bit like making a high-rise mall, where you save space and energy and reduce travel time by allowing people to use the elevators instead of having them walk to different locations like they would in a single-story strip mall. We call it vertically heterogenous-integrated architecture, and developing this is key to reducing energy consumption.

But effectively integrating memory and processing comes with its own challenges because they do inherently different things that you wouldn’t want interfering with one another. So, these are the problems people like my colleagues and me aim to work around. We’re trying to look for new types of materials that can facilitate designs for making energy-efficient memory devices that we can stack onto processors.

Do you have any closing thoughts?

Jariwala: By now, it should be clear that we have an 800-pound gorilla in the room; our computers and other devices are becoming insatiable energy beasts that we continue to feed. That’s not to say AI and advancing it needs to stop because it’s incredibly useful for important applications like accelerating the discovery of therapeutics. We just need to remain cognizant of the effects and keep pushing for more sustainable approaches to design, manufacturing, and consumption.

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

Academic life at University of Pennsylvania is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

The University of Pennsylvania is a private Ivy League research university in Philadelphia, Pennsylvania. The university claims a founding date of 1740 and is one of the nine colonial colleges chartered prior to the U.S. Declaration of Independence. Benjamin Franklin, Penn’s founder and first president, advocated an educational program that trained leaders in commerce, government, and public service, similar to a modern liberal arts curriculum.

Penn has four undergraduate schools as well as twelve graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences; the School of Engineering and Applied Science; the Wharton School; and the School of Nursing. Penn’s “One University Policy” allows students to enroll in classes in any of Penn’s twelve schools. Among its highly ranked graduate and professional schools are a law school whose first professor wrote the first draft of the United States Constitution, the first school of medicine in North America (Perelman School of Medicine, 1765), and the first collegiate business school (Wharton School, 1881).

Penn is also home to the first “student union” building and organization (Houston Hall, 1896), the first Catholic student club in North America (Newman Center, 1893), the first double-decker college football stadium (Franklin Field, 1924 when second deck was constructed), and Morris Arboretum, the official arboretum of the Commonwealth of Pennsylvania. The first general-purpose electronic computer (ENIAC) was developed at Penn and formally dedicated in 1946. In 2019, the university had an endowment of \$14.65 billion, the sixth-largest endowment of all universities in the United States, as well as a research budget of \$1.02 billion. The university’s athletics program, the Quakers, fields varsity teams in 33 sports as a member of the NCAA Division I Ivy League conference.

As of 2018, distinguished alumni and/or Trustees include three U.S. Supreme Court justices; 32 U.S. senators; 46 U.S. governors; 163 members of the U.S. House of Representatives; eight signers of the Declaration of Independence and seven signers of the U.S. Constitution (four of whom signed both representing two-thirds of the six people who signed both); 24 members of the Continental Congress; 14 foreign heads of state and two presidents of the United States, including Donald Trump. As of October 2019, 36 Nobel laureates; 80 members of the American Academy of Arts and Sciences; 64 billionaires; 29 Rhodes Scholars; 15 Marshall Scholars and 16 Pulitzer Prize winners have been affiliated with the university.

History

The University of Pennsylvania considers itself the fourth-oldest institution of higher education in the United States, though this is contested by Princeton University and Columbia University. The university also considers itself as the first university in the United States with both undergraduate and graduate studies.

In 1740, a group of Philadelphians joined together to erect a great preaching hall for the traveling evangelist George Whitefield, who toured the American colonies delivering open-air sermons. The building was designed and built by Edmund Woolley and was the largest building in the city at the time, drawing thousands of people the first time it was preached in. It was initially planned to serve as a charity school as well, but a lack of funds forced plans for the chapel and school to be suspended. According to Franklin’s autobiography, it was in 1743 when he first had the idea to establish an academy, “thinking the Rev. Richard Peters a fit person to superintend such an institution”. However, Peters declined a casual inquiry from Franklin and nothing further was done for another six years. In the fall of 1749, now more eager to create a school to educate future generations, Benjamin Franklin circulated a pamphlet titled Proposals Relating to the Education of Youth in Pensilvania, his vision for what he called a “Public Academy of Philadelphia”. Unlike the other colonial colleges that existed in 1749—Harvard University, William & Mary, Yale Unversity, and The College of New Jersey—Franklin’s new school would not focus merely on education for the clergy. He advocated an innovative concept of higher education, one which would teach both the ornamental knowledge of the arts and the practical skills necessary for making a living and doing public service. The proposed program of study could have become the nation’s first modern liberal arts curriculum, although it was never implemented because Anglican priest William Smith (1727-1803), who became the first provost, and other trustees strongly preferred the traditional curriculum.

Franklin assembled a board of trustees from among the leading citizens of Philadelphia, the first such non-sectarian board in America. At the first meeting of the 24 members of the board of trustees on November 13, 1749, the issue of where to locate the school was a prime concern. Although a lot across Sixth Street from the old Pennsylvania State House (later renamed and famously known since 1776 as “Independence Hall”), was offered without cost by James Logan, its owner, the trustees realized that the building erected in 1740, which was still vacant, would be an even better site. The original sponsors of the dormant building still owed considerable construction debts and asked Franklin’s group to assume their debts and, accordingly, their inactive trusts. On February 1, 1750, the new board took over the building and trusts of the old board. On August 13, 1751, the “Academy of Philadelphia”, using the great hall at 4th and Arch Streets, took in its first secondary students. A charity school also was chartered on July 13, 1753 by the intentions of the original “New Building” donors, although it lasted only a few years. On June 16, 1755, the “College of Philadelphia” was chartered, paving the way for the addition of undergraduate instruction. All three schools shared the same board of trustees and were considered to be part of the same institution. The first commencement exercises were held on May 17, 1757.

The institution of higher learning was known as the College of Philadelphia from 1755 to 1779. In 1779, not trusting then-provost the Reverend William Smith’s “Loyalist” tendencies, the revolutionary State Legislature created a University of the State of Pennsylvania. The result was a schism, with Smith continuing to operate an attenuated version of the College of Philadelphia. In 1791, the legislature issued a new charter, merging the two institutions into a new University of Pennsylvania with twelve men from each institution on the new board of trustees.

Penn has three claims to being the first university in the United States, according to university archives director Mark Frazier Lloyd: the 1765 founding of the first medical school in America made Penn the first institution to offer both “undergraduate” and professional education; the 1779 charter made it the first American institution of higher learning to take the name of “University”; and existing colleges were established as seminaries (although, as detailed earlier, Penn adopted a traditional seminary curriculum as well).

After being located in downtown Philadelphia for more than a century, the campus was moved across the Schuylkill River to property purchased from the Blockley Almshouse in West Philadelphia in 1872, where it has since remained in an area now known as University City. Although Penn began operating as an academy or secondary school in 1751 and obtained its collegiate charter in 1755, it initially designated 1750 as its founding date; this is the year that appears on the first iteration of the university seal. Sometime later in its early history, Penn began to consider 1749 as its founding date and this year was referenced for over a century, including at the centennial celebration in 1849. In 1899, the board of trustees voted to adjust the founding date earlier again, this time to 1740, the date of “the creation of the earliest of the many educational trusts the University has taken upon itself”. The board of trustees voted in response to a three-year campaign by Penn’s General Alumni Society to retroactively revise the university’s founding date to appear older than Princeton University, which had been chartered in 1746.

Research, innovations and discoveries

Penn is classified as an “R1” doctoral university: “Highest research activity.” Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to \$14.3 billion. Penn’s research expenditures in the 2018 fiscal year were \$1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received \$582.3 million in funding from the National Institutes of Health.

In line with its well-known interdisciplinary tradition, Penn’s research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing; the Center for Global Women’s Health at the Nursing School; the \$13 million Morris Arboretum’s Horticulture Center; the \$15 million Jay H. Baker Retailing Center at Wharton; and the \$13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the “Penn Integrates Knowledge” title awarded to selected Penn professors “whose research and teaching exemplify the integration of knowledge”. These professors hold endowed professorships and joint appointments between Penn’s schools.

Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia University and Cornell University (Harvard University did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale University) and tenth nationally.

In most disciplines Penn professors’ productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn’s 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields.

Penn’s research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school; the first university teaching hospital; the first business school; and the first student union Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, “Wharton is on the crest of a wave of reinvention and change in management education”.

Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering.

ENIAC UPenn

It was here also where the world’s first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer’s link with genes; cognitive therapy; Retin-A (the cream used to treat acne), Resistin; the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the genes for fragile X syndrome, the most common form of inherited mental retardation; spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; and Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs.

Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research; Simon Kuznets’s method of measuring Gross National Product; the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the “Wharton Model” developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon’s health reform in the 1970s.

International partnerships

Students can study abroad for a semester or a year at partner institutions such as the London School of Economics(UK), University of Barcelona [Universitat de Barcelona](ES), Paris Institute of Political Studies [Institut d’études politiques de Paris](FR), University of Queensland(AU), University College London(UK), King’s College London(UK), Hebrew University of Jerusalem(IL) and University of Warwick(UK).

## From The College of Engineering At The University of Washington : “Q&A – University of Washington researcher discusses future of quantum research” University of Washington Professor Kai-Mei Fu

From The College of Engineering

At

The University of Washington

2.8.23
Sarah McQuate
James Urton

In a world abuzz with smartphones, tablets, 5G and Siri, there are whispers of something new over the horizon — and it isn’t artificial intelligence!

A growing field of research seeks to develop technologies built directly on the seemingly strange and contradictory rules of quantum mechanics. These principles underlie the behavior of atoms and everything comprised of atoms, including people. But these rules are only apparent at very small scales. Researchers across the globe are constructing rudimentary quantum computers, which could perform computational tasks that the “classical” computers in our pockets and on our desks simply could not.

To help transform these quantum whispers into a chorus, scientists at the University of Washington are pursuing multiple quantum research projects spanning from creating materials with never-before-seen physical properties to studying the “quantum bits” — or qubits (pronounced “kyu-bits”) — that make quantum computing possible.

With their research group in the Department of Physics and the Department of Electrical & Computer Engineering, UW Professor Kai-Mei Fu studies the quantum-level properties of crystalline materials for potential applications in electrical and optical quantum technologies. In addition, Fu, who is also a faculty member in the Molecular & Engineering Sciences Institute and the Institute for Nano-engineered Systems, has led efforts to develop a comprehensive graduate curriculum and provide internship opportunities in quantum sciences for students in fields ranging from computer science to chemistry — all toward the goal of forging a quantum-competent workforce.

UW News sat down with Fu to talk about the potential of quantum research, and why it’s so important.

Kai-Mei Fu

Kai-Mei Fu: Originally, “quantum” just meant “discrete.” It referred to the observation that, at really small scales, something can exist only in discrete states. This is different from our everyday experiences. For example, if you start a car and then accelerate, the car “accesses” every speed. It can occupy any position. But when you get down to these really small systems — unusually small — you start to see that every “position” may not be accessible. And similarly, every speed or energy state may not be accessible. Things are “quantized” at this level.

And that’s not the only weird thing that’s going on: At this small scale, not only do things exist in discrete states, but it is possible for things to exist in a combination of two or more different states at once. This is called “superposition,” and that is when the interesting physical phenomena occur.

How is superposition useful in developing quantum technology?

KMF: Well, let’s take quantum computing for example. In the information age of today, a computational “bit” can only exist in one of two possible states: 0 and 1. But with superposition, you could have a qubit that can exist in two different states at the same time. It’s not just that you don’t know which state it’s in. It really is coexisting in two different states. Thus it is possible to compute with many states, in fact exponentially many states, at the same time. With quantum computing and quantum information, the power is in being able to control that superposition.

What are some exciting advancements or applications that could stem from controlling superposition?

KMF: There are four main areas of excitement. My favorite is probably quantum computation. It’s the one that’s furthest out technologically — right now, computation involving just a handful of qubits has been realized — but it’s kind of the big one.

We know that the power of quantum computation will be immense because superposition is scalable. This means that you would have so much more computational space to utilize, and you could perform computations that our classical computers would need the age of the universe to perform. So, we know that there’s a lot of power in quantum computing. But there’s also a lot of speculation in this field, and questions about how you can harness that power.

Does the University of Washington have a quantum computer?

KMF: It currently does not. We are gathering materials now to construct a quantum processor — the basis of a quantum computer — as part of our educational curriculum in this field.

Besides quantum computing, what other applications are there?

KMF: Another area is sensing for more precise measurements. One example: single-atom crystals that can act as sensors. For my research, I work with atoms arranged into a perfect crystal and then I create “defects” by adding in different types of atoms or taking out one atom in the lattice. The defect acts like an artificial atom and it will react to tiny changes nearby, such as a change in a magnetic field. These changes are normally so small that they would be hard to measure at room temperature, but the artificial atom amplifies the changes into something I can see — sometimes even by eye. For example, some crystals will radiate light when I shine a laser on them. By measuring the light they emit, I can detect a change.

This is so special. I get super excited because we know that all these things are possible in theory, but we’ve just hit the timescale where we’re starting to see real technological applications right now.

That sounds really exciting!

KMF: Another area I’ll mention is quantum simulation. There are a lot of potential applications in this field, such as studying new energy storage systems or figuring out how to make an enzyme better at nitrogen fixation. Essentially these problems require making new materials, but these are complex quantum systems that are hard for classical computers to simulate or predict. But quantum simulation could, and this could be done using a type of quantum computer. The field is expecting a lot of advancement in materials and other areas from quantum simulation.

The final area is quantum communication. When you’re transmitting sensitive information, you can create a key to encrypt it. With quantum encryption you can distribute a key that’s so fundamentally secure that if you have an eavesdropper, they leave a “mark” behind that you can detect.

How big is the field of quantum communication? Is it happening now?

KMF: Well, in the past few years, quantum communication became a prominent topic in government when China demonstrated secure ground-to-satellite communication.

Let’s shift gears a little to talk about quantum in terms of workforce development. You have companies, national labs and universities all pursuing quantum research. Are there any specific challenges for quantum education?

KMF: What we are doing is crafting a common framework — a common language — for education in quantum. Quantum involves many fields, including chemistry, computer science, material science, chemical engineering and theoretical physics. Historically these fields have all had their own approach, their own vocabulary, their own history. At the University of Washington, we’ve launched a core curriculum in quantum for graduate students who want to pursue careers in this field. Through the Northwest Quantum Nexus, we also have partners for internships: Microsoft; The DOE’s Pacific Northwest National Laboratory; The University of Washington.

We need more scientists in quantum because this is an exciting time. A lot is changing. There are many questions to answer, too many. Every field in quantum is growing in its own way. In the coming years, this is going to change a lot about how we approach problems — in communication, in software, in medicine and in materials. It will be beyond what we can think about even today.

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

Stem Education Coalition

Mission, Facts, and Stats
Our mission is to develop outstanding engineers and ideas that change the world.

Faculty:
275 faculty (25.2% women)
Achievements:

128 NSF Young Investigator/Early Career Awards since 1984
32 Sloan Foundation Research Awards
2 MacArthur Foundation Fellows (2007 and 2011)

A national leader in educating engineers, each year the College turns out new discoveries, inventions and top-flight graduates, all contributing to the strength of our economy and the vitality of our community.

Engineering innovation

Engineers drive the innovation economy and are vital to solving society’s most challenging problems. The College of Engineering is a key part of a world-class research university in a thriving hub of aerospace, biotechnology, global health and information technology innovation. Over 50% of The University of Washington startups in FY18 came from the College of Engineering.

Commitment to diversity and access

The College of Engineering is committed to developing and supporting a diverse student body and faculty that reflect and elevate the populations we serve. We are a national leader in women in engineering; 25.5% of our faculty are women compared to 17.4% nationally. We offer a robust set of diversity programs for students and faculty.
Research and commercialization

The University of Washington is an engine of economic growth, today ranked third in the nation for the number of startups launched each year, with 65 companies having been started in the last five years alone by UW students and faculty, or with technology developed here. The College of Engineering is a key contributor to these innovations, and engineering faculty, students or technology are behind half of all UW startups. In FY19, UW received \$1.58 billion in total research awards from federal and nonfederal sources.

The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

So, what defines us —the students, faculty and community members at The University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

The University of Washington is a public research university in Seattle, Washington, United States. Founded in 1861, The University of Washington is one of the oldest universities on the West Coast; it was established in downtown Seattle approximately a decade after the city’s founding to aid its economic development. Today, The University of Washington’s 703-acre main Seattle campus is in the University District above the Montlake Cut, within the urban Puget Sound region of the Pacific Northwest. The university has additional campuses in Tacoma and Bothell. Overall, The University of Washington encompasses over 500 buildings and over 20 million gross square footage of space, including one of the largest library systems in the world with more than 26 university libraries, as well as the UW Tower, lecture halls, art centers, museums, laboratories, stadiums, and conference centers. The University of Washington offers bachelor’s, master’s, and doctoral degrees through 140 departments in various colleges and schools, sees a total student enrollment of roughly 46,000 annually, and functions on a quarter system.

The University of Washington is a member of the Association of American Universities and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation, UW spent \$1.41 billion on research and development in 2018, ranking it 5th in the nation. As the flagship institution of the six public universities in Washington state, it is known for its medical, engineering and scientific research as well as its highly competitive computer science and engineering programs. Additionally, The University of Washington continues to benefit from its deep historic ties and major collaborations with numerous technology giants in the region, such as Amazon, Boeing, Nintendo, and particularly Microsoft. Paul G. Allen, Bill Gates and others spent significant time at Washington computer labs for a startup venture before founding Microsoft and other ventures. The University of Washington’s 22 varsity sports teams are also highly competitive, competing as the Huskies in the Pac-12 Conference of the NCAA Division I, representing the United States at the Olympic Games, and other major competitions.

The University of Washington has been affiliated with many notable alumni and faculty, including 21 Nobel Prize laureates and numerous Pulitzer Prize winners, Fulbright Scholars, Rhodes Scholars and Marshall Scholars.

In 1854, territorial governor Isaac Stevens recommended the establishment of a university in the Washington Territory. Prominent Seattle-area residents, including Methodist preacher Daniel Bagley, saw this as a chance to add to the city’s potential and prestige. Bagley learned of a law that allowed United States territories to sell land to raise money in support of public schools. At the time, Arthur A. Denny, one of the founders of Seattle and a member of the territorial legislature, aimed to increase the city’s importance by moving the territory’s capital from Olympia to Seattle. However, Bagley eventually convinced Denny that the establishment of a university would assist more in the development of Seattle’s economy. Two universities were initially chartered, but later the decision was repealed in favor of a single university in Lewis County provided that locally donated land was available. When no site emerged, Denny successfully petitioned the legislature to reconsider Seattle as a location in 1858.

In 1861, scouting began for an appropriate 10 acres (4 ha) site in Seattle to serve as a new university campus. Arthur and Mary Denny donated eight acres, while fellow pioneers Edward Lander, and Charlie and Mary Terry, donated two acres on Denny’s Knoll in downtown Seattle. More specifically, this tract was bounded by 4th Avenue to the west, 6th Avenue to the east, Union Street to the north, and Seneca Streets to the south.

John Pike, for whom Pike Street is named, was the university’s architect and builder. It was opened on November 4, 1861, as the Territorial University of Washington. The legislature passed articles incorporating the University, and establishing its Board of Regents in 1862. The school initially struggled, closing three times: in 1863 for low enrollment, and again in 1867 and 1876 due to funds shortage. The University of Washington awarded its first graduate Clara Antoinette McCarty Wilt in 1876, with a bachelor’s degree in science.

19th century relocation

By the time Washington state entered the Union in 1889, both Seattle and The University of Washington had grown substantially. The University of Washington’s total undergraduate enrollment increased from 30 to nearly 300 students, and the campus’s relative isolation in downtown Seattle faced encroaching development. A special legislative committee, headed by The University of Washington graduate Edmond Meany, was created to find a new campus to better serve the growing student population and faculty. The committee eventually selected a site on the northeast of downtown Seattle called Union Bay, which was the land of the Duwamish, and the legislature appropriated funds for its purchase and construction. In 1895, The University of Washington relocated to the new campus by moving into the newly built Denny Hall. The University of Washington Regents tried and failed to sell the old campus, eventually settling with leasing the area. This would later become one of the University’s most valuable pieces of real estate in modern-day Seattle, generating millions in annual revenue with what is now called the Metropolitan Tract. The original Territorial University building was torn down in 1908, and its former site now houses the Fairmont Olympic Hotel.

The sole-surviving remnants of The University of Washington’s first building are four 24-foot (7.3 m), white, hand-fluted cedar, Ionic columns. They were salvaged by Edmond S. Meany, one of The University of Washington’s first graduates and former head of its history department. Meany and his colleague, Dean Herbert T. Condon, dubbed the columns as “Loyalty,” “Industry,” “Faith”, and “Efficiency”, or “LIFE.” The columns now stand in the Sylvan Grove Theater.

20th century expansion

Organizers of the 1909 Alaska-Yukon-Pacific Exposition eyed the still largely undeveloped campus as a prime setting for their world’s fair. They came to an agreement with The University of Washington ‘s Board of Regents that allowed them to use the campus grounds for the exposition, surrounding today’s Drumheller Fountain facing towards Mount Rainier. In exchange, organizers agreed Washington would take over the campus and its development after the fair’s conclusion. This arrangement led to a detailed site plan and several new buildings, prepared in part by John Charles Olmsted. The plan was later incorporated into the overall University of Washington campus master plan, permanently affecting the campus layout.

Both World Wars brought the military to campus, with certain facilities temporarily lent to the federal government. In spite of this, subsequent post-war periods were times of dramatic growth for The University of Washington. The period between the wars saw a significant expansion of the upper campus. Construction of the Liberal Arts Quadrangle, known to students as “The Quad,” began in 1916 and continued to 1939. The University’s architectural centerpiece, Suzzallo Library, was built in 1926 and expanded in 1935.

After World War II, further growth came with the G.I. Bill. Among the most important developments of this period was the opening of the School of Medicine in 1946, which is now consistently ranked as the top medical school in the United States. It would eventually lead to The University of Washington Medical Center, ranked by U.S. News and World Report as one of the top ten hospitals in the nation.

In 1942, all persons of Japanese ancestry in the Seattle area were forced into inland internment camps as part of Executive Order 9066 following the attack on Pearl Harbor. During this difficult time, university president Lee Paul Sieg took an active and sympathetic leadership role in advocating for and facilitating the transfer of Japanese American students to universities and colleges away from the Pacific Coast to help them avoid the mass incarceration. Nevertheless, many Japanese American students and “soon-to-be” graduates were unable to transfer successfully in the short time window or receive diplomas before being incarcerated. It was only many years later that they would be recognized for their accomplishments during The University of Washington’s Long Journey Home ceremonial event that was held in May 2008.

From 1958 to 1973, The University of Washington saw a tremendous growth in student enrollment, its faculties and operating budget, and also its prestige under the leadership of Charles Odegaard. The University of Washington student enrollment had more than doubled to 34,000 as the baby boom generation came of age. However, this era was also marked by high levels of student activism, as was the case at many American universities. Much of the unrest focused around civil rights and opposition to the Vietnam War. In response to anti-Vietnam War protests by the late 1960s, the University Safety and Security Division became The University of Washington Police Department.

Odegaard instituted a vision of building a “community of scholars”, convincing the Washington State legislatures to increase investment in The University of Washington. Washington senators, such as Henry M. Jackson and Warren G. Magnuson, also used their political clout to gather research funds for the University of Washington. The results included an increase in the operating budget from \$37 million in 1958 to over \$400 million in 1973, solidifying The University of Washington as a top recipient of federal research funds in the United States. The establishment of technology giants such as Microsoft, Boeing and Amazon in the local area also proved to be highly influential in the University of Washington’s fortunes, not only improving graduate prospects but also helping to attract millions of dollars in university and research funding through its distinguished faculty and extensive alumni network.

21st century

In 1990, The University of Washington opened its additional campuses in Bothell and Tacoma. Although originally intended for students who have already completed two years of higher education, both schools have since become four-year universities with the authority to grant degrees. The first freshman classes at these campuses started in fall 2006. Today both Bothell and Tacoma also offer a selection of master’s degree programs.

In 2012, The University of Washington began exploring plans and governmental approval to expand the main Seattle campus, including significant increases in student housing, teaching facilities for the growing student body and faculty, as well as expanded public transit options. The University of Washington light rail station was completed in March 2015, connecting Seattle’s Capitol Hill neighborhood to The University of Washington Husky Stadium within five minutes of rail travel time. It offers a previously unavailable option of transportation into and out of the campus, designed specifically to reduce dependence on private vehicles, bicycles and local King County buses.

The University of Washington has been listed as a “Public Ivy” in Greene’s Guides since 2001, and is an elected member of the American Association of Universities. Among the faculty by 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences(US), 67 members of the American Academy of Arts and Sciences, 53 members of the National Academy of Medicine, 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering, 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among The University of Washington students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017.

The Academic Ranking of World Universities has consistently ranked The University of Washington as one of the top 20 universities worldwide every year since its first release. In 2019, The University of Washington ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900.

U.S. News & World Report ranked The University of Washington 8th out of nearly 1,500 universities worldwide for 2021, with The University of Washington’s undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities.

In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world’s 500 major universities, ranked The University of Washington 12th globally and 5th in the U.S.

In 2019, Kiplinger Magazine’s review of “top college values” named University of Washington 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings The University of Washington was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service.

## From The Massachusetts Institute of Technology: “Scientists boost quantum signals while reducing noise”

From The Massachusetts Institute of Technology

2.9.23

This superconducting parametric amplifier can achieve quantum squeezing over much broader bandwidths than other designs, which could lead to faster and more accurate quantum measurements. Courtesy of the researchers.

A certain amount of noise is inherent in any quantum system. For instance, when researchers want to read information from a quantum computer, which harnesses quantum mechanical phenomena to solve certain problems too complex for classical computers, the same quantum mechanics also imparts a minimum level of unavoidable error that limits the accuracy of the measurements.

Scientists can effectively get around this limitation by using “parametric” amplification to “squeeze” the noise –– a quantum phenomenon that decreases the noise affecting one variable while increasing the noise that affects its conjugate partner. While the total amount of noise remains the same, it is effectively redistributed. Researchers can then make more accurate measurements by looking only at the lower-noise variable.

A team of researchers from MIT and elsewhere has now developed a new superconducting parametric amplifier that operates with the gain of previous narrowband squeezers while achieving quantum squeezing over much larger bandwidths. Their work is the first to demonstrate squeezing over a broad frequency bandwidth of up to 1.75 gigahertz while maintaining a high degree of squeezing (selective noise reduction). In comparison, previous microwave parametric amplifiers generally achieved bandwidths of only 100 megahertz or less.

This new broadband device may enable scientists to read out quantum information much more efficiently, leading to faster and more accurate quantum systems. By reducing the error in measurements, this architecture could be utilized in multiqubit systems or other metrological applications that demand extreme precision.

“As the field of quantum computing grows, and the number of qubits in these systems increases to thousands or more, we will need broadband amplification. With our architecture, with just one amplifier you could theoretically read out thousands of qubits at the same time,” says electrical engineering and computer science graduate student Jack Qiu, who is a member of the Engineering Quantum Systems Group and lead author of the paper detailing this advance.

The senior authors are William D. Oliver, the Henry Ellis Warren professor of electrical engineering and computer science and of physics, director of the Center for Quantum Engineering, and associate director of the Research Laboratory of Electronics; and Kevin P. O’Brien, the Emanuel E. Landsman Career Development professor of electrical engineering and computer science. The paper appears today in Nature Physics [below].

Squeezing noise below the standard quantum limit

Superconducting quantum circuits, like quantum bits or “qubits,” process and transfer information in quantum systems. This information is carried by microwave electromagnetic signals comprising photons. But these signals can be extremely weak, so researchers use amplifiers to boost the signal level such that clean measurements can be made.

However, a quantum property known as the Heisenberg Uncertainty Principle requires a minimum amount of noise be added during the amplification process, leading to the “standard quantum limit” of background noise. However, a special device, called a Josephson parametric amplifier, can reduce the added noise by “squeezing” it below the fundamental limit by effectively redistributing it elsewhere.

Quantum information is represented in the conjugate variables, for example, the amplitude and phase of electromagnetic waves. However, in many instances, researchers need only measure one of these variables — the amplitude or the phase — to determine the quantum state of the system. In these instances, they can “squeeze the noise,” lowering it for one variable, say amplitude, while raising it for the other, in this case phase. The total amount of noise stays the same due to Heisenberg’s Uncertainty Principle, but its distribution can be shaped in such a way that less noisy measurements are possible on one of the variables.

A conventional Josephson parametric amplifier is resonator-based: It’s like an echo chamber with a superconducting nonlinear element called a Josephson junction in the middle. Photons enter the echo chamber and bounce around to interact with the same Josephson junction multiple times. In this environment, the system nonlinearity — realized by the Josephson junction — is enhanced and leads to parametric amplification and squeezing. But, since the photons traverse the same Josephson junction many times before exiting, the junction is stressed. As a result, both the bandwidth and the maximum signal the resonator-based amplifier can accommodate is limited.

The MIT researchers took a different approach. Instead of embedding a single or a few Josephson junctions inside a resonator, they chained more than 3,000 junctions together, creating what is known as a Josephson traveling-wave parametric amplifier. Photons interact with each other as they travel from junction to junction, resulting in noise squeezing without stressing any single­­­­­ junction.

Their traveling-wave system can tolerate much higher-power signals than resonator-based Josephson amplifiers without the bandwidth constraint of the resonator, leading to broadband amplification and high levels of squeezing, Qiu says.

“You can think of this system as a really long optical fiber, another type of distributed nonlinear parametric amplifier. And, we can push to 10,000 junctions or more. This is an extensible system, as opposed to the resonant architecture,” he says.

Nearly noiseless amplification

A pair of pump photons enters the device, serving as the energy source. Researchers can tune the frequency of photons coming from each pump to generate squeezing at the desired signal frequency. For instance, if they want to squeeze a 6-gigahertz signal, they would adjust the pumps to send photons at 5 and 7 gigahertz, respectively. When the pump photons interact inside the device, they combine to produce an amplified signal with a frequency right in the middle of the two pumps. This is a special process of a more generic phenomenon called nonlinear wave mixing.

“Squeezing of the noise results from a two-photon quantum interference effect that arises during the parametric process,” he explains.

This architecture enabled them to reduce the noise power by a factor 10 below the fundamental quantum limit while operating with 3.5 gigahertz of amplification bandwidth — a frequency range that is almost two orders of magnitude higher than previous devices.

Their device also demonstrates broadband generation of entangled photon pairs, which could enable researchers to read out quantum information more efficiently with a much higher signal-to-noise ratio, Qiu says.

While Qiu and his collaborators are excited by these results, he says there is still room for improvement. The materials they used to fabricate the amplifier introduce some microwave loss, which can reduce performance. Moving forward, they are exploring different fabrication methods that could improve the insertion loss.

“This work is not meant to be a standalone project. It has tremendous potential if you apply it to other quantum systems — to interface with a qubit system to enhance the readout, or to entangle qubits, or extend the device operating frequency range to be utilized in dark matter detection and improve its detection efficiency. This is essentially like a blueprint for future work,” he says.

Additional co-authors include Arne Grimsmo, senior lecturer at the University of Sydney; Kaidong Peng, an EECS graduate student in the Quantum Coherent Electronics Group at MIT; Bharath Kannan, PhD ’22, CEO of Atlantic Quantum; Benjamin Lienhard PhD ’21, a postdoc at Princeton University; Youngkyu Sung, an EECS grad student at MIT; Philip Krantz, an MIT postdoc; Vladimir Bolkhovsky, Greg Calusine, David Kim, Alex Melville, Bethany Niedzielski, Jonilyn Yoder, and Mollie Schwartz, members of the technical staff at MIT Lincoln Laboratory; Terry Orlando, professor of electrical engineering at MIT and a member of RLE; Irfan Siddiqi, a professor of physics at the University of California-Berkeley; and Simon Gustavsson, a principal research scientist in the Engineering Quantum Systems group at MIT.

This work was funded, in part, by the NTT Physics and Informatics Laboratories and the Office of the Director of National Intelligence IARPA program.

Nature Physics

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

The Computer Science and Artificial Intelligence Laboratory (CSAIL)

From The Kavli Institute For Astrophysics and Space Research

MIT’s Institute for Medical Engineering and Science is a research institute at the Massachusetts Institute of Technology

The MIT Laboratory for Nuclear Science

The MIT Media Lab

The MIT School of Engineering

The MIT Sloan School of Management

Spectrum

Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

Foundation and vision

In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

“The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

Early developments

Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated \$20 million (\$236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

Curricular reforms

In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of \$100 million (\$1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

Recent history

The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend \$240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be \$1 billion upon completion.

The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

## From The School of Engineering At The Massachusetts Institute of Technology : “New polymers could enable better wearable devices”

From The School of Engineering

At

The Massachusetts Institute of Technology

2.6.23
Becky Ham

MIT researchers developed a chemistry-based strategy to create organic iono-electronic polymers that “learn” and could improve electronic devices that interface directly with the human body. This illustration shows the proposed morphology of the polymer. Courtesy of the researchers.

MIT engineers developed organic polymers that can efficiently convert signals from biological tissue into the electronic signals used in transistors.

Certain electronics that integrate with the human body — a smartwatch that samples your sweat, for instance — work by converting the ion-based signals of biological tissue into the electron-based signals used in transistors. But the materials in these devices are often designed to maximize ion uptake while sacrificing electronic performance.

To remedy this, MIT researchers developed a strategy to design these materials, called organic mixed ionic-electronic conductors (OMIECs), that brings their ionic and electronic capabilities into balance.

Figure 1

Hybridizing ionic and electronic conduction in DPP-based polymers. A) Illustration of a synaptic transistor and the use of organic mixed ionic–electronic conductors for electrochemical conductance modulation. B) General structure of polymer mixed conductors. C) Molecular structure of hybrid DPP copolymers for attaining mixed ionic–electronic conduction. D,E) Optimized geometries of the parent DPP and the hybridized copolymer, as well as the corresponding side view. F) Frontier molecular orbitals of the glycol functionalized copolymer using density functional theory (DFT) calculations (B3LYP/6-31+G(d)).

These optimized OMIECs can even learn and retain these signals in a way that mimics biological neurons, according to Aristide Gumyusenge, the Merton C. Flemings Assistant Professor of Materials Science and Engineering.

“This behavior is key to next-generation biology-inspired electronics and body-machine interfaces, where our artificial components must speak the same language as the natural ones for a seamless integration,” he says.

Gumyusenge and his colleagues published their results Friday in the “Rising Stars” series of the journal Small [below]. His co-authors include Sanket Samal, an MIT postdoc; Heejung Roh and Camille E. Cunin, both MIT PhD students; and Geon Gug Yang, a visiting PhD student from the Korea Advanced Institute of Science and Technology.

Building a better OMIEC

Electronics that interface directly with the human body need to be made from lightweight, flexible, and biologically compatible electronics. Organic polymer materials like OMIECs, which can transport both ions and electrons, make excellent building blocks for the transistors in these devices.

“However, ionic and electronic conductivities have opposite trends,” Gumyusenge explains. “That is, improving ion uptake usually implies sacrificing electronic mobility.”

Gumyusenge and his colleagues wondered if they could build a better OMIEC by designing new copolymers from the ground up, using a highly conductive pigment called DPP and engineering the copolymer’s chemical backbone and sidechains. By selectively controlling the density of specific sidechains, the researchers were able to maximize both ion permeability and electron charge transport.

The technique could be used “to establish a broad library of OMIECs … thus unlocking the current single-material-fits-all bottleneck” that now exists in ionic-electronic devices, Gumyusenge says.

The newly designed OMIECs also retain their electrochemical properties after undergoing a baking step at 300 degrees Celsius (572 degrees Fahrenheit), making them compatible with commercial manufacturing conditions used to make traditional integrated circuits.

Given that the OMIEC design process involved adding softer and more “ion-friendly” building blocks, the polymers’ thermal properties and the impact of heat treatment “was impressive and a pleasant surprise,” Gumyusenge says.

OMIECs in artificial neurons

The MIT researchers’ design strategy makes it possible to tune the ability of an OMIEC to receive and hold on to an ion-based electrochemical charge. The process resembles what happens with biological neurons, which use ions to communicate during learning and memory.

This made Gumyusenge’s team wonder: Could their OMIECs be used in devices that mimic the synaptic connections between neurons in the brain?

The MIT study showed that the artificial synapses could conduct signals in a way that resembles the synaptic plasticity underlying learning, as well as a persistent strengthening of the synapse’s signal transmission that resembles the biological process of memory formation.

Someday these types of artificial synapses might form the basis of artificial neural networks that could make the integration of electronics and biology even more powerful, the researchers say.

For instance, Gumyusenge says, “materials such as the polymer we report are promising candidates toward the development of closed-loop feedback systems,” which could do things like monitor a person’s insulin levels and automatically deliver the correct dose of insulin based on these data.

The study was supported, in part, by the K. Lisa Yang Brain-Body Center at MIT and the Korea Advanced Institute of Science and Technology.

Small

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

The MIT School of Engineering is one of the five schools of the Massachusetts Institute of Technology, located in Cambridge, Massachusetts. The School of Engineering has eight academic departments and two interdisciplinary institutes. The School grants SB, MEng, SM, engineer’s degrees, and PhD or ScD degrees. The school is the largest at MIT as measured by undergraduate and graduate enrollments and faculty members.

Departments and initiatives:

Departments:

Aeronautics and Astronautics (Course 16)
Biological Engineering (Course 20)
Chemical Engineering (Course 10)
Civil and Environmental Engineering (Course 1)
Electrical Engineering and Computer Science (Course 6, joint department with MIT Schwarzman College of Computing)
Materials Science and Engineering (Course 3)
Mechanical Engineering (Course 2)
Nuclear Science and Engineering (Course 22)

Institutes:

Institute for Medical Engineering and Science
Health Sciences and Technology program (joint MIT-Harvard, “HST” in the course catalog)

(Departments and degree programs are commonly referred to by course catalog numbers on campus.)

Laboratories and research centers

Abdul Latif Jameel Water and Food Systems Lab
Center for Advanced Nuclear Energy Systems
Center for Computational Engineering
Center for Materials Science and Engineering
Center for Ocean Engineering
Center for Transportation and Logistics
Industrial Performance Center
Institute for Soldier Nanotechnologies
Koch Institute for Integrative Cancer Research
Laboratory for Information and Decision Systems
Laboratory for Manufacturing and Productivity
Materials Processing Center
Microsystems Technology Laboratories
MIT Lincoln Laboratory Beaver Works Center
Novartis-MIT Center for Continuous Manufacturing
Ocean Engineering Design Laboratory
Research Laboratory of Electronics
SMART Center
Sociotechnical Systems Research Center
Tata Center for Technology and Design

USPS “Forever” postage stamps celebrating Innovation at MIT.

The Computer Science and Artificial Intelligence Laboratory (CSAIL)

The Kavli Institute For Astrophysics and Space Research

MIT’s Institute for Medical Engineering and Science is a research institute at the Massachusetts Institute of Technology

The MIT Laboratory for Nuclear Science

The MIT Media Lab

The MIT Sloan School of Management

Spectrum

The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

Foundation and vision

In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

“The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

Early developments

Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated \$20 million (\$236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

Curricular reforms

In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of \$100 million (\$1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

Recent history

The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend \$240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be \$1 billion upon completion.

The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

## From Clemson University: “Five ways that lasers shine a light on research and leadership in engineering and science”

From Clemson University

2.3.23

Fig. 1: Image of the 124-m-high telecommunication tower of Säntis (Switzerland).

Also shown is the path of the laser recorded with its second harmonic at 515 nm.

The news that lasers are capable of rerouting lightning [Nature Photonics (below)] and could someday be used to protect airports, launchpads and other infrastructure raised a question that has electrified some observers with curiosity:

Just what else can these marvels of focused light do?

We took that question to Clemson University’s John Ballato, one of the world’s leading optical scientists, and his answers might be—you guessed it—shocking.

John Ballato.

John Ballato, right, and Wade Hawkins work in their lab the Center for Optical Materials Science and Engineering Technologies (COMSET).

Some lasers shine more intensely than the sun, while others can make things cold, he said. Lasers can drill the tiniest of holes, defend against missile attacks and help self-driving cars “see” where they are going, Ballato said. Those are just a few examples—and all have been the subject of research at Clemson.

If anyone knows about how light and lasers are used, it’s Ballato, who holds the J.E. Sirrine Endowed Chair of Optical Fiber in the Department of Materials Science and Engineering at Clemson, with joint appointments in electrical engineering and in physics.

He has authored more than 500 technical papers, holds 35 U.S. and international patents and is a fellow in seven professional organizations, including the American Association for the Advancement of Science.

Ballato recently returned from San Francisco, where he served as a symposium chair at SPIE’s Photonics West LASE, “the most important laser technologies conference in the field,” according to its website.

“We’ve got a great opportunity to shine a light—pun intended—on Clemson’s leadership in laser technology,” said Ballato, who was not involved in the lightning-related research. “Clemson has some of the world’s top talent in laser technology, unique facilities that include industry-scale capabilities for making some of the world’s most advanced optical fibers and opportunities for hands-on learning. If you want to be a leader, innovator or entrepreneur in lasers, Clemson is the place for you.”

Liang Dong, right, creates powerful lasers as part of his research at Clemson University.

Ballato is among numerous researchers at Clemson who are doing seemingly miraculous things with laser light. Here are five things lasers can do (other than deflect lightening) that Clemson researchers are working with today.

Ballato was part of an international team that developed the first laser self-cooling optical fiber made of silica glass and then turned that innovation into a laser amplifier. Researchers said it is a step toward self-cooling lasers. Such a laser would not need to be cooled externally because it would not heat up in the first place, they said, and it would produce exceptionally pure and stable frequencies. The work was led by researchers at Stanford University and originally reported in the journal Optics Letters [below two papers].

The light from lasers can be made to twist or spin as it travels from one point to another. This can be done by engineering the light’s “orbital angular momentum” and is central to research led by Eric Johnson, the PalmettoNet Endowed Chair in Optoelectronics, with help from several other researchers, including Joe Watkins, director of General Engineering. The technology could make it possible to channel through fog, murky water and thermal turbulence, potentially leading to new ways of communicating and gathering data.

Some lasers are orders of magnitude more intense than the surface of the sun, thanks to specially designed optical fiber that confines that light to a fraction of the width of a human hair. These powerful laser devices can be used to shoot missiles out of the sky or to cut, drill, weld and mark a variety of materials in ways that conventional tools cannot. Lasers, for example, are used to cut Gorilla Glass on smartphones. Clemson researchers helping advance laser technology in this direction include: Ballato; Liang Dong, a professor of electrical and computer engineering; and Wade Hawkins, a research assistant professor of materials science and engineering.

Lidar, which stands for Light Detection and Ranging, is a technology that employs pulsing laser beams to measure distance to objects or surfaces. For self-driving cars, lidar serves as the “eyes” that help vehicles navigate the streets. Lidar can also be used for mapping and surveying and measuring density, temperature, and other properties of the atmosphere. The technology has been employed in numerous projects at Clemson, including Deep Orange 12, an autonomous race car designed by automotive engineering graduate students.

Lasers are also playing a role in helping develop clean energy sources. One of the major challenges in creating hydrogen-powered turbines is protecting the blades against heat and high-velocity steam so extreme it would vaporize many materials. A possible solution under study at Clemson would be to cover turbine blades with a special slurry and use a laser to sinter it one point at a time, creating a protective coating. The research is led by Fei Peng, an associate professor of materials science and engineering.

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

Ranked as the 27th best national public university by U.S. News & World Report, Clemson University is dedicated to teaching, research and service. Founded in 1889, we remain committed both to world-class research and a high quality of life. In fact, 92 percent of our seniors say they’d pick Clemson again if they had it to do over.

Clemson’s retention and graduation rates rank among the highest in the country for public universities. We’ve been named among the “Best Public College Values” by Kiplinger Magazine in 2019, and The Princeton Review named us among the “Best Value Colleges” for 2020.

Our beautiful college campus sits on 20,000 acres in the foothills of the Blue Ridge Mountains, along the shores of Lake Hartwell. And we also have research facilities and economic development hubs throughout the state of South Carolina — in Anderson, Blackville, Charleston, Columbia, Darlington, Georgetown, Greenville, Greenwood, and Pendleton.

The research, outreach and entrepreneurial projects led by our faculty and students are driving economic development and improving quality of life in South Carolina and beyond. In fact, a recent study determined that Clemson has an annual \$1.9 billion economic impact on the state.

Just as founder Thomas Green Clemson intertwined his life with the state’s economic and educational development, the Clemson Family impacts lives daily with their teaching, research and service.
How Clemson got its start
University founders Thomas Green and Anna Calhoun Clemson had a lifelong interest in education, agricultural affairs and science.

In the post-Civil War days of 1865, Thomas Clemson looked upon a South that lay in economic ruin, once remarking, “This country is in wretched condition, no money and nothing to sell. Everyone is ruined, and those that can are leaving.”

Thomas Clemson’s death on April 6, 1888, set in motion a series of events that marked the start of a new era in higher education in South Carolina. In his will, he bequeathed the Fort Hill plantation and a considerable sum from his personal assets for the establishment of an educational institution that would teach scientific agriculture and the mechanical arts to South Carolina’s young people.

Clemson Agricultural College formally opened as an all-male military school in July 1893 with an enrollment of 446. It remained this way until 1955 when the change was made to “civilian” status for students, and Clemson became a coeducational institution. In 1964, the college was renamed Clemson University as the state legislature formally recognized the school’s expanded academic offerings and research pursuits.

More than a century after its opening, the University provides diverse learning, research facilities and educational opportunities not only for the people of the state — as Thomas Clemson dreamed — but for thousands of young men and women throughout the country and the world.

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r