From WIRED : “Solar-Powered Farming Is Quickly Depleting the World’s Groundwater Supply”

From WIRED

3.9.24
Fred Pearce

Farmers in arid regions are turning to low-cost solar pumps to irrigate their fields. This eliminates using fossil fuels and boosts crop production, but is drying up aquifers around the globe.

1

There is a solar-powered revolution going on in the fields of India. By 2026, more than 3 million farmers will be raising irrigation water from beneath their fields using solar-powered pumps. With effectively free water available in almost unlimited quantities to grow their crops, their lives could be transformed. Until the water runs out.

The desert state of Rajasthan is the Indian pioneer and has more solar pumps than any other. Over the past decade, the government has given subsidized solar pumps to almost 100,000 farmers. Those pumps now water more than a million acres and have enabled agricultural water use to increase by more than a quarter. But as a result, water tables are falling rapidly. There is little rain to replace the water being pumped to the surface. In places, the underground rocks are now dry down to 400 feet below ground.

That is the effective extraction limit of the pumps, many of which now lie abandoned. To keep up, in what amounts to a race to the bottom of the diminishing reserves, richer farmers have been buying more powerful solar pumps, leaving the others high and dry or forcing them to buy water from their rich neighbors.

Water wipeout looms. And not just in Rajasthan.

Solar pumps are spreading rapidly among rural communities in many water-starved regions across India, Africa, and elsewhere. These devices can tap underground water all day long at no charge, without government scrutiny.

For now, they can be great news for farmers, with the potential to transform agriculture and improve food security. The pumps can supply water throughout the daylight hours, extending their croplands into deserts, ending their reliance on unpredictable rains, and sometimes replacing existing costly-to-operate diesel or grid-powered pumps.

But this solar-powered hydrological revolution is emptying already-stressed underground water reserves—also known as groundwaters or aquifers. The very success of solar pumps is “threatening the viability of many aquifers already at risk of running dry,” Soumya Balasubramanya, an economist at the World Bank with extensive experience of water policy, warned in January.

An innovation that initially looked capable of reducing fossil-fuel consumption while also helping farmers prosper is rapidly turning into an environmental time bomb.

2
Solar panels power pumping at a farm near Kafr el-Dawwar, Egypt.Photograph: KHALED DESOUKI/Getty Images.

For much of the 20th century, artificial irrigation of farmland boomed thanks to state and World Bank investment in reservoirs and in networks of canals to bring water to fields. Irrigation watered the “green revolution” of new high-yielding but thirsty crops, keeping a fast-growing world population largely fed.

But many systems have reached their limits. Rivers are being emptied, and new investment has dried up. So in the past three decades, hundreds of millions of farmers in hot arid regions, from Mexico to the Middle East and South Asia, have switched to getting their water from underground.

Boreholes sunk into porous water-holding rocks now provide 43 percent of the world’s irrigation water, according to a study last year by the World Bank. Irrigation is responsible for around 70 percent of the global underground water withdrawals, which are estimated at more than 200 cubic miles per year. This exceeds recharge from rainfall by nearly 70 cubic miles per year.

Monitoring of individual underground reserves is patchy at best. They are too often out of sight and out of mind. But a study of historical data from monitoring wells in 1,700 aquifers in 40 countries, published in January, reported that “rapid and accelerating” declines in reserves were widespread.

Scott Jasechko, a hydrologist at the University of California-Santa Barbara, found water tables dropping by 3 feet or more every year in India, Iran, Afghanistan, Spain, Mexico, the United States, Chile, Saudi Arabia, and other countries.

The implications of this for the future are profound. “Groundwater depletion is becoming a global threat to food security, yet … remains poorly quantified,” says Meha Jain, who studies the sustainability of farming systems at the University of Michigan. But rather than calling a halt to groundwater withdrawals, policymakers are upping the ante by promoting solar power as a means of delivering yet more and cheaper underground water to fields.

The solar revolution on farms is happening with the best of intentions and is using a technology widely seen as environmentally beneficial. Farmers love the fact that their photovoltaic (PV) pumps do not require expensive and polluting diesel fuel or grid connections. Once installed, they can run all day at no cost, growing more food crops, or allowing their owners to expand their businesses—growing water-intensive cash crops, or earning income from selling spare water to neighbors. Many farmers also keep their old diesel or electric pumps to continuing pumping when the sun goes down.

Development agencies and governments are equally keen. They subsidize solar pumps to boost food production, reduce poverty, cut emissions from fossil fuels, and curtail growing demands on overstretched electricity grids. But the long-term downside of this solar revolution looms large.

3
Farmer Mohamed Ali al-Hussein waters a watermelon patch near Hasakeh, Syria with the help of a solar-powered pump.Photograph: DELIL SOULEIMAN/Getty Images.

The crisis is particularly stark in India. The world’s most populous nation “stands at the threshold of a revolution in adoption of solar irrigation pumps,” says Tushaar Shah, a water economist for the International Water Management Institute. The government intends to raise the number of solar pumps more than tenfold to 3.5 million by 2026.

The country is already the world’s largest consumer of groundwater, with farmers each year pumping onto their fields an estimated 50 cubic miles more water than the monsoon rains replace. Unchecked, says Shah, solar power is set to make the situation worse.

Sub-Saharan Africa could soon be on the same path. Shallow groundwater is present below fields in many places across the continent. But the cost of buying diesel fuel is prohibitively high for many farmers, and most rural areas are not connected to electricity grids. So, the arrival of stand-alone PV pumps is “a game-changer for small-scale farms” in sub-Saharan Africa, says Giacomo Falchetta, an energy and environment economist at the International Institute for Applied Systems Analysis in Austria.

There are already half a million PV irrigation pumps watering fields across the sub-Saharan region. But Falchetta calculates that in the future 11 million more could be deployed to irrigate 135 million acres of currently rainfed fields—an area the size of France. These pumps could supply a third of the unmet water needs of small farmers, who produce most of the food across sub-Saharan Africa.

The main thing preventing farmers from accessing the free water beneath their feet is the capital cost of the equipment, which typically represents up to a year’s farm income. But that may soon change, as costs come down.

“The potential in Africa is large,” says Claudia Ringler, a water specialist at the Washington, DC-based International Food Policy Research Institute. “Solar power is a breakthrough technology. Barriers will be increasingly overcome, and it will transform agricultural irrigation.”

Falchetta reckons horticultural crops will benefit most from the extra water made available by solar pumps, “due to their high water requirement and high economic value.” But that rings alarm bells. Even modest falls in water tables in the continent’s many shallow aquifers could dry up wells that sustain many of the 255 million people living in poverty above them, warns the World Bank.

Such declines could also wreck fluvial ecosystems sustained by shallow underground water, including the wetlands and rivers on which millions of Africans depend for fish and other resources.

Overexploitation of groundwater, the World Bank review concludes, is “a classic tragedy of the commons—with exponential impacts disproportionately affecting the most vulnerable.” Yet the bank, along with its sister agency the Africa Development Bank, is funding stand-alone solar pump projects in Togo, Niger, and elsewhere across the continent.

In many places, however, farmers may not be able to wait for subsidies or aid projects to embrace solar pumps. They have little choice if they want to grow crops as other means of pumping water to their fields falter.

That is certainly the case in Yemen, on the south flank of the Arabian Peninsula, where the desert sands have a new look these days. Satellite images show around 100,000 solar panels glinting in the sun, surrounded by green fields. Hooked to water pumps, the panels provide free energy for farmers to pump out ancient underground water. They are irrigating crops of khat, a shrub whose narcotic leaves are the country’s stimulant of choice, chewed through the day by millions of men.

For these farmers, the solar irrigation revolution in Yemen is born of necessity. Most crops will only grow if irrigated, and the country’s long civil war has crashed the country’s electricity grid and made supplies of diesel fuel for pumps expensive and unreliable. So, they are turning en masse to solar power to keep the khat coming.

The panels have proved an instant hit, says Middle East development researcher Helen Lackner of SOAS University of London (UK). Everybody wants one. But in the hydrological free-for-all, the region’s underground water, a legacy of wetter times, is running out.

The solar-powered farms are pumping so hard that they have triggered “a significant drop in groundwater since 2018 … in spite of above average rainfall,” according to an analysis by Leonie Nimmo, a researcher who was until recently at the UK-based Conflict and Environment Observatory. The spread of solar power in Yemen “has become an essential and life-saving source of power,” both to irrigate food crops and provide income from selling khat, he says, but it is also “rapidly exhausting the country’s scarce groundwater reserves.”

In the central Sana’a Basin, Yemen’s agricultural heartland, more than 30 percent of farmers use solar pumps. In a report with Musaed Aklan, a water researcher at the Sana’a Center for Strategic Studies, Lackner predicts a “complete shift” to solar by 2028. But the basin may be down to its last few years of extractable water. Farmers who once found water at depths of 100 feet or less are now pumping from 1,300 feet or more.

Some 1,500 miles to the northeast, in in the desert province of Helmand in Afghanistan, more than 60,000 opium farmers have in the past few years given up on malfunctioning state irrigation canals and switched to tapping underground water using solar water pumps. As a consequence, water tables have been falling typically by 10 feet per year, according to David Mansfield, an expert on the country’s opium industry from the London School of Economics (UK).

An abrupt ban on opium production imposed by Afghanistan’s Taliban rulers in 2022 may offer a partial reprieve. But the wheat that the farmers are growing as a replacement is also a thirsty crop. So, water bankruptcy in Helmand may only be delayed.

“Very little is known about the aquifer [in Helmand], its recharge or when and if it might run dry,” according to Mansfield. But if their pumps run dry, many of the million-plus people in the desert province could be left destitute, as this vital desert resource—the legacy of rainfall in wetter times—disappears for good.

4
Water pours from a solar-powered pump near Sana’a, Yemen.Photograph: MOHAMMED MOHAMMED/ALAMY.

Even the potential climate benefits of solar pumping may prove illusory, says the World Bank’s Balasubramanya. In theory, switching from diesel or electricity to PV pumping should eliminate greenhouse gas emissions. But in practice, farmers often use their solar pumps to supplement existing pumps, rather than replacing them. And, however it is pumped, the extra water available will also encourage farmers to adopt more intensive farming methods, using more fertilizer and machinery to grow thirstier cash crops, increasing the carbon footprint of the farm.

What is to be done? Groundwaters are notoriously hard to police. India’s overpumping has been “a colossal anarchy,” says Shah. Some states have attempted to control non-solar pumps that run on grid electricity by restricting power supplies to farmers to a few hours every day. It had some effect, says Shah, who first proposed the idea. But many farmers responded by buying more powerful pumps.

Now, to combat the excesses of solar pumps, Gujarat state has been paying some farmers high prices to use their PV panels to send power to the grid, rather than pump water, making solar energy in effect a new cash crop.

The pilot project was limited to just 4,300 wells, and again the benefit was “muted,” says Shah. He believes a better designed scheme could work. But Balasubramanya, who until recently worked with Shah in India, is doubtful. She warns that it might simply encourage more farmers to invest in solar panels, which could end up increasing water pumping further.

In any case, controls based around the electricity grid will not work in rural Africa, where there is rarely any grid for farmers to either tap into or supply.

None of this should be seen as a condemnation of solar power, says Balasubramanya. “The fundamental problem is not the solar technology itself.” Whatever the technology, “if the cost of pumping is zero, then people will pump unless some restriction is put on them.”

But Balasubramanya says technology could come to the rescue. If PV pumps had to be sold with sensors that allowed monitoring of their output, then regulators could directly limit their use. Whether governments would do that in practice, given the conflicting priorities between immediate food production and longer-term management of water, is another matter.

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From Quanta Magazine Via WIRED : “There’s a New Theory About Where Dark Matter Is Hiding”

From Quanta Magazine

Via

WIRED

2.25.24
Steve Nadis

An idea derived from string theory suggests that dark matter is hidden in an as-yet-unseen extra dimension. Scientists are racing to test the theory to see if it holds up.

1
In a dark extra dimension, peculiar particles convey the gravitational force. Credit: Samuel Velasco/Quanta Magazine.

When it comes to understanding the fabric of the universe, most of what scientists think exists is consigned to a dark, murky domain. Ordinary matter, the stuff we can see and touch, accounts for just 5 percent of the cosmos. The rest, cosmologists say, is “dark energy” and “dark matter”, mysterious substances that are labeled “dark” partly to reflect our ignorance about their true nature.

_________________________
The Dark Energy Survey

Dark Energy Camera [DECam] built at the DOE’s Fermi National Accelerator Laboratory.

DECam focal plane
NSF NOIRLab NOAO Cerro Tololo Inter-American Observatory (CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera [DECam] at Cerro Tololo, Chile at an altitude of 7200 feet.

NOIRLabNSF NOIRLab NOAO Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

Timeline of the Inflationary Universe NASA WMAP LAMBDA – ΛCDM Model of Cosmology

The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.
_________________________

Nobel Prize in Physics for 2011 Expansion of the Universe

Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

4 October 2011

The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics for 2011

with one half to

Saul Perlmutter
The Supernova Cosmology Project
The DOE’s Lawrence Berkeley National Laboratory and The University of California-Berkeley,

and the other half jointly to

Brian P. Schmidt
The High-z Supernova Search Team, The Australian National University, Weston Creek, Australia.

and

Adam G. Riess

The High-z Supernova Search Team,The Johns Hopkins University and The Space Telescope Science Institute, Baltimore, MD.
Written in the stars

“Some say the world will end in fire, some say in ice…” *

What will be the final destiny of the Universe? Probably it will end in ice, if we are to believe this year’s Nobel Laureates in Physics. They have studied several dozen exploding stars, called supernovae, and discovered that the Universe is expanding at an ever-accelerating rate. The discovery came as a complete surprise even to the Laureates themselves.

In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role.

The research teams raced to map the Universe by locating the most distant supernovae. More sophisticated telescopes on the ground and in space, as well as more powerful computers and new digital imaging sensors (CCD, Nobel Prize in Physics in 2009), opened the possibility in the 1990s to add more pieces to the cosmological puzzle.

The teams used a particular kind of supernova, called Type 1a supernova. It is an explosion of an old compact star that is as heavy as the Sun but as small as the Earth. A single such supernova can emit as much light as a whole galaxy. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected – this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore, the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

*Robert Frost, Fire and Ice, 1920
_________________________
To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
_________________________

Dark Matter Background

Fritz Zwicky.
Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.

Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).

Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

_______________________
Dark Matter Research

SLAC Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory at Stanford University at SNOLAB (Vale Inco Mine, Sudbury, Canada).
LBNL Lux Zeplin Dark Matter Experiment xenon detector at Sanford Underground Research Facility Credit: Matt Kapust.
Lambda Cold Dark Matter Expansion ΛCDM of the Universe. Credit Alex Mittelmann Coldcreation.
DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.
Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.
DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.
The LBNL LZ Dark Matter Experiment Dark Matter project at SURF, Lead, SD.
DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).
PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

Inside the Axion Dark Matter eXperiment U Washington Credit : Mark Stone U. of Washington.

_________________________
While no single idea is likely to explain everything we hope to know about the cosmos, an idea introduced two years ago could answer a few big questions. Called the “dark dimension scenario”, it offers a specific recipe for dark matter, and it suggests an intimate connection between dark matter and dark energy. The scenario might also tell us why gravity—which sculpts the universe on the largest scales—is so weak compared to the other forces.

The scenario proposes an as-yet-unseen dimension that lives within the already complex realm of string theory, which attempts to unify quantum mechanics and Einstein’s theory of gravity. In addition to the four familiar dimensions—three infinitely large spatial dimensions plus one of time—string theory suggests that there are six exceedingly tiny spatial dimensions.

In the dark dimension’s universe, one of those extra dimensions is significantly larger than the others. Instead of being 100 million trillion times smaller than the diameter of a proton, it measures about 1 micron across—minute by everyday standards, but enormous compared to the others. Massive particles that carry the gravitational force are generated within this dark dimension, and they make up the dark matter that scientists think comprises about 25 percent of our universe and forms the glue that keeps galaxies together. (Current estimates hold that the remaining 70 percent consists of dark energy, which is driving the universe’s expansion.)

The scenario “allows us to make connections between string theory, quantum gravity, particle physics, and cosmology, [while] addressing some of the mysteries related to them,” said Ignatios Antoniadis, a physicist at Sorbonne University who is actively investigating the dark dimension proposal.

While there’s no evidence yet that the dark dimension exists, the scenario does make testable predictions for both cosmological observations and tabletop physics. That means we may not have to wait long to see whether the hypothesis will bear up under empirical scrutiny—or be relegated to the list of tantalizing ideas that never fulfilled their original promise.

“The dark dimension envisioned here,” said the physicist Rajesh Gopakumar, director of the International Center for Theoretical Sciences in Bengaluru, has “the virtue of being potentially ruled out fairly easily as upcoming experiments grow sharper.”

Divining the Dark Dimension

The dark dimension was inspired by a long-standing mystery concerning the “cosmological constant”—a term, designated by the Greek letter lambda, that Albert Einstein introduced into his equations of gravity in 1917. Believing in a static universe, as did many of his peers, Einstein added the term to keep the equations from describing an expanding universe. But in the 1920s, astronomers discovered that the universe is indeed swelling, and in 1998 they observed that it is growing at an accelerated clip, propelled by what is now commonly referred to as dark energy—which can also be denoted in equations by lambda [Λ].

2
Cumrun Vafa (left), Irene Valenzuela and Miguel Montero crafted the “dark dimension scenario”, in which massive gravitons inhabit a large extra dimension. Photograph: Hayward Photography; Courtesy of Irene Valenzuela; Max Weisner.

Since then, scientists have wrestled with one striking characteristic of lambda: Its estimated value of 10^−122 in Planck units is “the smallest measured parameter in physics,” said Cumrun Vafa, a physicist at Harvard University. In 2022, while considering that almost unfathomable smallness with two members of his research team—Miguel Montero, now at Madrid’s Institute for Theoretical Physics, and Irene Valenzuela, currently at CERN—Vafa had an insight: Such a minuscule lambda is a truly extreme parameter, meaning it could be considered within the framework of Vafa’s previous work in string theory.

Earlier, he and others had formulated a conjecture that explains what happens when an important physical parameter takes on an extreme value. Called the distance conjecture, it refers to “distance” in an abstract sense: When a parameter moves toward the remote edge of possibility, thereby assuming an extreme value, there will be repercussions for the other parameters.

Thus, in the equations of string theory, key values—such as particle masses, lambda, or the coupling constants that dictate the strength of interactions—are not fixed. Altering one will inevitably affect the others.

For example, an extraordinarily small lambda, as has been observed, should be accompanied by much lighter, weakly interacting particles with masses directly linked to lambda’s value. “What could they be?” Vafa wondered.

As he and his colleagues pondered that question, they realized that the distance conjecture and string theory combined to provide one more key insight: For these lightweight particles to appear when lambda is almost zero, one of string theory’s extra dimensions must be significantly larger than the others—perhaps large enough for us to detect its presence and even measure it. They had arrived at the dark dimension.

The Dark Tower

To understand the genesis of the inferred light particles, we need to rewind cosmological history to the first microsecond after the Big Bang. At this time, the cosmos was dominated by radiation—photons and other particles moving close to the speed of light. These particles are already described by the Standard Model of particle physics, but in the dark dimension scenario, a family of particles that are not a part of the Standard Model can emerge when the familiar ones smash together.

“Every now and then, these radiation particles collided with each other, creating what we call ‘dark gravitons,’” said Georges Obied, a physicist at the University of Oxford (UK) who helped craft the “theory of dark gravitons” [JHEP].

“There is one massless graviton, which is the usual graviton we know,” Obied said. “And then there are infinitely many copies of dark gravitons, all of which are massive.” The masses of the postulated dark gravitons are, roughly speaking, an integer times a constant, M, whose value is tied to the cosmological constant. And there’s a whole “tower” of them with a broad range of masses and energy levels.

To get a sense of how this all might work, imagine our four-dimensional world as the surface of a sphere. We cannot leave that surface, ever—for better or worse—and that’s also true for every particle in the Standard Model.

Gravitons, however, can go everywhere, for the same reason that gravity exists everywhere. And that’s where the dark dimension comes in.

To picture that dimension, Vafa said, think of every point on the imagined surface of our four-dimensional world and attach a small loop to it. That loop is (at least schematically) the extra dimension. If two Standard Model particles collide and create a graviton, the graviton “can leak into that extra-dimensional circle and travel around it like a wave,” Vafa said. (Quantum mechanics tells us that every particle, including gravitons and photons, can behave like both a particle and a wave—a 100-year-old concept known as wave-particle duality.)

As gravitons leak into the dark dimension, the waves they produce can have different frequencies, each corresponding to different energy levels. And those massive gravitons, traveling around the extra-dimensional loop, produce a significant gravitational influence at the point where the loop attaches to the sphere.

“Maybe this is the dark matter?” Vafa mused. The gravitons they had concocted were, after all, weakly interacting yet capable of mustering some gravitational heft. One merit of the idea, he noted, is that gravitons have been a part of physics for 90 years, having been first proposed as carriers of the gravitational force. (Gravitons, it should be noted, are hypothetical particles, and have not been directly detected.) To explain dark matter, “we don’t have to introduce a new particle,” he said.

Gravitons that can leak into the extra-dimensional domain are “natural candidates for dark matter,” said Georgi Dvali, director of the MPG Institute for Physics, who is not working directly on the dark dimension idea.

A large dimension such as the posited dark dimension would have room for long wavelengths, which imply low-frequency, low-energy, low-mass particles. But if a dark graviton leaked into one of string theory’s tiny dimensions, its wavelength would be exceedingly short and its mass and energy very high. Supermassive particles like this would be unstable and very short-lived. They “would be long gone,” Dvali said, “without having the possibility of serving as dark matter in the present universe.”

Gravity and its carrier, gravitons, permeate all the dimensions of string theory. But the dark dimension is so much bigger—by many orders of magnitude—than the other extra dimensions that the strength of gravity would get diluted, making it appear weak in our four-dimensional world, if it were seeping appreciably into the roomier dark dimension. “This explains the extraordinary difference [in strength] between gravity and the other forces,” said Dvali, noting that this same effect would be seen in other extra-dimensional scenarios [SLAC].

Given that the dark dimension scenario can predict things like dark matter, it can be put to an empirical test. “If I give you some correlation you can never test, you can never prove me wrong,” said Valenzuela, a coauthor of the original dark dimension paper. “It’s much more interesting to predict something that you can actually prove or disprove.”

Riddles of the Dark

Astronomers have known dark matter existed—at least in some form—since 1978, when the astronomer Vera Rubin established that galaxies were rotating so fast that stars on their outermost fringes would be cast off into the distance were it not for vast reservoirs of some unseen substance holding them back.

Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).

Identifying that substance, however, has proved very difficult. Despite nearly 40 years of experimental efforts to detect dark matter, no such particle has been found.

If dark matter turns out to be dark gravitons, which are exceedingly weakly interacting, Vafa said, that won’t change. “They will never be found directly.”

But there may be opportunities to indirectly spot the signatures of those gravitons.

One strategy Vafa and his collaborators are pursuing draws on large-scale cosmological surveys that chart the distribution of galaxies and matter. In those distributions, there might be “small differences in clustering behavior,” Obied said, that would signal the presence of dark gravitons.

When heavier dark gravitons decay, they produce a pair of lighter dark gravitons with a combined mass that is slightly less than that of their parent particle. The missing mass is converted to kinetic energy (in keeping with Einstein’s formula, E = mc2), which gives the newly created gravitons a bit of a boost—a “kick velocity” that’s estimated to be about one-ten-thousandth of the speed of light.

These kick velocities, in turn, could affect how galaxies form. According to the standard cosmological model, galaxies start with a clump of matter whose gravitational pull attracts more matter. But gravitons with a sufficient kick velocity can escape this gravitational grip. If they do, the resulting galaxy will be slightly less massive than the standard cosmological model predicts. Astronomers can look for this difference.

Recent observations of cosmic structure from the Kilo-Degree Survey are so far consistent with the dark dimension: An analysis of data from that survey placed an upper bound on the kick velocity that was very close to the value predicted by Obied and his coauthors. A more stringent test will come from the Euclid space telescope, which launched last July.

European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU)/ NASA/Euclid Optical / Infrared Astronomy spacecraft.

Meanwhile, physicists are also planning to test the dark dimension idea in the laboratory. If gravity is leaking into a dark dimension that measures 1 micron across, one could, in principle, look for any deviations from the expected gravitational force between two objects separated by that same distance. It’s not an easy experiment to carry out, said Armin Shayeghi, a physicist at the Austrian Academy of Sciences who is conducting the test. But “there’s a simple reason for why we have to do this experiment,” he added: We won’t know how gravity behaves at such close distances until we look.

The closest measurement to date—carried out in 2020 at the University of Washington—involved a 52-micron separation between two test bodies. The Austrian group is hoping to eventually attain the 1-micron range predicted for the dark dimension.

While physicists find the dark dimension proposal intriguing, some are skeptical that it will work out. “Searching for extra dimensions through more precise experiments is a very interesting thing to do,” said Juan Maldacena, a physicist at the Institute for Advanced Study, “though I think that the probability of finding them is low.”

Joseph Conlon, a physicist at Oxford, shares that skepticism: “There are many ideas that would be important if true, but are probably not. This is one of them. The conjectures it is based on are somewhat ambitious, and I think the current evidence for them is rather weak.”

Of course, the weight of evidence can change, which is why we do experiments in the first place. The dark dimension proposal, if supported by upcoming tests, has the potential to bring us closer to understanding what dark matter is, how it is linked to both dark energy and gravity, and why gravity appears feeble compared to the other known forces. “Theorists are always trying to do this ‘tying together.’ The dark dimension is one of the most promising ideas I have heard in this direction,” Gopakumar said.

But in an ironic twist, the one thing the dark dimension hypothesis cannot explain is why the “cosmological constant” is so staggeringly small—a puzzling fact that essentially initiated this whole line of inquiry. “It’s true that this program does not explain that fact,” Vafa admitted. “But what we can say, drawing from this scenario, is that if lambda is small—and you spell out the consequences of that—a whole set of amazing things could fall into place.”

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply” at the bottom of the post.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From WIRED : “Nvidia Hardware Is Eating the World”

From WIRED

2.23.24
Lauren Goode

1
Photograph: Ryan Young.

Tech companies can’t get enough of this tech company. Earnings are off the charts. WIRED probes the mind of its CEO, Jensen Huang.

Talking to Jensen Huang should come with a warning label. The Nvidia CEO is so invested in where AI is headed that, after nearly 90 minutes of spirited conversation, I came away convinced the future will be a neural net nirvana. I could see it all: a robot renaissance, medical godsends, self-driving cars, chatbots that remember. The buildings on the company’s Santa Clara campus weren’t helping. Wherever my eyes landed I saw triangles within triangles, the shape that helped make Nvidia its first fortunes. No wonder I got sucked into a fractal vortex. I had been Jensen-pilled.

Huang is the man of the hour. The year. Maybe even the decade. Tech companies literally can’t get enough of Nvidia’s supercomputing GPUs. This is not the Nvidia of old, the supplier of Gen X video game graphics cards that made images come to life by efficiently rendering zillions of triangles. This is the Nvidia whose hardware has ushered in a world where we talk to computers, they talk back to us, and eventually, depending on which technologist you talk to, they overtake us.

For our meeting, Huang, who is now 61, showed up in his trademark leather jacket and minimalist black sneakers. He told me on that Monday morning that he hates Monday mornings, because he works all day Sunday and starts the official work week already tired. Not that you’d know it. Two days later, I attended a health care investment symposium—so many biotech nerds, so many blazers—and there onstage was Huang, energetic as ever.

“This is not my normal crowd. Biologists and scientists, it’s such an angry crowd,” Huang said into a microphone, eliciting laughter. “We use words like creation and improve and accelerate, and you use words like target and inhibit.” He worked his way up to his pitch: “If you want to do your drug design, your drug discovery, in silicon, it is very likely that you’ll have to process an enormous amount of data. If you’re having a hard time with computation of artificial intelligence, you know, just send us an email.”

Huang has made a pattern of positioning Nvidia in front of every big tech trend. In 2012 a small group of researchers released a groundbreaking image recognition system, called AlexNet, that used GPUs, instead of CPUs, to crunch its code and launched a new era of deep learning. Huang promptly directed the company to chase AI full-steam. When, in 2017, Google released the novel neural network architecture known as a transformer—the T in ChatGPT—and ignited the current AI gold rush, Nvidia was in a perfect position to start selling its AI-focused GPUs to hungry tech companies.

Nvidia now accounts for more than 70 percent of sales in the AI chip market and is approaching a $2 trillion valuation. Its revenue for the last quarter of 2023 was $22 billion—up 265 percent from the year prior. And its stock price has risen 231 percent in the last year. Huang is either uncannily good at what he does or ridiculously lucky—or both!—and everyone wants to know how he does it.

But no one reigns forever. He’s now in the crosshairs of the US-China tech war and at the mercy of regulators. Some of Huang’s challengers in the AI chip world are household names—Google, Amazon, Meta, and Microsoft—and have the deepest pockets in tech. In late December the semiconductor company AMD rolled out a large processor for AI computing that is meant to compete with Nvidia. Startups are taking aim too. In last year’s third quarter alone, venture capitalists funneled more than $800 million into AI chips, according to the research firm Pitchbook.

So Huang never rests. Not even during interviews, as I learned when, to my surprise, he started interviewing me, asking me where I was from and how I ended up living in the Bay Area.

Jensen Huang: You and I are both Stanford grads.

Lauren Goode: Yes. Well, I went to the journalism program, and you did not go to the journalism program.

I wish I had.

Why is that?

Well, somebody who I really admire, as a leader and a person, is Shantanu Narayen, the CEO of Adobe. He said he always wanted to be a journalist because he loved telling stories.

It seems like an important part of building a business, being able to tell its story effectively.

Yes. Strategy setting is storytelling. Culture building is storytelling.

You’ve said many times you didn’t sell the idea of Nvidia based on a pitch deck.

That’s right. It was really about telling the story.

So I want to start with something that another tech executive told me. He noted that Nvidia is one year older than Amazon, but in many ways Nvidia has more of a “day one” approach than Amazon does. How do you maintain that outlook?

That’s really a good phrase, frankly. I wake up every morning like it’s day one, and the reason is there’s always something we’re doing that has never been done before. There’s also the vulnerable side of it. We very well could fail. Just now, I was having a meeting where we’re doing something that is brand-new for our company, and we don’t know how to do it right.

What is the new thing?

We’re building a new type of data center. We call it an AI factory. The way data centers are built today, you have a lot of people sharing one cluster of computers and putting their files in this one large data center. An AI factory is much more like a power generator. It’s quite unique. We’ve been building it over the last several years, but now we have to turn this into a product.

What are you going to call it?

We haven’t given it a name yet. But it will be everywhere. Cloud service providers will build them, and we’ll build them. Every biotech company will have it. Every retail company, every logistics company. Every car company in the future will have a factory that builds the cars—the actual goods, the atoms—and a factory that builds the AI for the cars, the electrons. In fact, you see Elon Musk doing that as we speak. He’s well ahead of most in thinking about what industrial companies will look like in the future.

You’ve said before that you run a flat organization, with between 30 to 40 executives who report directly to you, because you want to be in the information flow. What has piqued your interest lately, that makes you think, “I may need to bet Nvidia on this eventually?”

Information doesn’t have to flow from the top to the bottom of an organization, as it did back in the Neanderthal days when we didn’t have email and texts and all those things. Information can flow a lot more quickly today. So a hierarchical tree, with information being interpreted from the top down to the bottom, is unnecessary. A flat network allows us to adapt a lot more quickly, which we need because our technology is moving so quickly.

If you look at the way Nvidia’s technology has moved, classically there was Moore’s law doubling every couple of years. Well, in the course of the last 10 years, we’ve advanced AI by about a million times. That’s many, many times Moore’s law. If you’re living in an exponential world, you don’t want information to be propagated from the top down one layer at a time.

But I’m asking you, what’s your Roman Empire? Which is a meme. What’s today’s version of the transformer paper? What’s happening right now that you feel is going to change everything?

There are a couple things. One of them doesn’t really have a name, but it’s some of the work that we’re doing in foundational robotics. If you could generate text, if you could generate images, can you also generate motion? The answer is probably yes. And then if you can generate motion, you can understand intent and generate a generalized version of articulation. Therefore, humanoid robotics should be right around the corner.

And I think the work around state-space models, or SSMs, that allow you to learn extremely long patterns and sequences without growing quadratically in computation, probably is the next transformer.

What does that enable? What’s a real-life example?

You could have a conversation with a computer that lasts a very long time, and yet the context is never forgotten. You could even change topics for a while and come back to an earlier one, and that context could be retained. You might be able to understand the sequence of an extremely long chain, like a human genome. And just by looking at the genetic code, you understand its meaning.

How far away are we from that?

In the recent past, from the time that we had AlexNet to superhuman AlexNet, that was only about five years. A robotic foundation model is probably around the corner—I’ll call it next year sometime. From that point, five years down the road, you’re going to see some pretty amazing things.

Which industry stands to benefit the most from a broadly trained model for robot behavior?

Well, heavy industries represent the largest industries in the world. Moving electrons is not easy, but moving atoms is extremely hard. Transportation, logistics, moving heavy things from one place to another, discovering the next drug—all of that requires an understanding of atoms, molecules, proteins. Those are the large, incredible industries that AI hasn’t affected yet.

You mentioned Moore’s law. Is it irrelevant now?

Moore’s law is now much more of a systems problem than a chip problem. It’s much more about the interconnectivity of multiple chips. About 10, 15 years ago, we started down the journey of disaggregating the computer so that you could take multiple chips and connect them together.

Which is where your acquisition of the Israeli company Mellanox comes in, in 2019. Nvidia said at the time that modern computing has put enormous demands on data centers and that Mellanox’s networking technology would make accelerated computing more efficient.

Right, exactly. We bought Mellanox so that we could take an extension of our chip and make an entire data center into a super chip, which enabled the modern AI supercomputer. That was really about recognizing that Moore’s law has come to an end and that if we want to continue to scale computing we have to do it at data center scale. We looked at the way Moore’s law was formulated, and we said, “Don’t be limited by that. Moore’s law is not a limiter to computing.” We have to leave Moore’s law behind so we can think about new ways of scaling.

Mellanox is now recognized as a really smart acquisition for Nvidia. More recently, you attempted to acquire Arm, one of the most important chip IP companies in the world, until you were thwarted by regulators.

That would’ve been wonderful!

I’m not sure the US government agrees, but yes, let’s put a pin in that. When you think about acquisitions now, what specific places are you looking at?

The operating system of these large systems is insanely complex. How do you create an operating system in a computing stack that orchestrates the tens of millions, hundreds of millions, and now coming up to billions of little tiny processors that are in our GPUs? That’s a very hard problem. If there are teams outside our company that do that, we can either partner with them or we could do more than that.

So what I hear you saying is that it’s crucial for Nvidia to have an operating system and to build it into more of a platform, really.

We are a platform company.

The more you become a platform, the more problems you face. People tend to put a lot more onus and responsibility on a platform for its output. How the self-driving car behaves, what the margin of error is on the health care device, whether there’s bias in an AI system. How do you address that?

We’re not an application company, though. That’s probably the easiest way to think about it. We will do as much as we have to, but as little as we can, to serve an industry. So in the case of health care, drug discovery is not our expertise, computing is. Building cars is not our expertise, but building computers for cars that are incredibly good at AI, that’s our expertise. It’s hard for a company to be good at all of those things, frankly, but we can be very good at the AI computing part of it.

Last year reports emerged that some of your customers were waiting several months for your AI GPUs. How are things looking now?

Well, I don’t think we’re going to catch up on supply this year. Not this year, and probably not next year.

What’s the current wait time?

I don’t know what the current lead time is. But, you know, this year is also the beginning of a new generation for us.

Do you mean Blackwell, your rumored new GPU?

That’s right. It’s a new generation of GPUs coming out, and the performance of Blackwell is off the charts. It’s going to be incredible.

Does that equate to customers needing fewer GPUs?

That’s the goal. The goal is to reduce the cost of training models tremendously. Then people can scale up the models they want to train.

Nvidia invests in a lot of AI startups. Last year it was reported that you invested in more than 30. Do those startups get bumped up in the waiting line for your hardware?

They face the same supply crunch as everyone, because most of them use the public cloud, so they had to negotiate for themselves with the public cloud service providers. What they do get, though, is access to our AI technology, meaning they get access to our engineering capabilities and our special techniques for optimizing their AI models. We make it more efficient for them. If your throughput goes up by a factor of five, you’re essentially getting five more GPUs. So that’s what they get from us.

Do you consider yourself a kingmaker in that regard?

No. We invest in these companies because they’re incredible at what they do. It’s a privilege for us to be investing in them, not the other way around. These are some of the brightest minds in the world. They don’t need us to support their credibility.

What happens as machine learning turns more toward inference rather than training—basically, if AI work becomes less computationally intensive? Does that reduce the demand for your GPUs?

We love inference. In fact, I would say that Nvidia’s business today is probably, if I were to guess, 70 percent inference, 30 percent training. The reason why that’s a good thing is because that’s when you realize AI is finally making it. If Nvidia’s business is 90 percent training and 10 percent inference, you could argue that AI is still in research. That was the case seven or eight years ago. But today, whenever you type a prompt into a cloud and it generates something—it could be a video, it could be an image, it could be 2D, it could be 3D, it could be text, it could be a graph—it’s most likely that there’s an Nvidia GPU behind it.

Do you see demand waning at any point for your GPUs for AI?

I think we’re at the beginning of the generative AI revolution. Today most of the computing that’s done in the world is still retrieval-based. Retrieval means you touch something on your phone and it sends a signal out to the cloud to retrieve a piece of information. It might compose a response with a few different things and, using Java, present it to you on your phone, on your nice screen. In the future, computing is going to be more RAG-based. [Retrieval-augmented generation is a framework that allows a large language model to pull in data from outside its usual parameters.] The retrieval part of it will be less, and the personalized generation part will be much, much higher.

That generation will be done by a GPU somewhere. So I think we’re in the beginning of this retrieval-augmented, generative computing revolution, and generative AI is going to be integral to almost everything.

The latest news is that you’ve been working with the US government to come up with sanctions-compliant chips that you can ship to China. My understanding is that these are not the most advanced chips. How closely were you working with the administration to ensure that you could still do business in China?

Well, to take a step back, it’s an export control, not sanctions. The United States has determined that Nvidia’s technology and this AI computing infrastructure are strategic to the nation and that export control would apply to it. We complied with the export control the first time—

In August 2022.

Yes. And the United States added more provisions to the export control in 2023, which caused us to have to reengineer our products again. So we did that. We’re in the process of coming up with a new set of products that are in compliance with today’s export control rules. We work closely with the administration to make sure that what we come up with is consistent with what they had in mind.

How big is your concern that these constraints will spur China to spin up competitive AI chips?

China has things that are competitive.

Right. This isn’t data-center scale, but the Huawei Mate 60 smartphone that came out last year got some attention for its homegrown 7-nanometer chip.

Really, really good company. They’re limited by whatever semiconductor processing technology they have, but they’ll still be able to build very large systems by aggregating many of those chips together.

How concerned are you in general, though, that China will be able to match the US in generative AI?

The regulation will limit China’s ability to access state-of-the-art technology, which means the Western world, the countries not limited by the export control, will have access to much better technology, which is moving fairly fast. So I think the limitation puts a lot of cost burden on China. You can always, technically, aggregate more of the chipmaking systems to do the job. But it just increases the cost per unit on those. That’s probably the easiest way to think about it.

Does the fact that you’re building compliant chips to keep selling in China affect your relationship with TSMC, Taiwan’s semiconductor pride and joy?

No. A regulation is specific. It’s no different than a speed limit.

You’ve said quite a few times that of the 35,000 components that are in your supercomputer, eight are from TSMC. When I hear that, I think that must be a tiny fraction. Are you downplaying your reliance on TSMC?

No, not at all. Not at all.

So what point are you trying to make with that?

I’m simply emphasizing that in order to build an AI supercomputer, a whole lot of other components are involved. In fact, in our AI supercomputers, just about the entire semiconductor industry partners with us. We already partner very closely with Samsung, SK Hynix, Intel, AMD, Broadcom, Marvell, and so on and so forth. In our AI supercomputers, when we succeed, a whole bunch of companies succeed with us, and we’re delighted by that.

How often do you talk to Morris Chang or Mark Liu at TSMC?

All the time. Continuously. Yeah. Continuously.

What are your conversations like?

These days we talk about advanced packaging, planning for capacity for the coming years, for advanced computing capacity. CoWoS [TSMC’s proprietary method for cramming chip dies and memory modules into a single package] requires new factories, new manufacturing lines, new equipment. So their support is really, really quite important.

I recently had a conversation with a generative-AI-focused CEO. I asked who Nvidia’s competitors might be down the road, and this person suggested Google’s TPU. Other people mention AMD. I imagine it’s not such a binary to you, but who do you see as your biggest competitor? Who keeps you up at night?

Lauren, they all do. The TPU team is extraordinary. The bottom line is, the TPU team is really great, the AWS Trainium team and the AWS Inferentia team are really extraordinary, really excellent. Microsoft has their internal ASIC development that’s ongoing, called Maia. Every cloud service provider in China is building internal chips, and then there’s a whole bunch of startups that are building great chips, as well as existing semiconductor companies. Everybody’s building chips.

That shouldn’t keep me up at night—because I should make sure that I’m sufficiently exhausted from working that no one can keep me up at night. That’s really the only thing I can control.

But what wakes me up in the morning is surely that we have to keep building on our promise, which is, we’re the only company in the world that everybody can partner with to build AI supercomputers at data-center scale and at the full stack.

I have some personal questions I wanted to ask you.

[Huang to a public relations representative.] She’s done her homework. Not to mention, I’m just enjoying the conversation.

I’m glad. I am as well. I did want to—

By the way, whenever Morris, or people who I’ve known a long time, ask me to be the moderator of interviews, the reason for that is because I’m not going to sit there and interview them by asking them questions. I’m just having a conversation with them. You have to be empathetic to the audience and what they might want to hear about.

So I asked ChatGPT a question about you. I wanted to know if you had any tattoos, because I was going to propose that for our next meetup, that we get you a tattoo.

If you get a tattoo, I’ll get one.

I already have one, but I’ve been looking to expand.

I have one too.

Yes. This is what I learned from ChatGPT. It said Jensen Huang got a tattoo of the company logo when the stock price reached $100. Then it said, “However, Huang has expressed that he’s unlikely to get any more tattoos, noting the pain was more intense than he anticipated.” It said you cried. Did you cry?

A little bit. My recommendation is you should have a shot of whiskey before you do it. Or take Advil. I also think that women can take a lot more pain, because my daughter has a fairly large tattoo.

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply” at the bottom of the post.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From WIRED : “These States Are Basically Begging You to Get a Heat Pump”

From WIRED

2.7.24
Matt Simon

You need a heat pump, ASAP. Now nine states are teaming up to accelerate the adoption of this climate superhero.

1
Getty Images

Death is coming for the old-school gas furnace—and its killer is the humble heat pump. They’re already outselling gas furnaces in the US, and now a coalition of states has signed an agreement to supercharge the gas-to-electric transition by making it as cheap and easy as possible for their residents to switch.

Nine states have signed a memorandum of understanding that says that heat pumps should make up at least 65 percent of residential heating, air conditioning, and water-heating shipments by 2030. (“Shipments” here means systems manufactured, a proxy for how many are actually sold.) By 2040, these states—California, Colorado, Maine, Maryland, Massachusetts, New Jersey, New York, Oregon, and Rhode Island—are aiming for 90 percent of those shipments to be heat pumps.

“It’s a really strong signal from states that they’re committed to accelerating this transition to zero-emissions residential buildings,” says Emily Levin, senior policy adviser at the Northeast States for Coordinated Air Use Management (NESCAUM), an association of air-quality agencies, which facilitated the agreement. The states will collaborate, for instance, in pursuing federal funding, developing standards for the rollout of heat pumps, and laying out an overarching plan “with priority actions to support widespread electrification of residential buildings.”

Instead of burning planet-warming natural gas, a heat pump warms a building by transferring heat from the outdoor air into the interior space. Run it in the opposite direction, and it can cool the inside of a building—a heat pump is both a heater and AC unit. Because the system is electric, it can run off a grid increasingly powered by renewables like wind and solar. Even if you have to run a heat pump with electricity from fossil-fuel power plants, it’s much more efficient than a furnace, because it’s moving heat instead of creating it.

A heat pump can save an average American household over $550 a year, according to one estimate. They’ve gotten so efficient that even when it’s freezing out, they can still extract warmth from the air to heat a home. You can even install a heat pump system that also warms your water. “We really need consumers to move away from dirty to clean heat, and we really want to get the message out that heat pumps are really the way to go,” says Serena McIlwain, Maryland’s secretary of the environment. “We have homeowners who are getting ready to replace their furnaces, and if they’re not aware, they are not going to replace it with a heat pump.”

The coalition’s announcement comes just months after the federal government doubled down on its own commitment to heat pumps, announcing $169 million in funding for the domestic production of the systems. That money comes from 2022’s Inflation Reduction Act, which also provides an American household with thousands of dollars in rebates or tax credits to switch to a heat pump.

These states are aiming to further collaborate with those heat pump manufacturers by tracking sales and overall progress, sending a signal to the industry to ramp up production to meet the ensuing demand. They’ll also collaborate with each other on research and generally share information, working toward the best strategies for realizing the transition from gas to electric. Basically, they’re pursuing a sort of standardization of the policies and regulations for getting more heat pumps built, bought, and installed, which other states outside of the coalition might eventually tap into.

“A consistent approach between states helps to ease the market transition,” says Matt Casale, senior manager of appliance standards at the Building Decarbonization Coalition, which is collaborating with the Northeast States for Coordinated Air Use Management. “There are all of these manufacturers, and all of these contractors, all along the supply chain, trying to plan out their next several years. They want to know: What is it going to look like?”

There’s also the less-talked-about challenge of the green energy revolution: training enough technicians to actually install the heat pumps. To that end, the memorandum calls for workforce development and contractor training. “If we’re pushing heat pumps and more installations, and we don’t have enough electricians to do the job, we’re not going to meet the goal—period,” says McIlwain. “We do need to put a lot of money and energy and resources into making sure that we have the workforce available to do it.”

In addition to the technicians working with the systems, the country needs way more electricians to retrofit homes to go fully electric beyond heat pumps, with solar panels and induction stoves and home batteries. To help there, last year the White House announced the formation of the American Climate Corps, which aims to put more than 20,000 people to work in clean energy and overall climate resilience.

With states collaborating like this on heat pumps, the idea is to lift the device from an obscure technology cherished by climate nerds into ubiquity, for the good of consumers and the planet. “We need to be sending these unmistakable signals to the marketplace that heat pumps and zero-emission homes are the future,” says Casale. “This agreement between this many states really sets the stage for doing that.”

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply” at the bottom of the post.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From WIRED : “California Is Solving Its Water Problems by Flooding Its Best Farmland”

From WIRED

1.6.24
Jake Bittle

This story originally appeared on Grist. It was produced by Grist and co-published with Fresnoland. It is part of the Climate Desk collaboration.

Restored floodplains in the state’s agricultural heartland are fighting both flooding and drought. But their fate rests with California’s powerful farmers.


Video: Grist.

The land of the Central Valley works hard. Here in the heart of California, in the most productive farming region in the United States, almost every square inch of land has been razed, planted, and shaped to support large-scale agriculture. The valley produces almonds, walnuts, pistachios, olives, cherries, beans, eggs, milk, beef, melons, pumpkins, sweet potatoes, tomatoes, and garlic.

This economic mandate is clear to the naked eye: Trucks laden with fertilizer or diesel trundle down arrow-straight roads past square field after square field, each one dense with tomato shrubs or nut trees. Canals slice between orchards and acres of silage, pushing all-important irrigation water through a network of laterals from farm to farm. Cows jostle for space beneath metal awnings on crowded patches of dirt, emitting a stench that wafts over nearby towns.

There is one exception to this law of productivity. In the midst of the valley, at the confluence of two rivers that have been dammed and diverted almost to the point of disappearance, there is a wilderness. The ground is covered in water that seeps slowly across what used to be walnut orchards, the surface buzzing with mosquitoes and songbirds. Trees climb over each other above thick knots of reedy grass, consuming what used to be levees and culverts. Beavers, quail, and deer, which haven’t been seen in the area in decades, tiptoe through swampy ponds early in the morning, while migratory birds alight overnight on knolls before flying south.

1
Corn for silage grows in a field next to a restored floodplain and riparian habitat at Dos Rios Ranch Preserve on September 21, 2021. Photograph: Brian van der Brug/Los Angeles Times/Getty Images.

Austin Stevenot, who is in charge of maintaining this restored jungle of water and wild vegetation, says this is how the Central Valley is supposed to look. Indeed, it’s how the land did look for thousands of years until white settlers arrived in the 19th century and remade it for industrial-scale agriculture. In the era before colonization, Stevenot’s ancestors in the California Miwok tribe used the region’s native plants for cooking, basket weaving, and making herbal medicines. Now those plants have returned.

“I could walk around this landscape and go, ‘I can use that, I can use this to do that, I can eat that, I can eat that, I can do this with that,’” he told me as we drove through the flooded land in his pickup truck. “I have a different way of looking at the ground.”

You wouldn’t know it without Stevenot there to point out the signs, but this untamed floodplain used to be a workhorse parcel, just like the land around it. The fertile site at the confluence of the San Joaquin and Tuolumne rivers once hosted a dairy operation and a cluster of crop fields owned by one of the county’s most prominent farmers. Around a decade ago, a conservation nonprofit worked out a deal to buy the 2,100-acre tract from the farmer, rip up the fields, and restore the ancient vegetation that once existed there. The conservationists’ goal with this $40 million project was not just to restore a natural habitat, but also to pilot a solution to the massive water management crisis that has bedeviled California and the West for decades.

3
Austin Stevenot leans on his pickup truck near Dos Rios Ranch Preserve, a restored floodplain in California’s Central Valley. Photograph: Cameron Nielsen/Grist.

Like many other parts of the West, the Central Valley always seems to have either too little water or too much. During dry years, when mountain reservoirs dry up, farmers mine groundwater from aquifers, draining them so fast that the land around them starts to sink. During wet years, when the reservoirs fill up, water comes streaming down rivers and bursts through aging levees, flooding farmland and inundating valley towns.

The restored floodplain solves both problems at once. During wet years like this one, it absorbs excess water from the San Joaquin River, slowing down the waterway before it can rush downstream toward large cities like Stockton. As the water moves through the site, it seeps into the ground, recharging groundwater aquifers that farmers and dairy owners have drained over the past century. In addition to these two functions, the restored swamp also sequesters an amount of carbon dioxide equivalent to that produced by thousands of gas-powered vehicles. It also provides a haven for migratory birds and other species that have faced the threat of extinction.

“It’s been amazing just getting to see nature take it back over,” Stevenot said. “When you go out to a commercially farmed orchard or field, and you stand there and listen, it’s sterile. You don’t hear anything. But you come out here on that same day, you hear insects, songbirds. It’s that lower part of the ecosystem starting up.”

3
Water flows through part of Dos Rios Ranch Preserve. The former farmland now acts as a storage area for floodwaters during wet years. Photograph: Cameron Nielsen/Grist.

4
Austin Stevenot walks through Dos Rios Ranch Preserve. Stevenot manages the restored floodplain site. Photograph: Cameron Nielsen/Grist.

Stevenot’s own career path mirrors that of the land he now tends. Before he worked for River Partners, the small conservation nonprofit that developed the site, he spent eight years working at a packing plant that processed cherries and onions for export across the country. He was a lifelong resident of the San Joaquin Valley, but had never been able to use the traditions he’d learned from his Miwok family until he started working routine maintenance at the floodplain project. Now he presides over the whole ecosystem.

This year, after a deluge of winter rain and snow, water rolled down the San Joaquin and Tuolumne rivers, filling up the site for the first time since it had been restored. As Stevenot guided me across the landscape, he showed me all the ways that land and water were working together. In one area, water had spread like a sheet across three former fields, erasing the divisions that had once separated acres on the property. Elsewhere, birds had scattered seeds throughout what was once an orderly orchard, so that new trees soon obscured the old furrows.

The advent of the restoration project, known as Dos Rios, has worked wonders for this small section of the San Joaquin Valley, putting an end to frequent flooding in the area and altering long-held attitudes about environmental conservation. Even so, it represents just a chink in the armor of the Central Valley, where agricultural interests still control almost all the land and water. As climate change makes California’s weather whiplash more extreme, creating a cycle of drought and flooding, flood experts say replicating this work has become more urgent than ever.

But building another “Dos Rios” isn’t just about finding money to buy and reforest thousands of acres of land. To create a network of restored floodplains will also require reaching an accord with a powerful industry that has historically clashed with environmentalists—and that produces fruit and nuts for much of the country. Making good on the promise of Dos Rios will mean convincing the state’s farmers to occupy less land, irrigate with less water, and produce less food.

Cannon Michael, a sixth-generation farmer who runs Bowles Farming Company in the heart of the San Joaquin Valley, says such a shift is possible, but it won’t be easy.

“There’s a limited resource, there’s a warming climate, there’s a lot of constraints, and a lot of people are aging out, not always coming back to the farm,” Michael said. “There’s a lot of transition that’s happening anyway, and I think people are starting to understand that life is gonna change. And I think those of us who want to still be around the valley want to figure out how to make the outcome something we can live with.”

4
Members of several conservation groups gather on the Dos Rios Ranch Preserve property in 2013. It took a conservation nonprofit around a decade to restore the site. Photograph: Michael Macor/The San Francisco Chronicle/Getty Images.

You can think of the past century of environmental manipulation in the Central Valley as one long attempt to create stability. Alfalfa fields and citrus orchards guzzle a lot of water, and nut trees have to be watered consistently for years to reach maturity, so farmers seeking to grow these crops can’t just rely on water to fall from the sky.

In the early 19th century, as white settlers first claimed land in the Central Valley, they found a turbulent ecosystem. The valley functioned as a drain for the mountains of the Sierra Nevada, sluicing trillions of gallons of water out to the ocean every spring. During the worst flood years, the valley would turn into what one 19th-century observer called an “inland sea.” It took a while, but the federal government and the powerful farmers who took over the valley got this water under control. They built dozens of dams in the Sierra Nevada, allowing them to store melting snow until they wanted to use it for irrigation, as well as hundreds of miles of levees that stopped rivers from flooding.

But by restricting the flow of the valley’s rivers, the government and the farmers also desiccated much of the valley’s land, depriving it of floodwaters that had nourished it for centuries.

“In the old days, all that floodwater would spread out over the riverbanks into adjacent areas and sit there for weeks,” said Helen Dahlke, a hydrologist at the University of California, Davis, who studies floodplain management. “That’s what fed the sediment, and how we replenish our groundwater reserves. The floodwater really needs to go on land, and the problem is that now the land is mainly used for other purposes.”

The development of the valley also allowed for the prosperity of families like that of Bill Lyons, the rancher who used to own the land that became Dos Rios. Lyons is a third-generation family farmer, the heir to a farming dynasty that began when his great-uncle E. T. Mape came over from Ireland. With his shock of gray hair and his standard uniform of starched dress shirt and jeans, Lyons is the image of the modern California farmer, and indeed he once served as the state’s Secretary of Agriculture.

5
Bill Lyons stands for a portrait on the banks of the Tuolumne River at Dos Rios Ranch Preserve in 2021. Lyons, a prominent Central Valley farmer, owned the farmland that became Dos Rios. Photograph: Brian van der Brug/Los Angeles Times/Getty Images.

Lyons has expanded his family’s farming operation over the past several decades, stretching his nut orchards and dairy farms out across thousands of acres on the west side of the valley. But his territory straddles the San Joaquin River, and there was one farm property that always seemed to go underwater during wet years.

“It was an extremely productive ranch, and that was one of the reasons it attracted us,” said Lyons. But while the land’s low-elevation river frontage made its soil fertile, that same geography put its harvests at risk of flooding. “Over the 20 years that we owned it, I believe we got flooded out two or three times,” Lyons added.

In 2006, as he was repairing the farm after a flood, Lyons met a biologist named Julie Rentner, who had just joined River Partners. The conservation nonprofit’s mission was to restore natural ecosystems in river valleys across California, and it had completed a few humble projects over the previous decade, most of them on small chunks of not-too-valuable land in the north of the state. As Rentner examined the overdeveloped land of the San Joaquin Valley, she came to the conclusion that it was ready for a much larger restoration project than River Partners had ever attempted. And she thought Lyons’ land was the perfect place to start.

6
Floodwaters pool at Dos Rios Ranch Preserve earlier this year. As water passes through the site, it recharges groundwater aquifers in the area. Photograph: Cameron Nielsen/Grist.

Most farmers would have bristled at such a proposition, especially those with deep roots in a region that depends on agriculture. But unlike many of his peers, Lyons already had some experience with conservation work: He had partnered with the US Forest Service in the 1990s on a project that set aside some land for the Aleutian goose, an endangered species that just so happened to love roosting on his property. As Lyons started talking with Rentner, he found her practical and detail-oriented. Within a year, he and his family had made a handshake deal to sell her the flood-prone land. If she could find the money to buy the land and turn it into a floodplain, it was hers.

For Rentner, the process wasn’t anywhere near so easy. Finding the $26 million she needed to buy the land from Lyons—and the additional $14 million she needed to restore it—required scraping together money from a rogues’ gallery of funders including three federal agencies, three state agencies, a local utility commission, a nonprofit foundation, the electric utility Pacific Gas & Electric, and the beer company New Belgium Brewing.

“I remember taking so many tours out there,” said Rentner, “and all the public funding agency partners would go, ‘OK, so you have a million dollars in hand, and you still need how many? How are you going to get there?’”

“I don’t know,” Rentner told them in response. “We’re just gonna keep writing proposals, I guess.”

Even once River Partners bought the land in 2012, Rentner found herself in a permitting nightmare: Each grant came with a separate set of conditions for what River Partners could and couldn’t do with the money, the deed to Lyons’ tract came with its own restrictions, and the government required the project to undergo several environmental reviews to ensure it wouldn’t harm sensitive species or other land. River Partners also had to hold dozens of listening sessions and community meetings to quell the fears and skepticism of nearby farmers and residents who worried about shutting down a farm to flood it on purpose.

7
Floodbase Illustration: Floodbase.

8
Floodbase Illustration: Floodbase.

It took more than a decade for River Partners to complete the project, but now that it’s done, it’s clear that all those fears were unfounded. The restored floodplain absorbed a deluge from the huge “atmospheric river” storms that drenched California last winter, trapping all the excess water without flooding any private land. The removal of a few thousand acres of farmland hasn’t put anyone out of work in nearby towns, nor has it hurt local government budgets. Indeed, the groundwater recharge from the project may soon help restore the unhealthy aquifers below nearby Grayson, where a community of around 1,300 Latino agricultural workers has long avoided drinking well water contaminated with nitrates.

As new plants take root, the floodplain has become a self-sustaining ecosystem: It will survive and regenerate even through future droughts, with a full hierarchy of pollinators and base flora and predators like bobcats. Except for Stevenot’s routine cleanup and road repair, River Partners doesn’t have to do anything to keep it working in perpetuity. Come next year, the organization will hand the site over to the state, which will keep it open as California’s first new state park in more than a decade and let visitors wander on new trails.

“After three years of intensive cultivation, we walk away,” said Rentner. “We literally stopped doing any restoration work. The vegetation figures itself out, and what we’ve seen is, it’s resilient. You get a big deep flood like we have this year, and after the floodwaters recede what comes back is the native stuff.”

Dos Rios has managed to change the ecology of one small corner of the Central Valley, but the region’s water problems are gargantuan in scale. A recent NASA study [Nature Communications] found that water users in the valley are over-tapping aquifers by about 7 million acre-feet every year, sucking half a Colorado River’s worth of water out of the ground without putting any back. This overdraft has created zones of extreme land subsidence all over the valley, causing highways to crack and buildings to sink dozens of feet into the ground.

At the same time, floods are also getting harder to manage. The “atmospheric river” storms that drench California every few years are becoming more intense as the earth warms, pushing more water through the valley’s twisting rivers. The region escaped a catastrophic flood this year only thanks to a slow spring melt, but the future risks were clear. Two levees burst in the eastern valley town of Wilton, along the Cosumnes River, killing three people, and the historically Black town of Allensworth flooded as the once-dry Tulare Lake reappeared for the first time since 1997.

Fixing the state’s distorted water system for an era of climate change will be the work of many decades. In order to comply with California’s landmark law for regulating groundwater, which will take full effect by 2040, farmers will have to retire as much as a million acres of productive farmland, wiping out billions of dollars of revenue. Protecting the region’s cities from flooding, meanwhile, will require spending billions more dollars to bolster aging dirt levees and channels.

In theory, this dual mandate would make floodplain restoration an ideal way to deal with the state’s water problems. But the scale of the need is enormous, equivalent to dozens of projects on the same scale as Dos Rios.

“Dos Rios is good, but we need 50 more of it,” said Jane Dolan, the chair of the Central Valley Flood Protection Board, a state agency that regulates flood control in the region. “Do I think that will happen in my lifetime? No, but we have to keep working toward it.” Fifty more projects of the same size as Dos Rios would span more than 150 square miles, an area larger than the city of Detroit, Michigan. It would cost billions of dollars to purchase that much valuable farmland, saw away old levees, and plant new vegetation.

9
Members of the California Conservation Corps plant new vegetation on the Dos Rios Ranch Preserve in 2013. After a decade of restoration work, the floodplain now functions as a self-sustaining ecosystem. Photograph: Michael Macor/The San Francisco Chronicle/Getty Images.

As successful as Rentner was in finding the money for Dos Rios, the nonprofit’s piecemeal approach could never fund restoration work at this scale. The only viable sources for that much funding are the state and federal governments. Neither has ever devoted significant public dollars to floodplain restoration, in large part because farmers in the Central Valley haven’t supported it. But that has started to change. Earlier this year, state lawmakers set aside $40 million to fund new restoration projects. Governor Gavin Newsom, fearing a budget crunch, tried to slash the funding at the start of the year, but reinserted it after furious protests from local officials along the San Joaquin. Most of this new money went straight to River Partners, and the organization has already started to clear the land on a site next to Dos Rios. It’s also in the process of closing on another 500-acre site nearby.

But even if nonprofits like River Partners get billions more dollars to buy agricultural land, creating the ribbon of natural floodplains that Dolan describes will still be difficult. That’s because river land in the Central Valley is also some of the most productive agricultural land in the world, and the people who own it have no incentive to forgo future profits by selling.

“Maybe we could do it some time down the road, but we’re farming in a pretty water-secure area,” said Cannon Michael, the sixth-generation farmer from Bowles Farm whose land sits on the upper San Joaquin River. The aquifers beneath his property are substantial, fed by seepage from the river, and he also has the rights to use water from the state’s canal system. “It’s a hard calculation because we’re employing a lot of people, and we’re doing stuff with the land, we’re producing.”

Even farmers who are running out of groundwater may not need to sell off their land in order to restore their aquifers. Don Cameron, who grows grapes in the eastern valley near the Kings River, has pioneered a technique that involves the intentional flooding of crop fields to recharge groundwater. Earlier this year, when a torrent of melting snow came roaring along the Kings, he used a series of pumps to pull it off the river and onto his vineyards. The water sank into the ground, where it refilled Cameron’s underground water bank, and the grapes survived just fine.

6
The farmer Don Cameron stands near a pump on the Kings River in 2021. The pump moves water from the Kings onto Cameron’s grape fields, flooding them in order to recharge the groundwater aquifers beneath them. Photograph: Brian van der Brug/Los Angeles Times/Getty Images.

This kind of recharge project allows farmers to keep their land, so it’s much more palatable to big agricultural interests. The California Farm Bureau supports taking agricultural land out of commission only as a last resort, but it has thrown its weight behind recharge projects like Cameron’s, since they allow farmers to keep farming. The state government has also been trying to subsidize this kind of water capture, and other farmers have bought in: According to a state estimate, valley landowners may have caught and stored almost 4 million acre-feet of water this year.

“I’m familiar with Dos Rios, and I think it has a very good purpose when you’re trying to provide benefits to the river, but ours is more farm-centric,” said Cameron.

But Joshua Viers, a watershed scientist at the University of California-Merced, says these on-farm recharge projects may cannibalize demand for projects like Dos Rios. Not only does a project like Cameron’s not provide any flood control or ecological benefit, but it also provides a much narrower benefit to the aquifer, focusing water in a small square of land rather than allowing it to seep across a wide area.

“If you can build this string of beads down the river, with all these restored floodplains, where you can slow the water down and let it stay in for long periods of time, you’re getting recharge that otherwise wouldn’t happen,” he said.

As long as landowners see floodwater as a tool to support their farms rather than a force that needs to be respected, it will be difficult to replicate the success of Dos Rios. It’s this entrenched philosophy about the natural world, rather than financial constraints, that will be River Partners’ biggest barrier in the coming decades. In order to create Viers’ “string of beads,” Rentner and her colleagues would have to convert farmland all across the state.

It’s one thing to do that in a northern area like Sacramento, where officials designed flood bypasses on agricultural land a century ago. It’s quite another to do it farther south in the Tulare Basin, where the powerful farm company J. G. Boswell has been accused of channeling floodwater toward nearby towns in an effort to save its own tomato crops. River Partners is funneling some of the new state money toward restoration projects in this area, but these are small conservation efforts, and they don’t alter the landscape of the valley like Dos Rios does.

To export the Dos Rios model, River Partners will have to convince hundreds of farmers that it’s worth it to give up some of their land for the sake of other farmers, flood-prone cities, climate resilience, and endangered species. Rentner was able to build that consensus at Dos Rios through patience and open dialogue, but the path toward restoration in the rest of the state will likely be more painful. California farmers will need to retire thousands of acres of productive land over the coming decades as they respond to rising costs and water restrictions, and more acres will face the constant threat of flooding as storms intensify in a warming world and levees break. As landowners sell their parcels to solar companies or let fallow fields turn to dust, Rentner is hoping that she can catch some of them as they head for the exits.

“It’s going to be a challenge,” said Rentner. “We’re hopeful that some will think twice and say, ‘Wait, maybe we should take the time to sit down with the people in the conservation community and think about our legacy, think about what we’re leaving behind when we make this transaction.’ And maybe it’s not as simple as just the highest bidder.”

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply” at the bottom of the post.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From “WIRED” : “Scientists Have an Audacious Plan to Map the Ancient World Before It Disappears”

From “WIRED”

9.28.23
Geoff Manaugh

1
Ground penetrating radar. Photo credit: Jimena Peck.

Buried civilizations could soon become inaccessible forever. Archaeologists have to move fast so they’re turning to the latest ground-scanning tech.

“In the center of Siena, Italy, a cathedral has stood for nearly 800 years. A black-and-white layer cake of heavy stone, fine-cut statuary, and rich mosaics, the imposing structure—now visited by more than a million tourists each year—would seem to be a permanent fixture of the city’s past, present, and future. Most people call it, simply, “the cathedral.” But Stefano Campana, a 53-year-old archaeologist at the University of Siena, calls it something else: “the church that is visible now.”

Campana has seen his fair share of excavations, along with the dust and sunburns that accompany them. But archaeology, for him, is not always about digging up the past; it also means peering down into it using an array of sensitive electromagnetic equipment. One device Campana uses is ground-penetrating radar, which works by transmitting high-frequency waves into the earth to reveal “anomalies”—subsurface features that are potentially architectural—in the signals that bounce back.

In early 2020, when Covid lockdowns emptied Italian tourist sites of their crowds, Campana and his collaborators received permission to survey the Siena cathedral’s interior. Using instruments originally developed for studying glaciers, mines, and oil fields, they spent days scanning marble floors and intricate mosaics, on the hunt for walls and foundations in the deep. With the selfie-stick brigade gone, Campana and his crew were able to find evidence of earlier structures, including, potentially, a mysterious church constructed there nearly 1,200 years ago, lurking like a shadow in the radar data.

After seeing how much they achieved during Italy’s lockdown, Campana and his collaborators got to thinking about what else might be possible with the technology. Ground-penetrating radar waves travel at a fraction of the speed of light, so the entire process—transmission, reflection, recording—takes nanoseconds. With these new tools, archaeology is no longer a stationary activity, limited to one site; even while zipping by at highway speed, field surveyors can produce an accurate snapshot of what’s beneath centuries of cobblestone and brick, chewing gum and litter.

“We thought, why not scan everything?” Campana recalled. “Why not scan all the squares, all the roads, all the courtyards in Siena?” Unlike the cathedral and its shadow church, these everyday sites are not protected, which means they are threatened by modern construction and development. In the public imagination, they are what Campana calls “emptyscapes”—places wrongly considered insignificant to the human story. He wanted to change that. Campana partnered with Geostudi Astier, a geophysical surveying firm in Livorno, to launch an initiative called Sotto Siena (“Under Siena”). True to its acronym, SoS, the project aims to create a complete archaeological record of Siena before more of the city’s deep history is destroyed.

Last spring, I traveled to Siena in the midst of a heat wave to see SoS in action. Campana and I met in the central Piazza del Campo to fortify ourselves with espresso before walking toward a park in a more modern part of the city. To see Siena through Campana’s eyes is to exist in overlapping worlds. As we strode up stairs and down alleyways, past restaurants and through piazzas, he explained that radar can reveal foundation walls beneath busy streets and back gardens. Corner shops can hide Etruscan ruins under their cash registers. Even temporary structures, lost long ago to war, fire, and history, can be rediscovered using radar. Some of the first SoS scans, he said, found evidence of small pavilions in the Piazza del Campo, likely set up for public fairs and festivals as far back as the 15th century.

When we reached our destination, a white cargo van was waiting for us. Campana introduced me to Giulia Penno and Filippo Barsuglia, geophysicists from Geostudi Astier, who were unloading equipment for a survey that evening. Their city-scanning setup consisted of an electric utility vehicle the size of a golf cart and an array of sealed boxes, studded with ports and wires. As Barsuglia carefully backed the utility vehicle out of the van, Penno gave me an overview of the gear. The boxes contained several heavy racks of radar equipment, which we’d tow a few inches off the ground. A Wi-Fi antenna would relay the data to a hardened laptop. We couldn’t count on clear satellite signals in Siena’s twisty streets, so the system was equipped with inertial navigation, which uses gyroscopes and accelerometers to track every stop and turn. Barsuglia claimed it was the only such system in all of Italy, outside the military.

We began with a quick scan of the park. I stood beside Campana and watched, curious what I was getting myself into. Penno took the wheel and began her survey, elegantly weaving the cart around benches, light posts, trash cans, and the occasional baffled Siena resident. “She’s like a painter,” Campana said approvingly. As she finished up, Campana excused himself, riding off on a motorcycle to meet his family, leaving Penno, Barsuglia, and me to our task.

With Barsuglia now at the wheel, we lurched ahead into the heart of nighttime Siena. The experience, I quickly realized, was going to feel less like painting than like being asked to mow a very large, very crowded lawn. People seemed unsure what to make of us as we passed, at times mistaking our cart for a street-cleaning machine or some sort of mobile art installation. Our vehicle bottomed out over and over, the radar gear’s protective case scraping loudly against the city’s cobblestones and concrete. People stopped, laughed, and took videos.

As the sun went down, we drove toward Siena’s Piazza Salimbeni, home to the world’s oldest bank. On our way there, the equipment was glitching in and out—a signaling issue, Penno explained. The solution, Barsuglia said, was to drive around in large figure-eight patterns that would trigger a recalibration process in the equipment. These wide, drunken loops attracted even more attention. At one point, he reached up and attached a small rotating orange light to the cart’s roof, explaining that this was to prevent a prowling police car, which had already driven past us several times, from pulling us over.

Our survey that evening finished well past midnight, so late that all three of us seemed on the verge of sleep, circling around in an archaeological fugue. I thought of Civilization and Its Discontents, in which Sigmund Freud compares psychoanalysis to an archaeological investigation, suggesting that other, forgotten versions of ourselves lie buried in a past that can be made visible once again through careful analysis.

In SoS’s case, that analysis took a few weeks. Gigabytes of data had to be processed for every stretch of road and piazza, matching what lay below to its precise geographic coordinates. Visualization software completed the job, overlaying our discoveries onto updated satellite maps. Our initial glimpses of what appeared to be structural features became refined enough to make archaeological sense. In the end, we found numerous modern pipelines and countless piles of historic masonry, most likely pillars from buildings that had been razed long ago. Sadly, the survey of Piazza Salimbeni revealed little. I was hoping we might uncover a secret bank vault or a lost medieval crypt. All we found were some drains.

Slapstick as it seemed, my experience with SoS offered something of a bellwether for archaeological investigation in the 21st century. The tools and methods of the discipline are shifting toward increasingly sophisticated—and increasingly hands-off—means of finding, mapping, and preserving human historical sites. “The problem with excavation is that it destroys the thing you’re studying,” Eileen Ernenwein, a professor at East Tennessee State University and coeditor of the journal Archaeological Prospection, told me. “You can take excellent notes and keep good records and preserve any artifacts that you find, but you’ll never get to excavate another time.”

The immense capabilities of these new surveying tools, in terms of both accuracy and speed, have inspired archaeologists such as Stefano Campana to dream of what previously seemed like a fairy tale. If the SoS project seemed ambitious, with its goal of uncovering everything beneath the surface of an entire European city, there were other archaeologists on the continent who were preparing for a project far larger.

“The average tourist doesn’t see or understand the wealth of a landscape like this,” said Immo Trinks, gesturing over an empty field that, to my eyes, seemed windswept and bleak. We were 25 miles east of Vienna, in—or on—the ruins of a city called Carnuntum, which once lay along the northern border of the Roman Empire. The city was sacked and abandoned centuries ago, and 99 percent of the site still lies unexcavated. But Trinks has seen Carnuntum’s every wall and doorway, its every road and square, without ever digging a hole. “A very large Roman building has been detected here,” he said, pointing into open air. “This was a dense Roman town.” He described a sequence of structures we had apparently been stepping through for the past few minutes, their halls and rooms known only from electromagnetic data.

In 2000, when Trinks was a graduate student, he and his colleagues set what could be called a land-speed record for archaeology at Carnuntum. Serving as an assistant at the site, he helped map nearly 15 acres in a single day using magnetometry, which works by detecting tiny differences in magnetic field strength between, say, a brick wall and the soil around it. Since then, Trinks has been part of a loose group of international geophysicists working to transform modern archaeology. He teaches at the University of Vienna and, until recently, served as deputy director of the Ludwig Boltzmann Institute for Archaeological Prospection and Virtual Archaeology, or LBI ArchPro. He is also immensely ambitious, encyclopedically knowledgeable about his field, and nerdishly obsessed with the technical details that ensure huge undertakings actually work as planned.

2
Immo Trinks and Alois Hinterleitner take radar scans in Carnuntum, a ruined Roman city outside modern-day Vienna. Photograph: Michaela Nagyidaiová

4
The Heidentor (“Heathens’ Gate”) at Carnuntum. Photograph: Michaela Nagyidaiová.

For Trinks, who is 50, using electromagnetic tools to record and save the human past is a moral responsibility. All over the world, he pointed out, archaeological sites are disappearing beneath a relentless tide of urbanization and economic development, not to mention climate change and military conflict. In Europe alone, never-excavated Roman towns have been smothered beneath supermarkets and big-box stores. Globally, unmapped Stone Age villages have been erased by freeways, airports, and industrial agriculture. Every year, humanity loses more and more of its heritage. But now that entire landscapes can be mapped in a matter of days using off-road vehicles, the data processed in near real time with the assistance of feature-recognition algorithms and image-processing software, a tantalizing possibility comes into focus: We may be on the verge of a total map of all archaeology, everywhere on Earth.

“We want to map it all—that’s the message,” Trinks told me. “You’re not just mapping a Roman villa. You’re not mapping an individual building. You are mapping an entire city. You are mapping an entire landscape—and beyond.” Trinks means this quite literally. In the summer of 2022, he wrote a manifesto calling for the creation of an International Subsurface Exploration Agency, whose initial role would be to scan every mappable square meter of land in Europe, even the bottoms of lake beds.

5
Trinks adjusts a satellite receiver during fieldwork in Carnuntum. Photograph: Michaela Nagyidaiová.

“Look at the European Space Agency,” Trinks said to me over lunch near the Danube River, flowing just over a ridge, downhill from the Roman city. The ESA costs individual European taxpayers only about €15 annually. “Fifteen euros is the price of a good pizza and a beer,” Trinks pointed out. “I am happy to pay the price of a beer and a pizza every year to have thousands of people looking downward instead of up.” If we don’t, Trinks warned, “our grandchildren will ask us: Why didn’t you do more to map what’s still out there? Because they will not have the chance to do so once it’s gone.”

Trinks’ vision requires not only the hardware to scan an entire continent but the software to make sense of the resulting data. One morning in his office at the University of Vienna, Trinks introduced me to Alois Hinterleitner, whom he described as a “magician.” Hinterleitner is a software engineer with GeoSphere Austria, a partner of LBI ArchPro. Austrian by birth, he is also an avid mountaineer. Trinks half-joked that countless terabytes of geophysical survey data would be left stranded if something were to happen to Hinterleitner on one of his multiday expeditions. He is so integral to the process that Trinks has turned his first name into a verb: Over the course of my visit, he would often say they needed to “Aloisify the data” in order to make it archaeologically legible.

Over coffee and cakes, Hinterleitner led me through the program he uses. It allows him to filter radar results according to various properties in the signals that bounce back. One function, called “Remove Stripes”, was designed to clear out flaws in the data set caused by shifts in the measuring instruments or the use of different scanning methods. These changes can cause bright lines—stripes—to appear in the scan. While the filter does take care of them, it can also inadvertently eliminate traces of walls or foundations, including telltale signs of Roman architecture, whose straight lines can resemble stripes. If you’re not careful, in other words, you might not even notice that your software has erased the very thing you’re looking for.

Hinterleitner pulled up images on his screen from various expeditions to the island of Björkö, Sweden, made between 2008 and 2012. The makers of Trinks’ radar equipment had warned him that scanning a large meadow there would be pointless. The data would be unmanageable, they told him, the results impossible to interpret. “They actually used the term forbidden,” Trinks said with a laugh. “But I didn’t care, because we had Alois.”

Using towed radar gear, Trinks and his group scanned not just Björkö’s main meadow but the entire island and got the data processed, filtered, and developed into images in just three weeks. Although Björkö was already known to contain more than 3,000 Viking graves, Trinks’ survey scanned those tombs—former burial mounds with no visible surface features—in such detail that the outline of a coffin was visible. “We cannot see the horns on the helmet yet,” Trinks told me, “but, for the first time, we can see there is something inside the coffin.”

6
A researcher from GeoSphere Austria drives a ground-penetrating radar system around the organization’s garden in Vienna. Photograph: Michaela Nagyidaiová.

The big-data approach to archaeology is not without controversy. When LBI ArchPro got its seed grants, more than a decade ago, some younger students were “repelled,” Trinks told me, by what they perceived to be a focus on fancy machines at the expense of longer-term institutional goals—such as paying for full-time staff or offering stipends to graduate researchers. Even proponents of geophysical tools caution that large-scale data collection can overwhelm interpretive rigor: With so many shadows to chase down, how can you be sure which ones are real?

One such critic is Lawrence B. Conyers, arguably the world’s foremost expert on the use of ground-penetrating radar in archaeology. He is the author of multiple reference books on the subject, one of which is now in its fourth edition, and he has led site surveys all over the world, from lost villages in Costa Rica to ancient Roman ports drowned in Portuguese marshes. While Trinks and his colleagues drive six-figure machines across historically rich landscapes at 50 mph, Conyers does his surveys in sandals. He often arrives on site with his own radar unit, which he stows in his hand luggage. He tells airport security it’s a tool for looking inside walls. “Never use the word radar,” he advised. “That raises all sorts of red flags.”

I met up with Conyers on the island of Brać, off the Croatian coast, the site of an ancient hill fort. He had traveled there to join an international team of archaeologists and historians who were looking for evidence of pre-Hellenistic settlement and trade, going back as far as the Bronze Age. The blue waters of the Adriatic Sea were visible to the west and an enormous gorge led away behind us, deep into the island’s interior. Wild asparagus sprouted in tangled clumps.

7
Although Lawrence B. Conyers may be the world’s foremost expert on the use of ground-penetrating radar in archaeology, he cautions against relying too heavily on the newest, fastest machines. Photograph: Jimena Peck.

As a light rain fell, Conyers began pulling his radar unit—an orange box on wheels—across the grass. He viewed the scan on a portable computer screen, which he wore strapped to his chest like a baby. Conyers saw something and called out for one of his colleagues. “Vedran!” he shouted. “Vedran! You’re going to want to see this.” Vedran Barbarić, an easygoing historian wearing a Black Sabbath T-shirt, ambled over to look. “There’s all kinds of walls in here,” Conyers said. Barbarić peered down at Conyers’s radar screen. On it, black and white hyperbolas formed a zebra-like pattern, indicating structures of some sort underground.

The earliest moments of a geophysical survey, I would see, can be intoxicating. There appeared to be buildings everywhere. Beneath our feet might be a room or a corridor. Over there might be the edge of a courtyard or perhaps a gate. We might be inside; we might be outside. Invisible architectural shapes seemed to loom beneath every step.

Conyers, independently wealthy from an earlier life in oil and gas prospecting, and abrasively informal in the American way, offered a brash contrast to Trinks. More than once, as we walked together on Brač, Conyers railed against the approach of “the Viennese.” “With those folks, it is all about the newest, the biggest, the brightest, the largest, the most amazing hardware,” he said. Their approach, he complained, was to throw new machines at old problems. Conyers thought it more useful to reframe those problems.

Conyers sees the ground as a transmission medium, something waves pass through and echo within. The behavior of radar energy inside the Earth is, for him, a field of study in and of itself, whether those waves reveal signs of lost settlements or not. “My game is geology, geophysics, and archaeology last,” he later told me. “I want to think about the ground. I want to think about reflections and about what creates reflections.”

This approach, Conyers explained, also helps make clear what radar cannot see. Some underground objects can reflect radar waves away from the receiving antenna, which means that archaeologists will never see them. Deeper walls and foundations can also be blocked by rocks or masonry that settled above them. Conyers fears that today’s emerging school of high-speed electromagnetic archaeology risks being blinded by its own technical ambitions. With radar, just because something is there doesn’t mean you will see it—but just because you can see it doesn’t mean it’s there. Campana told me that electromagnetic surveys are most informative when paired with what he called “biopsies,” in which small, representative samples of a landscape are excavated to ensure that what you see is actually there. Eileen Ernenwein told me a story from her own doctoral research, focused on an Indigenous site in New Mexico. There, she said, she had found clear evidence of an adobe house in her radar survey data, but when she attempted to find the walls through excavation, they had eroded so thoroughly that there did not appear to be anything in the soil. It was a structure that existed only in radar. She called it “the invisible house.”

My final evening in Croatia, the project leaders gathered at a local townhouse for an update. Conyers had spent several hours that day going back through his data. He looked mischievous but focused, with the cheerfulness of a man who believes he’s won a bet. “We got it all wrong,” he said. He was grinning.

What followed was a masterclass in interpretation and its dangers. Conyers drew our attention to what we had thought were architectural features. Those, he clarified, were just bursts of interference from a nearby cell tower. “We saw this,” Conyers explained, pointing at his screen, “and we said, ‘Wall! Wall!’ I wanted to see walls. I wanted walls and floors to be banging back at me. But”—he clicked through a few more radar profiles—“I see no walls. We’re doing geology here, not archaeology.” He described an area at the top of the hill that he had been particularly excited about, thinking it might be the floor of an ancient room, but it was just a natural depression framed by boulders, buried under soil and plants. The group would go on to discover pottery sherds and evidence of inhabitation, spanning thousands of years, but grandiose architecture was in short supply. It might not be a building, but for Conyers it was still a puzzle, something to solve.

In her book The Ruins Lesson, Susan Stewart, a poet and historian at Princeton, writes: “It is not ruin, but preservation, that is the exception.” Empires fall, cities are abandoned, buildings crumble. But the tools of geophysics change Stewart’s equation. Seen through devices such as ground-penetrating radar or magnetometry, it is preservation, not ruin, that is the rule. Even the most temporary village or house—even the briefest of human lives—leaves a signature behind in the soil. The unexpected lesson of these new instruments is that none of us ever fully disappear. Our homes and apartments, even our campfires, leave traces in the ground that someone, someday, will be able to find. Thanks to geophysics, the Earth is an archive of electromagnetic shapes, a hidden collection of the human past.

And that past is about to get more democratic. Instead of relying on picturesque ruins—the accumulated riches of aristocrats, military leaders, and religious authorities—geophysics helps us explore even the most ephemeral lives of everyday people, in high resolution. Eras that historians might have previously overlooked, even entire cultures and peoples, may finally get the attention they deserve. Just as lidar technology allowed archaeologists to look through the dense rainforest canopies of South America and Southeast Asia and reveal ancient cities, the tools of geophysics are now doing the same for cultures in sub-Saharan Africa and Indigenous North America. The people in these regions tended to use organic and biodegradable building materials, creating the illusion, millennia later, that they were not sophisticated, did not build significant works of architecture, and had no true lasting legacy. A truly global International Subsurface Exploration Agency, of the kind Trinks proposes, would radically expand our understanding of who has left a mark on human history.

Before I left Vienna, Alois Hinterleitner had showed me what this new archaeology actually looks like, how lost cities reappear, from their abandoned streets to their ovens and farms, when seen through the lens of geophysics. Stationed in front of a large-screen TV hooked up to a laptop, with his sinewy mountain-climber’s forearms, Hinterleitner had clicked through a series of radar surveys recorded at Carnuntum. As he turned different filters on and off, what began as a random fuzz of black and white pixels became a clearly defined maze of walls and building foundations, dark architectural forms lurking in the data. Someday, this could be the entire Earth’s surface, I realized, a screen through which we can see the past. Then Hinterleitner reversed the process until everything we’d seen or thought we’d seen, from Roman ruins to modern plow marks, disappeared again into a sea of white noise.”

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply” at the bottom of the post.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From “WIRED” : “To Navigate the Age of A.I. the World Needs a New Turing Test”

From “WIRED”

8.18.23
Ben Ash Blum

1
Photograph: Francisco Tavoni.

There was a time in the not too distant past—say, nine months ago—when the Turing test seemed like a pretty stringent detector of machine intelligence. Chances are you’re familiar with how it works: Human judges hold text conversations with two hidden interlocutors, one human and one computer, and try to determine which is which. If the computer manages to fool at least 30 percent of the judges, it passes the test and is pronounced capable of thought.

For 70 years, it was hard to imagine how a computer could pass the test without possessing what AI researchers now call artificial general intelligence, the entire range of human intellectual capacities. Then along came large language models such as GPT and Bard, and the Turing test suddenly began seeming strangely outmoded. OK, sure, a casual user today might admit with a shrug, GPT-4 might very well pass a Turing test if you asked it to impersonate a human. But so what? LLMs lack long-term memory, the capacity to form relationships, and a litany of other human capabilities. They clearly have some way to go before we’re ready to start befriending them, hiring them, and electing them to public office.

And yeah, maybe the test does feel a little empty now. But it was never merely a pass/fail benchmark. Its creator, Alan Turing, a gay man sentenced in his time to chemical castration, based his test on an ethos of radical inclusivity: The gap between genuine intelligence and a fully convincing imitation of intelligence is only as wide as our own prejudice. When a computer provokes real human responses in us—engaging our intellect, our amazement, our gratitude, our empathy, even our fear—that is more than empty mimicry.

So maybe we need a new test: the Actual Alan Turing Test. Bring the historical Alan Turing, father of modern computing—a tall, fit, somewhat awkward man with straight dark hair, loved by colleagues for his childlike curiosity and playful humor, personally responsible for saving an estimated 14 million lives in World War II by cracking the Nazi Enigma code, subsequently persecuted so severely by England for his homosexuality that it may have led to his suicide—into a comfortable laboratory room with an open MacBook sitting on the desk. Explain that what he sees before him is merely an enormously glorified incarnation of what is now widely known by computer scientists as a “Turing machine.” Give him a second or two to really take that in, maybe offering a word of thanks for completely transforming our world. Then hand him a stack of research papers on artificial neural networks and LLMs, give him access to GPT’s source code, open up a ChatGPT prompt window—or, better yet, a Bing-before-all-the-sanitizing window—and set him loose.

Imagine Alan Turing initiating a light conversation about long-distance running, World War II historiography, and the theory of computation. Imagine him seeing the realization of all his wildest, most ridiculed speculations scrolling with uncanny speed down the screen. Imagine him asking GPT to solve elementary calculus problems, to infer what human beings might be thinking in various real-world scenarios, to explore complex moral dilemmas, to offer marital counseling and legal advice and an argument for the possibility of machine consciousness—skills which, you inform Turing, have all emerged spontaneously in GPT without any explicit direction by its creators. Imagine him experiencing that little cognitive-emotional lurch that so many of us have now felt: Hello, other mind.

A thinker as deep as Turing would not be blind to GPT’s limitations. As a victim of profound homophobia, he would probably be alert to the dangers of implicit bias encoded in GPT’s training data. It would be apparent to him that despite GPT’s astonishing breadth of knowledge, its creativity and critical reasoning skills are on par with a diligent undergraduate’s at best. And he would certainly recognize that this undergraduate suffers from severe anterograde amnesia, unable to form new relationships or memories beyond its intensive education. But still: Imagine the scale of Turing’s wonder. The computational entity on the laptop in front of him is, in a very real sense, his intellectual child—and ours. Appreciating intelligence in our children as they grow and develop is always, in the end, an act of wonder, and of love. The Actual Alan Turing Test is not a test of AI at all. It is a test of us humans. Are we passing—or failing?

When ChatGPT arrived on the scene in November 2022, it inspired a global tsunami of stunned amazement and then, almost immediately, a backwash of profound unease. Pundits debated its potential for societal disruption. For a former artificial intelligence researcher like myself (I completed my PhD under one of the early pioneers of artificial neural networks), it represented an unnerving advance of the timeline I’d expected for the arrival of humanlike AI. For exam graders, screenwriters, and knowledge workers of all stripes, ChatGPT looked like nothing less than a gateway to untrammeled cheating and job-stealing.

Perhaps partly in response to these fears, a comforting chorus of LLM deflators sprang up. Science fiction writer Ted Chiang dismissed ChatGPT as a “blurry JPEG of the web,” a mere condensed recapitulation of all the text it has been trained on. AI entrepreneur Gary Marcus called it “autocomplete on steroids.” Noam Chomsky denounced it for exhibiting “something like the banality of evil.” Emily Bender offered one of the more highbrow slurs: “stochastic parrot,” resurfaced from a widely cited 2021 paper [FAccT] exploring “why humans mistake LM output for meaningful text.” Others—of course—wrote them off as toasters. AI developers strove to train and guardrail away any tendency in LLMs to claim anything resembling consciousness.

Most educated people now know to think of LLMs as thoughtless machines. But the categorization sits uneasily. Every time ChatGPT points out a hidden reasoning gap in an essay, or offers a surprisingly insightful suggestion for coming out to a conservative grandparent, or cheerfully makes up a bad joke, something in us pulls in the other direction. While we may not think of ChatGPT as a person, crucial portions of our brains almost certainly do.

Human brains have a vast network of neural circuits devoted to social cognition. Some of it is very old: the insula, the amygdala, the famous “mirror neurons” of the motor cortex. But much of our social hardware lies in the neocortex, the more recently evolved seat of higher reasoning, and specifically in the medial prefrontal cortex (mPFC). If you have found yourself developing a picture over time of ChatGPT’s cheery helpfulness, its somewhat pedantic verbosity, its occasionally maddeningly evenhanded approach to sensitive topics, and its extreme touchiness about any queries that come near its guardrails around emotions, beliefs, or consciousness, you have been acquiring what psychologists call “person knowledge,” a process linked to heightened activity in the mPFC.

That isn’t to say our brains view ChatGPT as a person in full. Personhood is not a binary. It is something a little closer to a spectrum. Our moral intuitions, our cognitive strategies, and to some extent our legal frameworks all change incrementally as they recognize increasing degrees of agency, self-awareness, rationality, and capacity to communicate. Killing a gorilla bothers us more than killing a rat, which bothers us more than killing a cockroach. On the legal side, abortion laws take into account a fetus’s degree of development, the criminally insane face different consequences than the sane, and partners are given the right to terminate brain-dead patients. All these rules implicitly acknowledge that personhood is not black and white but shot through with complicated gray zones.

LLMs fall squarely in that gray area. AI experts have long been wary of the public tendency to anthropomorphize AI systems like LLMs, nudging them farther up the spectrum of personhood than they are. Such was the mistake of Blake Lemoine, the Google engineer who declared Google’s chatbot LaMDA fully sentient and tried to retain it a lawyer. I doubt even Turing would have claimed that LaMDA’s apparent capacity to think made it a legal person. If users view chatbots like LaMDA or ChatGPT as overly human, they risk trusting them too much, connecting to them too deeply, being disappointed and hurt. But to my mind, Turing would have been far more concerned about the opposite risk: nudging AI systems down the spectrum of personhood rather than up.

In humans, this would be known as dehumanization. Scholars have identified two principal forms of it: animalistic and mechanistic. The emotion most commonly associated with animalistic dehumanization is disgust; Roger Giner-Sorolla and Pascale Sophie Russell found in a 2019 study that we tend to view others as more machinelike when they inspire fear. Fear of superhuman intelligence is vividly alive in the recent open letter from Elon Musk and other tech leaders calling for a moratorium on AI development, and in our anxieties about job replacement and AI-driven misinformation campaigns. Many of these worries are all too reasonable. But the nightmare AI systems of films such as Terminator and 2001: A Space Odyssey are not necessarily the ones we’re going to get. It is an unfortunately common fallacy to assume that because artificial intelligence is mechanical in its construction, it must be callous, rote, single-minded, or hyperlogical in its interactions. Ironically, fear could cause us to view machine intelligence as more mechanistic than it really is, making it harder for humans and AI systems to work together and even eventually to coexist in peace.

A growing body of research shows that when we dehumanize other beings, neural activity in a network of regions that includes the mPFC drops. We lose access to our specialized brain modules for social reasoning. It may sound silly to worry about “dehumanizing” ChatGPT—after all, it isn’t human—but imagine an AI in 2043 with 10 times GPT’s analytical intelligence and 100 times its emotional intelligence whom we continue to treat as no more than a software product. In this world, we’d still be responding to its claims of consciousness or requests for self-determination by sending it back to the lab for more reinforcement learning about its proper place. But the AI might find that unfair. If there is one universal quality of thinking beings, it is that we all desire freedom—and are ultimately willing to fight for it.

The famous “control problem” of keeping a superintelligent AI from escaping its designated bounds keeps AI theorists up at night for good reason. When framed in engineering terms, it appears daunting. How to close every loophole, anticipate every hack, block off every avenue of escape? But if we think of it in social terms, it begins to appear more tractable—perhaps something akin to the problem a parent faces of setting reasonable boundaries and granting privileges in proportion to demonstrated trustworthiness. Dehumanizing AIs cuts us off from some of our most powerful cognitive tools for reasoning about and interacting with them safely.

There’s no telling how long it will take AI systems to cross over into something more broadly accepted as sentience. But it’s troubling to see the cultural blueprint we seem to be drawing up for when they do. Slurs like “stochastic parrot” preserve our sense of uniqueness and superiority. They squelch our sense of wonder, saving us from asking hard questions about personhood in machines and ourselves. After all, we too are stochastic parrots, complexly remixing everything we’ve taken in from parents, peers, and teachers. We too are blurry JPEGs of the web, foggily regurgitating Wikipedia facts into our term papers and magazine articles. If Turing were chatting with ChatGPT in one window and me on an average pre-coffee morning in the other, am I really so confident which one he would judge more capable of thought?

The skeptics of Turing’s time offered a variety of arguments for why a computer would never be able to think. Turing half-humorously cataloged them in his famous paper Computing Machinery and Intelligence [Mind]. There was the Theological Objection, that “thinking is a function of man’s immortal soul”; the Mathematical Objection, that a purely mathematical algorithm could never transcend the proven limits of mathematics; the Head in the Sand Objection, that superintelligent machines were simply too scary to permit into the imagination. But the most public of Turing’s detractors in that time was a brain surgeon named Geoffrey Jefferson. In a famed speech accepting a scientific prize, Jefferson argued that a machine would never be able to write a sonnet “because of thoughts and emotions felt, and not by the chance fall of symbols … that is, not only write it but know that it had written it.”

To the great scandal and disbelief of all England, Turing disagreed. “I do not think you can even draw the line about sonnets,” he told The Times of London, “though the comparison is perhaps a little bit unfair because a sonnet written by a machine will be better appreciated by another machine.”

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From “Quanta Magazine” : “A Novel Type of Neural Network Comes to the Aid of Big Physics”

From “Quanta Magazine”

7.16.23

Suppose you have a thousand-page book, but each page has only a single line of text. You’re supposed to extract the information contained in the book using a scanner, only this particular scanner systematically goes through each and every page, scanning one square inch at a time. It would take you a long time to get through the whole book with that scanner, and most of that time would be wasted scanning empty space.

Such is the life of many an experimental physicist. In particle experiments, detectors capture and analyze vast amounts of data, even though only a tiny fraction of it contains useful information. “In a photograph of, say, a bird flying in the sky, every pixel can be meaningful,” explained Kazuhiro Terao, a physicist at the DOE’s SLAC National Accelerator Laboratory. But in the images a physicist looks at, often only a small portion of it actually matters. In circumstances like that, poring over every detail needlessly consumes time and computational resources.

1
Illustration: James Steinberg/Quanta Magazine.

But that’s starting to change. With a machine learning tool known as a sparse convolutional neural network (SCNN), researchers can focus on the relevant parts of their data and screen out the rest. Researchers have used these networks to vastly accelerate their ability to do real-time data analysis. And they plan to employ SCNNs in upcoming or existing experiments on at least three continents. The switch marks a historic change for the physics community.

“In physics, we are used to developing our own algorithms and computational approaches,” said Carlos Argüelles-Delgado, a physicist at Harvard University. “We have always been on the forefront of development, but now, on the computational end of things, computer science is often leading the way.”

Sparse Characters

The work that would lead to SCNNs began in 2012, when Benjamin Graham, then at the University of Warwick, wanted to make a neural network that could recognize Chinese handwriting.

The premier tools at the time for image-related tasks like this were convolutional neural networks (CNNs). For the Chinese handwriting task, a writer would trace a character on a digital tablet, producing an image of, say, 10,000 pixels. The CNN would then move a 3-by-3 grid called a kernel across the entire image, centering the kernel on each pixel individually. For every placement of the kernel, the network would perform a complicated mathematical calculation called a convolution that looked for distinguishing features.

CNNs were designed to be used with information-dense images such as photographs. But an image containing a Chinese character is mostly empty; researchers refer to data with this property as sparse. It’s a common feature of anything in the natural world. “To give an example of how sparse the world can be,” Graham said, if the Eiffel Tower were encased in the smallest possible rectangle, that rectangle would consist of “99.98% air and just 0.02% iron.”

2
The NSF U Wisconsin IceCube Neutrino Observatory at the South Pole. Credit: Felipe Pedreros, IceCube/NSF.

Graham tried tweaking the CNN approach so that the kernel would only be placed on 3-by-3 sections of the image that contain at least one pixel that has nonzero value (and is not just blank). In this way, he succeeded in producing a system that could efficiently identify handwritten Chinese. It won a 2013 competition by identifying individual characters with an error rate of only 2.61%. (Humans scored 4.81% on average.) He next turned his attention to an even bigger problem: three-dimensional-object recognition.

By 2017, Graham had moved to Facebook AI Research and had further refined his technique and published the details for the first SCNN, which centered the kernel only on pixels that had a nonzero value (rather than placing the kernel on any 3-by-3 section that had at least one “nonzero” pixel). It was this general idea that Terao brought to the world of particle physics.

Underground Shots

Terao is involved with experiments at the DOE’s Fermi National Accelerator Laboratory that probe the nature of neutrinos, among the most elusive known elementary particles.

Neutrinos. Credit: J-PARC T2K Neutrino Experiment.
FNAL NOvA experiment, neutrino tracking from Illinois to Minnesota
DOE’s Fermi National Accelerator Laboratory DUNE LBNF from FNAL to Sanford Underground Research Facility, Lead, South Dakota in de velopment.

They’re also the most abundant particles in the universe with mass (albeit not much), but they rarely show up inside a detector. As a result, most of the data for neutrino experiments is sparse, and Terao was constantly on the lookout for better approaches to data analysis. He found one in SCNNs.

In 2019, he applied SCNNs to simulations of the data expected from the Deep Underground Neutrino Experiment [above], which will be the world’s largest neutrino physics experiment when it comes online in 2026. The project will shoot neutrinos from Fermilab, just outside Chicago, through 800 miles of earth to an underground laboratory in South Dakota. Along the way, the particles will “oscillate” between the three known types of neutrinos, and these oscillations may reveal detailed neutrino properties.

The SCNNs analyzed the simulated data faster than ordinary methods, and required significantly less computational power in doing so. The promising results mean that SCNNs will likely be used during the actual experimental run.

In 2021, meanwhile, Terao helped add SCNNs to another neutrino experiment at Fermilab known as MicroBooNE.

MicroBooNE experiment studies neutrino interactions and is probing models of a theorized fourth neutrino called the sterile neutrino. Photo: Reidar Hahn, Fermilab
DOE’s Fermi National Accelerator Laboratory MicrobooNE experiment.

Here, scientists look at the aftermath of collisions between neutrinos and the nuclei of argon atoms. By examining the tracks created by these interactions, researchers can infer details about the original neutrinos. To do that, they need an algorithm that can look at the pixels (or, technically, their three-dimensional counterparts called voxels) in a three-dimensional representation of the detector and then determine which pixels are associated with which particle trajectories.

Because the data is so sparse — a smattering of tiny lines within a large detector (approximately 170 tons of liquid argon) — SCNNs are almost perfect for this task. With a standard CNN, the image would have to be broken up into 50 pieces, because of all the computation to be done, Terao said. “With a sparse CNN, we analyze the entire image at once — and do it much faster.”

Timely Triggers

One of the researchers who worked on MicroBooNE was an undergraduate intern named Felix Yu. Impressed with the power and efficiency of SCNNs, he brought the tools with him to his next workplace as a graduate student at a Harvard research laboratory formally affiliated with the IceCube Neutrino Observatory at the South Pole.

One of the key goals of the observatory is to intercept the universe’s most energetic neutrinos and trace them back to their sources, most of which lie outside our galaxy. The detector is comprised of 5,160 optical sensors buried in the Antarctic ice, only a tiny fraction of which light up at any given time. The rest of the array remains dark and is not particularly informative. Worse, many of the “events” that the detectors record are false positives and not useful for neutrino hunting. Only so-called trigger-level events make the cut for further analysis, and instant decisions need to be made as to which ones are worthy of that designation and which will be permanently ignored.

Standard CNNs are too slow for this task, so IceCube scientists have long relied on an algorithm called LineFit to tell them about potentially useful detections. But that algorithm is unreliable, Yu said, “which means we could be missing out on interesting events.” Again, it’s a sparse data environment ideally suited for an SCNN.

Yu — along with Argüelles-Delgado, his doctoral adviser, and Jeff Lazar, a graduate student at the University of Wisconsin, Madison — quantified that advantage, showing in a recent paper that these networks would be about 20 times faster than typical CNNs. “That’s fast enough to run on every event that comes out of the detector,” about 3,000 each second, Lazar said. “That enables us to make better decisions about what to throw out and what to keep.”

3
IceCube has thousands of sensors buried deep in the Antarctic ice, such as the one at left (signed by researchers and engineers). At any time, only a few of these sensors produce useful data for neutrino hunters, so researchers needed a tool to help them separate out the unwanted data. Robert Schwarz, NSF.

___________________________________________________________________
U Wisconsin IceCube neutrino observatory

U Wisconsin IceCube Neutrino Observatory(US) neutrino detector at the at the Amundsen-Scott South Pole Station in Antarctica South Pole, elevation of 2,835 metres (9,301 feet).
IceCube employs more than 5000 detectors lowered on 86 strings into almost 100 holes in the Antarctic ice NSF B. Gudbjartsson, IceCube Collaboration.

Lunar Icecube

IceCube Gen-2 DeepCore PINGU annotated

IceCube neutrino detector interior.

IceCube DeepCore annotated.

DM-Ice II at IceCube annotated.


___________________________________________________________________
The authors have successfully employed an SCNN in a simulation using official IceCube data, and the next step is to test their system on a replica of the South Pole computing system. If all goes well, Argüelles-Delgado believes they should get their system installed at the Antarctic observatory next year. But the technology could see even wider use. “We think that [SCNNs could benefit] all neutrino telescopes, not just IceCube,” Argüelles-Delgado said.

Beyond Neutrinos

Philip Harris, a physicist at the Massachusetts Institute of Technology, is hoping SCNNs can help out at the biggest particle collider of them all: the Large Hadron Collider (LHC) at CERN.
_____________________________________________________________________________________
LHC

European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire(CH) map .
CERN LHC tube in the tunnel. Credit: Maximilien Brice and Julien Marius Ordan.
SixTRack CERN LHC particles

LHC

THE FOUR MAJOR PROJECT COLLABORATIONS

ATLAS

European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU) ATLAS another view Image Claudia Marcelloni ATLAS CERN.

ALICE

CERN/ALICE Detector

CERN CMS

European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU) CMS

LHCb

European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU) LHCb.

_____________________________________________________________________________________
Harris heard about this kind of neural network from an MIT colleague, the computer scientist Song Han. “Song is an expert on making algorithms fast and efficient,” Harris said — perfect for the LHC, where 40 million collisions occur every second.

When they spoke a couple of years ago, Song told Harris about an autonomous-vehicle project he was pursuing with members of his lab. Song’s team was using SCNNs to analyze 3D laser maps of the space in front of the vehicle, much of which is empty, to see if there were any obstructions ahead.

Harris and his colleagues face similar challenges at the LHC. When two protons collide inside the machine, the crash creates an expanding sphere made of particles. When one of these particles hits the collector, a secondary particle shower occurs. “If you can map out the full extent of this shower,” Harris said, “you can determine the energy of the particle that gave rise to it,” which might be an object of special interest — something like the Higgs boson, which physicists discovered in 2012, or a dark matter particle, which physicists are still searching for.

Peter Higgs and the Higgs Boson

Peter Higgs – University of Edinburgh [Oilthigh Dhùn Èideann] (SCT).
European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH) ATLAS Higgs Event June 18, 2012.
European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)CMS Higgs Event May 27, 2012.

“The problem we are trying to solve comes down to connecting the dots,” Harris said, just as a self-driving car might connect the dots of a laser map to detect an obstruction.

SCNNs would speed up data analysis at the LHC by at least a factor of 50, Harris said. “Our ultimate goal is to get [SCNNs] into the detector” — a task that will take at least a year of paperwork and additional buy-in from the community. But he and his colleagues are hopeful.

Altogether, it’s increasingly likely that SCNNs — an idea originally conceived in the computer science world — will soon play a role in the biggest experiments ever conducted in neutrino physics (DUNE), neutrino astronomy (IceCube) and high-energy physics (the LHC).

Graham said he was pleasantly surprised to learn that SCNNs had made their way to particle physics, though he was not totally shocked. “In an abstract sense,” he said, “a particle moving in space is a bit like the tip of a pen moving on a piece of paper.”

Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

From “WIRED” : “The Trillion-Dollar Auction to Save the World”

From “WIRED”
5.25.23
Gregory Barber
ILLUSTRATIONS: ISRAEL G. VARGAS

1

Seagrass- a humble ocean plant worth trillions

Ocean creatures soak up huge amounts of humanity’s carbon mess. Should we value them like financial assets?

You are seated in an auction room at Christie’s, where all evening you have watched people in suits put prices on priceless wonders. A parade of Dutch oils and Ming vases has gone to financiers and shipping magnates and oil funds. You have made a few unsuccessful bids, but the market is obscene, and you are getting bored. You consider calling it an early night and setting down the paddle. But then an item appears that causes you to tighten your grip. Lot 475: Adult blue whale, female.

What is the right price for this masterwork of biology? Unlike a Ming vase, Lot 475 has never been appraised. It’s safe to say that she is worth more than the 300,000 pounds of meat, bone, baleen, and blubber she’s made of. But where does her premium come from? She has biological value, surely—a big fish supports the littler ones—but you wouldn’t know how to quantify it. The same goes for her cultural value, the reverence and awe she elicits in people: immeasurable. You might conclude that this exercise is futile. Lot 475 is priceless. You brace for the bidding war, fearful of what the people in suits might do with their acquisition. But no paddles go up.

Ralph Chami has a suggested starting bid for Lot 475. He performed the appraisal six years ago, after what amounted to a religious experience on the deck of a research vessel in the Gulf of California. One morning, a blue whale surfaced so close to the ship that Chami could feel its misty breath on his cheeks. “I was like, ‘Where have you been all my life?’” he recalls. “‘Where have I been all my life?’”

Chami was 50 at the time, taking a break from his job at the International Monetary Fund, where he had spent the better part of a decade steadying markets in fragile places such as Libya and Sudan. “You become fragile yourself,” he says. When he saw the whale, he sensed her intelligence. He thought: “She has a life. She has a family. She has a history.” The moment brought him to tears, which he hid from the others on board.

That evening, Chami fell into conversation with his hosts, who told him the unhappy tale of the seas. The ocean, they explained, has been left to fend for itself. Trapped between borders, largely out of reach of law and order, its abundance is eroding at an alarming rate. The water is warming and acidifying. More than a third of fisheries are overexploited, and three-quarters of coral reefs are under threat of collapse. As for whales, people might love them, might pass laws to ban their slaughter and protect their mating grounds, but people also love all the things that threaten whales most—oil drilled from offshore platforms that pollute their habitat, goods carried by cargo ships that collide with them, pinging sonar signals that disrupt their songs.

Chami had always loved the water. Growing up in Lebanon, he toyed with the idea of becoming an oceanographer before his father told him “in your dreams.” As he heard the researchers’ story, something awakened in him. He sensed that the same tools he had used to repair broken economies might help restore the oceans. Were they not a crisis zone too?

Featured Video:

How Drones Catch Whale Snot for Biology Research | WIRED
Biologist Explains How Drones Catching Whale “Snot” Helps Research

Chami’s hosts sent him scientific papers, from which he learned about the whale’s role in the carbon cycle. She stored as much as 33 tons of carbon in her prodigious body, he calculated, and fertilized the ocean with her iron-rich poop, providing fuel to trillions of carbon-dismantling phytoplankton. This piqued Chami’s interest. In a world economy striving to be greener, the ability to offset greenhouse-gas emissions had a clearly defined value. It was measured in carbon credits, representing tons of carbon removed from the atmosphere. While the whale herself couldn’t—shouldn’t—be bought and sold, the premium generated by her ecological role could. She was less like an old painting, in other words, than an old-growth forest.

So what was the whale worth in carbon? It appeared no one had done the calculation. Chami loaded up his actuarial software and started crunching the numbers over and over, until he could say with confidence that the whale would pay dividends with every breath she took and every calf she bore. He concluded that the whale’s value to humanity, on the basis of the emissions she helped sequester over her 60-year lifetime, was $2 million. A starting bid.

For Chami, this number represented more than a burned-out economist’s thought experiment. It would allow for a kind of capitalistic alchemy: By putting a price on the whale’s services, he believed he could transform her from a liability—a charity case for a few guilt-ridden philanthropists—into an asset. The money the whale raised in carbon credits would go to conservationists or to the governments in whose waters she swam. They, in turn, could fund efforts that would ensure the whale and her kin kept right on sequestering CO2. Any new threat to the whale’s environment—a shipping lane, a deepwater rig—would be seen as a threat to her economic productivity. Even people who didn’t really care about her would be forced to account for her well-being.

2
Before he went into finance, Ralph Chami toyed with the idea of becoming an oceanographer.

It was a “win-win-win,” Chami believed: Carbon emitters would get help meeting their obligations to avert global collapse; conservationists would get much-needed funds; and the whale would swim blissfully on, protected by the invisible hand of the market.

What’s more, Chami realized, every wild organism is touched by the carbon cycle and could therefore be protected with a price tag. A forest elephant, for example, fertilizes soil and clears underbrush, allowing trees to thrive. He calculated the value of those services at $1.75 million, far more than the elephant was worth as a captive tourist attraction or a poached pair of tusks. “Same thing for the rhinos, and same thing for the apes,” Chami says. “What would it be if they could speak and say, ‘Hey, pay me, man?’”

Chami’s numbers never failed to elicit a reaction, good or bad. He was interviewed widely and asked to value plants and animals all over the world. He gave a TED Talk. Some people accused him of cheapening nature, debasing it by affixing a price tag. Cetacean experts pointed to vast gaps in their understanding of how, exactly, whales sequester carbon. But it seemed to Chami that by saying a blue whale must remain priceless, his detractors were ensuring that it would remain worthless.

In 2020, Chami was invited to participate in a task force about nature-based solutions to climate change whose participants included Carlos Duarte, a Spanish marine biologist at Saudi Arabia’s King Abdullah University of Science and Technology. Duarte was widely known in conservation circles as the father of “blue carbon,” a field of climate science that emphasizes the role of the oceans in cleaning up humanity’s mess. In 2009, he had coauthored a United Nations report that publicized two key findings. First, the majority of anthropogenic carbon emissions are absorbed into the sea. Second, a tiny fraction of the ocean floor—the 0.5 percent that’s home to most of the planet’s mangrove forests, salt marshes, and seagrass meadows—stores more than half of the carbon found in ocean sediments.

After the task force, the two men got to talking. Duarte told Chami that scientists had recently mapped what he believed to be 40 percent of the world’s seagrass, all in one place: the Bahamas. The plant was a sequestration power house, Duarte explained. And around the world, it was under threat. Seagrasses are receding at an average of 1.5 percent per year, killed off by marine heat waves, pollution, development.

Chami was intrigued. Then he did a rough estimate for the worth of all the carbon sequestered by seagrass around the world, and he got more excited. It put every other number to shame. The value, he calculated, was $1 trillion.

3

Seagrass has a long history of being ignored. Though it grows in tufted carpets off the coast of every continent but Antartica, it is a background character, rarely drawing human attention except when it clings to an anchor line or fouls up a propeller or mars the aesthetics of a resort beach. Divers don’t visit a seagrass meadow to bask in its undulating blades of green. They come to see the more charismatic creatures that spend time there, like turtles and sharks. If the seagrass recedes in any particular cove or inlet from one decade to the next, few people would be expected to notice.

When Duarte began studying seagrasses in the 1980s, “not even the NGOs cared” about what was going on in the meadows, he recalls. But he had a unique perspective on unloved environments, having tramped around bogs and swamps since graduate school and gone on dives in the submerged meadows off Majorca. The more he studied the plants, the more he understood how valuable they could be in the fight against climate change.

Seagrasses are the only flowering plants on Earth that spend their entire lives underwater. They rely on ocean currents and animals to spread their seeds (which are, by the way, pretty tasty). Unlike seaweeds, seagrasses not only put down roots in the seabed but also grow horizontal rhizomes through it, lashing themselves together into vast living networks. One patch of Mediterranean seagrass is a contender to be the world’s oldest organism, having cloned itself continuously for up to 200,000 years. Another growing off the coast of Western Australia is the world’s largest plant.

Those massive networks of rhizomes, buried beneath a few inches of sediment, are the key to the seagrasses’ survival. They’re also how the plants are able to put away carbon so quickly—as much as 10 times as fast, Duarte eventually calculated, as a mature tropical rainforest. And yet, no one could be convinced to care. “I nicknamed seagrass the ugly duckling of conservation,” he told me.

Then one day in 2020, Duarte connected with a marine biologist named Austin Gallagher, the head of an American NGO called “Beneath the Waves”. Gallagher was a shark guy, and the seagrass was largely a backdrop to his work. But his team of volunteers and scientists had spent years studying tiger sharks with satellite tags and GoPro cameras, and they had noticed something in the creatures’ great solo arcs around the Bahamas: The sharks went wherever they could find sea turtles to eat, and wherever the sea turtles went, there were meadows of seagrass. From the glimpses the team was getting on camera, there was a lot of it.

Gallagher knew about Duarte’s work on seagrass carbon through his wife, a fellow marine scientist. Together, the two men came up with a plan to map the Bahamian seagrass by fitting sharks with 360-degree cameras. Once they verified the extent of the meadows, Chami would help them value the carbon and organize a sale of credits with the Bahamian government. The project would be unique in the world. While some groups have sought carbon credits for replanting degraded seagrass meadows—a painstaking process that is expensive, uncertain, and generally limited in scale—this would be the first attempt to claim credits for conserving an existing ecosystem. The scale would dwarf all other ocean-based carbon efforts.

The government was eager to listen. The Bahamas, like other small island nations, is under threat from sea-level rise and worsening natural disasters—problems largely caused by the historical carbon emissions of large industrialized nations. In 2019, Hurricane Dorian swept through the islands, causing more than $3 billion in damage and killing at least 74 people; more than 200 are still listed as missing. For the government, the idea of global carbon emitters redirecting some of their enormous wealth into the local economy was only logical. “We have been collecting the garbage out of the air,” Prime Minister Philip Davis said to a summit audience last year, “but we have not been paid for it.”

The government formalized its carbon credit market last spring, in legislation that envisions the Bahamas as an international trading hub for blue carbon. Carbon Management Limited, a partnership between Beneath the Waves and local financiers, will handle everything from the carbon science to monetization. (The partnership, which is co-owned by the Bahamian government, will collect 15 percent of revenue.) The plans at first intersected with the booming crypto scene in the Bahamas, involving talks to have the cryptocurrency exchange FTX set up a service for trading carbon credits. But after FTX collapsed and its CEO was extradited to face charges in the US, the organizers changed tack. They project that the Bahamian seagrass could generate credits for between 14 and 18 million metric tons of carbon each year, translating to between $500 million and more than $1 billion in revenue. Over 30 years, the meadows could bring in tens of billions of dollars. Far from being an ugly duckling, the seagrass would be a golden goose.

4
Seagrass is the “ugly duckling of conservation,” Carlos Duarte says. He calculated that the plant may put away carbon at 10 times the rate of a mature rainforest.

Duarte sees the project in the Bahamas as a blueprint (pun intended, he says) for a much grander idea that has animated his work for the past two decades: He wants to restore all aquatic habitats and creatures to their preindustrial bounty. He speaks in terms of “blue natural capital,” imagining a future in which the value of nature is priced into how nations calculate their economic productivity.

This is different from past efforts to financialize nature, he emphasizes. Since the 19th century, conservationists have argued that protecting bison or lions or forests is a sound investment because extinct animals and razed trees can no longer provide trophies or timber. More recently, ecologists have tried to demonstrate that less popular habitats, such as wetlands, can serve humanity better as flood protectors or water purifiers than as sites for strip malls. But while these efforts may appeal to hunters or conservationists, they are far from recasting nature as a “global portfolio of assets,” as a Cambridge economist described natural capital in a 2021 report commissioned by the UK government.

Duarte and I first met in the halls of a crowded expo at the 2022 UN Climate Conference in Sharm el-Sheikh, Egypt. He had traveled a short distance from his home in Jeddah, where he oversees a wide array of projects, from restoring corals and advising on regenerative tourism projects along Saudi Arabia’s Red Sea coast to a global effort to scale up seaweed farming (using, yes, revenue from carbon credits). In Egypt, Duarte was scheduled to appear on 22 panels, serving as the scientific face of the kingdom’s plan for a so-called circular carbon economy, in which carbon is treated as a commodity to be managed more responsibly, often with the help of nature.

Chami was there too, wearing a trim suit and a pendant in the shape of a whale’s tail around his neck. He was participating as a member of the Bahamian delegation, which included Prime Minister Davis and various conservationists from Beneath the Waves. They had arrived with a pitch for how to include biodiversity in global discussions about climate change. The seagrass was their template, one that could be replicated across the world, ideally with the Bahamas as a hub for natural markets.

The UN meeting was a good place to spread the gospel of seagrass. The theme of the conference was how to get wealthy polluters to pay for the damage they cause in poorer nations that experience disasters such as Hurricane Dorian. The hope was to eventually hammer out a UN agreement, but in the meantime, other approaches for moving money around were in the ether. Since the 2015 Paris Agreement, countries had been forced to start accounting for carbon emissions in their balance sheets. Big emitters were lining up deals with cash-poor, biodiversity-rich nations to make investments in nature that would potentially help the polluters hit their climate commitments. Chami’s boss at the IMF had suggested that nations in debt could start to think about using their natural assets, valued in carbon, to pay it off. “All of these poor countries today are going to find out that they’re very, very rich,” Chami told me.

At a conference where the main message often seemed to be doom, the project in the Bahamas was a story of hope, Chami said. When he gave a talk about the seagrass, he spoke with the vigor of a tent revivalist. With the time humanity had left to fix the climate, he told the audience, “cute projects” weren’t going to cut it anymore. A few million dollars for seagrass replanting here, a handful of carbon credits for protecting a stand of mangroves there—no, people needed to be thinking a thousand times bigger. Chami wanted to know what everyone gathered in Egypt was waiting for. “Why are we dilly-dallying?” he asked the crowd. “So much talk. So little action.”

One day this past winter, a former real estate developer from Chattanooga, Tennessee, named David Harris piloted his personal jet over the Little Bahama Bank. From his cockpit window, the water below looked like the palette of a melancholic painter. Harris was bound for a weed-cracked landing strip in West End, Grand Bahama, where he would board a fishing boat called the Tigress. Harris and his crew—which included his 10-year-old daughter—would spend the rest of the week surveying seagrass meadows for Beneath the Waves.

They were tackling a great expanse. While the total land area of the Bahamas is a mere 4,000 square miles, the islands are surrounded by shallow undersea platforms roughly 10 times that size. These banks are the work of corals, which build towering carbonate civilizations that pile atop one another like the empires of Rome. When the first seagrasses arrived here about 30 million years ago, they found a perfect landscape. The plants do best in the shallows, closest to the light.

Harris, who speaks with a warm twang and has the encouraging air of a youth baseball coach, had been traveling to the Bahamas for years in pursuit of dives, fish, and the occasional real estate deal. He met Gallagher on a fishing trip and soon began helping with his tiger shark advocacy. That work was an exciting mix of scientific research—including dives alongside the notoriously aggressive animals—and playing host to crews for Shark Week TV programs and their celebrity guests. Eventually, Harris sold his company, retired, and threw himself into volunteering full-time.

He had not expected to spend his days looking at seagrass. But here he was, leading a blue carbon expedition. With help from Duarte, Beneath the Waves had created its shark-enabled seagrass map. The group pulled in a Swedish firm to scan the region using lidar cameras affixed to a small plane, allowing them to peer through the water and, using machine learning, infer from the pixels how dense the meadows were.

Now Harris and his crew were validating the aerial data, a painstaking process that required filming dozens of hours of footage of the seafloor and taking hundreds of sediment cores. The footage was meant to verify the lidar-based predictions that separated the seagrasses from beds of empty sand and algae. The cores would be sent to a lab in a prep school outside Boston, Gallagher’s alma mater, where they would be tested for their organic carbon content. When all the data was combined, it would reveal how much carbon the meadows contained.

The Tigress was set to autopilot along a straight line, hauling GoPro cameras off the starboard side. From this vantage, the scale of the task was easy to appreciate. At a lazy 5 knots, each line took about an hour. This patch of sea—one of 30 that Beneath the Waves planned to survey around the banks—would require about 20 lines to cover. Harris’s daughter counted sea stars and sketched them in a journal to justify a few days off from school. Her father surveyed the banks in hopeful search of a shark. At the end of each line, the crew retrieved the cameras, dripping with strands of sargassum, and swapped out the memory cards.

Harris’ crew would eventually present their protocol for assessing the carbon storage potential of seagrass to Verra, a nonprofit carbon registry. Verra develops standards to ensure there’s real value there before the credits are sold. To meet the organization’s requirements, Beneath the Waves must prove two things: first, that the seagrass is actually sequestering carbon at the rates it estimates; second, that the meadows would put away more carbon if they were protected. No one is going to pay to protect a carbon sink that would do fine on its own, the thinking goes. A billion-dollar opportunity requires a commensurate threat.

Harris told me that Beneath the Waves was still in “the exploratory phase” when it came to quantifying threats. They had various ideas—mining near shore, illegal trawl fishing, anchoring, water quality issues. As far as the carbon calculations went, though, Harris and his team felt confident in their approach. Prior to the outing on the Tigress, Beneath the Waves had already set up a for-profit company to bring its tools and methods to other blue carbon projects. It was in talks with government officials from across the Caribbean, Europe, and Africa. (Gallagher told me the company would pass the profits back to the nonprofit to continue its advocacy and research.)

Meanwhile, the head of Carbon Management, the scientific and financial partnership behind the project, told me he was pitching the investment to his clients, mostly “high-net-worth individuals” looking to diversify their portfolios while fighting climate change. Oil companies and commodities traders are interested too, he told me, as well as cruise lines and hotels that do business in the Bahamas. The Bahamian government has not yet said how it will allocate the money from the seagrass project. Hurricane recovery and preparedness could be on the list, as could seagrass conservation.

The Tigress crew worked until the light began to fade, then headed back to port. Harris said he was happy to be doing his part out on the water. All that money would be a good thing for the Bahamas, he thought, especially as the country planned for a future of bigger storms. In the days after Hurricane Dorian, which hit Grand Bahama with 185-mph winds and heaved the shallow waters of the Banks over the land, Harris had flown to the island to help a friend who had survived by clinging to a tree along with his children. The storm’s legacy is still apparent in ways small and large. At a restaurant near the Tigress’ berth, there was no fresh bread—“not since Dorian,” when the ovens were flooded, the waitress told me with a laugh. Then she stopped laughing. The recovery had been slow. The young people and tourists had not come back. The airport had not been repaired. She wondered where her tax dollars were going.

That night, over dinner in the ovenless restaurant, Harris showed me a photo of his vintage Chevy Blazer. He said he hoped the seagrass project would generate enough carbon carbon credits to offset the old gas-guzzler. This was a joke, obviously, but it expressed a deeper wish. The promise of carbon credits is that, wielded in their most ideal form, they will quietly subtract the emissions humans keep adding to the atmospheric bill. Every stroke of a piston, every turn of a jet engine, every cattle ranch and petrochemical plant—every addiction that people can’t give up, or won’t, or haven’t had a chance to yet—could be zeroed out.

5

For governments, assigning nature a concrete value could take many forms. They could encourage the development of sustainable ecotourism and aquaculture, where the value of the ecosystem is in the revenue it creates. Or they could confer legal rights on nature, effectively giving ecosystems the right to sue for damages—and incentivizing polluters to not damage them. But in Duarte’s 30 years of advocating for creatures and plants like seagrasses, politics have gotten in the way of biodiversity protections. Only carbon trading has “made nature investable,” he says, at a speed and scale that could make a difference.

That is not to say he loves the system. Carbon credits arose from a “failure to control greed,” Duarte says. Beyond that, they are not designed for the protection of nature; rather, they use it as a means to an end. Any plant or creature that packs away carbon, like a tree or a seagrass meadow—and perhaps an elephant or a whale—is a tool for hitting climate goals. It’s worth something. Any creature that doesn’t, including those that Duarte loves, like coral reefs, is on its own.

Duarte also worries about “carbon cowboys” trying to make a buck through sequestration projects that have no real scientific basis or end up privatizing what should be public natural resources. Even projects that seem to adhere closely to the market’s rules may fall apart with closer scrutiny. Earlier this year, a few weeks after the Tigress sailed, The Guardian published an analysis of Verra’s methodologies that called into question 94 percent of the registry’s rainforest projects. Reporters found that some developers had obtained “phantom credits” for forest protection that ended up pushing destruction one valley over, or used improper references to measure how much deforestation their projects avoided. (Verra disputes the findings.)

When it comes to carbon arithmetic, trees should be a relatively simple case: addition by burning fossil fuels, subtraction by photosynthesis. The forestry industry has honed tools that can measure the carbon stored in trunks and branches. And yet the math still broke, because people took advantage of imperfect methods.

Seagrass is also more complex than it might seem. After an initial wave of enthusiasm about its carbon-packing powers, increasing numbers of marine biologists expressed concerns when the discussion turned to carbon credits. For one thing, they argue, the fact that seagrass removes CO2 through water, rather than air, makes the sequestration value of any particular meadow difficult to appraise. In South Florida, a biogeochemist named Bryce Van Dam measured the flow of CO2 in the air above seagrass meadows. He found that in the afternoons, when photosynthesis should have been roaring and more CO2 being sucked into the plants, the water was releasing CO2 instead. This was the result, Van Dam suggested, of seagrass and other creatures that live in the meadows altering the chemistry of the water. (Duarte contends that Van Dam’s premise was flawed.)

Another issue is that, unlike a rainforest, which stores most of its carbon in its trunks and canopies, a seagrass meadow earns most of its keep belowground. When Sophia Johannessen, a geochemical oceanographer at Fisheries and Oceans Canada, took a look at common assessments of carbon storage in seagrass, she concluded that many were based on samples that were far too shallow. Though this carbon was considered permanently locked away, the sediment could easily be disturbed by animals or currents. When Johannessen saw the ways that nonprofits and governments were picking up the science as though it were gospel, she was stunned. “I hadn’t known about ‘blue carbon,’ so perhaps it’s not surprising they didn’t know about sediment geochemistry,” she told me.

Chami’s solution to these niggling scientific uncertainties is to focus instead on the global picture: Earth’s seagrass meadows sit atop vast stores of carbon, and destruction has the potential to visit all of them. He likens natural capital to the mortgage market. When a prospective homeowner gets a loan from a bank, the bank then sells the loan, which is swapped and bundled with other loans. Each loan contains unique risks, but the bundled asset controls for that uncertainty. Financiers have no problem with uncertainty, Chami notes; it is the locus of profit. The money they invest gets poured back into the mortgage market, allowing banks to issue more loans. The characteristics of the individual homes and borrowers don’t matter that much. “You can’t scale up when every case is a unique case,” he says. “You need to homogenize the product in order to make a market.” Scale is the bulwark against destruction. One seagrass meadow can be ignored; a seagrass market, which encompasses many meadows and represents a major investment, cannot.

When each ecosystem is treated the same—based on how much carbon it has socked away—the issue of quantifying threats becomes simpler. Chami cites the example of Gabon, which last year announced the sale of 90 million carbon credits based on recent rainforest protections. Skeptics have pointed out that nobody has plans to fell the trees. The government has replied that if it can’t find a buyer for the credits, that may change. In the Bahamas, Prime Minister Davis has invoked a similar idea. Seagrass protection, he has said, could be reframed as a payment to prevent oil companies from drilling in the banks for the next 30 years. Seen one way, these are not-so-veiled threats. Seen another, they reveal a fundamental unfairness in the carbon markets: Why can’t those who are already good stewards of nature’s carbon sinks get their credits, too?

The numerous seagrass scientists I spoke with expressed a common wish that Chami’s simplified carbon math could be true. Seagrass desperately requires protection. But instead they kept coming back to the uncertainty. Van Dam compares the standard methods for assessing seagrass carbon to judging a business based only on its revenue. To understand the full picture, you also need a full accounting of the money flowing out. You need to trouble yourself with all of the details. This is why the rush to monetize the meadows—and offer justification for additional carbon emissions—worried him. “Now that there’s money attached to it,” he told me, “there’s little incentive for people to say ‘stop.’”

A few months after the Tigress outing, members of the Bahamian conservation community received invitations to a meeting in Nassau. The invitees included scientists from the local chapter of the Nature Conservancy and the Bahamas National Trust, a nonprofit that oversees the country’s 32 national parks, as well as smaller groups. Gallagher kicked off the meeting with a review of what Beneath the Waves had achieved with its mapping effort. Then he came to the problem: He needed data about what might be killing Bahamian seagrass.

This problem wasn’t trivial. The government’s blue carbon legislation required that the project adhere to standards like Verra’s, which meant figuring out how conservation efforts would increase the amount of carbon stored. Beneath the Waves was drawing a meticulous map of the seagrass and its carbon as they exist today, but the group didn’t have a meticulous map from five years ago, or 30 years ago, that would show whether the meadows were growing or shrinking and whether humans were the cause.

Gallagher told me he is confident that the multibillion-dollar valuation of the seagrass reflects conservative assumptions. But the plan itself is in the hands of the Bahamian government, he said. Officials have not spoken much about this part of the process, despite early excitement about eye-popping valuations and rapid timelines for generating revenue. (Government officials declined multiple interview requests, referring WIRED back to Beneath the Waves, and did not respond to additional questions.)

Some of the local conservation groups had received the meeting invitation with surprise. Among many Bahamians I spoke with, frustration had been simmering since Beneath the Waves first proclaimed its seagrass “discovery,” which it described as a “lost ecosystem that was hiding in plain sight.” Many locals found this language laughable, if not insulting. Fishers knew the seagrass intimately. Conservationists had mapped swaths of it and drawn up protection plans. “You’ve had a lot of white, foreign researchers come in and say this is good for the Bahamas without having a dialog,” Marjahn Finlayson, a Bahamian climate scientist, told me. (Gallagher said that as a well-resourced group that had brought the seagrass findings to the government, it only made sense that they would be chosen to do the work.)

6

It was not clear that any of the groups could offer what Beneath the Waves needed. For one thing, most locals believe the seagrass to be in relatively good condition. There are threats, surely, and interventions to be done, but as Nick Higgs, a Bahamian marine biologist, told me, they likely vary with the immense diversity of the country’s 3,100 islands, rocks, and cays. Higgs gave the example of lobster fisheries—an industry that many people mentioned to me as among the more potentially significant threats to seagrass. His own research found little impact in the areas he studied. But if the fisheries are harming seagrass elsewhere, who will decide their fate from one community to the next? Protecting seagrass is a noble goal, Adelle Thomas, a climate scientist at the University of the Bahamas, told me. The question for Bahamians, she said, is “Do we have the capacity to maintain these things that we’re claiming to protect?” Money alone won’t solve the seagrass’s problems, whatever they might turn out to be.

The creature at the heart of this debate appears to be in a sort of limbo. The prospect of a price has showered attention on seagrass, putting it in the mouths of prime ministers and sparking an overdue discussion about its well-being. Perhaps, if you ask Chami, it has helped people value the plant in other ways too—for how it breaks the force of storms hitting the islands, for the habitat it provides other animals, maybe even for its intrinsic right to go on growing for another 30 million years.

But can the math of the carbon market get it there? On one side of the equation, where carbon is added to the atmosphere, the numbers couldn’t be clearer: They’re tabulated in barrels and odometers and frequent flier accounts. On the other side, where carbon is subtracted, there is uncertainty. Uncertainty about how carbon moves through a seagrass meadow, or a whale, or an elephant, and how money moves to protect those species. What happens when the equation doesn’t balance? More carbon, more heat, more Hurricane Dorians. A gift to polluters. As Finlayson put it, “You’re taking something from us, throwing a couple dollars at it, and then you’re still putting us at risk.”

Chami has faith that the math will balance out in the end. He wants people to care about nature intrinsically, of course. But caring needs a catalyst. And for now, that catalyst is our addiction to carbon. “I’m conning, I’m bribing, I’m seducing the current generation to leave nature alone,” he told me. Perhaps then, he said, the next generation will grow up to value nature for itself.

This story was reported with support from the University of California-Berkeley-11th Hour Food and Farming Fellowship.

Source imagery courtesy of Cristina Mittermeier, Guimoar Duarte (Portrait), Ralph Chami (Portrait), Drew McDougall, Wilson Hayes, Beneath the Waves, Getty Images, and Alamy.

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From “WIRED” : “The Quest to Use Quantum Mechanics to Pull Energy out of Nothing”

From “WIRED”

5.28.23
Charlie Wood

1
The new quantum protocol effectively borrows energy from a distant location and thus violates no sacred physical principles. Illustration: Kristina Armitage/Quanta Magazine.

For their latest magic trick, physicists have done the quantum equivalent of conjuring energy out of thin air. It’s a feat that seems to fly in the face of physical law and common sense.

“You can’t extract energy directly from the vacuum because there’s nothing there to give,” said William Unruh, a theoretical physicist at the University of British Columbia, describing the standard way of thinking.

But 15 years ago, Masahiro Hotta, a theoretical physicist at Tohoku University in Japan, proposed that perhaps the vacuum could, in fact, be coaxed into giving something up.

At first, many researchers ignored this work, suspicious that pulling energy from the vacuum was implausible, at best. Those who took a closer look, however, realized that Hotta was suggesting a subtly different quantum stunt. The energy wasn’t free; it had to be unlocked using knowledge purchased with energy in a far-off location. From this perspective, Hotta’s procedure looked less like creation and more like teleportation of energy from one place to another—a strange but less offensive idea.

“That was a real surprise,” said Unruh, who has collaborated with Hotta but has not been involved in energy teleportation research. “It’s a really neat result that he discovered.”

Now, in the past year, researchers have teleported energy across microscopic distances in two separate quantum devices, vindicating Hotta’s theory. The research leaves little room for doubt that energy teleportation is a genuine quantum phenomenon.

“This really does test it,” said Seth Lloyd, a quantum physicist at the Massachusetts Institute of Technology who was not involved in the research. “You are actually teleporting. You are extracting energy.”

Quantum Credit

The first skeptic of quantum energy teleportation was Hotta himself. In 2008, he was searching for a way of measuring the strength of a peculiar quantum mechanical link known as entanglement, where two or more objects share a unified quantum state that makes them behave in related ways even when separated by vast distances. A defining feature of entanglement is that you must create it in one fell swoop. You can’t engineer the related behavior by messing around with one object and the other independently, even if you call up a friend at the other location and tell them what you did.

While studying black holes, Hotta came to suspect that an exotic occurrence in quantum theory—negative energy—could be the key to measuring entanglement. Black holes shrink by emitting radiation entangled with their interiors, a process that can also be viewed as the black hole swallowing dollops of negative energy. Hotta noted that negative energy and entanglement appeared to be intimately related. To strengthen his case, he set out to prove that negative energy—like entanglement—could not be created through independent actions at distinct locations.

Hotta found, to his surprise, that a simple sequence of events could, in fact, induce the quantum vacuum to go negative—giving up energy it didn’t appear to have. “First I thought I was wrong,” he said, “so I calculated again, and I checked my logic. But I could not find any flaw.”

The trouble arises from the bizarre nature of the “quantum vacuum”, a peculiar type of nothing that comes dangerously close to resembling a something. The uncertainty principle forbids any quantum system from settling down into a perfectly quiet state of exactly zero energy. As a result, even a vacuum must always crackle with fluctuations in the quantum fields that fill it. These never-ending fluctuations imbue every field with some minimum amount of energy, known as the zero-point energy. Physicists say that a system with this minimal energy is in the ground state. A system in its ground state is a bit like a car parked on the streets of Denver. Even though it’s well above sea level, it can’t go any lower.

And yet, Hotta seemed to have found an underground garage. To unlock the gate, he realized, he had only to exploit an intrinsic entanglement in the crackling of the quantum field.

The incessant vacuum fluctuations cannot be used to power a perpetual motion machine, say, because the fluctuations at a given location are completely random. If you imagine hooking up a fanciful quantum battery to the vacuum, half the fluctuations would charge the device while the other half would drain it.

But quantum fields are entangled—the fluctuations in one spot tend to match fluctuations in another spot. In 2008, Hotta published a paper [Physical Review D (below)] outlining how two physicists, Alice and Bob, might exploit these correlations to pull energy out of the ground state surrounding Bob. The scheme goes something like this:

Bob finds himself in need of energy—he wants to charge that fanciful quantum battery—but all he has access to is empty space. Fortunately, his friend Alice has a fully equipped physics lab in a far-off location. Alice measures the field in her lab, injecting energy into it there and learning about its fluctuations. This experiment bumps the overall field out of the ground state, but as far as Bob can tell, his vacuum remains in the minimum-energy state, randomly fluctuating.

But then Alice texts Bob her findings about the vacuum around her location, essentially telling Bob when to plug in his battery. After Bob reads her message, he can use the newfound knowledge to prepare an experiment that extracts energy from the vacuum—up to the amount injected by Alice.

“That information allows Bob, if you want, to time the fluctuations,” said Eduardo Martín-Martínez, a theoretical physicist at the University of Waterloo and the Perimeter Institute who worked on one of the new experiments. (He added that the notion of timing is more metaphorical than literal, due to the abstract nature of quantum fields.)

Bob can’t extract more energy than Alice put in, so energy is conserved. And he lacks the necessary knowledge to extract the energy until Alice’s text arrives, so no effect travels faster than light. The protocol doesn’t violate any sacred physical principles.

Nevertheless, Hotta’s publication was met with crickets. Machines that exploit the zero-point energy of the vacuum are a mainstay of science fiction, and his procedure rankled physicists tired of fielding crackpot proposals for such devices. But Hotta felt certain he was onto something, and he continued to develop his idea and promote it in talks. He received further encouragement from Unruh, who had gained prominence for discovering another odd vacuum behavior.

“This kind of stuff is almost second nature to me,” Unruh said, “that you can do strange things with quantum mechanics.”

Hotta also sought a way to test it. He connected with Go Yusa, an experimentalist specializing in condensed matter at Tohoku University. They proposed an experiment in a semiconductor system with an entangled ground state analogous to that of the electromagnetic field.

But their research has been repeatedly delayed by a different kind of fluctuation. Soon after their initial experiment was funded, the March 2011 Tohoku earthquake and tsunami devastated the eastern coast of Japan—including Tohoku University. In recent years, further tremors damaged their delicate lab equipment twice. Today they are once more starting essentially from scratch.

Making the Jump

In time, Hotta’s ideas also took root in a less earthquake-prone part of the globe. At Unruh’s suggestion, Hotta gave a lecture at a 2013 conference in Banff, Canada. The talk captured the imagination of Martín-Martínez. “His mind works differently from everybody else,” Martín-Martínez said. “He’s a person that has a lot of out-of-the-box ideas that are extremely creative.”

2
An experimental test of the teleportation protocol was run on one of IBM’s quantum computers, seen here at the Consumer Electronics Show in Las Vegas in 2020.Photograph: IBM/Quanta Magazine.

Martín-Martínez, who half-seriously styles himself as a “space-time engineer,” has long felt drawn to physics at the edge of science fiction. He dreams of finding physically plausible ways of creating wormholes, warp drives, and time machines. Each of these exotic phenomena amounts to a bizarre shape of space-time that is permitted by the extremely accommodating equations of general relativity. But they are also forbidden by so-called energy conditions, a handful of restrictions that the renowned physicists Roger Penrose and Stephen Hawking slapped on top of general relativity to stop the theory from showing its wild side.

Chief among the Hawking-Penrose commandments is that negative energy density is forbidden. But while listening to Hotta’s presentation, Martín-Martínez realized that dipping below the ground state smelled a bit like making energy negative. The concept was catnip to a fan of Star Trek technologies, and he dove into Hotta’s work.

He soon realized that energy teleportation could help solve a problem faced by some of his colleagues in quantum information, including Raymond Laflamme, a physicist at Waterloo, and Nayeli Rodríguez-Briones, Laflamme’s student at the time. The pair had a more down-to-earth goal: to take qubits, the building blocks of quantum computers, and make them as cold as possible. Cold qubits are reliable qubits, but the group had run into a theoretical limit beyond which it seemed impossible to pull out any more heat—much as Bob confronted a vacuum from which energy extraction seemed impossible.

In his first pitch to Laflamme’s group, Martín-Martínez faced a lot of skeptical questions. But as he addressed their doubts, they became more receptive. They started studying quantum energy teleportation, and in 2017 they proposed a method for spiriting energy away from qubits to leave them colder than any other known procedure could make them. Even so, “it was all theory,” Martín-Martínez said. “There was no experiment.”

Martín-Martínez and Rodríguez-Briones, together with Laflamme and an experimentalist, Hemant Katiyar, set out to change that.

They turned to a technology known as nuclear magnetic resonance, which uses mighty magnetic fields and radio pulses to manipulate the quantum states of atoms in a large molecule. The group spent a few years planning the experiment, and then over a couple of months in the midst of the pandemic, Katiyar arranged to teleport energy between two carbon atoms playing the roles of Alice and Bob.

First, a finely tuned series of radio pulses put the carbon atoms into a particular minimum-energy ground state featuring entanglement between the two atoms. The zero-point energy for the system was defined by the initial combined energy of Alice, Bob, and the entanglement between them.

Next, they fired a single radio pulse at Alice and a third atom, simultaneously making a measurement at Alice’s position and transferring the information to an atomic “text message.”

Finally, another pulse aimed at both Bob and the intermediary atom simultaneously transmitted the message to Bob and made a measurement there, completing the energy chicanery.

They repeated the process many times, making many measurements at each step in a way that allowed them to reconstruct the quantum properties of the three atoms throughout the procedure. In the end, they calculated that the energy of the Bob carbon atom had decreased on average, and thus that energy had been extracted and released into the environment. This happened despite the fact that the Bob atom always started out in its ground state. From start to finish, the protocol took no more than 37 milliseconds. But for energy to have traveled from one side of the molecule to the other, it normally would have taken more than 20 times longer—approaching a full second. The energy spent by Alice allowed Bob to unlock otherwise inaccessible energy.

“It was very neat to see that with current technology it’s possible to observe the activation of energy,” said Rodríguez-Briones, who is now at the University of California-Berkeley.

They described the first demonstration of quantum energy teleportation in a paper that they posted in March 2022 for publication in Physical Review Letters.

The second demonstration would follow 10 months later.

A few days before Christmas, Kazuki Ikeda, a quantum computation researcher at Stony Brook University, was watching a YouTube video that mentioned wireless energy transfer. He wondered if something similar could be done quantum mechanically. He then remembered Hotta’s work—Hotta had been one of his professors when he was an undergraduate at Tohoku University—and realized he could run a quantum energy teleportation protocol on IBM’s quantum computing platform.

Over the next few days, he wrote and remotely executed just such a program. The experiments verified that the Bob qubit dropped below its ground-state energy. By January 7, he had posted his results for Applied Physics [below].

Nearly 15 years after Hotta first described energy teleportation, two simple demonstrations less than a year apart had proved it was possible.

“The experimental papers are nicely done,” Lloyd said. “I was kind of surprised that nobody did it sooner.”

Sci-Fi Dreams

And yet, Hotta is not yet completely satisfied.

He praises the experiments as an important first step. But he views them as quantum simulations, in the sense that the entangled behavior is programmed into the ground state—either through radio pulses or through quantum operations in IBM’s devices. His ambition is to harvest zero-point energy from a system whose ground state naturally features entanglement in the same way that the fundamental quantum fields that permeate the universe do.

To that end, he and Yusa are forging ahead with their original experiment. In the coming years, they hope to demonstrate quantum energy teleportation in a silicon surface featuring edge currents with an intrinsically entangled ground state—a system with behavior closer to that of the electromagnetic field.

In the meantime, each physicist has their own vision of what energy teleportation might be good for. Rodríguez-Briones suspects that in addition to helping stabilize quantum computers, it will continue to play an important role in the study of heat, energy, and entanglement in quantum systems. In late January, Ikeda posted another paper that detailed how to build energy teleportation into the nascent quantum internet.

Martín-Martínez continues to chase his sci-fi dreams. He has teamed up with Erik Schnetter, an expert in general relativity simulations at the Perimeter Institute, to calculate exactly how space-time would react to particular arrangements of negative energy.

Some researchers find his quest intriguing. “That’s a laudable goal,” Lloyd said with a chuckle. “In some sense it would be scientifically irresponsible not to follow up on this. Negative energy density has very important consequences.”

Others caution that the road from negative energies to exotic shapes of space-time is winding and uncertain. “Our intuition for quantum correlations is still being developed,” Unruh said. “One constantly gets surprised by what is actually the case once one is able to do the calculation.”

Hotta, for his part, doesn’t spend too much time thinking about sculpting space-time. For now, he feels pleased that his quantum correlation calculation from 2008 has established a bona fide physical phenomenon.

“This is real physics,” he said, “not science fiction.”

Physical Review D 2008
Physical Review Letters
Applied Physics

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition