Tagged: MIT Technology Review Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:08 pm on April 21, 2021 Permalink | Reply
    Tags: "Protecting the World’s Vanishing Coral Reefs", , , MIT Technology Review   

    From MIT Technology Review : “Protecting the World’s Vanishing Coral Reefs” 

    From MIT Technology Review

    Apr 16, 2021
    Ari Daniel

    In 2012, Goreau and fellow diver Komang Astari examine a new Biorock installation in Pemuteran, a fishing village in Bali, Indonesia. Credit: Matthew Oldfield.

    As soon as he could walk, Tom Goreau ’70 was swimming in the warm waters off Jamaica, where he grew up. He recalls water so consistently clear and blue he could see all the way down to the corals and marine life blanketing the bottom. His dad would dive below, releasing streams of bubbles that Goreau would follow. This was the 1950s, before scuba gear was commercially available. So Goreau’s father—Thomas Fritz Goreau, considered the first diving marine scientist—built equipment from scratch that allowed him to dive as deep as a few hundred feet. “He probably held the world’s record for depth diving on compressed air at the time,” says his son. Goreau’s grandfather, Fritz Goro, was the inventor of macrophotography—featuring extreme close-ups of small objects—and the first to use it underwater. Together, Goreau’s grandfather and father took some of the earliest photographs of corals. His mother, Nora Goreau, also had a notable link to the sea: she was the first Panamanian marine biologist.

    Goreau—whose family’s story is told in the new documentary Coral Ghosts—has borne witness for seven decades to the steady global decline of coral reefs, which have degraded into fields of rubble and algae. “My expertise is knowing how the reefs used to be,” he says. In a word—magnificent. “And now they’re essentially gone, like Hiroshima looked the day after the atom bomb.”

    In the 1980s, building on his undergraduate degree in planetary physics from MIT (and graduate degrees from Caltech and Harvard), Goreau pioneered the use of sea surface temperatures collected by satellites to predict at what point corals would bleach. But we’ve far surpassed that threshold. Climate change has cooked and bleached the corals. Ocean acidification has dissolved them. And local pollution has sealed their fate.

    As president of the nonprofit Global Coral Reef Alliance (GCRA), Goreau helps local and indigenous people identify which stressors are killing their local reefs and how to reduce that negative impact. He targets his message to the oldest fishermen “because they’re the only ones who remember how it was,” he says. Young audiences are less receptive—their elders’ stories of teeming ocean life are like myths to a generation that knows coral reefs as feeble places barely capable of supporting a few small fish.

    But Goreau has found a way to help: a system he adapted, called Biorock. He and his small GCRA team weld together webs of steel rebar, plunge them underwater where the reefs once stood, and run a current through them. Over time, a thickening crust of limestone grows to blanket and strengthen the web. They graft coral fragments onto it, which continue to grow and sometimes overtake the original structure. The result attracts numerous marine creatures and protects eroded beaches from waves (as the reefs once did). Biorock can also be used to restore other marine habitats such as seagrass and salt marshes, Goreau notes. It’s a means, he explains, of “regenerating the ecosystem and working with people who are trying to save the last bit of what they have.” He’s constructed about 700 of these artificial reefs and is hopeful they might help, in some small way, to turn things around.

    One place he’s set up shop is the Marshall Islands, in the central Pacific. In the 1940s, the inhabitants of Bikini Atoll were forcibly evacuated to the other islands so the United States could test its atomic bombs. Today, Goreau hopes his electrified reefs can protect these islands from flooding and sea-level rise. Bikini Atoll was also the place where, decades ago, his father and grandfather began their photography work. Some 25 years later, while Goreau was studying at MIT, his dad—like many of the displaced people of Bikini Atoll—died from accumulated exposure to the radiation.

    The underwater world Goreau knew as a boy, and all it was filled with, is long gone. This leaves him feeling “very much like someone who is the last member of a dying culture,” he says—a man who knew an ocean that now exists only in his family’s albums of fading photographs.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 1:05 pm on February 4, 2021 Permalink | Reply
    Tags: "My satellite would fit in a small suitcase.", "The Smallest Lights in the Universe", , , , , , , MIT Technology Review, MIT/NASA JPL/Caltech ASTERIA cubesat developed by Sara Seager of MIT.,   

    From MIT Technology Review: “My satellite would fit in a small suitcase.” 

    From MIT Technology Review

    December 18, 2020 [Just now in social media from MIT]
    by Sara Seager

    Sara Seager with a telescope in her yard, awaiting the darkness of the night sky. Credit: Webb Chappell.

    But it could help us find other worlds.

    Sara Seager has thought long and hard about the math: the odds that Earth harbors the only life in the universe are almost impossible. “The greatest discovery astronomers could possibly make is that we’re not alone,” writes the MIT astrophysicist in her new memoir The Smallest Lights in the Universe.


    “Humanity has searched the heavens for a reflection of ourselves for centuries; to see someone or something else, inhabiting another Earth—that’s the dream.”

    A pioneer in the search for exoplanets, or planets orbiting other stars, she came up with the now-standard practice of studying the atmospheres of planets by analyzing the light that filters through them. Seager, who won a MacArthur Foundation “genius” grant, is the Class of 1941 Professor of Planetary Science and has appointments in the Departments of Physics and Aeronautics and Astronautics as well. She was also the deputy science director of the MIT-led NASA Explorer mission TESS (transiting exoplanet survey satellite) from 2016 to 2020, and a lead for Starshade Rendezvous, a feasibility study for a space-based mission to find and characterize Earth-like exoplanets. In her memoir, she shares her personal story of finding herself widowed at 40, a suddenly single mother of two young sons, while she explains the science of her search for other worlds.

    NASA/MIT Tess

    NASA/MIT Tess in the building.

    NASA/MIT TESS replaced Kepler in search for exoplanets.

    TESS is a NASA Astrophysics Explorer mission led and operated by MIT in Cambridge, Massachusetts, and managed by NASA’s Goddard Space Flight Center.

    Additional partners include Northrop Grumman, based in Falls Church, Virginia; NASA’s Ames Research Center in California’s Silicon Valley; the Center for Astrophysics – Harvard and Smithsonian in Cambridge; MIT Lincoln Laboratory; and the Space Telescope Science Institute in Baltimore.

    NASA JPL Starshade

    This excerpt, drawn from different sections of her book, chronicles her work to develop the cubesat ASTERIA.

    MIT/NASA JPL/Caltech ASTERIA cubesat, developed by Sara Seager of MIT.

    A satellite the size of a small suitcase, ASTERIA was designed to demonstrate the technology needed for a tiny telescope to search for exoplanets by detecting the minuscule dip in a star’s light when an orbiting planet passes in front of it.

    Planet transit. NASA/Ames.

    Seager initiated and developed ASTERIA at MIT, and later served as principal investigator while it was built and operated by the Jet Propulsion Laboratory from November 2017 until December 2019.

    Searching for shadows to find other worlds

    At its essence, astrophysics is the study of light. We know that there are stars other than the sun because we can see them shining. But light doesn’t just illuminate. Light pollutes. Light blinds. Little lights—exoplanets—have forever been washed out by the bigger lights of their stars, the way those stars are washed out by our sun. To find another Earth, we’d have to find the smallest lights in the universe.

    If, for the moment at least, astronomers couldn’t fight the brightness of stars, maybe we could use their power to our advantage. Bodies in transit sometimes align. If we were lucky, a planet might pass between us and its star, creating something like a miniature eclipse. The moon looks giant when it blocks out the sun. The Transit Technique, as it would come to be called, applied the same principle to exoplanets. We would find them not by the light they emitted, but by the light they spoiled. Nothing stands out like a black spot.

    In the fall of 1999, while I was a postdoctoral fellow at the Institute for Advanced Study in Princeton, the first transit of a known planet—HD 209458b, a “hot Jupiter”—was announced. It was absolutely fantastic news, in part because the discovery erased the last shred of doubt that exoplanets exist.

    Studying starlight for signs of life

    I had been turning over an idea—a genuinely original one—and the successful use of the Transit Technique gave it a greater urgency. A lot of science, especially pioneering science, relies on intuition. I didn’t have any evidence that my idea would work. But I was doubtless. I had realized that the technique might help reveal something more than the black silhouette of a planet. Immediately around that tiny partial eclipse, the same starlight that was being blocked by an exoplanet would pass through its atmosphere. The starlight would reach us, but not the way regular starlight reaches us. It would be filtered, like water running through a screen, or a flashlight’s beam struggling through a fog. If you look at a rainbow from a distance, its many colors form a perfect union. But if you look at a rainbow more closely, using an instrument called a spectrograph, you can see gaps in the light, minuscule breaks in each wavelength like missing teeth. Gases in the solar atmosphere and Earth’s own thin envelope interrupt the transmission of sunlight, the way power lines cause static in a radio signal. Certain gases interfere in telltale ways. One gas might take a bite out of indigo, while another gas might have an appetite for yellow or blue. Why couldn’t we use a spectrograph to look at the starlight passing through a transiting exoplanet’s atmosphere? That way we could determine what sorts of gases surround that exoplanet. We already knew that large amounts of certain gases are likely to exist only in the presence of life. We call them biosignature gases. Oxygen is one; methane is another. We could start with hot Jupiters, the planets we already know, and their more easily detectable atmospheres. Like a skunk’s spray, their traces of sodium and potassium would stand out amid the company of less potent atoms. I kept my idea to myself, because I knew it was great—I was the first to see the potential of the Transit Technique for studying atmospheres—and I knew, too, that great ideas get stolen. Dimitar Sasselov, my former PhD supervisor, was the only person I told about my theory, and he offered to help me bring it closer to practice. When we had worked out the details, I published a paper [The Astrophysical Journal-Supplemental Series] extolling what Dimitar and I called “transit transmission spectra”—reading the gaps in rainbows.

    My paper received considerable attention. NASA was accepting proposals to use the Hubble Space Telescope; within a few months of publication, one team cited my work and won the rights to study the light that passed through the atmosphere of a transiting hot Jupiter. I was furious not to be included on that team, which chose an older male scientist over me.

    Within two years, their work revealed the first exoplanet atmosphere. It didn’t surround another Earth, but my premise had worked. We had seen our first alien sky.

    Spying on stars with tiny satellites

    One of the great hurdles in looking for exoplanets is the time it takes to find them. The nearest and brightest sun-like stars are scattered all over the sky, which means that no telescope can take in more than a few at a time. But it’s prohibitively expensive, as well as nonsensical, to use something like Hubble or Spitzer to stare at a single star system waiting, hoping, to see the shadows of planets we’re not sure exist. Properly mapping a star system might take years.

    I had been trying to make a long-term plan to find another Earth when I learned about what the community had taken to calling cubesats—tiny satellites designed to a standard form, which supposedly made them cheaper and easier to build and deliver into space. What if I made a constellation of cubesats, each assigned to look at only one star? I dreamed of space telescopes the size of a loaf of bread—not one, but an army, fanning out into orbit like so many advance scouts. Each could settle in and monitor its assigned sun-like star for however long I needed it to; each could be dedicated to learning everything possible about one single light. Hubble, Spitzer, Kepler—they each saw hugely. Maybe now we needed dozens or hundreds of narrower gazes, using the Transit Technique [above] as the principal method of discovery. Cubesats wouldn’t see what larger space telescopes could see, but they would never need to blink.

    This panorama of the northern sky captured by TESS (transiting exoplanet survey satellite) [above] includes an edgewise view of the Milky Way. Sara Seager served as deputy science director of the MIT-led TESS, a NASA Explorer mission, from 2016 to 2020. Credit: NASA/MIT/TESS AND ETHAN KRUSE (USRA).

    I talked to David Miller, a colleague and engineering professor who was in charge of what would become one of my favorite classes: a design-and-build class for fourth-year undergraduates. It was revolutionary when it started, because it was so project-based; after a few introductory lectures, the students dived into the challenges of making an actual satellite. I asked David whether I could use his class to incubate my cubesat idea.

    He was enthusiastic from the start. Maybe the best thing about MIT is that no matter how crazy your idea, nobody says it’s not going to work until it’s proved unworkable. And squeezing a space telescope inside something as small as a cubesat was a pretty crazy idea. The main challenge would be in making something small that was still stable enough to gather clear data—a tall order because smaller satellites, like smaller anything, get pushed around in space more easily than larger objects. To take precise brightness measurements of a star, we would need to be able to keep its center of brightness fixed to the same tiny fraction of a pixel, far finer than the width of a human hair. We would have to make something that was a hundred times better than anything that currently existed in the cubesat’s mass class. Imagine making a car engine that runs a hundred times better than today’s best car engine.

    “Let’s do it,” David said.

    Statistics and space hardware

    Cubesats are much cheaper than regular satellites, because they’re smaller and easier to launch; they take up a lot less room in the hold of a rocket, and it costs $10,000 to send a pound of anything into space. Unfortunately, their cheap manufacture makes them prone to failure. Many of them never work. We use the same hopeless term for them that doctors use for patients they never got the chance to save: “DOA.”

    One of our first hurdles, then, was a problem of statistics. (Every problem is a problem of statistics.) To make the cloud of cubesats that would come to be called ASTERIA, we had to figure out how many satellites we would need to give us a reasonable chance of finding another Earth-size planet. Thousands of bright, sun-like stars were worth monitoring, but we wouldn’t be able to build and manage thousands of satellites. We also knew that given the ephemeral nature of transits, the odds of an Earth-size planet transiting a sun-like star were only about 1 in 200. Some of our satellites would also no doubt fail or be lost. If we sent up only a few, we would have to be either very strategic or very lucky to find what we were looking for. There was some optimal number of satellites that, combined with a smart list of target stars, would keep our budget reasonable but still give us a good chance of success.

    I was lucky to have a great group of graduate students and postdocs who I leaned on when my husband, Mike, got sick. I set one to work on ASTERIA’s optics, another on precision pointing, a third on communications. With their help, I’d made progress toward a prototype for my tiny satellites, inventing and testing precision-pointing hardware and software, and perfecting the design of the onboard telescope and its protective baffle. I worked hard to clear the rest of the path for ASTERIA to become real. After we’d laid the groundwork in the design-and-build class, my students and I were joined in our efforts by Draper Laboratory in Cambridge, where researchers work on things like missile guidance systems and submarine navigation. They also do a lot of work on space hardware. We had meetings every week, trying to solve the problems of small telescopes. We could build small enough components, and we could deploy the satellite and tell it what to do, but we still couldn’t figure out how to keep it as stable as we needed it to be. While we tried to solve that issue, I used my ongoing research on biosignature gases to determine what types of exoplanets deserved our focus. I thought we might be able to explore a hundred star systems or so in my lifetime; they had to be the right ones.

    A test in the desert

    Night fell, desert-hard and blacker than black as we huddled together on a big patch of concrete at an old missile site in the middle of New Mexico to test out a new component for ASTERIA. I was more and more certain of its value. It wasn’t Hubble or Spitzer or Kepler, and it might never be something so magnificent.

    NASA/ESA Hubble Telescope.

    NASA/Spitzer Infrared telescope no longer in service. Launched in 2003 and retired on 30 January 2020. Credit: NASA.


    But not every painting should or could be Starry Night. There is room in the universe for smaller work, a different kind of art. Kepler might find thousands of new worlds, but it wouldn’t reveal enough of any single one of them for us to know whether it was somebody’s home. It was sweeping its eye across star fields that were too far away for astronomers to make anything more than assumptions about places like Kepler-22b.

    Kepler-22b. Credit: NASA Exoplanets https://exoplanets.nasa.gov/exoplanet-catalog/1599/kepler-22b/

    But if I could just make ASTERIA work, and then find a way to send up a fleet of satellites, it would combine the best outcomes of NASA’s Kepler space telescope, capable of finding smaller planets around sun-like stars, and the nascent TESS, with its more proximate search and sensitivity to red dwarf stars.

    Engineers test ASTERIA before its 2017 launch. Credit: NASA/JPL-CALTECH.

    My team built a prototype for a possible camera, one that was promisingly stable and could operate at a warmer temperature than the detectors used in most satellites. (Most have to be cooled, which taxes the machine.) I just wasn’t sure that it would see what we needed it to see. I had a particularly bright and enthusiastic grad student at the time, named Mary Knapp; she had been an undergraduate in the first design-and-build class I taught. She encouraged us to test the camera outside, using it to look at real stars. Mary proposed the deserts of New Mexico as our proving ground. That April, there would be a new moon, casting the already clear desert sky an even pitcher black. That new moon also coincided with school break for my sons, Max and Alex, which meant that I could take them along. As much as I wanted to see the stars, I wanted to see them, too.

    I had asked a local club of amateur astronomers where the best place to test our camera might be. That night they invited us to their star-viewing party, a celebration of the new moon. We arrived at dusk at the old missile site. I looked up at the stars and felt my childlike wonder return. I think the boys felt it too.

    We set up the camera. We would have to wait until we were back at MIT to analyze our data, but our new type of detector, one not yet used for astronomy, seemed to do the trick. We knew at least that our experiment wasn’t a total failure.

    A long-awaited launch

    In August 2017, after years of work and hope and effort, SpaceX prepared to launch a Falcon 9 rocket into space. The rocket didn’t have a crew, but ASTERIA was on board.

    It had been a difficult journey. The camera had made its way from my imagination to our design-and-build class, through drawings and prototypes and an old missile site in New Mexico. Then we’d run out of money at MIT, and Draper Laboratory had liked the technology better for other things. The Jet Propulsion Laboratory, which had always been interested in the possibilities of cubesats and ASTERIA in particular, picked up where MIT and Draper left off. Three MIT graduates there would play leading roles on the project; they took their work seriously, having seen firsthand how much it mattered. Their passion and expertise made sure that ASTERIA would become everything it could be, that it was built right and lovingly placed, at last, into the hold of a rocket, groaning on the launchpad on a beautiful late-summer day. The rocket would slice into the sky and rendezvous with the International Space Station. The astronauts there would set our little satellite free later in the fall. From a whisper in my dreams to space: I couldn’t believe that we were nearing the end of such a long reckoning.

    I had planned on going to the ASTERIA launch, but it was delayed just long enough for travel and child-care plans to fall through. On the day of the launch, I took the train into Cambridge instead, walked to the Green Building, and took the elevator to my floor. I walked past the travel posters for distant worlds into my office, shut the door, and called up the online video stream. The launch was a big deal; all over the world, eyes were trained on that rocket, still waiting on the pad.

    Every now and then I looked up from the cloudless Florida footage on my screen and out my windows, at my crystalline view of downtown Boston. There were clear skies everywhere I looked. I spent maybe 30 minutes in the quiet, writing thank-you emails to other members of the ASTERIA team. At the last second I decided not to send them. I know that superstition is unscientific. I understand that it doesn’t matter to the universe if a baseball player is wearing his lucky underwear—whether he gets a hit is mostly up to the pitcher and to him. But rockets are delicate, ill-tempered machines. Before the Russians launch rockets from the steppes of Kazakhstan into orbit, they summon an Orthodox priest to throw holy water at the boosters, his beard and cloak and the holy water carried sideways by the wind. I wasn’t going that far, but I wasn’t going to send a couple of emails until we were safely weightless. I was surprised by how nervous I was, watching the countdown clock tick down to launch.

    The engines ignited with a great big ball of pure fire. The launch tower fell away, and the rocket eased its way off the pad, gained speed, and pushed its shining shoulders toward its future orbit. The onboard cameras recorded its arching flight as the sky around it went from blue to purple to black. The rocket had broken through into space. The boosters were jettisoned, and the remainder of the rocket continued its climb into the deepest possible night, the Earth blue and alight behind it, an impossible blackness ahead. It would take a little while for it to catch up with the space station, which was racing its own way through orbit at 17,000 miles an hour, about five miles every second. But the rocket, and our satellite, were well on their way.

    Everything brave has to start somewhere, I thought.

    Do I believe in other life in the universe?

    Yes, I believe.

    The better question: What does our search for it say about us? It says we’re curious. It says we’re hopeful. It says we’re capable of wonder and of wonderful things.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 3:20 pm on November 6, 2020 Permalink | Reply
    Tags: "Half the Milky Way’s sun-like stars could be home to Earth-like planets", , , , , MIT Technology Review   

    From MIT Technology Review: “Half the Milky Way’s sun-like stars could be home to Earth-like planets” 

    From MIT Technology Review

    November 6, 2020
    Neel V. Patel

    Wikimedia Commons

    A new study of exoplanet data suggests there are at least 300 million potentially habitable planets orbiting stars like the sun, and likely way more.

    Nearly 4,300 exoplanets have been discovered by astronomers, and it’s quite obvious now our galaxy is filled with them. But the point of looking for these new worlds is more than just an exercise in stamp collecting—it’s to find one that could be home to life, be it future humans who have found a way to travel those distances or extraterrestrial life that’s made a home for itself already. The best opportunity to find something like that is to find a planet that resembles Earth.

    And what better way to look for Earth 2.0 than to search around stars similar to the sun? A new analysis of exoplanet data collected by NASA’s Kepler space telescope, which operated from 2009 to 2018, has come up with some new predictions for how many stars in the Milky Way galaxy that are comparable to the sun in temperature and age are likely to be orbited by a rocky, potentially habitable planet like Earth. When applied to current estimates of 4.1 billion sun-like stars in the galaxy, their model suggests there are at minimum 300 million with at least one habitable planet.

    The model’s average, however, posits that one in two sun-like stars could have a habitable planet, causing that figure to swell to over 2 billion. Even less conservative predictions suggest it could be over 3.6 billion.

    The new study [The Occurrence of Rocky Habitable Zone Planets Around Solar-Like Stars from Kepler Data] has not yet been peer-reviewed, but it will be soon, and it is due to be published in The Astronomical Journal.

    “This appears to be a very careful study and deals with really thorny issues about extrapolating from the Kepler catalogue,” says Adam Frank, a physicist and astronomer at the University of Rochester, who was not involved with the study. “The goal is to get a complete, reliable, and accurate estimate for the average number of potentially habitable planets around stars. They seem to have made a good run at that.”

    Scientists have made several attempts in the past to use Kepler data to work out how many sun-like stars in the galaxy have potentially habitable exoplanets in their orbit. But these studies have provided answers that ranged from less than 1% to more than 100% (i.e., multiple planets around these stars). It’s a reflection of how hard it’s been to work with this data, says Steve Bryson of NASA Ames Research Center in California, who led the new work.

    Two major issues have created this large window: incomplete data, and the need to cull false detections from the Kepler data set.

    The new study addresses both of these problems. It’s the first of its kind to use the full Kepler exoplanet data set (more than 4,000 detections from 150,000 stars), but it’s also using stellar data from Gaia, the European Space Agency’s mission to map every star in the Milky Way.

    ESA (EU)/GAIA satellite .

    All that helped make the final estimates more accurate, with smaller uncertainties. And this is after scientists have spent years analyzing the Kepler catalogue to strip away obscuring elements and ensure that only real exoplanets are left. Armed with both Kepler and Gaia data, Bryson and his team were able to determine the rate of formation for sun-like stars in the galaxy, the number of stars likely to have rocky planets (with radiuses 0.5 to 1.5 times Earth’s), and the likelihood those planets would be habitable.

    On average, Bryson and his team predict, 37 to 60% of sun-like stars in the Milky Way should be home to at least one potentially habitable planet. Optimistically, the figure could be as high as 88%. The conservative calculations pull this figure down to 7% of sun-like stars in the galaxy (hence 300 million)—and on the basis of that number, the team predicts there are four sun-like stars with habitable planets within 30 light-years of Earth.

    “One of the original goals of the Kepler mission was to compute exactly this number,” says Bryson. “We have always intended to do this.”

    Habitability has to do with the chances a planet has temperatures moderate enough for liquid water to exist on the surface (since water is essential for life as we know it). Most studies figure this out by gauging the distance of an exoplanet from its host star and whether its orbit is not too close and not too far—the so-called Goldilocks zone.

    According to Bryson, orbital distance is a useful metric when you’re examining one specific star. But when you’re looking at many stars, they’ll all exhibit different brightnesses that deliver different amounts of heat to surrounding objects, which means their habitable zones will vary. The team instead chose to think about habitability in terms of the volume of light hitting the surface of an exoplanet, which the paper calls the “instellation flux.”

    Through stellar brightness data, “we are measuring the true temperature of the planet—whether or not it is truly in the habitable zone—for all the planets around all the stars in our sample,” says Bryson. You don’t get the same sort of reliable temperature figures working with distances, he says.

    Though Bryson claims this study’s uncertainties are smaller than those in previous efforts, they are still quite large. This is mainly because the team is working with such a small sample of discovered rocky exoplanets. Kepler has identified over 2,800 exoplanets, only some of which orbit sun-like stars. It’s not an ideal number to use to predict the existence of hundreds of millions of others in the galaxy. “By having so few observations, it limits what you can say about what the truth is,” says Bryson.

    Lastly, the new study assumes a simple model for these exoplanets that could depart dramatically from conditions in the real world (some of these stars may form binary star systems with other stars, for example). Plugging more variables into the model would help paint a more accurate picture, but that requires more precise data that we don’t really have yet.

    But it’s studies like these that could help us acquire that data. The whole point of Kepler was to help scientists figure out what kinds of interstellar objects they ought to devote more resources to studying to find extraterrestrial life, especially with space-based telescopes whose observation time is limited. These are the instruments (such as NASA’s James Webb Space Telescope and the ESA’s PLATO telescope) that could determine whether a potentially habitable exoplanet has an atmosphere or is home to any potential biosignatures, and studies like this latest one can help engineers design telescopes more suited to these tasks.

    NASA/ESA/CSA Webb Telescope annotated

    ESA PLATO spacecraft depiction

    “Almost every sun-like star in the galaxy has a planet where life could form,” says Frank. “Humanity has been asking this question for more than 2,500 years, and now we not only know the answer, we are refining our knowledge of that answer. This paper tells us there are a lot of planets out there in the right place for life to form.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 1:07 pm on October 14, 2020 Permalink | Reply
    Tags: "Room-temperature superconductivity has been achieved for the first time", , MIT Technology Review, ,   

    From MIT Technology Review: “Room-temperature superconductivity has been achieved for the first time” 

    From MIT Technology Review

    October 14, 2020
    Konstantin Kakaes

    Equipment used to create a room-temperature superconductor, including a diamond anvil cell (blue box) and laser arrays, is pictured in the University of Rochester lab of Ranga Dias. Credit: Adam Fenster.

    It was in a tiny sample under extremely high pressure, so don’t start dismantling the world’s energy infrastructure quite yet.

    Room-temperature superconductors—materials that conduct electricity with zero resistance without needing special cooling—are the sort of technological miracle that would upend daily life. They could revolutionize the electric grid and enable levitating trains, among many other potential applications. But until now, superconductors have had to be cooled to extremely low temperatures, which has restricted them to use as a niche technology (albeit an important one). For decades it seemed that room-temperature superconductivity might be forever out of reach, but in the last five years a few research groups around the world have been engaged in a race to attain it in the lab.

    One of them just won.

    In a paper published today in Nature, researchers report achieving room-temperature superconductivity in a compound containing hydrogen, sulfur, and carbon at temperatures as high as 58 °F (13.3 °C, or 287.7 K). The previous highest temperature had been 260 K, or 8 °F, achieved by a rival group at George Washington University and the Carnegie Institution in Washington, DC, in 2018. (Another group at the Max Planck Institute for Chemistry in Mainz, Germany, achieved 250 K, or -9.7 °F, at around this same time.) Like the previous records, the new record was attained under extremely high pressures—roughly two and a half million times greater than that of the air we breathe.

    “It’s a landmark,” says José Flores-Livas, a computational physicist at the Sapienza University of Rome, who creates models that explain high-temperature superconductivity and was not directly involved in the work. “In a couple of years,” he says, “we went from 200 [K] to 250 and now 290. I’m pretty sure we will reach 300.”

    Electric currents are flowing electric charges, most commonly made up of electrons. Conductors like copper wires have lots of loosely bound electrons. When an electric field is applied, those electrons flow relatively freely. But even good conductors like copper have resistance: they heat up when carrying electricity.

    Superconductivity—in which electrons flow through a material without resistance—sounds impossible at first blush. It’s as though one could drive at high speed through a congested city center, never hitting a traffic light. But in 1911, Dutch physicist Heike Kamerlingh Onnes found that mercury becomes a superconductor when cooled to a few degrees above absolute zero (about -460 °F, or -273 °C). He soon observed the phenomenon in other metals like tin and lead.

    For many decades afterwards, superconductivity was created only at extremely low temperatures. Then, in late 1986 and early 1987, a group of researchers at IBM’s Zurich laboratory found that certain ceramic oxides can be superconductors at temperatures as high as 92 K—crucially, over the boiling temperature of liquid nitrogen, which is 77 K. This transformed the study of superconductivity, and its applications in things like hospital MRIs, because liquid nitrogen is cheap and easy to handle. (Liquid helium, though colder, is much more finicky and expensive.) The huge leap in the 1980s led to feverish speculation that room-temperature superconductivity might be possible. But that dream had proved elusive until the research being reported today.

    Under pressure

    One way that superconductors work is when the electrons flowing through them are “coupled” to phonons—vibrations in the lattice of atoms the material is made out of. The fact that the two are in sync, theorists believe, allows electrons to flow without resistance. Low temperatures can create the circumstances for such pairs to form in a wide variety of materials. In 1968, Neil Ashcroft, of Cornell University, posited that under high pressures, hydrogen would also be a superconductor. By forcing atoms to pack closely together, high pressures change the way electrons behave and, in some circumstances, enable electron-phonon pairs to form.

    Scientists have for decades sought to understand just what those circumstances are, and to figure out what other elements might be mixed in with hydrogen to achieve superconductivity at progressively higher temperatures and lower pressures.

    In the work reported in today’s paper, researchers from the University of Rochester and colleagues first mixed carbon and sulfur in a one-to-one ratio, milled the mixture down to tiny balls, and then squeezed those balls between two diamonds while injecting hydrogen gas. A laser was shined at the compound for several hours to break down bonds between the sulfur atoms, thus changing the chemistry of the system and the behavior of electrons in the sample. The resulting crystal is not stable at low pressures—but it is superconducting. It is also very small—under the high pressures at which it superconducts, it is about 30 millionths of a meter in diameter.

    The exact details of why this compound works are not fully understood—the researchers aren’t even sure exactly what compound they made. But they are developing new tools to figure out what it is and are optimistic that once they are able to do so, they will be able to tweak the composition so that the compound might remain superconducting even at lower pressures.

    Getting down to 100 gigapascal—about half of the pressures used in today’s Nature paper—would make it possible to begin industrializing “super tiny sensors with very high resolution,” Flores-Livas speculates. Precise magnetic sensors are used in mineral prospecting and also to detect the firing of neurons in the human brain, as well as in fabricating new materials for data storage. A low-cost, precise magnetic sensor is the type of technology that doesn’t sound sexy on its own but makes many others possible.

    And if these materials can be scaled up from tiny pressurized crystals into larger sizes that work not only at room temperature but also at ambient pressure, that would be the beginning of an even more profound technological shift. Ralph Scheicher, a computational modeler at Uppsala University in Sweden, says that he would not be surprised if this happened “within the next decade.”

    Resistance is futile

    The ways in which electricity is generated, transmitted, and distributed would be fundamentally transformed by cheap and effective room-temperature superconductors bigger than a few millionths of a meter. About 5% of the electricity generated in the United States is lost in transmission and distribution, according to the Energy Information Administration. Eliminating this loss would, for starters, save billions of dollars and have a significant climate impact. But room-temperature superconductors wouldn’t just change the system we have—they’d enable a whole new system. Transformers, which are crucial to the electric grid, could be made smaller, cheaper, and more efficient. So too could electric motors and generators. Superconducting energy storage is currently used to smooth out short-term fluctuations in the electric grid, but it still remains relatively niche because it takes a lot of energy to keep superconductors cold. Room-temperature superconductors, especially if they could be engineered to withstand strong magnetic fields, might serve as very efficient way to store larger amounts of energy for longer periods of time, making renewable but intermittent energy sources like wind turbines or solar cells more effective.

    And because flowing electricity creates magnetic fields, superconductors can also be used to create powerful magnets for applications as diverse as MRI machines and levitating trains. Superconductors are of great potential importance in the nascent field of quantum computing, too. Superconducting qubits are already the basis of some of the world’s most powerful quantum computers. Being able to make such qubits without having to cool them down would not only make quantum computers simpler, smaller, and cheaper, but could lead to more rapid progress in creating systems of many qubits, depending on the exact properties of the superconductors that are created.

    All these applications are in principle attainable with superconductors that need to be cooled to low temperatures in order to work. But if you have to cool them so radically, you lose many—in some cases all—of the benefits you get from the lack of electrical resistance. It also makes them more complicated, expensive, and prone to failure.

    It remains to be seen whether scientists can devise stable compounds that are superconducting not only at ambient temperature, but also at ambient pressure. But the researchers are optimistic. They conclude their paper with this tantalizing claim: “A robust room-temperature superconducting material that will transform the energy economy, quantum information processing and sensing may be achievable.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 11:09 am on October 13, 2020 Permalink | Reply
    Tags: "Inside Singapore’s (SG) huge bet on vertical farming", MIT Technology Review   

    From MIT Technology Review: “Inside Singapore’s (SG) huge bet on vertical farming” 

    From MIT Technology Review

    Megan Tatum

    From the outside, VertiVegies looked like a handful of grubby shipping containers put side by side and drilled together. A couple of meters in height, they were propped up on a patch of concrete in one of Singapore’s nondescript suburbs. But once he was inside, Ankesh Shahra saw potential. Huge potential.

    Shahra, who wears his dark hair floppy and his expensive-looking shirts with their top button casually undone, had a lot of experience in the food industry. His grandfather had founded the Ruchi Group, a corporate powerhouse in India with offshoots in steel, real estate, and agriculture; his father had started Ruchi Soya, a $3 billion oilseed processor that had been Shahra’s training ground.

    By the time Shahra was introduced to VertiVegies founder Veera Sekaran at a friend’s party in 2017, he was hungry to make his own entrepreneurial mark. A previous attempt had involved sourcing organic food from around Asia: “an eye-opening experience, one with a lot of pressure,” he says. It helped him spot a problem that needed solving.

    “I’d seen how much dependency farmers have globally on weather,” he says. “Yields were hugely erratic: there are so many inconsistencies and dependencies that it’s a hugely difficult profession for the bulk of farmers. The perishable supply chain was so broken.”

    And what Shahra saw when he stepped into Sekaran’s repurposed shipping containers was a solution.

    Inside, mismatched plastic trays sat carefully stacked on industrial metal shelves, stretching all the way from the concrete floor to the corrugated-steel ceiling. In each tray were small green plants of different species and sizes, all with their roots bathed in the same watery solution, their leaves curling up toward the same pink glow of faintly humming LED bar lights above.

    A controlled environment means VertiVegies’s food, such as edible flowers, can be grown without pesticides. Credit: COURTESY OF VERTIVEGIES.

    With VertiVegies, Sekaran was farming vertically: growing vegetables indoors, with towers of crops stacked one on the other instead of in wide, sprawling fields, and in hydroponic solution instead of soil. He was growing food without exposure to weather or seasons, using techniques pioneered by others, in a country that was badly in need of a new way to meet its food needs.

    Singapore is the third most densely populated country in the world, known for its tightly packed high-rises. But to cram all those gleaming towers and nearly 6 million people into a land mass half the size of Los Angeles, it has sacrificed many things, including food production. Farms make up no more than 1% of its total land (in the United States it’s 40%), forcing the small city-state to shell out around $10 billion each year importing 90% of its food.

    Here was an example of technology that could change all that.

    Sekaran came from a world very different from Shahra’s. The fifth of nine children, he had lost his father at five years old and grew up poor. So little money did the family have that Sekaran would show up to school in an oversized uniform, clutching his textbooks in a paper bag. But he climbed out of poverty, paying his own way through university and never losing his irrepressible passion for living things. By the time the pair met, Sekaran had qualified as a botanist and worked in the Seychelles, Pakistan, and Morocco before returning home. In almost every media interview or biography he is referred to, almost reverently, as a “plant whisperer.”

    “We were two different personalities for sure,” says Shahra with a chuckle. But in VertiVegies, Sekaran had created the prototype for a vision both men shared.

    “It was intriguing,” Shahra says. “On paper, indoor farming solves all sorts of problems. But for me it was about: How do we make a sustainable business model out of it? You’re not going to solve food security with five or 10 containers.”

    He spent six months in discussion with Sekaran, and months more visiting urban-farm specialists across the region, learning every single thing he could. “All of 2017 was spent going through the systems, the technology, and just not being able to wrap my head around how to scale it,” he says.

    The solution, when it came, felt surprisingly serendipitous.

    Trouble at home

    It’s taken decades for Singapore to wake up and realize that—as far as food goes—it is one of the most vulnerable countries in the world.

    This risk simply hadn’t occurred to authorities back in the 1970s, when they ripped up the crops of tapioca, sweet potatoes, and vegetables flourishing across more than 15,000 hectares of the country’s land and replaced them with high-rise office buildings and condos. The focus back then was finance, telecoms, and electronics, not food.

    But while this strategy successfully swelled Singapore’s economy (it’s now the fourth richest country in the world, per capita), it left the country with only 600 hectares of farmland. Food manufacturing is now worth just S$4.3 billion, or 1% of GDP, compared with just over 5% in the US.

    The precariousness of this situation hit home in 2008, when—a few months before the global financial crisis took hold—the world suffered a spike in food prices. Bad weather, rising fuel costs, and population growth had converged to send the cost of food commodities soaring. There were riots and widespread political unrest.

    Once a layer of plants is grown, the tall stacks of plants can be harvested. Credit:ZAKARIA ZAINAL.

    Without production of its own, Singapore saw its food supplies take a big hit. Imported raw food rose 55% in price in 12 months, and commodities such as rice, grain, and maize as much as 31%. The state was forced to absorb hikes in the costs of basics like cooking oil, bread, and milk—something made even tougher by the fact that China, from which Singapore imports around $600 million worth of food each year, had experienced its worst winter weather in 50 years, destroying crops and further pushing up regional food prices from late 2007 to mid-2008.

    Delivering the bad news to parliament in February 2008, the finance minister, Tharman Shanmugaratnam, warned that “the factors … which have led to these food price increases are not expected to go away soon.” Singapore needed to act.

    The government’s policy is to produce enough food to supply 30% of its own nutritional needs by 2030, up from just 10% now.

    Since then, food security has raced up the agenda. Now the government’s stated policy is that it wants to produce enough food to supply 30% of its own nutritional needs by 2030, up from just 10% now. To get there, it says, Singapore will need to grow 50% of all fruits and vegetables consumed domestically, 25% of all proteins, and 25% of all staples, such as brown rice. The commitment effectively aims to triple production by volume in the next 10 years. And since the country is short of land, it has pinned its hopes on technology. This year alone Singapore’s government has set aside S$55 million (US$40million) to fund agritech projects. Scouting teams have been bundled off on food security fact-finding missions, and sprawling agritech parks have been built.

    For Shahra and Sekaran, the turning point came in August 2017, when authorities started making plots of farmland available to any company using tech or innovation to boost food security.

    The 10 government-owned plots, each around two hectares in size, are all in Lim Chu Kang—a patch of green north of the city, where fruit trees, dairy farms, and organic vegetable operations provide a small supply of local produce. Startups that could convince the authorities their plan had legs would be sold the land at a fraction of its market value.

    Finally, Shahra had a way to scale up VertiVegies. “It would take away our biggest hurdle,” he says of the announcement. “It would unlock the ability to expand.”

    They hurriedly pulled together a proposal using all the information they’d gathered in the previous months. By February 2018 they were successful, and by June they’d taken possession of a S$300,000 plot and laid out their vision.

    Leafy greens and herbs like arugula are packaged and sold locally. Credit: COURTESY OF VERTIVEGIES.

    Once completed, the new farm will be Singapore’s biggest: the warehouse will stretch 20,000 square meters (roughly the size of three soccer fields) and, once at full capacity, produce six metric tons of leafy greens, microgreens, and herbs each day, to supply restaurants, retailers, and hotels. Not only will the plants grow up to 25% faster than those in a conventional outdoor field if all goes to plan, but with no soil and with a farming stack up to two meters high, they will require around a fifth as much room to grow as conventional crops. If it can meet its production targets, it will singlehandedly boost Singapore’s vegetable production by 10%.

    But it isn’t scale alone that separates VertiVegies from the competition. Only six months after securing the plot of land, Shahra also signed a deal with SananBio. The Chinese company is arguably the world’s biggest provider of vertical farming technology, operating vast indoor farms of its own in China, which committed in 2017 to investing $1 billion in scaling the technology. “The amount of R&D SananBio has invested in indoor farming solutions, we could never do. They were several years ahead of all the other companies I visited,” says Shahra. But thanks to the joint venture signed in August 2018, his team has access to not only SananBio’s physical growing systems, but its years of data on how to grow better and faster.

    The covid-19 pandemic has put plans for the main growing operation on hold, with focus temporarily switching to a smaller alternative that will be faster to build and easier to set up: it aims to produce 700 to 800 kilograms of vegetables per day. And in doing so, it will demonstrate a future for high-tech indoor farms in which the technology can finally be used to make a meaningful contribution to mainstream production.

    A global problem

    Food security is a pressing issue in Singapore, but it’s a growing concern almost everywhere else too.

    The world’s population is set to swell by a quarter by 2050, to 9.7 billion, creating an urgent need for more food. Estimates of exactly how much more vary from 25% to 70%, but nobody disputes that we’ll need more of everything: more grains, more meat, and far more fresh vegetables. Already the high cost of producing and distributing food is worsening global malnutrition: 690 million people were left without enough to eat in 2019, up 10 million from 2018. Failure to increase production will tip millions more into chronic hunger.

    Conventional outdoor food production is unlikely to meet this demand, especially with outdoor crops already feeling the impact of climate change. In 2019 alone, weather problems exacerbated by global warming hit the food system with a string of disasters: a heat wave hit farms in the US Midwest, severe cyclones destroyed corn output in sub-Saharan Africa, India battled relentless drought, and farmers on the banks of Asia’s Mekong River watched helplessly as rising water washed away livestock.

    Urbanization only makes this harder, cutting the amount of farmland available and putting more people in closer proximity to each other. The United Nations says that by 2050, 68% of the world will live in densely populated urban areas—up from 55% today. That will make them more reliant on imports and vulnerable to even small shocks to the market, or disruptions to supply.

    The pandemic has already provided a bitter first look at what that could mean. In Kenya’s urban slums, people were literally fighting each other for food as covid-19 spread and the disruption cut off regular supply routes into Nairobi, says Esther Ngumbi, an assistant professor of entomology at the University of Illinois and the founder of Oyeska Greens, an agricultural startup in Kenya that aims to empower local farms. It’s “extremely urgent” that we find alternatives for bringing production closer to demand, she told me.

    Of all the available options, high-output urban farms are our best bet, argues Dickson Despommier, an emeritus professor of microbiology and public health at Columbia University, and one of the founding fathers of vertical farming. “When the climate changes to disallow farming as we know it, we will have to look to other agricultural strategies for obtaining our food,” he says. “Indoor agriculture is an excellent option, and vertical farming is the most efficient indoor method for producing lots of food in a small architectural footprint.”

    Unlike the startups growing shrimp from stem cells or harvesting protein from black soldier flies, these indoor farms are already up and running almost everywhere. In the US and Europe, a growing number of high-tech farm operators champion themselves as a green alternative to conventional farms, selling bags of microgreens or kale to affluent consumers for up to 200% more than standard greens. The premium price is justified with the promise of pesticide-free, nutrient-packed produce.

    In developing countries, meanwhile, systems have been tweaked to accommodate unreliable electricity supplies and small budgets. According to the Swedish International Agriculture Network Initiative, around 35% of food in the Ugandan capital, Kampala, now comes from small urban farms, including vertical installations where vegetables are stacked in low-cost bags that protect plants from harmful UV rays. Advocates say they increase production by up to six times per square meter over conventional farming.

    But no region has taken this technology and run with it quite the way Asia has.

    From Shanghai to Seoul, Tokyo to Singapore, Asia’s muggy, rapidly rising metropolises have been among the first in the world to embrace indoor farms at scale. By 2010, Japan had more indoor plant factories than the US managed by 2016, and there are now around 450 commercial indoor farms up and running across Asia.

    There are good reasons for this, according to Per Pinstrup-Andersen, a Danish economist and professor emeritus at Cornell University. As in Africa, many countries in Asia need to feed a growing urban middle class.

    But unlike their African counterparts, many Asian countries also have the money to invest in technology as a solution—and nowhere more so than in Singapore.

    Full Stack

    Darren Tan has had a front-row seat from which to watch as high-tech farms have become a central piece of the plan to boost food production in Singapore. He works as outreach coordinator at ComCrop, one of Singapore’s best-known urban-farm operators, which moved into a new 8,000-square-foot (740-square-meter) greenhouse in 2018. In an industrial glass shed on the rooftop of a former parking garage, Singapore’s relentless sun streams through the windows onto a sea of leafy greens, lettuce, and Italian basil.


    Though ComCrop doesn’t grow “up,” it has still spent the last 10 years honing many of the same techniques on which traditional vertical farms rely. Tan, who is tall and slim, talks at length about the use of hydroponics—replacing soil with a water-based solution in which sensors test electrical conductivity and painstakingly gauge the ratios between specific nutrients.

    Even a simple hydroponics system can double the yield of conventional farming, he says—“and if we were to fully optimize everything and scale up, making use of every single piece of land, then we could add more multipliers to that.” It’s this productivity in a small space that makes urban farms so appealing. “The only constraint we have is the availability of light,” he says.

    The situation is different for vertical farms, which use LED lamps because each row of plants blocks sunlight to the one below. But indoor operations turn this into an advantage: protected from the elements, they are designed to accelerate photosynthesis with endless artificial light.

    In fact, Paul Teng, a professor at Singapore’s Nanyang Technological University, estimates that indoor plant factories alone—the type that VertiVegies is building—could take the country from producing 13% of its leafy vegetables domestically to 30% in 10 years, churning out an additional 18,700 metric tons per year.

    The aim of all this isn’t for Singapore to lose its outward-­looking ethos, says Tan—“But it’s important that on top of being able to import food from overseas, there is at least some local buffer that we could turn to in a crisis, or in the rare event that there are supply-­chain disruptions.”

    Even though VertiVegies is among those making vertical farming a reality, there are plenty of skeptics. Most of them focus on the astronomical costs involved.

    Urban farms may use less land than those outdoors, but that land is far more expensive. One 2017 study in Australia estimated that a square meter of arable land in central Melbourne would cost US$3,491, compared with US$0.40 in rural areas. The price difference can mean that even at its most compressed, vertical farming does not save much on one of farming’s major capital expenses.

    Singapore aims to produce 30% of its food supply locally by 2030. Credit: ZAKARIA ZAINAL.

    Another ongoing problem is the cost of photosynthesis. While traditional farms benefit from free energy in the form of sunlight, one of the biggest expenditures for indoor farms is the 24/7 stream of artificial light. VertiVegies’s new farm will need 720 LED light tubes per 100 square meters of growing space, for example. The energy required can be prohibitive: one notorious analysis in 2014 estimated that a loaf of bread produced using standard indoor techniques would cost $23.

    But, though oft cited, that analysis is also dated. In the six years since those calculations were made, not only has the cost of an average 60-watt LED bulb fallen (it’s about 80% cheaper than it was 10 years ago), but the energy efficiency of LEDs has improved dramatically. From 2005 to 2017, efficiency increased from 25 lumens per watt to 160. An LED streetlight now lasts about 60,000 hours.

    One of the biggest expenditures for indoor farms is the 24/7 stream of artificial light. But the cost and efficiency of LED bulbs has improved dramatically in the last few years.

    Which isn’t to say indoor vertical farms don’t come with high startup and running costs. “If you look at the capital expenditure involved in starting an indoor vertical farm, it’s very high,” says Teng. “And to recover the investment costs and the direct running costs, operators need to charge 10% to 15% higher than, say, vegetables that come from Malaysia and China.” Many charge far more.

    Shahra feels that tension. While he and his small team wait for their new farm, they produce up to 250 kilograms of vegetables per week from a 140-square-meter pilot site in the city. Shahra spends days meeting with local retailers and restaurants to convince them it’s worth shelling out more on indoor-grown greens. He’s the first to admit this is both expensive and experimental.

    “At the end of the day, farming is still farming,” he says. “It might be in an air-conditioned room, but it’s repetitive; it’s hard work; it’s iterative. You can put all the bells and whistles on it, but at the end of the day you’re still growing a plant.”

    Getting the price down requires scale. Achieving scale requires mainstream appeal. That’s the chicken-and-egg situation that has left indoor farms in a bind the world over until now, Teng points out. But in 2020 we’ve reached a tipping point, Pinstrup-Andersen believes.


    “Ten years ago, indoor farming was a pipe dream,” he says. “But right now, because of the efficiency in LED lighting and better management practices, it is very close to being economically competitive with greenhouses and open-field production of vegetables … It just needs a kick in the rear.”

    Covid Crisis

    In April, the pandemic delivered that kick. Just as Shahra was preparing to build the farm—Sekaran left the company earlier this year—Singaporean officials discovered a cluster of covid-19 cases in one of the country’s cramped worker dormitories.

    The scenes that unfolded echoed much of what happened around the rest of the world: instructions to stay home were followed by long supermarket queues, fearful stockpiling, and scattered food shortages. At conventional farms there were reports of people turning up and pulling produce out of the ground. Almost overnight, Singapore’s perilous food supply became one of the most visible consequences of an otherwise invisible crisis.

    Now Shahra had everyone’s undivided attention. “Food security has suddenly become very personal to everyone,” he says. “Last year if I’d gone out and talked about it, [the reaction] was completely different. Now it’s real; it’s here.”

    Teng agrees. “Covid-19 has done a lot more good to create awareness of food security than all the papers I and my colleagues have written in the last few years,” he says ruefully. “It has created so much awareness among Singaporeans that hey, we’re one of the most vulnerable countries in the world.”


    It lit a fire under officials, too. Only two days after introducing a partial lockdown, the government committed to an express grant of S$30 million for projects designed to boost local supplies of eggs, vegetables, and fish. This has helped fund the new VertiVegies facility.

    “There’s conversations going on every day now,” says Shahra. “In the blink of an eye, there’s all this innovation—from 2017, when I first took a look at this and couldn’t have imagined how it was possible, to now, where there’s this huge positive movement.

    “And when so many people are working toward a common agenda, then something good generally happens.”

    See the full article here, subscription required.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 12:33 pm on August 28, 2020 Permalink | Reply
    Tags: "How special relativity can help AI predict the future", , Albert Einstein’s Theory of Special Relativity, An AI could be used to simulate how a patient might respond to a certain treatment—for example spooling out how that treatment might run its course step by step., By creating all possible outcomes one can see how a drug will affect a disease., Machine-learning models excel at spotting correlations but are hard pressed to explain why one event should follow another., MIT Technology Review, Nobody knows what will happen in the future but some guesses are a lot better than others., This is a particular concern with AI-powered medical diagnosis.   

    From MIT Technology Review: “How special relativity can help AI predict the future” 

    From MIT Technology Review

    August 28, 2020
    Will Douglas Heaven

    Credit: Étienne-Jules Marey.

    Nobody knows what will happen in the future, but some guesses are a lot better than others. A kicked football will not reverse in midair and return to the kicker’s foot. A half-eaten cheeseburger will not become whole again. A broken arm will not heal overnight.

    By drawing on a fundamental description of cause and effect found in Einstein’s theory of special relativity, researchers from Imperial College London have come up with a way to help AIs make better guesses too.

    The world progresses step by step, every instant emerging from those that precede it. We can make good guesses about what happens next because we have strong intuitions about cause and effect, honed by observing how the world works from the moment we are born and processing those observations with brains hardwired by millions of years of evolution.

    Computers, however, find causal reasoning hard. Machine-learning models excel at spotting correlations but are hard pressed to explain why one event should follow another. That’s a problem, because without a sense of cause and effect, predictions can be wildly off. Why shouldn’t a football reverse in flight?

    This is a particular concern with AI-powered diagnosis. Diseases are often correlated with multiple symptoms. For example, people with type 2 diabetes are often overweight and have shortness of breath. But the shortness of breath is not caused by the diabetes, and treating a patient with insulin will not help with that symptom.

    The AI community is realizing how important causal reasoning could be for machine learning and are scrambling to find ways to bolt it on.

    Researchers have tried various ways to help computers predict what might happen next. Existing approaches train a machine-learning model frame by frame to spot patterns in sequences of actions. Show the AI a few frames of a train pulling out of a station and then ask it to generate the next few frames in the sequence, for example.

    AIs can do a good job of predicting a few frames into the future, but the accuracy falls off sharply after five or 10 frames, says Athanasios Vlontzos at Imperial College London. Because the AI uses preceding frames to generate the next one in the sequence, small mistakes made early on—a few glitchy pixels, say—get compounded into larger errors as the sequence progresses.

    Vlontzos and his colleagues wanted to try a different approach. Instead of getting an AI to learn to predict a specific sequence of future frames by watching millions of video clips, they allowed it to generate a whole range of frames that were roughly similar to the preceding ones and then pick those that were most likely to come next. The AI can make guesses about the future without having to learn anything about the progression of time, says Vlontzos.

    To do this, the team developed an algorithm inspired by light cones, a mathematical description of the boundaries of cause and effect in spacetime, which was first proposed in Einstein’s theory of special relativity and later refined by his former professor Hermann Minkowski. Light cones emerge in physics because the speed of light is constant. They show the expanding limits of a ray of light—and everything else—as it emanates from an initial event, such as an explosion.

    Take a sheet of paper and mark an event on it with a dot. Now draw a circle with that event at the center. The distance between the dot and the edge of the circle is the distance light has traveled in a period of time—say, one second. Because nothing, not even information, can travel faster than light, the edge of this circle is a hard boundary on the causal influence of the original event. In principle, anything inside the circle could have been affected by the event; anything outside could not.

    After two seconds, light has traveled twice the distance and the circle’s size has doubled: there are now many more possible futures for that original event. Picture these ever larger circles rising second by second out of the sheet of paper, and you have an upside-down cone with the original event at its tip. This is a light cone. A mirror image of the cone can also extend backwards, behind the sheet of paper; it will contain all possible pasts that could have led to the original event.

    Vlontzos and his colleagues used this concept to constrain the future frames an AI could pick. They tested the idea on two data sets: Moving MNIST, which consists of short video clips of handwritten digits moving around on a screen, and the KTH human action series, which contains clips of people walking or waving their arms. In both cases, they trained the AI to generate frames that looked similar to those in the data set. But importantly the frames in the training data set were not shown in sequence, and the algorithm was not learning how to complete a series.

    They then asked the AI to pick which of the new frames were more likely to follow another. To do this, the AI grouped generated frames by similarity and then used the light-cone algorithm to draw a boundary around those that could be causally related to the given frame. Despite not being trained to continue a sequence, the AI could still make good guesses about which frames came next. If you give the AI a frame in which a short-haired person wearing a shirt is walking, then the AI will reject frames that show a person with long hair or no shirt, says Vlontzos. The work is in the final stages of review at NeurIPS, a major machine-learning conference.

    An advantage of the approach is that it should work with different types of machine learning, as long as the model can generate new frames that are similar to those in the training set. It could also be used to improve the accuracy of existing AIs trained on video sequences.

    To test the approach, the team had the cones expand at a fixed rate. But in practice, this rate will vary. A ball on a football field will have more possible future positions than a ball traveling along rails, for example. This means you would need a cone that expanded at a faster rate for the football.

    Working out these speeds involves getting deep into thermodynamics, which isn’t practical. For now, the team plans to set the diameter of the cones by hand. But by watching video of a football game, say, the AI could learn how much and how fast objects moved around, which would enable it to set the diameter of the cone itself. An AI could also learn on the fly, observing how fast a real system changed and adjusting cone size to match it.

    Predicting the future is important for many applications. Autonomous vehicles need to be able to predict whether a child is about to run into the road or whether a wobbling cyclist presents a hazard. Robots that need to interact with physical objects need to be able to predict how those objects will behave when moved around. Predictive systems in general will be more accurate if they can reason about cause and effect rather than just correlation.

    But Vlontzos and his colleagues are particularly interested in medicine. An AI could be used to simulate how a patient might respond to a certain treatment—for example, spooling out how that treatment might run its course, step by step. “By creating all these possible outcomes, you can see how a drug will affect a disease,” says Vlontzos. The approach could also be used with medical images. Given an MRI scan of a brain, an AI could identify the likely ways a disease could progress.

    “It’s very cool to see ideas from fundamental physics being borrowed to do this,” says Ciaran Lee, a researcher at University College London who works on causal inference at Babylon Health, a UK-based digital health-care provider, but wasn’t involved in this research. “A grasp of causality is really important if you want to take actions or decisions in the real world,” he says. It goes to the heart of how things come to be the way they are: “If you ever want to ask the question ‘Why?’ then you need to understand cause and effect.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 3:58 pm on July 22, 2020 Permalink | Reply
    Tags: "Why Japan is emerging as NASA’s most important space partner", Artemis, , , , , JAXA's proposed lunar rover with Toyota, MIT Technology Review,   

    From MIT Technology Review: “Why Japan is emerging as NASA’s most important space partner” 

    From MIT Technology Review

    July 22, 2020
    Neel V. Patel

    Japan provides a few major advantages in helping the US get back to the moon. In return, it will get its own chance to set foot on the lunar surface.

    Conceptual art of JAXA’s proposed lunar rover with Toyota. Credit: Toyota/JAXA

    The first time the US went to the moon, it put down an estimated $283 billion to do it alone. That’s not the case with Artemis, the new NASA program to send humans back. Although it’s a US-led initiative, Artemis is meant to be a much more collaborative effort than Apollo. Japan is quickly emerging as one of the most important partners for this program—perhaps the most important.

    Although NASA has teased for quite some time the idea of a pretty ambitious role for Japan in Artemis, that talk finally became real on July 9, when the two countries signed a formal agreement regarding further collaboration in human exploration. It gives NASA a much-needed partner for Artemis—without which the agency would find it much more difficult to meet the long-term goals of establishing a sustainable permanent presence on the moon.

    The US-Japanese space relationship goes back a long time, says John Logsdon, a space policy expert at George Washington University: “Japan has been basically our best international partner over the last 40-plus years.” It may have declined to work on the space shuttle program in the 1970s, but it reversed course in the early 1980s and signed on with the International Space Station program.

    Since then, Japan’s space capabilities have progressed rapidly. The country found a reliable launch vehicle in the H-IIA rocket, built by Mitsubishi, and JAXA, its space agency, has found success in a number of high-profile science missions, like HALCA (the first space-based mission for very long baseline interferometry, in which multiple telescopes are used simultaneously to study astronomical objects), Hayabusa (the first asteroid sample return mission), the lunar probe SELENE, IKAROS (the first successful demonstration of solar technology in interplanetary space), and Hayabusa2 (expected to return to Earth with samples from the asteroid Ryugu in December). Since 1990, 12 Japanese astronauts have been in space.


    JAXA Selene Kaguya lunar probe

    JAXA IKAROS spacecraft

    JAXA/Hayabusa 2 Credit: JAXA/Akihiro Ikeshita

    So the country has a spaceflight pedigree superior to that of most other American allies, and is more than capable of building and deploying the types of spaceflight technologies that could push a lunar exploration program forward (NASA, after all, is working on an Artemis budget that is much slimmer than Apollo’s). In return, Japan gets to participate in a major human exploration program and likely send its own astronauts to the moon via NASA missions, without having to pay for and develop a lunar mission of its own.

    What exactly will Japan do for Artemis? Specific details about the new agreement were not released, but we already know the country is sending a couple of science payloads on Artemis 1 (an uncrewed mission around the moon) and Artemis 2 (crewed, but only a flyby). Back in January, Yoshikazu Shoji, the director of international relations and research at JAXA, told the public that JAXA wanted to help in the development of Gateway, NASA’s upcoming lunar space station that will facilitate deep space exploration. JAXA could contribute to the Habitation and Logistics Outpost (HALO) module, developing life support and power elements, said Shoji. It can also help in delivering cargo, supplies, and parts to Gateway as it’s being built, through its upcoming HTV-X spaceflight vehicle (the successor to the current HTV that supports the ISS).

    For the moon itself, JAXA can provide more data that helps future Artemis missions land more safely. JAXA’s Smart Lander for Investigating Moon (SLIM) mission, slated for 2022, will demonstrate brand-new precision lunar landing technology that could prove very useful later on for both crewed and robotic landers. Japan is also working with Canada and the European Space Agency on Heracles, a robotic transport system that could deliver cargo to the moon or help bring back valuable resources mined there. Heracles is still under development, but it’s aimed at supporting the Artemis program and Gateway in the long run.

    The biggest thing Japan might contribute, however, is a pressurized lunar rover that astronauts could use to cruise around the moon. Last week, Mark Kirasich, acting director of NASA’s Advanced Exploration Systems, unveiled some of NASA’s plans for Artemis, outlining specific proposals for the agency to work with JAXA and its commercial partner, Toyota, to build out this RV-like vehicle for astronauts to use in some of the later lunar missions. Japan’s strong auto industry means the country already has expertise in developing technologies like this, Kirasich said. JAXA and Toyota would like to have this platform ready for launch by 2029.

    Besides helping offset technology costs, having a partner like Japan “is good for the stability of Artemis,” says Logsdon. “International cooperation is popular in Congress, and I think that’s true for most of the public as well.” These agreements mean that funding is more secure, and for a space program that has long-term goals, this is pretty important.

    It also gives the US a trusted ally that can act as a bulwark against another burgeoning space power in the region: China.

    According to Kaitlyn Johnson, an aerospace security expert at the Center for Strategic & International Studies, Japan can provide more regional stability that offsets China’s influence, both in space and in related technology sectors like defense. While the civilian and defense sides of the US space program are almost completely split from one another, that’s not so much the case in countries like Japan. “There’s a lot of technological sharing between agencies within other countries,” she says. It’s likely that work on Artemis will fill some basic knowledge gaps in space defense for Japan too, such as how to identify a stalking satellite.

    The relationship between the two countries in space, says Johnson, is similar to what we see for intelligence sharing among the Five Eyes nations (the US, Australia, Canada, New Zealand, and the UK). “That relationship has extended beyond intelligence into a lot of areas in national security, including space,” she says. “We’re seeing Japan get the similar trusted-ally treatment.”

    Defense benefits aside, space exploration is simply more achievable with partners, and Japan is just a natural fit. “Japan has been at the forefront of technological change for a long time,” says Johnson. “If the world is really serious about exploring space and establishing a presence on other bodies like the moon, I do believe we have to go at those goals together, and share the burdens and resources together.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 2:59 pm on January 29, 2020 Permalink | Reply
    Tags: "This is the highest-resolution photo of the sun ever taken", , DKIST-Daniel K. Inouye Solar Telescope, MIT Technology Review,   

    From MIT Technology Review: “This is the highest-resolution photo of the sun ever taken” 

    From MIT Technology Review

    One of the first images of the surface of the sun, taken by the Daniel K. Inouye Solar Telescope

    Daniel K. Inouye Solar Telescope, DKIST, atop the Haleakala volcano on the Pacific island of Maui, Hawaii, USA, at an altitude of 3,084 m (10,118 ft).

    Astronomers turned on the Daniel K. Inouye Solar Telescope in Maui and managed to snap this incredible view of the sun. And there’s more to come.

    Jan 29, 2020
    Neel V. Patel

    Astronomers have just released the highest-resolution image of the sun. Taken by the Daniel K. Inouye Solar Telescope in Maui, it gives us an unprecedented view of our nearest star and brings us closer to solving several long-standing mysteries.

    The new image demonstrates the telescope’s potential power. It shows off a surface that’s divided up into discrete, Texas-size cells, like cracked sections in the desert soil. You can see plasma oozing off the surface, rising into the air before sinking back into darker lanes.

    “We have now seen the smallest details on the largest object in our solar system,” says Thomas Rimmele, the director of DKIST. The new image was taken December 10, when the telescope achieved first light.

    When formal observations begin in July, DKIST, with its 13-foot mirror, will be the most powerful solar telescope in the world. Located on Haleakalā (the tallest summit on Maui), the telescope will be able to observe structures on the surface of the sun as small as 18.5 miles (30 kilometers). This resolution is over five times better than that of DKIST’s predecessor, the Richard B. Dunn Solar Telescope in New Mexico.

    DKIST was specifically designed to make precise measurements of the sun’s magnetic field throughout the corona (the outermost region of its atmosphere) and answer questions like why the corona is millions of degrees hotter than the sun’s surface.

    Each of the “cells” on the surface of the sun are roughly the size of Texas. The resolution of DKIST is about 18.5 miles.

    Other instruments coming online in the next six months will also collect data pertaining to temperature, velocity, and solar structures. The new solar cycle is about to start up again soon, and this means there’s going to be a wealth of solar activity to spot.

    To observe the sun, you can’t just build a telescope the old-fashioned way. DKIST boasts one of the world’s most complex solar-adaptive optics systems. It uses deformable mirrors to offset distortions caused by Earth’s atmosphere. The shape of the mirror adjusts 2,000 times per second. Staring at the sun also makes the telescope hot enough to melt metal. To cool it down, the DKIST team has to use a swimming pool of ice and 7.5 miles of pipe-distributed coolant.

    There’s a good reason why we need to take a closer look at the sun. When the solar atmosphere releases its magnetic energy, it results in explosive phenomena like solar flares that hurl ultra-energized particles through the solar system in all directions, including ours. This “space weather” can wreak havoc on things like GPS and electrical grids.

    Space weather affects your daily life. It’s time to start paying attention.

    Learning more about solar activity could give us more notice of when hazardous space weather is due to hit.

    The telescope’s history is not without controversy. Haleakalā is important to the culture of Native Hawaiians, who protested its construction of DKIST in the summer of 2015. The DKIST team addressed those concerns in various ways, such as launching a $20 million program at Maui College to teach science in conjunction with Hawaiian culture, and reserving 2% of telescope time for Native Hawaiians.

    The plan is to keep DKIST operational for at least four solar cycles, or around 44 years. “We’re now in the final sprint of what has been a very long marathon,” says Rimmele. “These first images are really just the very beginning.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 5:42 pm on January 8, 2020 Permalink | Reply
    Tags: "NASA’s new exoplanet hunter found its first potentially habitable world", , , , , MIT Technology Review, , , Planet TOI 700 d   

    From MIT Technology Review: “NASA’s new exoplanet hunter found its first potentially habitable world” 

    MIT Technology Review
    From MIT Technology Review

    Neel V. Patel

    TOI 700 d

    NASA’s Transiting Exoplanet Survey Satellite (TESS) has just found a new potentially habitable exoplanet the size of Earth, located about 100 light-years away. It’s the first potentially habitable exoplanet the telescope has found since it was launched in April 2018.

    NASA/MIT TESS replaced Kepler in search for exoplanets

    It’s called TOI 700 d [science paper https://arxiv.org/abs/2001.00952 ]. It orbits a red dwarf star about 40% less massive than the sun and half as cool. The planet itself is about 1.2 times the size of Earth and orbits the host star every 37 days, receiving close to 86% of the amount of sunlight Earth does.

    Most notably, TOI 700 d is in what’s thought to be its star’s habitable zone, meaning it’s at a distance where temperatures ought to be moderate enough to support liquid water on the surface. This raises hopes TOI 700 d could be amenable to life—even though no one can agree on what it means for a planet to be habitable.

    A set of 20 different simulations meant to model TOI 700 d suggest the planet is rocky and has an atmosphere that helps it retain water, but there’s a chance it might simply be a gaseous mini-Neptune. We won’t know for sure until follow-up observations are made with some sharper instruments, such as the upcoming James Webb Space Telescope, which is planned for launch in March 2021.

    NASA/ESA/CSA Webb Telescope annotated

    TESS finds exoplanets using the tried-and-true technique of looking for objects as they’re transiting in front of their host stars.

    Planet transit. NASA/Ames

    Data from NASA’s Spitzer Space Telescope was also used to get some closer measurements of the planet’s size and orbit.

    NASA/Spitzer Infrared Telescope

    Tess is NASA’s newest exoplanet-hunting space telescope, the successor to the renowned Kepler Space Telescope that was used to find some 2,600 exoplanets.

    NASA/Kepler Telescope, and K2 March 7, 2009 until November 15, 2018

    TESS, able to survey 85% of the night sky (400 times more than what Kepler could monitor), is about to finish its primary two-year mission but has fallen woefully short of expectations. NASA initially thought TESS was going to find more than 20,000 transiting exoplanets, but with only months left it has only identified 1,588 candidates. Even so, the telescope’s mission will almost surely be extended.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 12:31 pm on December 20, 2019 Permalink | Reply
    Tags: "These are the stars the Pioneer and Voyager spacecraft will encounter", , , , , , , , , , MIT Technology Review, NASA Pioneer 10 and 11,   

    From MIT Technology Review: “These are the stars the Pioneer and Voyager spacecraft will encounter” 

    MIT Technology Review
    From MIT Technology Review

    Dec 20, 2019
    Emerging Technology from the arXiv

    As four NASA spacecraft exit our solar system, a 3D map [below] of the Milky Way reveals which others they’re likely to visit tens of thousands of years on.

    Laniakea supercluster. From Nature The Laniakea supercluster of galaxies R. Brent Tully, Hélène Courtois, Yehuda Hoffman & Daniel Pomarède at http://www.nature.com/nature/journal/v513/n7516/full/nature13674.html. Milky Way is the red dot.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt. The bar is visible in this image

    NASA Pioneer 10

    NASA Pioneer 11

    NASA/Voyager 1

    NASA/Voyager 2

    During the 1970s, NASA launched four of the most important spacecraft ever built. When Pioneer 10 began its journey to Jupiter, astronomers did not even know whether it was possible to pass through the asteroid belt unharmed.

    The inner Solar System, from the Sun to Jupiter. Also includes the asteroid belt (the white donut-shaped cloud), the Hildas (the orange “triangle” just inside the orbit of Jupiter), the Jupiter trojans (green), and the near-Earth asteroids. The group that leads Jupiter are called the “Greeks” and the trailing group are called the “Trojans” (Murray and Dermott, Solar System Dynamics, pg. 107)
    This image is based on data found in the en:JPL DE-405 ephemeris, and the en:Minor Planet Center database of asteroids (etc) published 2006 Jul 6. The image is looking down on the en:ecliptic plane as would have been seen on 2006 August 14. It was rendered by custom software written for Wikipedia. The same image without labels is also available at File:InnerSolarSystem.png. Mdf at English Wikipedia

    Only after it emerged safe was Pioneer 11 sent on its way.

    Both sent back the first close-up pictures of Jupiter, with Pioneer 11 continuing to Saturn. Voyager 1 and 2 later took even more detailed measurements, and extended the exploration of the solar system to Uranus and Neptune.

    All four of these spacecraft are now on their way out of the solar system, heading into interstellar space at a rate of about 10 kilometers per second. They will travel about a parsec (3.26 light-years) every 100,000 years, and that raises an important question: What stars will they encounter next?

    This is harder to answer than it seems. Stars are not stationary but moving rapidly through interstellar space. Without knowing their precise velocity, it’s impossible to say which ones our interstellar travelers are on course to meet.

    Enter Coryn Bailer-Jones at the Max Planck Institute for Astronomy in Germany and Davide Farnocchia at the Jet Propulsion Laboratory in Pasadena, California. These guys have performed this calculation using a new 3D map of star positions and velocities throughout the Milky Way.

    Max Planck Institute for Astronomy

    Max Planck Institute for Astronomy campus, Heidelberg, Baden-Württemberg, Germany


    NASA JPL-Caltech Campus

    This has allowed them to work out for the first time which stars the spacecraft will rendezvous with in the coming millennia. “The closest encounters for all spacecraft take place at separations between 0.2 and 0.5 parsecs within the next million years,” they say.

    Their results were made possible by the observations of a space telescope called Gaia.

    ESA/GAIA satellite

    Since 2014, Gaia has sat some 1.5 million from Earth recording the position of 1 billion stars, planets, comets, asteroids, quasars, and so on. At the same time, it has been measuring the velocities of the brightest 150 million of these objects.

    The result is a three-dimensional map of the Milky Way and the way astronomical objects within it are moving. It is the latest incarnation of this map, Gaia Data Release 2 or GDR2, that Bailer-Jones and Farnocchia have used for their calculations.


    The map makes it possible to project the future positions of stars in our neighborhood and to compare them with the future positions of the Pioneer and Voyager spacecraft, calculated using their last known positions and velocities.

    This information yields a list of stars that the spacecraft will encounter in the coming millennia. Bailer-Jones and Farnocchia define a close encounter as flying within 0.2 or 0.3 parsecs.

    The first spacecraft to encounter another star will be Pioneer 10 in 90,000 years. It will approach the orange-red star HIP 117795 in the constellation of Cassiopeia at a distance of 0.231 parsecs. Then, in 303,000 years, Voyager 1 will pass a star called TYC 3135-52-1 at a distance of 0.3 parsecs. And in 900,000 years, Pioneer 11 will pass a star called TYC 992-192-1 at a distance of 0.245 parsecs.

    These fly-bys are all at a distance of less than one light-year and in some cases might even graze the orbits of the stars’ most distant comets.

    Voyager 2 is destined for a more lonely future. According to the team’s calculations, it will never come within 0.3 parsecs of another star in the next 5 million years, although it is predicted to come within 0.6 parsecs of a star called Ross 248 in the constellation Andromeda in 42,000 years.

    Andromeda Galaxy Messier 31 with Messier32 -a satellite galaxy copyright Terry Hancock.

    Milkdromeda -Andromeda on the left-Earth’s night sky in 3.75 billion years-NASA

    These interstellar explorers will eventually collide with or be captured by other stars. It’s not possible yet to say which ones these will be, but Bailer-Jones and Farnocchia have an idea of the time involved. “The timescale for the collision of a spacecraft with a star is of order 10^20 years, so the spacecraft have a long future ahead of them,” they conclude.

    The Pioneer and Voyager spacecraft will soon be joined by another interstellar traveler. The New Horizons spacecraft that flew past Pluto in 2015 is heading out of the solar system but may yet execute a maneuver so that it intercepts a Kuiper Belt object on its way.

    NASA/New Horizons spacecraft

    Kuiper Belt. Minor Planet Center

    After that last course correction takes place, Bailer-Jones and Farnocchia will be able to work out its final destination.

    Ref: arxiv.org/abs/1912.03503 : Future stellar flybys of the Voyager and Pioneer spacecraft

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: