Tagged: Scientific American Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:44 pm on September 13, 2022 Permalink | Reply
    Tags: "Physicists Struggle to Unite around Future Plans", , , , , Scientific American   

    From “Scientific American” : “Physicists Struggle to Unite around Future Plans” 

    From “Scientific American”

    9.8.22
    Daniel Garisto

    After a year of seemingly endless Zoom meetings, Slack chats and e-mails, nearly 800 particle physicists descended on the University of Washington to share their scientific dreams and nightmares in person. For 10 days at the end of July, whether masked inside conference rooms or sipping coffee beneath unusually sunny Seattle skies, they attempted to build a unified vision of their field’s future.

    The story of 20th century particle physics is chronicled in the pantheon of elementary particles dubbed the Standard Model: quarks bound tight by gluons to make atomic nuclei; negatively charged electrons and their heavier counterparts, muons and taus; photons, the particles of light; heavy W and Z bosons, with their subtle influence; and evasive, lightweight neutrinos.

    Particles in the Standard Model are divided into fermions, the building blocks of matter, and bosons, forces that organize the matter. Perhaps ironically, searching at the smallest of scales has required experiments of increasing complexity and size. To find new particles, physicists have sifted for needles in haystacks of data produced by slamming known types of particles together at higher and higher energies. In 2012 the discovery of the Higgs boson at the Large Hadron Collider (LHC) at CERN near Geneva was accomplished by more than 5,000 scientists analyzing petabytes of data from detectors weighing thousands of tons at the biggest machine in the world.

    Yet the triumph of the Higgs discovery—arguably the crowning achievement of the Standard Model—has been shadowed by worries that particle physicists are now stuck in a “nightmare scenario” with no clear path forward. Physicists have long believed the Standard Model’s pantheon should be bigger to account for phenomena such as dark matter and gravity. Many theories proposed these new particles would be within the LHC’s reach, but so far searches have come up empty—a nightmare for particle physicists.

    While the phrase “nightmare scenario” often causes physicists to bristle and scoff, at the conference, a panel discussion simply entitled “Where Are We Going?” faced the question head-on. Tao Han, a theorist at the University of Pittsburgh, argued that the lack of new particles was actually a success of falsification—gaining knowledge by proving what isn’t rather than what is. “The nightmare scenario is not a failure,” he declared. Other panelists were less keen on that reframing, insisting particle physics was not in a nightmare scenario at all or that the nightmare was here but short-lived.

    Some of this dissonance and discord is intentional. Roughly once a decade, hundreds of particle physicists participate in the Snowmass process (named for Snowmass, Colo., where initial meetings took place) to decide what to ask nature and which tools they need for answers.

    1

    The preceding Snowmass in 2013 led to the identification of a few high-priority tasks, including characterizing properties of the newly discovered Higgs boson, measuring neutrino masses and determining the true nature of dark matter and dark energy.

    The puzzles remain unsolved—a disconcerting lack of progress consistent with a nightmare scenario—but much of the field’s terrain has shifted for the better. New computational methods are allowing experiments to cut through noise and find signals previously assumed to be inaccessible. Possibilities for next-generation facilities such as a muon collider have invigorated the community.

    The search for dark matter, once constrained to a small number of candidate particles and types of detectors, has blossomed to encompass a wider range of possibilities.

    A popular fantasy of science is that these puzzles will be solved by a lone thinker isolated in a lab, or scribbling on a chalkboard. Today, science is a communal endeavor, and the work of a career scientist is not always so different from the work of a politician or businessperson. At a plenary to kick off the conference, Hitoshi Murayama, a widely admired theorist at the University of California-Berkeley, gave a talk where he emphasized that particle physicists needed to do more than argue for their own projects. “We need to make a case for the entire field,” he said.

    Getting particle physicists to agree on a unified vision is, in their jargon, “nontrivial.” Each subfield believes in its preeminence: neutrino researchers place their work first, while dark matter experts maintain that their search is more important. The debates are an essential part of a process that particle physicists know must end with common ground. On the first day of the latest Snowmass, Department of Energy representative Harriet Kung delivered a familiar warning: “Bickering scientists,” she intoned with a pause, “get nothing.”

    Getting with the Program

    U.S. particle physics subsists on a little more than $1 billion per year, primarily from the National Science Foundation and the DOE’s Office of Science.

    Two projects draw the lion’s share of funds and attention: the LHC and the Deep Underground Neutrino Experiment (DUNE), which is under construction.

    Although the LHC is a pan-European project at CERN, roughly 30 percent of researchers working on LHC experiments are at U.S. institutions [which is why this blog was started – to bring attention to the still important contribution of U.S. scientists and laboratories to High Energy Physics]. DUNE’s 1,400 collaborators are also geographically diverse. Neither project is viable without international support, as Fermilab director Lia Merminga forcefully reminded the community at Snowmass: “Particle physics is global!”

    But perhaps no project looms as largely as the one that was canceled. In 1993, after $2 billion had been spent and miles of tunnel dug under Waxahachie, Tex., near Dallas, the Superconducting Supercollider (SSC) was scrapped by Congress.

    __________________________________________________________
    [Earlier than the LHC at CERN, The DOE’s Fermi National Accelerator Laboratory had sought the Higgs with the Tevatron Accelerator.

    But the Tevatron could barely muster 2 TeV [Terraelecton volts], not enough energy to find the Higgs. CERN’s LHC is capable of 13 TeV.

    Another possible attempt in the U.S. would have been the Super Conducting Supercollider.

    Fermilab has gone on to become a world powerhouse in neutrino research with the LBNF/DUNE project [above] which will send neutrinos 800 miles to SURF-The Sanford Underground Research Facility in Lead, South Dakota.
    __________________________________________________________

    Had it been completed, the SSC would have been the most powerful particle accelerator in the world. Its demise was a heavy blow for particle physics around the world, but the impacts on American physics verged on catastrophic. Suddenly, U.S.-based researchers found themselves without a collider to call home. Some migrated to other projects, while many simply left the field entirely.

    Concerns at Snowmass swirled around DUNE, which some worried could go the way of the SSC because its price has swelled from $1.8 billion to $3.1 billion. One of DUNE’s main goals is to determine charge parity (CP) violation—essentially how much nature prefers producing neutrinos over their antimatter twins, antineutrinos.

    Hyper-Kamiokande (Hyper-K), a Japanese neutrino detector scheduled to begin operations in 2028, could also make such measurements.

    In postsession questions, critics prodded: Would DUNE be redundant? Supporters rebutted them, noting that DUNE has far better sensitivity to CP violation than Hyper-K—if, as some critics noted, it receives an upgrade costing an additional $900 million.

    Nevertheless, when Merminga also announced that DUNE had cleared the latest round of DOE reviews, she received relieved applause. Mindful of the SSC, even scientists uneasy about DUNE’s scientific goals want it to succeed because its failure would negatively impact the whole community. As more than one researcher put it at Snowmass, “If DUNE’s dead, we’re dead.”

    Concerns aren’t just limited to DUNE—research funding has dwindled over the past decade.* Now, with U.S. industrial policy on the rise, some particle physicists are hopeful they will see a slice of the pie. The $280-billion CHIPS and Science Act recently signed into law by President Joe Biden includes provisions seeking to boost quantum technology, which is key to some new dark matter experiments, as well as more funding for machine learning (ML), which is driving novel searches for particles at the LHC.

    “ML is powerful because our discoveries about fundamental particles must be statistical,” said Daniel Whiteson, an experimental particle physicist at the University of California-Irvine, at Snowmass. There, he and others raised the idea of a “data physicist”—a new breed of researcher using data in novel ways. One radical example came from David Shih, a theorist at Rutgers University. “Here’s a crazy idea,” Shih said cheerfully during his remote presentation via Zoom. “We could replace the LHC with a generative model.” Just as powerful models have demonstrated an ability to produce compelling images or write prose, one could produce collisions explorable by physicists. More provocative than serious, the idea set off both laughter and concerned murmurs.

    While DUNE’s failure—or a more general lack of new funds—would be bad news for the field, it’s already the case that about two thirds of particle physicists, who spend years working toward a Ph.D. or in a postdoctoral position, are forced to leave research because there simply aren’t enough jobs. The “pipeline” is rich with aspiring researchers—and impoverished of suitable positions for them to hold.

    To reduce the stigma of leaving particle physics, Snowmass conveners held a mixer where more than a dozen former physicists now at local companies, from small tech studios to Microsoft and Google, advertised paths away from academia. But for early-career researchers in search of a job to actually do physics, the prospects were few and far between.

    Like most sciences in the U.S., physics suffers from a lack of diversity: among those whose race and gender were known, nearly 70 percent of physics PhDs awarded between 2014 and 2019 went to white men. But not everyone can agree on community efforts to address diversity, equity and inclusion. “The younger generation isn’t really interested in that discussion about whether there’s a trade-off between excellence and diversity,” Fermilab research scientist Bryan Ramson tells Scientific American. “I think physics as a whole would be much better off if you assume that everybody’s good enough.”

    Dreams and Nightmares

    One of the great shared dreams of particle physicists is to double their particulate pantheon so that each boson has a fermion counterpart and each fermion has a boson twin. This is the core concept of supersymmetry (SUSY), a set of theories that have profoundly shaped successive generations of today’s researchers. For example, under SUSY’s rules, photons would be mirrored by “photinos” and electrons mirrored by “selectrons.”

    Appealingly, a symmetry between force-carrying bosons and fermionic particles of matter could tame the uncontrolled Higgs mass (which the Standard Model otherwise predicts should be astronomically larger) and even act as dark matter.

    Not only is there as yet no evidence for supersymmetry, but the LHC experiments ATLAS and CMS have successfully ruled out the most likely places its particles could have been hiding.

    Despite this, SUSY holds a pride of place among theories. And at Snowmass, many particle physicists—particularly those of an older vintage—still spoke of it in the present tense as an old friend.

    Theories are hard to kill, and SUSY is not dead, but many younger researchers are beginning to move on. Inspired by a new vista of possibilities, they are looking for dark matter anywhere they can find it, not just for the weakly interacting massive particles (WIMPs) predicted by SUSY.

    __________________________________
    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.

    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory at Stanford University at SNOLAB (Vale Inco Mine, Sudbury, Canada).

    LBNL LZ Dark Matter Experiment xenon detector at Sanford Underground Research Facility Credit: Matt Kapust.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment Dark Matter project at SURF, Lead, SD.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.


    __________________________________

    They’re also trying not to throw the baby out with the bathwater. At an early-morning session, Nathaniel Craig, a theorist at the University of California-Santa Barbara, made the case that, irrespective of SUSY, the principle of naturalness should be salvaged.

    Reductively put, naturalness is the idea that the universe should not be absurdly lucky. Over coffee, Craig gave an analogy: Suppose every pencil could be easily balanced on its tip. Should we expect the universe to be this lucky, or should we look for some hidden phenomenon that is secretly stabilizing the pencils?

    While critics have sometimes derided naturalness as a mere aesthetic preference, Craig pointed to its historical success—naturalness stems from the theoretical physicist Victor Weisskopf’s 1939 work showing how the positron stabilizes the electron, and in 1974, led theorists Ben Lee and Mary Gaillard to predict the charm quark’s mass. “Naturalness is not a theory but a strategy to help us focus in the infinite places we could look,” Craig explained. Instead of abandoning naturalness because of SUSY’s dim prospects, he argued, physicists should consider nearly two dozen other theories inspired by naturalness.

    Theorists aren’t the only ones moving on from SUSY. XENONnT [above] and LUX-ZEPLIN (LZ) [above]—two experiments using giant containers of liquid xenon to spot dark matter—recently reported results that, while null, still set impressive new limits on the plausible properties of WIMPs. Yet those results occupied only a small portion of the conversation at Snowmass. Freed from the need to fulfill SUSY, which predicts dark matter in a relatively narrow mass range, researchers are now looking for various candidate dark matter particles with masses ranging across some 30 orders of magnitude—about the difference between the mass of an ant and that of our sun. They are also figuring out ways to penetrate the once foreboding “neutrino floor,” the level at which noise from cosmic neutrinos would drown out any dark matter signal. The new approach is embodied by a motto workshopped at the conference: “Delve deep, search wide.”

    Physicists working with colliders are also trying new methods. During the LHC’s third run, which is now underway, both ATLAS and CMS will be looking for long-lived particles. At Snowmass, researchers discussed how best to search for such particles, which can putter around before decaying, leading to unusual-looking events that might have been overlooked in the past.

    Physicists are also reassessing particle flavor, a quantum property that defines the species of fermion: up quark, down quark, electron, muon, and so on. Flavor has often been taken for granted, but anomalies that indicate flavor-based behavioral differences between electrons and muons have reawakened interest in the subject. “Flavor is something that no one knows the answer to,” said Patrick Meade, a theorist at Stony Brook University-SUNY, at Snowmass. If “any theorist tells you they know what the right model of flavor is, they’re lying to you.” As in so many other cases, physicists may simply have to wait for more data. If experiments such as Belle II confirm the flavor anomalies seen in the LHCb and Muon g-2 [above] experiments, flavor could become a top-priority unknown.

    Visions from the Frontier

    If you wanted to conduct a quick, crude version of Snowmass, you might ask, “Which particle is the best to study?” Physicists disagree emphatically—some would choose mysterious neutrinos; others might point to whatever unknown particles constitute dark matter or even to better-known particles such as muons or bottom quarks, for their rare decays.

    Among these choices, it is the drive to study the Higgs that may most shape the field. Though ATLAS and CMS have precisely measured its mass to one part per 1,000, much is still unknown about the Higgs. How it couples to lighter particles—if at all—remains unclear. Through an upgrade later this decade, the LHC will accumulate over 20 times more data than it has collected so far, allowing it to make more precise measurements of the Higgs. The particle is also fertile ground for new physics, and models with multiple types of Higgs—or where the Higgs interacts with dark matter particles—are easy to create. But regardless of what researchers learn about the particle, the effort to study it will shape the field.

    Particle physicists are hungry for a new collider. They are, by and large, tired of smashing protons—essentially messy bundles of quarks—and would much prefer the more tidy collisions of electrons and positrons. With cleaner collisions, they could create a factory churning out Higgs bosons to subject to further, more intense scrutiny. The nearest-term possibility for such a Higgs factory is the International Linear Collider, which would be built in Japan.

    Though it is shovel-ready, the project has been delayed for years, and in February it was dealt another, possibly fatal blow when the Japanese government refused to allow it to go forward.

    Then there is the Future Circular Collider (FCC), a proposed 90-kilometer-wide ring that would lie under a wide swathe of Swiss countryside.

    According to CERN director general Fabiola Gianotti, the FCC would probably begin operations circa 2050. Meanwhile, accelerator scientists in the U.S. are eager to host the next collider. In a white paper released late last October, a team of researchers introduced a new “cold copper” technology that could accelerate particles more rapidly without liquid helium cryogenics, allowing for a smaller, cheaper and more feasible collider.

    But many researchers are unhappy with the idea of waiting 20 years or more for a mere Higgs factory. They want to explore high energies far out of the LHC’s reach and with unprecedented precision. Over the past two years, the idea of a muon collider has spread throughout the particle physics community. In the past, the Muon Accelerator Program drew little attention from theoretical physicists, few of whom mourned its demise. Experimentally, little has changed about a muon collider, which faces daunting technical obstacles. Socially, the community is invigorated—especially younger researchers, many of whom sported stylish muon collider T-shirts at Snowmass (a propaganda feat that was later mimicked by cold-copper-collider proponents who handed out chic buttons).

    Ironically, where physicists’ ambitions are greatest is where Snowmass struggles the most as a format. In theory, it is Snowmass’s goal to outline a scientific vision without setting priorities, which is the job of the Particle Physics Project Prioritization Panel (P5). But a scientific vision cannot exist in a priority-free vacuum unless it impractically ignores all resources and constraints. The tortured logic meant that at the latest Snowmass, particle physicists could point to the promise of investigating Higgs parameters with a muon collider but not actually endorse a muon collider over any alternative.

    As Snowmass ended, a coherent vision was not immediately clear. The task of refining 10 days and 500 white papers now falls to P5 and its newly announced chair Hitoshi Murayama.

    Discussing the role of theorists with words that might also apply to his new role during Snowmass, Murayama said, “I hope we can provide guidance,” and then added puckishly, “although it’s sometimes misguidance.”

    *Editor’s Note (9/9/22): This sentence was edited after posting to correct the description of the funding that has dwindled over the past decade.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 8:04 pm on August 24, 2022 Permalink | Reply
    Tags: "See Iceland Aglow in Volcanic Eruptions", A swarm of earthquakes in late July and early August rocked the area., A vivid look at Iceland’s recent resurgence of volcanic eruptions—and why the country could be in for 300 years of renewed volcanic activity., , As the mid-ocean ridge spreads Reykjanes cycles through quiet periods typically lasting 800 to 1000 years followed by two or three centuries of spectacular eruptions., , , , , Iceland straddles the boundary between two of the earth’s tectonic plates: The North American and Eurasian plates are pulling away from each other at a rate of one to two inches per year., , Iceland-like Hawaii-is perched above a “hotspot” a column of hot rock that rises through the mantle driven by its own buoyancy., Now it seems the peninsula is truly waking up., Scientific American, Such striking volcanic displays are relatively common in Iceland., The area is located along a kink in the mid-ocean ridge and the cracks form as a result of the two plates moving apart at an odd angle., The Fagradalsfjall volcano into the valley is just 20 miles from Iceland’s capital of Reykjavk., The kind of volcanic eruptions that take place in this area [Reykjanes] are not originating from the typical cone-shaped mountain but more through openings in the crust., The strings of small craters and fissures now forming in Reykjanes’s volcanic systems are where the plate boundary comes onshore., The towering cone of Hekla in the south is closer to the mantle hotspot.,   

    From “Scientific American” : “See Iceland Aglow in Volcanic Eruptions” Photo Essay 

    From “Scientific American”

    8.24.22
    Sasha Warren

    A vivid look at Iceland’s recent resurgence of volcanic eruptions—and why the country could be in for 300 years of renewed volcanic activity.

    1
    The 2021 eruption of Iceland’s Fagradalsfjall volcano. Credit: Jeroen Van Nieuwenhove.

    Breaking more than seven months of calm, the peninsula of Reykjanes in western Iceland has once again burst into volcanic flames. After a swarm of earthquakes in late July and early August rocked the area, lava burst forth from the Fagradalsfjall volcano into the valley of Meradalir—not far from the barely cooled lava from the same volcano’s 2021 eruption—treating tourists and researchers to the vibrant red-orange glow of fresh molten rock just 20 miles from Iceland’s capital of Reykjavk.

    Such striking volcanic displays are relatively common in Iceland. The entire country, which is one of the geologically youngest landmasses in the world, is the product of millions of years of eruptions and is perfectly placed for ongoing volcanic activity.

    2
    Fagradalsfjall volcanic eruption in 2022. Credit: Jeroen Van Nieuwenhove.

    Iceland straddles the boundary between two of the earth’s tectonic plates: enormous fragments of crust that fit together like puzzle pieces to form our planet’s rocky outer shell. The North American and Eurasian plates are pulling away from each other at a rate of one to two inches per year, gradually unzipping the floor of the Atlantic Ocean to form a mid-ocean ridge. This divergence leaves a gap that draws up material from the earth’s mantle, a hot layer of rock sandwiched between the crust (the layer we live on) and our planet’s metal core.

    3
    Fagradalsfjall 2022 eruption. Credit: Jeroen Van Nieuwenhove.

    As it rises, this material partially melts, supplying Icelandic volcanoes with magma, but this isn’t the only source of molten rock in the region. Iceland-like Hawaii-is perched above a “hotspot” a column of hot rock that rises through the mantle driven by its own buoyancy, which adds yet more fuel to the island’s volcanic fires.

    In Iceland, this combination of magma sources expresses itself as several different kinds of volcanoes. The towering cone of Hekla in the south is closer to the mantle hotspot-whereas the strings of small craters and fissures now forming in Reykjanes’s volcanic systems are where the plate boundary comes onshore.

    4
    Fagradalsfjall 2021 eruption. Credit: Jeroen Van Nieuwenhove.

    “The kind of volcanic eruptions that take place in this area [Reykjanes] are not originating from the typical cone-shaped mountain but more through openings in the crust,” says Sara Barsotti, coordinator for volcanic hazards at the Icelandic Meteorological Office (IMO). These openings occur because the area is located along a kink in the mid-ocean ridge and the cracks form as a result of the two plates moving apart at an odd angle. Some of these cracks fill with magma, which can eventually erupt, whereas others allow chunks of crust to slide past one another, leading to earthquakes. Magma moving through the crust can also cause seismic activity as new cracks form or widen to accommodate the molten rock.

    5
    Fagradalsfjall 2021 eruption. Credit: Jeroen Van Nieuwenhove.

    As the mid-ocean ridge spreads Reykjanes cycles through quiet periods typically lasting 800 to 1000 years followed by two or three centuries of spectacular eruptions, which scientists studying Iceland suspect could be starting now. During the 1990s, well before the Fagradalsfjall eruption began in 2021, geophysicist Sigrun Hreinsdóttir, now at the New Zealand geoscience research and consulting company GNS Science, Te Pū Ao, set up GPS stations throughout the peninsula to monitor the area’s slow shifting, bending and buckling, accompanied by small earthquakes. At the time, there were no active eruptions.

    Looking back, though, Hreinsdóttir says, these measurements may have captured the first signs of new volcanic action in the region. “There was a lot of activity in [the mountain] Hengill, at the edge of Reykjanes Peninsula—lots of earthquakes,” she explains. All the action led scientists to suspect a magma chamber was filling up deep below the surface, and “we were wondering if that was kind of the first sign that Reykjanes might be close to coming alive.”

    6
    Fagradalsfjall 2021 eruption. Credit: Jeroen Van Nieuwenhove.

    Now it seems the peninsula is truly waking up. Since the late 2000s, magma injected beneath the surface has caused the area to periodically inflate and deflate, bulging to accommodate the movements of molten rock underground. Barsotti and her colleagues at IMO track the locations of these reservoirs using earthquakes, GPS and satellite imagery to try to anticipate which parts of Reykjanes are most primed for future eruptions. The final warning sign was a cluster of large earthquakes that shook western Iceland before the first fissures opened in 2021.

    7
    2021 eruption. Credit: Jeroen Van Nieuwenhove.

    After longing to see an eruption on every day of her fieldwork on the peninsula around 30 years ago, Hreinsdóttir could only watch her dream come true from afar, as COVID kept her home in New Zealand in 2021. This August, however, she went on a pilgrimage to lay her hands on the cooled lava from last year, and her six-year-old son was knocked off his feet by a magnitude 4.5 earthquake. This August 2 quake turned out to be a warning for an eruption on the very next day that would prove to be even bigger and more spectacular than the one she had missed. “It was quite a nice feeling for me,” she says. “It felt like Fagradalsfjall was just saying, ‘Hello!’”

    8
    Fagradalsfjall 2021 eruption. Credit: Jeroen Van Nieuwenhove.

    On August 3, Hreinsdóttir hiked out to Meradalir with her colleagues from the University of Iceland, where she was previously affiliated, and some 1,800 other visitors to see the fluorescent orange glow of lava fountaining up from between the rocks of her former study area. Like in the 2021 Fagradalsfjall eruption, volcanologists expect new lava to keep emerging here for several months.

    The eruption is already a hotspot for hikers and photographers. So far it is “pretty safe,” says Barsotti, who is monitoring the volcanic activity closely for potential hazards. “But I think we also need to know there is always uncertainty in what we can anticipate to be next.” The ongoing eruption is just an hour’s drive from Reykjavík, so IMO’s volcanologists are using data and models to assess current and future risks to infrastructure, water quality and human health caused by the lava and gases emanating from the new fissure.

    9
    Fagradalsfjall 2021 eruption. Credit: Jeroen Van Nieuwenhove.

    Although the eruption itself presents some dangers to tourists, including noxious fumes and unimaginably hot molten rock, perhaps the greatest challenge facing those who want to see it is the two-hour hike to get there. “It is important to check on the IMO website for the conditions expected because we are going toward autumn—it might be very cold; it might be very windy,” Barsotti says. As a result, children age 12 and under and pets are prohibited from entering the eruption area.

    Those that make it, though, are in for an enviable sight. “I’m jealous of myself, to be honest,” Hreinsdóttir says, although eruptions may occur as often as every few years now that Reykjanes has awakened from its roughly 800-year slumber. “How lucky was it that I was alive when this was happening?”

    10
    Fagradalsfjall 2022 eruption. Credit: Jeroen Van Nieuwenhove

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 10:52 am on August 21, 2022 Permalink | Reply
    Tags: "See How Scientists Put Together the Complete Human Genome?", , , , , For the first time researchers have sequenced all 3117275501 bases of our genetic code., , Scientific American   

    From “Scientific American” : “See How Scientists Put Together the Complete Human Genome” 

    From “Scientific American”

    8.1.22
    Clara Moskowitz
    Martin Krzywinski

    For the first time researchers have sequenced all 3117275501 bases of our genetic code.

    1
    Credit: Martin Krzywinski.

    The human genome is at last complete. Researchers have been working for decades toward this goal, and the Human Genome Project claimed victory in 2001, when it had read almost all of a person’s DNA. But the stubborn remaining 8 percent of the genome took another two decades to decipher. These final sections were highly repetitive and highly variable among individuals, making them the hardest parts to sequence. Yet they revealed hundreds of new genes, including genes involved in immune responses and those responsible for humans developing larger brains than our primate ancestors. “Now that we have one complete reference, we can understand human variation and how we changed with respect to our closest related species on the planet,” says geneticist Evan Eichler of the University of Washington, one of the co-chairs of the Telomere-to-Telomere consortium at The National Human Genome Research Institute that finished the genome.


    Credit: Martin Krzywinski; Sources: The University of California-Santa Cruz Genome Browser; “The Complete Sequence of a Human Genome,” by Sergey Nurk et al., in Science April 2022.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:53 am on July 30, 2022 Permalink | Reply
    Tags: "See the Strange Underground Detector Probing Neutrino Mysteries", , , , , , Scientific American, The LEGEND-200 detector could help explain why matter dominates the known universe.   

    From “Scientific American” : “See the Strange Underground Detector Probing Neutrino Mysteries” 

    From “Scientific American”

    July 1, 2022
    Joanna Thompson

    The LEGEND-200 detector could help explain why matter dominates the known universe.

    Sheltered underneath nearly a mile of rock in Abruzzo, Italy, scientists are hard at work unraveling the secrets of the universe’s smallest bits of matter. When a radioactive process called beta decay occurs, it typically emits two particles: a negatively charged electron and a version of a tiny, neutrally charged neutrino. The Large Enriched Germanium Experiment for Neutrinoless Double Beta Decay (LEGEND-200) at the Gran Sasso National Laboratory is designed to figure out whether this process can occur without resulting in a neutrino at the end. The answer could shape our understanding of how matter came to be.

    The process of “neutrinoless double beta decay,” if it does occur, happens very rarely. Noticing when decay results in electrons but not neutrinos can be difficult, especially because neutrinos are plentiful everywhere—billions pass through your body every second—and are often produced when background radiation reacts with machine components.

    That’s why scientists focus on “choosing really low-radioactivity materials to start with and then also coming up with lots of clever ways to reject background [particles],” says Drexel University particle physicist Michelle Dolinski, who is not involved in the project.

    LEGEND-200 is equipped with slightly radioactive germanium crystals, which act as both a source of beta decay and a neutrino detector. To screen out ambient particles, the entire setup is immersed in a cryogenic tank shielded by water and liquid argon. That core is surrounded with green optical fibers and a reflective film that bounces away stray particles.

    If LEGEND-200 observes neutrinoless double beta decay, it will mean that unlike protons, electrons and other elementary particles—which each have an “antiparticle” that destroys them on contact—neutrinos are their own antiparticles and can destroy one another. If this is the case, then when double beta decay occurs, two neutrinos would be produced and immediately annihilated, leaving none behind. “This is an important ingredient in trying to understand why matter dominated over antimatter in the early universe and why the universe exists as it does today,” Dolinski says.

    3
    Inside the LEGEND-200 water tank, mirror film surrounds the liquid-argon cryostat. Inside the water tank, mirror film surrounds the liquid-argon cryostat. Credit: Enrico Sacchetti.

    LEGEND collaborator Laura Baudis, who is an experimental physicist at the University of Zurich, is excited to see what this experiment uncovers when it begins collecting data later this year. “There are so many things we don’t know about neutrinos,” she says. “They’re really still full of surprises.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 3:50 pm on July 23, 2022 Permalink | Reply
    Tags: "Ocean Discoveries Are Revising Long-Held Truths about Life", , , , For more than 50 years deep-sea exploration has been a continuous fount of discoveries that change how we think about life., In 1977 scientists diving in the restored ”Alvin” made another historic discovery—the first in-person observations of life around hot hydrothermal vents rising from the seafloor., New findings show that the ocean is much more intertwined with our lives than we ever imagined., Scientific American, The greatest paradigm that ocean exploration may tear down is that Earth represents the sole example of life in the universe., We are seeing that the ocean's biogeographic boundaries are neither immutable nor beyond the imprint of humans.   

    From “Scientific American” and The Woods Hole Oceanographic Institution: “Ocean Discoveries Are Revising Long-Held Truths about Life” 

    From “Scientific American”

    and

    The Woods Hole Oceanographic Institution

    August 1, 2022
    Timothy Shank

    New findings show that the ocean is much more intertwined with our lives than we ever imagined.

    1
    The Orpheus from the Woods Hole Oceanographic Institution is designed to maneuver autonomously and in swarms of vehicles at the deepest depths and to land to collect samples on the fly. Credit: Evan Kovacs/© Woods Hole Oceanographic Institution.

    For more than 50 years deep-sea exploration has been a continuous fount of discoveries that change how we think about life in the ocean, on dry land and even beyond our planet. Consider the following three events.

    On October 16, 1968, a cable tethering the submersible Alvin to a research ship located 100 miles off Nantucket broke. The sub sank to the seafloor more than 5,000 feet below; the crew of three escaped safely. Nearly a year later, when a team brought Alvin back to the surface, the biggest surprise was that the crew’s lunch—bologna sandwiches and apples in a plastic box—was strikingly well preserved. Bacteriological and biochemical assays proved it. Someone even took a bite. Subsequent experiments in the Woods Hole Oceanographic Institution laboratory where I’m writing this article found that rates of microbial degradation in the retrieved samples were 10 to 100 times slower than expected. This discovery, and others, led to the conclusion that metabolic and growth rates among deep-sea organisms were much slower than those of comparable species at the ocean’s surface.

    In 1977 scientists diving in the restored Alvin made another historic discovery—the first in-person observations of life around hot, hydrothermal vents rising from the seafloor. This sighting overturned the long-held view that our entire planetary food web was built on photosynthesis—using sunlight’s energy to convert carbon dioxide and water into complex carbohydrates and oxygen. The hydrothermal organisms, and the entire ecosystem, thrived in pure darkness, converting chemicals in the vent fluid into life-sustaining compounds through a process we now call chemosynthesis.

    If that revelation wasn’t surprising enough, an expedition I was part of in 1993 exposed an earlier mistaken belief. We had discovered a significant hydrothermal vent ecosystem on the East Pacific Rise. The system had been destroyed by a seafloor eruption just a few years earlier, yet it had already been bountifully recolonized. A bologna sandwich might decay so slowly in the deep that you could eat it a year later, but it turned out that biological processes in the deep sea could be extremely fast as well.

    Each new ocean discovery that disrupts old dogma reinforces a much larger truth: the ocean is far more complex—and much more intertwined with our own lives—than we ever imagined. For much of the 20th century, for example, scientists maintained that the deep ocean was a harsh, monotonous place of perpetual darkness, frigid temperatures, limited food and extreme pressure—conditions that should make complex forms of life impossible. But new tools for observing, sensing and sampling the deep ocean, such as increasingly sophisticated underwater vehicles with high-definition camera systems, have demonstrated that biodiversity in the darkest depths may rival that of rain forests and tropical coral reefs. These missions have further revealed that the depths are far from uniform; like kangaroo habitat in Australia and tiger lands in Asia, they are home to evolutionarily distinct biogeographic regions.

    We are beginning to appreciate how connected these realms are to our own. The rapid three-dimensional change of conditions such as temperature, salinity and oxygen concentration in the deep ocean and the currents and eddies that establish the boundaries of these provinces are expected to fundamentally change as the effects of human activity reach ever farther below the surface. Already lobsters are moving to deeper, colder waters and molting at different times of the year. Commercially important ground fish such as cod and haddock are migrating poleward in search of more suitable habitat.

    We are seeing that the ocean’s biogeographic boundaries are neither immutable nor beyond the imprint of humans. In studies, more than half of sampled hadal organisms—those living in the deepest parts of the ocean, beyond 20,000 feet—had plastics in their gut. PCBs, which were banned in the U.S. in 1979 and phased out internationally as part of the Stockholm Convention beginning in 2001, are also common in tissues of animals from the extreme bottoms of the sea.

    We are also starting to learn that life in the deep might have things to teach us. Deep-sea fish produce biomolecules called osmolytes that permit cellular functions, such as the precise folding and unfolding of proteins, to proceed unimpeded by crushing water-column pressures exceeding 15,000 pounds per square inch. Medical researchers have determined that some of these molecules could help treat Alzheimer’s disease, which is characterized by misfolded proteins. In addition, decoding the genes that govern traits we see in deep-sea animals, such as those that stave off errors in DNA replication, transcription and translation, might be used in therapies for cancer and other afflictions.

    The greatest paradigm that ocean exploration may tear down is that Earth represents the sole example of life in the universe. Life might have existed on Mars when it hosted liquid water, and the fact that Earth and Mars have shared ejected material in the past means we could have exchanged the building blocks of life. But the discovery of chemosynthetic life on Earth and the more recent finding of perhaps 13 liquid-water oceans underneath the icy shells of moons such as Jupiter’s Europa and Saturn’s Enceladus—places that may have been too distant to have shared life-bearing material with Earth in the past—raise the possibility of a second, independent genesis of life. And if life can form twice in one solar system, then it could be anywhere we look in the heavens.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Mission Statement

    The Woods Hole Oceanographic Institution is dedicated to advancing knowledge of the ocean and its connection with the Earth system through a sustained commitment to excellence in science, engineering, and education, and to the application of this knowledge to problems facing society.

    Vision & Mission

    The ocean is a defining feature of our planet and crucial to life on Earth, yet it remains one of the planet’s last unexplored frontiers. For this reason, WHOI scientists and engineers are committed to understanding all facets of the ocean as well as its complex connections with Earth’s atmosphere, land, ice, seafloor, and life—including humanity. This is essential not only to advance knowledge about our planet, but also to ensure society’s long-term welfare and to help guide human stewardship of the environment. WHOI researchers are also dedicated to training future generations of ocean science leaders, to providing unbiased information that informs public policy and decision-making, and to expanding public awareness about the importance of the global ocean and its resources.

    The Institution is organized into six departments, the Cooperative Institute for Climate and Ocean Research, and a marine policy center. Its shore-based facilities are located in the village of Woods Hole, Massachusetts and a mile and a half away on the Quissett Campus. The bulk of the Institution’s funding comes from grants and contracts from the National Science Foundation and other government agencies, augmented by foundations and private donations.

    WHOI scientists, engineers, and students collaborate to develop theories, test ideas, build seagoing instruments, and collect data in diverse marine environments. Ships operated by WHOI carry research scientists throughout the world’s oceans. The WHOI fleet includes two large research vessels (R/V Atlantis and R/V Neil Armstrong); the coastal craft Tioga; small research craft such as the dive-operation work boat Echo; the deep-diving human-occupied submersible Alvin; the tethered, remotely operated vehicle Jason/Medea; and autonomous underwater vehicles such as the REMUS and SeaBED.
    WHOI offers graduate and post-doctoral studies in marine science. There are several fellowship and training programs, and graduate degrees are awarded through a joint program with the Massachusetts Institute of Technology. WHOI is accredited by the New England Association of Schools and Colleges . WHOI also offers public outreach programs and informal education through its Exhibit Center and summer tours. The Institution has a volunteer program and a membership program, WHOI Associate.

    On October 1, 2020, Peter B. de Menocal became the institution’s eleventh president and director.

    History

    In 1927, a National Academy of Sciences committee concluded that it was time to “consider the share of the United States of America in a worldwide program of oceanographic research.” The committee’s recommendation for establishing a permanent independent research laboratory on the East Coast to “prosecute oceanography in all its branches” led to the founding in 1930 of the Woods Hole Oceanographic Institution.

    A $2.5 million grant from the Rockefeller Foundation supported the summer work of a dozen scientists, construction of a laboratory building and commissioning of a research vessel, the 142-foot (43 m) ketch R/V Atlantis, whose profile still forms the Institution’s logo.

    WHOI grew substantially to support significant defense-related research during World War II, and later began a steady growth in staff, research fleet, and scientific stature. From 1950 to 1956, the director was Dr. Edward “Iceberg” Smith, an Arctic explorer, oceanographer and retired Coast Guard rear admiral.

    In 1977 the institution appointed the influential oceanographer John Steele as director, and he served until his retirement in 1989.

    On 1 September 1985, a joint French-American expedition led by Jean-Louis Michel of IFREMER and Robert Ballard of the Woods Hole Oceanographic Institution identified the location of the wreck of the RMS Titanic which sank off the coast of Newfoundland 15 April 1912.

    On 3 April 2011, within a week of resuming of the search operation for Air France Flight 447, a team led by WHOI, operating full ocean depth autonomous underwater vehicles (AUVs) owned by the Waitt Institute discovered, by means of sidescan sonar, a large portion of debris field from flight AF447.

    In March 2017 the institution effected an open-access policy to make its research publicly accessible online.

    The Institution has maintained a long and controversial business collaboration with the treasure hunter company Odyssey Marine. Likewise, WHOI has participated in the location of the San José galleon in Colombia for the commercial exploitation of the shipwreck by the Government of President Santos and a private company.

    In 2019, iDefense reported that China’s hackers had launched cyberattacks on dozens of academic institutions in an attempt to gain information on technology being developed for the United States Navy. Some of the targets included the Woods Hole Oceanographic Institution. The attacks have been underway since at least April 2017.

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 3:50 pm on May 11, 2022 Permalink | Reply
    Tags: "God; Dark Matter and Falling Cats-A Conversation with 2022 Templeton Prize Winner Frank Wilczek", 'Nature 'talks back' and sometimes surprises you and sometimes confirms what you imagined.", , Axions are super exciting. It was totally unexpected to me at the beginning that the theory was perfectly designed to explain the dark matter but that possibility has been gaining ground., “Complementarity” says that you can’t use a single picture to answer all meaningful questions., , , , Scientific American, So-called WIMPs (weakly interacting massive particles) have turned up empty., There’s an abstract mathematical idea called “gauge symmetry” that underpins particle physics. It’s a powerful tool but it’s a mystery as to why it is there.,   

    From “Scientific American”: “God; Dark Matter and Falling Cats-A Conversation with 2022 Templeton Prize Winner Frank Wilczek” 

    From Scientific American

    May 11, 2022
    Zeeya Merali

    1
    Theoretical physicist and Nobel Prize laureate Dr. Frank Wilczek has won the 2022 Templeton Prize. Credit: Michael Clark.

    Frank Wilczek, a Nobel Prize–winning theoretical physicist and author, has been announced as the recipient of the 2022 Templeton Prize, which is valued at more than $1.3 million. The annual award honors those “who harness the power of the sciences to explore the deepest questions of the universe and humankind’s place and purpose within it,” according to a press release from the John Templeton Foundation. Previous recipients include scientists such as Jane Goodall, Marcelo Gleiser and Martin Rees, as well as religious or political leaders such as Mother Theresa and Desmond Tutu.

    Wilczek’s Nobel-winning work traces back to the early 1970s, when he and two colleagues devised a theory describing the behavior of fundamental particles called quarks—a feat that proved crucial for establishing the Standard Model of particle physics.

    He has also proposed the existence of multiple new particles and entities. Some, such as “time crystals” and “anyons,” have since been discovered and appear promising for developing better quantum computers. Another Wilczek prediction—the “axion”—remains unconfirmed but is a leading candidate for dark matter, the invisible substance thought to comprise the majority of mass in the universe. He is also a prolific author, and in his recent books links his work as a physicist with his contemplations on the inherent beauty of reality, arguing that our universe embodies the most mathematically elegant structures.

    Scientific American spoke with Wilczek about the interplay between science and spirituality, recent reports that the Standard Model may be “broken” and his latest research involving the hunt for hypothetical particles and the physics of falling cats.

    [An edited transcript of the interview]

    Congratulations on receiving the Templeton Prize. What does this award represent for you?

    My exploratory, science-based efforts to address questions that are often thought to be philosophical or religious are resonating. I’m very grateful for that, and I’ve started to think about what it all means.

    One kind of “spiritual” awakening for me has been experiencing how a dialogue with nature is possible—in which nature “talks back” and sometimes surprises you and sometimes confirms what you imagined. Vague hopes and concepts that were originally scribbles on paper become experimental proposals and sometimes successful descriptions of the world.

    You don’t now identify with any particular religious tradition, but in your 2021 book Fundamentals: Ten Keys to Reality, you wrote, “In studying how the world works, we are studying how God works, and thereby learning what God is.” What did you mean by that?

    The use of the word “God” in common culture is very loose. People can mean entirely different things by it. For me, the unifying thread is thinking big: thinking about how the world works, what it is, how it came to be and what all that means for what we should do.

    I chose to study this partly to fill the void that was left when I realized I could no longer accept the dogmas of the Catholic Church that had meant a lot to me as a teenager. Those dogmas include claims about how things happen that are particularly difficult to reconcile with science. But more importantly, the world is a bigger, older and more alien place than the tribalistic account in the Bible. There are some claims about ethics and attitudes about community that I do find valuable, but they cannot be taken as pronouncements from “on high.” I think I have now gathered enough wisdom and life experience that I can revisit all this with real insight.

    Can you give me some specific examples of how the wisdom you have now but didn’t have earlier in your scientific career has influenced your outlook?

    “Complementarity” says that you can’t use a single picture to answer all meaningful questions. You may need very different descriptions, even descriptions that are mutually incomprehensible or superficially contradictory. This concept is absolutely necessary in understanding quantum mechanics, where, for instance, you can’t make predictions about the position and the momentum of an electron simultaneously. When I first encountered Bohr’s ideas about taking complementarity beyond quantum mechanics, I was not impressed. I thought it was borderline bullshit. But I’ve come to realize that it is a much more general piece of wisdom that promotes tolerance and mind expansion. There’s also the scientific attitude that openness and honesty allow people to flourish. It enhances the effectiveness of scientists to have a sort of loving relationship with what they are doing because the work can be frustrating and involves investing in learning some rather dry material. And then there is the lesson of beauty: when you allow yourself to use your imagination, the world repays with wonderful gifts.

    You won a share of the Nobel Prize in Physics in 2004 for your work on understanding the strong force, which binds subatomic particles within the atomic nucleus. This work forms part of the backbone of the Standard Model. But the Standard Model is of course incomplete because it doesn’t account for gravity or dark matter or the “dark energy” that seems to be powering the accelerating expansion of the universe. Many physicists, including yourself, consequently believe we will eventually find evidence that allows us to craft a successor to or extension of the Standard Model. In April physicists at the Fermi National Accelerator Laboratory in Batavia, Ill., announced that they had measured the mass of an elementary particle called the W boson to be significantly heavier than predicted by the Standard Model. Is this an exciting sign that the Standard Model’s reign is approaching its end?

    I am skeptical. This is an impressive piece of work, but it’s an attempt to do a high-precision measurement of the mass of an unstable particle that decays very fast in exotic ways. And because the W boson has a finite lifetime, according to quantum mechanics, it has an uncertainty in mass. Just the fact that the measurement is so complicated raises an eyebrow. And then, even more serious, is that the result is not only discrepant with theoretical calculations but also with previous experimental measurements. If there were a compelling theoretical hypothesis suggesting that there should be this discrepancy with the W boson mass but no other discrepancy with all the other tests, that would be fantastic. But that’s not the case. So, to me, the jury is still out.

    One of your most recent successes was predicting the existence of a novel quantum state of matter that you dubbed a “time crystal” because its particles exhibit repetitive behavior—like a swinging pendulum—but without consuming energy. How did you come up with the idea?

    Almost 10 years ago I was preparing to teach a course on symmetry, and I thought, “Let’s think about crystal symmetry in more than just 3-D; let’s think about crystals that are periodic in time.” Basically, time crystals are self-organized clocks, ones that are not constructed but arise spontaneously because they want to be clocks. Now, if you have systems that spontaneously want to move, this sounds dangerously like a perpetual-motion machine, and that had scared physicists away. But I have been given several injections of confidence over my career, so I wasn’t afraid and jumped in where angels fear to tread. I originally wanted to call it “spontaneous breaking of time-translation symmetry,” but my wife Betsy Devine said, “What the heck?!” So they became time crystals.

    Time crystals have now been created in the lab and in a quantum computer. How might they be useful?

    The most promising application is to make new and better clocks that are more portable and robust. Making accurate clocks is an important frontier in physics; [they are] used in GPS, for example. It’s also important to make clocks that are friendly to quantum mechanics because quantum computers will need compatible clocks.

    You have a habit of coming up with catchy names. Back in the 1970s, you proposed a hypothetical new particle that you called the “axion”—inspired by a laundry detergent—because its existence would clean up a messy technical problem in the workings of particle physics. Since then, other physicists have suggested that axions, if they exist, have just the right properties to make up dark matter. How is the search for axions progressing?

    Axions are super exciting. It was totally unexpected to me at the beginning that the theory was perfectly designed to explain the dark matter, but that possibility has been gaining ground. That’s partly because searches for the other leading dark matter candidates, so-called WIMPs (weakly interacting massive particles), have turned up empty, so axions look better by comparison. And in the last few years, there have been some truly promising ideas for detecting dark matter axions. I came up with one with Stockholm University researchers Alex Millar and Matt Lawson that uses a “metamaterial”—a material that has been engineered to process light in particular ways—as a sort of “antenna” for axions. The ALPHA collaboration has tested prototypes, and I’m optimistic, bordering on confident, that within five to 10 years, we will have definitive results.

    And “axion” is now in the Oxford English Dictionary. When you’re in the OED, you know you’ve arrived.

    You also coined the name of another new particle, the “anyon.” The Standard Model allows for two types of elementary particles: “fermions” (which include electrons) and “bosons” (such as photons of light). The anyon is a third category of “quasiparticle” that emerges through the collective behavior of groups of electrons in certain quantum systems. You predicted this back in 1984, but it’s only been confirmed in recent years. What’s the latest news on anyons?

    I thought it would take a few months to verify that you could have anyons, but it took almost 40 years. During that time, there have been literally thousands of papers about anyons, but very few were experimental. People also realized that anyons could be useful as ways of storing information—and that this could potentially be produced on an industrial scale—giving rise to the field of “topological quantum computing.” There have now been prototype experiments in China and serious investment by Microsoft. Last month Microsoft announced that they have made the kind of anyon we need to get the quantum-computing applications off the ground in a serious way. So all these thousands of papers of theory are finally making contact with practical reality and even technology.

    You clearly have a knack for coming up with groundbreaking concepts in physics. Do you have any other revolutionary ideas brewing?

    Yes, but I don’t want to jinx them by casually mentioning them here! I’ll tell you something amusing I am working on, though: there’s an abstract mathematical idea called “gauge symmetry” that underpins particle physics. It’s a powerful tool, but it’s a mystery as to why it is there. An interesting observation is that gauge symmetry also arises in the description of the mechanics of bodies that are squishy and can propel themselves. Amazingly, gauge symmetry appears when you try and work out how a cat that falls out of tree can manage to land on its feet or how divers avoid belly flops. I realized this with [physicist] Al Shapere 30 years ago, but in recent work I have been generalizing it in several directions. It’s a lot of fun—and it might turn out to be profound.

    And finally, what are your long-term hopes for the future of society?

    Looking at big history reinforces cosmic optimism. I like to say that God is a “work in progress.” Day-to-day, you can have backsliding—pandemics, wars—but if you look at the overall trends, they are extraordinarily positive. Things could go wrong, with nuclear war or ecological catastrophe, but if we are careful as a species, we can have a really glorious future. I view it as part of my mission in the remainder of my life to try and point people toward futures that are worthy of our opportunities and not to get derailed.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:45 am on April 18, 2022 Permalink | Reply
    Tags: "Astronomers Gear Up to Grapple with the High-Tension Cosmos", , Astronomers gear up for a host of new space and terrestrial telescopes to gain clearer cosmic views., , , , , ESA’s 1.2-meter Euclid space telescope., How fast is the universe expanding?, How much does matter clump up in our cosmic neighborhood?, Ironically-by its very success-the model highlights what we do not know: the exact nature of 95 percent of the universe., NASA’s Nancy Grace Roman Space Telescope, New levels of precision via radio telescope arrays such as the Simons Observatory in the Atacama Desert and the nascent CMB-S4., Pursuing these tensions is a great way to learn about the universe., , Scientific American, SH0ES project, The ground-based Vera C. Rubin Observatory, The Hubble tension which arises from differing estimates of the value of the Hubble constant-H0-the rate at which the universe is expanding., The second generation of precision cosmology supported the standard model but also brought to light the tensions., The simplest explanation for these discrepancies is merely that our measurements are somehow erroneous., The Sunyaev-Zeldovich effect, The third generation has been waiting in the wings for years and is only now starting to take center stage with the successful launch of the James Webb Space Telescope., These new telescopes are about to usher in the third generation of precision cosmology., These twin tensions may reflect some deep flaw in the standard model of cosmology.   

    From Scientific American: “Astronomers Gear Up to Grapple with the High-Tension Cosmos” 

    From Scientific American

    April 18, 2022
    Anil Ananthaswamy

    A debate over conflicting measurements of key cosmological properties is set to shape the next decade of astronomy and astrophysics.

    1
    Atacama Cosmology Telescope at Cerro Toco in the Atacama Desert of northern Chile. Credit: Giulio Ercolani/Alamy Stock Photo.

    How fast is the universe expanding? How much does matter clump up in our cosmic neighborhood? Different methods of answering these two questions—either by observing the early cosmos and extrapolating to present times, or by making direct observations of the nearby universe—are yielding consistently different answers. The simplest explanation for these discrepancies is merely that our measurements are somehow erroneous, but researchers are increasingly entertaining another, more breathtaking possibility: These twin tensions—between expectation and observation, between the early and late universe—may reflect some deep flaw in the standard model of cosmology, which encapsulates our knowledge and assumptions about the universe. Finding and fixing that flaw, then, could profoundly transform our understanding of the cosmos.

    One way or another, an answer seems certain to emerge from the fog over the coming decade, as eager astronomers gear up for a host of new space and terrestrial telescopes to gain clearer cosmic views. “Pursuing these tensions is a great way to learn about the universe,” says astrophysicist and Nobel laureate Adam Riess of Johns Hopkins University. “They give us the ability to focus our experiments on very specific tests, rather than just making it a general fishing expedition.”

    These new telescopes, Riess anticipates, are about to usher in the third generation of precision cosmology. The first generation came of age in the late 1990s and early 2000s with the Hubble Space Telescope (HST) and with NASA’s WMAP satellite that sharpened our measurements of the universe’s oldest light, the cosmic microwave background (CMB).

    It was also shaped by a number of eight-meter-class telescopes in Chile and the twin 10-meter Keck behemoths in Hawaii.

    Collectively, these observatories helped cosmologists formulate the standard model of cosmology, which is a cocktail of 5 percent ordinary matter, 27 percent dark matter and 68 percent dark energy that can with uncanny accuracy account for most of what we observe about galaxies, galaxy clusters and other large-scale structures and their evolution over cosmic time.

    Ironically-by its very success-the model highlights what we do not know: the exact nature of 95 percent of the universe.

    Driven by even more precise measurements of the CMB from ESA’s Planck satellite [above] and various ground-based telescopes, the second generation of precision cosmology supported the standard model but also brought to light the tensions. The focus shifted to reducing so-called systematics: repeatable errors that creep in because of faults in the design of experiments or equipment.

    The third generation has been waiting in the wings for years and is only now starting to take center stage with the successful launch and deep-space deployment of Hubble’s successor, the James Webb Space Telescope (JWST).

    On Earth, CMB measurements are poised to reach new Planck-surpassing levels of precision via radio telescope arrays such as the Simons Observatory in the Atacama Desert and the nascent CMB-S4, a future assemblage of 21 dishes and a half million cryogenically cooled detectors that will be divided between sites in the Atacama and at the South Pole.

    But the jewels in the third generation’s crown will be telescopes that stare at wide swathes of the sky. The first of these is likely to be ESA’s 1.2-meter Euclid space telescope, due for launch in 2023 to study the shapes and distributions of billions of galaxies with a gaze that spans about a third of the sky.

    Euclid’s studies will dovetail with those of NASA’s Nancy Grace Roman Space Telescope, a 2.4-meter telescope with a field of view about 100 times bigger than Hubble’s that is slated for launch in 2025.

    Finally, when it begins operations in the mid-2020s, the ground-based Vera C. Rubin Observatory will map the entire overhead sky every few nights with its 8.4-meter mirror and a three-billion-pixel camera, the largest ever built for astronomy.

    “We’re not going to be limited by noise and by systematics, because these are independent observatories,” says astrophysicist Priyamvada Natarajan of Yale University. “Even if we have a systematic in our framework, we should [be able to] figure it out.”

    Scaling the Distance Ladder

    Riess, for one, would like to see a resolution of the Hubble tension, which arises from differing estimates of the value of the Hubble constant, H0, the rate at which the universe is expanding. Riess leads the Supernovae, H0, for the Equation of State of Dark Energy (SH0ES)project to measure H0. The SH0ES process starts with astronomers climbing onto the first rung of the so-called cosmic distance ladder, a hierarchy of methods to gauge ever-greater celestial expanses.

    The first rung—that is, the one concerning the nearest cosmic objects—relies on geometric parallax to determine the distance to special stars called Cepheid variables, which pulsate in proportion to their intrinsic luminosity. Pegging the distance to a Cepheid via parallax allows astronomers to calibrate the relationship between its brightness and variability, making it a workhorse “standard candle” for estimating greater cosmic distances.

    This forms the basis of the second rung, which uses telescopes like the HST to find Cepheids in more remote galaxies, measure their variability to determine their distance and then use that distance to calibrate another, more powerful set of standard candles called type Ia (pronounced “one-A”) supernovae, or SNe Ia, in those very same galaxies. Ascending further, astronomers locate SNe Ia in even more far-flung galaxies, using them to establish a relationship between distance and a galaxy’s redshift, a measure of how fast it is moving away from us. The end result is an estimate of H0.

    Others, besides SH0ES, have also been on the case, including the Pantheon+ team, which has compiled a large dataset of type Ia supernovae.

    In December, Riess says, “after a couple of years of taking a deep dive on the subject,” the SH0ES team and the Pantheon+ team announced the results of nearly 70 different analyses of their combined data. The data included observations of Cepheid variables in 37 host galaxies that contained 42 type Ia supernovae, more than double the number of supernovae studied by SH0ES in 2016. Riess and his co-authors suspect this latest study represents the HST’s last stand, the outer limits of that hallowed telescope’s ability to help them climb higher up the cosmic scale. The set of supernovae now includes “all suitable SNe Ia (of which we are aware) observed between 1980 and 2021” in the nearby universe. In their analysis, H0 comes out to be 73.04 ± 1.04 kilometers per second per megaparsec.

    That is way off the value obtained by an entirely different method that looks at the other end of cosmic history—the so-called epoch of recombination when the universe became transparent to light, about 380,000 years after the Big Bang.

    The light from this epoch, now stretched to microwave wavelengths because of the universe’s subsequent expansion, is detectable as the all-pervading cosmic microwave background. Tiny fluctuations in temperature and polarization of the CMB capture an all-important signal: the distance a sound wave travels from almost the beginning of the universe to the epoch of recombination. This length is a useful metric for precision cosmology and can be used to estimate the value of H0 by extrapolating to the present-day universe using the standard LCDM model (where L stands for lambda or dark energy, and CDM for cold dark matter; cold refers to the assumption that dark matter particles are relatively slow-moving).

    Lamda Cold Dark Matter Accerated Expansion of The universe http://www.scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    Published a year ago, the latest analysis combined data from the Planck satellite and two ground-based instruments, the Atacama Cosmology Telescope (ACT) and the South Pole Telescope (SPT), to arrive at an H0 of 67.49 ± 0.53.

    The discrepancy between the two estimates has a statistical significance of five σ, meaning there is only about a one-in-a-million chance of it being a statistical fluke. “It’s certainly at the level that people should take seriously—and they have,” Riess says.

    How Clumpy Is the Cosmos?

    The other tension that researchers are starting to take seriously concerns a cosmic parameter called S8, which depends on the density of matter in the universe and the extent to which it is clumped up rather than evenly distributed. Estimates of S8 also involve, on one end, measurements of the CMB, with measurements of the local universe on the other. The CMB-derived value of S8 in the early universe, extrapolated using LCDM, generates a present-day value of about 0.834.

    The local universe measurements of S8 involve a host of different methods. Among the most stringent of these are so-called weak gravitational lensing observations, which measure how the average shape of millions of galaxies across large patches of the sky is distorted by the gravitational influence of intervening concentrations of dark and normal matter.

    Astronomers used the latest data from the Kilo-Degree Survey (KiDS), which more than doubled its sky coverage from 350 to 777 square degrees of the sky (the full moon, by comparison, spans a mere half a degree), and estimated S8 to be about 0.759. The tension between the early- and late-universe estimates of S8 has grown from being at 2.5 sigma in 2019 to three sigma now (or, a one-in-740 chance of being a fluke). “This tension isn’t going away,” says astronomer Hendrik Hildebrandt of The Ruhr-University Bochum [Ruhr-Universität Bochum] (DE). “It has hardened.”

    There is yet another way to arrive at the value of S8: by counting the number of the most massive galaxy clusters in some volume of space. Astronomers can either do that directly (for example, by using gravitational lensing), or by studying the imprint of these clusters in the cosmic microwave background, thanks to something called the Sunyaev-Zeldovich effect (which causes CMB photons to scatter off the hot electrons in clusters of galaxies, creating shadows in the CMB that are proportional to the mass of the cluster). A detailed 2019 study using data from the South Pole Telescope estimated S8 to be 0.749—again, way off from the CMB+LCDM–based estimates. These numbers could be reconciled if the estimates of the masses of these clusters were wrong by about 40–50 percent, Natarajan says. However, she thinks such substantial revisions are unlikely. “We are not that badly off in the measurement game,” she says. “So that’s another kind of internal inconsistency, another anomaly pointing to something else.”

    Breaking the Tensions

    Given these tensions, it is no surprise cosmologists are anxiously awaiting fresh data from the new generation of observatories. For instance, David Spergel of Princeton University is eager for astronomers to use the JWST to study the brightest of the so-called red-giant-branch stars. These stars have a well-known luminosity and can be used as standard candles to measure galactic distances—an independent rung on the cosmic ladder, if you will. In 2019, Wendy Freedman of The University of Chicago and colleagues used this technique to estimate H0, finding that their value sits smack in the middle of the early- and late-universe estimates. “The error bars on the current tip of the red-giant-branch data are such that they’re consistent with both possibilities,” Spergel says. Astronomers are also planning to use JWST to recalibrate the Cepheids surveyed by Hubble, and separately the telescope will help create another new rung for the distance ladder by targeting Mira stars (which, like Cepheids, have a luminosity-periodicity relation useful for cosmic cartography).

    Whereas JWST might resolve or strengthen the H0 tension, the wide-field survey data from the Euclid, Roman and Rubin observatories could do the same for the S8 tension by studying the clustering and clumping of matter. The sheer amount of data expected from this trio of telescopes will reduce S8 error bars enormously. “The statistics are going to go through the roof,” Natarajan says.

    Meanwhile, theoreticians are already having a field day with the twin tensions. “This is a playground for theorists,” Riess says. “You throw in some actual observed tensions, and they are having more fun than we are.”

    The most recent theoretical idea to garner a great deal of interest is something called early dark energy (EDE). In the canonical LCDM model, dark energy only started dominating the universe relatively late in cosmic history, about five billion years ago. But, Spergel says, “we don’t know why dark energy is the dominant component of the universe today; since we don’t know why it’s important today, it could have also been important early on.” That is partly the rationale for invoking dark energy’s effects much earlier, before the epoch of recombination. Even if dark energy was just 10 percent of the universe’s energy budget during those times, that would be enough to accelerate the early phases of cosmic expansion, causing recombination to occur sooner and shrinking the distance traversed by primordial sound waves. The net effect would be to ease the H0 tension.

    “What I find most interesting about these models is that they can be wrong,” Spergel says. Cosmologists’ EDE models make predictions about the resulting EDE-modulated patterns in the photons of the CMB. In February 2022, Silvia Galli, a member of the Planck collaboration at The University of Paris-Sorbonne [Université de Paris-Sorbonne](FR), and colleagues published an analysis of observations from Planck and ground-based CMB telescopes, suggesting that they collectively favor EDE over LCDM, by a statistical smidgen. Confirming or refuting this rather tentative result, however, will require more and better data—which could come soon from upcoming observations by same ground-based CMB telescopes. But even if EDE models prove to be better fits and fix the H0 tension, they do little to alleviate the tension from S8.

    Potential fixes for S8 exhibit a similarly vexing lack of overlap with H0. In March, Guillermo Franco Abellán of The University of Montpellier [Université de Montpellier](FR) and colleagues published a study in Physical Review D showing that the S8 tension could be eased by the hypothetical decay of cold dark matter particles (into one massive particle and one “warm” massless particle). This mechanism would lower the value of S8 arising from CMB-based extrapolations, bringing it more in line with the late universe measurements. Unfortunately, it does little to solve the H0 tension.

    “It seems like a robust pattern: whatever model you come up with that solves the H0 tension makes the S8 tension worse, and the other way around,” Hildebrandt says. “There are a few models that at least don’t make the other tension worse, but also don’t improve it a lot.”

    “We Are Missing Something”

    Once fresh data arrive, Spergel foresees a few possible scenarios unfolding. First, the new CMB data could turn out to be consistent with early dark energy, resolving the H0 tension, and the upcoming survey telescope observations could separately ease the S8 tension. That would be a win for early dark energy models—and would constitute a major shift in our understanding of the opening chapters of cosmic history.

    Or, it is possible that both H0 and S8 tensions resolve in favor of LCDM. This would be a win for the standard model, and a possibly bittersweet victory for cosmologists hoping for paradigm-shifting breakthroughs rather than “business as usual.”

    “Outcome three would be both tensions become increasingly significant as the data improves—and early dark energy isn’t the answer,” Spergel says. Then, LCDM would presumably have to be reworked differently, but absent further specifics the impact of such an outcome is difficult to foresee.

    Natarajan thinks that the tensions and discrepancies are probably telling us that LCDM is merely an “effective theory,” a technical term meaning that it accurately explains a certain subset of the current compendium of cosmic observations. “Perhaps what’s really happening is that there is an underlying, more complex theory,” she says. “And that LCDM is this [effective] theory, which seems to have most of the key ingredients. For the level of observational probes that we had previously, that effective theory was sufficient.” But times change, and the data deluge from precision cosmology’s third generation of powerful observatories may demand more creative and elaborate theories.

    Theorists, of course, are more than happy to oblige. For instance, Spergel speculates that if early dark energy could interact with dark matter (in LCDM, dark energy and dark matter do not interact), this could suppress the fluctuations of matter in the early universe in ways that would resolve the S8 tension, while simultaneously taking care of the H0 tension. “It makes the models more baroque, but maybe that’s what nature will demand,” Spergel says.

    As an observational astronomer, Hildebrandt is circumspect. “If there was a convincing model that beautifully solves these two tensions, we’d already have the next standard model. That we’re instead still talking about these tensions and scratching our heads is just reflecting the fact that we don’t have such a model yet.”

    Riess agrees. “After all, this is a problem of using a model based on an understanding of physics and the universe that is about 95 percent incomplete, in terms of the nature of dark matter and dark energy,” he says. “It wouldn’t be crazy to think that we are missing something.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 10:57 am on April 17, 2022 Permalink | Reply
    Tags: "SpaceX’s Starship and NASA’s SLS Could Supercharge Space Science", NASA’s Space Launch System (SLS), Scientific American, Scientists are beginning to dream of how a new generation of super-heavy-lift rockets might enable revolutionary space telescopes and bigger and bolder interplanetary missions., SpaceX Starship   

    From Scientific American: “SpaceX’s Starship and NASA’s SLS Could Supercharge Space Science” 

    From Scientific American

    April 12, 2022
    Jonathan O’Callaghan

    Scientists are beginning to dream of how a new generation of super-heavy-lift rockets might enable revolutionary space telescopes and bigger and bolder interplanetary missions.

    1
    NASA’s Space Launch System (SLS) moon megarocket topped by the Orion spacecraft (left); SpaceX’s Starship with Super Heavy booster (right). Credit: Paul Hennessy/Anadolu Agency via Getty Images (left); SpaceX/Flickr (CC BY-NC 2.0) (right).

    Astronomers breathed a collective sigh of relief as the James Webb Space Telescope (JWST) sprung to life. Getting the $10-billion telescope up and running following its launch on Christmas Day 2021 had been a nerve-racking affair. JWST would not fit into any modern rocket without being folded, and it had to rely on hundreds of moving parts to unfurl to full size once in space. Ultimately those efforts were successful, and the telescope has started returning some of its first calibration images to thrilled audiences back on Earth. Yet the experience left many astronomers wondering if there was a simpler way to build and launch telescopes of this size. “We were worried about the unfolding,” says John Blevins of NASA’s Marshall Space Flight Center. But with a larger rocket, “you don’t have to unfold in space. You can do it on the ground.”

    As chance would have it, two such rockets are currently sitting on launchpads. Each should ultimately exceed the power of the mighty Saturn V, which sent the Apollo astronauts to the moon. The first, NASA’s Space Launch System (SLS), is ready and waiting at The Kennedy Space Center in Florida for its inaugural uncrewed voyage around the moon this summer as part of the Artemis I mission—the opening shot in NASA’s plan to return humans to the lunar surface in the 2020s.

    The rocket is meant to be as reliable as possible and is therefore based, in large part, on legacy hardware from NASA’s Space Shuttle program. But a reliance on tried-and-true technology could be its Achilles’ heel: some estimates currently peg the SLS’s cost at an eye-watering $4.1 billion per launch. Presuming it is not scuttled by congressional appropriators feeling buyer’s remorse, its massive size could ultimately be a boon for scientists seeking to send larger, more ambitious spacecraft and telescopes throughout the solar system—and even beyond.

    Over in Texas, Starship, a similarly capable but wildly different rocket being developed by SpaceX, is also in preparation to launch on its first orbital test flight as early as May, pending regulatory approval from the Federal Aviation Administration. The cost of the SLS seems so egregious because each multibillion-dollar rocket will be discarded after a single use, its components relegated to junk on the seafloor or adrift in space. Such was the standard for most of the space age, but times have changed. Starship and its giant Super Heavy booster are instead built for endurance, landing back on the ground for rapid reuse similar to SpaceX’s current fleet of Falcon rockets, which has already dramatically lowered the cost of reaching space. As big and bold as the SLS may be, experts say that it pales in comparison with what Starship could achieve. “Starship holds the promise of transforming the solar system in a way we can’t really appreciate,” says Alan Stern of the Southwest Research Institute in Texas, who helms NASA’s New Horizons mission, which flew by the dwarf planet Pluto in 2015. “It completely changes the game.”

    2
    During the Artemis I mission, NASA’s SLS rocket will send an uncrewed Orion spacecraft (illustrated) soaring away from Earth to the moon. Credit: NASA.

    Either rocket’s shroudlike payload fairing is spacious enough to fit cargo as big or even bigger than JWST, all without the need for folding components into the world’s most expensive origami. And both launchers will possess such immense thrust that they can reach remote corners of the solar system on shorter time scales with larger spacecraft than smaller rockets. Starship alone, however, is designed to be refueled in space, meaning that it could transport mind-bogglingly huge payloads to hard-to-reach locales such as Jupiter and Saturn—or pretty much anywhere else around the sun, for that matter.

    As this hopeful new era of the super rocket dawns, eager scientists are vying to be along for the ride. “These rockets can enable whole new classes of missions—to all the giant planets and the Kuiper belt objects, to the ocean world satellites and the dwarf planets of the solar system,” Stern says. “They’re across-the-board useful.” Now many are busy drawing up ideas for what might be possible, at the moment focusing more on the SLS because of its greater maturity but keeping a beady eye on Starship and its potentially revolutionary capabilities.

    The Science Launch System?

    After its initial moonshot, NASA officials say, the SLS will primarily be used to launch the agency’s Orion spacecraft with crew onboard. Those launches will work in tandem with NASA-contracted Starship launches, which will serve to land an Artemis crew on the moon as early as 2025—and perhaps one day send astronauts to the surface of Mars. “We expect approximately one human landing per year over a decade or so,” NASA’s administrator Bill Nelson said in a press conference on March 23. As such, no SLS rocket is likely to be available to solely launch any sort of telescope or scientific probe into the solar system until the 2030s. “Given the demands of the Artemis program between now and the late 2020s, it’s going to be very difficult to squeeze a science mission in that time frame,” said Robert Stough, payload utilization manager of SLS at NASA’s Marshall Space Flight Center, in a briefing last year.

    Consequently, in 2021 NASA switched the planned 2024 launch of its Jupiter-bound flagship mission, Europa Clipper, from the SLS to a SpaceX Falcon Heavy.

    Even so, agency officials are bullish that the SLS’s exorbitant costs and sluggish launch rate can be improved, creating more opportunities for science missions. In his briefing, Stough estimated that $800 million or lower was an achievable target by the 2030s. According to a paper presented at a November 2020 American Institute of Aeronautics and Astronautics (AIAA) meeting, SLS’s final, most powerful planned configuration could be supercharged with the addition of a new “kick stage” that would add propulsion to the top of the rocket. Such an upgrade would allow the SLS to send some 16 metric tons to Jupiter, about six metric tons to Neptune and one metric ton to interstellar space. The New Horizons mission to Pluto, by comparison, had a mass of half a metric ton. “There’s no rocket right now that can carry anywhere near this payload,” says Blevins, who is chief engineer of the SLS at Marshall.

    On April 19 the National Academies of Sciences, Engineering, and Medicine will release its much awaited Planetary Science and Astrobiology Decadal Survey which will recommend NASA’s otherworldly science priorities well into the 2030s. As part of the survey, NASA solicited studies from scientists on mission concepts that the agency might consider for targets in the outer solar system. Three of those suggested using the SLS to allow faster, bulkier missions: a Pluto orbiter, an orbiter and lander to Saturn’s moon Enceladus, and an orbiter and atmospheric probe to Neptune. “We wanted to use existing or very near-term technology,” says Kirby Runyon of The Johns Hopkins University Applied Physics Laboratory (JHUAPL), who is part of the proposed Neptune mission. “The SLS is the furthest along in its design and maturation of any of the very large vehicles.”

    Runyon’s group’s proposal, Neptune Odyssey, would launch as soon as 2031 on an SLS rocket to enter orbit around Neptune in the 2040s. The mission would provide unprecedented insight into a planet that has only been visited once, a fleeting flyby from the Voyager 2 spacecraft in 1989 on its journey out of the solar system. Odyssey would study Neptune and its largest moon Triton for four years while also deploying a probe into the planet’s stormy atmosphere. Slightly smaller rockets such as the Falcon Heavy could also get Odyssey to Neptune but only via various add-ons that would raise the mission’s cost and complexity while reducing its tolerance for error. That approach “is definitely more risky,” Runyon says.

    The Enceladus Orbilander, meanwhile, would be a mission to seek out signs of life within the Saturnian moon’s ocean, which is ejecting plumes of water vapor and organic molecules through cracks in its overlying icy crust. The spacecraft could fly through and sample the plumes before landing on the moon’s surface to perform in situ studies. The SLS, again, makes a mission like this easier than it would be with a smaller rocket, which would require gravitational boosts from planetary flybys in the inner solar system. “This means we don’t have to design the spacecraft to survive both the warm conditions of the inner solar system and the frigid conditions out at Saturn,” says Shannon MacKenzie, the concept’s lead at JHUAPL.

    Even the SLS has its limitations, though. Assuming a launch in 2031, the giant rocket would still take nearly three decades to propel a proposed orbiter, called Persephone, to Pluto. And despite its immense size, the SLS is still limited by its inability for on-orbit refueling to boost its carrying capacity once in space. In their more audacious dreams of cosmic exploration, scientists have eyes for only one rocket: Starship. “Starship is not just an incremental change,” says Jennifer Heldmann of NASA’s Ames Research Center. “This is a significant paradigm shift.”

    Into the Unknown

    Starship, by its design, can be refueled by other Starship vehicles in Earth orbit. This means it could, hypothetically, carry a huge amount of mass around the solar system. “You could get a 100-ton object to the surface of Europa,” SpaceX’s CEO Elon Musk said in a public meeting of the National Academies in November 2021. That is a five times greater performance than the very best the SLS can offer, even in its final configuration with a kick stage. Starship is also forecast to be significantly cheaper, although whether it can hit Musk’s optimistic projection of less than $10 million per launch remains to be seen. “If they get anywhere near that cost, it’s kind of an analogue to a 747 and a shipping container all in one,” says Robin Hague, former head of launch at the U.K. launch company Skyrora. “That’s going to be used throughout the solar system.”

    With 1,000 cubic meters of usable volume, Starship is also big enough to fit the entire Eiffel Tower, disassembled (although not powerful enough to lift it into orbit). This gargantuan capability led Heldmann and her colleagues to publish a paper on what sort of equipment Starship could carry to the lunar or Martian surface. “Refilling Starship in orbit effectively resets the rocket equation, allowing for large payloads to be transported to the Moon and Mars,” they wrote, a reference to the fact that the more mass you want to launch, traditionally, the more thrust you need on an exponential scale. Starship is not limited to these destinations, though. “It is not fine-tuned to either the moon or Mars,” says Margarita Marinova, a former senior Mars development engineer at SpaceX. “The goal for Starship is to create this more generic, larger-scale exploration capability.”

    Ideas include launching full-size drills rather than pint-size versions. “You can put a 100-foot [30-meter] drill on the vehicle and then just deploy it,” Heldmann says. “You don’t have to try and fold it up. That’s exciting because you can drill down into ice on Mars, which is very important for sustaining human exploration and also the search for life.” Starship could conceivably also offer a two-way delivery service, returning vast quantities of material to Earth from these and other worlds. “We’ve always been very cautious about the samples we return because we’ve been limited by the amount of mass,” Heldmann says. “With Starship, you can just load up that vehicle with rocks and ice and whatever else you find.”

    Meanwhile Martin Elvis of the Harvard-Smithsonian Center for Astrophysics and his colleagues have written a white paper on how Starship’s unique capabilities could be used to launch a wide variety of next-generation space telescopes to revolutionize astrophysics. One idea is an extension of the Event Horizon Telescope, a “virtual” observatory on Earth used in 2019 to capture the first-ever image of a supermassive black hole. In a single launch, Starship could send a stack of six-meter telescopes into space, allowing for the creation of a much larger virtual telescope. That could provide views of “thousands of supermassive black holes” found at the centers of galaxies like our own, Elvis says.

    Starship—and the SLS—could also launch a large telescope custom-built to image Earth-like exoplanets around other stars, as recommended to NASA by the National Academies’ Astronomy and Astrophysics Decadal Survey in November 2021. “The diameter of mirror the Decadal report suggested was six meters, which is about the same as the JWST,” Elvis says. But with a superrocket’s large payload fairing, such a mirror could be monolithic, without any need to unfold and deploy in space, likely resulting in major cost savings and a speedier path to the launchpad. “That would simplify the design dramatically,” Elvis says.

    A Cavalcade of Rockets

    The SLS and Starship are not the only options for future heavy exploration of the solar system. The Washington State–based company Blue Origin, founded by Jeff Bezos, is working on a reusable rocket called New Glenn that it says could loft 45 metric tons into Earth orbit. And New Glenn’s successor New Armstrong is expected to be even more powerful. Both Blue Origin rockets could play a role in the scientific exploration of the solar system, although their true capabilities are unknown. China, meanwhile, is working on its own superheavy rocket called the Long March 9 to transport humans and machinery to the moon and Mars as early as the 2030s. It is touted as being able to lift as much as 140 metric tons to Earth orbit, says Andrew Jones, a space journalist who closely follows the Chinese space program.

    “They’re set on a super rocket,” Jones says. “We’re seeing China become more and more interested in planetary exploration—and even looking beyond the boundaries of the solar system.” That latter notion is also something the U.S. is considering with a proposed mission called Interstellar Probe, which may need to rely on the SLS or a similarly sized rocket in order to reach its full scientific potential if it is selected by the upcoming Heliophysics Decadal Survey from the National Academies. “Without SLS or larger launch vehicles, you could not do the Interstellar Probe as intended,” says Runyon, who is planetary science lead for the proposal.

    Some have wondered if this new generation of super-heavy-lift vehicles is needed at all and whether multiple smaller launchers could send spacecraft components into orbit for subsequent assembly by astronauts or robots. That same modular approach could also be used to launch rocket fuel to fill orbital depots, potentially offering similar enhancements to in-space capabilities without the need for a giant rocket. This fuel-depot idea is rumored to have been much maligned by NASA in the early days of the SLS’s development because it undercut the rationale for the program in the first place. George Sowers, former chief scientist at the United Launch Alliance (ULA) and now at the Colorado School of Mines, says he had worked on such ideas at ULA a decade ago but was asked to stop. “It got really political,” he says. “We were basically told to sit down and shut up.” NASA would later change its tune, and the agency has since selected ULA and others to demonstrate in-space refueling and depot technology.

    Daniel Dumbacher, now executive director of The American Institute of Aeronautics and Astronautics and previously part of the leadership at NASA that selected the SLS for development in 2010, says other options were considered. The agency looked at a variant of the SLS that used kerosene rather than the liquid hydrogen and oxygen version that was ultimately opted for. That system also used smaller rockets launched in tandem like ULA’s Atlas V or SpaceX’s Falcon Heavy. Ultimately, however, such an option was deemed too complex and expensive. “We did look at an option of what it would take if we utilized Atlas V- and Falcon Heavy-class vehicles,” Dumbacher says. “It was down selected out because it had negative effects on mission reliability, and it was more costly because of the number of launches required to execute the mission.” More than 10 launches would have been needed to replicate a single SLS launch, he says.

    There is no denying that the SLS is an expensive machine. Yet given its technological maturity, if costs can be brought down, it remains a promising option for future scientific missions. Starship, meanwhile, represents something entirely new in space exploration. There is much that has yet to be proved, including the launch and landing of the giant rocket and its ability to refuel in space. But if those hurdles can be overcome, future exploration of the solar system and the cosmos may no longer be limited mostly by rockets but rather by human imagination. “There’s a ton of excitement about what really high-performance rockets will enable,” Runyon says. “The solar system really opens up in a way that’s never been done before.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 1:52 pm on April 12, 2022 Permalink | Reply
    Tags: "Spy Satellites Confirmed Our Discovery of the First Meteor from the Solar System", Scientific American   

    From Scientific American: “Spy Satellites Confirmed Our Discovery of the First Meteor from the Solar System” 

    From Scientific American

    April 12, 2022
    Amir Siraj

    A high–speed fireball that struck Earth in 2014 looked to be interstellar in origin, but verifying this extraordinary claim required extraordinary cooperation from secretive defense programs.

    1
    Credit: janiecbros/Getty Images.

    “On January 8, 2014, at 17:05:34 UT, an approximately meter-sized rock from space streaked through the sky off the coast of Manus Island, Papua New Guinea, burning up with an energy equivalent to about 110 metric tons of TNT and raining debris into the depths of the Pacific Ocean. Similar-sized fireballs are not uncommon occurrences in Earth’s skies; in fact, a few dozen of them occur each year. But what was unusual about this particular meteor was the very high speed and unusual direction at which it encountered our planet, which collectively suggested it came from interstellar space.

    Sensors on a classified U.S. government satellite designed to detect foreign missile launches were the sole known witnesses to the fireball. Thanks to a partnership between the Department of Defense and The National Aeronautics and Space Agency, the data describing the event eventually were shared on a public database hosted by the Center for Near Earth Object Studies within the space agency’s NASA JPL/Caltech, along with data for more than 900 other fireballs recorded by U.S. government sensors between 1988 and present-day. The data for these events includes dates, times, latitudes, longitudes, altitudes, speeds, three-dimensional velocity components, and energies for each. Notably omitted from the database are the uncertainties for most of these measurements—presumably to ensure the precision thresholds for U.S. global sensing capabilities are not divulged, as this information could potentially be exploited by adversaries.

    My involvement with this meteor traces back to April 2019, when my academic adviser at Harvard University , astrophysicist Avi Loeb, brought the CNEOS fireballs catalogue to my attention. At the time, he and I were about eight months into our studies of data related to ‘Oumuamua, the object identified in October 2017 as the first known interstellar visitor to the solar system. Since ‘Oumuamua originated from outside of the solar system, each of its properties, including its very detection, conveyed previously inaccessible information about our cosmic neighborhood. With the wealth of knowledge carried by interstellar visitors foremost in our minds, Loeb and I had been pondering the possibility of finding others to study, and the CNEOS data seemed promising. Within days, I had identified the 2014 Manus Island fireball as a potential interstellar meteor candidate. Loeb then suggested that I use the speed of impact combined with knowledge of the kinematics of small-body populations in the solar system to estimate the probability that it originated from elsewhere, beyond our solar system. Contemplating this approach, I then proposed a more precise method to derive the object’s trajectory that accounted for the gravitational influences of our sun and its planets. Loeb agreed with my proposal and I swiftly got to work.

    At Earth’s distance from the sun, any object moving faster than about 42 kilometers per second is in an unbounded, hyperbolic orbit relative to our star, meaning that it is too speedy to be captured by the sun’s gravity. Anything traveling over this local celestial speed limit, then, may come from (and if unimpeded should return to) interstellar space. The CNEOS entry for the 2014 Manus Island fireball indicated the meteor hit the Earth’s atmosphere at about 45 kilometers per second—very promising. However, some of this speed came from the object’s motion relative to the Earth and the Earth’s motion around the sun. Teasing apart these effects with the help of computer programs that I wrote, I found that the object had overtaken the Earth from behind before striking our atmosphere, and likely had a sun-relative speed closer to 60 kilometers per second. The corresponding orbit that I calculated was clearly unbound from the sun—even if there had been large uncertainty errors. If the data were correct, this event would be the first interstellar meteor ever discovered. And it was hiding in plain sight.

    Extraordinary claims, of course, require extraordinary evidence. So Loeb and I reverse-engineered estimates of the classified satellites’ measurement errors, using independently verified data on other fireballs in the CNEOS database and elsewhere in the scientific literature. After this arduous reality check, we were left with the same astonishing conclusion: the 2014 fireball had clearly originated from interstellar space. In short order, we drafted a paper reporting our discovery for peer-reviewed publication.

    Journal referees balked at the unknown nature of the error bars, so we enlisted the help of Alan Hurd and Matt Heavner, scientists at The DOE’s Los Alamos National Laboratory with high-level security clearances as well as an interest in promoting collaboration with the public sector to enable blue-sky science. In short order, Heavner made contact with the anonymous analyst who had derived the meteor’s velocity components from the classified satellite observations, and who confirmed that the relevant uncertainties for each value were no higher than 10 percent. Plugged into our error analysis, this implied an interstellar origin with 99.999 percent certainty, but the paper was again turned down by referees, who raised objections about the fact that the statement about uncertainties was a private communication with an anonymous U.S. government employee, and not an official statement from the U.S. government, which Heavner had difficulty in procuring. After several further failed attempts to pierce the veil of secrecy to the satisfaction of journal reviewers, we regretfully moved on to other research, leaving the true nature of the 2014 meteor unconfirmed.

    A year later, however, we were approached by Pete Worden, the chair of the Breakthrough Prize Foundation, with an introduction to Matt Daniels, who at the time was working for the Office of the Secretary of Defense. Daniels had read our preprint about the 2014 meteor and wished to help to confirm its origin from within the U.S. government. After a year of laboriously navigating multiple layers of government bureaucracy, in March/April 2022 Daniels was able to procure official confirmation from Lt. Gen. John Shaw, deputy commander of U.S. Space Force, and Joel Mozer, chief scientist of the branch’s Space Operations Command, of the relevant uncertainties—and thus effective confirmation that the meteor was of true interstellar origin.

    Three years after our original discovery, the first object originating from outside of the solar system observed to strike the Earth—the first known interstellar meteor—has officially been recognized. The 2014 meteor is also the first recorded interstellar object to be detected in the solar system, predating ‘Oumuamua by over three years, and is one of three interstellar objects confirmed to date, alongside ‘Oumuamua and the interstellar comet Borisov.

    The 2014 object’s interstellar nature carries fascinating consequences. Its size implies that each star needs to contribute a significant mass of similar objects over its lifetime to make the 2014 detection likely—suggesting there are many more interstellar meteors to be found. And its high speed relative to the average speeds of our neighboring stars suggests that it could have been ejected from deep within another planetary system, relatively close to its star. This is surprising, as one would naively expect most interstellar objects to instead originate from far more distant circumstellar regions where escape velocities are lower, namely the clouds of comets that exist at the outskirts of many star systems.

    This new field, the study of interstellar meteors, certainly has much to tell us about our place in the cosmos. Further investigations of the observed properties of the 2014 meteor could reveal new insights about our local interstellar environment, especially when compared with the characteristics of its successors, ‘Oumuamua and Borisov. Meteor databases are ripe for follow-on searches, and fresh motivations exist for building new sensing networks, with a focus on detecting future interstellar meteors. Observing an interstellar meteor burn up in real time would allow for the study of its composition, yielding novel insights into the chemistry of other planetary systems.

    The holy grail of interstellar object studies would be to obtain a physical sample of an object that originated from outside of the solar system—a goal as audacious as it is scientifically groundbreaking. We are currently investigating whether a mission to the bottom of the Pacific Ocean off the coast of Manus Island, in the hopes of finding fragments of the 2014 meteor, could be fruitful or even possible. Any sufficiently large interstellar meteor discovered in the future should also produce a shower of debris, which we could potentially track down and analyze. There is, of course, another approach for getting samples, which, as director of interstellar object studies for the Galileo Project, I am excited to also be pursuing: a spacecraft rendezvous. In collaboration with Alan Stern, the principal investigator of NASA’s New Horizons mission, we have now received funding to develop a concept for a space mission to some future interstellar object.”

    Like exotic seashells, these messengers from the stars have been washing ashore on our planetary beach for billions of years, each carrying secrets of their—and our—cosmic origins. Now, at last, we are starting to comb the shoreline.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 1:15 pm on April 5, 2022 Permalink | Reply
    Tags: "Swarms of Black Holes at the Milky Way’s Heart? Maybe Not", , , , , , Scientific American   

    From Scientific American: “Swarms of Black Holes at the Milky Way’s Heart? Maybe Not” 

    From Scientific American

    April 5, 2022
    Lyndie Chiou

    1
    Hundreds of thousands of stars crowd the Milky Way’s heart in this infrared view from the Hubble Space Telescope. Credit: V. Bajaj (STScI) T. Do The National Aeronautics and Space Agency, The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU), and Hubble Heritage Team (The Space Telescope Science Institute /The Association of Universities for Research in Astronomy, Acknowledgment: Andrea Ghez – The University of California-Los Angeles .

    What lurks at the Milky Way’s heart? Astronomers have known most of the answer for decades. Just as in most large galaxies, a supermassive black hole sits at the core of our own island in the universe, enveloped in a swirling maelstrom of molecular clouds and stars.

    But something seems to be missing from this picture: the stellar-mass black holes that can be produced when the heaviest stars die. Theorists have long predicted that such black holes should exist in abundance in our galaxy’s star-packed center, but evidence for their presence there and in the so-called nuclei of other large nearby galaxies has proved scarce. Astronomers chalked this up to black holes simply being too faint to pick out against the noisy x-ray background.

    Four years ago Chuck Hailey, an astrophysicist at Columbia University, published a paper in Nature in which he and his co-authors argued they had at last spied the Milky Way’s hidden swarm of black holes—or rather the swarm’s most flamboyant members. If some of those black holes were binaries—meaning they orbited alongside another object, usually a star—they should siphon off material from such companions that, in the process, would heat up and emit detectable x-rays. Using more than a decade of archival data from NASA’s Chandra X-Ray Observatory, Hailey and his colleagues found a dozen previously unknown sources of x-ray emission in the vicinity of the Milky Way’s core—each of which, they say, likely represents a black hole feasting on a star. Collectively, the dozen candidates would thus betray the presence of a population of thousands upon thousands of unseen, quieter kin.

    A Matter of Timing

    To prove his point, he and his co-authors examined neutron star data from NASA’s Rossi X-ray Timing Explorer (RXTE), which launched in 1995 on a 16-year mission to survey the entire sky for time-dependent x-ray emissions.

    Their analysis showed that RXTE found numerous flaring neutron stars with recurrence times of more than a decade, and it found many more that flared once but have yet to repeat. The breadth, depth and duration of RXTE’s survey and its resulting harvest of x-rays from flaring binary neutron stars implies that tens of thousands of these systems could exist undiscovered throughout the Milky Way, Maccarone says. Simply put, their prevalence has been dramatically underestimated because their typical recurrence times exceed the durations of astronomers’ longest, most keen-eyed surveys.

    Tom Maccarone, an astrophysicist at The Texas Tech University and lead author of the new study, which appeared in Monthly NNRAS last month, notes that no one is disputing the existence of the tantalizing x-ray sources reported by Hailey and company, and some of those sources may indeed be linked to black holes. Rather Maccarone and his co-authors take issue with the previous paper’s analysis and interpretation in a way that, they say, undermines its core conclusions.

    “Hailey’s team found objects that are either neutron stars or black holes in binaries,” Maccarone says. “Everyone agrees about that. The data is good. It shows the objects don’t have [recurring short-period] outbursts. Everyone agrees about that, too.” The heart of the debate, Maccarone says, is that “Hailey claimed this means they must be binary black holes because binary neutron stars [supposedly] outburst every five years or so. But he never checked if that is actually true. I did, and it’s not true.”

    Neutron stars are the city-sized remnants of expired massive stars. They are not black holes, but their gravitational pull is so strong that they look and act like black hole imposters, and they are also thought to be prevalent in the Milky Way’s core. Like black holes, they have the ability to capture and devour a companion, and they can occasionally burp out huge blasts of x-rays as they feast. It may be impossible to tell from looking solely at x-ray emissions if an x-ray binary system contains a black hole or an imposter neutron star.

    Some astronomers, Hailey and his co-authors among them, postulate that the distinction can be made by carefully monitoring the timing of an x-ray binary’s outbursts. X-ray binary systems containing a black hole, they argue, would flare less frequently than others harboring a neutron star. A paucity of data makes theorists uncertain, but their work suggests this may be true. Observers have yet to conclusively show that any timing difference exists at all, however. In search of answers to these questions, astronomers are—for now—fumbling in the dark at the frontiers of their field.

    Equating neutron binary stars to speedy recurring flares, Maccarone maintains, is nothing more than astronomical lore; it is a product of statistical artifacts in biased data sets. Specifically, he says, neutron stars that repeatedly flare every five to 10 years are overrepresented in astronomers’ catalogs because such systems simply offer more useful data, compared with more quiescent neutron stars that exhibit less frequent outbursts.

    Questionable Comparisons

    According to Harvard-Smithsonian Center for Astrophysics astronomer Kareem El-Badry, who was not a part of either team, this underscores the deeper, more general problem of nature’s indifference to our all-too-human limitations. Cosmic events can and do occur on timescales that dwarf the life spans of individuals and even entire civilizations, which perversely means that relying solely on observations is a surefire way for astronomers to get a biased view of what is out there. “You can imagine there might be a large population of neutron star binaries that have an outburst every million years, but you would never see them if you only observe for a few decades,” El-Badry says. “Recurrence time is not necessarily a good indicator for whether something is a black hole or a neutron star binary.”

    In his defense, Hailey notes that he never claimed all of the dozen point sources were black hole binaries. Instead he showed that, given the sources’ characteristics, they were likely black hole candidates, and this statistically tips the scale toward a black hole swarm at the Milky Way’s heart. “Even a handful of black hole binaries in the galactic center implies the existence of hundreds to thousands of isolated black holes,” he says.

    Furthermore, he has criticisms for his critics, who he says have allowed biases to corrupt their own conclusions. The analysis by Maccarone and his colleagues, Hailey says, overly relies on neutron star data from globular clusters, which are ancient aggregations of old stars scattered across our galaxy. The myriad differences between somewhat sedate globular clusters and the action-packed environs of the galactic center, he says, stack up to invalidate comparisons between their respective populations of neutron stars.

    “An analogy would be that what Maccarone has done is try to extrapolate the number of skyscrapers in New York City by studying how many he observes in suburban Texas,” Hailey says. “These are different environments, and one will obtain misleading conclusions.”

    Additionally, Hailey and his colleagues say, they are puzzled as to why Maccarone would use the somewhat outdated RXTE data when more sensitive surveys of x-ray sources now exist.

    Maccarone responds that “using the higher-quality x-ray monitoring data on the galactic center region doesn’t help us because we don’t know the nature of the x-ray sources. It would require circular logic. Are they black hole binaries or neutron star binaries? That’s the very debate.”

    Answers Await

    For now, swarms of black holes still remain the dominant explanation for the mysterious x-ray sources at the Milky Way’s heart in keeping with the more extensive theoretical legacy of that hypothesis. The inconvenient truth may be that this puzzle has no one-size-fits-all solution, and a mix of black holes and neutron stars account for the observations. “The reality is: the universe would have to be pretty pathological for there to be no black holes in the galactic center and similarly for there to be no neutron stars,” Ford says.

    Further clarity may come relatively soon. An upgraded “next generation” version of the Karl G. Jansky Very Large Array of radio telescopes in New Mexico could begin operations toward the end of the 2020s and settle the debate.

    The array will be sensitive enough to measure the sources’ faint radio waves, and a simple comparison between them and the x-ray emissions should discern whether any given source is a neutron star or a black hole.

    “This controversy underscores the need for multiwavelength astronomical data sets,” Maccarone says. “The x-ray data can identify interesting sources, but now we need radio or infrared data to solve the problem.”

    For Hailey, the debate itself is its own sort of vindication—a testament to the impact of his 2018 paper and its role, however small, in bringing scientists closer to the truth. “The paper has accomplished more than I hoped for,” he says. “It has generated enormous theoretical interest and, I guess, more than a little controversy along the way. It all makes for good fun.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: