Tagged: NATURE Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:15 pm on October 17, 2022 Permalink | Reply
    Tags: "Renowned Arecibo telescope won’t be rebuilt — and astronomers are heartbroken", , NATURE, , The National Science Foundation has decided to open an educational centre on the site.   

    From “Nature” : “Renowned Arecibo telescope won’t be rebuilt — and astronomers are heartbroken” 

    From “Nature”

    Alexandra Witze

    The National Science Foundation has decided to open an educational centre on the site.

    The Arecibo Observatory’s 305-metre-wide telescope dish was destroyed in late 2020, after supporting cables snapped.Credit: Ricardo Arduengo/AFP via Getty.

    After a world-famous radio telescope at Arecibo Observatory in Puerto Rico collapsed two years ago, many scientists hoped that the US National Science Foundation (NSF), which runs the facility, would eventually build a new one to replace it. Instead, the agency has announced that it will establish an educational centre for science, technology, engineering and maths (STEM) at the site. The revised plan might ramp down or dramatically alter the remaining research being done at Arecibo.

    “It’s heartbreaking,” says Héctor Arce, an astronomer at Yale University in New Haven, Connecticut, who is from Puerto Rico and has worked on Arecibo advocacy efforts. “To many it seems like yet another unjust way of treating the colonial territory of Puerto Rico.”

    The NSF says that it is following community recommendations in not planning to rebuild the large telescope and instead establishing the new educational centre. “We are not closing Arecibo,” says Sean Jones, head of the NSF’s directorate of mathematical and physical sciences. “We think this new approach and new centre will be catalytic in many areas.”

    The agency announced its plans in a call for proposals on 13 October. It requests ideas for setting up and running an educational centre at Arecibo, at a cost of US$1 million to $3 million per year over five years starting in 2023. That money might or might not include funds to operate research facilities at Arecibo still in use, such as a 12-metre radio antenna and a lidar system that uses lasers to study Earth’s atmosphere.

    The situation “could be worse,” says Abel Méndez, a planetary astronomer at the University of Puerto Rico at Arecibo who uses the 12-metre antenna for research and teaching. But “it could be much, much better”.

    “It is devastating to know that that’s their ultimate decision,” says Desirée Cotto-Figueroa, an astronomer at the University of Puerto Rico at Humacao. “Especially despite all the efforts made by the staff and scientists of the Arecibo Observatory and by the general scientific community to keep it working as the research centre of excellence that it has always been with the observing facilities that are left.”

    A powerhouse of education

    One major question is how the Arecibo site will draw students and teachers if there is little active research to participate in. “Yet the NSF calls for proposals for a world-class educational institution,” says Anne Virkki, a planetary scientist at the University of Helsinki in Finland. “How does anyone do that without the world-class scientists, engineers, and instruments?”

    The NSF says that it is asking for precisely those sorts of ideas. The new centre could support ongoing work in astronomy and planetary science, or it could focus on other areas such as the biological sciences, says James L. Moore III, the head of the NSF’s education and human resources directorate. “Here’s an opportunity to reimagine what the possibilities could be,” he says.

    Arecibo Observatory has long been a powerhouse of STEM education in Puerto Rico because of its renowned telescope and place in astronomical history. Students trained there have gone on to become professional astronomers and planetary scientists in many countries.

    The 305-metre-wide radio telescope that collapsed in 2020 played a key role in many scientific fields for more than half a century, including the search for extraterrestrial life, the discovery of the first extrasolar planets and of gravitational waves, and the study of near-Earth asteroids and of fast radio bursts.

    The NSF has run the observatory since the 1970s, working with a series of contractors. It has been trying to wind down investment at Arecibo since 2006, to shift funding to newer astronomical facilities. Advocates rallied and research continued, but the observatory faced fresh challenges in 2017, when Hurricane Maria damaged much of the facility, and in early 2020 when a series of earthquakes caused more damage.

    Then came the collapse of the 305-metre dish. One of its crucial supporting cables failed in August 2020, then another in November of that year, and the NSF decided it was too structurally unsound to repair. An engineering investigation revealed five factors that contributed to the collapse, including the design of the cable system, deferred maintenance, and damage from hurricanes and earthquakes.

    An observatory no more

    Research has continued at the smaller facilities at Arecibo Observatory. Currently funded projects using those facilities will be able to finish up, Jones says, and scientists can propose to continue their use under the scope of the new educational centre.

    The lidar facilities include a potassium laser that studies the temperature of layers in Earth’s atmosphere, and a planned new instrument to probe aerosols such as atmospheric dust. The 12-metre antenna serves as a node in a long-distance astronomical network operated by European astronomers. Other research projects that use it include Méndez’s studies of red dwarf stars and the habitability of planets around them.

    Many who work with Arecibo instruments are now scrambling to figure out how to ramp down their research projects. Under the new plan, the site will no longer be called Arecibo Observatory — becoming instead the Arecibo Center for STEM Education and Research.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    ”Nature” is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 10:09 am on September 30, 2022 Permalink | Reply
    Tags: "‘Bit of panic’:: Astronomers forced to rethink early Webb telescope findings", NATURE, Revised instrument calibrations are bedevilling work on the distant Universe.,   

    From “Nature” : “‘Bit of panic’:: Astronomers forced to rethink early Webb telescope findings” 

    From “Nature”

    Alexandra Witze

    Revised instrument calibrations are bedevilling work on the distant Universe.

    Astronomers have been poring over early data from the James Webb Space Telescope.Credit: Nolan Zunk/University of Texas -Austin.

    Astronomers have been so keen to use the new James Webb Space Telescope that some have got a little ahead of themselves. Many started analysing Webb data right after the first batch was released, on 14 July, and quickly posted their results on preprint servers — but are now having to revise them. The telescope’s detectors had not been calibrated thoroughly when the first data were made available, and that fact slipped past some astronomers in their excitement.

    The revisions don’t so far appear to substantially change many of the exciting early results, such as the discovery of a number of candidates for the most distant galaxy ever spotted. But the ongoing calibration process is forcing astronomers to reckon with the limitations of early data from Webb.

    Figuring out how to redo the work is “thorny and annoying”, says Marco Castellano, an astronomer at the Italian National Institute of Astrophysics in Rome. “There’s been a lot of frustration,” says Garth Illingworth, an astronomer at the University of California-Santa Cruz. “I don’t think anybody really expected this to be as big of an issue as it’s becoming,” adds Guido Roberts-Borsani, an astronomer at The University of California-Los Angeles.

    Calibration is particularly challenging for projects that require precise measurements of the brightness of astronomical objects, such as faint, faraway galaxies. For several weeks, some astronomers have been cobbling together workarounds so that they can continue their analyses [1]. The next official round of updates to Webb’s calibrations are expected in the coming weeks from the Space Telescope Science Institute (STScI) in Baltimore, Maryland, which operates the telescope. Those updates should shrink the error bars on the telescope’s calibrations from the tens of percentage points that have been bedevilling astronomers in some areas, down to just a few percentage points. And data accuracy will continue to improve as calibration efforts proceed over the coming months.

    This is the first scientific image released from the Webb telescope publicly, on 11 July, showing a deep-field look at the sky that includes a number of distant galaxies.Credit: NASA, ESA, CSA, STScI.

    The STScI made it clear that the initial calibrations to the telescope were rough, says Jane Rigby, operations project scientist for Webb at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Much of the issue stems from the fact that Webb, which launched in December 2021, is a new telescope whose details are still being worked out. “It’s been a long time since the community has had a brand-new telescope in space — a big one with these amazingly transformative powers,” Rigby says.

    “We knew it wasn’t going to be perfect right out of the box,” says Martha Boyer, an astronomer at the STScI who is helping to lead the calibration efforts [2].

    Calibration controversy

    All telescopes need to be calibrated. This is usually done by observing a well-understood star such as Vega, a prominent star in the night sky. Astronomers look at the data being collected by the telescope’s various instruments — such as the brightness of the star in different wavelengths of light — and compare them with measurements of the same star from other telescopes and of laboratory standards.

    Working with Webb data involves several types of calibration, but the current controversy is around one of the telescope’s main instruments, its Near Infrared Camera (NIRCam) [below]. In the six months after Webb launched, STScI researchers worked to calibrate NIRCam. But given the demands on Webb, they had only enough time to point it at one or two calibration stars, and to take data using just one of NIRCam’s ten detectors. They then estimated the calibrations for the other nine detectors. “That’s where there was a problem,” Boyer says. “Each detector will be a little bit different.”

    Within days of the first Webb data release, non-peer-reviewed papers began appearing on the arXiv preprint server, reporting multiple candidates for the most distant galaxy ever recorded. These studies relied on the brightness of distant objects, measured with Webb at various wavelengths. Then, on 29 July, the STScI released an updated set of calibrations that were substantially different from what astronomers had been working with.

    “This caused a little bit of panic,” says Nathan Adams, an astronomer at the University of Manchester, UK, who, along with his colleagues, pointed out the problem in a 9 August update to a preprint they had posted in late July [3]. “For those including myself who had written a paper within the first two weeks, it was a bit of — ‘Oh no, is everything that we’ve done wrong, does it all need to go in the bin?’”

    A young observatory

    To try to standardize all the measurements, the STScI is working through a detailed plan to point Webb at several types of well-understood star, and observe them with every detector in every mode for every instrument on the telescope [4]. “It just takes a while,” says Karl Gordon, an astronomer at the STScI who helps lead the effort.

    In the meantime, astronomers have been reworking manuscripts that describe distant galaxies on the basis of Webb data. “Everyone’s gone back over and had a second look, and it’s not as bad as we thought,” Adams says. Many of the most exciting distant-galaxy candidates still seem to be at or near the distance originally estimated. But other preliminary studies, such as those that draw conclusions about the early Universe by comparing large numbers of faint galaxies, might not stand the test of time. Other fields of research, such as planetary studies, are not affected as much because they depend less on these preliminary brightness measurements.

    “We’ve come to realize how much this data processing is an ongoing and developing situation, just because the observatory is so new and so young,” says Gabriel Brammer, an astronomer at the University of Copenhagen who has been developing Webb calibrations independent of the STScI.

    In the long run, astronomers are sure to sort out the calibration and become more confident in their conclusions. But for now, Boyer says, “I would tell people to proceed with caution — whatever results they might be getting today might not be quite right in six months, when we have more information. It’s just sort of, ‘Proceed at your own risk.’”


    1. Nardiello, D. et al. Preprint at https://arxiv.org/abs/2209.06547 (2022).

    2. Boyer, M. L. et al. Preprint at https://arxiv.org/abs/2209.03348 (2022).

    3. Adams, N. J. et al. Preprint at https://arxiv.org/abs/2207.11217 (2022).

    4. Gordon, K. D. et al. Preprint at https://arxiv.org/abs/2204.06500 (2022).

    National Aeronautics Space Agency/European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU)/ Canadian Space Agency [Agence Spatiale Canadienne](CA) James Webb Infrared Space Telescope annotated, finally launched December 25, 2021, ten years late.

    There are four science instruments on Webb: The Near InfraRed Camera (NIRCam), The Near InfraRed Spectrograph (NIRspec), The Mid-InfraRed Instrument (MIRI), and The Fine Guidance Sensor/ Near InfraRed Imager and Slitless Spectrograph (FGS-NIRISS). Webb’s instruments are designed to work primarily in the infrared range of the electromagnetic spectrum, with some capability in the visible range. It will be sensitive to light from 0.6 to 28 micrometers in wavelength.
    National Aeronautics Space Agency Webb NIRCam.

    The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU) Webb MIRI schematic.

    Webb Fine Guidance Sensor-Near InfraRed Imager and Slitless Spectrograph FGS/NIRISS.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    ”Nature” is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 9:47 am on September 6, 2022 Permalink | Reply
    Tags: "Scientists Warn the World Is Not Ready For The Next Super-Eruption", NATURE, , , ,   

    From The University of Birmingham (UK) And The University of Oxford (UK) Via “Science Alert (AU)” And “Nature” : “Scientists Warn the World Is Not Ready For The Next Super-Eruption” 

    From The University of Birmingham (UK)


    U Oxford bloc

    The University of Oxford (UK)



    “Science Alert (AU)”



    Russell McLendon

    The 2021 eruption of Iceland’s Fagradalsfjall volcano. Credit: Jeroen Van Nieuwenhove.

    Even if humanity manages not to self-destruct with war or climate change, there are still other existential threats we must be ready for.

    Earth came pre-loaded with plenty of dangers long before we began piling on, some of which our species has still barely experienced.

    One of the flashier dangers comes from asteroids, like the one suspected of devastating the dinosaurs 65 million years ago. As we try to anticipate our own doomsday, the cautionary tale of dinosaurs seems to suggest we direct our vigilance upward.

    That makes sense, and humans are wisely preparing in ways the dinosaurs couldn’t, with investments in asteroid monitoring and even deflection.

    But as two researchers point out in a new commentary in the journal Nature, we shouldn’t let asteroid anxiety overshadow another colossal danger lurking under our noses: volcanoes.

    “Over the next century, large-scale volcanic eruptions are hundreds of times more likely to occur than are asteroid and comet impacts, put together,” write Michael Cassidy, a professor of volcanology at the University of Birmingham, and Lara Mani, a research associate at the Centre for the Study of Existential Risk at the University of Cambridge.

    While preparing for asteroids is prudent, we’re doing too little about the likelier event of a volcanic “super-eruption”, Cassidy and Mani argue.

    Governments and global agencies spend hundreds of millions of dollars annually on planetary defense, they write, including a new US experiment to fend off space rocks.

    NASA’s Double Asteroid Redirection Test (DART) mission will soon test the feasibility of asteroid deflection by trying to move an asteroid off course. The DART mission will cost about $330 million, and while that’s a bargain if it saves us from an asteroid, Cassidy and Mani note there is no comparable investment to prep for a super-eruption.

    “This needs to change,” they write.

    Volcanoes may be less exotic than fireballs from space, but that’s all the more reason to respect them: Volcanoes, unlike asteroids, are already here on Earth. They’re scattered all over the planet, often blanketed with picturesque scenery that belies their destructive potential.

    And while humans have seen lots of terrible eruptions in modern times, most pale in comparison to the supervolcanoes that erupt every 15,000 years or so.

    The last super-eruption of this kind happened about 22,000 years ago, according to the US Geological Survey. (A “super-eruption” is one with a magnitude of 8, the highest rating on the Volcanic Explosivity Index, or VEI.)

    The most recent magnitude-7 eruption occurred in 1815 at Mount Tambora, Indonesia, killing an estimated 100,000 people.

    The ash and smoke reduced global temperatures by about 1 degree Celsius on average, causing the “Year Without a Summer” in 1816. There were widespread crop failures, leading to famine, disease outbreaks, and violence.

    Volcano monitoring has improved since 1815, as has our ability to rally global support for disaster relief, but not necessarily enough to offset all the risks we now face.

    Earth’s human population has octupled since the early 1800s, Cassidy and Mani note, and some big urban areas have blossomed near dangerous volcanoes. We’re more reliant on global trade, too, so upheaval in one place can spur food shortages and other crises elsewhere.

    The peril posed by volcanoes may also be greater than we think. In a 2021 study based on data from ancient ice cores, researchers found the intervals between catastrophic eruptions are hundreds or even thousands of years shorter than previously believed.

    The history of many volcanoes remains murky, making it hard to anticipate future eruptions and focus resources where risks are highest. We need more research on ice cores as well as historical and geological records, Cassidy and Mani write, including marine and lakes cores, especially in high-risk but data-poor regions like Southeast Asia.

    We also need more interdisciplinary research to help us predict how a super-eruption might cripple civilization, they add, by identifying risks to trade, agriculture, energy, and infrastructure, plus geographic “pinch points” where volcanic risks overlap with critical trade networks.

    More comprehensive volcano monitoring is vital, too, including ground-based monitoring as well as aerial and satellite observation. The researchers note volcanologists have long pined for a specialized volcano-observing satellite, which could boost preparedness beyond the current system of sharing existing satellites with other scientists.

    Community awareness and education is another key to resilience. People need to know if they live in volcanic danger zones, how to prepare for an eruption, and what to do when it happens.

    Beyond preparatory outreach, authorities also need ways to broadcast public alerts when volcanoes erupt, Cassidy and Mani write, like text messages with details about evacuations, tips for surviving an eruption, or directions to shelters and health-care facilities.

    The commentary was published in the journal Nature.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Birmingham (UK) has been challenging and developing great minds for more than a century. Characterized by a tradition of innovation, research at the University has broken new ground, pushed forward the boundaries of knowledge and made an impact on people’s lives. We continue this tradition today and have ambitions for a future that will embed our work and recognition of the Birmingham name on the international stage.

    The University of Birmingham is a public research university located in Edgbaston, Birmingham, United Kingdom. It received its royal charter in 1900 as a successor to Queen’s College, Birmingham (founded in 1825 as the Birmingham School of Medicine and Surgery), and Mason Science College (established in 1875 by Sir Josiah Mason), making it the first English civic or ‘red brick’ university to receive its own royal charter. It is a founding member of both the Russell Group (UK) of British research universities and the international network of research universities, Universitas 21.

    The student population includes 23,155 undergraduate and 12,605 postgraduate students, which is the 7th largest in the UK (out of 169). The annual income of the institution for 2019–20 was £737.3 million of which £140.4 million was from research grants and contracts, with an expenditure of £667.4 million.

    The university is home to the Barber Institute of Fine Arts, housing works by Van Gogh, Picasso and Monet; the Shakespeare Institute; the Cadbury Research Library, home to the Mingana Collection of Middle Eastern manuscripts; the Lapworth Museum of Geology; and the 100-metre Joseph Chamberlain Memorial Clock Tower, which is a prominent landmark visible from many parts of the city. Academics and alumni of the university include former British Prime Ministers Neville Chamberlain and Stanley Baldwin, the British composer Sir Edward Elgar and eleven Nobel laureates.

    Scientific discoveries and inventions

    The university has been involved in many scientific breakthroughs and inventions. From 1925 until 1948, Sir Norman Haworth was Professor and Director of the Department of Chemistry. He was appointed Dean of the Faculty of Science and acted as Vice-Principal from 1947 until 1948. His research focused predominantly on carbohydrate chemistry in which he confirmed a number of structures of optically active sugars. By 1928, he had deduced and confirmed the structures of maltose, cellobiose, lactose, gentiobiose, melibiose, gentianose, raffinose, as well as the glucoside ring tautomeric structure of aldose sugars. His research helped to define the basic features of the starch, cellulose, glycogen, inulin and xylan molecules. He also contributed towards solving the problems with bacterial polysaccharides. He was a recipient of the Nobel Prize in Chemistry in 1937.

    The cavity magnetron was developed in the Department of Physics by Sir John Randall, Harry Boot and James Sayers. This was vital to the Allied victory in World War II. In 1940, the Frisch–Peierls memorandum, a document which demonstrated that the atomic bomb was more than simply theoretically possible, was written in the Physics Department by Sir Rudolf Peierls and Otto Frisch. The university also hosted early work on gaseous diffusion in the Chemistry department when it was located in the Hills building.

    Physicist Sir Mark Oliphant made a proposal for the construction of a proton-synchrotron in 1943, however he made no assertion that the machine would work. In 1945, phase stability was discovered; consequently, the proposal was revived, and construction of a machine that could surpass proton energies of 1 GeV began at the university. However, because of lack of funds, the machine did not start until 1953. The DOE’s Brookhaven National Laboratory (US) managed to beat them; they started their Cosmotron in 1952, and had it entirely working in 1953, before the University of Birmingham.

    In 1947, Sir Peter Medawar was appointed Mason Professor of Zoology at the university. His work involved investigating the phenomenon of tolerance and transplantation immunity. He collaborated with Rupert E. Billingham and they did research on problems of pigmentation and skin grafting in cattle. They used skin grafting to differentiate between monozygotic and dizygotic twins in cattle. Taking the earlier research of R. D. Owen into consideration, they concluded that actively acquired tolerance of homografts could be artificially reproduced. For this research, Medawar was elected a Fellow of the Royal Society. He left Birmingham in 1951 and joined the faculty at University College London (UK), where he continued his research on transplantation immunity. He was a recipient of the Nobel Prize in Physiology or Medicine in 1960.

  • richardmitnick 9:41 pm on August 10, 2022 Permalink | Reply
    Tags: "How the revamped Large Hadron Collider will hunt for new physics", , , NATURE, , The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organization für Kernfors, The particle-smashing machine has fired up again — sparking fresh hope it can find unusual results.   

    From The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH) [CERN] Via “Nature” : “How the revamped Large Hadron Collider will hunt for new physics” 



    25 May 2022 [In social media 8.10.22]
    Elizabeth Gibney

    Detectors at the ALICE experiment were revamped during the Large Hadron Collider’s 2018–22 shutdown. Credit: Maximilien Brice, Julien Marius Ordan/CERN.

    The hunt for new physics is back on. The world’s most powerful machine for smashing high-energy particles together, the Large Hadron Collider (LHC), has fired up after a shutdown of more than three years. Beams of protons are once again whizzing around its 27-kilometre loop at CERN, Europe’s particle-physics laboratory near Geneva. This July, physicists switched on their experiments and watch bunches of particles collide.


    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] ATLAS.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] ALICE.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] CMS.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] LHCb.

    In its first two stints, in 2009–13 and 2015–18, the LHC explored the known physics world. All of that work — including the triumphant 2012 discovery of the Higgs boson — reaffirmed physicists’ current best description of the particles and forces that make up the Universe: the standard model.

    But scientists sifting through the detritus of quadrillions of high-energy collisions have yet to find proof of any surprising new particles or anything else completely unknown.

    This time could be different. The LHC has so far cost US$9.2 billion to build, including the latest upgrades: version three comes with more data, better detectors and innovative ways to search for new physics. What’s more, scientists start with a tantalizing shopping list of anomalous results — many more than at the start of the last run — that hint at where to look for particles outside the standard model.

    “We’re really starting with adrenaline up,” says Isabel Pedraza, a particle physicist at the Meritorious Autonomous University of Puebla (BUAP) in Mexico. “I’m sure we will see something in run 3.”

    Higher energy and more data

    After renovations to its particle accelerators, the third version of the LHC will collide protons at 13.6 trillion electron volts (TeV) — slightly higher than in run 2, which reached 13 TeV. The more-energetic smashes should increase the chances that collisions will create particles in high-energy regions where some theories suggest new physics could lie, says Rende Steerenberg, who leads beam operations at CERN. The machine’s beams will also deliver more-compact bunches of particles, increasing the probability of collisions. This will allow the LHC to maintain its peak rate of collisions for longer, ultimately allowing experiments to record as many data as in the first two runs combined.

    To deal with the flood, the machine’s detectors — layers of sensors that capture particles that spray from collisions and measure their energy, momentum and other properties — have been upgraded to make them more efficient and precise.

    Credit: Nik Spencer/Nature; Source: CERN.

    A major challenge for LHC researchers has always been that so little of the collision data can be stored. The machine collides bunches 40 million times per second, and each proton–proton collision, or ‘event’, can spew out hundreds of particles. ‘Trigger’ systems must weed out the most interesting of these events and throw the bulk of the data away. For example, at CMS — one of the LHC’s four main experiments — a trigger built into the hardware makes a rough cut of around 100,000 events per second on the basis of assessments of properties such as the particles’ energies, before software picks out around 1,000 to reconstruct in full for analysis.

    With more data, the trigger systems must triage even more events. One improvement comes from a trial of chips originally designed for video games, called GPUs (graphics processing units). These can reconstruct particle histories more quickly than conventional processors can, so the software will be able to scan faster and across more criteria each second. That will allow it to potentially spot strange collisions that might previously have been missed.

    In particular, the LHCb experiment has revamped its detector electronics so that it will use only software to scan events for interesting physics. Improvements across the experiment mean that it should collect four times more data in run 3 than it did in run 2. It is “almost like a brand new detector”, says Yasmine Amhis, a physicist at the Laboratory of the Irène-Joliot Curie Physics of the Two Infinities Lab in Orsay, France, and member of the LHCb collaboration.

    The LHCb’s ‘vertex locator’, placed close to the LHC’s beamline to see short-lived particles.Credit: Maximilien Brice, Julien Marius Ordan/CERN.

    Spotting anomalies

    Run 3 will also give physicists more precision in their measurements of known particles, such as the Higgs boson, says Ludovico Pontecorvo, a physicist with the ATLAS experiment. This alone could produce results that conflict with known physics — for instance, when measuring it more precisely shrinks the error bars enough to put it outside the standard model’s predictions.

    But physicists also want to know whether a host of odd recent results are genuine anomalies, which might help to fill some gaps in understanding about the Universe. The standard model is incomplete: it cannot account for phenomena such as dark matter, for instance. And findings that jar with the model — but are not firm enough to claim as a definite discrepancy — have popped up many times in the past two years.

    Credit: Nik Spencer/Nature; Source: CERN.

    The most recent is from the Tevatron collider at the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, which shut down in 2011.

    FNAL Tevatron


    Researchers have spent the past decade poring through data from the Tevatron’s CDF experiment. In April, they reported [1] that the mass of the W boson, a fundamental particle that carries the weak nuclear force involved in radioactive decay, is significantly higher than the standard model predicts.

    That doesn’t chime with LHC data: measurements at ATLAS and LHCb disagree with the CDF data, although they are less precise. Physicists at CMS are now working on their own measurement, using data from the machine’s second run. Data from run 3 could provide a definitive answer, although not immediately, because the mass of the W boson is notoriously difficult to measure.

    B-meson confusion

    The LHC’s data have hinted at other anomalies. In particular, evidence has been building for almost a decade of odd behaviour in particles called B mesons. These transient particles, which quickly decay into others, are so named because they contain pairs of fundamental particles that include a ‘bottom’ or ‘beauty’ quark. LHCb analyses suggest that B-meson decays tend to produce electrons more often than they produce their heavier cousins, muons [2]. The standard model predicts that nature should not prefer one over the other, says Tara Shears, a particle physicist at the University of Liverpool, UK, and a member of the LHCb collaboration. “Muons are being produced about 15% less often than electrons, and it’s utterly bizarre,” she says.

    The result differs from the predictions of the standard model with a significance of around 3 sigma, or 3 standard deviations from what’s expected — which translates to a 3 in 1,000 chance that random noise could have produced the apparent bias. Only more data can confirm whether the effect is real or a statistical fluke. Experimentalists might have misunderstood something in their data or machine, but now that many of the relevant LHCb detectors have been replaced, the next phase of data-gathering should provide a cross-check, Shears says. “We will be crushed if [the anomaly] goes away. But that’s life as a scientist, that can happen.”

    The anomaly is backed up by similar subtle discrepancies that LHCb has seen in other decays involving bottom quarks; experiments at colliders in Japan and the United States have also seen hints of this odd result [Nature (below)]. This kind of work is LHCb’s métier: its detectors were designed to study in detail the decays of particles that contain heavy quarks, allowing the experiment to gather indirect hints of phenomena that might influence these particles’ behaviour. CMS and ATLAS are more general-purpose experiments, but experimenters there are now checking to see whether they can spot more of the events that are sensitive to the anomalies, says Florencia Canelli, an experimental particle physicist at the University of Zurich in Switzerland and member of the CMS collaboration.

    Hunt for the leptoquark

    CMS and ATLAS will also do what LHCb cannot: comb collision data to look directly for the exotic particles that theorists suggest could be causing the still-unconfirmed anomalies. One such hypothetical particle has been dubbed the leptoquark, because it would, at high energies, take on properties of two otherwise distinct families of particles — leptons, such as electrons and muons, and quarks (see ‘Decoding decays’). This hybrid particle comes from theories that seek to unite the electromagnetic, weak and strong fundamental forces as aspects of the same force, and could explain the LHCb results. The leptoquark — or a complex version of it — also fits with another tantalizing anomaly; a measurement last year [3], from the Muon g − 2 experiment at Fermilab, that muons are more magnetic than expected [Nature (below)].

    Credit: Nik Spencer/Nature.

    At the Moriond particle-physics conference in La Thuile, Italy, in March, CMS researchers presented results of a search that found intriguing hints of a beyond-standard-model lepton. This particle would interact with leptoquarks and is predicted by some leptoquark theories. Physicists saw a slight excess of the particles that the proposed lepton could decay into, bottom quarks and taus (heavier cousins of the muon), but the finding’s significance is only 2.8σ. “Those are very exciting results, as LHCb is also seeing something similar,” says Pedraza. CMS physicists presented hints of other new phenomena at the conference: two possible particles that might decay into two taus, and a potential high-energy particle that, through a theorized but unproven decay route, would turn into distinctive particle cascades termed jets.

    Another intriguing result comes from ATLAS, where Ismet Siral at the University of Oregon in Eugene and his colleagues looked for hypothetical heavy, long-lived charged particles. In trillions of collisions from 3 years of data they found 7 candidates at around 1.4 TeV, around 8 times the energy of the heaviest known particle [4]. Those results are 3.3 sigma, and the identity of the candidate particles remains a mystery. “We don’t know if this is real, we need more data. That’s where run 3 comes in,” says Siral.

    CERN’s 86-metre long Linac4 accelerator, which produces proton beams for the Large Hadron Collider. Credit: Robert Hradil, Monika Majer/ProStudio22.ch/CERN.

    Another LHC experiment, ALICE, will explore its own surprising finding: that the extreme conditions created in collisions between lead ions (which the LHC smashes together when not working with protons) might crop up elsewhere. ALICE is designed to study quark–gluon plasma, a hot, dense soup of fundamental particles created in collisions of heavy ions that is thought to have existed just after the Big Bang. Analyses of the first two runs found that particles in proton–proton and proton–lead ion collisions show some traits of this state of matter, such as paths that are correlated rather than random. “It’s an extremely interesting, unexpected phenomenon,” says Barbara Erazmus, deputy spokesperson for ALICE at CERN.

    Like LHCb, ALICE has had a major upgrade, including updated electronics to provide it with a faster software-only trigger system. The experiment, which will probe the temperature of the plasma as well as precisely measuring particles that contain charm and beauty quarks, will be able to collect 100 times more events this time than in its previous two runs, thanks to improvements across its detectors.

    Machine learning aids the search

    Run 3 will see also entirely new experiments. FASER, half a kilometre from ATLAS, will hunt for light and weakly interacting particles including neutrinos and new phenomena that could explain dark matter.

    (These particles can’t be spotted by ATLAS, because they would fly out of collisions on a trajectory that hugs close to the LHC’s beamline and evades the detectors). Meanwhile, the ATLAS and CMS experiments now have improved detectors but will not receive major hardware upgrades until the next long shutdown, in 2026. At this point, the LHC will be overhauled to create more focused ‘high-luminosity’ beams, which will start up in 2029 (see ‘LHC timeline’). This will allow scientists in the following runs to collect 10 times more collision data than in runs 1 to 3 combined. For now, CMS and ATLAS have got prototype technology to help them prepare.

    Credit: Nik Spencer/Nature; Source: CERN.

    As well as collecting more events, physicists such as Siral are keen to change the way in which LHC experiments hunt for particles. So far, much of the LHC’s research has involved testing specific predictions (such as searching for the Higgs where physicists expected to see it) or hunting for particular hypotheses of new physics.

    Scientists thought this would be a fruitful strategy, because they had a good steer on where to look. Many expected to find new heavy particles, such as those predicted by a group of theories known as supersymmetry, soon after the LHC started. That they have seen none rules out all but the most convoluted versions of supersymmetry. Today, few theoretical extensions of the standard model seem any more likely to be true than others.

    Experimentalists are now shifting to search strategies that are less constrained by expectations. Both ATLAS and CMS are going to search for long-lived particles that could linger across two collisions, for instance. New search strategies often mean writing analysis software that rejects the usual assumptions, says Siral.

    Machine learning is likely to help, too. Many LHC experiments already use this technique to distinguish particular sought-for collisions from the background noise. This is ‘supervised’ learning: the algorithm is given a pattern to hunt for. But researchers are increasingly using ‘unsupervised’ machine-learning algorithms that can scan widely for anomalies, without expectations. For example, a neural network can compare events against a learned simulation of the standard model. If the simulation can’t recreate the event, that’s an anomaly. Although this kind of approach is not yet used systematically, “I do think this is the direction people will go in,” says Sascha Caron of Radboud University Nijmegen in the Netherlands, who works on applying these techniques to ATLAS data.

    In making searches less biased, the triggers that decide which events are interesting to look at are crucial, so it helps that the new GPUs will be able to scour candidate events with wider criteria. CMS will also use an approach called ‘scouting’: analyzing rough reconstructions of all the 100,000 or so events initially selected but not saved in full detail. “It’s the equivalent of 10 years more of running your detector, but in one year,” says Andrea Massironi, a physicist with the CMS experiment.

    The triggers themselves could also soon rely on machine learning to make their choices. Katya Govorkova, a particle physicist at CERN, and her colleagues have come up with a high-speed proof-of-principle algorithm that uses machine learning to select which of the collider’s 40 million events per second to save, according to their fit with the standard model [5]. In run 3, researchers plan to train and test the algorithm on CMS collisions, alongside the experiment’s conventional trigger. A challenge will be knowing how to analyse events that the algorithm labels as anomalous, because it cannot yet point to exactly why an event is anomalous, says Govorkova.

    Physicists must keep an open mind about where they might find the thread that will lead them to a theory beyond the standard model, says Amhis. Although the current crop of anomalies is exciting, even previous oddities seen by multiple experiments turned out to be statistical flukes that faded away when more data were gathered. “It’s important that we continue to push all of the physics programme,” she says. “It’s a matter of not putting all your eggs in one basket.”

    Science articles:


    1. CDF Collaboration. Science (2022).
    2. LHCb collaboration. Nature Phys. (2022).
    3. Abi, B. et al. Phys. Rev. Lett. (2021).
    4. ATLAS Collaboration (2022).
    5. Govorkova, E. et al. Nature Mach. Int. (2022).

    See the full article here .

  • richardmitnick 2:04 pm on August 10, 2022 Permalink | Reply
    Tags: "Particle physicists want to build the world’s first muon collider", , NATURE, , , , The accelerator would smash together this heavier version of the electron and researchers hope discover new particles.   

    From “Nature” : “Particle physicists want to build the world’s first muon collider” 

    From “Nature”

    Elizabeth Gibney

    The accelerator would smash together this heavier version of the electron and researchers hope discover new particles.

    Momentum is growing to build a particle collider in the United States that smashes muons — heavier cousins of electrons. The collider would follow the world’s next major accelerator, which is yet to be built, and physicists hope it would discover new elementary particles.

    Future colliders

    The muon machine would follow construction of a ‘Higgs factory’, a major collider that collaborations in Europe, China and Japan are already vying to build to study the elementary particle known as the Higgs boson in precise detail (see ‘Future colliders’). The Large Hadron Collider (LHC) at CERN, Europe’s particle-physics laboratory near Geneva, Switzerland, discovered the Higgs — which is associated with the field that gives particles mass — in 2012. But it did not discover the other new particles that many physicists expected, and some now believe that might be beyond the machine’s reach.

    A Higgs factory would bring together electrons with their antimatter counterparts, positrons, in collisions that are cleaner than the proton–proton smashes in the LHC, allowing for precision studies. In contrast, a muon collider would be a ‘discovery’ machine, trying to find new particles through collisions of unprecedented energy and elucidating the cause of discrepancies found in results from previous experiments.

    Although muons’ short-lived nature makes such a collider technically difficult to build, its major advantage is that it would be smaller and potentially cheaper than competing collider designs. The vision remains distant, into the 2040s at the earliest, but research and development need to begin now, say its advocates.

    It’s a “bold and promising vision”, says Karri Di Petrillo, a particle physicist at The DOE’s Fermi National Laboratory. Physicists around the world are mulling the feasibility of such a collider, but hosting it in the United States “would be a game changer for my generation of physicists”, she says.

    Support among physicists for a muon collider emerged during Snowmass, a major planning exercise by the US particle-physics community that sets out its scientific vision around once a decade.


    The exercise culminated in a ten-day workshop held in Seattle, Washington, from 17 to 26 July. Organizers will now distill the views of thousands of scientists into a report that describes the field’s major questions and what’s needed to solve them, which will ultimately influence US federal funding. Almost one-third of white papers that physicists contributed to the ‘energy frontier’ section of the exercise were about muon colliders, and excited supporters at the meeting sold T-shirts backing the plans.

    Higgs factory

    The muon machine would follow construction of a ‘Higgs factory’, a major collider that collaborations in Europe, China and Japan are already vying to build to study the elementary particle known as the Higgs boson in precise detail (see ‘Future colliders’ above). The Large Hadron Collider (LHC) at CERN, Europe’s particle-physics laboratory near Geneva, Switzerland, discovered the Higgs — which is associated with the field that gives particles mass — in 2012. But it did not discover the other new particles that many physicists expected, and some now believe that might be beyond the machine’s reach.

    A Higgs factory would bring together electrons with their antimatter counterparts, positrons, in collisions that are cleaner than the proton–proton smashes in the LHC, allowing for precision studies. In contrast, a muon collider would be a ‘discovery’ machine, trying to find new particles through collisions of unprecedented energy and elucidating the cause of discrepancies found in results from previous experiments.

    Calendar for future colliders

    Muons can be accelerated to higher energies than electrons because they lose less energy as synchrotron radiation. And they have a big advantage over proton collisions. These involve smashes between the particles’ constituent quarks, each of which carries just a fraction of the overall collision energy. Because muons are fundamental particles, every collision involves the particle’s entire energy. This means that a 10-trillion electronvolt (TeV) muon collider, at around 10 kilometres long, could produce particles that have as much energy as those produced by the 100-TeV, 90-kilometre proton machine that CERN is looking to build in the second half of the century.

    The muon-collider concept has been around since the 1960s. But only in recent years have viable technologies been developed that might be able to deal with the muon’s quirks, which include the fact that it decays readily, producing bothersome background noise, and is difficult to cajole into forming an intense beam. The excitement among US physicists now is because there is enough time to develop and build the machine to succeed a Higgs factory, and plenty of people to work on it, says Priscilla Cushman, a physicist at the University of Minnesota in Minneapolis.

    Whether it would be built in the United States depends on funding and politics, as well as technical feasibility, says Joel Butler, a particle physicist at Fermilab and chair of the Snowmass steering group. CERN is also organizing an international collaboration to study the feasibility of a muon collider. For all the collider options on the table, US physicists must do enough research and development “that when the choices have to be made, they can be made in a good way”, he says.

    Enthusiasm for the muon collider chimes with a growing focus on cost and sustainability, says Caterina Vernieri, a particle physicist at Stanford University in California, who is part of a group that pitched a cheaper Higgs-factory design, known as the “Cool Copper Collider”, as part of the Snowmass process.

    Dark matter

    Distant colliders were just a small part of the Snowmass agenda. Among their nearer-term plans, physicists highlighted their commitment to a high-intensity upgrade to the LHC from 2026 that will produce more than ten times the data created until that point. They also reiterated their desire to push ahead with a phased construction of a 1,300-kilometre, US-based experiment called DUNE, which is designed to investigate the nature of elusive particles called neutrinos. Some argued for the go-ahead on CMB-S4, a next-generation survey of the cosmic microwave background.

    A call that cut across disciplines was to ensure that a wide range of facilities exists to hunt for dark matter. The failure to find a theoretically predicted kind of dark matter known as weakly interacting massive particles (WIMPs) in the past ten years, either at the massive detectors designed to search for them or at the LHC, means that dark matter must be even more exotic than had been thought.

    Physicists want to look for much lighter candidates for dark matter, and to reframe their search to consider that it could exist as a whole family of particles, rather than just one, says Suchita Kulkarni, a dark-matter physicist at the University of Graz in Austria, who attended the Snowmass meeting. Finding it will take a few large and sensitive experiments — such as those already looking for WIMPS — and many more small, experimental ones, says Micah Buuck, a physicist at Stanford University.

    Funding recommendations

    The two-year Snowmass process, to which physicists from around the world submitted 521 papers, was “exhausting, but thrilling”, says Cushman, who is a member of the steering group.

    Crunch time will come next year, when the US federal Particle Physics Prioritization Panel, known as P5, will use Snowmass’s conclusions — and budget considerations — to make investment recommendations to funders at the Department of Energy and the National Science Foundation for the next ten years.

    Physicists are now working on how best to communicate with funders and the public, says Kulkarni. In the past decade, they haven’t found what many expected — a deviation with the standard model, their best description of particle physics that they know to be incomplete. “The community is making an effort to set a consistent and honest narrative,” Kulkarni says. “We are doing the best we can, and will learn something from it. But discoveries are fickle mistresses, and you never know when you are going to get them.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    ”Nature” is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 10:47 am on July 6, 2022 Permalink | Reply
    Tags: "Happy birthday Higgs boson! What we do and don’t know about the particle", , , , , NATURE, , , ,   

    From “Nature” : “Happy birthday Higgs boson! What we do and don’t know about the particle” 

    From “Nature”

    04 July 2022
    Elizabeth Gibney

    Physicists are celebrating ten years since the Higgs boson’s discovery. But many of its properties remain mysterious.

    On 4 July 2012, physicists at CERN, Europe’s particle-physics laboratory, declared victory in their long search for the Higgs boson.



    The elusive particle’s discovery filled in the last gap in the standard model — physicists’ best description of particles and forces — and opened a new window on physics by providing a way to learn about the Higgs field, which involves a previously unstudied kind of interaction that gives particles their masses.

    Since then, researchers at CERN’s Large Hadron Collider (LHC) near Geneva, Switzerland, have been busy, publishing almost 350 scientific articles about the Higgs boson. Nevertheless, many of the particle’s properties remain a mystery.









    On the ten-year anniversary of the Higgs boson’s discovery, Nature looks at what it has taught us about the Universe, as well as the big questions that remain.

    5 things scientists have learned.

    The Higgs boson’s mass is 125 billion electronvolts.

    Physicists expected to find the Higgs boson eventually, but they didn’t know when.

    [Earlier than the LHC at CERN, The DOE’s Fermi National Accelerator Laboratory had sought the Higgs with the Tevatron Accelerator.

    But the Tevatron could barely muster 2 TeV [Terraelecton volts], not enough energy to find the Higgs. CERN’s LHC is capable of 13 TeV.

    Another possible attempt in the U.S. would have been the Super Conducting Supercollider.

    Fermilab has gone on to become a world powerhouse in neutrino research with the LBNF/DUNE project which will send neutrinos 800 miles to SURF-The Sanford Underground Research Facility in in Lead, South Dakota.]

    In the 1960s, physicist Peter Higgs and others theorized that what’s now called a Higgs field could explain why the photon has no mass and the W and Z bosons, which carry the weak nuclear force that is behind radioactivity, are heavy (for subatomic particles). The special properties of the Higgs field allowed the same mathematics to account for the masses of all particles, and it became an essential part of the standard model. But the theory made no predictions about the boson’s mass and therefore when the LHC might produce it.

    In the end, the particle emerged much earlier than expected. The LHC started gathering data in its search for the Higgs in 2009, and both ATLAS and CMS, the accelerator’s general-purpose detectors, saw it in 2012. The detectors observed the decay of just a few dozen Higgs bosons into photons, Ws and Zs, which revealed a bump in the data at 125 billion electronvolts (GeV), about 125 times the mass of the proton.

    The Higgs’ mass of 125 GeV puts it in a sweet spot that means the boson decays into a wide range of particles at a frequency high enough for LHC experiments to observe, says Matthew Mccullough, a theoretical physicist at CERN. “It’s very bizarre and probably happenstance, but it just so happens that [at this mass] you can measure loads of different things about the Higgs.”

    The Higgs boson is a spin-zero particle.

    Spin is an intrinsic quantum-mechanical property of a particle, often pictured as an internal bar magnet. All other known fundamental particles have a spin of 1/2 or 1, but theories predicted that the Higgs should be unique in having a spin of zero (it was also correctly predicted to have zero charge).

    In 2013, CERN experiments studied the angle at which photons produced in Higgs boson decays flew out into the detectors, and used this to show with high probability that the particle had zero spin. Until this had been demonstrated, few physicists were comfortable calling the particle they had found the Higgs, says Ramona Gröber, a theoretical physicist at the University of Padua in Italy.

    The Higgs’ properties rule out some theories that extend the standard model.

    Physicists know that the standard model is not complete. It breaks down at high energies and can’t explain key observations, such as the existence of dark matter or why there is so little antimatter in the Universe. So physicists have come up with extensions to the model that account for these. Discovering the Higgs boson’s 125-GeV mass has made some of these theories less attractive, says Gröber. But the mass is in a grey zone, which means it rules out very little categorically, says Freya Blekman, a particle physicist at the German Electron Synchrotron (DESY) in Hamburg. “What we have is a particle that’s consistent with more or less anything,” she says.

    The Higgs boson interacts with other particles as the standard model predicts.

    According to the standard model, a particle’s mass depends on how strongly it interacts with the Higgs field. Although the boson — which is like a ripple in the Higgs field — doesn’t have a role in that process, the rate at which Higgs bosons decay into or are produced by any other given particle provides a measure of how strongly that particle interacts with the field. LHC experiments have confirmed that — at least for the heaviest particles, produced most frequently in Higgs decays — mass is proportional to interaction with the field, a remarkable win for a 60-year-old theory.

    The Universe is stable — but only just.

    Calculations using the mass of the Higgs boson suggest that the Universe might be only temporarily stable, and there’s a vanishingly small chance that it could shift into a lower energy state — with catastrophic consequences.

    Unlike other known fields, the Higgs field has a lowest energy state above zero even in a vacuum, and it pervades the entire Universe. According to the standard model, this ‘ground state’ depends on how particles interact with the field. Soon after physicists discovered the Higgs boson’s mass, theorists used the value (alongside other measurements) to predict that there also exists a lower, more preferable energy state.

    Shifting to this other state would require it to overcome an enormous energy barrier, says Mccullough, and the probability of this happening is so small that it is unlikely to occur on the timescale of the lifetime of the Universe. “Our doomsday will be much sooner, for other reasons,” says Mccullough.

    5 things scientists still want to know.

    Can we make Higgs measurements more precise?

    So far, the Higgs boson’s properties — such as its interaction strength — match those predicted by the standard model, but with an uncertainty of around 10%. This is not good enough to show the subtle differences predicted by new physics theories, which are only slightly different from the standard model, says Blekman.

    More data will increase the precision of these measurements and the LHC has collected just one-twentieth of the total amount of information it is expected to gather. Seeing hints of new phenomena in precision studies is more likely than directly observing a new particle, says Daniel de Florian, a theoretical physicist at the National University of San Martín in Argentina. “For the next decade or more, the name of the game is precision.”

    Does the Higgs interact with lighter particles?

    Until now, the Higgs boson’s interactions have seemed to fit with the standard model, but physicists have seen it decay into only the heaviest matter particles, such as the bottom quark. Physicists now want to check whether it interacts in the same way with particles from lighter families, known as generations. In 2020, CMS and ATLAS saw one such interaction — the rare decay of a Higgs to a second-generation cousin of the electron called the muon1. Although this is evidence that the relationship between mass and interaction strength holds for lighter particles, physicists need more data to confirm it.

    Does the Higgs interact with itself?

    The Higgs boson has mass, so it should interact with itself. But such interactions — for example, the decay of an energetic Higgs boson to two less energetic ones — are extremely rare, because all the particles involved are so heavy. ATLAS and CMS hope to find hints of the interactions after a planned upgrade to the LHC from 2026, but conclusive evidence will probably take a more powerful collider.

    The rate of this self-interaction is crucial to understanding the Universe, says Mccullough. The probability of self-interaction is determined by how the Higgs field’s potential energy changes near its minimum, which describes conditions just after the Big Bang. So knowing about the Higgs self-interaction could help scientists to understand the dynamics of the early Universe, says Mccullough. Gröber notes that many theories that try to explain how matter somehow became more abundant than antimatter require Higgs self-interactions that diverge from the standard model’s prediction by as much as 30%. “I can’t emphasize enough how important” this measurement is, says Mccullough.

    What is the Higgs boson’s lifetime?

    Physicists want to know the lifetime of the Higgs — how long, on average, it sticks around before decaying to other particles — because any deviation from predictions could point to interactions with unknown particles, such as those that make up dark matter. But its lifetime is too small to measure directly.

    To measure it indirectly, physicists look at the spread, or ‘width’, of the particle’s energy over multiple measurements (quantum physics says that uncertainty in the particle’s energy should be inversely related to its lifetime). Last year, CMS physicists produced their first rough measurement of the Higgs’ lifetime: 2.1 × 10^−22 seconds2. The results suggest that the lifetime is consistent with the standard model.

    Are any exotic predictions true?

    Some theories that extend the standard model predict that the Higgs boson is not fundamental, but — like the proton — is made up of other particles. Others predict that there are multiple Higgs bosons, which behave similarly but differ, for example, in charge or spin. As well as checking whether the Higgs is truly a standard-model particle, LHC experiments will look for properties predicted by other theories, including decays into forbidden particle combinations.

    Physicists are just at the beginning of their efforts to understand the Higgs field, whose unique nature makes it “behave like a portal to new physics”, says de Florian. “There is a lot of room for excitement here.”


    The CMS collaboration et al. J. High Energ. Phys. 2021, 148 (2021).

    The CMS collaboration in Nature Physics.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Nature” is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 5:07 pm on February 28, 2022 Permalink | Reply
    Tags: "Climate change is hitting the planet faster than scientists originally thought", , NATURE   

    From Nature: “Climate change is hitting the planet faster than scientists originally thought” 

    From Nature

    28 February 2022
    Jeff Tollefson

    Latest IPCC climate report warns that rising greenhouse-gas emissions could soon outstrip the ability of many communities to adapt.


    The climate crisis has already negatively affected places like Bangladesh, where river erosion has cost people their homes.Credit: Zakir Hossain Chowdhury/Barcroft Media/Getty.

    The negative impacts of climate change are mounting far faster than scientists predicted less than a decade ago, according to the latest report from a United Nations climate panel. Many impacts are unavoidable and will hit the world’s most vulnerable populations hardest, it warns — but collective action from governments to both curb greenhouse-gas emissions and to prepare communities to live with global warming could yet avert the worst outcomes.

    “The cumulative scientific evidence is unequivocal,” says Maarten van Aalst, a climate scientist who heads the Red Cross Red Crescent’s Climate Centre in Enschede, the Netherlands, and an author on the report. “Any further delay in global action on adaptation and mitigation will miss a brief and rapidly closing window of opportunity to secure a livable and sustainable future for all.”

    The report, released on 28 February, is the second installment of the latest climate assessment from the UN Intergovernmental Panel on Climate Change (IPCC). Issued last August, the IPCC’s first report focused on recent climate science, while this report focuses on the impacts of climate change on people and ecosystems. It will be followed by a third installment in early April that evaluates humanity’s options for reducing greenhouse-gas emissions. This is the sixth such assessment from the IPCC in three decades, and the warnings have only become more dire. Advocates hope this latest assessment will finally spur governments to decisively tackle the climate crisis.

    “I’ve seen many scientific reports in my time, but nothing like this,” UN secretary-general António Guterres said during a press conference unveiling the report. It is a “damning indictment of failed climate leadership”, he added.

    “I’ve seen many scientific reports in my time, but nothing like this,” UN secretary-general António Guterres said during a press conference unveiling the report. It is a “damning indictment of failed climate leadership”, he added.

    Key points from the report:

    • About 3.3–3.6 billion people — more than 40% of the world’s population — live in places on Earth and in situations that are “highly vulnerable to climate change”, the report estimates. Some are already experiencing the effects of climate change, which vary by region and are driven by factors such as geography, how that region is governed and its socioeconomic status. The report also references for the first time “historical and ongoing patterns of inequity such as colonialism” that contribute to many regions’ vulnerability to climate change.

    • Although additional finance and planning could help many communities better prepare for climate change, “hard limits” to humanity’s ability to adapt to climate change are approaching if temperatures continue to rise, the report says. For instance, coastal communities can temporarily buffer themselves from extreme storms by restoring coral reefs, mangroves and wetlands, but rising seas will eventually overwhelm such efforts, resulting in coastal erosion, flooding and loss of freshwater resources.

    • Climate change has already caused death and suffering across the world, and it will continue to do so. In addition to contributing to mortalities by helping to trigger disasters such as fires and heat waves, it has affected public health in various ways. Smoke inhalation from fires has contributed to cardiovascular and respiratory problems, for instance, while increased rainfall and flooding has led to the spread of diseases such as dengue and cholera. Mental-health issues, tied to the trauma of living through extreme events and to people losing their livelihoods and culture, are also on the rise.

    • If global temperatures rise more than 1.5 °C above preindustrial temperatures, some environmental changes could become irreversible, depending on the magnitude and duration of the ‘overshoot’ beyond this threshold. In forests and arctic permafrost zones that act as carbon dioxide reservoirs, for instance, extreme global warming could lead to the release of excess carbon emissions, which would in turn drive further warming — a self-perpetuating cycle.

    • Sustainable economic development must include protection for biodiversity and natural ecosystems, which secure resources such as freshwater and coastlines that are protective against storms, the report says. Multiple lines of evidence suggest that maintaining the resilience of biodiversity and ecosystems as the climate warms will depend on “effective and equitable conservation of approximately 30% to 50% of Earth’s land, freshwater and ocean areas”.

    More than 270 researchers from 67 countries authored the latest IPCC report. Here’s what some are saying about its importance:

    Adelle Thomas, a geographer at the University of the Bahamas in Nassau. The most important message coming from the report from my perspective is that losses and damages are widespread and being felt now. Unfortunately, these negative impacts of climate change are disproportionately affecting the most vulnerable and marginalized communities around the world. Also critical is evidence showing that people and ecosystems are already reaching limits to adaptation, where they have surpassed their capacities to prevent negative impacts of climate change.

    As a scientist from The Bahamas, one of the low-lying coastal countries that are at high risk to climate change, I hope that this report provides an impetus for policymakers to limit warming to 1.5 °C, urgently ramp up adaptation and address loss and damage.

    Edwin Castellanos, director of the Sustainable Economic Observatory at the University of the Valley of Guatemala in Guatemala City. This report combines two messages, one of urgency and one of hope: urgency to act, not only to drastically reduce emissions in the near term … but to increase our actions to adapt to the impacts already observed and to come. And there is hope from knowing that we are still in time to take these actions.

    My hope is that this report will highlight the need for developed countries to support developing countries, particularly with financial resources to reduce the vulnerability of people, particularly those at higher risk: the poor, the marginalized, and Indigenous peoples.

    Sarah Cooley, director of climate science at the Ocean Conservancy, a conservation group based in Washington DC. This report assesses how local communities are rising to the challenge [of climate change] and have become leaders on climate adaptation and climate planning. It evaluates the climate adaptations that communities have already tried, and it identifies the features of successful, equitable activities, as well as opportunities for even bigger changes.

    It also confirms that any more delay in climate action is going to close off opportunities to head off the worst impacts of climate. But the good news is, there are more details than ever about how the glob al community can meet the challenge effectively, despite our slow start.

    Ibidun Adelekan, a geographer at the University of Ibadan in Nigeria. The report underscores the fact that the capacity of individuals and local communities to cope and adapt to the risks from climate change is very limited without adaptation planning efforts supported by governments. There is need for collaboration among citizens, scientists, the private sector and policymakers to develop feasible adaptation plans, through the integration of different knowledge systems — including local and Indigenous knowledge.

    Rawshan Ara Begum, an economist from Bangladesh who studies sustainable development at Macquarie University in Sydney, Australia. This report provides a range of climate adaptation options for reducing vulnerability and enhancing resilience. As a citizen of a vulnerable country, I have hopes that global leaders [will take] urgent, accelerated action to adapt to climate change, while making rapid, deep cuts in greenhouse-gas emissions.

    Bangladesh is one of the most vulnerable countries in the world due to climate change and sea level rise. This will further worsen the country’s current challenges, including extreme poverty, income inequality, economic and non-economic losses and damages and low adaptive capacity. Urgent and accelerated action is required.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 8:39 pm on February 21, 2022 Permalink | Reply
    Tags: "A supernova could light up the Milky Way at any time. Astronomers will be watching", , , , , NATURE, ,   

    From Nature: “A supernova could light up the Milky Way at any time. Astronomers will be watching” 

    From Nature

    21 February 2022
    Davide Castelvecchi

    Supernova 1987A appears as a bright spot near the centre of this image of the Tarantula nebula, taken by the ESO Schmidt Danish Telescope.Credit: The European Southern Observatory [La Observatorio Europeo Austral] [Observatoire européen austral][Europäische Südsternwarte](EU)(CL)

    Masayuki Nakahata has been waiting 35 years for a nearby star to explode.

    He was just starting out in science the last time it happened, in February 1987, when a dot of light suddenly appeared in the southern sky. This is the closest supernova seen during modern times; and the event, known as SN 1987A, gained worldwide media attention and led to dramatic advances in astrophysics.

    SN 1987A remnant, imaged by ALMA. The inner region is contrasted with the outer shell, lacy white and blue circles, where the blast wave from the supernova is colliding with the envelope of gas ejected from the star prior to its powerful detonation. Image credit: ALMA / European Southern Observatory [La Observatorio Europeo Austral][Observatoire européen austral][Europäische Südsternwarte] (EU) / National Astronomical Observatory of Japan [国立天文台](JP) / National Radio Astronomy Observatory (US) / Alexandra Angelich, NRAO / Associated Universities Inc (US) / National Science Foundation (US).

    Nakahata was a graduate student at the time, working on what was then one of the world’s foremost neutrino catchers, the Kamiokande-II detector at the Kamioka Underground Observatory near Hida, Japan.

    He and a fellow student, Keiko Hirata, spotted evidence of neutrinos pouring out of the supernova — the first time anyone had seen these fundamental particles originating from anywhere outside the Solar System.

    Now, Nakahata, a physicist at the University of Tokyo, is ready for when a supernova goes off. He is head of the world’s largest neutrino experiment of its kind, Super-Kamiokande, where upgrades to its supernova alert system were completed late last year.

    The improvements will enable the observatory’s computers to recognize when it is detecting neutrinos from a supernova, almost in real time, and to send out an automated alert to conventional telescopes worldwide.

    Astronomers will be waiting. “It’s gonna give everybody the willies,” says Alec Habig, an astrophysicist at the University of Minnesota-Duluth. Early warning from Super-Kamiokande and other neutrino observatories will trigger robotic telescopes — in many cases responding with no human intervention — to swivel in the direction of the dying star to catch the first light from the supernova, which will come after the neutrino storm.

    But when the light arrives, it could be too much of a good thing, says Patrice Bouchet, an astrophysicist at the The Paris-Saclay University[Université Paris-Saclay](FR) who made crucial observations of SN 1987A, from the La Silla Observatory in Chile.

    The brightest events, which would shine brighter than a full Moon and be visible during the day, would overwhelm the ultra-sensitive but delicate sensors in the telescopes used by professional astronomers.

    And some of the instruments Bouchet used back then no longer exist. “If η Carinae or Betelgeuse explode,” says Bouchet, referring to two well-known stars, “we are not ready to observe it as we did with ’87A.”

    Researchers will scramble to adapt their instruments on the fly, but the lion’s share of the observations could fall on amateur astronomers, who have smaller telescopes and are in many cases very proficient at using them.

    The scientific pay-off will nevertheless be immense. Supernovae have rarely been observed up close, but they are crucial for understanding how the chemical elements that were forged inside stars by nuclear fusion disperse across galaxies. And the stellar explosions themselves synthesize elements that would not exist otherwise. The neutrinos that Nakahata and others hope to capture will provide a unique window into the extreme physics that goes on inside an exploding star, and could lead to important discoveries about the fundamental forces and particles of nature.

    New light

    It was early in the morning of 24 February 1987, when Ian Shelton, the staff telescope operator at a Canadian observatory in Las Campanas, Chile, spotted an unexpected dot of light. It appeared on some routine exposures he had just taken of the Large Magellanic Cloud, a small galaxy that orbits the Milky Way and is visible in the southern sky.

    Shelton immediately realized that this could be a significant event. He stepped outside to look with his own eyes and, sure enough, noticed a bright star that had not been there before. It was the first such stellar object to be visible with the naked eye since the German astronomer Johannes Kepler recorded one in 1604.

    Supernovae are among the most energetic cataclysms in the cosmos, shining for a period of weeks or months, and in some rare cases emitting more light than an entire galaxy. Supernova explosions comprise several types, but the most common occurs at the end of the life of a very large star — one somewhere between 8 and 140 times the mass of the Sun.

    The star runs out of fuel for the nuclear fusion that had been powering it, leaving behind an inert core of iron and nickel in a state of plasma. The outer layers of the star begin to fall inwards, and the core starts to collapse. In a span of milliseconds, most of the matter in the core gets so compressed that protons and electrons combine to form neutrons. The core’s density suddenly rises by several orders of magnitude, because neutrons take up much less space than plasma. The neutrons pack into a denser ball — as dense as the laws of physics permit, forming what Habig calls a proto-neutron star inside the core.

    The formation of each neutron releases a neutrino, and so the core’s collapse releases a brief initial burst of neutrinos. But the cataclysm has only just begun. “The rest of the star is raining down on that proto-neutron star,” says Habig. After falling for thousands of kilometres in an intense gravitational field, the material hits the hard surface of the neutron core, bouncing back with a shock wave that propagates outwards. The shock wave is so violent that the rest of the star disintegrates, leaving only the neutron star as a remnant, which weighs around twice as much as the Sun.


    The Hubble Space Telescope captured SN 1987A in 2011 surrounded by a set of glowing rings.Credit: The National Aeronautics and Space Agency (US)/The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU) Hubble Telescope.

    During the collapse itself, the energy released by the falling matter smashes elementary particles together as happens in a high-energy collider, continuously turning energy into new particles of all kinds. “It’s so incredibly hot and dense, everything is happening,” says Kate Scholberg, an astrophysicist at Duke University in Durham, North Carolina.

    Most of those particles have nowhere to go and keep bumping into each other — with one exception. When a collision produces a neutrino, that particle will have a high chance of escaping into outer space without hitting anything else. As a result, many neutrinos are produced over a period of ten seconds or more. Researchers estimate that SN 1987A ejected 1058 of these particles.

    On these time scales, neutrinos are by far the dominant way in which the supernova dissipates energy. Although the shock wave can take many hours to make it through the outer layers of the star and to become visible, neutrinos come out right away, practically at the speed of light. More than 99% of the energy from a core-collapse supernova escapes not as light, but as neutrinos.

    Eventually, most of the star’s original mass disperses into interstellar space. Over the following eons, it will trigger the formation of new stars and planets; our Solar System might have formed that way, some 5 billion years ago.

    The centre cannot hold

    On average, one or two Milky Way stars per century undergo core collapse, according to the most recent estimates [1]. Yet throughout history, only five supernovae have been recorded as being visible with the naked eye, with two thought to be of the core-collapse type [2]. There are various reasons for this discrepancy. If enough mass concentrates in the collapsing core, it forms a black hole without producing much of a light show. In perhaps the majority of cases, an explosion does happen, but remains hidden from view by thick interstellar dust in the plane of the Milky Way, where massive stars reside.

    Fortunately, the same physics that lets neutrinos escape a star’s core will also let them cross the dusty Galactic Centre unimpeded. This means that neutrino detectors on Earth will pick up a shower of neutrinos no matter what, and so will record collapsing stars that would not have been detected by any other means.

    And what a shower it will be. In 1987, Kamiokande-II was one of the world’s largest neutrino detectors. Its 3,000 tonnes of water picked up 11 neutrinos; experiments in Ohio and Russia captured a handful, too. If a similar event were to occur today, Super-Kamiokande, which opened in 1996 and holds 50,000 tonnes of water, would spot at least 300 of the particles — and many more if the supernova occurs in our Galaxy, as opposed to in the Large Magellanic Cloud.

    Beginning in 2018, Super-K, as the observatory is known, had an upgrade that has vastly improved its ability to study supernovae.

    In particular, the Super-K collaboration, which includes Japanese and US physicists, added the rare-earth metal gadolinium to the detector’s water. Its presence will enable the detector to clearly distinguish two types of supernova neutrino. One type produces flashes inside the detector that propagate in a random direction. But the flashes from the other type point straight back at the direction in which the neutrino was travelling.

    Being able to tell the two apart in real time means that Super-K’s software will rapidly calculate where in the sky astronomers should point their telescopes, within an angle of less than 3 degrees. “Using this information, Super-K is the world’s best detector for determining the direction to a supernova,” says Nakahata.

    The supernova alert system, called SNWatch, is programmed to notify senior collaboration members about a possible sighting. At the same time, it sounds an alarm in the detector’s cavernous underground hall and control room. Sara Sussman, a physicist now at Princeton University in New Jersey, spent time working at Super-K in 2017 during her undergraduate studies, and experienced the alarm in person. It went off during her first stint as the shift operator in the Super-K control room, and Sussman didn’t know it was a drill. “I’m never gonna forget that moment for the rest of my life,” she says.

    Until recently, the Super-K procedures in case of a supernova prescribed that a senior team would hold an emergency meeting to decide whether the signal was genuine, and whether to send the news out. Starting last December, the collaboration removed any need for human intervention. In case of a neutrino shower, SNWatch will send an automated alert — including the event’s coordinates in the sky — to astronomers within 5 minutes, Nakahata says. Future improvements in the software should bring that down to 1 minute, he adds.

    This will be a far cry from how information spread following the discovery of SN 1987A. The Chilean mountaintop of Las Campanas where Shelton worked did not even have a telephone line, and its radio telephone rarely worked. To alert other researchers to the scientific treasure that had just appeared, observatory staff had to drive to the nearest town, two hours away, and send a telegram.

    On alert

    Neutrino alert systems are not new: one has existed for nearly two decades. The Supernova Early Warning System (SNEWS) is a network involving Super-K and several other neutrino observatories. It includes IceCube, an array of light sensors embedded in a cubic kilometre of Antarctica’s ice, and KM3NeT, a similar array submerged in the Mediterranean Sea.

    Large neutrino facilities now under construction in the United States and China are expected to join in the next few years, and Japan is building Hyper-Kamiokande, which will be five times larger than Super-K. “We expect 54,000–90,000 neutrinos if a supernova explodes in the centre of the Galaxy,” says Francesca Di Lodovico, co-spokesperson for the Hyper-Kamiokande detector.

    The main idea of SNEWS is to combine signals to improve the confidence in a detection, even if the individual ones look marginal at best. Each detector runs software that notifies a central SNEWS server of any unusual activity. SNEWS sends an alert to astronomers only if neutrino detectors in two separate geographical areas see a spike in activity within 10 seconds of each other. “If two see something and are not in the same lab, then it would be really hard for something random to happen in Japan and Italy, say,” says Habig.

    Since SNEWS went live in 2005, it has not had the opportunity to send out a single alert. “You’ve got to admire the tenacity and the endurance,” says Robert Kirshner, an astronomer at Harvard University in Cambridge, Massachusetts. “They know they’re right, they know it’s important — but they’re not getting rewarded a lot.”

    Now, SNEWS is about to roll out its first major upgrade, called SNEWS 2.03. One goal is to produce alerts on the basis of lower-confidence sightings of possible supernova neutrinos. Observatories used to be conservative about sending out alerts, wanting to avoid any risk of false alarms. But in recent years, the culture has changed, and researchers are more comfortable exchanging lower-confidence alerts, just in case.

    “The attitude has flipped 180 degrees,” Habig says. This change was brought in part by the advent of gravitational-wave astronomy, which yields weekly or even daily signals that many astronomers follow up using ordinary telescopes. That way, the same event can be studied using different astronomical phenomena, a trend called multi-messenger astronomy.

    Another innovation of SNEWS 2.0 is that when multiple observatories record a neutrino shower, it will compare the exact timings of the particles’ arrival, and use those to triangulate back to the source. The pointing will be vastly less precise than that provided by Super-K alone, but the triangulation might end up being even faster, Habig says.

    Too much light

    When Shelton spotted SN 1987A, Bouchet was in the right place at the right time. He had been working at the European Southern Observatory in La Silla, where he used a special device that could make infrared measurements of stars during the daytime. This meant that Bouchet could continue to measure the supernova’s brightness even when daylight in the sky drowned out the visible light from stars. But the telescope Bouchet used has been decommissioned, and no modern observatory has the right equipment to make daytime infrared measurements.

    What’s worse, Bouchet adds, is that most large observatories have decommissioned their smaller visible-light telescopes, focusing on the largest, most sensitive instruments, which could be useless for observing a bright event. But Danny Steeghs, an astronomer at the University of Warwick, UK, is more optimistic. There has been a renaissance in ‘small astronomy’, he says, spurred in part by multi-messenger astronomy. “Now we have a new generation of more-bespoke, smaller telescopes,” Steeghs says. When a supernova happens, he says, “we might miss the very first stages, but I am sure everyone will be creative.” Steeghs runs the Gravitational wave Optical Transient Observer, a system that can rapidly cover a large part of the sky to chase after possible light associated with gravitational waves.

    “Even in the case of a really bright one, astronomers are clever and will find a way,” says Andy Howell, senior scientist at the Las Cumbres Observatory.

    Las Cumbres is an organization based near Santa Barbara, California, which runs a network of robotic telescopes that together give global coverage of the sky. “We could observe the supernova around the clock, since we always have telescopes in the dark.”

    To observe extra bright objects, astronomers might use tricks such as taking short exposures, or partially blacking out the telescope’s mirror so that it reflects less light. But one of the most crucial observations — measuring the supernova’s brightness and how it evolves over time — will be difficult to do precisely. Astronomers usually measure a star’s brightness by calibration, by comparing it to that of another, well-known object in the same field of view. But calibration is difficult when the object of study is so bright that no other star can be seen in the same shot.

    If professional astronomers stumble, an army of serious hobbyists might come to the rescue, Bouchet says. The American Association of Variable Star Observers (AAVSO), headquartered in Cambridge, Massachusetts, will help to coordinate the efforts of amateur astronomers, many of whom will be eager to jump in. “They would be on it — some of them within minutes,” says Elizabeth Waagen, an astronomer who has been on the AAVSO staff for 40 years and helps to coordinate observer campaigns.

    “We are everywhere,” says Arto Oksanen, an IT professional based in Jyväskylä, Finland, who is a celebrity in the world of amateur astronomy. “At any given time, there is someone that can observe under clear skies.” Oksanen is the chair of a club of observers that built and runs its own remotely operated observatory, with a 40-centimetre reflector telescope and an automatic dome, some 300 kilometres north of Helsinki.

    To take measurements of a very bright supernova, even smaller telescopes will do. Oksanen says that if the object is extremely bright — and assuming it is visible in the Finnish sky — the first thing he would probably do is take pictures with his Nikon digital SLR camera. With a supernova, time is of the essence, and even this rough method would record invaluable information about how the explosion varies in brightness.

    But Tom Calderwood, an amateur astronomer in Bend, Oregon, says that few serious hobbyists have made such contingency plans to prepare for a possible supernova. “It’s definitely worth it for the amateur community to sit down and think what they would do,” he says.

    The supernova of 1987 changed many lives overnight. Shelton decided to pursue a PhD in astronomy. Bouchet spent much of the next year on the Chilean mountaintop and has been studying the supernova and its remnants ever since, as has Kirshner, who has been involved in the search for SN 1987A’s neutron-star remnant. That’s something he could soon help to nail down using NASA’s recently launched James Webb Space Telescope, which might be able to detect infrared radiation from the remnant that makes it through the surrounding shroud of dust. Nakahata’s boss at the time, the late Masatoshi Koshiba, shared a Nobel physics prize in 2002 for his work using Kamiokande-II, in large part for detecting the 11 supernova neutrinos.

    Waagen says that many young people can trace the time when they became interested in astronomy — or science in general — to a specific day, when “some spectacular event caught their imagination and changed the course of their lives”. The next supernova will change a lot of lives, too, she says. “It will connect them to the sky in a new way.”

    “It will be wild,” says Ed Kearns, a particle physicist at Boston University in Massachusetts. “I don’t know exactly what’s going to happen, because there’s so much human nature involved.” No supernova neutrinos have been detected since 1987, but it could happen any time, he adds. “Every year is a fresh year, every day is a fresh day for a chance at a supernova.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 2:51 pm on February 9, 2022 Permalink | Reply
    Tags: "Nuclear-fusion reactor smashes energy record", , , , , NATURE, The Joint European Torus tokamak generator   

    From Nature: “Nuclear-fusion reactor smashes energy record” 

    From Nature

    09 February 2022
    Elizabeth Gibney

    The experimental Joint European Torus has doubled the record for the amount of energy made from fusing atoms — the process that powers the Sun.

    The Joint European Torus tokamak generator based at the Culham Center for Fusion Energy located at the Culham Science Centre, near Culham, Oxfordshire, England.

    A 24-year-old nuclear-fusion record has crumbled. Scientists at the Joint European Torus (JET) near Oxford, UK, announced on 9 February that they had generated the highest-ever sustained energy from fusing together atoms, more than doubling their own record from experiments performed in 1997.

    “These landmark results have taken us a huge step closer to conquering one of the biggest scientific and engineering challenges of them all,” said Ian Chapman, who leads The Culham Centre for Fusion Energy (CCFE), where JET is based, in a statement. JET is owned by the UK Atomic Energy Authority, but it’s scientific operations are run by a European collaboration called The EUROfusion consortium.

    If researchers can harness nuclear fusion — the process that powers the Sun — it promises to provide a near-limitless source of clean energy. But so far no experiment has generated more energy out than it puts in. JET’s results do not change that, but they suggest that a follow-up fusion reactor project that uses the same technology and fuel mix — the ambitious US$22-billion ITER, scheduled to begin fusion experiments in 2025 — should eventually be able to achieve this goal.

    “JET really achieved what was predicted. The same modelling now says ITER will work,” says fusion physicist Josefine Proll at Eindhoven University of Technology [Technische Universiteit Eindhoven](NL), who was not involved in JET’s research. “It’s a really, really good sign and I’m excited.”

    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France.

    Two decades’ work

    The experiments — the culmination of almost two decades’ work — are important for helping scientists to predict how ITER will behave and will guide its operating settings, says Anne White, a plasma physicist at The Massachusetts Institute of Technology (US) who works on tokamaks, reactors like JET that have a doughnut shape. “I am sure I am not alone in the fusion community in wanting to extend very hearty congratulations to the JET Team.”

    JET and ITER use magnetic fields to confine plasma, a superheated gas of hydrogen isotopes, in the tokamak. Under heat and pressure, the hydrogen isotopes fuse into helium, releasing energy as neutrons.

    To break the energy record, JET used a tritium fuel mix, the same one that will power ITER, which is being built in southern France. Tritium is a rare and radioactive isotope of hydrogen that, when fusing with deuterium, produces many more neutrons than do deuterium reactions alone. That ramps up the energy output, but using this fuel required JET to undergo more than two years of renovation to prepare the machine for the onslaught. Tritium was last used by a tokamak fusion experiment when JET set the previous fusion energy record in 1997.


    JET contained two types of heavy hydrogen, deuterium and tritium, in fusion experiments performed last year. Credit: EUROfusion consortium

    In an experiment on 21 December 2021, JET’s tokamak produced 59 megajoules of energy over a fusion ‘pulse’ of five seconds, more than double the 21.7 megajoules released in 1997 over around four seconds. Although the 1997 experiment still retains the record for ‘peak power’, it was over a fraction of a second and its average power then was less than half that of today, says Fernanda Rimini, a plasma scientist at the CCFE who oversaw the latest experimental campaign. The improvement took 20 years of experimental optimization, as well as hardware upgrades that included replacing the tokamak’s inner wall to waste less fuel, she says.

    Power ratio

    Producing the energy over a number of seconds is essential for understanding the heating, cooling and movement happening inside the plasma that will be crucial to run ITER, says Rimini.

    Five seconds “is a big deal”, adds Proll, who works on an alternative fusion-reactor design called a stellarator. “It is really, really impressive.”

    Last year, The Department of Energy (US)’s National Ignition Facility (US) set a different fusion record — it used laser technology to produce the highest fusion power output relative to power in, a value called Q.

    NIF National Ignition Facility located at the DOE’s Lawrence Livermore National Laboratory in Livermore, California.

    The facility produced a Q of 0.7, where 1 would be breakeven — a landmark for laser fusion that beat JET’s 1997 record. But the event was short lived, producing just 1.9 megajoules over less than 4 billionths of a second.

    JET’s latest experiment sustained a Q value of 0.33 for five seconds, says Rimini. At one-tenth of the volume, JET is a scaled-down version of ITER — a bathtub compared to a swimming pool, says Proll, and because it loses heat more easily it was never expected to hit breakeven. If engineers applied the same conditions and physics approach to ITER, she says, it would probably reach its goal of a Q of 10, producing ten times the energy put in.

    Fusion researchers are far from having all the answers. A remaining challenge, for example, is dealing with the heat created in the exhaust region of the ITER reactor, which will increase in area compared with JET, but not proportionally with the surge in power it will have to deal with. Research is under way to work out which design would best withstand the heat, but they’re not there yet, says Proll.

    The record-breaking run happened on the last day of a five-month campaign from which Rimini says scientists gleaned a wealth of information that they will analyse over the coming years. The final experiment pushed the device to its “absolute maximum”, adds Rimini, who witnessed the record-breaking test in real-time. “We didn’t jump up and down and hug each other — we were at 2 metres distance — but it was very exciting.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 5:11 pm on February 8, 2022 Permalink | Reply
    Tags: "Scientists raise alarm over ‘dangerously fast’ growth in atmospheric methane", Anthropogenic sources such as livestock; agricultural waste; landfill and fossil-fuel extraction accounted for about 62% of total methane emissions since from 2007 to 2016., , , , Facilities could easily halt emissions by preventing methane from leaking out., Global methane concentrations soar over 1900 parts per billion., Many researchers worry that global warming is creating a feedback mechanism that will cause ever more methane to be released., NATURE, Some researchers fear that global warming itself is behind the rapid rise., The majority of carbon is carbon-12 but methane molecules sometimes also contain the heavier isotope carbon-13., There is plenty that can be done to reduce emissions.   

    From Nature: “Scientists raise alarm over ‘dangerously fast’ growth in atmospheric methane” 

    From Nature

    08 February 2022
    Jeff Tollefson

    As global methane concentrations soar over 1900 parts per billion, some researchers fear that global warming itself is behind the rapid rise.

    Tropical wetlands, such as the Pantanal in Brazil, are a major source of methane emissions.Credit: Carl De Souza/Agence France Pressé.com(FR) via Getty.

    Methane concentrations in the atmosphere raced past 1,900 parts per billion last year, nearly triple preindustrial levels, according to data released in January by The National Oceanic and Atmospheric Administration (US). Scientists says the grim milestone underscores the importance of a pledge made at last year’s COP26 climate summit to curb emissions of methane, a greenhouse gas at least 28 times as potent as CO2.

    The growth of methane emissions slowed around the turn of the millennium, but began a rapid and mysterious uptick around 2007. The spike has caused many researchers to worry that global warming is creating a feedback mechanism that will cause ever more methane to be released, making it even harder to rein in rising temperatures.

    “Methane levels are growing dangerously fast,” says Euan Nisbet, an Earth scientist at Royal Holloway University(UK). The emissions, which seem to have accelerated in the past few years, are a major threat to the world’s goal of limiting global warming to 1.5–2 °C over pre-industrial temperatures, he says.

    Enigmatic patterns

    For more than a decade, researchers have deployed aircraft, taken satellite measurements and run models in an effort to understand the drivers of the increase (see ‘A worrying trend’)[1],[2]. Potential explanations range from the expanding exploitation of oil and natural gas and rising emissions from landfill to growing livestock herds and increasing activity by microbes in wetlands [3].

    “The causes of the methane trends have indeed proved rather enigmatic,” says Alex Turner, an atmospheric chemist at The University of Washington (US). And despite a flurry of research, Turner says he is yet to see any conclusive answers emerge.

    One clue is in the isotopic signature of methane molecules. The majority of carbon is carbon-12 but methane molecules sometimes also contain the heavier isotope carbon-13. Methane generated by microbes — after they consume carbon in the mud of a wetland or in the gut of a cow, for instance — contains less 13C than does methane generated by heat and pressure inside Earth, which is released during fossil-fuel extraction.

    Scientists have sought to understand the source of the mystery methane by comparing this knowledge about the production of the gas with what is observed in the atmosphere.

    By studying methane trapped decades or centuries ago in ice cores and accumulated snow, as well as gas in the atmosphere, they have been able to show that for two centuries after the start of the Industrial Revolution the proportion of methane containing 13C increased [4]. But since 2007, when methane levels began to rise more rapidly again, the proportion of methane containing 13C began to fall. Some researchers believe that this suggests that much of the increase in the past 15 years might be due to microbial sources, rather than the extraction of fossil fuels.

    Back to the source

    “It’s a powerful signal,” says Xin Lan, an atmospheric scientist at NOAA’s Global Monitoring Laboratory in Boulder, Colorado, and it suggests that human activities alone are not responsible for the increase. Lan’s team has used the atmospheric 13C data to estimate that microbes are responsible for around 85% of the growth in emissions since 2007, with fossil-fuel extraction accounting for the remainder [5].

    The next — and most challenging — step is to try to pin down the relative contributions of microbes from various systems, such as natural wetlands or human-raised livestock and landfills. This may help determine whether warming itself is contributing to the increase, potentially via mechanisms such as increasing the productivity of tropical wetlands. To provide answers, Lan and her team are running atmospheric models to trace methane back to its source.

    “Is warming feeding the warming? It’s an incredibly important question,” says Nisbet. “As yet, no answer, but it very much looks that way.”

    Regardless of how this mystery plays out, humans are not off the hook. Based on their latest analysis of the isotopic trends, Lan’s team estimates that anthropogenic sources such as livestock; agricultural waste; landfill and fossil-fuel extraction accounted for about 62% of total methane emissions since from 2007 to 2016.

    Global Methane Pledge

    This means there is plenty that can be done to reduce emissions. Despite NOAA’s worrying numbers for 2021, scientists already have the knowledge to help governments take action, says Riley Duren, who leads Carbon Mapper, a non-profit consortium in Pasadena, California, that uses satellites to pinpoint the source of methane emissions.

    Last month, for instance, Carbon Mapper and the Environmental Defense Fund, an advocacy group in New York City, released data revealing that 30 oil and gas facilities in the southwestern United States have collectively emitted about 100,000 tonnes of methane for at least the past three years, equivalent to the annual warming impact of half a million cars. These facilities could easily halt those emissions by preventing methane from leaking out, the groups argue.

    At COP26 in Glasgow, UK, more than 100 countries signed the Global Methane Pledge to cut emissions by 30% from 2020 levels by 2030, and Duren says the emphasis must now be on action, including in low- and middle-income countries across the global south. “Tackling methane is probably the best opportunity we have to buy some time”, he says, to solve the much bigger challenge of reducing the world’s CO2 emissions.

    References [Download references]

    1. Nisbet, E. et al. Phil. Trans. R. Soc. A. https://doi.org/10.1098/rsta.2021.0112 (2021).

    2. Palmer, P. I. et al. Phil. Trans. R. Soc. A. https://doi.org/10.1098/rsta.2021.0106 (2021).

    3. Turner, A. J., Frankenburg, C. & Kort, E. A. Proc. Natl Acad. Sci. USA 116, 2805–2813 (2019).

    4. Ferretti, D. F. et al. Science 309, 1714–1717 (2005).

    5. Lan, X. et al. Global Biogeochem. Cycles https://doi.org/10.1029/2021GB007000 (2021).

    Download references


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: