Tagged: Science Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:11 am on July 6, 2020 Permalink | Reply
    Tags: "U.K. buys stake in satellite company that could spoil astronomy", , , “It’s the stuff at 1000 kilometers that is the real killer for astronomy” says Mark McCaughrean of the European Space Agency., “Megaconstellations” of satellites, , , OneWeb filed for bankruptcy protection in March 2020, Science Magazine, The U.K. government said in a statement today that its acquisition of OneWeb will “contribute to the government’s plan to join the first rank of space nations.”, U.K. government and the Indian cellphone operator Bharti Global have successfully bid to rescue OneWeb with a $1 billion investment.   

    From Science Magazine: “U.K. buys stake in satellite company that could spoil astronomy” 

    From Science Magazine

    Jul. 3, 2020
    Daniel Clery

    OneWeb plans to launch as many as 42,000 satellites to an orbit that could harm astronomy. Credit: NASA/Kim Shiflett

    When OneWeb filed for bankruptcy protection in March, astronomers breathed a sigh of relief. The company planned to launch thousands of internet-providing satellites into low-Earth orbit, where their reflections could disrupt the observations of ground-based telescopes. But now, the company has risen from the grave with the announcement today that the U.K. government and the Indian cellphone operator Bharti Global have successfully bid to rescue OneWeb with a $1 billion investment.

    The revived company now plans an even larger constellation of up to 42,000 satellites, at an altitude of 1200 kilometers—the worst possible outcome for astronomers. At that altitude, satellites will leave bright trails across telescope images all through the night, effectively ruining the observations of survey telescopes such as the 8-meter Vera C. Rubin Observatory, under construction in Chile. “It’s the stuff at 1000 kilometers that is the real killer for astronomy,” says Mark McCaughrean of the European Space Agency, speaking at a briefing organized by the European Astronomical Society (EAS). “Engagement [with astronomers] has to happen and it has to happen now.”

    Astronomers first became concerned about such “megaconstellations” last year, when the launch company SpaceX lofted the first batch of its Starlink satellites. The aim of the project is to provide internet access in areas hard to reach with fiber-optic cables. The satellites, launched 60 at a time in a single rocket, proved to be highly visible in the sky, to the alarm of astronomers. The company has now launched 540 Starlink satellites—part of an initial goal of 1584—and aims to provide a service in the United States and Canada before the end of the year.

    Early on, astronomers began working with SpaceX to mitigate the impact of its satellites. In a January launch, one satellite was covered with an antireflective coating (dubbed Darksat), and in June, one satellite carried a sunshade to stop reflections (Visorsat). Although Darksat partially reduced the satellite’s visibility, it wasn’t enough to satisfy astronomers. Visorsat has yet to reach its operational altitude so, Olivier Hainaut of the European Southern Observatory told the EAS briefing, “we don’t know yet” how bright it will appear. But McCaughrean says Starlink’s next launch will be populated entirely with Visorsats.

    OneWeb is one of several other companies chasing Starlink with similar goals. Astronomers had only limited interactions with the company before it filed for Chapter 11 protection in March with 74 satellites launched toward an initial goal of 650. While new owners were being sought, OneWeb applied for permission to expand its constellation to 42,000.

    The U.K. government said in a statement today that its acquisition of OneWeb will “contribute to the government’s plan to join the first rank of space nations.” Initial reports suggested the government wanted to transform the constellation into a navigation system akin to GPS, because with Brexit, the United Kingdom will no longer be a governing member of Europe’s Galileo navigation system. But there is no mention of navigation plans in today’s statement.

    The rescue of OneWeb still has political and legal hurdles to overcome, but Robert Massey of the Royal Astronomical Society told the EAS briefing: “I would hope the government uses its leverage to ensure OneWeb are a good partner and engages with the scientific community.” He adds, “It’s hard to believe they didn’t know.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 1:25 pm on July 3, 2020 Permalink | Reply
    Tags: "First asteroid found within Venus’s orbit could be a clue to missing ‘mantle’ asteroids", , , , , Science Magazine   

    From Science Magazine: “First asteroid found within Venus’s orbit could be a clue to missing ‘mantle’ asteroids” 

    From Science Magazine

    Jul. 1, 2020
    Nola Redd

    The first Vatira, 2020 AV2, may point to asteroids resembling Earth’s mantle. Equinox Graphics/Science Source.

    Earlier this year, astronomers discovered an oddball asteroid inside the orbit of Venus—the first member of a predicted flock near the Sun. No bigger than a small mountain, the asteroid has now gained another distinction: It appears to be rich in the mineral olivine, which makes up much of Earth’s deep rock. Some astronomers think that is a clue to a larger set of asteroids, never properly accounted for, that was forged early in the formation of the Solar System.

    “It’s improbable that we look at this new population and an olivine-dominated object is the first type we see,” says Francesca DeMeo, an asteroid hunter at the Massachusetts Institute of Technology who was not part of the discovery team. “That’s what makes this a cool result.”

    Most of the nearly 1 million known asteroids lie in a belt beyond Mars, shepherded by Jupiter’s gravity. Just 23 Atira asteroids—named after a Native American goddess—have been found within Earth’s orbit, because interactions with the inner planets upset their orbits and eventually send them crashing into a planet or the Sun. But astronomers have long suspected the existence of an even smaller population of short-lived objects within the orbit of Venus, informally called Vatiras.

    They’re hard to spot. Like Venus, these objects would appear low on the horizon at dawn and dusk, barely visible against the glare of the Sun. Yet on 4 January, astronomers using a small survey telescope at the Palomar Observatory in California found one: 2020 AV2, a 1.5-kilometer-wide asteroid in a 151-day orbit around the Sun.

    To find out what 2020 AV2 is made of, Marcel Popescu, a researcher at the Astronomical Institute of the Romanian Academy, and his colleagues used telescopes on the Canary Islands to prise apart the asteroid’s reflected light, revealing absorption lines that are clues to chemical composition. They identified the fingerprint of olivine, a major mineral in the mantle of Earth and other planets, Popescu and his colleagues reported on 18 June in the Monthly Notices of the Royal Astronomical Society. “We’re not able to say definitely that it is an olivine-dominated asteroid, but olivine is abundant at its surface,” Popescu says.

    Separate studies of the object’s trajectory by Carlos and Raul de la Fuente Marcos, brothers who are researchers at the Complutense University of Madrid and were also co-authors on the discovery paper, revealed that 2020 AV2 likely originated in the main asteroid belt. Gravitational interactions with Jupiter would have flung it, and potentially some neighbors, toward Earth. There, a gravitational dance with the terrestrial planets probably nudged its orbit inside Venus over millions of years. That path, along with its small size, suggests to Popescu a way to solve a decades-old “missing mantle” puzzle for asteroids.

    The same separation into core, mantle, and crust that took place in rocky planets soon after they formed is also thought to have occurred in small planetary embryos 4.56 billion years ago. Heat from the decay of short-lived radioactive aluminum-26 caused iron and nickel-rich rocks in these embryos to sink into their cores while olivine-rich rocks rose into a mantle and the lightest minerals formed a thin crust. Subsequent collisions shattered these embryos into asteroids.

    Yet although plenty of metal-rich asteroids have been identified, the olivine-rich mantle asteroids are few and far between. “When you fragment differentiated bodies, you should get a lot of mantle out,” says Marco Delbo, of the Côte d’Azur Observatory in Nice, France. “But we don’t see many of these asteroids in the main belt.” In 2019, DeMeo reported finding 21 new olivine-rich asteroids in the main belt, bringing the total to 36. But that’s still not enough to account for all the missing mantle material.

    One idea is that astronomers just can’t see small enough. The olivine-rich asteroids are more easily pulverized than their harder iron cousins, suggesting most of the missing mantle sits in small pieces—a “battered-to-bits” model first proposed in the 1990s. 2020 AV2 could be a far-flung representative of a hidden population of even smaller, olivine-rich objects in the main belt that are hard to see because they’re farther from Earth, Popescu says. “As soon as we are able to observe smaller objects, it is expected that we will find these objects,” he says.

    Other researchers are skeptical. In her 2019 study, DeMeo searched for olivine-rich objects nearly as small as 2020 AV2 and found only a handful—not enough to hint at a hidden smaller population. Moreover, she says, the Vatiras are likely to hail from the inner part of the asteroid belt, where olivine-rich bodies are slightly more common. That makes 2020 AV2’s composition a little less surprising, she says. The discovery “definitely adds to our body of knowledge,” she says. “I just don’t think it clinches any final conclusions.”

    Meanwhile, Popescu wants to observe the asteroid again and look for signs of another mineral, pyroxine, which would firm up its identity as a “mantle” asteroid. And he hopes ongoing surveys will spot more close-in asteroids. “It’s a very interesting object, a peculiar one—the first of its kind,” Popescu says. “I want to see if there will be others.”

    *Correction, 2 July, 4 p.m.: A previous version of the story misstated the number of Atira asteroids and the affiliation for Carlos and Raul de la Fuente Marcos.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 3:09 pm on June 19, 2020 Permalink | Reply
    Tags: "European physicists boldly take small step toward 100-kilometer-long atom smasher", , , , , , ILC-being planned for the Kitakami highland in the Iwate prefecture of northern Japan, , , Physicists in China have similar plans to build big circular colliders, , Science Magazine   

    From Science Magazine: “European physicists boldly take small step toward 100-kilometer-long atom smasher” 

    From Science Magazine

    CERN FCC Future Circular Collider map

    Jun. 19, 2020
    Adrian Cho

    It is a truth universally acknowledged that a physics laboratory with a world-leading scientific facility must have a plan for an even better machine to succeed it. So it is with the European particle physics laboratory, CERN, near Geneva, which is home to the world’s biggest atom smasher, the 27-kilometer-long Large Hadron Collider (LHC). Today, CERN’s governing council announced it will launch a technical and financial feasibility study to build an even bigger collider 80 to 100 kilometers long (actually two of them in succession) that could ultimately reach an energy seven times higher than the LHC. The first machine wouldn’t be built before 2040.

    There is “some pride of the member states of CERN [that it is] the leading particle physics laboratory, and I think there is interest in CERN staying there,” says Ursula Bassler, a physicist and president of the CERN council, the panel of representatives from the 23 nations that support the lab. However, CERN Director-General Fabiola Gianotti emphasizes that no commitment has been made to build a new mammoth collider, which could cost $20 billion. “There is no recommendation for the implementation of any project,” she says. “This is coming in a few years.”

    Physicists have been debating what collider to build next since well before the LHC started to take data in 2010. In the early 2000s, discussions centered on a 30-kilometer-long, straight-shot, linear collider that would smash electrons into positrons. Such a machine would complement the circular LHC, which smashes countercirculating beams of protons. The two types of machines have different strengths. A proton collider can generally reach higher energies and discover heavier new particles. But protons are made of other particles called quarks, so they make messy collisions. In contrast, electrons and positrons are indivisible fundamental particles, so they make cleaner collisions. Historically, physicists often have found new particles at proton colliders and studied them in detail at electron-positron colliders.

    That’s the game particle physicists around the world are trying to play today. In 2012, the proton-smashing LHC blasted out the Higgs boson, the last particle predicted by physicists’ standard model and the linchpin to their explanation how all other fundamental particles get their mass. Many would now like to build an electron-positron collider and run it as a Higgs factory, to make the particle in large numbers and see whether it has exactly the predicted properties. Any deviation from the predictions would be signs of new physics beyond the 40-year-old standard model, something particle physicists are desperate to find. Physicists in Japan would like to host such a linear collider.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    A few years ago, however, some physicists proposed another approach, building an 80- to 100-kilometer-long circular electron-positron collider to study the Higgs. That machine would have a major drawback: As light-weight electrons go around in circles, they radiate copious x-rays and lose energy, so such a machine is inefficient and limited in its energy reach. But it has a big practical upside: The tunnel it needs could also later be used to house a higher energy proton collider. This is exactly what CERN did with the LHC, which was built in an existing tunnel dug for the Large Electron-Positron Collider, which ran from 1989 to 2000. (It studied in detail particles called the W and Z bosons that had been discovered previously with a proton-antiproton collider at CERN.)

    Now, CERN physicists envision a future in which, around 2040, they build a huge circular electron-positron collider to study the Higgs. Then, they would follow up with a more powerful proton collider to reach a new high-energy frontier. Today, the CERN council took a step in that direction, announcing an update to its long-range strategy, the first since 2013.

    Just how much CERN’s plans have changed remains murky, however. Some physicists there have long been working on CERN’s own design for a linear collider. And it appears the new long-range strategy does not completely sideline that idea. “We also recommend continued accelerator R&D to ensure that we do not miss an opportunity to improve our accelerator technology,” said Halina Abramowicz, a physicist at Tel Aviv University who led the planning exercise, during an online question-and-answer session. “I think it’s important to convey this message very clearly.”

    The feasibility study for the big new machine should be done by 2026 or 2027, when CERN will next update its long-term strategy. CERN may also have competition in the presumed collider arms race, as physicists in China have similar plans to build big circular colliders.

    China Circular Electron Positron Collider (CEPC) map

    Of course, all may depend on whether the LHC, which is now undergoing an upgrade and should run until the mid 2030s, finds anything beyond the Higgs boson to study. If it doesn’t, convincing the governments of Europe to spend $20 billion to study just the Higgs may prove a daunting political challenge.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 1:03 pm on June 4, 2020 Permalink | Reply
    Tags: , , , , Science Magazine, The galaxy’s brightest explosions go nuclear with an unexpected trigger: pairs of dead stars"   

    From Science Magazine: “The galaxy’s brightest explosions go nuclear with an unexpected trigger: pairs of dead stars” 

    From Science Magazine

    Jun. 4, 2020
    Daniel Clery

    Kepler’s supernova, a type Ia explosion from 1604, lingers today with the x-ray glow of hot, leftover debris. (X-RAY) NASA/CXC/SAO/D. PATNAUDE; (OPTICAL) DSS

    The white dwarf stars were zipping across the Milky Way at more than 1000 kilometers per second—thousands of times faster than a speeding bullet, so fast that they would eventually escape the gravitational clutches of the Galaxy. “They were not like anything we had seen before,” says Boris Gaensicke, an astronomer at the University of Warwick.

    Gaensicke and his colleagues suspected these burnt-out embers were fleeing scenes of violence: supernova explosions in which another white dwarf had detonated like an Earth-size hydrogen bomb. In the standard picture of these explosions, known as type Ia supernovae, a nearby giant star lights the fuse. But the extreme speed of the white dwarfs suggested a different scenario, in which the fleeing dwarfs had delivered the sparks, from close orbits around the doomed stars. When they blew up, these partners were flung away like shots from a biblical sling.

    The speeds of the three white dwarfs, discovered in a 2018 data set from Europe’s Gaia satellite, were just one clue to this picture. Subsequent ground-based observations found traces of iron and other metals in the stars’ light—elements that might have been implanted by a supernova blast. From the color and brightness of the light, the astronomers could deduce that the stars were hotter and larger than typical white dwarfs, as if they had been puffed up by a sudden energy boost. Most telling, the researchers rewound the stars’ trajectories and found that one hailed from a known site: the remnant of a 90,000-year-old supernova. They are “the very best evidence” for the twin white dwarf scenario, says team leader Ken Shen of the University of California (UC), Berkeley.

    The evidence that twin white dwarfs drive most, if not all, type Ia supernovae, which account for about 20% of the supernova blasts in the Milky Way, “is more and more overwhelming,” says Dan Maoz, director of Tel Aviv University’s Wise Observatory, which tracks fast-changing phenomena such as supernovae. He says the classic scenario of a white dwarf paired with a large star such as a red giant “doesn’t happen in nature, or quite rarely.”

    Which picture prevails has impacts across astronomy: Type Ia supernovae play a vital role in cosmic chemical manufacturing, forging in their fireballs most of the iron and other metals that pervade the universe. The explosions also serve as “standard candles,” assumed to shine with a predictable brightness. Their brightness as seen from Earth provides a cosmic yardstick, used among other things to discover “dark energy,” the unknown force that is accelerating the expansion of the universe. If type Ia supernovae originate as paired white dwarfs, their brightness might not be as consistent as was thought—and they might be less reliable as standard candles.

    Robert Kirshner, a longtime supernova watcher at the Gordon and Betty Moore Foundation, isn’t ready to give up on the classic scenario, but he acknowledges the misgivings about it. “It’s plausible that there is more than one path to a type Ia supernova, but the uniformity of the light output is a tiny bit of a paradox,” he says. Nevertheless, “There’s a nagging doubt: Do we fully understand the nature of these explosions?”

    Supernovae come in two basic flavors. The most common are called core-collapse. They occur when a massive star, far larger than the Sun, runs out of fuel. The pressure of the core’s heat can no longer counteract gravity; the outer parts of the star collapse into the core with such force that a rebounding shock wave, or sometimes a renewed fusion burn, blows out the star’s outer layers, leaving a neutron star or black hole behind.

    The rest are all type Ia—white dwarf stars somehow reignited into a runaway fusion reaction. The burst of new energy happens too fast for the star to absorb, blowing the entire thing to smithereens in a blast that is brighter and longer than a core-collapse supernova.

    White dwarfs might seem unlikely candidates for fireworks. They are the cinders of Sun-like stars that have burned up their hydrogen and helium fuels, leaving—in most cases—carbon and oxygen, heavier elements that can’t fuse in such a low-mass star. White dwarfs shrink to the size of Earth, and only glow from leftover heat. Theoretically, they should cool to black over billions of years.

    But if the white dwarf orbits in a binary pair with another star, a more spectacular fate may await it. The classic type Ia scenario was proposed in 1973 by John Whelan and Icko Iben. They mapped out the fading light of supernova SN 1972e for one full year, and realized its brightness could be explained by the decay of about one solar mass worth of radioactive metals. These, they proposed, were forged in a white dwarf that had grown to a size at which the pressure and temperature in its core would be high enough to fuse carbon, causing a thermonuclear blast. Whelan and Iben suggested the white dwarf might grow to that mass threshold if its gravity stole hydrogen gas from a companion star such as a puffed-up red giant, which doesn’t clutch its outer layers too tightly.

    Later modeling showed achieving this growth was a tricky balancing act. If a white dwarf gobbles up hydrogen too fast, the hydrogen layer that forms on its surface can get hot enough to blow up prematurely, in a more modest thermonuclear explosion called a classical nova or, if it happens repeatedly, a recurrent nova. If it accumulates too slowly, the white dwarf can tiptoe up to a mass 1.44 times that of the Sun. At that threshold, known as the Chandrasekhar limit, theorists predict the pressure inside will cause electrons and protons to fuse into neutrons, and the white dwarf will quietly collapse into a neutron star.

    But if the hydrogen is added at just the right rate, a white dwarf, especially one rich in carbon and oxygen, can respond more dramatically. Just short of the Chandrasekhar limit, at about 1.4 solar masses, the density and temperature of the core shoot up. Flares of carbon fusion break out. After smoldering for 100 or more years, a runaway reaction detonates the star and blows it apart in a matter of seconds. The resulting fireball, 5 billion times as bright as the Sun, forges a suite of metals from chromium to nickel in the periodic table. The radioactive decay of nickel to cobalt and then iron powers a brilliant afterglow that peaks in a couple of weeks and fades over years. And because, in this picture, every type Ia explodes with the same mass, they should all have the same peak brightness.

    That scenario satisfied astronomers for decades. But they have yet to find definitive evidence for it. After a supernova disperses enough to be transparent, astronomers routinely search for a surviving red giant companion but have never found one.

    Models also suggest a flash of blue light should appear, hours after the supernova begins, as the expanding fireball slams into the tenuous hydrogen atmosphere of the red giant and heats it enough to glow in the ultraviolet. But when astronomers spotted a supernova in the nearby Pinwheel galaxy within hours of its ignition in 2011, they saw no blue flash. “SN 2011fe was a paradigm changer,” says Stan Woosley, a supernova expert at UC Santa Cruz.

    A second predicted signal—weaker but more persistent—has also been elusive. Astronomers would expect some hydrogen from the red giant to be swept along with the other explosion debris, creating an absorption line in the spectrum of the supernova’s light as the remnant cools. But a 2019 study of 227 type Ia supernova remnants found no hint of this hydrogen.

    Theorists have struggled to simulate classic type Ia explosions. This is one of their successful models.

    The numbers are against the classic scenario, Maoz says. In a galaxy like the Milky Way, a type Ia supernova occurs once every few hundred years. If they all originate from a white dwarf and a red giant, the galaxy would have to host some 10,000 of these pairs—and there’s little evidence of them. Because it’s hard to see binary pairs as separate stars at such great distances, astronomers have looked for indirect evidence, such as the recurrent novae triggered when hydrogen from a red giant spills quickly onto a white dwarf companion, or the soft x-ray glow that can result from a steady stream of hydrogen. If the Milky Way harbored 10,000 white dwarf-red giant pairs, we should see many novae and soft x-ray sources, but only a handful have been found, Maoz says. “We’re missing 99.9% of the systems.”

    The theoretical underpinnings of the classic scenario have problems, too. A key assumption is that the explosion is so violent that the entire star is blown apart. But theorists have had a hard time modeling how the smoldering fusion flares would burgeon into an all-consuming detonation. Modeling that process requires simulating nuclear reactions with centimeter resolution across an object the size of Earth for 100 years or more. Until recently, computers couldn’t handle that challenge—and now that they can, Shen says, only some models predict the transition. “The state of the field is not conclusive yet.”

    Enter another way to blow up a white dwarf. Astronomers know that two white dwarfs orbiting each other—a so-called double degenerate—can gradually spiral inward as their rapid orbits throw out gravitational waves, which carry away energy. When they merge, there would be easily enough heat and pressure to kick-start carbon fusion in the combined stars.

    Models suggest a problem with this mechanism: The burning would not start in the core. It might kick off in the hot zone where the two are actively colliding, leading to a lopsided and incomplete explosion that would leave much of the stars’ mass unburned. Or it could ignite close to the surface, only blowing off the outer layers of the merged star. “Getting a detonation in a double degenerate is by no means trivial,” Woosley says.

    Yet Maoz says the odds are irresistible: White dwarf binaries are thought to be fairly common. In a 2018 study, Maoz and his colleagues looked for the wobble of white dwarfs being tugged by a partner, combining survey results from the Sloan Digital Sky Survey and Europe’s Very Large Telescope. They concluded that about half a billion white dwarf binaries have merged in the Milky Way since its formation. If just one-sixth of those mergers led to a type Ia supernova, that would be one supernova every 200 years—roughly what is observed in the Milky Way. Another team, using Gaia data, came to a similar conclusion.

    Maoz is undeterred by the problem of getting a complete, symmetrical blast. “Just because we don’t know how it happens doesn’t mean nature hasn’t found a way.” In fact, many believe nature has found a way to blow up one member of a white dwarf pair—but without a merger. White dwarfs can have some leftover helium in their atmospheres after the core stops burning. When an orbiting pair is on the cusp of merging, the larger of the two stars can rapidly steal helium from the smaller one to form a dense helium layer on its surface. The helium layer can act as a kind of blasting cap, exploding in a small thermonuclear blast and sending a shock wave into the star that can ignite the core.

    This scenario is called D6, for dynamically driven double-degenerate double detonation. The idea was first developed in 2010 by James Guillochon, a researcher at the Harvard-Smithsonian Center for Astrophysics, and his colleagues. It leaves the smaller white dwarf battered but thrown free, like those Shen’s group found in the Gaia data. D6 was originally thought to require a hefty amount of helium, making it a rare event, but more recent modeling suggests just a few percent of a solar mass could be enough, Woosley says.

    One key feature of the D6 scenario is that the exploding white dwarf can be well below the critical mass, because the spark comes from a shock wave and not from gravitational pressure. A less massive exploding star will produce less nickel and be less bright.

    Recent studies of the metallic elements supernovae forge suggest the low-mass type Ia may be the norm. According to models, the production of manganese in type Ia supernovae is particularly sensitive to density in the white dwarf ’s core: If the star is close to the 1.4–solar mass threshold, its high-density core produces lots of manganese; if the star is lighter—as is likely in the D6 mechanism—it produces one-tenth as much.

    As a result, manganese abundances derived from the light of stars today can hint at the masses of the ancient supernovae that seeded them with heavier elements. “Manganese provides an indirect way to probe previous generations of type Ia supernovae that went off in that galaxy,” says Ashley Ruiter of the University of New South Wales, Canberra.

    In a pioneering study from 2013, researchers led by Ivo Seitenzahl, then at the Julius Maximilian University of Würzburg, compared manganese abundance in the Sun with models of how much manganese would be produced by supernovae of different masses. They found that only half of the supernovae that exploded in the solar neighborhood in the past needed to be high mass to explain the Sun’s manganese content. “This was the first of a new wave of results,” says Maria Bergemann of the Max Planck Institute for Astronomy. This year, she and her colleagues reported looking at manganese in 42 stars across the Milky Way and concluded that the abundances suggest 75% of the galaxy’s type Ia supernovae were low mass.

    The implications of undersize type Ia supernovae extend far beyond the elements in the present-day universe. They also raise questions about the explosions’ long-standing role as “standard candles” for probing cosmic history.

    In 1998, researchers compared a few dozen distant type Ia supernovae with closer ones and found that they were dimmer than they should have been. They concluded that the universe’s expansion is accelerating, driven by some unknown dark energy—a discovery for which they were awarded the 2011 Nobel Prize in Physics. Supernova distances are also at the heart of a dispute over the value of the expansion rate itself, known as the Hubble constant. In the nearby universe the expansion rate is measured using standard candles such as type Ia supernovae; in the distant, early universe, it is derived from clues such as the cosmic microwave background, the echo of the big bang. When the effect of dark energy is taken out, the two values should agree—but they don’t.

    Could a not-so-standard candle jeopardize those discoveries? “It means something, but not that dark energy goes away,” Woosley says. Dark energy has been confirmed using other methods, so he’s not worried about that. But he thinks cosmologists will run into trouble as they put their theories to more rigorous tests that require more precise standard candles. “Supernovae could be less useful for precision cosmology,” he says.

    Astronomers already knew the peak brightness of type Ia supernovae isn’t perfectly consistent. To cope, they have worked out an empirical formula, known as the Phillips relation, that links peak brightness to the rate at which the light fades: Flashes that decay slowly are overall brighter than those that fade quickly. But more than 30% of type Ia supernovae stray far from the Phillips relation. Perhaps low-mass D6 explosions can explain these oddballs, Shen says. For now, those who wield the cosmic yardstick will need to “throw away anything that looks weird,” Gaensicke says, and hope for the best.

    Andy Howell, a supernova watcher at Las Cumbres Observatory, thinks type Ia supernovae could still be reliable tools for cosmology if astronomers could separate the different varieties of type Ia that are now lumped together. “If we knew there were two populations, we could make the measurements even better,” he says.

    So far, astronomers can’t say how many of their favorite cosmic explosions are sparked by white dwarf pairs rather than a giant and a dwarf. “It’s too early to say with certainty what that fraction is,” Ruiter says. But the coming years could bring more clarity.

    Survey telescopes that scan the skies nightly or even hourly are catching more and more supernovae. The current frontrunner, the Zwicky Transient Facility in California, spots about 30 supernovae per night. Its output will be dwarfed in 2022 with the opening of the Vera C. Rubin Observatory, an 8.4-meter survey telescope in Chile that is expected to find thousands of supernovae nightly. Other telescopes able to obtain spectra from thousands of objects simultaneously will enable astronomers to study the explosions for the features—the blue flash, the hydrogen absorption lines—that could betray the involvement of a giant star.

    Shen and Gaensicke hope the next data release from Gaia will contain more high-speed white dwarfs fleeing from D6 explosions. And the Laser Interferometer Space Antenna, an orbiting gravitational wave detector due for launch in 2034, will be able to sense white dwarf pairs as they spiral in toward merger, giving astronomers a better idea of how common they really are. “It’s a real golden age for supernovae because we’re finding so many,” Howell says. “We’ve now finally got the tools to see them in new ways.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 9:47 am on February 9, 2020 Permalink | Reply
    Tags: (SARAO)-South African Radio Astronomy Observatory, A massive $54 million expansion, , , , , Germany’s Max Planck Society, , Science Magazine,   

    From Science Magazine: “This powerful observatory studying the formation of galaxies is getting a massive, $54 million expansion” 

    From Science Magazine

    Feb. 7, 2020
    Sarah Wild

    South Africa’s 64-dish MeerKAT telescope is set to grow by almost one-third, significantly increasing its sensitivity and ability to image the far reaches of the universe. The 20 new dishes come with a $54 million price tag, to be split evenly between the South African government and Germany’s Max Planck Society.

    MeerKAT, which will get 20 new dishes by 2022, will eventually become part of the Square Kilometre Array, which will be the largest radio telescope in the world. South African Radio Astronomy Observatory

    SKA Square Kilometer Array

    SKA South Africa

    MeerKAT, a midfrequency dish array, is already the most sensitive telescope of its kind in the world [Nature]. Since its inauguration in 2018, it has captured the most detailed radio image of the center of the Milky Way and discovered giant radiation bubbles [Nature] within it.

    “The extended MeerKAT will be an even more powerful telescope to study the formation and evolution of galaxies throughout the history of the universe,” says Fernando Camilo, chief scientist at the South African Radio Astronomy Observatory (SARAO). Francisco Colomer, director of the Joint Institute for Very Long Baseline Interferometry European Research Infrastructure Consortium, says the expansion will “enhance an already impressive instrument.” The new dishes will have a slightly different design from the existing ones and a diameter of 15 meters instead of 13.5 meters.

    MeerKAT will eventually be folded into the Square Kilometre Array (SKA), which will be the largest radio telescope in the world; the new dishes, scheduled to come online in 2022, are designed to be part of SKA, says Rob Adam, SARAO’s managing director. SKA will comprise thousands of dishes across Africa and 1 million antennas in Australia and have a collecting area of 1 square kilometer, allowing scientists to look at the universe in unprecedented detail and investigate what happened immediately after the big bang, how galaxies form, and the nature of dark matter.

    SKA is now trying to attract funding and new partners for the project, whose initial phase is set to cost about $1 billion. Construction is scheduled to begin in 2021 [Nature]. SKA data may not be available to astronomers until the end of the decade; the expansion of MeerKAT will allow the astronomical community to stay busy in the meantime, Colomer says.

    South Africa’s contribution to MeerKAT will be counted toward the country’s pledge for the first phase of SKA, Adam says. Germany’s relationship with SKA is complicated. The country was a member of the SKA Organisation, tasked with overseeing the design phase of the telescope, but pulled out in 2014. The Max Planck Society rejoined the organization last year, but Germany isn’t among the seven member countries that signed a treaty to actually establish the SKA Observatory in August 2019. If it decides to join that group, the German funding for MeerKAT will also count toward the country’s contribution, Adam says.

    The additional dishes will increase MeerKAT’s computing requirements by an order of magnitude, but Adams says the extension coincides with a planned update to the telescope’s hardware that capitalizes on advances in computer technology.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 4:47 pm on January 9, 2020 Permalink | Reply
    Tags: "Department of Energy picks New York over Virginia for site of new particle collider", , , , , , , , Science Magazine   

    From BNL via Science Magazine: “Department of Energy picks New York over Virginia for site of new particle collider” 

    From Brookhaven National Lab


    Science Magazine

    Jan. 9, 2020
    Adrian Cho

    Nuclear physicists’ next dream machine will be built at Brookhaven National Laboratory in Upton, New York, officials with the Department of Energy (DOE) announced today. The Electron-Ion Collider (EIC) will smash a high-energy beam of electrons into one of protons to probe the mysterious innards of the proton. The machine will cost between $1.6 billion and $2.6 billion and should be up and running by 2030, said Paul Dabbar, DOE’s undersecretary for science, in a telephone press briefing.

    This schematic shows how the EIC will fit within the tunnel of the Relativistic Heavy Ion Collider (RHIC, background photo), reusing essential infrastructure and key components of RHIC.

    Electrons will collide with protons or larger atomic nuclei at the Electron-Ion Collider to produce dynamic 3-D snapshots of the building blocks of all visible matter.

    The EIC will allow nuclear physicists to track the arrangement of the quarks and gluons that make up the protons and neutrons of atomic nuclei.

    “It will be the first brand-new greenfield collider built in the country in decades,” Dabbar said. “The U.S. has been at the front end in nuclear physics since the end of the Second World War and this machine will enable the U.S. to stay at the front end for decades to come.”

    The site decision brings to a close the competition to host the machine. Physicists at DOE’s Thomas Jefferson National Accelerator Facility in Newport News, Virginia, had also hoped to build the EIC.

    Protons and neutrons make up the atomic nucleus, so the sort of work the EIC would do falls under the rubric of nuclear physics. Although they’re more common than dust, protons remain somewhat mysterious. Since the early 1970s, physicists have known that each proton consists of a trio of less massive particles called quarks. These bind to one another by exchanging other quantum particles called gluons.

    However, the detailed structure of the proton is far more complex. Thanks to the uncertainties inherent in quantum mechanics, its interior roils with countless gluons and quark-antiquark pairs that flit in and out of existence too quickly to be directly observed. And many of the proton’s properties—including its mass and spin—emerge from that sea of “virtual” particles. To determine how that happens, the EIC will use its electrons to probe the protons, colliding the two types of particles at unprecedented energies and in unparalleled numbers.

    Researchers at Jefferson lab already do similar work by firing their electron beam at targets rich with protons and neutrons. In 2017, researchers completed a $338 million upgrade to double the energy of the lab’s workhorse, the Continuous Electron Beam Accelerator Facility.

    Continuous Electron Beam Accelerator Facility

    With that electron accelerator in hand, Jefferson lab researchers had hoped to build the EIC by adding a new proton accelerator.

    Brookhaven researchers have studied a very different type of nuclear physics. Their Relativistic Heavy Ion Collider (RHIC) [below] collides nuclei such as gold and copper to produce fleeting puffs of an ultrahot plasma of free-flying quarks and gluons like the one that filled the universe in the split second after the big bang. The RHIC is a 3.8-kilometer-long ring consisting of two concentric and counter-circulating accelerators. Brookhaven researchers plan to make the EIC by using one of the RHIC’s rings to accelerate the protons and to add an electron accelerator to the complex.

    To decide which option to take, DOE officials convened an independent EIC site selection committee, Dabbar says. The committee weighed numerous factors, including the relative costs of the rival plans, he says. Proton accelerators are generally larger and more expensive than electron accelerators.

    The Jefferson lab won’t be left out in the cold, Dabbar says. Researchers there have critical expertise in, among other things, making the superconducting accelerating cavities that will be needed for the new collider. So, scientists there will participate in designing, building, and operating the new collider. “We certainly look forward to [the Jefferson lab] taking the lead in these areas,” Dabbar says.

    The site decision does not commit DOE to building the EIC. The project must still pass several milestones before researchers can being construction—including the approval of a detailed design, cost estimate, and construction schedule. That process can take a few years. However, the announcement does signal the end for the RHIC, which has run since 1999. To make way for the new collider, the RHIC will shut down for good in 2024, Dabbar said at the briefing.

    The decision on a machine still 10 years away reflects the relative good times for DOE science funding, Dabbar says. “We’ve been able to start on every major project that’s been on the books for years.” DOE’s science budget is up 31% since 2016—in spite of the fact that under President Donald Trump, the White House has tried to slash it every year.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Center for Functional Nanomaterials



    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 8:04 am on August 29, 2019 Permalink | Reply
    Tags: "Australia plans to tackle foreign influence at nation’s universities", Science Magazine   

    From Science Magazine: “Australia plans to tackle foreign influence at nation’s universities” 

    From Science Magazine

    Aug. 28, 2019
    Dennis Normile

    A supporter of the Hong Kong, China, pro-democracy protests stands next to a “Lennon Wall” at the University of Queensland in Brisbane, Australia. Several such walls have been vandalized recently.

    In response to growing concerns in Australia about foreign influence at universities, cyberspying, and a perceived erosion of freedom of speech on campuses, the country’s education minister today announced that a new task force will develop “best-practice guidelines for dealing with foreign interference.”

    The decision grew out of recent meetings between university and government representatives, Minister for Education Dan Tehan said in a speech at the National Press Club of Australia in Canberra this afternoon. “Everybody wants a considered, methodical approach to deal with this issue,” he said, “one that strikes a balance between our national interest and giving universities the freedom to pursue research and collaboration. We must get the balance right.”

    Tehan did not mention China, according to a ministry transcript of the news conference. But it is clear the country is the primary concern. “There’ve been a series of miniscandals throughout the tertiary education sector that show there is a big problem of foreign interference in universities coming from China, and the government has now realized that the universities themselves are not going to act,” says Clive Hamilton, an ethicist at Charles Sturt University in Canberra who has been outspoken in warning about threats to Australia’s universities.


    Chinese influence is a sensitive issue, however. On Monday, before the guideline plans had been announced, Michael Spence, vice-chancellor of the University of Sydney in Australia, said on a radio program that the debate over Chinese connections has become “slightly hysterical.”


    The University Foreign Interference Taskforce will draw half of its participants from the nation’s universities; Department of Education officials and government security experts will make up the other half. The task force will have four working groups focusing on cybersecurity, intellectual property, foreign collaborations, and communications to raise awareness of security issues. It will aim to produce guidelines by November.

    The recent incidents include a massive breach of Australia National University’s computer systems, revealed in June, that netted the hackers—suspected of being based in China—personal details on up to 200,000 students and staff dating back 19 years.

    There have also been allegations of universities unwittingly working with entities connected to China’s military. On Monday, the Australian Strategic Policy Institute published a report claiming that artificial intelligence software being used to surveil the minority Uyghur population in China’s Xinjiang region “may have benefited from connections with Australian universities and Australian government funding.”

    A 20 August report by sociologist Salvatore Babones of the University of Sydney also warned that Australia’s universities have become overly reliant on international—and particularly Chinese—student fees. At the seven top universities, tuition fees paid by Chinese students account for 13% to 23% of total revenues, which puts the institutions in a precarious financial position, Babones writes in a report published by the Center for Independent Studies, a Sydney-based think tank. The report notes that 11% of all university students in Australia hail from China, versus 2% in the United States and 6% in the United Kingdom. The more than 150,000 Chinese students in Australian higher educational institutions account for 38% of international enrollees.

    That large Chinese presence raises other concerns as well. In recent months, so-called Lennon Walls, where people could post notes of encouragement to pro-democracy protesters in Hong Kong, China, have been repeatedly vandalized. On several occasions, scuffles broke out between pro–Hong Kong demonstrators and those supporting mainland China. Punches were thrown during a confrontation on the campus of University of Queensland (UQ) in Brisbane, Australia.

    “What the recent demonstrations on university campuses over Hong Kong show is that universities remain firmly committed to freedom of expression,” UQ Chancellor Peter Varghese wrote in a statement responding to the public clamor over such incidents. “Restricting that freedom through intimidation and disruption is unacceptable, as is threatening the families of those who participated.”

    Against that background, Hamilton says, “The government gives every impression that this is going to be a thorough-going review leading to major changes.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 9:19 am on August 7, 2019 Permalink | Reply
    Tags: , , , , Science Magazine, Terrascope-a 1-meter space telescope positioned beyond the moon   

    From Science Magazine: “Space telescope would turn Earth into a giant magnifying lens” 

    From Science Magazine

    Aug. 6, 2019
    Daniel Clery

    A space telescope beyond the moon could use Earth’s atmosphere as a lens to magnify the light of distant objects by 22,500 times.
    James Tuttle Keane/California Institute of Technology

    When it is finished sometime next decade, Europe’s Extremely Large Telescope will be the largest in the world, with a mirror nearly 40 meters across.

    ESO/E-ELT,to be on top of Cerro Armazones in the Atacama Desert of northern Chile. located at the summit of the mountain at an altitude of 3,060 metres (10,040 ft).

    But one astronomer has proposed an even more powerful space telescope—one with the equivalent of a 150-meter mirror—that would use Earth’s atmosphere itself as a natural lens to gather and focus light. Astronomer David Kipping of Columbia University has worked out that a 1-meter space telescope, positioned beyond the moon, could use the focusing power of the ring of atmosphere seen around the edge of the planet to amplify the brightness of dim objects by tens of thousands of times.

    The atmosphere is too variable for a Terrascope, as Kipping calls it, to produce beautiful images to rival those from the Hubble Space Telescope. But it could discover much fainter objects than is now possible, including small exoplanets or Earth-threatening asteroids. Kipping acknowledges that more work is needed to prove the idea, but the necessary technology already exists. “None of this is reinventing the wheel, it just needs to be pushed a bit harder,” he says.

    Astronomers who read the paper Kipping posted last week on arXiv were both delighted and cautious. Matt Kenworthy, of Leiden University in the Netherlands, says he was “blown away by how much work and thought he had put into it” but wants more evidence that it will work. “I’d want to sit down and do a more realistic model,” he says. Bruce Macintosh of Stanford University in Palo Alto, California, adds: “It’s an interesting thought experiment, but there are a lot of details to think through.”

    Kipping is well known for leading searches for moons in other planetary systems and he revealed a strong contender for the first exomoon last year. He says the germ of the Terrascope idea came 13 years ago when he was studying a rare atmospheric phenomenon called the green flash, which appears just as the sun sets below the horizon, when refraction and scattering in the atmosphere work together to momentarily select green from the sun’s light. He realized that, from the right vantage in space, you might see an entire green ring, when the sun passed behind Earth and its light was refracted by the ring of air around the planet’s circumference.

    Kipping was also inspired by the idea that the sun itself could be used as a lens, with its gravity focusing light toward a space-based detector. Such a solar lens would magnify light 1 million billion times, potentially bringing the surfaces of exoplanets into view. The idea led to the Fast Outgoing Cyclopean Astronomical Lens mission, proposed to the European Space Agency in 1993. But it never gained traction because the detector would have to be positioned 550 times the Earth-sun distance out in space, nearly 20 times farther away than Neptune—a distance that would require a century for a spacecraft to reach.

    But Terrascope could be much closer to home, Kipping says. He calculated that surface-skimming light from an object directly behind Earth is deflected to a focus 85% of the distance out to the moon. Light reaching that focal point is likely to encounter clouds and a lot of turbulence as it passes through the lower atmosphere. But move the detector 1.5 million kilometers away, to a focus four times farther than the moon, and it would sample light that has passed through the much calmer and cloud-free stratosphere at an altitude of 13.7 kilometers.

    A 1-meter telescope at that distance, observing for a whole night, would see an object boosted to 22,500 times its original brightness, he calculates—the equivalent of using a 150-meter telescope. The Terrascope’s powerful amplification means it would excel in detecting very faint objects or discerning very slight changes in brightness, Kipping says, enabling it to scan the sky for very small and dim asteroids or measure the tiny dips in brightness as small exoplanets pass in front of bright stars.

    To avoid being dazzled by the bright disk of Earth, the telescope would need a mask, known as a coronagraph, to block it out. Kipping also says he has yet to consider the impact of “airglow,” a dim light emitted in the upper atmosphere by luminescence and other processes. But he notes that the glow could conceivably be removed with filters or digitally, taking advantage of the fact that it is steady while objects of interest are constantly changing. The Terrascope concept could be tested, he says, with a cheap toaster-size CubeSat mission.

    Kenworthy says the atmosphere’s variability might seriously degrade Terrascope’s image quality. To assess the impact, he says, “The next step would be ray tracing with a realistic Earth atmosphere model,” he says. Ideally, the giant lens should focus light to a spot. “In reality, you’ll probably get a pattern of blobs.”

    Macintosh agrees. “Earth’s atmosphere is a pretty nonideal lens so it produces very blurry images,” he says. But it may find a role studying brightness changes in very faint objects, exploiting its role as a huge magnifying “light bucket.”

    If nothing else, Kipping has got astronomers talking about the idea. “I wouldn’t launch a satellite on this paper alone,” Kenworthy says. “But it’s an excellent first step.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 7:23 am on July 9, 2019 Permalink | Reply
    Tags: , , Deleting mentions of ‘climate change’ from U.S. Geological Survey press releases, , Science Magazine,   

    From Science Magazine: “Trump officials deleting mentions of ‘climate change’ from U.S. Geological Survey press releases” 

    From Science Magazine

    Jul. 8, 2019
    Scott Waldman

    Under Director James Reilly, the U.S. Geological Survey has drawn criticism for deemphasizing concerns about climate change. NASA

    A March news release from the U.S. Geological Survey (USGS) touted a new study that could be useful for infrastructure planning along the California coastline.

    At least that’s how President Donald Trump’s administration conveyed it.

    The news release hardly stood out. It focused on the methodology of the study rather than its major findings, which showed that climate change could have a withering effect on California’s economy by inundating real estate over the next few decades.

    An earlier draft of the news release, written by researchers, was sanitized by Trump administration officials, who removed references to the dire effects of climate change after delaying its release for several months, according to three federal officials who saw it. The study, published in the journal Scientific Reports, showed that California, the world’s fifth-largest economy, would face more than $100 billion in damages related to climate change and sea-level rise by the end of the century. It found that three to seven times more people and businesses than previously believed would be exposed to severe flooding.

    “We show that for California, USA, the world’s fifth largest economy, over $150 billion of property equating to more than 6% of the state’s GDP and 600,000 people could be impacted by dynamic flooding by 2100,” the researchers wrote in the study.

    The release fits a pattern of downplaying climate research at USGS and in other agencies within the administration. While USGS does not appear to be halting the pursuit of science, it has publicly communicated an incomplete account of the peer-reviewed research or omitted it under President Trump.

    “It’s been made clear to us that we’re not supposed to use climate change in press releases anymore. They will not be authorized,” one federal researcher said, speaking anonymously for fear of reprisal.

    In the Obama administration, press releases related to climate change were typically approved within days, researchers said. Now, they can take more than six months and go through the offices of political appointees, where they are often altered, several researchers told E&E News.

    In the case of the California coastline study, the press release went through the office of James Reilly, the director of USGS, a former astronaut who is attempting to minimize the consideration of climate change in agency decisions. Reilly is preparing a directive for agency scientists to use climate models that predict changes through 2040, when the effect of emissions is expected to be less severe. The New York Times first reported on the directive.

    At his 2018 confirmation hearing, Reilly promised to protect the agency’s scientific integrity.

    “If someone were to come to me and say, ‘I want you to change this because it’s the politically right thing to do,’ I would politely decline,” Reilly told lawmakers. “I’m fully committed to scientific integrity.”

    A spokeswoman for USGS said the agency has no formal policy to avoid references to climate change.

    “There is no policy nor directive in place that directs us to avoid mentioning climate change in our communication materials,” said Karen Armstrong, the spokeswoman.

    “Scientists at USGS regularly develop new methods and tools to supply timely, relevant and useful information about our planet and its processes, and we are committed to promoting the science they develop and making it broadly available,” she added.

    The agency’s press release about the California coastline study was significantly altered to mask the potential impact of rising temperatures on the state’s economy. Instead, it described the methodology of the study and how it relied on “state-of-the-art computer models” and various sea-level rise predictions.

    “USGS scientists and collaborators used state-of-the-art computer models to determine the coastal flooding and erosion that could result from a range of peer-reviewed, published 21st-century sea level rise and storm scenarios,” the final press release said. “The authors then translated those hazards into a range of projected economic and social exposure data to show the lives and dollars that could be at risk from climate change in California during the 21st century.”

    The USGS release didn’t include the dollar figures outlined in the study.

    An earlier draft of the press release, which was put online by the environmental group Point Blue Conservation Science, a participant in the study, compared the possible effect on Californians to the devastation of Hurricane Katrina. The release had stark recommendations for coastal planners and emphasized that by the end of the century, a typical winter storm could threaten $100 billion in coastal real estate annually.

    “According to the study, even modest sea level rise projections of ten inches (25 centimeters) by 2040 could flood more than 150,000 residents and affect more than $30 billion in property value when combined with an extreme 100-year storm along California’s coast,” the draft stated. “Societal exposure that included storms was up to seven times greater than with sea level rise alone.”

    The agency has omitted climate change from other press releases.

    A release in 2017 that publicized a study on how polar bears were expending more energy due to a loss of sea ice did not mention climate change. It noted that a “moving treadmill of sea ice” in the warming Arctic forced polar bears to hunt for more seals and placed pressure on their population in the Beaufort and Chukchi seas, without stating that climate change is a key driver of sea ice conditions.

    Another USGS release, on shifting farming regions due to climate change, mentioned “future high-temperature extremes” and “future climate conditions” but not climate change. The first sentence of the study that it was intended to promote mentions climate change. It was published in Scientific Reports.

    Some of the USGS studies point to national security repercussions. One study released last year found that a military installation in the Pacific Ocean that would play a role in a possible nuclear strike by North Korea could become uninhabitable in less than two decades due to climate change. The study, which was ordered by the Department of Defense, was released by USGS without a press release.

    USGS conducts important climate research and manages the Landsat satellite system that has tracked human-caused global changes for almost 50 years. Government researchers study sea-level rise and glacial melt and manage regional climate adaptation centers housed at universities from Hawaii to Massachusetts.

    Allowing valuable information to fall through the cracks is a waste of taxpayer dollars and could prevent science from being included in policy decisions, said Joel Clement, a former climate staffer for the Department of the Interior, USGS’s parent agency. Clement, who is now a senior fellow at the Harvard Kennedy School’s Belfer Center for Science and International Affairs, said the promotion of studies is an important way to get information into the hands of planners, homeowners, and policymakers. He said Interior appears to be suppressing climate science.

    “It’s an insult to the science, of course, but it’s also an insult to the people who need this information and whose livelihoods and in some cases their lives depend on this,” Clement said. “What’s shocking about it is that this has been taken to a new level, where information that is essential to economic and health and safety—essentially American well-being—is essentially being shelved and being hidden.”

    In the last year of the Obama administration, USGS distributed at least 13 press releases that focused on climate change and highlighted it in the headline, according to an E&E News review. Since then — from 2017 through the first six months of 2019 — none has mentioned climate change in the headline of the press release, according to the list of state and national releases posted on the USGS website. Some briefly mentioned climate change in the body of the release, while others did not refer to it at all.

    Other studies have been quietly buried on the agency’s webpages.

    That subtle form of suppression fits a pattern elsewhere in the federal government.

    Politico recently reported that officials at the Department of Agriculture buried dozens of studies related to climate change. In one case, agency officials tried to prevent outside groups from disseminating a climate-related study. The research looked at how rice provides less nutrition in a carbon-rich environment. That could have global consequences because hundreds of millions of people have rice-based diets around the world.

    The Interior Department has been accused of deleting climate change references from previous press releases. In 2017, The Washington Post reported that the agency deleted a line mentioning climate change in a press release about a study on flood risks to coastal communities. That line was: “Global climate change drives sea-level rise, increasing the frequency of coastal flooding.”

    Interior Secretary David Bernhardt, a former energy lobbyist, is under investigation for his ties to the energy industry while serving in government. A separate investigation is exploring whether he sought to block an Interior Department study on the dangers that a pesticide posed to endangered species.

    There is no evidence that Trump political appointees at the agency have blocked climate studies from taking place, but the censoring of press releases has affected the work of researchers worried about their jobs, according to another federal researcher.

    “We are pretty cognizant of political pressures, and with these press releases people are definitely biting their nails over ‘how should we word this’ and if there are proposals within USGS, should we use climate change or not,” the researcher said. “It’s a lot of stuff that definitely filters down, and it affects the reality of people on the ground doing the work when you’re not sure of how I should present this. It’s definitely a huge waste of time.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 11:34 am on June 24, 2019 Permalink | Reply
    Tags: "As countries battle for control of North Pole", , , , science is the ultimate winner", Science Magazine   

    From Science Magazine: “As countries battle for control of North Pole, science is the ultimate winner” 

    From Science Magazine

    Jun. 20, 2019
    Richard Kemeny

    Canadian and U.S. Coast Guard ships worked together to map the Arctic sea floor for continental shelf claims. DVIDSHUB/FLICKR/CC BY 2.0

    A competition for the North Pole heated up last month, as Canada became the third country to claim—based on extensive scientific data—that it should have sovereignty over a large swath of the Arctic Ocean, including the pole. Canada’s bid, submitted to the United Nations’s Commission on the Limits of the Continental Shelf (CLCS) on 23 May, joins competing claims from Russia and Denmark. Like theirs, it is motivated by the prospect of mineral riches: the large oil reserves believed to lie under the Arctic Ocean, which will become more accessible as the polar ice retreats. And all three claims, along with dozens of similar claims in other oceans, rest on extensive seafloor mapping, which has proved to be a boon to science, whatever the outcome for individual countries. The race to obtain control over parts of the sea floor has “dramatically changed our understanding of the oceans,” says marine geophysicist Larry Mayer of the University of New Hampshire in Durham.

    Coastal nations have sovereign rights over an exclusive economic zone (EEZ), extending by definition 200 nautical miles (370 kilometers) out from their coastline. But the 1982 United Nations Convention on the Law of the Sea opened up the possibility of expanding that zone if a country can convince CLCS that its continental shelf extends beyond the EEZ’s limits.

    Most of the 84 submissions so far were driven by the prospect of oil and gas, although advances in deep-sea mining technology have added new reasons to apply. Brazil, for example, filed an application in December 2018 that included the Rio Grande Rise, a deep-ocean mountain range 1500 kilometers southeast of Rio De Janeiro that’s covered in cobalt-rich ferromanganese crusts.

    To make a claim, a country has to submit detailed data on the shape of the sea floor and on its sediment, which is thicker on the shelf than in the deep ocean. The data come from sonar, which reveals seafloor topography, and seismic profiling, which uses low-frequency booms to probe the sediment. Canada’s bid also enlisted ships to conduct high-resolution gravimetry—measurements of gravity anomalies that reveal seafloor structure. Elevated gravity readings are found over higher-density mantle rocks found in oceanic crust, and lower readings over lighter, continental structures. And the bid used analyses of 800 kilograms of rock samples dredged up from the sea floor, whose composition can distinguish continental from ocean crust.

    The studies don’t come cheap; Canada’s 17 Arctic expeditions alone cost more than CA$117 million. But the work by the three countries vying for the Arctic—and that of dozens of others elsewhere in the world—has been a bonanza for oceanography. In the Arctic alone, the mapping has revealed several sunken mountains, previously missed or undetected by older sonar methods. Hundreds of pockmarks found on the Chukchi Cap, a submarine plateau extending out from Alaska, suggest that bursts of previously frozen methane have erupted from the seabed, a phenomenon that could accelerate climate change. And gaps discovered across submarine ridges allow currents to flow from basin to basin, with “important ramifications on the distribution of heat in the Arctic and on overall modeling of climate and ice melting,” Mayer says.

    Who owns the North Pole? Countries can clain the sea floor beyond the 200-nautical-mile (370-kilometer) ex-clusive economic zone (EEZ) if data show it to be an extension of the continental shelf (below). Russia Denmark and Canada have submitted overlapping claims in the Artic Ocean.

    CLCS, composed of 21 scientists in fields such as geology and hydrography who are elected by member states, has accepted 24 of the 28 claims it has finished evaluating, some partially or with caveats; in several cases, it has asked for follow-up submissions with more data. Australia was the first country to succeed, adding 2.5 million square kilometers to its territory in 2008. New Zealand gained undersea territory six times larger than its terrestrial area. But CLCS only judges the merit of each individual scientific claim; it has no authority to decide boundaries when claims overlap. To do that, countries have to turn to diplomatic channels once the science is settled.

    The three claims on the North Pole revolve around the Lomonosov Ridge, an underwater mountain system that runs from Ellesmere Island in Canada’s Qikiqtaaluk region to the New Siberian Islands of Russia, passing the North Pole. Both countries claim the ridge is geologically connected to their continent, whereas Denmark says it is also tied to Greenland, a Danish territory. As the ridge is thought to be continental crust, the territorial extensions could be extensive. (U.S. scientists should finish mapping in the Arctic in about 2 years, says Mayer, who is involved in that effort, but as one of the few countries that hasn’t ratified the Law of the Sea convention, the United States can’t file an official submission.)

    Tensions flared when Russia planted a titanium flag on the sea floor beneath the North Pole in 2007, after CLCS rejected its first claim, saying more data were needed. The Canadian foreign minister at the time likened the move to the land grabs of early European colonizers. Not that the North Pole has any material value: “The oil potential there is zip,” says geologist Henry Dick of the Woods Hole Oceanographic Institution in Massachusetts. “The real fight is over the Amerasian Basin,” Dick says (see map, above) where large amounts of oil are thought to be locked up.

    It will take years, perhaps decades, for CLCS to rule on the overlapping Arctic claims. Whoever wins the scientific contest still faces a diplomatic struggle.

    Denmark, Russia, and Canada have expressed their desire to settle the situation peacefully. “Russia actually has played nice on this and stopped at the North Pole,” rather than extending its claim along the length of the ridge, says Philip Steinberg, a political geographer at Durham University in the United Kingdom. Denmark had no such qualms and put in a claim up to the edge of Russia’s EEZ, “even though there’s no way in hell they’ll get that,” when it comes to the diplomatic discussions, Steinberg says.

    One solution would be to use the equidistance principle, by drawing a median line between the coastlines, as has been done when proposed marine territories overlapped in the past; doing so would mean the North Pole falls to Denmark. There’s also a proposal to make the pole international, like Antarctica, as a sign of peace, says Oran Young, a political scientist at the University of California, Santa Barbara. “It seems a very sensible idea.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: