Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:37 am on January 17, 2020 Permalink | Reply
    Tags: "The blob", , Common Murre die-off, , Food chain,   

    From University of Washington: “‘The blob,’ food supply squeeze to blame for largest seabird die-off” 

    From University of Washington

    January 15, 2020
    Michelle Ma

    1
    A recently dead common murre found by a citizen scientist on a routine monthly survey in January 2016. An intact, fresh bird indicates scavengers have not yet arrived. This carcass has probably only been on the beach a few hours.COASST

    The common murre is a self-sufficient, resilient bird.

    Though the seabird must eat about half of its body weight in prey each day, common murres are experts at catching the small “forage fish” they need to survive. Herring, sardines, anchovies and even juvenile salmon are no match for a hungry murre.

    So when nearly one million common murres died at sea and washed ashore from California to Alaska in 2015 and 2016, it was unprecedented — both for murres, and across all bird species worldwide. Scientists from the University of Washington, the U.S. Geological Survey and others blame an unexpected squeeze on the ecosystem’s food supply, brought on by a severe and long-lasting marine heat wave known as “the blob.”

    Their findings were published Jan. 15 in the journal PLOS ONE.

    “Think of it as a run on the grocery stores at the same time that the delivery trucks to the stores stopped coming so often,” explained second author Julia Parrish, a UW professor in the School of Aquatic and Fishery Sciences. “We believe that the smoking gun for common murres — beyond the marine heat wave itself — was an ecosystem squeeze: fewer forage fish and smaller prey in general, at the same time that competition from big fish predators like walleye, pollock and Pacific cod greatly increased.”

    2
    Common murre eggs show consistent patterns year after year. Photo by Christiana Carvalho. Hakai Magazine

    Common murres nest in colonies along cliffs and rocky ledges overlooking the ocean. The adult birds, about one foot in length, are mostly black with white bellies, and can dive more than two football fields below the ocean’s surface in search of prey.

    Warmer surface water temperatures off the Pacific coast — a phenomenon known as “the blob” [above] — first occurred in the fall and winter of 2013, and persisted through 2014 and 2015. Warming increased with the arrival of a powerful El Niño in 2015-2016. A number of other species experienced mass die-offs during this period, including tufted puffins, Cassin’s auklets, sea lions and baleen whales. But the common murre die-off was by far the largest any way you measure it.

    From May 2015 to April 2016, about 62,000 murre carcasses were found on beaches from central California north through Alaska. Citizen scientists in Alaska monitoring long-term sites counted numbers that reached 1,000 times more than normal for their beaches. Scientists estimate that the actual number of deaths was likely close to one million, since only a fraction of birds that die will wash to shore, and only a fraction of those will be in places that people can access.

    Many of the birds that died were breeding-age adults. With massive shifts in food availability, murre breeding colonies across the entire region failed to produce chicks for the years during and after the marine heat wave event, the authors found.

    “The magnitude and scale of this failure has no precedent,” said lead author John Piatt, a research biologist at the U.S. Geological Survey’s Alaska Science Center and an affiliate professor in the UW School of Aquatic and Fishery Sciences. “It was astonishing and alarming, and a red-flag warning about the tremendous impact sustained ocean warming can have on the marine ecosystem.”

    From a review of fisheries studies conducted during the heat wave period, the research team concluded that persistent warm ocean temperatures associated with “the blob” increased the metabolism of cold-blooded organisms from zooplankton and small forage fish up through larger predatory fish like salmon and pollock. With predatory fish eating more than usual, the demand for food at the top of the food chain was unsustainable. As a result, the once-plentiful schools of forage fish that murres rely on became harder to find.

    “Food demands of large commercial groundfish like cod, pollock, halibut and hake were predicted to increase dramatically with the level of warming observed with the blob, and since they eat many of the same prey as murres, this competition likely compounded the food supply problem for murres, leading to mass mortality events from starvation,” Piatt said.

    As the largest mass die-off of seabirds in recorded history, the common murre event may help explain the other die-offs that occurred during the northeast Pacific marine heat wave, and also serve as a warning for what could happen during future marine heat waves, the authors said.

    UW scientists recently identified another marine heatwave forming off the Washington coast and up into the Gulf of Alaska.

    “All of this — as with the Cassin’s auklet mass mortality and the tufted puffin mass mortality — demonstrates that a warmer ocean world is a very different environment and a very different coastal ecosystem for many marine species,” said Parrish, who is also the executive director of the Coastal Observation and Seabird Survey Team, known as COASST. “Seabirds, as highly visible members of that system, are bellwethers of that change.”

    Additional UW co-authors are Timothy Jones, Hillary Burgess and Jackie Lindsey. Other study co-authors are from U.S. Geological Survey, U.S. Fish and Wildlife Service, Farallon Institute, International Bird Rescue, Humboldt State University, National Park Service, NOAA Fisheries, Moss Landing Marine Laboratories, NOAA Greater Farallones National Marine Sanctuary and Point Blue Conservation Science.

    This research was funded by the USGS Ecosystems Mission Area, the North Pacific Research Board, The National Science Foundation and the Washington Department of Fish and Wildlife.

    For more information, contact Parrish at jparrish@uw.edu and Piatt at piattjf@gmail.com.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    u-washington-campus
    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.
    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 8:12 am on January 17, 2020 Permalink | Reply
    Tags: , , , , , , ,   

    From Commonwealth Scientific and Industrial Research Organisation -CSIRO: “Leading Australian telescopes to get technology upgrades” 

    CSIRO bloc

    From Commonwealth Scientific and Industrial Research Organisation -CSIRO

    17 Jan 2020
    Gabby Russell
    +61 2 9490 8002

    CSIRO/Parkes Observatory, located 20 kilometres north of the town of Parkes, New South Wales, Australia, 414.80m above sea level

    CSIRO’s iconic Parkes radio telescope – fondly known as ‘The Dish’ – will get a new receiver that will significantly increase the amount of sky it can see at any one time, enabling new science and supporting local innovation in the space sector.

    The receiver is one of two projects announced today that will deliver technology enhancements for Australia’s leading radio telescopes.

    Australian Research Council Linkage Infrastructure, Equipment and Facilities (LIEF) grants have been awarded for the development of a new receiver for the Parkes radio telescope, and a major upgrade for the Australia Telescope Compact Array near Narrabri in NSW.

    CSIRO Australia Compact Array, six radio telescopes at the Paul Wild Observatory, is an array of six 22-m antennas located about twenty five kilometres (16 mi) west of the town of Narrabri in Australia.

    Both telescopes are owned and operated by Australia’s national science agency, CSIRO, for use by astronomers in Australia and around the world.

    A $1.15M LIEF grant will support a $3M project to build a sensitive receiver called a ‘cryoPAF’ for the Parkes radio telescope.

    Once complete, the new cryoPAF will sit high above the Parkes telescope’s dish surface and receive radio signals reflected up from the dish.

    Its detectors will convert radio signals into electrical ones, which can be combined in different ways so that the telescope ‘looks’ in several different directions at once.

    The cryoPAF will be cooled to -253°C to reduce ‘noise’ in its electrical circuits, enhancing the ability to detect weak radio signals from the cosmos at frequencies from 700 MHz to 1.9 GHz.

    The grant was led by The University of Western Australia, which will coordinate construction and commissioning of the cryoPAF. CSIRO will design, build and install the instrument.

    There are five further research organisations involved in the project.

    Professor Lister Staveley-Smith from The University of Western Australia node of ICRAR, who led the grant application, said the cryoPAF has three times more field of view than the previous instrument, allowing quicker and more complete surveys of the sky.

    “The new receiver will help astronomers to study fast radio bursts and pulsar stars, and observe hydrogen gas throughout the Universe,” Professor Staveley-Smith said.

    A phased-array feed or PAF is a close-packed array of radio detectors.

    CSIRO has previously designed and built innovative phased-array feeds for its ASKAP telescope in Western Australia, and a test version of the cryoPAF was used successfully on the Parkes telescope in 2016.

    Director of CSIRO Astronomy and Space Science, Dr Douglas Bock, said that in addition to boosting the capabilities of the Parkes telescope, the cryoPAF receiver technology had the potential to create spin-off opportunities.

    “Phased arrays have found extensive use in defence radar, medical imaging and even optical laser beam steering, with emerging applications in satellite communications and telecommunications,” Dr Bock said.

    “Their further development at radio wavelengths has technology applications beyond radio astronomy with the potential to fuel the growth of space-related industries here in Australia.”

    A second LIEF grant, worth $530,000, will support a $2.6M upgrade of the Australia Telescope Compact Array.

    The existing digital signal processor will be replaced with a GPU-powered processor to double the bandwidth of the telescope’s signal electronics.

    The project is being led by Professor Ray Norris from Western Sydney University, working closely with CSIRO and seven other university partners.

    Professor Norris said the upgrade will enable Australian researchers to address major challenges in our understanding of the Universe, and make more ground-breaking discoveries, across broad areas of astrophysics.

    “The upgrade will enable the telescope to study radio counterparts to gravitational wave sources, and it will enable it to make detailed observations of initial discoveries made with the Australian Square Kilometre Array Pathfinder and other Australian telescopes,” Professor Norris said.

    CSIRO is a leader in radio astronomy technology development, working in close partnership with astronomers who use its telescopes as well as international observatory customers.

    “We’ve been developing specialised instrumentation for radio telescopes since the 1940s, when the field of radio astronomy first emerged, for our own and international telescopes,” Dr Bock said.

    “Through our close collaborations with research partners and our expertise in technology development, we’ll keep the telescopes at the cutting edge of science.”

    CSIRO owns and operates a wide range of science-ready national research facilities and infrastructure that is used by thousands of Australian and international researchers each year. The Parkes radio telescope and Australia Telescope Compact Array are part of the Australia Telescope National Facility, which is funded by the Australian Government.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 7:26 am on January 17, 2020 Permalink | Reply
    Tags: , , , , , , ,   

    From European Space Agency – United space in Europe: “XMM-Newton discovers scorching gas in Milky Way’s halo” 

    ESA Space For Europe Banner

    From European Space Agency – United space in Europe

    From United space in Europe

    16/01/2020

    Sanskriti Das
    The Ohio State University, USA
    das.244@buckeyemail.osu.edu

    Smita Mathur
    The Ohio State University, USA
    smita@astronomy.ohio-state.edu

    Fabrizio Nicastro
    Osservatorio Astronomico di Roma—INAF, Italy
    Harvard-Smithsonian Center for Astrophysics, USA
    fabrizio.nicastro@inaf.it

    Norbert Schartel
    XMM-Newton project scientist
    European Space Agency
    norbert.schartel@esa.int

    1

    ESA’s XMM-Newton has discovered that gas lurking within the Milky Way’s halo reaches far hotter temperatures than previously thought and has a different chemical make-up than predicted, challenging our understanding of our galactic home.

    ESA/XMM Newton

    A halo is a vast region of gas, stars and invisible dark matter surrounding a galaxy. It is a key component of a galaxy, connecting it to wider intergalactic space, and is thus thought to play an important role in galactic evolution.

    Until now, a galaxy’s halo was thought to contain hot gas at a single temperature, with the exact temperature of this gas dependent on the mass of the galaxy.

    However, a new study using ESA’s XMM-Newton X-ray space observatory now shows that the Milky Way’s halo contains not one but three different components of hot gas, with the hottest of these being a factor of ten hotter than previously thought. This is the first time multiple gas components structured in this way have been discovered in not only the Milky Way, but in any galaxy.

    “We thought that gas temperatures in galactic haloes ranged from around 10,000 to one million degrees – but it turns out that some of the gas in the Milky Way’s halo can hit a scorching 10 million degrees,” says Sanskriti Das, a graduate student at The Ohio State University, USA, and lead author of the new study.

    “While we think that gas gets heated to around one million degrees as a galaxy initially forms, we’re not sure how this component got so hot. It may be due to winds emanating from the disc of stars within the Milky Way.”

    The study used a combination of two instruments aboard XMM-Newton: the Reflection Grating Spectrometer (RGS) and European Photon Imaging Camera (EPIC). EPIC was used to study the light emitted by the halo, and RGS to study how the halo affects and absorbs light that passes through it.

    To probe the Milky Way’s halo in absorption, Sanskriti and colleagues observed an object known as a blazar: the very active, energetic core of a distant galaxy that is emitting intense beams of light.

    By now iconic image of a blazar. NASA Fermi Gamma ray Space Telescope. Credits M. Weiss/ CfA

    NASA/Fermi LAT


    NASA/Fermi Gamma Ray Space Telescope

    Having travelled almost five billion light-years across the cosmos, the X-ray light from this blazar also passed through our galaxy’s halo before reaching XMM-Newton’s detectors, and thus holds clues about the properties of this gaseous region.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    Unlike previous X-ray studies of the Milky Way’s halo, which normally last a day or two, the team performed observations over a period of three weeks, enabling them to detect signals that are usually too faint to see.

    “We analysed the blazar’s light and zeroed in on its individual spectral signatures: the characteristics of the light that can tell us about the material it’s passed through on its way to us,” says co-author Smita Mathur, also of The Ohio State University, and Sanskriti’s advisor.

    “There are specific signatures that only exist at specific temperatures, so we were able to determine how hot the halo gas must have been to affect the blazar light as it did.”

    The Milky Way’s hot halo is also significantly enhanced with elements heavier than helium, which are usually produced in the later stages of a star’s life. This indicates that the halo has received material created by certain stars during their lifetimes and final stages, and flung out into space as they die.

    3
    Elements found in the Milky Way halo – artist’s impression

    “Until now, scientists have primarily looked for oxygen, as it’s abundant and thus easier to find than other elements,” explains Sanskriti.

    “Our study was more detailed: we looked at not only oxygen but also nitrogen, neon and iron, and found some hugely interesting results.”

    Scientists expect the halo to contain elements in similar ratios to those seen in the Sun. However, Das and colleagues noticed less iron in the halo than expected, indicating that the halo has been enriched by massive dying stars, and also less oxygen, likely due to this element being taken up by dusty particles in the halo.

    “This is really exciting – it was completely unexpected, and tells us that we have much to learn about how the Milky Way has evolved into the galaxy it is today,” adds Sanskriti.

    4
    The cosmic budget of ‘ordinary’ matter

    While the mysterious dark matter and dark energy make up about 25 and 70 percent of our cosmos respectively, the ordinary matter that makes up everything we see – from stars and galaxies to planets and people – amounts to only about five percent.

    ______________________________________________________________________

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    The LSST, or Large Synoptic Survey Telescope is to be named the Vera C. Rubin Observatory by an act of the U.S. Congress.

    LSST telescope, The Vera Rubin Survey Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background [CMB]hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    [caption id="attachment_73741" align="alignnone" width="632"] CMB per ESA/Planck

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LBNL LZ Dark Matter project at SURF, Lead, SD, USA


    Inside the ADMX experiment hall at the University of Washington Credit Mark Stone U. of Washington. Axion Dark Matter Experiment

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    ______________________________________________________________________

    However, stars in galaxies across the Universe only make up about seven percent of all ordinary matter. The cold interstellar gas that permeates galaxies – the raw material to create stars – amounts to about 1.8 percent of total, while the hot, diffuse gas in the haloes that encompass galaxies makes up roughly five percent, and the even hotter gas that fills galaxy clusters – the largest cosmic structures held together by gravity – accounts for four percent.

    This is not surprising: stars, galaxies and galaxy clusters form in the densest knots of the cosmic web, the filamentary distribution of both dark and ordinary matter that extends throughout the Universe. While these sites are dense, they are also rare, so not the best spots to look for the majority of cosmic matter.

    Most of the Universe’s ordinary matter, or baryons, must be lurking in the ubiquitous filaments of this cosmic web, where matter is however less dense and therefore more challenging to observe. Using different techniques over the years, they were able to locate a good chunk of this intergalactic material – mainly its cool component (also known as Lyman-alpha forest, which makes up about 28 percent of all baryons) and its warm component (about 15 percent).

    After two decades of observations, astronomers using ESA’s XMM-Newton space observatory have detected the hot component of this intergalactic material along the line of sight to a distant quasar. The amount of hot intergalactic gas detected in these observations amounts up to 40 percent of all baryons in the Universe, closing the gap in the overall budget of ordinary matter in the cosmos.

    The newly discovered hot gas component also has wider implications that affect our overall understanding of the cosmos. Our galaxy contains far less mass than we expect: this is known as the ‘missing matter problem’, in that what we observe does not match up with theoretical predictions.

    From its long-term mapping of the cosmos, ESA’s Planck spacecraft predicted that just under 5% of the mass in the Universe should exist in the form of ‘normal’ matter – the kind making up stars, galaxies, planets, and so on.

    ESA/Planck 2009 to 2013

    “However, when we add up everything we see, our figure is nowhere by S. Das, S. Mathur, F. Nicastro, and Y. Krongold near this prediction,” adds co-author Fabrizio Nicastro of Osservatorio Astronomico di Roma—INAF, Italy, and the Harvard-Smithsonian Center for Astrophysics, USA.

    “So where’s the rest? Some suggest that it may be hiding in the extended and massive halos surrounding galaxies, making our finding really exciting.”

    As this hot component of the Milky Way’s halo has never been seen before, it may have been overlooked in previous analyses – and may thus contain a large amount of this ‘missing’ matter.

    “These observations provide new insights into the thermal and chemical history of the Milky Way and its halo, and challenge our knowledge of how galaxies form and evolve,” concludes ESA XMM project scientist Norbert Schartel.

    “The study looked at the halo along one sightline – that towards the blazar – so it will be hugely exciting to see future research expand on this.”

    Science papers:
    https://iopscience.iop.org/article/10.3847/2041-8213/ab3b09 , by S. Das, S. Mathur, F. Nicastro, and Y. Krongold

    https://iopscience.iop.org/article/10.3847/1538-4357/ab5846 , S. Das, S. Mathur, A, Gupta, F. Nicastro, and Y. Krongold

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

     
  • richardmitnick 1:08 am on January 17, 2020 Permalink | Reply
    Tags: , , , , , https://pubs.geoscienceworld.org/, , San Diego CA, ,   

    From temblor: “Past meets present to help future seismic hazard forecasts in San Diego, CA” 

    1

    From temblor

    January 13, 2020
    Alka Tripathy-Lang
    @DrAlkaTrip

    Urbanization obscures a complex fault zone on which downtown San Diego sits, but decades-old geotechnical studies reveal the faults.

    1
    Urbanization in downtown San Diego. Credit: Tony Webster CC-BY-2.0.

    Fault studies often rely on surface expressions of the ground’s movement. In densely populated urban areas, such as San Diego, this evidence is concealed beneath the cityscape. Now, though, a team has used historical reports to trace faults through downtown San Diego in unprecedented detail, establishing a template that other fault-prone cities can follow to illuminate otherwise hidden hazards.

    Urbanization obscures geology

    Downtown San Diego, popular for its beaches and parks, also hosts the active Rose Canyon Fault Zone, a complex hazard that underlies the city from northwest of La Jolla through downtown, before curving into San Diego Bay.

    2
    Rose Canyon Fault. https://www.nbcsandiego.com/

    Like the nearby San Andreas, the Rose Canyon Fault is right-lateral, meaning if you were to stand on one side, the opposite side would appear to move to your right. But it plods along at a rate of 1-2 millimeters per year, unlike its speedy neighbor, which indicates a comparatively lower seismic risk.

    “We haven’t had a major rupture” on the Rose Canyon Fault since people have been living atop it, says Jillian Maloney, a geophysicist at San Diego State University and co-author of the new study. So it’s hard to say what kind of damage would be caused, she says. “But, a magnitude-6.9 [of which this fault is capable] is big.”

    Because of urbanization, though, “there haven’t been any comprehensive geologic investigations” of the faults underlying downtown San Diego, Maloney says. This presents a problem because detailed knowledge of active and inactive fault locations, especially in a complicated area where the fault zone bends, is key for successful seismic hazard assessments, she says. The state and federal government maintain fault maps and databases, but their accuracy at the small scale was unknown.

    3
    Map of the Rose Canyon Fault near San Diego, California, USA. USGS

    Faded pages

    A solution to the lack of detailed fault mapping in downtown San Diego resided in decades of old geotechnical reports. These individual studies the size of a city block or smaller are required by the city for any proposed development near active faults, as mapped by the state. Although the data are public once the reports are filed with the city, the reports had not been integrated into a comprehensive or digital resource, and the city does not maintain a list of such reports.

    4
    This bird’s-eye view of downtown San Diego was drawn by Eli Glover in 1876. Prior to the development of downtown San Diego, the Rose Canyon Fault Zone was expressed on the surface and could be seen laterally offsetting topographic features. Credit: Library of Congress, Geography and Map Division.

    According to Luke Weidman, lead author of this study, which was his master’s project, the first challenge was determining how many reports were even available. Weidman, currently a geologist at geotech firm Geocon, went straight to the source: He asked several of San Diego’s large geotechnical firms for their old publicly available reports in exchange for digitizing them. 


    Weidman scrutinized more than 400 reports he received, dating from 1979 to 2016. Many were uninterpretable because of faded or illegible pages. He assembled the 268 most legible ones into a fault map and database of downtown San Diego. Because reports lacked geographical coordinates, Weidman resorted to property boundaries, building locations, park benches and even trees to locate the reports on a modern map, says Maloney, one of his master’s advisors. Weidman, Maloney and geologist Tom Rockwell also of San Diego State published the findings from their comprehensive interactive digital map last month in Geosphere, along with an analysis of the Rose Canyon Fault Zone in downtown San Diego.

    Below from https://pubs.geoscienceworld.org/

    5
    Map of the Rose Canyon fault zone (RCFZ) through San Diego (SD), California (USA) and across the San Diego Bay pull-apart basin. Black box shows the extent of Figure 3. Grid shows population count per grid cell (∼1 km2) (source: LandScan 2017, Oak Ridge National Laboratory, UT-Battelle, LLC, https://landscan.ornl.gov/). DF—Descanso fault; SBF—Spanish Bight fault; CF—Coronado fault; SSF—Silver Strand fault; LNFZ—La Nacion fault zone.

    6
    Street map of greater downtown San Diego region showing Alquist-Priolo (AP) zones and faults from the U.S. Geological Survey (USGS) fault database (USGS-CGS, 2006). Black box shows the extent of Figures 6, 7, and 8. Background imagery: ESRI, HERE (https://www.here.com/strategic-partners/esri), Garmin, OpenStreetMap contributors, and the GIS community.

    Fault findings

    The team found that downtown San Diego’s active faults—defined in their paper as having ruptured within the past 11,500 years—largely track the state’s active fault maps. However, at the scale of the one-block investigations, they found several faults mapped in the wrong location, and cases of no fault where one was expected. Further, the team uncovered three active faults that were not included in the state or federal maps. At the scale at which geotechnical firms, government, owners and developers need to know active fault locations, the use of this type of data is important, says Diane Murbach, an engineering geologist at Murbach Geotech who was not involved in this study.

    7
    This map of downtown San Diego, Calif., shows fault locations as mapped by the U.S. Geological Survey (USGS), and faults as located by the individual geotechnical reports compiled in the new study. Green, light orange, dark orange and red boxes indicate whether individual geotechnical studies found no hazard (green), active faults (red) or potential fault hazards (dark or light orange). Note that the Rose Canyon Fault Zone as mapped by USGS occasionally intersects green boxes, indicating the fault may be mislocated. Where the fault is active, mismatches exist as well. Note the arrow pointing to the ‘USGS-Geotech fault difference,’ highlighting a significant discrepancy in where the fault was previously mapped, versus where it lies. Credit: Weidman et al., [2019].

    Maloney says they also found other faults that haven’t ruptured in the last 11,500 years. This is important, she says, because “you could have a scenario where an active zone ruptures and propagates to [one] that was previously considered inactive.”

    This research “is the first of its kind that I know of that takes all these different reports from different scales with no set format, and fits them into one [usable] database,” says Nicolas Barth, a geologist at the University of California, Riverside who was not part of this study. Many cities have been built on active faults, obscuring hints of past seismicity, he notes. “This is a nice template for others to use,” he says, “not just in California, but globally.”

    References
    Weidman, L., Maloney, J.M., and Rockwell, T.K. (2019). Geotechnical data synthesis for GIS-based analysis of fault zone geometry and hazard in an urban environment. Geosphere, v.15, 1999-2017. doi:10.1130/GES02098.1

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Earthquake Alert

    1

    Earthquake Alert

    Earthquake Network project

    Earthquake Network is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    3
    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States
    1

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.

    Authorities

    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach
    rdegroot@usgs.gov
    626-583-7225

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

     
  • richardmitnick 7:42 pm on January 16, 2020 Permalink | Reply
    Tags: "Study finds billions of quantum entangled electrons in ‘strange metal", , , , , Quantum entanglement is the basis for storage and processing of quantum information., , Terahertz spectroscopy, With strange metals there is an unusual connection between electrical resistance and temperature.   

    From Rice University: “Study finds billions of quantum entangled electrons in ‘strange metal” 

    Rice U bloc

    From Rice University

    January 16, 2020
    Jade Boyd

    Physicists provide direct evidence of entanglement’s role in quantum criticality.

    In a new study, U.S. and Austrian physicists have observed quantum entanglement among “billions of billions” of flowing electrons in a quantum critical material.

    1
    Junichiro Kono (left) and Qimiao Si in Kono’s Rice University laboratory in December 2019. (Photo by Jeff Fitlow/Rice University)

    The research, which appears this week in Science, examined the electronic and magnetic behavior of a “strange metal” compound of ytterbium, rhodium and silicon as it both neared and passed through a critical transition at the boundary between two well-studied quantum phases.

    The study at Rice University and Vienna University of Technology (TU Wien) provides the strongest direct evidence to date of entanglement’s role in bringing about quantum criticality, said study co-author Qimiao Si of Rice.

    “When we think about quantum entanglement, we think about small things,” Si said. “We don’t associate it with macroscopic objects. But at a quantum critical point, things are so collective that we have this chance to see the effects of entanglement, even in a metallic film that contains billions of billions of quantum mechanical objects.”

    Si, a theoretical physicist and director of the Rice Center for Quantum Materials (RCQM), has spent more than two decades studying what happens when materials like strange metals and high-temperature superconductors change quantum phases. Better understanding such materials could open the door to new technologies in computing, communications and more.

    The international team overcame several challenges to get the result. TU Wien researchers developed a highly complex materials synthesis technique to produce ultrapure films containing one part ytterbium for every two parts rhodium and silicon (YbRh2Si2). At absolute zero temperature, the material undergoes a transition from one quantum phase that forms a magnetic order to another that does not.

    2
    Physicist Silke Bühler-Paschen of the Vienna University of Technology (Photo by Luisa Puiu/TU Wien)

    At Rice, study co-lead author Xinwei Li, then a graduate student in the lab of co-author and RCQM member Junichiro Kono, performed terahertz spectroscopy experiments on the films at temperatures as low as 1.4 Kelvin. The terahertz measurements revealed the optical conductivity of the YbRh2Si2 films as they were cooled to a quantum critical point that marked the transition from one quantum phase to another.

    “With strange metals, there is an unusual connection between electrical resistance and temperature,” said corresponding author Silke Bühler-Paschen of TU Wien’s Institute for Solid State Physics. “In contrast to simple metals such as copper or gold, this does not seem to be due to the thermal movement of the atoms, but to quantum fluctuations at the absolute zero temperature.”

    To measure optical conductivity, Li shined coherent electromagnetic radiation in the terahertz frequency range on top of the films and analyzed the amount of terahertz rays that passed through as a function of frequency and temperature. The experiments revealed “frequency over temperature scaling,” a telltale sign of quantum criticality, the authors said.

    Kono, an engineer and physicist in Rice’s Brown School of Engineering, said the measurements were painstaking for Li, who’s now a postdoctoral researcher at the California Institute of Technology. For example, only a fraction of the terahertz radiation shined onto the sample passed through to the detector, and the important measurement was how much that fraction rose or fell at different temperatures.

    3
    Former Rice University graduate student Xinwei Li in 2016 with the terahertz spectrometer he later used to measure entanglement in the conduction electrons flowing through a “strange metal” compound of ytterbium, rhodium and silicon. (Photo by Jeff Fitlow/Rice University)

    “Less than 0.1% of the total terahertz radiation was transmitted, and the signal, which was the variation of conductivity as a function of frequency, was a further few percent of that,” Kono said. “It took many hours to take reliable data at each temperature to average over many, many measurements, and it was necessary to take data at many, many temperatures to prove the existence of scaling.

    “Xinwei was very, very patient and persistent,” Kono said. “In addition, he carefully processed the huge amounts of data he collected to unfold the scaling law, which was really fascinating to me.”

    Making the films was even more challenging. To grow them thin enough to pass terahertz rays, the TU Wien team developed a unique molecular beam epitaxy system and an elaborate growth procedure. Ytterbium, rhodium and silicon were simultaneously evaporated from separate sources in the exact 1-2-2 ratio. Because of the high energy needed to evaporate rhodium and silicon, the system required a custom-made ultrahigh vacuum chamber with two electron-beam evaporators.

    “Our wild card was finding the perfect substrate: germanium,” said TU Wien graduate student Lukas Prochaska, a study co-lead author. The germanium was transparent to terahertz, and had “certain atomic distances (that were) practically identical to those between the ytterbium atoms in YbRh2Si2, which explains the excellent quality of the films,” he said.

    Si recalled discussing the experiment with Bühler-Paschen more than 15 years ago when they were exploring the means to test a new class of quantum critical point. The hallmark of the quantum critical point that they were advancing with co-workers is that the quantum entanglement between spins and charges is critical.

    4
    Former Rice University graduate student Xinwei Li (left) and Professor Junichiro Kono in 2016 with the terahertz spectrometer Li used to measure quantum entanglement in YbRh2Si2. (Photo by Jeff Fitlow/Rice University)

    “At a magnetic quantum critical point, conventional wisdom dictates that only the spin sector will be critical,” he said. “But if the charge and spin sectors are quantum-entangled, the charge sector will end up being critical as well.”

    At the time, the technology was not available to test the hypothesis, but by 2016, the situation had changed. TU Wien could grow the films, Rice had recently installed a powerful microscope that could scan them for defects, and Kono had the terahertz spectrometer to measure optical conductivity. During Bühler-Paschen’s sabbatical visit to Rice that year, she, Si, Kono and Rice microscopy expert Emilie Ringe received support to pursue the project via an Interdisciplinary Excellence Award from Rice’s newly established Creative Ventures program.

    “Conceptually, it was really a dream experiment,” Si said. “Probe the charge sector at the magnetic quantum critical point to see whether it’s critical, whether it has dynamical scaling. If you don’t see anything that’s collective, that’s scaling, the critical point has to belong to some textbook type of description. But, if you see something singular, which in fact we did, then it is very direct and new evidence for the quantum entanglement nature of quantum criticality.”

    Si said all the efforts that went into the study were well worth it, because the findings have far-reaching implications.

    “Quantum entanglement is the basis for storage and processing of quantum information,” Si said. “At the same time, quantum criticality is believed to drive high-temperature superconductivity. So our findings suggest that the same underlying physics — quantum criticality — can lead to a platform for both quantum information and high-temperature superconductivity. When one contemplates that possibility, one cannot help but marvel at the wonder of nature.”

    Si is the Harry C. and Olga K. Wiess Professor in Rice’s Department of Physics and Astronomy. Kono is a professor in Rice’s departments of Electrical and Computer Engineering, Physics and Astronomy, and Materials Science and NanoEngineering and the director of Rice’s Applied Physics Graduate Program. Ringe is now at the University of Cambridge.

    Additional co-authors include Maxwell Andrews, Maximilian Bonta, Werner Schrenk, Andreas Limbeck and Gottfried Strasser, all of the TU Wien; Hermann Detz, formerly of TU Wien and currently at Brno University; Elisabeth Bianco, formerly of Rice and currently at Cornell University; Sadegh Yazdi, formerly of Rice and currently at the University of Colorado Boulder; and co-lead author Donald MacFarland, formerly of TU Wien and currently at the University at Buffalo.

    The research was supported by the European Research Council, the Army Research Office, the Austrian Science Fund, the European Union’s Horizon 2020 program, the National Science Foundation, the Robert A. Welch Foundation, Los Alamos National Laboratory and Rice University.

    RCQM leverages global partnerships and the strengths of more than 20 Rice research groups to address questions related to quantum materials. RCQM is supported by Rice’s offices of the provost and the vice provost for research, the Wiess School of Natural Sciences, the Brown School of Engineering, the Smalley-Curl Institute and the departments of Physics and Astronomy, Electrical and Computer Engineering, and Materials Science and NanoEngineering.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings


    Stem Education Coalition

    Rice U campus

    In his 1912 inaugural address, Rice University president Edgar Odell Lovett set forth an ambitious vision for a great research university in Houston, Texas; one dedicated to excellence across the range of human endeavor. With this bold beginning in mind, and with Rice’s centennial approaching, it is time to ask again what we aspire to in a dynamic and shrinking world in which education and the production of knowledge will play an even greater role. What shall our vision be for Rice as we prepare for its second century, and how ought we to advance over the next decade?

    This was the fundamental question posed in the Call to Conversation, a document released to the Rice community in summer 2005. The Call to Conversation asked us to reexamine many aspects of our enterprise, from our fundamental mission and aspirations to the manner in which we define and achieve excellence. It identified the pressures of a constantly changing and increasingly competitive landscape; it asked us to assess honestly Rice’s comparative strengths and weaknesses; and it called on us to define strategic priorities for the future, an effort that will be a focus of the next phase of this process.

     
  • richardmitnick 2:24 pm on January 16, 2020 Permalink | Reply
    Tags: , , , , , , , The LSST Vera C. Rubin Observatory,   

    From The Kavli Foundation: “Behold the Whole Sky” The LSST Vera C. Rubin Observatory 

    KavliFoundation

    From The Kavli Foundation

    01/02/2020
    Adam Hadhazy

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)

    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    The LSST Vera C. Rubin Observatory

    LSST Camera, built at SLAC



    LSST telescope, Vera C. Rubin Observatory, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.


    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    When construction is complete, the LSST, Vera C. Rubin Observatory, will be “the widest, fastest, deepest eye of the new digital age.”

    There’s about to be a new telescope in town—in the figurative sense, that is, unless you happen to literally live more than a mile-and-a-half up on the summit of a mountain named Cerro Pachón in the foothills of the Chilean Andes.

    There, construction is humming along for the Large Synoptic Survey Telescope, or LSST. Slated to start science operations early next decade, LSST in all likelihood will be a gamechanger for astronomy and astrophysics.

    What makes LSST so special is how big and fast it will be compared to other telescopes. “Big” in this case refers to the telescope’s field of view, which captures a chunk of sky 40 times the size of the full Moon. “Big” also refers to LSST’s mirror size, a very respectable 8.4 meters in diameter, which means it can collect ample amounts of cosmic light. Thirdly, “big” applies to LSST’s 3.2 billion-pixel camera, the biggest digital camera ever built. Put all those bits together, and LSST will be able to record images of significantly fainter and farther-away objects than other ground-based optical telescopes.

    And finally, as for “fast,” LSST will soak up more than 800 panoramas each night, cumulatively scanning the entire sky twice per week. That means the telescope will catch sight of fleeting astrophysical events, known as transients, that are often missed because telescopes—even today’s state-of-the-art, automated networks of ‘scopes—are not gobbling up so much of the sky so quickly. Transients that last days, weeks, and months—for instance, cataclysmic stellar explosions called supernovae—are routinely spotted. But the shortest events, lasting mere hours or even minutes, are another, untold story.

    “Unfortunately, we still know relatively little about the transient optical sky because we have never before had a survey that can make observations of a very large fraction of the sky repeatedly every few nights,” says Steven Kahn, Director of the LSST project. “LSST will meet this need.”

    Kahn, the Cassius Lamb Kirk Professor in the Natural Sciences and Professor of Particle Physics and Astrophysics at Stanford University, is also a member of the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC). He stepped into the director role back in 2013 when LSST was on the drawing board. Now the huge instrument is nearing the completion of its construction. Kahn and his colleagues are dearly looking forward to all that LSST will bring to the table, building on the pioneering work into gauging the transient sky underway with other, precursor projects worldwide.

    “LSST will go significantly deeper and cover the sky more rapidly,” says Kahn. “By covering more sky per unit time, we are more sensitive to very rare events, which are often the most interesting.”

    In this way, LSST is going to open up a major discovery space, for phenomena both (poorly) known and (entirely) unknown.

    “The Universe is far from static,” says Kahn. “There are stellar explosions of many different kinds that allow stars to brighten dramatically and then fade away on different timescales.” Some of these transient flashes of light are expected from the vicinities of neutron stars and black holes as they interact with matter that strays too close. Researchers hope to gain new insights into these dense objects’ properties, whose extreme physics challenge our best-supported theories.

    Another primary goal for LSST is to advance our understanding of the “dark universe” of dark matter and dark energy. Together, these entities compose 95 percent of the cosmos, with the “normal” matter that makes up stars, planets, and people registering as the remaining rounding error. Yet scientists have only stabs in the dark, as it were, on what exactly dark matter and dark energy really are. LSST will help by acquiring images of billions of galaxies, stretching back to some of the earliest epochs in the universe. Analyzing the shapes and distributions of these galaxies in space as well as time (recall that the farther away you see something in the universe, the farther you’re seeing back in time) will better show dark matter’s role in building up cosmic structure. The signature of dark energy, a force that is seemingly accelerating the universe’s expansion, will also be writ across the observed eons of galactic loci.

    Closer to home, LSST will vastly expand our knowledge of our own Solar System. It will take a census of small bodies, such as asteroids and comets, that fly by overhead, too faint for us humans to notice but there all the same—and in rare instances, potentially dangerously so; just ask the dinosaurs.

    “LSST will measure everything that moves in the sky,” says Kahn. “Of particular interest, we will provide the most complete catalogue of potentially hazardous asteroids, those objects whose orbits might allow them to impact the Earth.”

    Not done yet, LSST will also extend our catalogue of stars in the galaxy, aiding in charting the history and evolution of our own Milky Way galaxy. Furthermore, LSST will be a premier instrument for discovering the sources of gravitational waves, the ripples in spacetime first predicted by Albert Einstein in 1915 and finally directly detected in 2015 by the LIGO experiment. It can be a tough business today, even with the rich array of telescopes in operation, to rapidly pinpoint the visible light that gravitational wave-spawning neutron star collisions give off. LSST should aid in that regard admirably.

    The wait is nearly over. The LSST building is nearly complete, the large mirrors are on site, and the camera is being integrated at the at SLAC National Accelerator Laboratory in California, which co-hosts KIPAC along with Stanford.

    “Basically, everything that needed to be fabricated for the LSST telescope and camera has been fabricated,” says Kahn. “The remaining work largely involves putting the system together and getting it working.”

    Kahn has been to the telescope site recently, in both September and October. He likes what he sees.

    “Visiting the site in Chile is a remarkable experience,” Kahn says. “It is a beautiful site, and the LSST facility sits prominently atop the edge of a cliff on Cerro Pachón. The sheer size of the building and its complexity is striking.”

    Before long, the impressiveness of the building will recede into the background as the profundity of the science LSST generates takes center stage.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

     
  • richardmitnick 1:55 pm on January 16, 2020 Permalink | Reply
    Tags: "What companies can learn from a Cambridge physics laboratory", , Cavendish physics laboratory at Cambridge university, Ernest Rutherford, Financial Times   

    From Financial Times: “What companies can learn from a Cambridge physics laboratory” 

    From Financial Times

    Ernest Rutherford’s pioneering science lab offers great lessons in modern teamwork.

    1
    Ernest Rutherford, centre, with colleagues at the Cavendish laboratory in Cambridge © Hulton-Deutsch Collection/Corbis/Getty

    1.15.20
    David Bodanis

    It’s hard to guide teams. Too hands-off and the result is chaos; too hands-on and no one has any space for initiative. But how to get it right? I have found the history of science an excellent way to begin.

    Back in the early 20th century, when Ernest Rutherford ran the Cavendish physics laboratory at Cambridge university, he created teams that surpassed almost anything else in the world.

    The Cavendish was at that time a small operation, but in a single 20-year period, researchers under his direction won at least eight Nobel Prizes in physics: more than all of France, all of Italy, and all of Japan combined. This was unprecedented.

    How did he do it? Many labs had bright students and experienced professors. The twist was a particular style of guidance Rutherford brought in.

    Directives at the right level

    If Rutherford had given everyone at his lab very abstract guidelines, such as “be the best”, that wouldn’t have helped. Who doesn’t want to be the best? Like vapid mission statements, it doesn’t tell you anything about how to get to the desired goal.

    If, on the other hand, he had specified exactly what experiments people were to do — if he had told new hires to “take this specific electrical apparatus and do precisely this operation with it, and don’t argue about it” — he would have crushed their initiative. There would be no space for creativity. They would be mere technicians.

    Instead, Rutherford created a guide that was in-between: not so abstract as to be useless, but not so detailed as to be unduly limiting.

    Naming the guidance: mid-level abstraction (MLA)

    In the case of the Cavendish lab, Rutherford’s guide came out as: “See what’s inside the atom.” Newcomers knew what to do. Established researchers, support staff and funders did too.

    Hans Geiger, for example, created his eponymous counter following this guidance, with little supervision. Out of the range of all possible actions, he knew which subset he should aim to help: it was anything that helped the teams hear what was happening inside an atom — which led to his counter with its famous “clicks”.

    Rutherford’s work might be considered just a curio from the history of science. But successful businesses end up working in accord with skilfully cast MLAs almost all the time.

    Consider Pixar. Its founders wouldn’t have achieved much if their only guidance had been: “Let’s revolutionise Hollywood.” That’s too abstract. If on the other hand they had said, “let’s make movies about living toys, with a Tom Hanks-voiced cowboy as hero”, they might have succeeded with one series, but that would have been it.

    2
    Pixar’s Toy Story. The company’s mid-level guide was: ‘Tell good stories using new animation tools’ © Pixar Animation Studios/Walt Disney Pictures/Snap/Shutterstock

    The Pixar founders, however, knew that a fresh field was opening up with advances in computer graphics and saw a fulfilling and potentially lucrative way to use it. Everyone at Pixar in its early days, even those with no experience, instantly picked up on their distinctive mid-level guide: “Tell good stories using new animation tools.”

    MLA sounds easy, but many companies get it wrong. Their guides are too amorphous, or they pull in too many directions, or they fall woefully out of date (as with Yahoo). Individual charisma isn’t enough, but has to be channelled.

    The inner structure of MLAs

    To create guidance tools pitched at the right level — getting teams to pull together, yet also keeping individual initiative — the first step is to understand how the best MLAs work.

    First of all they are built around verbs. In the Pixar example, the company would be telling stories. At the Cavendish lab, the focus was on seeing what’s inside an atom.

    Then, along with having such verbs, good MLAs need to exclude things. Woolly compromise is out. (It’s a basic point from information theory: If anything goes, there is no information; no useful advice.) It’s the same reason a mentor who constantly says “Yes, go for it!” is useless. The mentee is not being given any guidance.

    Pixar’s guide did this very well. Conventional films were out, hand-drawn animation was out, purely technical non-story-driven exercises were out, and so on.

    Rutherford’s MLA at the Cavendish was similarly bold at excluding things. If researchers had been left to their own devices they might have turned to any topic of interest at the time: matters such as the improvement of undersea telegraph cables, or the precise measurement of electrical circuitry. But then they would all be pulling in different directions. There would be little of the knowledge spillovers or agglomeration effects which we know are so important in successful cities and teams today.

    Nor would they have gained from the wisdom which the astute Rutherford had accumulated in his career to that point.

    Sometimes game-changing MLAs arrive fully formed — think of Jeff Bezos’s early efforts to “sell all books online fast”. No one had to force teams to think about one-click purchasing: it was a natural consequence, like Geiger’s work.

    More often there is a series of iterations before getting it right, as with Google’s “monetise search through targeted ads”. It is also often wise to adjust for different levels within a company: corporate, departmental and so on.

    There’s still no magic, and one needs skill at execution. The England rugby coach Eddie Jones could be brought in to run a secondary school rugby team, and it would still have zero chance in the World Cup. But given a minimum competency, astute MLAs do wonders.

    Updating the guidance

    In time, though, even the best MLAs go out of date. It happened at the Cavendish, which faded in Rutherford’s later years. Only a new series of leaders, shifting focus to “investigate crystallised proteins with X-rays”, reinvigorated Cambridge science: jump-starting the DNA revolution, and triggering yet another cascade of Nobels.

    Companies need to refresh their MLAs as well: not too often — that’s just nerves — but usually within a decade. It’s hard, but not impossible. All it takes is the aptly chosen verb and confidence in deciding what to exclude.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Financial Times (FT) is an English-language international daily newspaper owned by Japanese company Nikkei, Inc., headquartered in London, with a special emphasis on business and economic news.

    The paper was founded in 1888 by James Sheridan and Horatio Bottomley, and merged in 1945 with its closest rival, the Financial News (which had been founded in 1884).

    The Financial Times has a record paying readership of one million, three-quarters of which are digital subscriptions (as of April 2019).

    On 23 July 2015, Nikkei Inc. agreed to buy the Financial Times from Pearson for £844m ($1.32 billion) and the acquisition was completed on 30 November 2015.

     
  • richardmitnick 12:47 pm on January 16, 2020 Permalink | Reply
    Tags: "Behind howls of solar wind quiet chirps reveal its origins", , , , , , ,   

    From JHU HUB: “Behind howls of solar wind, quiet chirps reveal its origins” 

    From JHU HUB

    1.15.20
    Jeremy Rehm

    1
    Image credit: NASA/Naval Research Laboratory/Parker Solar Probe

    NASA Parker Solar Probe Plus named to honor Pioneering Physicist Eugene Parker

    Scientists have studied the solar wind (pictured) for more than 60 years, but they’re still puzzled over some of its behaviors. The small chirps, squeaks, and rustles recorded by the Parker Solar Probe hint at the origin of this mysterious and ever-present wind.

    There’s a wind that emanates from the sun, and it blows not like a soft whistle but like a hurricane’s scream.

    Made of electrons, protons, and heavier ions, the solar wind courses through the solar system at roughly 1 million miles per hour, barreling over everything in its path. Yet through the wind’s roar, NASA’s Parker Solar Probe can hear small chirps, squeaks, and rustles that hint at the origins of this mysterious and ever-present wind. Now, the team at the Johns Hopkins Applied Physics Laboratory, which designed, built, and manages the Parker Solar Probe for NASA, is getting their first chance to hear those sounds, too.

    “We are looking at the young solar wind being born around the sun,” says Nour Raouafi, mission project scientist for the Parker Solar Probe. “And it’s completely different from what we see here near Earth.”


    Sounds of the Solar Wind from NASA’s Parker Solar Probe

    Scientists have studied the solar wind for more than 60 years, but they’re still puzzled over many of its behaviors. For example, while they know it comes from the sun’s million-degree outer atmosphere called the corona, the solar wind doesn’t slow down as it leaves the sun—it speeds up, and it has a sort of internal heater that keeps it from cooling as it zips through space. With growing concern about the solar wind’s ability to interfere with GPS satellites and disrupt power grids on Earth, it’s imperative to better understand it.

    Just 17 months since the probe’s launch and after three orbits around the sun, Parker Solar Probe has not disappointed in its mission.

    “We expected to make big discoveries because we’re going into uncharted territory,” Raouafi says. “What we’re actually seeing is beyond anything anybody imagined.”

    Researchers suspected that plasma waves within the solar wind could be responsible for some of the wind’s odd characteristics. Just as fluctuations in air pressure cause winds that force rolling waves on the ocean, fluctuations in electric and magnetic fields can cause waves that roll through clouds of electrons, protons, and other charged particles that make up the plasma racing away from the sun. Particles can ride these plasma waves much like the way a surfer rides an ocean wave, propelling them to higher speeds.

    “Plasma waves certainly play a part in heating and accelerating the particles,” Raouafi says. Scientists just don’t know how much of a part. That’s where Parker Solar Probe comes in.

    The spacecraft’s FIELDS instrument can eavesdrop on the electric and magnetic fluctuations caused by plasma waves. It can also “hear” when the waves and particles interact with one another, recording frequency and amplitude information about these plasma waves that scientists can then play as sound waves. And it results in some striking sounds.

    2
    Parker Solar Probe Diagram instrument FIELDS. NASA

    Take, for example, whistler-mode waves. These are caused by energetic electrons bursting out of the sun’s corona. These electrons follow magnetic field lines that stretch away from the sun out into the solar system’s farthest edge, spinning around them like they’re riding a carousel. When a plasma wave’s frequency matches how frequently those electrons are spin, they amplify one another. And it sounds like a scene out of Star Wars.

    “Some theories suggest that part of the solar wind’s acceleration is due to these escaping electrons,” says David Malaspina, a member of the FIELDS team and an assistant professor at the University of Colorado, Boulder, and the Laboratory for Atmospheric and Space Physics. He adds that the electrons could also be a critical clue to understanding one process that heats the solar wind.

    “We can use observations of these waves to work our way backward and probe the source of these electrons in the corona,” Malaspina says.

    Another example are dispersive waves, which quickly shift from one frequency to another as they move through the solar wind. These shifts create a sort of “chirp” that sounds like wind rushing over a microphone. They’re rare near the Earth, so researchers believed they were unimportant. But closer to the sun, scientists discovered, these waves are everywhere.

    “These waves haven’t been detected in the solar wind before, at least not in any large numbers,” Malaspina explains. “Nobody knows what causes these chirping waves or what they do to heat and accelerate the solar wind. That’s what we’re going to be determining. I think it’s incredibly exciting.”

    Raouafi commented that seeing all of this wave activity very close to the sun is why this mission is so critical. “We are seeing new, early behaviors of solar plasma we couldn’t observe here at Earth, and we’re seeing that the energy carried by the waves is being dissipated somewhere along the way, to heat and accelerate the plasma.”

    But it wasn’t just plasma waves that Parker Solar Probe heard. While barreling through a cloud of microscopic dust, the spacecraft’s instruments also captured a sound resembling old TV static. That static-like sound is actually hundreds of microscopic impacts happening every day: dust from asteroids torn apart by the sun’s gravity and heat and particles stripped away from comets strike the spacecraft at speeds close to a quarter of a million miles per hour. As Parker Solar Probe cruises through this dust cloud, the spacecraft doesn’t just crash into these particles—it obliterates them. Each grain’s atoms burst apart into electrons, protons, and other ions in a mini puff of plasma that the FIELDS instrument can “hear.”

    Each collision, however, also chips away a tiny bit of the spacecraft.

    “It was well understood that this would happen,” Malaspina says. “What was not understood was how much dust was going to be there.”

    APL engineers used models and remote observations to estimate how bad the dust situation might be well before the spacecraft launched. But in this uncharted territory, the number was bound to have some margin of error.

    James Kinnison, the Parker Solar Probe mission system engineer at APL, says this discrepancy in dust density is just one more reason why the probe’s proximity to the sun is so useful.

    “We protected almost everything from the dust,” Kinnison says. And although the dust is denser than expected, nothing right now points to dust impacts being a concern for the mission, he adds.

    Parker Solar Probe is scheduled to make another 21 orbits around the sun, using five flybys of Venus to propel itself increasingly closer to the star. Researchers will have the opportunity to better understand how these plasma waves change their behavior and to build a more complete evolutionary picture of the solar wind.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    About the Hub
    We’ve been doing some thinking — quite a bit, actually — about all the things that go on at Johns Hopkins. Discovering the glue that holds the universe together, for example. Or unraveling the mysteries of Alzheimer’s disease. Or studying butterflies in flight to fine-tune the construction of aerial surveillance robots. Heady stuff, and a lot of it.

    In fact, Johns Hopkins does so much, in so many places, that it’s hard to wrap your brain around it all. It’s too big, too disparate, too far-flung.

    We created the Hub to be the news center for all this diverse, decentralized activity, a place where you can see what’s new, what’s important, what Johns Hopkins is up to that’s worth sharing. It’s where smart people (like you) can learn about all the smart stuff going on here.

    At the Hub, you might read about cutting-edge cancer research or deep-trench diving vehicles or bionic arms. About the psychology of hoarders or the delicate work of restoring ancient manuscripts or the mad motor-skills brilliance of a guy who can solve a Rubik’s Cube in under eight seconds.

    There’s no telling what you’ll find here because there’s no way of knowing what Johns Hopkins will do next. But when it happens, this is where you’ll find it.

    The Johns Hopkins University opened in 1876, with the inauguration of its first president, Daniel Coit Gilman. “What are we aiming at?” Gilman asked in his installation address. “The encouragement of research … and the advancement of individual scholars, who by their excellence will advance the sciences they pursue, and the society where they dwell.”

    The mission laid out by Gilman remains the university’s mission today, summed up in a simple but powerful restatement of Gilman’s own words: “Knowledge for the world.”

    What Gilman created was a research university, dedicated to advancing both students’ knowledge and the state of human knowledge through research and scholarship. Gilman believed that teaching and research are interdependent, that success in one depends on success in the other. A modern university, he believed, must do both well. The realization of Gilman’s philosophy at Johns Hopkins, and at other institutions that later attracted Johns Hopkins-trained scholars, revolutionized higher education in America, leading to the research university system as it exists today.

     
  • richardmitnick 11:20 am on January 16, 2020 Permalink | Reply
    Tags: "Why is Puerto Rico Being Struck by Earthquakes?", , , ,   

    From Discover Magazine: “Why is Puerto Rico Being Struck by Earthquakes?” 

    DiscoverMag

    From Discover Magazine

    January 7, 2020
    Erik Klemetti

    Multiple large earthquakes have hit Puerto Rico over the past week, all thanks to the geologically-active Caribbean Plate.

    The tectonic plates of the world were mapped in 1996, USGS.

    1
    Map of recent earthquakes from late December into early January 2020 near Puerto Rico. Credit: USGS.

    Since Monday, Puerto Rico has been struck by multiple magnitude 5 and 6 earthquakes. These earthquakes caused significant damage on an island still recovering from the devastation of Hurricane Maria in 2017.

    Most people don’t think of the Caribbean as an area rife for geologic activity, but earthquakes and eruptions are common. The major earthquakes in Puerto Rico and Haiti, as well as eruptions on Montserrat are all reminders that complex interactions between tectonic plates lie along the Caribbean Ocean’s margins.

    The Caribbean plate lies beneath much of the ocean of the same name (see below). It is bounded in the north and east by the North American plate, to the south by the South American plate and to the west by the Cocos plate. There isn’t much land mass above sea level on the plate beyond the islands that stretch from southern Cuba to the Lesser Antilles, along with parts of Central America like Costa Rica and Panama. A few small platelets have been identified along the margins of the plate as well.

    2
    Tectonic plates in the eastern Caribbean with historical earthquakes from 1900-2016 marked. Source: USGS.

    The northern edge of the plate is a transform boundary, where the two plates are sliding by each other. This causes stress that leads to earthquakes, much the same as the earthquakes generated along the San Andreas fault in California. This is why we’ve seen large earthquakes in places like Haiti, the Dominican Republic and now Puerto Rico.

    Head to the east and you reach the curving arc of islands that form the Lesser Antilles. Many of these islands are homes to potentially active volcanoes, such as Soufrière Hills on Montserrat, Pelée on Martinique, La Soufrière on St. Vincent and more. Other islands are homes to relict volcanoes as well. All these volcanoes have been formed by the North American plate sliding underneath the Caribbean, similar to the Cascade Range in the western United States and Canada.

    So, Puerto Rico doesn’t have active volcanoes, but it can experience large earthquakes. One of the most famous in the 1918 San Fermín earthquake that was a magnitude 7.1. Unlike the current temblors, the San Fermín earthquake occurred north of the island under the sea, generating a tsunami. More than 100 people likely died in that event.

    The current spate of earthquakes struck near the southern coast of the island. Both of the largest earthquakes — Monday’s M5.8 and Tuesday’s M6.4 — occurred during the early morning hours, when most people are at home. This heightens the risk of injuries and fatalities if homes collapse, but luckily so far the number of deaths is low. However, there has been significant damage to home and infrastructure already made precarious by the devastation of Hurricane Maria. This means longer-term hazards for the people of Puerto Rico.

    On top of this, the earthquakes have triggered landslides and rockfalls, increasing the threat to the island’s residents. The shaking also destroyed a picturesque natural bridge on the coast of the island. With dozens of aftershocks so far, it may be quite some time before people feel secure again.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 11:00 am on January 16, 2020 Permalink | Reply
    Tags: "How far is Betelgeuse?", , , , , ,   

    From ALMA via EarthSky: “How far is Betelgeuse?” 

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    From ALMA

    via

    1

    EarthSky

    January 16, 2020

    Recent speculation that Betelgeuse might be on the verge of going supernova prompted many to ask: how far away is it? But getting a distance measurement for this star has been no easy task.

    1
    An image of Betelgeuse taken at sub-millimeter wavelengths by the Atacama Large Millimeter/submillimeter Array (ALMA). It shows a section of hot gas slightly protruding from the red giant star’s extended atmosphere. Some of the data used to compute the latest parallax for Betelgeuse came from observations by ALMA. Image via ALMA.

    Betelgeuse, the bright red star in the constellation of Orion the Hunter, is in the end stage of its stellar life. Astronomers have long thought it will someday explode to become a supernova. In late 2019 and early 2020, Betelgeuse generated a lot of chatter on social media among astronomers. They wondered, somewhat jokingly, if an explosion were imminent because the star has dimmed, unprecedentedly, by a noticeable amount since late October 2019. As the news went mainstream, many people wondered how far Betelgeuse was from us and if an explosion could hurt life on Earth. The good news is that if Betelgeuse explodes, it is close enough to put on a spectacular light show, but far enough to not cause us on Earth any harm. To answer the distance question first, Betelgeuse is approximately 724 light-years away. But getting that answer, even for a relatively nearby star, is surprisingly difficult.

    It’s only in the last 30 years, with the use of new technologies, that astronomers have obtained more accurate measurements for the distance to Betelgeuse and other nearby stars. This advance began in 1989, when the European Space Agency (ESA) launched a space telescope called Hipparcos, named after the famous Greek astronomer Hipparchus.

    ESA/Hipparcos satellite

    Over several years of observations, the Hipparcos space telescope provided parallax and distance data for more than 100,000 relatively nearby stars.

    Those measurements became the basis for most of the estimated distances to stars that you see today.

    3
    When viewed from two locations, there is a slight shift in the position of a nearby star with respect to distant background stars. For observations on Earth, taken six months apart, the separation between those two locations is the diameter of Earth’s orbit. The angle alpha is the parallax angle. Image via P.wormer / Wikimedia Commons.

    The original Hipparcos data gave a parallax of 7.63 milliarcseconds for Betelgeuse; that’s about one-millionth the width of the full moon. Computations based on that parallax yielded a distance of about 430 light-years.

    However, Betelgeuse is what’s known as a variable star because its brightness fluctuates with time (that said, the recent excitement over Betelgeuse’s dimming is because it’s the biggest dip in brightness ever observed). And therein began the difficulty in estimating Betelgeuse’s distance.

    That’s because subsequent studies found an error in the methods used for reducing the Hipparcos data for variable stars. An effort to correct those errors gave a parallax of 5.07 milliarcseconds, changing Betelgeuse’s estimated distance from 430 light-years to about 643 light-years, plus or minus 46 light-years.

    But wait, there’s more. In 2017, astronomers published new calculations that further refined Betelgeuse’s parallax to 4.51 milliiarcseconds. This new analysis of data from Hipparcos also included observations from several ground-based radio telescopes. That placed Betelgeuse at a distance of about 724 light-years, or, more accurately, between 613 and 881 light-years when data uncertainties are included.

    You might know that the European Space Agency’s Gaia astrometry mission has the goal of making a three-dimensional map of our Milky Way galaxy.

    ESA/GAIA satellite

    At the time of its second data release in April 2018, ESA said Gaia’s data had already made possible:

    “… the richest star catalog to date, including high-precision measurements of nearly 1.7 billion stars….”

    Yet Betelgeuse is not one of those stars, and Gaia won’t be used to find a more precise distance for Betelgeuse. The reason is that Betelgeuse is too bright for the spacecraft’s sensors.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: