Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:13 pm on April 22, 2021 Permalink | Reply
    Tags: "ALMA Discovers Rotating Infant Galaxy with Help of Natural Cosmic Telescope", , , , , From ALMA(CL), ,   

    From ALMA(CL) : “ALMA Discovers Rotating Infant Galaxy with Help of Natural Cosmic Telescope” 

    From ALMA(CL)

    22 April, 2021

    Nicolás Lira
    Education and Public Outreach Coordinator
    Joint ALMA Observatory, Santiago – Chile
    Phone: +56 2 2467 6519
    Cell phone: +56 9 9445 7726
    Email: nicolas.lira@alma.cl

    Masaaki Hiramatsu
    Education and Public Outreach Officer, NAOJ Chile
    Observatory
, Tokyo – Japan
    Phone: +81 422 34 3630
    Email: hiramatsu.masaaki@nao.ac.jp

    Bárbara Ferreira
    ESO Public Information Officer
    Garching bei München, Germany
    Phone: +49 89 3200 6670
    Email: pio@eso.org

    Amy C. Oliver
    Public Information & News Manager
    National Radio Astronomical Observatory (NRAO), USA
    Phone: +1 434 242 9584
    Email: aoliver@nrao.edu

    All general references:
    ALMA Observatory (CL)
    European Southern Observatory(EU)
    National Astronomical Observatory of Japan(JP)
    National Radio Astronomy Observatory(US)

    1
    Image of the galaxy cluster RXCJ0600-2007 taken by the NASA/ESA Hubble Space Telescope (US), combined with gravitational lensing images of the distant galaxy RXCJ0600-z6, 12.4 billion light-years away, observed by ALMA (shown in red).

    Due to the gravitational lensing effect by the galaxy cluster, the image of RXCJ0600-z6 was intensified and magnified, and appeared to be divided into three or more parts. Credit: ALMA (ESO/NAOJ/NRAO), Fujimoto et al., NASA/ESA Hubble Space Telescope.

    2
    Reconstructed image of the distant galaxy RXCJ0600-z6 by compensating for the gravitational lensing effect caused by the galaxy cluster. The red contours show the distribution of radio waves emitted by carbon ions captured by ALMA, and the blue contours show the spread of light captured by the Hubble Space Telescope. The critical line, where the light intensity from gravitational lensing is at its maximum, runs along the left side of the galaxy, so this part of the galaxy was further magnified (inset image). Credit: ALMA (ESO/NAOJ/NRAO), Fujimoto et al., NASA/ESA Hubble Space Telescope.

    Using the Atacama Large Millimeter/submillimeter Array (ALMA), astronomers found a rotating baby galaxy 1/100th the size of the Milky Way at a time when the Universe was only seven percent of its present age. Assisted by the gravitational lens effect, the team was able to explore for the first time the nature of small and dark “normal galaxies” in the early Universe, representative of the main population of the first galaxies, which greatly advances our understanding of the initial phase of galaxy evolution.

    “Many of the galaxies that existed in the early Universe were so small that their brightness is well below the limit of the current largest telescopes on Earth and in Space, making difficult to study their properties and internal structure,” says Nicolas Laporte, a Kavli Senior Fellow at the University of Cambridge (UK). “However, the light coming from the galaxy named RXCJ0600-z6, was highly magnified by gravitational lensing, making it an ideal target for studying the properties and structure of a typical baby galaxies.”

    Gravitational lensing is a natural phenomenon in which light emitted from a distant object is bent by the gravity of a massive body such as a galaxy or a galaxy cluster located in the foreground. The name “gravitational lensing” is derived from the fact that the gravity of the massive object acts like a lens. When we look through a gravitational lens, the light of distant objects is magnified and their shapes are stretched. In other words, it is a “natural telescope” floating in space.

    The ALMA Lensing Cluster Survey (ALCS) team used ALMA to search for a large number of galaxies in the early Universe that are enlarged by gravitational lensing. Combining the power of ALMA, with the help of the natural telescopes, the researchers are able to uncover and study fainter galaxies.

    Why is it crucial to explore the faintest galaxies in the early Universe? Theory and simulations predict that the majority of galaxies formed few hundred millions years after the Big-Bang are small, and thus faint. Although several galaxies in the early Universe have been previously observed, those studied were limited to the most massive objects, and therefore the less representative galaxies in the early Universe, because of telescopes capabilities. The only way to understand the standard formation of the first galaxies, and obtain a complete picture of galaxy formation, is to focus on the fainter and more numerous galaxies.

    The ALCS team performed a large-scale observation program that took 95 hours, which is a very long time for ALMA observations, to observe the central regions of 33 galaxy clusters that could cause gravitational lensing. One of these clusters, called RXCJ0600-2007, is located in the direction of the constellation of Lepus, and has a mass 1000 trillion times that of the Sun. The team discovered a single distant galaxy that is being affected by the gravitational lens created by this natural telescope. ALMA detected the light from carbon ions and stardust in the galaxy and determined that the galaxy is seen as it was about 900 million years after the Big Bang (12.9 billion years ago) [1]. Further analysis of the ALMA and Gemini data suggested that a part of this source is seen 160 times brighter than it is intrinsically.

    By precisely measuring the mass distribution of the cluster of galaxies, it is possible to “undo” the gravitational lensing effect and restore the original appearance of the magnified object. By combining data from Hubble Space Telescope and the European Southern Observatory’s Very Large Telescope with a theoretical model, the team succeeded in reconstructing the actual shape of the distant galaxy RXCJ0600-z6.

    The total mass of this galaxy is about 2 to 3 billion times that of the Sun, which is about 1/100th of the size of our own Milky Way Galaxy.

    What astonished the team is that RXCJ0600-z6 is rotating. Traditionally, gas in the young galaxies was thought to have random, chaotic motion. Only recently has ALMA discovered several rotating young galaxies that have challenged the traditional theoretical framework [2], but these were several orders of magnitude brighter (larger) than RXCJ0600-z6.

    “Our study demonstrates, for the first time, that we can directly measure the internal motion of such faint (less massive) galaxies in the early Universe and compare it with the theoretical predictions”, says Kotaro Kohno, a professor at the University of Tokyo[東京大学](JP) and the leader of the ALCS team.

    “The fact that RXCJ0600-z6 has a very high magnification factor also raises expectations for future research,” explains Seiji Fujimoto, a DAWN fellow at the Niels Bohr Institute [Niels Bohr Institutet] (DK). “This galaxy has been selected, among hundreds, to be observed by the James Webb Space Telescope (JWST), the next generation space telescope to be launched this autumn.

    Through joint observations using ALMA and JWST, we will unveil the properties of gas and stars in a baby galaxy and its internal motions. When the Thirty Meter Telescope and the Extremely Large Telescope are completed, they may be able to detect clusters of stars in the galaxy, and possibly even resolve individual stars.

    There is an example of gravitational lensing that has been used to observe a single star 9.5 billion light-years away, and this research has the potential to extend this to less than a billion years after the birth of the Universe.”

    Notes

    [1] The light emitted from carbon ions was originally infrared light with a wavelength of 156 micrometers, but as the Universe expanded, the wavelength extended and became radio waves with a wavelength of 1.1 millimeters, which were detected with ALMA. The redshift of this object is z=6.07. Using the cosmological parameters measured with Planck (H0=67.3km/s/Mpc, Ωm=0.315, Λ=0.685: Planck 2013 Results), we can calculate the distance to the object to be 12.9 billion light-years. (Please refer to “Expressing the distance to remote objects” for the details.)

    [2] Using gravitational lensing, ALMA discovered a rotating galaxy similar in size to the Milky Way at about 12.4 billion years ago. (Please refer to the news article “ALMA sees most distant Milky Way look-alike” issued on August 13, 2020). Also, ALMA discovered a rotating galaxy from 12.4 billion years ago without using gravitational lensing. (Please refer to the news article “ALMA Discovers Massive Rotating Disk in Early Universe.”)

    Additional Information

    These observation results were presented in two papers, The Astrophysical Journal on April 22, 2021 and in the MNRAS on April 22, 2021.

    This research was supported by the Japan Society for the Promotion of Science KAKENHI (Grant Number JP17H06130, JP18K03693, 17H01114, 19H00697, and 20H00180), NAOJ ALMA Joint Scientific Research Program (2017-06B), European Research Council (ERC) Consolidator Grant funding scheme (project ConTExt, grant No. 648179, 681627-BUILDUP), ERC under the European Union’s Horizon 2020 research and innovation program (grant agreement No. 669253) , Independent Research Fund Denmark grant DFF-7014-00017, Danish National Research Foundation(No. 140), the Kavli Foundation, ANID grants CATA-Basal AFB-170002, FONDECYT Regular (1190818 and 1200495) , Millennium Science Initiative ICN12 009, STFC (ST/T000244/1) , NSFC grant 11933011, the Swedish Research Council, and the Knut and Alice Wallenberg Foundation. This work was partially supported by the joint research program of the Institute for Cosmic Ray Research (ICRR), University of Tokyo.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Atacama Large Millimeter/submillimeter Array (ALMA)(CL) , an international astronomy facility, is a partnership of Europe, North America and East Asia in cooperation with the Republic of Chile. ALMA is funded in Europe by the European Organization for Astronomical Research in the Southern Hemisphere (ESO), in North America by the U.S. National Science Foundation (NSF) in cooperation with the National Research Council of Canada (NRC) and the National Science Council of Taiwan (NSC) and in East Asia by the National Institutes of Natural Sciences (NINS) of Japan in cooperation with the Academia Sinica (AS) in Taiwan.

    ALMA construction and operations are led on behalf of Europe by European Southern Observatory(EU), on behalf of North America by the National Radio Astronomy Observatory (NRAO), which is managed by Associated Universities, Inc. (US) and on behalf of East Asia by the National Astronomical Observatory of Japan (NAOJ). The Joint ALMA Observatory (JAO) provides the unified leadership and management of the construction, commissioning and operation of ALMA.
    NRAO Small
    ESO 50 Large

    ALMA is a time machine!

    ALMA-In Search of our Cosmic Origins

     
  • richardmitnick 3:56 pm on April 22, 2021 Permalink | Reply
    Tags: "Fast radio bursts could help solve the mystery of the universe’s expansion",   

    From Science News : “Fast radio bursts could help solve the mystery of the universe’s expansion” 

    From Science News

    This work recognizes research by:

    1.The Oskar Klein Centre for Cosmoparticle Physics, Department of Physics, Stockholm University [Stockholms universitet](SE)
    2. Ruhr-University Bochum [Ruhr-Universität Bochum] (DE), Faculty of Physics and Astronomy, Astronomical Institute (AIRUB),German Centre for Cosmological Lensing
    3. Department of Physics, Technion – Israel Institute of Technology [ הטכניון – מכון טכנולוגי לישראל] (IL)

    1

    Short-lived bursts of radio waves from deep space, possibly from eruptions on magnetic stars (one illustrated), are now being used to measure the expansion of the universe.Credit: European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)

    Astronomers have been arguing about the rate of the universe’s expansion for nearly a century. A new independent method to measure that rate could help cast the deciding vote.

    For the first time, astronomers calculated the Hubble constant — the rate at which the universe is expanding — from observations of cosmic flashes called fast radio bursts, or FRBs. While the results are preliminary and the uncertainties are large, the technique could mature into a powerful tool for nailing down the elusive Hubble constant, researchers report April 12 at MNRAS.

    Ultimately, if the uncertainties in the new method can be reduced, it could help settle the long-standing debate that holds our understanding of the universe’s physics in the balance (SN: 7/30/19).

    “I see great promises in this measurement in the future, especially with the growing number of detected repeated FRBs,” says Stanford University (US) astronomer Simon Birrer, who was not involved with the new work.

    Astronomers typically measure the Hubble constant in two ways. One uses the cosmic microwave background [CMB], the light released shortly after the Big Bang, in the distant universe.

    The other uses supernovas and other stars in the nearby universe. These approaches currently disagree by a few percent. The new value from FRBs comes in at an expansion rate of about 62.3 kilometers per second for every megaparsec (about 3.3 million light-years). While lower than the other methods, it’s tentatively closer to the value from the cosmic microwave background, or CMB.

    “Our data agrees a little bit more with the CMB side of things compared to the supernova side, but the error bar is really big, so you can’t really say anything,” says Steffen Hagstotz, an astronomer at Stockholm University. Nonetheless, he says, “I think fast radio bursts have the potential to be as accurate as the other methods.”

    No one knows exactly what causes FRBs, though eruptions from highly magnetic neutron stars are one possible explanation (SN: 6/4/20). During the few milliseconds when FRBs blast out radio waves, their extreme brightness makes them visible across large cosmic distances, giving astronomers a way to probe the space between galaxies (SN: 5/27/20).

    As an FRB signal travels through the dust and gas separating galaxies, it becomes scattered in a predictable way that causes some frequencies to arrive slightly later than others. The farther away the FRB, the more dispersed the signal. Using measurements of this dispersion, Hagstotz and colleagues estimated the distances to nine FRBs. Comparing those distances to the speeds at which the FRBs’ host galaxies are receding from Earth, the team calculated the Hubble constant.

    The largest error in the new method comes from not knowing precisely how the FRB signal disperses as it exits its home galaxy before entering intergalactic space, where the gas and dust content is better understood. With a few hundred FRBs, the team estimates that it could reduce the uncertainties and match the accuracy of other methods such as supernovas.

    “It’s a first measurement, so not too surprising that the current results are not as constraining as other more matured probes,” says Birrer.

    New FRB data might be coming soon. Many new radio observatories are coming online and larger surveys, such as ones proposed for the Square Kilometre Array, could discover tens to thousands of FRBs every night. Hagstotz expects there will sufficient FRBs with distance estimates in the next year or two to accurately determine the Hubble constant. Such FRB data could also help astronomers understand what’s causing the bright outbursts.

    “I am very excited about the new possibilities that we will have soon,” Hagstotz says. “It’s really just beginning.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 3:13 pm on April 22, 2021 Permalink | Reply
    Tags: "WHOI and ADI Launch Ocean and Climate Innovation Accelerator", , , , ,   

    From Woods Hole Oceanographic Institution: “WHOI and ADI Launch Ocean and Climate Innovation Accelerator” 

    From Woods Hole Oceanographic Institution

    April 20, 2021

    1
    Today, WHOI and Analog Devices, Inc. (ADI) launched an Ocean and Climate Innovation Accelerator (OCIA) consortium, focused on the critical role of oceans in combatting climate change, and developing new solutions at the intersection of oceans and climate.

    First-of-its-kind consortium focused on the critical role of oceans in combatting climate change.

    Woods Hole Oceanographic Institution (WHOI) and Analog Devices, Inc. (Nasdaq: ADI) today launched the Ocean and Climate Innovation Accelerator (OCIA) consortium. ADI has committed $3 million over three years towards the consortium which will focus on advancing knowledge of the ocean’s critical role in combatting climate change as well as developing new solutions at the intersection of oceans and climate.

    “Carbon emissions feature as a centerpiece in global efforts to mitigate climate change. Oceans are among our most important defense mechanisms against a warming planet – yet their ability to continue to play this critically important role is being threatened by the effects of climate change,” said Vincent Roche, CEO of Analog Devices. “Through the Ocean and Climate Innovation Accelerator, we are committed to engaging ADI’s engineers and technologies to advance knowledge of the oceans, in order to gain a better understanding of how oceans are impacted by climate change and to develop solutions to restore ocean health. By doing so, we hope to drive meaningful impact on the global fight against climate change.”

    The OCIA consortium is designed to be a highly scalable collaboration leveraging the unique resources and capabilities of its partner organizations. Among its goals, the consortium will focus on the development of the “networked ocean” – placing sensors across oceanographic environments that will continuously monitor critical metrics related to ocean conditions with the aim of informing business and policy decision makers, enabling evidence-based stewardship of ocean health and driving more accurate climate and weather predictions with real-time data.

    “On behalf of WHOI’s entire community of ocean scientists and engineers, we are incredibly excited about this collaboration,” said Dr. Peter de Menocal, president, and director of WHOI. “The formation of the OCIA consortium comes at a time when support for science and ocean research is at a critical juncture. We are building a research innovation ecosystem that will drive new understanding to tackle global challenges facing society. It provides a new, scalable model showing how corporations can engage deeply on the climate crisis.”

    The consortium will be jointly led by WHOI, a world leader in oceanographic research, technology, and education dedicated to understanding the ocean for the benefit of humanity, and ADI, a world leader in the design, manufacturing, and marketing of a broad portfolio of high-performance semiconductor solutions used in virtually all types of electronic equipment. Designed to act as an engine for continuous innovation and powered by some of the world’s leading minds and businesses, the OCIA consortium is open to participation by a wide range of leading organizations across business, academia and non-profits that recognize the inextricable links between ocean and climate and wish to have a positive impact on the global climate crisis.

    The OCIA consortium will also establish a robust, multi-stage innovation ecosystem, building on WHOI’s existing strengths in education and research to drive solutions-thinking and allow scientists and engineers to focus on high-impact problems. This will include the launch of a new Climate Challenge Grant Program which will award seed-funding for smaller, competitively selected projects.

    Initially, the OCIA will provide two types of awards:

    Incubation Awards: comprised of seed-funding awarded to dynamic individuals and teams. Incubation Awards will support design, exploration, and early execution of new, cutting-edge scientific initiatives that foster new avenues of research and engineering and encourage and incentivize collaborative engagement.
    Acceleration Awards: awarded to successful recipients of prior support for novel ideas and technologies, as well as other more mature projects, for the purpose of expanding these programs, increasing public engagement, and positioning and preparing projects to achieve lasting impact and receive durable outside support.

    As the consortium grows over time, OCIA programs may expand to invest in people through the establishment of fellowships and other awards, along with a portfolio of other activities such as support for collaboration hubs to drive innovations in data processing, machine learning, and transdisciplinary science and engineering.

    “Now more than ever, it is essential for people to understand that the ocean and climate are not two separate systems, but rather part of a single system that spans our entire ocean planet and affects the lives of people everywhere, even if they live far from the coast,” said de Menocal. “Recognizing this, it is critical for organizations like ADI and WHOI to find common cause and work in shared-mission partnerships to help mitigate the rapidly advancing threats brought on by a warming planet.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Woods Hole Oceanographic Institute

    Vision & Mission

    The ocean is a defining feature of our planet and crucial to life on Earth, yet it remains one of the planet’s last unexplored frontiers. For this reason, WHOI scientists and engineers are committed to understanding all facets of the ocean as well as its complex connections with Earth’s atmosphere, land, ice, seafloor, and life—including humanity. This is essential not only to advance knowledge about our planet, but also to ensure society’s long-term welfare and to help guide human stewardship of the environment. WHOI researchers are also dedicated to training future generations of ocean science leaders, to providing unbiased information that informs public policy and decision-making, and to expanding public awareness about the importance of the global ocean and its resources.

    Mission Statement

    The Woods Hole Oceanographic Institution is dedicated to advancing knowledge of the ocean and its connection with the Earth system through a sustained commitment to excellence in science, engineering, and education, and to the application of this knowledge to problems facing society.

    The Institution is organized into six departments, the Cooperative Institute for Climate and Ocean Research, and a marine policy center. Its shore-based facilities are located in the village of Woods Hole, Massachusetts(US) and a mile and a half away on the Quissett Campus. The bulk of the Institution’s funding comes from grants and contracts from the National Science Foundation(US) and other government agencies, augmented by foundations and private donations.

    WHOI scientists, engineers, and students collaborate to develop theories, test ideas, build seagoing instruments, and collect data in diverse marine environments. Ships operated by WHOI carry research scientists throughout the world’s oceans. The WHOI fleet includes two large research vessels (R/V Atlantis and R/V Neil Armstrong); the coastal craft Tioga; small research craft such as the dive-operation work boat Echo; the deep-diving human-occupied submersible Alvin; the tethered, remotely operated vehicle Jason/Medea; and autonomous underwater vehicles such as the REMUS and SeaBED.

    WHOI offers graduate and post-doctoral studies in marine science. There are several fellowship and training programs, and graduate degrees are awarded through a joint program with the Massachusetts Institute of Technology(US). WHOI is accredited by the New England Association of Schools and Colleges. WHOI also offers public outreach programs and informal education through its Exhibit Center and summer tours. The Institution has a volunteer program and a membership program, WHOI Associate.

    On October 1, 2020, Peter B. de Menocal became the institution’s eleventh president and director.

    History

    In 1927, a National Academy of Sciences(US) committee concluded that it was time to “consider the share of the United States of America in a worldwide program of oceanographic research.” The committee’s recommendation for establishing a permanent independent research laboratory on the East Coast to “prosecute oceanography in all its branches” led to the founding in 1930 of the Woods Hole Oceanographic Institution(US).

    A $2.5 million grant from the Rockefeller Foundation supported the summer work of a dozen scientists, construction of a laboratory building and commissioning of a research vessel, the 142-foot (43 m) ketch R/V Atlantis, whose profile still forms the Institution’s logo.

    WHOI grew substantially to support significant defense-related research during World War II, and later began a steady growth in staff, research fleet, and scientific stature. From 1950 to 1956, the director was Dr. Edward “Iceberg” Smith, an Arctic explorer, oceanographer and retired Coast Guard rear admiral.

    In 1977 the institution appointed the influential oceanographer John Steele as director, and he served until his retirement in 1989.

    On 1 September 1985, a joint French-American expedition led by Jean-Louis Michel of IFREMER and Robert Ballard of the Woods Hole Oceanographic Institution identified the location of the wreck of the RMS Titanic which sank off the coast of Newfoundland 15 April 1912.

    On 3 April 2011, within a week of resuming of the search operation for Air France Flight 447, a team led by WHOI, operating full ocean depth autonomous underwater vehicles (AUVs) owned by the Waitt Institute discovered, by means of sidescan sonar, a large portion of debris field from flight AF447.

    In March 2017 the institution effected an open-access policy to make its research publicly accessible online.

    The Institution has maintained a long and controversial business collaboration with the treasure hunter company Odyssey Marine. Likewise, WHOI has participated in the location of the San José galleon in Colombia for the commercial exploitation of the shipwreck by the Government of President Santos and a private company.

    In 2019, iDefense reported that China’s hackers had launched cyberattacks on dozens of academic institutions in an attempt to gain information on technology being developed for the United States Navy. Some of the targets included the Woods Hole Oceanographic Institution. The attacks have been underway since at least April 2017.

     
  • richardmitnick 2:11 pm on April 22, 2021 Permalink | Reply
    Tags: "Volcanic Blasts Deep Under The Ocean Are Shockingly Powerful a New Study Reveals", , , ,   

    From Science Alert (AU) : “Volcanic Blasts Deep Under The Ocean Are Shockingly Powerful a New Study Reveals” 

    ScienceAlert

    From Science Alert (AU)

    22 APRIL 2021

    DAVID FERGUSON
    SAM PEGLER

    1
    Credit: National Oceanic and Atmospheric Administration (US)/National Science Foundation (US)/Woods Hole Oceanographic Institution (US))

    The ocean floor is famously unexplored and is imaged in much less detail than the surfaces of Mars, the Moon and Venus.

    Draining the water from the oceans would reveal a vast and mostly unknown volcanic landscape. In fact, the majority of Earth’s volcanic activity occurs underwater and at depths of several kilometers in the deep ocean.

    But in contrast to terrestrial volcanoes, even detecting that an eruption has occurred on the seafloor is extremely challenging.

    Consequently, there remains much for scientists to learn about submarine volcanism and its role in the marine environment.

    Now our new study on deep-sea eruptions, published in Nature Communications, gives important insights.

    Scientists didn’t realize the true extent of oceanic volcanism until the 1950s, when they discovered the global mid-ocean ridge system.

    2
    Mid-ocean ridge system. Credit: WHOI.

    This finding was pivotal to the theory of plate tectonics. The network of volcanic ridges runs more than 60,000 kilometers around the globe.

    Subsequent exploration led to the detection of “black smoker” vents, where mineral-rich “hydrothermal” fluids (heated water in Earth’s crust) are ejected into the deep ocean.

    3
    Black smoker.Credit: Geology.

    Driven by heat from the underlying magma, these systems influence the chemistry of the entire oceans. The vents also host “extremophiles” – organisms that survive in extreme environments that were once thought to be unable to sustain life.

    But many questions remain. It has long been thought that deep-sea eruptions themselves are rather uninteresting compared to the variety of eruptive styles observed on land.

    Terrestrial volcanoes that produce similar types of magma to those on the seafloor, such as in Hawaii or Iceland, often produce spectacular explosive eruptions, dispersing volcanic ash (called tephra). This type of eruption was thought to be highly improbable in the deep ocean due to the pressure from the overlying water.

    But data collected via remotely operated submarine vehicles has shown that tephra deposits are surprisingly common on the seafloor. Some marine micro-organisms (foraminifera) even use this volcanic ash to construct their shells.

    These eruptions are probably driven by expanding bubbles of carbon dioxide. Steam, which is largely responsible for explosive eruptions on land, cannot form at high pressures.

    Scientists have also sporadically detected massive regions of hydrothermal fluid in the ocean above volcanic ridges. These enigmatic regions of heated, chemical-rich water are known as megaplumes.

    Their size is truly immense, with volumes that can exceed 100 cubic kilometers – equivalent to over 40 million Olympic swimming pools.

    But while they seem to be linked to seafloor eruptions, their origin has remained a mystery.

    Megaplumes mystery

    In our study, we used a mathematical model to explain the dispersal of submarine tephra through the ocean. Thanks to detailed mapping of a volcanic ash deposit in the north-east Pacific, we know that this tephra can spread up to several kilometers from the site of an eruption.

    This cannot be explained easily by tides or other oceanic currents. Our results instead suggest that the plumes must be highly energetic. Like the atmospheric plumes seen at terrestrial volcanoes, these initially rise upwards through the water before spreading out horizontally.

    The heat transfer required to drive this flow, and carry the tephra with it, is surprisingly large at around one terawatt (double that required to power the entire USA at once). We calculated that this should create plumes of a similar size that has indeed been measured.

    Our work provides strong evidence that megaplumes are linked to active seafloor eruptions and that they form very rapidly, probably in a matter of hours.

    So, what is the specific source of this intense input of heat and chemicals that ultimately creates a megaplume? The most obvious candidate is of course the freshly erupted molten lava. At first glance, our results seemed to support such a hypothesis.

    They show that megaplume formation occurs concurrently with the eruption of lava and tephra. But when we calculated the amount of lava required for this it was unrealistically high, around ten times greater than most submarine lava flows.

    Our best guess for now is that, while megaplume creation is closely linked to seafloor eruptions, they primarily owe their origin to the emptying of reservoirs of hydrothermal fluids that are already present within the ocean crust. As magma forces its way upwards to feed seafloor eruptions, it may drive this hot (>300°C) fluid with it.

    Extreme life

    We now know that diverse microorganisms live in rocks below the surface. As startling as the discovery of extremophile lifeforms around hydrothermal vents was, this discovery pushed our ideas of what life is, and where it might exist, even further.

    The fact that our research suggests that megaplumes come from the crust is consistent with the detection of such bacteria within some megaplumes.

    The rapid outpouring of fluids associated with megaplume formation may actually be the primary mechanism that disperses these microorganisms from their subterranean origin. If so, then deep-sea volcanic activity is an important factor influencing the geography of these extremophile communities.

    Some scientists believe that the unusual physical and chemical conditions associated with seafloor hydrothermal systems may have provided a suitable environment for the origin of life on Earth. Megaplumes may therefore have been involved in spreading this life across the ocean.

    If life is to be found elsewhere in our solar system then hydrothermal vents, such as those thought to exist on Saturn’s moon Enceladus, would be a good place to look.

    In the absence of other sources of nutrients and light, these types of organisms – possibly the first to exist on our planet – owe their existence to the heat and chemicals supplied by the magma that rises upwards to feed seafloor volcanoes.

    Since megaplume-transported volcanic ash deposits seem to be common at deep sea volcanoes, the results of our research suggest that the proliferation of life through megaplume emissions may be widespread.

    While being able to observe a deep-sea eruption in person remains unlikely for now, efforts are being made to collect data on submarine volcanic events.

    The most notable of these is the observatory at Axial Volcano in the Pacific. This array of seafloor instruments can stream data in real time, capturing events as they happen.

    5
    Exaggerated swatch bathymetry of Axial Seamount. NOAA.

    Through efforts like these, in concert with continued mapping and sampling of the ocean floor, the volcanic character of the oceans is slowly being revealed.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 1:22 pm on April 22, 2021 Permalink | Reply
    Tags: "10 NSF funded studies that show the challenges and complexities of climate change",   

    From National Science Foundation (US) : “10 NSF funded studies that show the challenges and complexities of climate change” 

    From National Science Foundation (US)

    April 20, 2021

    In a complex dance, Earth’s climate affects, and is affected by, the sky, land, ice, sea — and by life, including people. To understand climate change, which scientists believe may be one of the most important challenges humankind has ever faced, we need to comprehend Earth’s natural and human systems and how they interact. The answers may determine the future of life on our planet. For Earth Day, we look at 10 recent discoveries from U.S. National Science Foundation-funded climate change research and what they tell us about a warming planet.

    1
    Many protected areas do not take into account the potential long-term effects of climate change. Photo Credit: Mandy Choi via Unsplash.

    1. Climate change forcing a rethinking of conservation biology planning.

    Creating and managing protected areas is key for biodiversity conservation. With changes in climate, species will need to migrate to maintain their habitat needs. Those that lived in protected areas 10 years ago may move outside those zones to find new areas that provide the climate and food they need to survive. Researchers looked at the amount of new protected areas in several regions, including areas where climate change is projected to be slower; areas where the terrain can shelter a high number of species; and areas that increase connectivity between protected zones, which allow species to move between them to escape adverse climate conditions. The study suggests that countries have not fully taken advantage of the potential of protected areas.

    2
    In the Mojave Desert, burrowing mammals are weathering hotter, drier conditions. Photo Credit: Wikimedia Commons/Murray Foubister.

    2. In a desert seared by climate change, burrowers fare better than birds.

    In the arid Mojave Desert, small burrowing mammals such as the cactus mouse, the kangaroo rat and the white-tailed antelope squirrel are weathering the hotter, drier conditions triggered by climate change better than their winged counterparts. Over the past century, climate change has pushed the Mojave’s searing summer temperatures ever higher; the blazing heat has taken its toll on the desert’s birds. However, the research team that documented the birds’ decline also found that small mammal populations have remained relatively stable since the beginning of the 20th century. Using computer models to simulate response to heat, the researchers showed that small mammals’ resilience is likely due to their ability to escape the sun in underground burrows and their tendency to be more active at night.

    3
    IODP researchers work aboard the ocean drillship JOIDES Resolution. Photo Credit: International Ocean Discovery Program.

    3. Scientists solve climate change mystery.

    Scientists have resolved a key climate change mystery, showing that the annual global temperature today is the warmest in the past 10,000 years. The findings challenge long-held views on the temperature history of the Holocene era, which began about 12,000 years ago and continues to the present. Using fossils of single-celled organisms from the ocean surface to reconstruct the temperature histories of the two most recent warm intervals on Earth, the researchers found that the first half of the Holocene was colder than in industrial times due to the cooling effects of remnant ice sheets from the previous glacial period. The warming was caused by an increase in greenhouse gases, as predicted by climate models.

    4
    Meltwater lakes on Antarctica’s George VI Ice Shelf in January 2020. Photo Credit: Thomas Simons.

    4. Extreme melt on Antarctica’s George VI Ice Shelf.

    Antarctica’s George VI Ice Shelf experienced record melting during the summer season of 2019-2020 compared with 31 previous summers. The extreme melt coincided with record-setting stretches when local air temperatures were at or above the freezing point. The scientists studied the 2019-2020 melt season using satellite observations that can detect meltwater on top of the ice and in the near-surface snow. They observed the most widespread melt and greatest total number of melt days of any season for the northern George VI Ice Shelf. Understanding the impact of surface melt on ice shelf vulnerability can help researchers more accurately project the future influence of climate on sea level rise.

    5
    The central fissure of the Laki volcano in Iceland. Photo Credit: Wikimedia Commons

    5. Tree rings and Iceland’s Laki volcano eruption: A closer look at climate.

    By reading between the lines of tree rings, researchers reconstructed what happened in Alaska when the Laki Volcano erupted in 1783 — half a world away in Iceland. Laki spewed more sulfur into the atmosphere than any other Northern Hemisphere eruption in the last 1,000 years. The Inuit in North America tell stories about the year summer never arrived. Benjamin Franklin, who was in France at the time, noted the “fog” that descended over much of Europe and reasoned that it led to an unusually cold winter on the continent. What happened to climate from the eruption reflects a combination of the volcano’s effects and natural variability. The research is helping fine-tune future climate predictions.

    6
    By the late 21st century, the number of people suffering extreme droughts will double. Photo Credit: Wikimedia Commons.

    6. By the late 21st century, the number of people suffering extreme droughts will double.

    Scientists are undertaking a global effort to offer the first worldwide view of how climate change could affect water availability and drought severity in the decades to come. By the late 21st century, the global land area and population facing extreme droughts could more than double — increasing from 3% during 1976-2005 to 7%-8%. More people will suffer from extreme droughts if a medium-to-high level of global warming continues and water management is maintained in its present state. Areas of the Southern Hemisphere, where water scarcity is already a problem, will be disproportionately affected. The researchers predict this increase in water scarcity will affect food security and escalate human migration and conflict.

    7
    Paleoecologist Sora Kim studies ancient shark teeth to learn about Earth’s history. Photo Credit: University of California – Merced (US).

    7. Shark teeth offer clues to ancient climate change.

    A character in the movie “Jaws” said that all sharks do is “swim and eat and make little sharks.” It turns out they do much more than that. Sharks have roamed Earth’s oceans for more than 400 million years, quietly recording the planet’s history. If a researcher like paleoecologist Sora Kim of the University of California, Merced, wants to “read” those records to learn about major global changes that took place 50 million years ago, she must decode the information stored in what remains of ancient sharks: their teeth. Teeth from the long-extinct sand tiger shark are providing new information about global climate change and the movement of Earth’s tectonic plates.

    8
    Researchers stand at the entrance to a cave in Mallorca. Photo Credit: University of South Florida (US)

    8. Scientists reconstruct 6.5 million years of sea level in the Western Mediterranean.

    The pressing concern posed by rising sea levels has created a need for scientists to predict how quickly the oceans will rise in coming centuries. To gain insight into future ice sheet stability and sea level rise, new findings draw on evidence from past periods when Earth’s climate was warmer than today. To reconstruct past sea levels, researchers used deposits found in caves on the Mediterranean island of Mallorca. The scientists determined that the extent of these unique deposits corresponds with the fluctuating water table, providing a way to precisely measure past sea levels.

    9
    The great purple emperor butterfly is one of countless insect species needing human assistance.  Photo Credit: Wikimedia Commons/Peeliden.

    9. Unsure how to help insect declines? Researchers suggest some ways.

    Florida Museum of Natural History entomologist Akito Kawahara’s message is straightforward: We can’t live without insects; they’re in trouble; and there’s something all of us can do to help. Kawahara’s research has focused on answering questions about moth and butterfly evolution, but he’s increasingly haunted by studies that sound the alarm about plummeting insect numbers and diversity. One of the culprits? Climate change. In response, Kawahara has turned his attention to boosting appreciation for some of the world’s most misunderstood animals. Now, Kawahara and his colleagues outline easy ways to contribute to insect conservation, including mowing less, dimming the lights, using insect-friendly soaps and sealants, and becoming insect ambassadors.

    10
    A satellite image of a dust plume crossing the Korean Peninsula. Photo Credit: SeaWiFS Project, NASA/Goddard Space Flight Center, and ORBIMAGE.

    10. Will warming bring a change in the winds? Dust from the deep sea provides a clue.

    The westerlies — or westerly winds — play an important role in weather and climate locally and on a global scale by influencing precipitation patterns, affecting ocean circulation, and steering tropical cyclones. Assessing how they will change as climate warms is crucial. The westerlies usually blow from west to east across the planet’s mid-latitudes, but scientists have noticed that over the last several decades, these winds are moving toward the poles. Research suggests this shift is due to climate change. Scientists developed a new way to apply paleoclimatology, the study of past climate, to the behavior of the westerly winds and found evidence that atmospheric circulation patterns will change with climate warming. This breakthrough in understanding how the winds changed in the past may show us how they will continue to in the future.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The National Science Foundation (NSF) (US) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    We fulfill our mission chiefly by issuing limited-term grants — currently about 12,000 new awards per year, with an average duration of three years — to fund specific research proposals that have been judged the most promising by a rigorous and objective merit-review system. Most of these awards go to individuals or small groups of investigators. Others provide funding for research centers, instruments and facilities that allow scientists, engineers and students to work at the outermost frontiers of knowledge.

    NSF’s goals — discovery, learning, research infrastructure and stewardship — provide an integrated strategy to advance the frontiers of knowledge, cultivate a world-class, broadly inclusive science and engineering workforce and expand the scientific literacy of all citizens, build the nation’s research capability through investments in advanced instrumentation and facilities, and support excellence in science and engineering research and education through a capable and responsive organization. We like to say that NSF is “where discoveries begin.”

    Many of the discoveries and technological advances have been truly revolutionary. In the past few decades, NSF-funded researchers have won some 236 Nobel Prizes as well as other honors too numerous to list. These pioneers have included the scientists or teams that discovered many of the fundamental particles of matter, analyzed the cosmic microwaves left over from the earliest epoch of the universe, developed carbon-14 dating of ancient artifacts, decoded the genetics of viruses, and created an entirely new state of matter called a Bose-Einstein condensate.

    NSF also funds equipment that is needed by scientists and engineers but is often too expensive for any one group or researcher to afford. Examples of such major research equipment include giant optical and radio telescopes, Antarctic research sites, high-end computer facilities and ultra-high-speed connections, ships for ocean research, sensitive detectors of very subtle physical phenomena and gravitational wave observatories.

    Another essential element in NSF’s mission is support for science and engineering education, from pre-K through graduate school and beyond. The research we fund is thoroughly integrated with education to help ensure that there will always be plenty of skilled people available to work in new and emerging scientific, engineering and technological fields, and plenty of capable teachers to educate the next generation.

    No single factor is more important to the intellectual and economic progress of society, and to the enhanced well-being of its citizens, than the continuous acquisition of new knowledge. NSF is proud to be a major part of that process.

    Specifically, the Foundation’s organic legislation authorizes us to engage in the following activities:

    Initiate and support, through grants and contracts, scientific and engineering research and programs to strengthen scientific and engineering research potential, and education programs at all levels, and appraise the impact of research upon industrial development and the general welfare.
    Award graduate fellowships in the sciences and in engineering.
    Foster the interchange of scientific information among scientists and engineers in the United States and foreign countries.
    Foster and support the development and use of computers and other scientific methods and technologies, primarily for research and education in the sciences.
    Evaluate the status and needs of the various sciences and engineering and take into consideration the results of this evaluation in correlating our research and educational programs with other federal and non-federal programs.
    Provide a central clearinghouse for the collection, interpretation and analysis of data on scientific and technical resources in the United States, and provide a source of information for policy formulation by other federal agencies.
    Determine the total amount of federal money received by universities and appropriate organizations for the conduct of scientific and engineering research, including both basic and applied, and construction of facilities where such research is conducted, but excluding development, and report annually thereon to the President and the Congress.
    Initiate and support specific scientific and engineering activities in connection with matters relating to international cooperation, national security and the effects of scientific and technological applications upon society.
    Initiate and support scientific and engineering research, including applied research, at academic and other nonprofit institutions and, at the direction of the President, support applied research at other organizations.
    Recommend and encourage the pursuit of national policies for the promotion of basic research and education in the sciences and engineering. Strengthen research and education innovation in the sciences and engineering, including independent research by individuals, throughout the United States.
    Support activities designed to increase the participation of women and minorities and others underrepresented in science and technology.

    At present, NSF has a total workforce of about 2,100 at its Alexandria, VA, headquarters, including approximately 1,400 career employees, 200 scientists from research institutions on temporary duty, 450 contract workers and the staff of the NSB office and the Office of the Inspector General.

    NSF is divided into the following seven directorates that support science and engineering research and education: Biological Sciences, Computer and Information Science and Engineering, Engineering, Geosciences, Mathematical and Physical Sciences, Social, Behavioral and Economic Sciences, and Education and Human Resources. Each is headed by an assistant director and each is further subdivided into divisions like materials research, ocean sciences and behavioral and cognitive sciences.

    Within NSF’s Office of the Director, the Office of Integrative Activities also supports research and researchers. Other sections of NSF are devoted to financial management, award processing and monitoring, legal affairs, outreach and other functions. The Office of the Inspector General examines the foundation’s work and reports to the NSB and Congress.

    Each year, NSF supports an average of about 200,000 scientists, engineers, educators and students at universities, laboratories and field sites all over the United States and throughout the world, from Alaska to Alabama to Africa to Antarctica. You could say that NSF support goes “to the ends of the earth” to learn more about the planet and its inhabitants, and to produce fundamental discoveries that further the progress of research and lead to products and services that boost the economy and improve general health and well-being.

    As described in our strategic plan, NSF is the only federal agency whose mission includes support for all fields of fundamental science and engineering, except for medical sciences. NSF is tasked with keeping the United States at the leading edge of discovery in a wide range of scientific areas, from astronomy to geology to zoology. So, in addition to funding research in the traditional academic areas, the agency also supports “high risk, high pay off” ideas, novel collaborations and numerous projects that may seem like science fiction today, but which the public will take for granted tomorrow. And in every case, we ensure that research is fully integrated with education so that today’s revolutionary work will also be training tomorrow’s top scientists and engineers.

    Unlike many other federal agencies, NSF does not hire researchers or directly operate our own laboratories or similar facilities. Instead, we support scientists, engineers and educators directly through their own home institutions (typically universities and colleges). Similarly, we fund facilities and equipment such as telescopes, through cooperative agreements with research consortia that have competed successfully for limited-term management contracts.

    NSF’s job is to determine where the frontiers are, identify the leading U.S. pioneers in these fields and provide money and equipment to help them continue. The results can be transformative. For example, years before most people had heard of “nanotechnology,” NSF was supporting scientists and engineers who were learning how to detect, record and manipulate activity at the scale of individual atoms — the nanoscale. Today, scientists are adept at moving atoms around to create devices and materials with properties that are often more useful than those found in nature.

    Dozens of companies are gearing up to produce nanoscale products. NSF is funding the research projects, state-of-the-art facilities and educational opportunities that will teach new skills to the science and engineering students who will make up the nanotechnology workforce of tomorrow.

    At the same time, we are looking for the next frontier.

    NSF’s task of identifying and funding work at the frontiers of science and engineering is not a “top-down” process. NSF operates from the “bottom up,” keeping close track of research around the United States and the world, maintaining constant contact with the research community to identify ever-moving horizons of inquiry, monitoring which areas are most likely to result in spectacular progress and choosing the most promising people to conduct the research.

    NSF funds research and education in most fields of science and engineering. We do this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the U.S. The Foundation considers proposals submitted by organizations on behalf of individuals or groups for support in most fields of research. Interdisciplinary proposals also are eligible for consideration. Awardees are chosen from those who send us proposals asking for a specific amount of support for a specific project.

    Proposals may be submitted in response to the various funding opportunities that are announced on the NSF website. These funding opportunities fall into three categories — program descriptions, program announcements and program solicitations — and are the mechanisms NSF uses to generate funding requests. At any time, scientists and engineers are also welcome to send in unsolicited proposals for research and education projects, in any existing or emerging field. The Proposal and Award Policies and Procedures Guide (PAPPG) provides guidance on proposal preparation and submission and award management. At present, NSF receives more than 42,000 proposals per year.

    To ensure that proposals are evaluated in a fair, competitive, transparent and in-depth manner, we use a rigorous system of merit review. Nearly every proposal is evaluated by a minimum of three independent reviewers consisting of scientists, engineers and educators who do not work at NSF or for the institution that employs the proposing researchers. NSF selects the reviewers from among the national pool of experts in each field and their evaluations are confidential. On average, approximately 40,000 experts, knowledgeable about the current state of their field, give their time to serve as reviewers each year.

    The reviewer’s job is to decide which projects are of the very highest caliber. NSF’s merit review process, considered by some to be the “gold standard” of scientific review, ensures that many voices are heard and that only the best projects make it to the funding stage. An enormous amount of research, deliberation, thought and discussion goes into award decisions.

    The NSF program officer reviews the proposal and analyzes the input received from the external reviewers. After scientific, technical and programmatic review and consideration of appropriate factors, the program officer makes an “award” or “decline” recommendation to the division director. Final programmatic approval for a proposal is generally completed at NSF’s division level. A principal investigator (PI) whose proposal for NSF support has been declined will receive information and an explanation of the reason(s) for declination, along with copies of the reviews considered in making the decision. If that explanation does not satisfy the PI, he/she may request additional information from the cognizant NSF program officer or division director.

    If the program officer makes an award recommendation and the division director concurs, the recommendation is submitted to NSF’s Division of Grants and Agreements (DGA) for award processing. A DGA officer reviews the recommendation from the program division/office for business, financial and policy implications, and the processing and issuance of a grant or cooperative agreement. DGA generally makes awards to academic institutions within 30 days after the program division/office makes its recommendation.

     
  • richardmitnick 12:25 pm on April 22, 2021 Permalink | Reply
    Tags: "Monitoring the Oceans’ Color for Clues to Climate Change", , , MOBY project, ,   

    From National Institute of Standards and Technology (US) : “Monitoring the Oceans’ Color for Clues to Climate Change” 

    From National Institute of Standards and Technology (US)

    April 22, 2021
    B. Carol Johnson

    1

    Ocean chlorophyll concentrations, MODIS-Aqua full mission July 2002 to January 2021.
    Credit: Ocean Biology Processing Group, National Aeronautics Space Agency (US)/Goddard Space Flight Center(US)

    “It is February 1994 and I am on the research vessel R/V Moana Wave off the coast of Lanai, Hawaii, with the team of the Marine Optical BuoY (MOBY) project. The water is incredibly blue, and I can’t help but be awestruck by the enormous energy, momentum, power and depth of the ocean as I watch the currents and the wind create what appear to be rising and falling pyramids of solid substance, no longer a liquid but a mighty living thing. It is against this backdrop that we work to deploy devices designed to determine the optical properties of the Pacific Ocean. As the instrument at hand, bright yellow and wing-shaped, is lowered over the port side into the water, its yellow wings appear green. This change, and indeed the blue color of the water itself, was indicative of what was in the water, which in the case of the open Pacific, was “not much.” “Much” meaning small particles in the water that scatter or absorb sunlight, changing the overall reflectance of the sunlit layer, and thereby the observed color.

    2
    Phytoplankton. Credit:National Oceanic and Atmospheric Administration (US) MESA Project.

    One class of these small particles is phytoplankton, a form of algae. They contain chlorophyll and practice photosynthesis as does any other plant, using solar radiation to convert carbon dioxide dissolved in the water into plant sugars, releasing oxygen and respiring a portion of the carbon dioxide. A portion of the carbon dioxide is eventually converted to sediment thanks to the grazing activities of marine life (and death), such as zooplankton, young fish or crustaceans and those that feed upon them.

    Phytoplankton
    are the basis of marine life, produce about half of the oxygen in the Earth’s atmosphere, and currently absorb much of the carbon dioxide we humans produce. Ocean temperature, currents, acidification, surface winds and nutrients can affect phytoplankton populations, life cycles and the amount of carbon dioxide they remove from the atmosphere, and so measuring them is critical to understanding climate change. A reasonable question to ask is “Will the oceans continue to remove significant amounts of human-produced carbon dioxide in the future?

    We need to continually observe the color of the world’s oceans to address these questions, and ocean color satellites have been in continuous operation since the launch of NASA’s SeaWiFS mission in 1997. The idea is simple: Just as you can tell a desert from a forest by the color, so it goes with the oceans. But, while the idea is simple, detecting the color of the oceans through Earth’s atmosphere is not. The main problem is the atmosphere also scatters sunlight, and both sources of scattered light, from the ocean and the atmosphere, are detected by the satellite sensor. Because the portion from the atmosphere dominates, it is not possible, with the current status of preflight calibration and atmospheric modeling, to use ocean color satellites to derive the light scattered out of the oceans with the accuracy we need to determine the chlorophyll concentration and other quantities of interest.

    2
    Divers inspecting MOBY. The buoy extends 12 meters (39.4 feet) underwater and has sensors at depths of 1 m (3.3 ft), 5 m (16.4 ft) and 9 m (29.5 ft). Credit: Moss Landing Marine Laboratories (US)(MLML).

    The solution lies in a procedure called system vicarious calibration (SVC), which was pioneered by Dennis Clark at NOAA and colleagues from ships during the Coastal Zone Color Scanner satellite mission (1978 to 1986). Based on this experience, Dennis implemented the MOBY project with support from NOAA and NASA’s SeaWiFS and MODIS projects; MOBY has produced data from July 1997 to the present. MOBY is an optical system that measures light at different colors (wavelengths). This type of system is called a spectrograph. It is mounted on a tethered wave-rider buoy and measures the light incident on the surface and at three depths, and the backscattered light at four depths. MOBY is located about 20 kilometers from Lanai where the water is representative of the world’s oceans. Data are acquired daily as ocean color satellites fly over the location. The MOBY instrument, in collaboration with NIST, is extremely well characterized and extensively calibrated, and the results are traceable to the International System of Units (SI), the modern metric system. By providing accurate values for the oceanic portion of the light measured by the satellite sensor, MOBY provides a calibrated source for any ocean color sensor that observes this region of the ocean.

    I was on the Wave in 1994 because Dennis had come to NIST a couple of years earlier for help with establishing traceability to the SI, which means ensuring the MOBY results are rigorously connected to the SI measurement system so that researchers around the world have the best possible ocean color reference. He had designed a robust measurement plan, with cross-checks and validation at every turn. Over the years, NIST has supplied radiometric sensors for the MOBY team to track its calibration sources in between the NIST calibrations, and we have deployed additional NIST radiometers and sources on occasion to validate the radiometric scales at the MOBY facility in Honolulu.

    NIST has also played a role in characterizing the MOBY optical system. A good example is a problem Dennis presented early on: Independent, simultaneous measurements at the same wavelength and depth did not agree. Now, this is a problem, but really it is a good thing to find issues. In metrology, in order to assure ourselves we are getting the best (and hopefully correct) answer, it is good practice to measure the same thing with different approaches, in this case the backscattered light at the same wavelength with two different spectrographs. It took a while to figure out, but thanks to some laser characterizations and subsequent discussions, we identified stray light as the issue and developed and implemented an algorithm to correct the problem.

    As you may have gathered, MOBY has been around for quite some time. It is an example of how collaborations really work, leading to a world-class product. We’re currently on our 67th buoy (a buoy has a deployment cycle of three to six months). We rotate two systems, calibrating and refurbishing one while the other is in the water. NOAA fully supports the MOBY project for its visible infrared imaging radiometer suite (VIIRS) calibration. Presently, under the leadership of Kenneth Voss (University of Miami; Dennis retired in 2005 and died in 2014) and execution of Moss Landing Marine Laboratories, and with NOAA support, we are implementing a new system design. The new optical system collects data from all depths simultaneously in order to reduce environmental sources of measurement uncertainty. A new carbon-fiber buoy structure, and new control, communication, and data analysis systems complete the system, which we call “Refresh.”

    The MOBY team, with NASA funding, is developing a portable version termed MarONet. This system is identical to Refresh and enables deployment at a new location with recalibrations in Honolulu at the MOBY facility. The first MarONet will be deployed off Rottnest Island, Australia, in 2022. NIST’s role in MarONet is to supply a stable source and spectroradiometer system to validate any changes with shipment. In 2022, NASA will decide whether to continue the MarONet project as the primary SCV site for the upcoming Plankton, Aerosol, Cloud and ocean Ecosystem

    (PACE) mission.

    I would like to close with a few words about the MOBY team and how this work has been a core part of my career at NIST. Yes, I have been involved with other satellite sensors, performing on-site validation activities at the manufacturer facilities with my NIST colleagues for the NASA Earth Observing System program, NOAA geostationary satellites, ESA’s Sentinel-2 , the Orbiting Carbon Observatory and others.

    I have had the opportunity to participate in validation of ground-based measurements of the Moon’s irradiance. But the field of ocean color has led to long-standing relationships with exceptional scientists, and I am so grateful for this experience.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    National Institute of Standards and Technology (US)‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “National Institute of Standards and Technology (US)” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock. NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR). The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961. SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology (CNST) performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility. This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).

    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 11:52 am on April 22, 2021 Permalink | Reply
    Tags: "The Uncertainty of Climate Change is Hurting Us", , , Environmental exposures affect pregnancies., High temperatures bring high risks., Painting a new picture of the future., Studying climate effects in our backyard., Tarik Benmarhnia, UC San Diego (US), We know sea levels; temperatures; and atmospheric CO2 levels are climbing but what’s less clear is what that’s doing to our health.   

    From UC San Diego (US) : “The Uncertainty of Climate Change is Hurting Us” 

    From UC San Diego (US)

    We know sea levels; temperatures; and atmospheric CO2 levels are climbing but what’s less clear is what that’s doing to our health.

    1
    Tarik Benmarhnia didn’t plan on ending up here, in an office overlooking the pier at UC San Diego’s Scripps Institution of Oceanography.

    As a young student in France, Tarik Benmarhnia started out studying environmental engineering, with an interest in soil decontamination. During his schooling, he developed an interest in environmental justice. That eventually drove him to pursue a Ph.D. in epidemiology.

    Most stories about climate change focus on the environmental effects, such as flooding in Venice’s Saint Mark’s Square and extensive droughts along the West Coast. But Benmarhnia and other researchers are now learning that the changing climate is having immediate and direct effects on our health—and will continue to affect us for years to come.

    “The real challenge with studying the health impact of climate change is that there’s so much uncertainty,” said Benmarhnia, now associate professor at the Herbert Wertheim School of Public Health and Human Longevity Science and at Scripps Institution of Oceanography at UC San Diego. “That makes it difficult to predict what, exactly, the health effects will be—and makes it hard to convince people that this is an important issue and that actions need to be undertaken right now.”

    High temperatures bring high risks.

    Uncertainty is a large part of what propels Benmarhnia’s research. He is interested in understanding complex situations—which kinds of events will lead to heat waves and wildfires, what those heat waves and wildfires will mean for human health, and who ultimately is sickened or dies.

    “It’s really important that we understand the small details,” said Benmarhnia. “We can’t just say ‘Heat is killing people.’ We have to figure out why and how. We need to pick apart the complexity of the relationship and understand how different groups will be affected, and what policies we need to create to address the problems.”

    One example: The relationship between local humidity and heat waves. Changing weather patterns have led to increased humidity along the California coast. Humid air holds heat better than dry air, so more humidity means high temperatures along the coast overnight. High temperatures can cause a host of health problems, including dehydration and heatstroke.

    Normally, our bodies can avoid heatstroke by sweating, which brings heat to the surface of the skin and lets it dissipate as the sweat dries. But when it’s humid out, the sweat can’t evaporate, and the heat can’t dissipate. And under high heat conditions—above our body’s normal temperature of 98.6˚F—sweating also becomes moot.

    Heatstroke can cause confusion and seizures, and if left untreated, the body will eventually begin to shut down as essential enzymes and organs stop functioning. Everyone is at risk of heatstroke, but the risk is especially high for children, whose bodies aren’t fully adept at temperature control, and older adults who may take medications that affect heat regulation.

    According to the National Climate Assessment, the number of “extreme heat” days in the U.S. will continue to rise. And the combination of extreme heat days and increased coastal humidity means that Southern California—including San Diego—will be hit especially hard.

    But how do we go about coping with this problem? “One way we can help is by implementing action plans, such as early heat warning systems, to protect and care for vulnerable populations on very hot days,” said Benmarhnia. “The risks of extreme heat are especially burdensome on the elderly and in low-income communities, where people may not be able to afford air conditioning.

    “If communities are educated about these risks and can respond accordingly to check on their vulnerable members, as well as implementing urban greening strategies to create more parks and gardens to help keep neighborhoods cool, that can go a long way toward preventing heat stroke deaths.”

    Environmental exposures affect pregnancies.

    The health effects of climate change aren’t always as obvious as heatstroke. Other scientists are studying the myriad ways a shifting environment and rising pollution levels are affecting human development.

    Like Benmarhnia, Christina Chambers, is not necessarily the person you’d expect to be at the forefront of this field. In fact, research is her second career—after volunteering in the neonatal intensive care unit at UC San Diego Health on her days off, she made the jump from the business world into epidemiology.

    She studies teratology, working to better understand the causes of congenital abnormalities in human development. As a professor of pediatrics at UC San Diego School of Medicine, her particular interest is in understanding how exposure to environmental compounds, such as pesticides, medications and infections, can affect embryonic development during pregnancy and childhood development via breast milk. Often, her research is longitudinal, following groups of parents and children over many years to understand the short- and long-term effects of certain exposures.

    Chambers finds that her work is increasingly influenced by climate concerns. “I work with a counseling service to answer questions about possible exposures for people who are pregnant or breastfeeding,” she said. “The most difficult questions come after a natural disaster, like a hurricane or flood. Now you’re not only impacted by the devastation of losing your home, but also the downstream effects of mold exposure, infections, things you would not have had otherwise. It’s really important that people understand the risks of these kinds of exposures.”

    According to Chambers, there are many ways a changing climate can lead to increased hazardous exposures during pregnancy and breastfeeding. For example, it could alter the geographic distribution of disease-bearing insects, such as ticks transmitting Lyme disease or mosquitoes carrying Zika virus. If natural disasters and altered weather patterns affect agriculture, it may be difficult to access critical foods for preventing birth defects, such as folic acid-rich spinach.

    One area where climate change is already directly affecting pregnancy outcomes is closely related to the work being done by Benmarhnia—specifically, a link between adverse pregnancy outcomes and increased body temperature.

    “During pregnancy, you don’t want your body temperature to increase more than a couple of degrees,” Chambers said. “We know that during certain gestational windows, experiencing a high temperature, whether from a fever or even from sitting in a hot tub for too long, can lead to serious birth defects. So there’s a possibility that increasingly high air temperatures could jeopardize healthy pregnancies.”

    New research by Benmarhnia and his colleagues at UC San Diego and San Diego State University demonstrates another risk of extreme heat exposure during pregnancy: increased risk of preterm birth.

    In a study published in February 2020, the researchers examined data from nearly 2 million births, 2005 through 2013. The risk of preterm birth was consistently higher for people exposed to a high heat episode during their last week of pregnancy. The higher the temperature and the longer the heat wave, the greater the risk of preterm birth.

    Preterm birth, defined as birth before 37 weeks of gestation, is associated with a variety of health issues. Short-term complications can include respiratory and cardiac problems, risk of brain hemorrhage, and difficulty controlling body temperature. Long term, children born preterm are at an increased risk of cerebral palsy, learning and behavioral disorders, and vision and hearing problems. These challenges can affect people throughout their lives.

    Benmarhnia’s research suggests that implementing warning and alert systems targeted toward pregnant people, as well as expanded cooling zones and more exposure to green spaces, could improve birth outcomes and protect against these risks.

    But even outside of preterm birth, there may be other risks associated with climate change effects. According to Benmarhnia, air pollutant exposure at high levels during pregnancy —such as those generated by the wildfires that blaze across southern California each summer and fall—is associated with increased risk of heart defects and even prenatal respiratory complications, indicating that climate change could be affecting pregnancies at all stages.

    Studying climate effects in our backyard.

    While it’s likely that climate change could be contributing to prenatal risks, it can be difficult to prove a direct association without long-term environmental and health data.

    To that end, Chambers and her team are undertaking an enormous study in San Diego County, compiling anonymized data from every baby born in San Diego over a 20-year period.

    “This study captures all of the hospital discharge data for all babies born in the county, including locations and dates,” said Chambers. “Now we are comparing that information with relevant environmental data, like water and air quality measures, traffic patterns, and police data, to see if there are relationships or patterns between health outcomes and environmental events.

    “We can also access decades of banked blood spots from newborns and serum samples from expectant parents, to look for genetic markers associated with increased risk of certain problems, such as Sudden Infant Death Syndrome, or SIDS.”

    These data can also be used to examine the effects of specific kinds of environmental exposures.

    “Are there changes in preterm delivery rates in regions affected by wildfires?” said Chambers. “Or in areas where it’s getting hotter over time? This longitudinal data will help us pick apart the impacts of these trends.”

    Chamber’s hope is that this project will continue to expand and include more members of the community, to collect additional health data in the region.

    “When it comes to having children, climate change is a big consideration for some people,” she said. “There’s been a slight drop in birth rates recently, and it could be caused by a lot of things, but I’m definitely hearing concerns about what the world will look like in the future.”

    Painting a new picture of the future.

    Even with data from thousands of community members, it will be difficult to pinpoint the exact causes of adverse health outcomes in pregnancy and infants, which is why Benmarhnia and others are working to make the connections between climate change and child health.

    A study out of Benmarhnia’s lab, published in December 2019 in the Annals of the American Thoracic Society, found that exposure to wildfire smoke during the Lilac Fire in December of 2017 was connected to an uptick in the number of emergency room and urgent care clinics for children seeking respiratory care. More recently, the team found that airborne particles in wildfire smoke are approximately 10 times more harmful on children’s respiratory health than similarly sized particles from other sources, particularly for children under age five.

    And he’s studying other angles, too. “Big, obvious problems are easy,” he said. “More pollution in the air leads to more respiratory problems. But it’s harder to see that a heatwave can exacerbate other health conditions and lead to more complications, like a stroke, or renal failure in people with diabetes.

    “Extreme precipitation events can cause sewers to flood and release pathogens into the air and water. And we know that high levels of air pollution are linked to increased risk of dementia. But all of these things are very sneaky, and very hard to quantify.”

    2
    San Diego skyline in front of smoke from wildfires, October 23, 2007. Credit: Kat Miner/Wikimedia.

    Benmarhnia isn’t just trying to connect the dots between climate change and health; he’s also working with local policy makers and government officials to start developing plans to protect our health in the years to come.

    “We’re developing an adaptation plan specifically focused on the health impacts of climate change and how to mitigate them,” he said. “This is important because everyone will be impacted by this. Everyone knows someone dealing with some of these conditions. And everyone is exposed.”

    While that might seem dire, Benmarhnia actually finds this perspective motivating, and thinks it will help spur climate action.

    “Even without the climate change focus, epidemiology is depressing,” he laughed. “But ultimately, I think that if we were able to so dramatically change the planet in only a few decades, with enough effort, we can try to do just the opposite.”

    It will take time and effort to make the necessary changes, but Benmarhnia is optimistic—and despite the risks of climate change, he thinks that children are an essential part of the future.

    “We need to share what we’re doing with the next generation, and make sure that they are ready to act,” he said. “Having kids is part of the solution.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of California, San Diego (US), is a public research university located in the La Jolla area of San Diego, California, in the United States. The university occupies 2,141 acres (866 ha) near the coast of the Pacific Ocean with the main campus resting on approximately 1,152 acres (466 ha). Established in 1960 near the pre-existing Scripps Institution of Oceanography, UC San Diego is the seventh oldest of the 10 University of California campuses and offers over 200 undergraduate and graduate degree programs, enrolling about 22,700 undergraduate and 6,300 graduate students. UC San Diego is one of America’s Public Ivy universities, which recognizes top public research universities in the United States. UC San Diego was ranked 8th among public universities and 37th among all universities in the United States, and rated the 18th Top World University by U.S. News & World Report’s 2015 rankings.

    UC San Diego is organized into seven undergraduate residential colleges (Revelle; John Muir; Thurgood Marshall; Earl Warren; Eleanor Roosevelt; Sixth; and Seventh), four academic divisions (Arts and Humanities; Biological Sciences; Physical Sciences; and Social Sciences), and seven graduate and professional schools (Jacobs School of Engineering; Rady School of Management; Scripps Institution of Oceanography; School of Global Policy and Strategy; School of Medicine; Skaggs School of Pharmacy and Pharmaceutical Sciences; and the newly established Wertheim School of Public Health and Human Longevity Science). UC San Diego Health, the region’s only academic health system, provides patient care; conducts medical research; and educates future health care professionals at the UC San Diego Medical Center, Hillcrest; Jacobs Medical Center; Moores Cancer Center; Sulpizio Cardiovascular Center; Shiley Eye Institute; Institute for Genomic Medicine; Koman Family Outpatient Pavilion and various express care and urgent care clinics throughout San Diego.

    The university operates 19 organized research units (ORUs), including the Center for Energy Research; Qualcomm Institute (a branch of the California Institute for Telecommunications and Information Technology); San Diego Supercomputer Center; and the Kavli Institute for Brain and Mind, as well as eight School of Medicine research units, six research centers at Scripps Institution of Oceanography and two multi-campus initiatives, including the Institute on Global Conflict and Cooperation. UC San Diego is also closely affiliated with several regional research centers, such as the Salk Institute; the Sanford Burnham Prebys Medical Discovery Institute; the Sanford Consortium for Regenerative Medicine; and the Scripps Research Institute. It is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation(US), UC San Diego spent $1.265 billion on research and development in fiscal year 2018, ranking it 7th in the nation.

    UC San Diego is considered one of the country’s Public Ivies. As of February 2021, UC San Diego faculty, researchers and alumni have won 27 Nobel Prizes and three Fields Medals, eight National Medals of Science, eight MacArthur Fellowships, and three Pulitzer Prizes. Additionally, of the current faculty, 29 have been elected to the National Academy of Engineering, 70 to the National Academy of Sciences(US), 45 to the National Academy of Medicine(US) and 110 to the American Academy of Arts and Sciences.

    History

    When the Regents of the University of California originally authorized the San Diego campus in 1956, it was planned to be a graduate and research institution, providing instruction in the sciences, mathematics, and engineering. Local citizens supported the idea, voting the same year to transfer to the university 59 acres (24 ha) of mesa land on the coast near the preexisting Scripps Institution of Oceanography(US). The Regents requested an additional gift of 550 acres (220 ha) of undeveloped mesa land northeast of Scripps, as well as 500 acres (200 ha) on the former site of Camp Matthews from the federal government, but Roger Revelle, then director of Scripps Institution and main advocate for establishing the new campus, jeopardized the site selection by exposing the La Jolla community’s exclusive real estate business practices, which were antagonistic to minority racial and religious groups. This outraged local conservatives, as well as Regent Edwin W. Pauley.

    UC President Clark Kerr satisfied San Diego city donors by changing the proposed name from University of California, La Jolla, to University of California, San Diego. The city voted in agreement to its part in 1958, and the UC approved construction of the new campus in 1960. Because of the clash with Pauley, Revelle was not made chancellor. Herbert York, first director of Lawrence Livermore National Laboratory, was designated instead. York planned the main campus according to the “Oxbridge” model, relying on many of Revelle’s ideas.

    According to Kerr, “San Diego always asked for the best,” though this created much friction throughout the UC system, including with Kerr himself, because UC San Diego often seemed to be “asking for too much and too fast.” Kerr attributed UC San Diego’s “special personality” to Scripps, which for over five decades had been the most isolated UC unit in every sense: geographically, financially, and institutionally. It was a great shock to the Scripps community to learn that Scripps was now expected to become the nucleus of a new UC campus and would now be the object of far more attention from both the university administration in Berkeley and the state government in Sacramento.

    UC San Diego was the first general campus of the University of California to be designed “from the top down” in terms of research emphasis. Local leaders disagreed on whether the new school should be a technical research institute or a more broadly based school that included undergraduates as well. John Jay Hopkins of General Dynamics Corporation pledged one million dollars for the former while the City Council offered free land for the latter. The original authorization for the San Diego campus given by the UC Regents in 1956 approved a “graduate program in science and technology” that included undergraduate programs, a compromise that won both the support of General Dynamics and the city voters’ approval.

    Nobel laureate Harold Urey, a physicist from the University of Chicago(US), and Hans Suess, who had published the first paper on the greenhouse effect with Revelle in the previous year, were early recruits to the faculty in 1958. Maria Goeppert-Mayer, later the second female Nobel laureate in physics, was appointed professor of physics in 1960. The graduate division of the school opened in 1960 with 20 faculty in residence, with instruction offered in the fields of physics, biology, chemistry, and earth science. Before the main campus completed construction, classes were held in the Scripps Institution of Oceanography.

    By 1963, new facilities on the mesa had been finished for the School of Science and Engineering, and new buildings were under construction for Social Sciences and Humanities. Ten additional faculty in those disciplines were hired, and the whole site was designated the First College, later renamed after Roger Revelle, of the new campus. York resigned as chancellor that year and was replaced by John Semple Galbraith. The undergraduate program accepted its first class of 181 freshman at Revelle College in 1964. Second College was founded in 1964, on the land deeded by the federal government, and named after environmentalist John Muir two years later. The School of Medicine also accepted its first students in 1966.

    Political theorist Herbert Marcuse joined the faculty in 1965. A champion of the New Left, he reportedly was the first protester to occupy the administration building in a demonstration organized by his student, political activist Angela Davis. The American Legion offered to buy out the remainder of Marcuse’s contract for $20,000; the Regents censured Chancellor William J. McGill for defending Marcuse on the basis of academic freedom, but further action was averted after local leaders expressed support for Marcuse. Further student unrest was felt at the university, as the United States increased its involvement in the Vietnam War during the mid-1960s, when a student raised a Viet Minh flag over the campus. Protests escalated as the war continued and were only exacerbated after the National Guard fired on student protesters at Kent State University in 1970. Over 200 students occupied Urey Hall, with one student setting himself on fire in protest of the war.

    Early research activity and faculty quality, notably in the sciences, was integral to shaping the focus and culture of the university. Even before UC San Diego had its own campus, faculty recruits had already made significant research breakthroughs, such as the Keeling Curve, a graph that plots rapidly increasing carbon dioxide levels in the atmosphere and was the first significant evidence for global climate change; the Kohn–Sham equations, used to investigate particular atoms and molecules in quantum chemistry; and the Miller–Urey experiment, which gave birth to the field of prebiotic chemistry.

    Engineering, particularly computer science, became an important part of the university’s academics as it matured. University researchers helped develop UCSD Pascal, an early machine-independent programming language that later heavily influenced Java; the National Science Foundation Network, a precursor to the Internet; and the Network News Transfer Protocol during the late 1970s to 1980s. In economics, the methods for analyzing economic time series with time-varying volatility (ARCH), and with common trends (cointegration) were developed. UC San Diego maintained its research intense character after its founding, racking up 25 Nobel Laureates affiliated within 50 years of history; a rate of five per decade.

    Under Richard C. Atkinson’s leadership as chancellor from 1980 to 1995, the university strengthened its ties with the city of San Diego by encouraging technology transfer with developing companies, transforming San Diego into a world leader in technology-based industries. He oversaw a rapid expansion of the School of Engineering, later renamed after Qualcomm founder Irwin M. Jacobs, with the construction of the San Diego Supercomputer Center(US) and establishment of the computer science, electrical engineering, and bioengineering departments. Private donations increased from $15 million to nearly $50 million annually, faculty expanded by nearly 50%, and enrollment doubled to about 18,000 students during his administration. By the end of his chancellorship, the quality of UC San Diego graduate programs was ranked 10th in the nation by the National Research Council.

    The university continued to undergo further expansion during the first decade of the new millennium with the establishment and construction of two new professional schools — the Skaggs School of Pharmacy and Rady School of Management—and the California Institute for Telecommunications and Information Technology, a research institute run jointly with University of California Irvine(US). UC San Diego also reached two financial milestones during this time, becoming the first university in the western region to raise over $1 billion in its eight-year fundraising campaign in 2007 and also obtaining an additional $1 billion through research contracts and grants in a single fiscal year for the first time in 2010. Despite this, due to the California budget crisis, the university loaned $40 million against its own assets in 2009 to offset a significant reduction in state educational appropriations. The salary of Pradeep Khosla, who became chancellor in 2012, has been the subject of controversy amidst continued budget cuts and tuition increases.

    On November 27, 2017, the university announced it would leave its longtime athletic home of the California Collegiate Athletic Association, an NCAA Division II league, to begin a transition to Division I in 2020. At that time, it will join the Big West Conference, already home to four other UC campuses (Davis, Irvine, Riverside, Santa Barbara). The transition period will run through the 2023–24 school year. The university prepares to transition to NCAA Division I competition on July 1, 2020.

    Research

    Applied Physics and Mathematics

    The Nature Index lists UC San Diego as 6th in the United States for research output by article count in 2019. In 2017, UC San Diego spent $1.13 billion on research, the 7th highest expenditure among academic institutions in the U.S. The university operates several organized research units, including the Center for Astrophysics and Space Sciences (CASS), the Center for Drug Discovery Innovation, and the Institute for Neural Computation. UC San Diego also maintains close ties to the nearby Scripps Research Institute(US) and Salk Institute for Biological Studies(US). In 1977, UC San Diego developed and released the UCSD Pascal programming language. The university was designated as one of the original national Alzheimer’s disease research centers in 1984 by the National Institute on Aging. In 2018, UC San Diego received $10.5 million from the DOE National Nuclear Security Administration(US) to establish the Center for Matters under Extreme Pressure (CMEC).

    The university founded the San Diego Supercomputer Center (SDSC) in 1985, which provides high performance computing for research in various scientific disciplines. In 2000, UC San Diego partnered with UC Irvine to create the Qualcomm Institute – UC San Diego(US), which integrates research in photonics, nanotechnology, and wireless telecommunication to develop solutions to problems in energy, health, and the environment.

    UC San Diego also operates the Scripps Institution of Oceanography (SIO)(US), one of the largest centers of research in earth science in the world, which predates the university itself. Together, SDSC and SIO, along with funding partner universities California Institute of Technology(US), San Diego State University(US), and UC Santa Barbara, manage the High Performance Wireless Research and Education Network.

     
  • richardmitnick 11:17 am on April 22, 2021 Permalink | Reply
    Tags: "Paris to Berlin in an hour by train? Here’s how it could happen", , , Maglev   

    From Horizon-The EU Research and Innovation Magazine : “Paris to Berlin in an hour by train? Here’s how it could happen” 

    1

    From Horizon-The EU Research and Innovation Magazine

    22 April 2021
    Tom Cassauwers

    1
    The hyperloop is ready for a breakthrough, and Zeleros is one of the concepts in the running. The Spanish start-up has created a unique technology thanks to their approach to their higher-pressure tubes. Artist’s impression – Zeleros hyperloop.

    The hyperloop is what you get when you take a magnetic levitation train and put it into an airless tube. The lack of resistance allows the train, in theory, to achieve unseen speeds, a concept that is edging closer and closer to reality – and could provide a greener alternative to short-haul air travel.

    In November of 2020 two people were shooting through an airless tube at 160 km/h in the desert outside of Las Vegas. This wasn’t a ride invented by a casino or theme park; it was the first crewed ride of a hyperloop by the company Virgin Hyperloop. The ride only lasted 15 seconds, and the speeds they achieved were a far cry from the 1200 km/h they promise they will one day reach, but it represented a step forward.

    The hyperloop might be the future of transportation for medium-length journeys. It could out-compete high-speed rail, and at the same time operate at speeds comparable to aviation, but at a fraction of its environmental and energy costs. It’s a concept which start-ups and researchers have eagerly adopted, including several teams across Europe.

    Open design

    The idea originated with the US entrepreneur Elon Musk, associated with companies like SpaceX and Tesla. After he mentioned it several times in public, a team of SpaceX and Tesla engineers released an open concept in 2013. This initial idea then spawned a range of companies and even student teams, trying to design their own versions. Among them were several students in the Spanish city of Valencia.

    ‘We started in 2015 after Elon Musk’s announcement, when we were still students’, said Juan Vicén Balaguer, co-founder and chief marketing officer of the hyperloop start-up Zeleros, which today employs more than 50 people and raised around €10 million in funding. ‘We’ve been working on this technology for five years, and it can be a real alternative mode of transportation.’

    Yet the idea behind the hyperloop is older than Elon Musk, and it’s similar to an earlier idea called a vactrain or vacuum tube train. A comparable concept was already proposed by 19th century author Michel Verne, son of Jules, and has since then been periodically brought up by science-fiction writers and technologists. Now, however, the hyperloop seems to be getting ready for a breakthrough, and Zeleros is one of the concepts in the running.

    2
    ‘You need to remove the air from the front of the vehicle. If not, the craft would stop. Which is why we use a compressor system at the front of the vehicle’, explained Juan Vicén Balaguer, co-founder and chief marketing officer of Zeleros. Artist’s impression – Zeleros hyperloop.

    Higher-pressure tube

    What makes their technology unique is their approach to the tube. ‘Each company uses a different level of pressure,’ said Vicén. ‘Some are going for space pressure levels. Which means that the atmosphere in the tube is similar to space. It contains almost zero air.’

    This state would allow for very fast speeds, since the train would face almost no friction. Yet it comes with a range of practical issues. It’s very difficult and expensive to achieve and maintain this level of pressure for long stretches of tube. Safety would also be an issue. if something happens to the hull of the train, passengers would be exposed to dangerous vacuum conditions.

    That’s why Zeleros is aiming for higher-pressure tubes. ‘It would be similar to the pressure seen in aviation,’ said Vicén. The pressure in the tubes proposed by Zeleros would extend to around 100 millibars. This, in turn, allows them to copy safety systems from aircraft, such as the oxygen masks that drop from overhead cabins. This design choice also makes their tubes cheaper to build, thereby reducing infrastructure costs. Yet it also means their trains face more air friction when they glide through the tube, which they have to compensate for in other ways.

    ‘You need to remove the air from the front of the vehicle,’ said Vicén. ‘If not, the craft would stop. Which is why we use a compressor system at the front of the vehicle. If there was zero pressure, we wouldn’t need this. But it’s a balance between economics and efficiency.’

    At the front of the train is a compressor, which looks like the front of an airliner engine and which sucks in air and lets it out at the rear, providing propulsion for the craft. A so-called linear motor is also located at key parts of the track, like the start, to give the train its initial propulsion. From there it self-propels along the track, with magnets at the top of the vehicle attracting it to the top of the tube and making it levitate. This proposed craft would carry between 50 and 200 passengers, and would reach up to 1000 km/h. By comparison, the cruising speed of a short-haul passenger aircraft is about 800 km/h.

    Outcompete air

    But why do we need this in the first place? Shouldn’t we just invest more in our regular, high-speed trains? It’s more complicated than that, says Professor María Luisa Martínez Muneta from the Technical University of Madrid [Universidad Politécnica de Madrid] (ES), where she coordinates the HYPERNEX research project. HYPERNEX connects hyperloop start-ups, like Zeleros, with universities, railway companies and regulators, in order to accelerate the technology’s development in Europe.

    ‘Hyperloops face today’s greatest transportation demands: reduction of travel time and of environmental impact,’ said Prof. Martínez Muneta.

    Because of its limited speed – generally around 300-350 km/h – high-speed rail quickly becomes a bad choice for longer range travel if you want to get somewhere in a hurry. This gap is filled by short and medium-distance air travel, but aircraft emit a high volume of emissions compared to trains and are not always convenient, as airports can be located away from city centres.

    A hyperloop could solve the problem. ‘This mode of transport is focused on covering routes between 400 and 1500 kilometres,’ said Prof. Martínez Muneta. In this way a hyperloop would replace most shorter aeroplane travel, with much less of an environmental impact. ‘The hyperloop produces zero direct emissions as it is 100% electrical, while achieving higher speeds and therefore shorter travel times,’ she said.

    3
    With a speed of 1000 km/h, the hyperloop could be a greener and faster alternative to air travel. Image credit – Horizon.

    Labs and regulation

    Bringing this vision into reality will likely take a decade. Vicén from Zeleros predicts that the first commercial passenger routes will come online around 2030, with hyperloops focused on cargo arriving a few years earlier, around 2025-2027.

    One key issue in this timeframe is regulation. ‘The European Union is the first region that has a committee that promotes regulation and standardisation of hyperloops,’ said Vicén, referring to the 2020 founding of a joint technical committee on hyperloops by the European Committee for Standardization and the European Committee for Electrotechnical Standardization.

    According to Zeleros, this is an important step if hyperloops want to become commercially viable. These craft would operate at hitherto unseen speeds, with new safety characteristics like airless tubes. This would in turn require new regulations and standardisations, for example on what to do if the capsule depressurised.

    4
    The pressure in the tubes proposed by Zeleros would extend to around 100 millibars and allows to copy safety systems from aircraft, such as the oxygen masks that drop from overhead cabins. Artist’s impression – Zeleros hyperloop.

    The technology also remains somewhat untested, although real-world experiments are happening more often. Vicén mentions how they have already tested their technology in computer simulations, where they can model things like aerodynamic conditions and electromagnetic dynamics. They also use so-called physical demonstrators or prototypes that test in laboratory conditions how magnetism is affected by high speeds, for example.

    Nevertheless, they are aching to move from the lab to the field. Right now, they are planning to build a 3-km test track at a still-to-be-determined location in Spain, where by 2023 they hope to demonstrate their technology, and they are working with the Port of Valencia to study the use of hyperloops in transporting freight.

    Hyperloops might still be a few years out, but we’ll likely see more of them in the future.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 10:43 am on April 22, 2021 Permalink | Reply
    Tags: "Martin Rees and Frederick Lamb on humanity’s fate",   

    From EarthSky News: “Martin Rees and Frederick Lamb on humanity’s fate” 

    From EarthSky News

    April 22, 2021
    Kelly Kizer Whitt

    From nuclear weapons to biowarfare to cyberattacks, humanity has much to overcome. Martin Rees and Frederick Lamb discuss the obstacles we must face as we look forward to humanity’s future on Earth.

    1
    What is the fate of humanity – and other living creatures – on planet Earth? In the 21st century, we humans hold the key. Image via National Aeronautics Space Agency (US).

    The American Physical Society (US) met virtually this week, and – on April 19, 2021, at a press gathering – the U.K.’s Astronomer Royal Martin Rees and astrophysicist Frederick Lamb discussed some of the serious obstacles humans must overcome, in order to move forward in this century. Rees commented:

    “Our Earth is 45 million centuries old. But this century is the first when one species – ours – can determine the biosphere’s fate.”

    Rees has been the Astronomer Royal since 1995 and has written a book titled On the Future: Prospects for Humanity. He is also the co-founder of the Centre for the Study of Existential Risk- U Cambridge (UK) at the University of Cambridge (UK). Lamb is an astrophysicist at the University of Illinois (US) and a core faculty member in the Program on Arms Control, Disarmament and International Security.

    Rees said that, through technology, we could improve the state of the Earth, or – through inaction on issues such as climate change and biodiversity loss, among other things – we could end this century with Earth in a much-degraded state for habitation by living things. Two of the great threats we face, according to Rees, are biological and cyber threats. The biological threats of natural pandemics spread globally in a way they never did before, as we have seen with Covid-19. But human-engineered pandemics and misuses of cybertechnology would be even worse nightmares, Rees said. He commented:

    “What really scares me is … it’s possible to make a bioweapon or cause a cyberattack with minimal equipment, standard equipment available to many people. I see the biggest challenge in the next 20 years is to ensure that doesn’t happen.”

    Rees also describes himself as “an optimist.” He believes humans can avoid the risks and achieve a sustainable future, better than the world we live in today. He said:

    “If all of us passengers on ‘spaceship Earth’ want to ensure that we leave it in better shape for future generations we need to promote wise deployment of new technologies, while minimizing the risk of pandemics, cyberthreats, and other global catastrophes.”

    Meanwhile, Lamb has a different focus on the obstacles facing humanity. That is in part because Lamb is an expert in nuclear weapons proliferation and mitigation. He called it “crazy and bizarre” that we allow ourselves to live under a nuclear threat:

    “Why we would want to live in a world due to mishap, misunderstanding, mistake or madness, we could have the whole world destroyed, makes no sense. And we can do something about it.”

    Lamb said that while nuclear arsenals have been reduced between Russia and the United States, there are still some 14,000 weapons that are ready to be launched at a moment’s notice. A few hundred could destroy a country and a few thousand could destroy civilization on the planet. Missile defense has been the goal of many nations, but if just a single weapon gets through a missile defense, a million people can die. A missile defense, for it to work, would have to work perfectly, which is not feasible. Intercept has to occur quickly, in less than 170 seconds for solid-propelled warheads and less than 280 seconds for liquid-propelled warheads. The missile targeting is not even able to accurately distinguish between the exhaust plume and the warhead. In fact, the existence of missile defense systems has driven countries to produce more nuclear weapons in an effort to counteract defense systems. As Lamb put it:

    “Missile defense has never been able to protect against a nuclear attack, and there is no prospect that it could in the foreseeable future. Spending on missile defense has only made us less safe, by causing our potential adversaries to increase their nuclear arsenals.”

    What works? According to Lamb:

    “When we have become more safe, it’s when we’ve made agreements that reduce the number of nuclear weapons that threaten us. This has only been possible when we agreed to limit defenses.”

    But, Lamb asserted, there is one way in which nuclear is still important for humanity’s future, and that is energy production. Lamb – a proponent of nuclear energy, at least in the short term – believes we need to work toward making the technological advances that will allow us to stop using carbon-based fuels:

    “It’s a question of time. We have a limited amount of time to be able to stop the global warming before it becomes really serious … We’re going to, at least in the interim, find a way to use nuclear power to get us through this transition period.”

    Rees agrees with the need for sustainable energy production, and he also sees food as an important issue. With 9 billion people expected to be on the planet by mid-century, we need more efficient food production. Rees hopes we find ways of sustainably and intensively growing plants and vegetables so we don’t encroach on forest and land, and he believes development of artificial meat will be crucial to feeding future populations.

    Lamb and Rees discussed their backgrounds in astrophysics and their concern with the future of the planet. Lamb was interested in physics as a child, and his parents turned to his expertise when the cold war heated up and everyone in the United States was told to build a bomb shelter as protection against a possible nuclear war. After studying the subject at the library, he said to his parents:

    “Mom and Dad, we have to talk. The only solution here is to prevent this from ever happening. If we have any money or time or energy that we would have spent digging a bomb shelter, we should spend it on trying to prevent it from happening.”

    Rees explained how his astronomy background meshes with his concern for humanity’s fate:

    “People often ask does being an astronomer have any effect on one’s attitude toward these things. I think it does in a way, because it makes us aware of the long-range future. We’re aware that it’s taken about 4 billion years for life to evolve from simple beginnings to our biosphere of which we are a part, but we also know that the sun is less than halfway through its life and the universe may go on forever. So we are not the culmination of evolution. Post-humans are going to have far longer to evolve. We can’t conceive what they’d be like, but if life is a rarity in the universe, then, of course, the stakes are very high if we snuff things out this century.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Deborah Byrd created the EarthSky radio series in 1991 and founded EarthSky.orgin 1994. Today, she serves as Editor-in-Chief of this website. She has won a galaxy of awards from the broadcasting and science communities, including having an asteroid named 3505 Byrd in her honor. A science communicator and educator since 1976, Byrd believes in science as a force for good in the world and a vital tool for the 21st century. “Being an EarthSky editor is like hosting a big global party for cool nature-lovers,” she says.

     
  • richardmitnick 9:57 am on April 22, 2021 Permalink | Reply
    Tags: , NIST National Cybersecurity Center of Excellence (US)   

    From National Institute of Standards and Technology (US) and From NIST National Cybersecurity Center of Excellence (US) : “Securing the Industrial Internet of Things” 

    From National Institute of Standards and Technology (US)

    and

    From NIST National Cybersecurity Center of Excellence (US)

    1

    Current Status

    The NCCoE released for public comment a preliminary draft of Volumes A and B of NIST SP 1800-32, Securing the Industrial Internet of Things: Cybersecurity for Distributed Energy Resources. Implementation of the example solution at the NCCoE is ongoing. We are providing this preliminary draft to gather valuable feedback and inform stakeholders of the progress of the project. Organizations are encouraged to review the preliminary draft and provide feedback online or via email to energy_nccoe@nist.gov by May 24, 2021.

    SP 1800-32A: Executive Summary (PDF)
    SP 1800-32B: Approach, Architecture, and Security Characteristics (PDF)
    SP 1800-32C: How-To Guides (under development)

    Read the Securing Distributed Energy Resources one-page flyer to learn about the cybersecurity capabilities demonstrated in the project.

    Read the two-page fact sheet for a brief overview of this project.

    If you have questions or would like to join our Community of Interest, please email the project team at energy_nccoe@nist.gov.

    Summary

    The Industrial Internet of Things, or IIoT, refers to the application of instrumentation and connected sensors and other devices to machinery and vehicles in the transport, energy, and industrial sectors. In the energy sector, distributed energy resources (DERs), such as solar photovoltaics and wind turbines, introduce information exchanges between a utility’s distribution control system and the DERs to manage the flow of energy in the distribution grid. These information exchanges often employ IIoT technologies that may lack communications security. Additionally, the operating characteristics of DERs are dynamic and significantly different from those of traditional power generation capabilities. Timely management of DER capabilities often requires a higher degree of automation. Introduction of additional automation into DER management and control systems can also introduce cybersecurity risks. Managing the automation, the increased need for information exchanges, and the cybersecurity associated with these presents significant challenges.

    The National Cybersecurity Center of Excellence (NCCoE) is proposing a project that will focus on helping energy companies secure IIoT information exchanges of DERs in their operating environments. As an increasing number of DERs are connected to the grid there is a need to examine the potential cybersecurity concerns that may arise from these interconnections.

    Our goal in this project is to document an approach for improving the overall security of IIoT in a DER environment that will address the following areas of interest:

    The information exchanges between and among DER systems and distribution facilities/entities, and the cybersecurity considerations involved in these interactions.
    The processes and cybersecurity technologies needed for trusted device identification and communication with other devices.
    The ability to provide malware prevention, detection, and mitigation in operating environments where information exchanges are occurring.
    The mechanisms that can be used for ensuring the integrity of command and operational data and the components that produce and receive this data.
    Data-driven cybersecurity analytics to help owners and operators securely perform necessary tasks.

    Collaborating Vendors

    Organizations participating in this project submitted their capabilities in response to an open call in the Federal Register for all sources of relevant security capabilities from academia and industry (vendors and integrators). The following respondents with relevant capabilities or product components (identified as “Technology Partners/Collaborators” herein) signed a Cooperative Research and Development Agreement to collaborate with NIST in a consortium to build this example solution.

    2
    3
    4
    5
    6
    7
    8
    9

    11

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The National Cybersecurity Center of Excellence (NCCoE) is a US government organization that builds and publicly shares solutions to cybersecurity problems faced by U.S. businesses. The center, located in Rockville, Maryland, was established in 2012 through a partnership with the National Institute of Standards and Technology (US), the State of Maryland, and Montgomery County. The center is partnered with nearly 20 market-leading IT companies, which contribute hardware, software and expertise.

    The NCCoE asks industry sector members about their cybersecurity problems, then selects issues that affect an entire sector or reaches across sectors. The center forms a team of people from cybersecurity technology companies, other federal agencies and academia to address each problem. The teams work in the center’s labs to build example solutions using commercially available, off-the-shelf products. For each example solution, the NCCoE publishes a practice guide, a collection of the materials and information needed to deploy the example solution, and makes it available to the general public. The center’s goal is to “accelerate the deployment and use of secure technologies” that can help businesses improve their defenses against cyber attack.

    In September 2014, the National Institute of Standards and Technology (NIST) awarded a contract to the MITRE Corporation to operate the Department of Commerce’s first Federally Funded Research and Development Center (FFRDC), the National Cybersecurity FFRDC, which supports the NCCoE. According to the press release on the NIST website, “this FFRDC is the first solely dedicated to enhancing the security of the nation’s information systems.” The press release states that the FFRDC will help the NCCoE “expand and accelerate its public-private collaborations” and focus on “boosting the security of U.S. information systems.” “FFRDCs operate in the public interest and are required to be free from organizational conflicts of interest as well as bias toward any particular company, technology or product—key attributes given the NCCoE’s collaborative nature…The first three task orders under the contract will allow the NCCoE to expand its efforts in developing use cases and building blocks and provide operations management and facilities planning.”

    National Cybersecurity Excellence Partners (NCEPs) offer technology companies the opportunity to develop long-term relationships with the NCCoE and NIST. As core partners, NCEPs can provide hardware, software, or personnel who collaborate with the NCCoE on current projects.

    NIST Campus, Gaitherberg, MD, USA

    National Institute of Standards and Technology (US)‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “National Institute of Standards and Technology (US)” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock. NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR). The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961. SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology (CNST) performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility. This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).

    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel