Tagged: NOAA Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:46 am on December 19, 2019 Permalink | Reply
    Tags: , , , Mississippi State University, NOAA, Orion Dell-EMC supercomputer,   

    From insideHPC: “Orion Supercomputer comes to Mississippi State University” 

    From insideHPC

    December 18, 2019
    Rich Brueckner

    Orion Dell EMC supercomputer

    Today Mississippi State University and the NOAA celebrated one of the country’s most powerful supercomputers with a ribbon-cutting ceremony for the Orion supercomputer, the fourth-fastest computer system in U.S. academia. Funded by NOAA and managed by MSU’s High Performance Computing Collaboratory, the Orion system is powering research and development advancements in weather and climate modeling, autonomous systems, materials, cybersecurity, computational modeling and more.


    With 3.66 Petaflops of performance on the Linpack benchmark, Orion is 60th most powerful supercomputer in the world according to Top500.org, which ranks the world’s most powerful non-distributed computer systems. It is housed in the Malcolm A. Portera High Performance Computing Center, located in MSU’s Thad Cochran Research, Technology and Economic Development Park.

    “Mississippi State has a long history of using advanced computing power to drive innovative research, making an impact in Mississippi and around the world,” said MSU President Mark E. Keenum. “We also have had many successful collaborations with NOAA in support of the agency’s vital work. I am grateful that NOAA has partnered with us to help meet its computing needs, and I look forward to seeing the many scientific advancements that will take place because of this world-class supercomputer.”

    NOAA has provided MSU with $22 million in grants to purchase, install and run Orion. The Dell-EMC system consists of 28 computer cabinets, each cabinet approximately the size of an industrial refrigerator, 72,000 processing cores and 350 terabytes of Random Access Memory.

    “We’re excited to support this powerhouse of computing capacity at Mississippi State,” said Craig McLean, NOAA assistant administrator for Oceanic and Atmospheric Research. “Orion joins NOAA’s network of computer centers around the country, and boosts NOAA’s ability to conduct innovative research to advance weather, climate and ocean forecasting products vital to protecting American lives and property.”

    MSU’s partnerships with NOAA include the university’s leadership of the Northern Gulf Institute, a consortium of six academic institutions that works with NOAA to address national strategic research and education goals in the Gulf of Mexico region. Additionally, MSU’s High Performance Computing Collaboratory provides the computing infrastructure for NOAA’s Exploration Command Center at the NASA Stennis Space Center. The state-of-the-art communications hub enables research scientists at sea and colleagues on shore to communicate in real time and view live video streams of undersea life.

    “NOAA has been an incredible partner in research with MSU, and this is the latest in a clear demonstration of the benefits of this partnership for both the university and the agency,” said MSU Provost and Executive Vice President David Shaw.”

    Orion supports research operations for several MSU centers and institutes, such as the Center for Computational Sciences, Center for Cyber Innovation, Geosystems Research Institute, Center for Advanced Vehicular Systems, Institute for Genomics, Biocomputing and Biogeotechnology, the Northern Gulf Institute and the FAA Alliance for System Safety of UAS through Research Excellence (ASSURE). These centers use high-performance computing to model and simulate real-world phenomena, generating insights that would be impossible or prohibitively expensive to obtain otherwise.

    “With our faculty expertise and our computing capabilities, MSU is able to remain at the forefront of cutting-edge research areas,” said MSU Interim Vice President for Research and Economic Development Julie Jordan. “The Orion supercomputer is a great asset for the state of Mississippi as we work with state, federal and industry partners to solve complex problems and spur new innovations.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

  • richardmitnick 8:00 am on December 18, 2019 Permalink | Reply
    Tags: "Earth's Magnetic North Pole Keeps Moving Towards Siberia at a Mysteriously Fast Pace", , , NOAA,   

    From NOAA via Science Alert: “Earth’s Magnetic North Pole Keeps Moving Towards Siberia at a Mysteriously Fast Pace” 

    From NOAA



    Science Alert

    18 DEC 2019


    Our planet is restless, and its poles are wandering. Of course, the geographic north pole is in the same place it always was, but its magnetic counterpart – indicated by the N on any compass – is roaming towards Siberia at record-breaking speeds that scientists don’t fully comprehend.

    It’s worth stating that while the pace is remarkable, the movement itself isn’t. The magnetic north pole is never truly stationary, owing to fluctuations in the flow of molten iron within the core of our planet, which affect how Earth’s magnetic field behaves.

    “Since its first formal discovery in 1831, the north magnetic pole has travelled around 1,400 miles (2,250 km),” the NOAA’s National Centres for Environmental Information (NCEI) explains on its website.

    The latest version of the World Magnetic Model (WMM), one of the key tools developed to model the change in Earth’s magnetic field, has been released. Developed by NCEI and the British Geological Survey, with support from the Cooperative Institute for Research in Environmental Sciences (CIRES), the WMM is a representation of the planet’s magnetic field that gives compasses dependable accuracy.

    The WMM now includes “Blackout Zones” around the magnetic poles, as defined by the strength of the horizontal field. Between 2000 and 6000 nanotesla (nT) horizontal field strength is the “Caution Zone” where compasses may start to become prone to errors. The area around the pole between 2000 and 0nT is the “Unreliable Zone” where compasses may become inaccurate.

    Smartphone and consumer electronics companies rely on the WMM to provide consumers with accurate compass apps, maps, and GPS services. The WMM is the also the standard navigation tool for the Federal Aviation Administration, U.S. Department of Defense, North Atlantic Treaty Organization (NATO), and more.

    A new and updated version of the WMM is released every five years. The latest WMM2020 model will extend to 2025.

    “This wandering has been generally quite slow, allowing scientists to keep track of its position fairly easily.”

    That slow wander has quickened of late. In recent decades, the magnetic north pole accelerated to an average speed of 55 kilometres (34 miles) per year.

    The most recent data suggest its movement towards Russia may have slowed down to about 40 kilometres (25 miles) annually, but even so, compared to theoretical measurements going back hundreds of years, this is a phenomenon scientists have never witnessed before.

    “The movement since the 1990s is much faster than at any time for at least four centuries,” geomagnetic specialist Ciaran Beggan from the British Geological Survey (BGS) told FT.

    “We really don’t know much about the changes in the core that’s driving it.”

    While researchers can’t fully explain the core fluctuations affecting the north pole’s extreme restlessness, they can map Earth’s magnetic field and calculate its rate of change over time, which helps us to predict how it may be distributed in the future.

    That system produces what is called the World Magnetic Model (WMM): a representation of the field that powers everything from navigational tools like GPS to mapping services and consumer compass apps, not to mention systems used by NASA, the FAA, and the military, among other institutions.

    Despite is importance, the WMM’s powers of foresight – like the magnetic north pole itself – are not set in stone, and the readings need to be updated every five years to keep the model accurate.

    “Provided that suitable satellite magnetic observations are available, the prediction of the WMM is highly accurate on its release date and then subsequently deteriorates towards the end of the five-year epoch, when it has to be updated with revised values of the model coefficients,” the NCEI explains.

    That’s the point we’re up to now, with the bodies that maintain the WMM – the NCEI and the BGS – having finally updated the model last week.

    The refresh actually comes a whole year ahead of schedule due to the unusual speed with which the magnetic north pole has been drifting, meaning that the WMM’s predictions have deteriorated faster than usual this cycle, despite the recent slowdown.

    While the speed fluctuations seem crazy, it’s actually a more moderate range of pole movement than has happened in Earth’s history: when the magnetic poles move far enough out of position, they can actually flip, something that happens every few hundreds of thousands of years.

    In the meantime, the new WMM data is good until 2025, and rest assured, no imminent flipping is predicted for now.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    NOAA is an agency that enriches life through science. Our reach goes from the surface of the sun to the depths of the ocean floor as we work to keep the public informed of the changing environment around them.

    From daily weather forecasts, severe storm warnings, and climate monitoring to fisheries management, coastal restoration and supporting marine commerce, NOAA’s products and services support economic vitality and affect more than one-third of America’s gross domestic product. NOAA’s dedicated scientists use cutting-edge research and high-tech instrumentation to provide citizens, planners, emergency managers and other decision makers with reliable information they need when they need it

  • richardmitnick 11:17 am on September 28, 2019 Permalink | Reply
    Tags: , NOAA, ,   

    From Rutgers University: “Rutgers-led projects awarded $16 million in NOAA Sea grants” 

    Rutgers smaller
    Our Great Seal.

    From Rutgers University

    September 26, 2019

    Rutgers scientists are serving as principal investigators of three projects.

    NOAA Sea Grant announced $16 million in federal funding awards to support 42 research projects and collaborative programs aimed at advancing sustainable aquaculture in the United States. Rutgers scientists are among those serving as principal investigators of three of the 42 projects. (Photo Credit: Aileen Devlin, Virginia Sea Grant)

    NOAA Sea Grant announced $16 million in federal funding awards to support 42 research projects and collaborative programs aimed at advancing sustainable aquaculture in the United States. Rutgers scientists are among those serving as principal investigators of three of the 42 projects.

    Rutgers scientists David Bushek, professor and director of Rutgers Haskin Shellfish Research Laboratory (HSRL) and Michael De Luca, director of the Rutgers Aquaculture Innovation Center, are the lead principal investigators on two projects, while Ximing Guo, distinguished professor and shellfish geneticist at HSRL, is a co-principal investigator on a third project.

    The economic benefit of NOAA Sea Grant’s investment in aquaculture in 2018 was $65 million, including sustaining or creating 841 jobs and 345 businesses. In 2019, Sea Grant employed or partially funded 111 professionals working on aquaculture around the country to study, communicate, identify needs or transfer research to industry members and the public.

    Director of the National Sea Grant College Program, Jonathan Pennock, said, “With our 2019 investments, we are building on investments by Sea Grant and NOAA over the last few years to fill critical gaps in information and strengthen connectivity of science to industry. These investments will help advance U.S. aquaculture in sustainable, thoughtful ways using the best science and talent across the country.”

    The GIS Based Tool for Spatial Planning and Management of Shellfish Aquaculture in New Jersey, funded at $249,989, is led by Michael De Luca, with co-principal investigators Lisa Calvo, shellfish aquaculture program coordinator at HSRL, Jeanne Herb (CC ’81), executive director, Environmental Analysis and Communications Group at the Bloustein School of Planning and Policy, and Lucas Marxen, associate director of the Office of Research Analytics, NJAES. Their project aims to “develop an interactive shellfish aquaculture siting tool for New Jersey and use it to model different scenarios for the expansion of aquaculture in the state and to gather stakeholder input for valuation of critical areas. The continued vitality and potential for growth of shellfish aquaculture in New Jersey will be dependent on a comprehensive spatial planning effort that examines the existing footprint and seeks smart growth that is compatible with other valued uses of state coastal waters.”

    Atlantic and Gulf Shellfish Seed Biosecurity Collaborative, funded at $1,172,732, is led by David Bushek, with co-principal investigators Ryan Carnegie, William & Mary’s Virginia Institute of Marine Science, Peter M. Rowe, New Jersey Sea Grant Consortium, and William Walton, Auburn University. “Aquaculture continues to expand rapidly along the East Coast and is beginning to grow along the Gulf Coast with increasing requests for transfers of shellfish that can potentially spread disease harming the industry and natural resources. Regulators to have access to an understanding of disease dynamics and distributions in order to appropriately evaluate risk and gain confidence in justifying transfer decisions one way or the other. Similarly, industry members, including hatcheries, nurseries and farms, must understand the basics of disease risks to protect themselves, neighboring farms, and wild stocks as they seek to source seed for their needs. Developing this understanding and transferring the knowledge to employ these tools is a primary role of extension and represents an important element of this project.”

    QPX distribution and persistence in the environment and East Coast Hard Clam Selective Breeding Collaborative, funded at $1,200,000, is led by Bassem Allam of New York Sea Grant project, with co-principal investigators Emmanuelle Pales Espinosa, Stony Brook University; Ximing Guo, Rutgers University; Kimberly Reece, Virginia Institute of Marine Science; and Antoinette Clemetson, New York Sea Grant. “This collaborative proposal builds on ongoing cooperation and new partnerships among Sea Grant programs, scientists and extension teams in five Atlantic states to develop a hard clam selective breeding program for the benefit of clam farmers throughout the region. [….] The overarching aim of this proposal is to accelerate and enhance the selective breeding of the hard clam and develop superior clam stocks for aquaculture operations for all growing regions along the Atlantic seaboard.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition


    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    As a ’67 graduate of University college, second in my class, I am proud to be a member of

    Alpha Sigma Lamda, National Honor Society of non-tradional students.

  • richardmitnick 10:09 am on August 19, 2019 Permalink | Reply
    Tags: "Ocean warming has fisheries on the move helping some but hurting more", , , NOAA, , ,   

    From The Conversation: “Ocean warming has fisheries on the move, helping some but hurting more” 

    From The Conversation

    August 19, 2019
    Chris Free, UCSB

    Atlantic Cod on Ice. Alamy. Cod fisheries in the North Sea and Irish Sea are declining due to overfishing and climate change.

    Climate change has been steadily warming the ocean, which absorbs most of the heat trapped by greenhouse gases in the atmosphere, for 100 years. This warming is altering marine ecosystems and having a direct impact on fish populations. About half of the world’s population relies on fish as a vital source of protein, and the fishing industry employs more the 56 million people worldwide.

    My recent study [Science] with colleagues from Rutgers University and the U.S. National Oceanic and Atmospheric Administration found that ocean warming has already impacted global fish populations. We found that some populations benefited from warming, but more of them suffered.


    Overall, ocean warming reduced catch potential – the greatest amount of fish that can be caught year after year – by a net 4% over the past 80 years. In some regions, the effects of warming have been much larger. The North Sea, which has large commercial fisheries, and the seas of East Asia, which support some of the fastest-growing human populations, experienced losses of 15% to 35%.

    The reddish and brown circles represent fish populations whose maximum sustainable yields have dropped as the ocean has warmed. The darkest tones represent extremes of 35 percent. Blueish colors represent fish yields that increased in warmer waters. Chris Free, CC BY-ND

    Although ocean warming has already challenged the ability of ocean fisheries to provide food and income, swift reductions in greenhouse gas emissions and reforms to fisheries management could lessen many of the negative impacts of continued warming.

    How and why does ocean warming affect fish?

    My collaborators and I like to say that fish are like Goldilocks: They don’t want their water too hot or too cold, but just right.

    Put another way, most fish species have evolved narrow temperature tolerances. Supporting the cellular machinery necessary to tolerate wider temperatures demands a lot of energy. This evolutionary strategy saves energy when temperatures are “just right,” but it becomes a problem when fish find themselves in warming water. As their bodies begin to fail, they must divert energy from searching for food or avoiding predators to maintaining basic bodily functions and searching for cooler waters.

    Thus, as the oceans warm, fish move to track their preferred temperatures. Most fish are moving poleward or into deeper waters. For some species, warming expands their ranges. In other cases it contracts their ranges by reducing the amount of ocean they can thermally tolerate. These shifts change where fish go, their abundance and their catch potential.

    Warming can also modify the availability of key prey species. For example, if warming causes zooplankton – small invertebrates at the bottom of the ocean food web – to bloom early, they may not be available when juvenile fish need them most. Alternatively, warming can sometimes enhance the strength of zooplankton blooms, thereby increasing the productivity of juvenile fish.

    Understanding how the complex impacts of warming on fish populations balance out is crucial for projecting how climate change could affect the ocean’s potential to provide food and income for people.


    Impacts of historical warming on marine fisheries

    Sustainable fisheries are like healthy bank accounts. If people live off the interest and don’t overly deplete the principal, both people and the bank thrive. If a fish population is overfished, the population’s “principal” shrinks too much to generate high long-term yields.

    Similarly, stresses on fish populations from environmental change can reduce population growth rates, much as an interest rate reduction reduces the growth rate of savings in a bank account.

    In our study we combined maps of historical ocean temperatures with estimates of historical fish abundance and exploitation. This allowed us to assess how warming has affected those interest rates and returns from the global fisheries bank account.

    Losers outweigh winners

    We found that warming has damaged some fisheries and benefited others. The losers outweighed the winners, resulting in a net 4% decline in sustainable catch potential over the last 80 years. This represents a cumulative loss of 1.4 million metric tons previously available for food and income.

    Some regions have been hit especially hard. The North Sea, with large commercial fisheries for species like Atlantic cod, haddock and herring, has experienced a 35% loss in sustainable catch potential since 1930. The waters of East Asia, neighbored by some of the fastest-growing human populations in the world, saw losses of 8% to 35% across three seas.

    Other species and regions benefited from warming. Black sea bass, a popular species among recreational anglers on the U.S. East Coast, expanded its range and catch potential as waters previously too cool for it warmed. In the Baltic Sea, juvenile herring and sprat – another small herring-like fish – have more food available to them in warm years than in cool years, and have also benefited from warming. However, these climate winners can tolerate only so much warming, and may see declines as temperatures continue to rise.

    Shucking scallops in Maine, where fishery management has kept scallop numbers sustainable. Robert F. Bukaty/AP

    Management boosts fishes’ resilience

    Our work suggests three encouraging pieces of news for fish populations.

    First, well-managed fisheries, such as Atlantic scallops on the U.S. East Coast, were among the most resilient to warming. Others with a history of overfishing, such as Atlantic cod in the Irish and North seas, were among the most vulnerable. These findings suggest that preventing overfishing and rebuilding overfished populations will enhance resilience and maximize long-term food and income potential.

    Second, new research suggests that swift climate-adaptive management reforms can make it possible for fish to feed humans and generate income into the future. This will require scientific agencies to work with the fishing industry on new methods for assessing fish populations’ health, set catch limits that account for the effects of climate change and establish new international institutions to ensure that management remains strong as fish migrate poleward from one nation’s waters into another’s. These agencies would be similar to multinational organizations that manage tuna, swordfish and marlin today.

    Finally, nations will have to aggressively curb greenhouse gas emissions. Even the best fishery management reforms will be unable to compensate for the 4 degree Celsius ocean temperature increase that scientists project will occur by the end of this century if greenhouse gas emissions are not reduced.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Conversation launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

  • richardmitnick 8:35 am on November 18, 2017 Permalink | Reply
    Tags: , JPSS-1 will be renamed NOAA-20 when it reaches its final orbit, , NOAA, Observations of atmospheric temperature and moisture clouds sea-surface temperature ocean color sea ice cover volcanic ash and fire detection, The data will improve weather forecasting such as predicting a hurricane’s track   

    From NASA: “NASA Launches NOAA Weather Satellite Aboard United Launch Alliance Rocket to Improve Forecasts” 


    Nov. 18, 2017

    Steve Cole
    Headquarters, Washington

    John Leslie
    NOAA, Silver Spring, Md.

    NOAA Joint Polar Satellite System (JPSS)

    NASA has successfully launched for the National Oceanic and Atmospheric Administration (NOAA) the first in a series of four highly advanced polar-orbiting satellites, equipped with next-generation technology and designed to improve the accuracy of U.S. weather forecasts out to seven days.

    The Joint Polar Satellite System-1 (JPSS-1) lifted off on a United Launch Alliance Delta II rocket from Vandenberg Air Force Base, California, at 1:47 a.m. PST Saturday.

    Approximately 63 minutes after launch the solar arrays on JPSS-1 deployed and the spacecraft was operating on its own power. JPSS-1 will be renamed NOAA-20 when it reaches its final orbit. Following a three-month checkout and validation of its five advanced instruments, the satellite will become operational.

    “Launching JPSS-1 underscores NOAA’s commitment to putting the best possible satellites into orbit, giving our forecasters — and the public — greater confidence in weather forecasts up to seven days in advance, including the potential for severe, or impactful weather,” said Stephen Volz, director of NOAA’s Satellite and Information Service.

    JPSS-1 will join the joint NOAA/NASA Suomi National Polar-orbiting Partnership satellite in the same orbit and provide meteorologists with observations of atmospheric temperature and moisture, clouds, sea-surface temperature, ocean color, sea ice cover, volcanic ash, and fire detection. The data will improve weather forecasting, such as predicting a hurricane’s track, and will help agencies involved with post-storm recovery by visualizing storm damage and the geographic extent of power outages.

    “Emergency managers increasingly rely on our forecasts to make critical decisions and take appropriate action before a storm hits,” said Louis W. Uccellini, director of NOAA’s National Weather Service. “Polar satellite observations not only help us monitor and collect information about current weather systems, but they provide data to feed into our weather forecast models.”

    JPSS-1 has five instruments, each of which is significantly upgraded from the instruments on NOAA’s previous polar-orbiting satellites. The more-detailed observations from JPSS will allow forecasters to make more accurate predictions. JPSS-1 data will also improve recognition of climate patterns that influence the weather, such as El Nino and La Nina.

    The JPSS program is a partnership between NOAA and NASA through which they will oversee the development, launch, testing and operation all the satellites in the series. NOAA funds and manages the program, operations and data products. NASA develops and builds the instruments, spacecraft and ground system and launches the satellites for NOAA. JPSS-1 launch management was provided by NASA’s Launch Services Program based at the agency’s Kennedy Space Center in Florida.

    “Today’s launch is the latest example of the strong relationship between NASA and NOAA, contributing to the advancement of scientific discovery and the improvement of the U.S. weather forecasting capability by leveraging the unique vantage point of space to benefit and protect humankind,” said Sandra Smalley, director of NASA’s Joint Agency Satellite Division.

    Ball Aerospace designed and built the JPSS-1 satellite bus and Ozone Mapping and Profiler Suite instrument, integrated all five of the spacecraft’s instruments and performed satellite-level testing and launch support. Raytheon Corporation built the Visible Infrared Imaging Radiometer Suite and the Common Ground System. Harris Corporation built the Cross-track Infrared Sounder. Northrop Grumman Aerospace Systems built the Advanced Technology Microwave Sounder and the Clouds and the Earth’s Radiant Energy System instrument.

    To learn more about the JPSS-1 mission, visit:




    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs. NASA shares data with various national and international organizations such as from the [JAXA]Greenhouse Gases Observing Satellite.

  • richardmitnick 5:20 am on August 17, 2017 Permalink | Reply
    Tags: , , , NASA's Global Hawk autonomous aircraft, NASA-led Mission Studies Storm Intensification, NOAA   

    From JPL: “NASA-led Mission Studies Storm Intensification” 

    NASA JPL Banner


    August 16, 2017
    Alan Buis
    Jet Propulsion Laboratory, Pasadena, California

    Kate Squires
    NASA Armstrong Flight Research Center

    Written by Kate Squires
    NASA Armstrong Flight Research Center

    NASA’s Global Hawk being prepared at Armstrong to monitor and take scientific measurements of Hurricane Matthew in 2016. Credits: NASA Photo/Lauren Hughes.

    A group of NASA and National Oceanic and Atmospheric Administration (NOAA) scientists, including scientists from NASA’s Jet Propulsion Laboratory, Pasadena, California, are teaming up this month for an airborne mission focused on studying severe storm processes and intensification. The Hands-On Project Experience (HOPE) Eastern Pacific Origins and Characteristics of Hurricanes (EPOCH) field campaign will use NASA’s Global Hawk autonomous aircraft to study storms in the Northern Hemisphere to learn more about how storms intensify as they brew out over the ocean.

    The scope of the mission initially focused only on the East Pacific region, but was expanded to both the Gulf and Atlantic regions to give the science team broader opportunities for data collection.

    “Our key point of interest is still the Eastern Pacific, but if the team saw something developing off the East Coast that may have high impact to coastal communities, we would definitely recalibrate to send the aircraft to that area,” said Amber Emory, NASA’s principal investigator.

    Having a better understanding of storm intensification is an important goal of HOPE EPOCH. The data will help improve models that predict storm impact to coastal regions, where property damage and threat to human life can be high.

    NASA has led the campaign through integration of the HOPE EPOCH science payload onto the Global Hawk platform and maintained operational oversight for the six planned mission flights. NOAA’s role will be to incorporate data from dropsondes — devices dropped from aircraft to measure storm conditions — into NOAA National Weather Service operational models to improve storm track and intensity forecasts that will be provided to the public. NOAA first used the Global Hawk to study Hurricane Gaston in 2016.

    With the Global Hawk flying at altitudes of 60,000 feet (18,300 meters), the team will conduct six 24-hour-long flights, three of which are being supported and funded through a partnership with NOAA’s Unmanned Aircraft Systems program.

    NASA’s autonomous Global Hawk is operated from NASA’s Armstrong Flight Research Center at Edwards Air Force Base in California and was developed for the U.S. Air Force by Northrop Grumman. It is ideally suited for high-altitude, long-duration Earth science flights.

    The ability of the Global Hawk to autonomously fly long distances, remain aloft for extended periods of time and carry large payloads brings a new capability to the science community for measuring, monitoring and observing remote locations of Earth not feasible or practical with piloted aircraft or space satellites.

    The science payload consists of a variety of instruments that will measure different aspects of storm systems, including wind velocity, pressure, temperature, humidity, cloud moisture content and the overall structure of the storm system.

    Many of the science instruments have flown previously on the Global Hawk, including the High-Altitude MMIC Sounding Radiometer (HAMSR), a microwave sounder instrument that takes vertical profiles of temperature and humidity; and the Airborne Vertical Atmospheric Profiling System (AVAPS) dropsondes, which are released from the aircraft to profile temperature, humidity, pressure, wind speed and direction.

    New to the science payload is the ER-2 X-band Doppler Radar (EXRAD) instrument that observes vertical velocity of a storm system. EXRAD has one conically scanning beam as well as one nadir beam, which looks down directly underneath the aircraft. EXRAD now allows researchers to get direct retrievals of vertical velocities directly underneath the plane.

    The EXRAD instrument is managed and operated by NASA’s Goddard Space Flight Center in Greenbelt, Maryland; and the HAMSR instrument is managed by JPL. The National Center for Atmospheric Research developed the AVAPS dropsonde system, and the NOAA team will manage and operate the system for the HOPE EPOCH mission.

    Besides the scientific value that the HOPE EPOCH mission brings, the campaign also provides a unique opportunity for early-career scientists and project managers to gain professional development.

    HOPE is a cooperative workforce development program sponsored by the Academy of Program/Project & Engineering Leadership (APPEL) program and NASA’s Science Mission Directorate. The HOPE Training Program provides an opportunity for a team of early-entry NASA employees to propose, design, develop, build and launch a suborbital flight project over the course of 18 months. This opportunity enables participants to gain the knowledge and skills necessary to manage NASA’s future flight projects.

    Emory started as a NASA Pathways Intern in 2009. The HOPE EPOCH mission is particularly exciting for her, as some of her first science projects at NASA began with the Global Hawk program.

    The NASA Global Hawk had its first flights during the 2010 Genesis and Rapid Intensification Processes (GRIP) campaign. Incidentally, the first EPOCH science flight targeted Tropical Storm Franklin as it emerged from the Yucatan peninsula into the Gulf of Campeche along a track almost identical to that of Hurricane Karl in 2010, which was targeted during GRIP and where Emory played an important role.

    “It’s exciting to work with people who are so committed to making the mission successful,” Emory said. “Every mission has its own set of challenges, but when people come to the table with new ideas on how to solve those challenges, it makes for a very rewarding experience and we end up learning a lot from one another.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

  • richardmitnick 7:38 am on March 30, 2017 Permalink | Reply
    Tags: , , NOAA,   

    From U Washington: “Tackling resilience: Finding order in chaos to help buffer against climate change” 

    U Washington

    University of Washington

    March 29, 2017
    Michelle Ma

    Lotus flowers on a delta island on the outer reaches of the Mississippi delta, which is in danger of drastically shrinking or disappearing. The islands are actually quite resilient, as seen in part by the vegetation growth. Britta Timpane-Padgham/NWFSC

    “Resilience” is a buzzword often used in scientific literature to describe how animals, plants and landscapes can persist under climate change. It’s typically considered a good quality, suggesting that those with resilience can withstand or adapt as the climate continues to change.

    But when it comes to actually figuring out what makes a species or an entire ecosystem resilient ― and how to promote that through restoration or management ― there is a lack of consensus in the scientific community.

    A new paper by the University of Washington and NOAA’s Northwest Fisheries Science Center aims to provide clarity among scientists, resource managers and planners on what ecological resilience means and how it can be achieved. The study, published this month in the journal PLOS ONE, is the first to examine the topic in the context of ecological restoration and identify ways that resilience can be measured and achieved at different scales.

    “I was really interested in translating a broad concept like resilience into management or restoration actions,” said lead author Britta Timpane-Padgham, a fisheries biologist at Northwest Fisheries Science Center who completed the study as part of her graduate degree in marine and environmental affairs at the UW.

    “I wanted to do something that addressed impacts of climate change and connected the science with management and restoration efforts.”

    Timpane-Padgham scoured the scientific literature for all mentions of ecological resilience, then pared down the list of relevant articles to 170 examined for this study. She then identified in each paper the common attributes, or metrics, that contribute to resilience among species, populations or ecosystems. For example, genetic diversity and population density were commonly mentioned in the literature as attributes that help populations either recover from or resist disturbance.

    Timpane-Padgham along with co-authors Terrie Klinger, professor and director of the UW’s School of Marine and Environmental Affairs, and Tim Beechie, research biologist at Northwest Fisheries Science Center, grouped the various resilience attributes into five large categories, based on whether they affected individual plants or animals; whole populations; entire communities of plants and animals; ecosystems; or ecological processes. They then listed how many times each attribute was cited, which is one indicator of how well-suited a particular attribute is for measuring resilience.

    The Kissimmee River in central Florida. This ecosystem-scale restoration project began two decades ago and is used as an example in the study. South Florida Water Management District

    “It’s a very nice way of organizing what was sort of a confused body of literature,” Beechie said. “It will at least allow people to get their heads around resilience and understand what it really is and what things you can actually measure.”

    The researchers say this work could be useful for people who manage ecosystem restoration projects and want to improve the chances of success under climate change. They could pick from the ordered list of attributes that relate specifically to their project and begin incorporating tactics that promote resilience from the start.

    “Specifying resilience attributes that are appropriate for the system and that can be measured repeatably will help move resilience from concept to practice,” Klinger said.

    or example, with Puget Sound salmon recovery, managers are asking how climate change will alter various rivers’ temperatures, flow levels and nutrient content. Because salmon recovery includes individual species, entire populations and the surrounding ecosystem, many resilience attributes are being used to monitor the status of the fish and recovery of the river ecosystems that support them.

    The list of attributes that track resilience can be downloaded and sorted by managers to find the most relevant measures for the type of restoration project they are tackling. It is increasingly common to account for climate change in project plans, the researchers said, but more foresight and planning at the start of a project is crucial.

    “The threat of climate change and its impacts is a considerable issue that should be looked at from the beginning of a restoration project. It needs to be its own planning objective,” Timpane-Padgham said. “With this paper, I don’t want to have something that will be published and collect dust. It’s about providing something that will be useful for people.”

    No external funding was used for this study.

    Download the spreadsheet to find the best resilience measures for your project (click on the second file in the carousal titled Interactive decision support table)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

  • richardmitnick 4:16 pm on February 28, 2017 Permalink | Reply
    Tags: GOES-16 SUVI instrument, , NOAA, NOAA’s GOES-16 satellite,   

    From NOAA and Goddard: “First Solar Images from NOAA’s GOES-16 Satellite” 

    NASA Goddard Banner
    NASA Goddard Space Flight Center



    Feb. 27, 2017

    Michelle Smith
    National Oceanic and Atmospheric Administration, Silver Spring, Md.

    Rob Gutro
    NASA’s Goddard Space Flight Center, Greenbelt, Md.

    The first images from the Solar Ultraviolet Imager or SUVI instrument aboard NOAA’s GOES-16 satellite have been successful, capturing a large coronal hole on Jan. 29, 2017.

    NOAA GOES-16
    NOAA GOES-16

    The sun’s 11-year activity cycle is currently approaching solar minimum, and during this time powerful solar flares become scarce and coronal holes become the primary space weather phenomena – this one in particular initiated aurora throughout the polar regions. Coronal holes are areas where the sun’s corona appears darker because the plasma has high-speed streams open to interplanetary space, resulting in a cooler and lower-density area as compared to its surroundings.

    Access mp4 video here .
    This animation from January 29, 2017, shows a large coronal hole in the sun’s southern hemisphere from the Solar Ultraviolet Imager (SUVI) on board NOAA’s new GOES-16 satellite. SUVI observations of solar flares and solar eruptions will provide an early warning of possible impacts to Earth’s space environment and enable better forecasting of potentially disruptive events on the ground. This animation captures the sun in the 304 Å wavelength, which observes plasma in the sun’s atmosphere up to a temperature of about 50,000 degrees. When combined with the five other wavelengths from SUVI, observations such as these give solar physicists and space weather forecasters a complete picture of the conditions on the sun that drive space weather. Credits: NOAA/NASA

    SUVI is a telescope that monitors the sun in the extreme ultraviolet wavelength range. SUVI will capture full-disk solar images around-the-clock and will be able to see more of the environment around the sun than earlier NOAA geostationary satellites.

    The sun’s upper atmosphere, or solar corona, consists of extremely hot plasma, an ionized gas. This plasma interacts with the sun’s powerful magnetic field, generating bright loops of material that can be heated to millions of degrees. Outside hot coronal loops, there are cool, dark regions called filaments, which can erupt and become a key source of space weather when the sun is active. Other dark regions are called coronal holes, which occur where the sun’s magnetic field allows plasma to stream away from the sun at high speed. The effects linked to coronal holes are generally milder than those of coronal mass ejections, but when the outflow of solar particles is intense – can pose risks to satellites in Earth orbit.

    The solar corona is so hot that it is best observed with X-ray and extreme-ultraviolet (EUV) cameras. Various elements emit light at specific EUV and X-ray wavelengths depending on their temperature, so by observing in several different wavelengths, a picture of the complete temperature structure of the corona can be made. The GOES-16 SUVI observes the sun in six EUV channels.

    Data from SUVI will provide an estimation of coronal plasma temperatures and emission measurements which are important to space weather forecasting. SUVI is essential to understanding active areas on the sun, solar flares and eruptions that may lead to coronal mass ejections which may impact Earth. Depending on the magnitude of a particular eruption, a geomagnetic storm can result that is powerful enough to disturb Earth’s magnetic field. Such an event may impact power grids by tripping circuit breakers, disrupt communication and satellite data collection by causing short-wave radio interference and damage orbiting satellites and their electronics. SUVI will allow the NOAA Space Weather Prediction Center to provide early space weather warnings to electric power companies, telecommunication providers and satellite operators.

    These images of the sun were captured at the same time on January 29, 2017 by the six channels on the SUVI instrument on board GOES-16 and show a large coronal hole in the sun’s southern hemisphere. Each channel observes the sun at a different wavelength, allowing scientists to detect a wide range of solar phenomena important for space weather forecasting.
    Credits: NOAA

    SUVI replaces the GOES Solar X-ray Imager (SXI) instrument in previous GOES satellites and represents a change in both spectral coverage and spatial resolution over SXI.

    NASA successfully launched GOES-R at 6:42 p.m. EST on Nov. 19, 2016, from Cape Canaveral Air Force Station in Florida and it was renamed GOES-16 when it achieved orbit. GOES-16 is now observing the planet from an equatorial view approximately 22,300 miles above the surface of Earth.

    NOAA’s satellites are the backbone of its life-saving weather forecasts. GOES-16 will build upon and extend the more than 40-year legacy of satellite observations from NOAA that the American public has come to rely upon.

    For more information about GOES-16, visit: http://www.goes-r.gov/ or http://www.nasa.gov/goes

    To learn more about the GOES-16 SUVI instrument, visit:


    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Goddard Space Flight Center is home to the nation’s largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

    Named for American rocketry pioneer Dr. Robert H. Goddard, the center was established in 1959 as NASA’s first space flight complex. Goddard and its several facilities are critical in carrying out NASA’s missions of space exploration and scientific discovery.

    NASA Goddard Campus
    NASA/Goddard Campus

  • richardmitnick 9:40 am on January 25, 2017 Permalink | Reply
    Tags: , , GOES-16, New Earth images from GOES-16, NOAA   

    From EarthSky: “New Earth images from GOES-16” 



    January 23, 2017
    Deborah Byrd

    NOAA GOES-16
    NOAA GOES-16

    GOES-16 captured this view of the moon as it looked across the surface of the Earth on January 15, 2017. Like earlier GOES satellites, GOES-16 will use the moon for calibration. Image via NOAA/NASA.

    NOAA sounded thrilled on January 23, 2017 about the release of the first images from orbit by the GOES-16 satellite. This new satellite lifted off from Cape Canaveral on November 19, 2016, and, according to NOAA:

    … scientists, meteorologists and ordinary weather enthusiasts have anxiously waited for the first photos from NOAA’s newest weather satellite, GOES-16, formerly GOES-R.

    The release of the first images today is the latest step in a new age of weather satellites. It will be like high-definition from the heavens.

    Stephen Volz Ph.D. director of NOAA’s Satellite and Information Service said:

    This is such an exciting day for NOAA! One of our GOES-16 scientists compared this to seeing a newborn baby’s first pictures — it’s that exciting for us. These images come from the most sophisticated technology ever flown in space to predict severe weather on Earth. The fantastically rich images provide us with our first glimpse of the impact GOES-16 will have on developing life-saving forecasts.

    This GOES image shows the significant storm system that crossed North America that caused freezing and ice that resulted in dangerous conditions across the United States on January 15, 2017, resulting in loss of life. GOES-16 scientists say the satellite will offer 3x more spectral channels with 4x greater resolution, 5x faster than ever before, leading to more accurate weather forecasting. Image via GOES Image Gallery.

    This 16-panel image shows the continental United States in the 2 visible, 4 near-infrared and 10 infrared channels on GOES-16’s Advanced Baseline Imager (ABI). These channels help forecasters distinguish between differences in the atmosphere, for example, between clouds, water vapor, smoke, ice and volcanic ash. Image via NOAA/NASA.

    The GOES-16 Advanced Baseline Imager also acquired the images to make this composite color image of Earth. It’s from 1:07 p.m. EDT on January 15, 2017 and was created using several of the instrument’s 16 spectral channels . The image shows North and South America and the surrounding oceans. GOES-16 observes Earth from an equatorial view approximately 22,300 miles high (35,888 km high), which is why, NOAA said, it’s able to create “full disk images like these, extending from the coast of West Africa, to Guam, and everything in between.” Image via GOES Image Gallery.

    Bottom line: On January 23, 2017 NOAA released the first images from its GOES-16 weather forecasting satellite.

    Via NOAA

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 3:29 pm on January 4, 2017 Permalink | Reply
    Tags: , , NOAA, Wind studies   

    From ALCF: “Supercomputer simulations helping to improve wind predictions” 

    Argonne Lab
    News from Argonne National Laboratory

    ANL Cray Aurora supercomputer
    Cray Aurora supercomputer at the Argonne Leadership Computing Facility

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility
    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility


    January 3, 2017
    Katie Jones

    Station locations and lists of instruments deployed within the Columbia River Gorge, Columbia River Basin, and surrounding region. Credit:
    James Wilczak, NOAA

    A research team led by the National Oceanic and Atmospheric Administration (NOAA) is performing simulations at the Argonne Leadership Computing Facility (ALCF), a U.S. Department of Energy (DOE) Office of Science User Facility, to develop numerical weather prediction models that can provide more accurate wind forecasts in regions with complex terrain. The team, funded by DOE in support of its Wind Forecast Improvement Project II (WFIP 2), is testing and validating the computational models with data being collected from a network of environmental sensors in the Columbia River Gorge region.

    Wind turbines dotting the Columbia River Gorge in Washington and Oregon can collectively generate about 4,500 megawatts (MW) of power, or more than that of five, 800-MW nuclear power plants. However, the gorge region and its dramatic topography create highly variable wind conditions, posing a challenge for utility operators who use weather forecast models to predict when wind power will be available on the grid.

    If predictions are unreliable, operators must depend on steady power sources like coal and nuclear plants to meet demand. Because they take a long time to fuel and heat, conventional power plants operate on less flexible timetables and can generate power that is then wasted if wind energy unexpectedly floods the grid.

    To produce accurate wind predictions over complex terrain, researchers are using Mira, the ALCF’s 10-petaflops IBM Blue Gene/Q supercomputer, to increase resolution and improve physical representations to better simulate wind features in national forecast models. In a unique intersection of field observation and computer simulation, the research team has installed and is collecting data from a network of environmental instruments in the Columbia River Gorge region that is being used to test and validate model improvements.

    This research is part of the Wind Forecast Improvement Project II (WFIP 2), an effort sponsored by DOE in collaboration with NOAA, Vaisala—a manufacturer of environmental and meteorological equipment—and a number of national laboratories and universities. DOE aims to increase U.S. wind energy from five to 20 percent of total energy use by 2020, which means optimizing how wind is used on the grid.

    “Our goal is to give utility operators better forecasts, which could ultimately help make the cost of wind energy a little cheaper,” said lead model developer Joe Olson of NOAA. “For example, if the forecast calls for a windy day but operators don’t trust the forecast, they won’t be able to turn off coal plants, which are releasing carbon dioxide when maybe there was renewable wind energy available.”

    The complicated physics of wind

    For computational efficiency, existing forecast models assume the Earth’s surface is relatively flat—which works well at predicting wind on the flat terrain of the Midwestern United States where states like Texas and Iowa generate many thousands of megawatts of wind power. Yet, as the Columbia River Gorge region demonstrates, some of the ripest locations for harnessing wind energy could be along mountains and coastlines where conditions are difficult to predict.

    “There are a lot of complications predicting wind conditions for terrain with a high degree of complexity at a variety of spatial scales,” Olson said.

    Two major challenges include overcoming a model resolution that is too low for resolving wind features in sharp valleys and mountain gaps and a lack of observational data.

    At the NOAA National Center for Environmental Prediction, two atmospheric models run around the clock to provide national weather forecasts: the 13-km Rapid Refresh (RAP) and the 3-km High-Resolution Rapid Refresh (HRRR). Only a couple of years old, the HRRR model has improved storm and winter weather predictions by resolving atmospheric features at 9 km2—or about 2.5 times the size of Central Park in New York City.

    At a resolution of a few kilometers, HRRR can capture processes at the mesoscale—about the size of storms—but cannot resolve features at the microscale, which is a few hundred feet. Some phenomena important to wind prediction that cannot be modeled in RAP or HRRR include mountain wakes (the splitting of airflow obstructed by the side of a mountain); mountain waves (the oscillation of air flow on the side of the mountain that affects cloud formation and turbulence); and gap flow (small-scale winds that can blow strongly through gaps in mountains and gorge ridges).

    The 750-meter leap

    To make wind predictions that are sufficiently accurate for utility operators, Olson said they need to model physical parameters at a 750-m resolution—about one-sixth the size of Central Park, or an average wind farm. This 16-times increase in resolution will require a lot of real-world data for model testing and validation, which is why the WFIP 2 team outfitted the Columbia River Gorge region with more than 20 environmental sensor stations.

    “We haven’t been able to identify all the strengths and weaknesses for wind predictions in the model because we haven’t had a complete, detailed dataset,” Olson said. “Now we have an expansive network of wind profilers and other weather instruments. Some are sampling wind in mountain gaps and valleys, others are on ridges. It’s a multiscale network that can capture the high-resolution aspects of the flow, as well as the broader mesoscale flows.”

    Many of the sensors send data every 10 minutes. Considering data will be collected for an 18-month period that began in October 2015 and ends March 2017, this steady stream will ultimately amount to about half a petabyte. The observational data is initially sent to Pacific Northwest National Laboratory where it is stored until it is used to test model parameters on Mira at Argonne.

    The WFIP 2 research team needed Mira’s highly parallel architecture to simulate an ensemble of about 20 models with varied parameterizations. ALCF researchers Ray Loy and Ramesh Balakrishnan worked with the team to optimize the HRRR architectural configuration and craft a strategy that allowed them to run the necessary ensemble jobs.

    “We wanted to run on Mira because ALCF has experience using HRRR for climate simulations and running ensembles jobs that would allow us to compare the models’ physical parameters,” said Rao Kotamarthi, chief scientist and department head of Argonne’s Climate and Atmospheric Science Department. “The ALCF team helped to scale the model to Mira and instructed us on how to bundle jobs so they avoid interrupting workflow, which is important for a project that often has new data coming in.”

    The ensemble approach allowed the team to create case studies that are used to evaluate how each simulation compared to observational data.

    “We pick certain case studies where the model performs very poorly, and we go back and change the physics in the model until it improves, and we keep doing that for each case study so that we have significant improvement across many scenarios,” Olson said.

    At the end of the field data collection, the team will simulate an entire year of weather conditions with an emphasis on wind in the Columbia River Gorge region using the control model—the 3-km HRRR model before any modifications were made—and a modified model with the improved physical parameterizations.

    “That way, we’ll be able to get a good measure of how much has improved overall,” Olson said.

    Computing time on Mira was awarded through the ASCR Leadership Computing Challenge (ALCC). Collaborating institutions include DOE’s Wind Energy Technologies Office, NOAA, Argonne, Pacific Northwest National Laboratory, Lawrence Livermore National Laboratory, the National Renewable Energy Laboratory, the University of Colorado, Notre Dame University, Texas Tech University, and Vaisala.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About ALCF

    The Argonne Leadership Computing Facility’s (ALCF) mission is to accelerate major scientific discoveries and engineering breakthroughs for humanity by designing and providing world-leading computing facilities in partnership with the computational science community.

    We help researchers solve some of the world’s largest and most complex problems with our unique combination of supercomputing resources and expertise.

    ALCF projects cover many scientific disciplines, ranging from chemistry and biology to physics and materials science. Examples include modeling and simulation efforts to:

    Discover new materials for batteries
    Predict the impacts of global climate change
    Unravel the origins of the universe
    Develop renewable energy technologies

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: