Tagged: NSF Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:27 am on June 23, 2016 Permalink | Reply
    Tags: 2016 Week of Making, NSF,   

    From Rutgers: 2016 Week of Making 

     

    1

    In honor of the 2016 Week of Making, the NSF has awarded five different grants for early-concept programs. Each of these grants is $300,000 over a course of two years. Researchers at Rutgers Newark were awarded one of these grants.

    “Researchers at Rutgers University Newark will investigate the developmental origins of making in children’s play through the development of a Mobile Maker Center that can be brought to local science museums, parks, play centers, zoos or libraries to study children’s interactions with specially designed physical objects and computer-designed simulations.”

    Read more here.

    Source: 2016 Week of Making

    Rutgersensis

     
  • richardmitnick 7:50 am on June 14, 2016 Permalink | Reply
    Tags: , , NSF   

    From NSF: “Biophysics fights cancer” 

    nsf
    National Science Foundation

    June 13, 2016
    Media Contacts
    Ivy F. Kupec, NSF, (703) 292-8796, ikupec@nsf.gov
    Jessica Arriens, NSF, (703) 292-2243, jarriens@nsf.gov

    Program Contacts
    Krastan B. Blagoev, NSF, (703) 292-4666, kblagoev@nsf.gov

    1
    “Theoretical physics brings an important perspective to studying biological issues,” said Krastan Blagoev, program director of NSF’s Physics of Living System Program, who worked on building the unique public-private partnership. “Using an interdisciplinary approach to living systems helps researchers solve some basic science problems that stop us from making further progress in understanding and treating cancer.” Here, an SW620 colon cancer cell line with RNA in situ hybridization is stained for LINE-1 non-coding RNA (Green) and GAPDH housekeeping gene (Red). DAPI nuclear stain (Blue). Images at 200X magnification. Credit: David T. Ting MD, Massachusetts General Hospital and Harvard Medical School

    Whether it focuses on determining why certain cancers develop drug resistance, finding a way to improve individual’s immune systems or better understanding cancer cell evolution, fundamental scientific research will “stand up to cancer” with three new awards from the National Science Foundation (NSF). The awards arose through an innovative public-private partnership between NSF, Stand Up To Cancer (SU2C), the V Foundation for Cancer Research, The Lustgarten Foundation, Breast Cancer Research Foundation and Bristol-Myers Squibb.

    “These research projects are quite dissimilar, but they have two things in common,” said Fleming Crim, assistant director for NSF’s Mathematical and Physical Sciences Directorate. “They challenge what we know and don’t know about cancer at the most fundamental level, and they are attempting to tackle some of the most pressing issues in cancer treatment today.”

    Announced in September 2014, the partnership committed $5 million towards transformational, theoretical, biophysical approaches to cancer, with the potential for significant impact on basic science research and potentially on treatment.

    “This is an example of how biology, physical sciences and mathematics can work together to address complex problems in biology,” said James Olds, assistant director for NSF’s Biological Sciences Directorate.

    “Theoretical physics brings an important perspective to studying biological issues,” said Krastan Blagoev, program director of NSF’s Physics of Living System Program, who worked on building the unique public-private partnership. “Using an interdisciplinary approach to living systems helps researchers solve some basic science problems that stop us from making further progress in understanding and treating cancer.”

    “Stand Up To Cancer has demonstrated we can accelerate new effective cancer treatments through collaborations across institutions and research disciplines, getting researchers out of their silos,” said SU2C Scientific Advisory Chairperson Phillip A. Sharp, professor at the Koch Institute for Integrative Cancer Research at the Massachusetts Institute of Technology. “With these convergence teams, SU2C will advance translational cancer research beyond the long-held view that scientists ‘discover,’ engineers ‘invent,’ and entrepreneurs ‘innovate.'”

    In February 2015, SU2C held a workshop to follow up on the cancer partnership announcement. The workshop brought together leading clinicians and theoretical physicists to hone in on transformative approaches with the potential to catapult cancer research forward.

    Rational design of anticancer drug combinations with dynamic multidimensional input

    Réka Albert, Penn State University; Eric Siggia, Rockefeller University; José Baselga, Memorial Sloan Kettering Cancer Center; Levi Garraway, Dana-Farber/Harvard Cancer Center; and Raul Rabadan, Columbia University
    Why cancer treatments become ineffective has long confounded clinicians and is the main reason behind cancer’s deadliness. This project will characterize the molecular network that causes this drug resistance. The researchers will focus on two cancers: estrogen-positive breast cancer and melanoma, the most lethal skin cancer. Using quantitative network analysis, they will work to identify networks responsible for drug resistance and explore ways to bypass that resistance at a molecular level. If successful, the theory could likely be used to improve treatments for other cancers that face drug resistance issues.

    Liberating T-cell mediated immunity to pancreatic cancer

    Jeffrey Drebin, University of Pennsylvania; Curtis Callan, Princeton University; David Ting, Massachusetts General Hospital and Harvard Medical School

    A promising recent approach to cancer treatment is immunotherapy, which works by stimulating patients’ own immune systems to fight tumor cells. Unfortunately, this approach has had little success in pancreatic cancer. Mechanisms that prevent an effective immune response include the release of immune-suppressing molecules by the tumor environment, as well as the physical barrier in the tissue preventing immune cells from reaching their target. The heart of this proposal uses theoretical modeling and statistical understanding of T-cell repertoires to design immunotherapy treatment strategies that can be tested in clinical settings.

    The genetic, epigenetic and immunological underpinnings of cancer evolution through treatment

    Ross Levine, Memorial Sloan Kettering Cancer Center; Daniel Fisher, Stanford University; Harlan Robins, Fred Hutchinson Cancer Research Center; Jeffrey Engelman, Massachusetts General Hospital; Steven Altschuler, University of California, San Francisco; and Chang Chan, Rutgers University

    A single cancerous tumor can contain many different kinds of mutations in different cells, which affect patient prognosis: a higher degree of heterogeneity (cell variation) in a tumor corresponds to a lower survival rate. In this project, oncologists and physicists will study evolutionary dynamics leading to heterogeneity in cancer, focusing on acute myeloid leukemia and a specific type of non-small cell lung cancer. They hope to develop a quantitative framework from science-based data mining to produce more accurate survival forecasts that facilitate better treatment decisions.

    Related Websites
    Stand Up to Cancer: http://www.standup2cancer.org/

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

     
  • richardmitnick 3:33 pm on June 10, 2016 Permalink | Reply
    Tags: , , Diane Souvaine-Vice Chair, Maria Zuber-Chair, National Science Board, NSF,   

    From NSF: Women in Science “The National Science Board taps Maria Zuber as its chairperson and Diane Souvaine for vice chairperson” 

    nsf
    National Science Foundation

    1
    National Science Board

    May 24, 2016 [Just appeared in social media.]

    1
    Left, Maria Zuber, Chair; right, Diane Souvaine, Vice Chair

    For the first time in National Science Foundation (NSF) history, women hold the positions of director and National Science Board (NSB) chair, and vice chair. During its May meeting, the board, which serves as the governing body for NSF, elected Maria Zuber, vice president for research at the Massachusetts Institute of Technology, as chair and Diane Souvaine, vice provost for research at Tufts University, as vice chair. They replace Dan Arvizu and Kelvin Droegemeier, who both rotated off the board after serving 12 years, the last four as chair and vice chair, respectively.

    Zuber’s research bridges planetary geophysics and the technology of space-based laser and radio systems, and she has published over 200 papers. She has held leadership roles associated with scientific experiments or instrumentation on nine NASA missions and remains involved with six of these missions. She is a member of the National Academy of Sciences and American Philosophical Society and is a fellow for the American Academy of Arts and Sciences, the American Association for the Advancement of Science, the Geological Society and the American Geophysical Union. In 2002, Discover magazine named her one of the 50 most important women in science. Zuber served on the Presidential Commission on the Implementation of United States Space Exploration Policy in 2004.

    NSF Director and NSB member ex officio France Córdova said, “I am delighted to say, on behalf of NSF that we are thrilled with Dr. Zuber’s election as chair and Dr. Souvaine’s election as vice chair of the National Science Board. As Dr. Zuber is a superb scientist and recognized university leader, she has the skills needed to help guide the agency’s policies and programs. Coupled with Dr. Souvaine’s background in computer science, exemplary leadership skills, and expertise in budget oversight and strategy, NSB is well-positioned for the coming years. I look forward to working with both leaders as NSF launches new big ideas in science and engineering.”

    Zuber is in her fourth year on the board and has served on its Committee on Strategy and Budget, which advises on NSF’s strategic direction and reviews the agency’s budget submissions.

    “It is a privilege to lead the National Science Board and to promote NSF’s bold vision for research and education in science and engineering,” Zuber said. “The outcomes of discovery science inspire the next generation and yield the knowledge that drives innovation and national competitiveness, and contribute to our quality of life. NSB is committed to working with director Córdova and her talented staff to assure that the very best ideas based on merit review are supported and that exciting, emerging opportunities — many at the intersection of disciplines — are pursued.”

    Souvaine is in her second term on the NSB and has served as chair of its Committee on Strategy and Budget, chair of its Committee on Programs and Plans, and as a member of its Committee on Audit and Oversight, all of which provide strategic direction, and oversight and guidance on NSF projects and programs. In addition, she co-chaired NSB’s Task Force on Mid-Scale Research and served three years on the Executive Committee.

    A theoretical computer scientist, Souvaine’s research in computational geometry has commercial applications in materials engineering, microchip design, robotics and computer graphics. She was elected a fellow of the Association for Computing Machinery for her research and for her service on behalf of the computing community. A founding member, Souvaine served for over two years in the directorate of the NSF Science and Technology Center on Discrete Mathematics and Theoretical Computer Science that originally spanned Princeton University, Rutgers University, Bell Labs and Bell Communications Research. She also works to enhance pre-college mathematics and foundations of computing education and to advance opportunities for women and minorities in mathematics, science and engineering.

    “I am truly honored and humbled by this vote of confidence from such esteemed colleagues. I do not take this responsibility lightly,” Souvaine said. “The board is proud of NSF’s accomplishments over its 66 years, from the discovery of gravitational waves at LIGO to our biennial Science and Engineering Indicators report on the state of our nation’s science and engineering enterprise. I look forward to working with Congress, the Administration, the science and education communities, and NSF staff to continue the agency’s legacy in advancing the progress of science.”

    Jointly, the 24-member board and the director pursue the goals and function of the foundation. NSB establishes NSF policies within the framework of applicable national policies set forth by the President and Congress. NSB identifies issues critical to NSF’s future, approves the agency’s strategic budget directions and the annual budget submission to the Office of Management and Budget, and new major programs and awards. The board also serves as an independent body of advisers to both the President and Congress on policy matters related to science and engineering and education in science and engineering. In addition to major reports, NSB publishes policy papers and statements on issues of importance to U.S. science and engineering.

    The President appoints board members, selected for their eminence in research, education or public service and records of distinguished service and who represent a variety of science and engineering disciplines and geographic areas. Board members serve six-year terms and the President may reappoint members for a second term. NSF’s director is an ex officio 25th member of the board.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

     
  • richardmitnick 7:29 am on June 3, 2016 Permalink | Reply
    Tags: , , NSF, Stampede supercomputer, ,   

    From NSF: “Stampede 2 drives frontiers of science and engineering forward” 

    nsf
    National Science Foundation

    Media Contacts
    Ivy F. Kupec, NSF
    (703) 292-8796
    ikupec@nsf.gov

    Gera Jochum, NSF
    (703) 292-8794
    gjochum@nsf.gov

    Faith Singer-Villalobos
    University of Texas at Austin
    (512) 232-5771
    faith@tacc.utexas.edu

    Program Contacts
    Bob Chadduck, NSF
    (703) 292-2247
    rchadduc@nsf.gov

    Irene M. Qualters, NSF
    (703) 292-2339
    iqualter@nsf.gov

    June 2, 2016

    Today, the National Science Foundation (NSF) announced a $30 million award to the Texas Advanced Computing Center (TACC) at The University of Texas at Austin (UT Austin) to acquire and deploy a new large-scale supercomputing system, Stampede 2, as a strategic national resource to provide high-performance computing (HPC) capabilities for thousands of researchers across the U.S.

    U Texas Stampede Supercomputer. Texas Advanced Computer Center
    Dell Poweredge Stampede supercomputer, U Texas, Austin, TX , 9.6PF

    This award builds on technology and expertise from the Stampede system first funded by NSF in 2011 and will deliver a peak performance of up to 18 Petaflops, over twice the overall system performance of the current Stampede system. Stampede 2 will be among the first systems to employ cutting-edge processor and memory technology to continue to bridge users to future cyberinfrastructure.

    Stampede 2 will be deployed by TACC in conjunction with vendor partners Dell Inc., Intel Corporation, and Seagate Technology, and operated by a team of cyberinfrastructure experts at TACC, UT Austin, Clemson University, Cornell University, the University of Colorado at Boulder, Indiana University, and Ohio State University.

    “NSF is proud to join with the University of Texas at Austin in supporting the nation’s academic researchers in science and engineering with the latest in advanced computing technology and expertise,” said Irene Qualters, NSF Division Director for Advanced Cyberinfrastructure. “Stampede 2’s capabilities will complement and significantly expand the diverse portfolio of computing resources increasingly essential to exploration at the frontiers of science and engineering.”

    The announcement of Stampede 2 comes at a time when the use of NSF-supported research cyberinfrastructure resources is at an all-time high and continuing to increase across all science and engineering disciplines. Since 2005, the number of active institutions using this research cyberinfrastructure has doubled, the number of principal investigators has tripled, and the number of active users has quintupled. Furthering the Stampede system will help enable a growing number of scientists to have access to computation at-scale.

    “The original Stampede system has run more than 7 million simulation and data analysis jobs for tens of thousands of users around the country and around the world,” noted Dan Stanzione, executive director of TACC and principal investigator of the Stampede and Stampede 2 projects. “The kind of large-scale computing and data capabilities systems like Stampede and Stampede 2 provide are crucial for innovation in almost every area of research and development, from providing insights to fundamental theory to applied work that has real near-term impacts on society. Stampede has been used for everything from determining earthquake risks to help set building codes for homes and commercial buildings, to computing the largest mathematical proof ever constructed. We thank the NSF for trusting us again with the tremendous responsibility of supporting our nation’s researchers as they push the boundaries of discovery.”

    Researchers across the nation can gain access to Stampede and other advanced computing resources, including other HPC machines, high throughput computing machines, visualizations, data storage, testbeds, and services through the NSF-funded Extreme Science and Engineering Discovery Environment (XSEDE).

    The award for Stampede 2 will deploy a new system that will surpass performance of the current Stampede system, doubling peak performance, memory, storage capacity and bandwidth. The new system will be deployed in phases, using a variety of new and upcoming technologies. The processors in the system will include a mix of upcoming Intel® Xeon Phi™ Processors, codenamed “Knights Landing,” and future-generation Intel® Xeon® processors, connected by Intel® Omni-Path Architecture. The last phase of the system will include integration of the upcoming 3D XPoint non-volatile memory technology.

    “The first Stampede system has been the workhorse of XSEDE, supporting the advanced modeling, simulation, and analysis needs of many thousands of researchers across the country,” said Omar Ghattas, a computational geoscientist/engineer at UT Austin and recent winner of the Gordon Bell prize for the most outstanding achievement in high-performance computing.

    “Stampede has also given us a window into a future in which simulation is but an inner iteration of a ‘what-if?’ outer loop. Stampede 2’s massive performance increase will make routine the principled exploration of parameter space entailed in this outer loop. This will usher in a new era of HPC-based inference, data assimilation, design, control, uncertainty quantification, and decision-making for large-scale complex models in the natural and social sciences, engineering, technology, medicine, and beyond.”

    The announcement was made today during an event at TACC recognizing the center’s 15th anniversary and dedicating a new building for advanced computing on the UT Austin J.J. Pickle Research Campus. Speakers included Qualters; Stanzione; Bill McRaven, chancellor of the University of Texas System; Jim Ganthier, vice president and general manager, Engineered Solutions, HPC & Cloud, Dell, Inc.; and Charlie Wuischpard, vice president and general manager, HPC Platform Group, Intel Corporation.

    “We are both excited for and proud to power TACC’s multiple Stampede Systems. TACC has been a great Dell customer and partner over the years, helping us to evolve our own portfolio as we continue to push the HPC industry forward,” said Ganthier. “Our Dell technologies at the core of the Stampede 2 supercomputing cluster will continue powering leading-edge research to both enable and advance science and society.”

    “The NSF and TACC continue to recognize the need for advanced HPC solutions as a fundamental tool to accelerate academic and scientific discovery,” Wuischpard said. “Stampede 2 will be a leadership-class system based on the Intel® Scalable System Framework, delivering a common platform for modeling, simulation, and data-driven science, and fueling scientific research and discovery for the next generation of researchers.”

    The event also included a symposium on advanced computing featuring users of the system: Ghattas; Ellen Rathje, of UT Austin, who leads the NSF-funded DesignSafe infrastructure; Peter Couvares of Syracuse University from the Advanced LIGO project, which recently confirmed the first observation of gravitational waves; and Nirav Merchant from the University of Arizona, who is co-principal investigator of the NSF-funded CyVerse life sciences cyberinfrastructure.

    Intel, Xeon and Xeon Phi are trademarks or registered trademarks of Intel Corporation in the United States and other countries.

    Related Websites
    Texas Advanced Computing Center (TACC): http://www.tacc.utexas.edu/
    NSF announcement of Stampede dedication: http://www.nsf.gov/news/news_summ.jsp?cntn_id=127194
    NSF announcement of 2011 Stampede award: https://www.nsf.gov/news/news_summ.jsp?cntn_id=121763

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

     
  • richardmitnick 4:35 pm on April 13, 2016 Permalink | Reply
    Tags: , NSF, South Pole Ice Core project   

    From NSF: “Getting to the Bottom of SPICECORE” 

    nsf
    National Science Foundation

    Tha Antarctic Sun

    April 12, 2016
    Michael Lucibella

    Researchers drill deep into the ice beneath the South Pole to sample Earth’s ancient atmosphere.

    As the winch extracted a two-meter-long cylinder of ancient ice in late December, Murat Aydin looked on.

    1
    Shawntel Stapleton operates the drill that’s boring into the ice under the South Pole. Photo Credit: Mike Lucibella

    “If we can keep this pace up we should be able to hit 1,600 meters,” he said. “This is going to be the deepest ice core drilled at the South Pole by quite a margin.”

    By the end of the project a month later, researchers with the South Pole Ice Core project, known more succinctly as SPICECORE, had exceeded even their most ambitious goals.

    Aydin is an atmospheric chemist at the University of California, Irvine and the lead scientist on SPICECORE. The project wrapped up its two-year drilling effort at the South Pole in late January, having collected ice samples from 1,751 meters (5,744 feet) below the surface, more than 200 meters (656 feet) deeper than the original goal.

    2
    The SPICECORE drill hauls up an ice core from hundreds of feet below the surface.Photo Credit: Mike Lucibella

    “There is no better feeling for a field scientist than coming back home, having surpassed all expectations,” Aydin said.

    SPICECORE is supported by the National Science Foundation, which manages the U.S. Antarctic Program.

    These ancient ice samples are important to climate scientists because sealed inside each ice core are numerous tiny air bubbles, which are essentially samples of atmosphere from before the ice was completely buried. Scientists analyze these trapped bubbles to learn what the Earth’s atmosphere was like thousands of years ago. These measurements have become a key tool for climate researchers investigating ancient carbon dioxide levels to better understand the planet’s present warming trends.

    Though much ice core trace gas research focuses on gleaning insight into ancient carbon dioxide and methane levels, the SPICECORE research will also focus on a different class of gases found at much lower levels in the atmosphere. Measurements of rare gasses like carbonyl sulfide, methyl chloride and methyl bromide offer further details about how the trace gas composition of the atmosphere changed over thousands of years and what impacts the changes in global biogeochemical cycles may have had on it.

    The ice started out as accumulated snow at the surface. Each year a new layer of snow builds up on top of the layer laid down the previous year. The weight of all this extra snow compresses the frozen snowflakes together until they fuse and become solid ice. The farther down into the ice sheet the team drills, the older the ice gets. The samples nearest to the bottom could be as many as 50,000 years old.

    3
    Murat Aydin cleans cutting fluid off of a recent ice core as he prepares it for storage. Photo Credit: Mike Lucibella

    Setting up shop next to the South Pole station is ideal for the team in part because the station can provide logistical support that would otherwise have to be established for an isolated field camp. Drilling at the South Pole also helps fill in some of the gaps in the scientific record around the continent.

    “If you look at all the other ice cores drilled in East Antarctica, they’re kind of far away, so there isn’t a lot from this region of East Antarctica, climate record wise,” Aydin said.

    In addition, the station’s location atop the frigid polar plateau is a boon for collecting atmospheric samples.

    “We wanted an East Antarctic ice core because of the temperatures,” Aydin said. “The ice is a lot colder here, because the annual mean temperature is about minus 50 [Celsius]. For some of the gas measurements we make, colder ice is better.”

    4
    Shawntel Stapleton cleans and prepares the ice drill for another run. The team was drilling 24 hours a day. Photo Credit: Mike Lucibella

    But on the flip side, the temperatures that help with the science can wreak havoc on the equipment.

    “The ice here, because it’s colder, it’s also harder,” said Jay Johnson, a mechanical engineer at the University of Wisconsin, Madison’s Ice Drilling Design and Operations (IDDO). “Last season we had a lot of problems with [drill] bits chipping and cracking corners off and things like that, so we had to go to a different process of heat treating the metal this year and that’s working much better.”

    Johnson is one of the chief designers of the drill boring through the ice. The Intermediate Depth Drill (IDD) the IDDO developed is based on an existing Danish drill design called the Hans-Tausen drill but with some modifications. The idea was to develop a design with standardized parts that could be shared between both programs.

    “This might be one of the first times that has happened internationally,” Johnson said. “That was one of the goals of that too, so we could have some interchangeability between the two countries’ systems.”

    One of the main changes the IDDO made to the original Danish design is that they used fiberglass to make their outer core barrel, the part of the drill bit that’s lowered into the hole and sheathes the ice sample while it’s brought back to the surface.

    “We were able to make these tubes for about ten times cheaper than you can make them out of metal,” Johnson said. “They’re straighter, more round and run more true. They’re probably a little more disposable, probably a little shorter life span maybe than stainless [steel], but the benefit is that they’re extremely cost effective.”

    This South Pole ice coring operation is the first run of the new IDD. It was designed to fill a gap in drilling capabilities between small ice coring drills that can only penetrate about 300 meters (about 980 feet) into the ice, and much larger operations like WAIS Divide that can bore down several thousand meters.

    “This is one that had been in the works for a number of years,” Johnson said. “The U.S. science community wanted this intermediate depth drill for lighter logistics and easier transportability.”

    By reducing the amount of support that the drill needs, the system can be deployed more quickly and cheaply than other, larger drills.

    “We designed this project to be quick, efficient and get the biggest impact for the buck,” Aydin said. “Let’s get the most valuable science out of it with the smallest footprint possible.”

    6
    SPICECORE’s Intermediate Depth Drill was designed to have a smaller footprint than larger ice drilling projects like WAIS Divide. Photo Credit: Mike Lucibella

    Because of its smaller size, the IDD can’t reach all the way to bedrock nearly three kilometers (1.8 miles) below the surface of the ice. However the lowest layers under the Pole are not ideal ice samples, because the ice that the South Pole rests on is constantly flowing at a rate of about 10 meters (30 feet) a year.

    “The ice that we find at South Pole didn’t originate here… the deeper you go, the farther away it’s coming from,” Aydin said. “It complicates the interpretation of the isotope record, especially the deeper sections.”

    Aydin will return to the South Pole next season to close out the project. The drilling is finished, but the camp still needs to be disassembled. There are also samples that remained behind that need to be packed up and shipped to the National Ice Core Laboratory (NICL) in Denver where cores from previous years are already stored.

    “It’s a great feeling but I don’t feel like it is all over,” Aydin said. “It is just starting in a way because we have all this beautiful ice to analyze now. Making good measurements and publishing interesting papers are the next targets now.”

    In the lab, Aydin works to identify and characterize the rare trace gasses trapped in the ice.

    In addition, scientists from around the world will be able to process and analyze the core samples stored at the lab for their own projects.

    “There are many measurements that are going to be made and there are different institutions that are going to be involved,” Aydin said. “When all is said and done, there’s going to be plenty of South Pole ice there for years to come at NICL.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

     
  • richardmitnick 5:56 am on March 18, 2016 Permalink | Reply
    Tags: , , NSF   

    From NSF: “Envisioning supercomputers of the future” 

    nsf
    National Science Foundation

    March 17, 2016
    Aaron Dubrow,
    NSF
    703-292-4489
    adubrow@nsf.gov

    Makeda Easter, Texas Advanced Computing Center
    512-471-8217
    makeda@tacc.utexas.edu

    Project to test operating systems for future exascale computers

    Last year, President Obama announced the National Strategic Computing Initiative (NSCI), an executive order to increase research, development and deployment of high performance computing (HPC) in the United States, with the the National Science Foundation, the Department of Energy and the Department of Defense as the lead agencies.

    One of NSCI’s objectives is to accelerate research and development that can lead to future exascale computing systems — computers capable of performing one billion billion calculations per second (also known as an exaflop). Exascale computers will advance research, enhance national security and give the U.S. a competitive economic advantage.

    Experts believe simply improving existing technologies and architectures will not get us to exascale levels. Instead, researchers will need to rethink the entire computing paradigm — from power, to memory, to system software — to make exascale systems a reality.

    The Argo Project is a three-year collaborative effort, funded by the Department of Energy, to develop a new approach for extreme-scale system software. The project involves the efforts of 40 researchers from three national laboratories and four universities working to design and prototype an exascale operating system and the software to make it useful.

    To test their new ideas, the research team is using Chameleon, an experimental environment for large-scale cloud computing research supported by the National Science Foundation and hosted by the University of Chicago and the Texas Advanced Computing Center (TACC).

    Chameleon — funded by a $10 million award from the NSFFutureCloud program — is a re-configurable testbed that lets the research community experiment with novel cloud computing architectures and pursue new, architecturally-enabled applications of cloud computing.

    Cloud computing has become a dominant method of providing computing infrastructure for Internet services,” said Jack Brassil, a program officer in NSF’s division of Computer and Network Systems. “But to design new and innovative compute clouds and the applications they will run, academic researchers need much greater control, diversity and visibility into the hardware and software infrastructure than is available with commercial cloud systems today.”

    The NSFFutureCloud testbeds provides the types of capabilities Brassil described.

    Using Chameleon, the team is testing four key aspects of the future system:

    The Global Operating System, which handles machine configuration, resource allocation and launching applications.

    The Node Operating System, which is based on Linux and provides interfaces for better control of future exascale architectures.

    The concurrency runtime Argobots, a novel infrastructure that efficiently distributes work among computing resources.

    BEACON (the Backplane for Event and Control Notification), a framework that gathers data on system performance and sends it to various controllers to take appropriate action.

    Chameleon’s unique, reconfigurable infrastructure lets researchers bypass some issues that would have come up if the team was running the project on a typical high-performance computing system.

    For instance, developing the Node Operating System requires researchers to change the operating system kernel — the computer program that controls all the hardware components of a system and allocates them to applications.

    “There are not a lot of places where we can do that,” said Swann Perarnau, a postdoctoral researcher at Argonne National Laboratory and collaborator on the Argo Project. “HPC machines in production are strictly controlled, and nobody will let us modify such a critical component.”

    However Chameleon lets scientists modify and control the system from top to bottom, allowing it to support a wide variety of cloud research and methods and architectures not available elsewhere.

    “The Argo project didn’t have the right hardware nor the manpower to maintain the infrastructure needed for proper integration and testing of the entire software stack,” Perarnau added. “While we had full access to a small cluster, I think we saved weeks of additional system setup time, and many hours of maintenance work, switching to Chameleon.”

    One of the major challenges in reaching exascale is energy usage and cost. During last year’s Supercomputing Conference, the researchers demonstrated the ability to dynamically control the power usage of 20 nodes during a live demonstration running on Chameleon.

    They released a paper this week describing their approach to power management for future exascale systems and will present the results at the Twelfth Workshop on High-Performance, Power-Aware Computing (HPPAC’16) in May.

    The Argo team is working with industry partners, including Cray, Intel and IBM, to explore which techniques and features would be best suited for the Department of Energy’s next supercomputer.

    “Argo was founded to design and prototype exascale operating systems and runtime software,” Perarnau said. “We believe some of the new techniques and tools we have developed can be tested on petascale systems and refined for exascale platforms.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

     
  • richardmitnick 4:53 pm on March 9, 2016 Permalink | Reply
    Tags: , , , NSF   

    From NSF: “All we are is dust in the interstellar wind” 

    nsf
    National Science Foundation

    March 9, 2016
    Sara Dwyer
    (703) 292-4934
    sdwyer@nsf.gov

    Cosmic dust is not simply something to sweep under the rug and forget about.

    Cosmic dust EAS
    Cosmic dust. EAS

    Instead, National Science Foundation (NSF)-funded astronomers are studying and even mapping it to learn more about what it might be hiding from us, where it comes from and what it’s turning into.

    1
    Fly through the cosmic dust of the Milky Way

    Some researchers are delving deep down to see how dust comes together at the atomic level, while others are looking at the big picture to see where stars and planets might be forming in dusty stellar nurseries. Recent discoveries, such as that of a very young galaxy containing much more dust than expected, have shown us that we still have much to learn about where exactly all this dust comes from.

    A little bit of dust makes a very large problem

    Although dust only makes up about 1 percent of the interstellar medium (the stuff between the stars), it can have big effects on astronomical observations. Dust has a bad reputation because it gets in the way by absorbing and scattering the visible light from objects such as far-off galaxies and stars, making them difficult or impossible to observe with optical telescopes.

    The scattering effect dust has is known as “reddening” — dust scatters the blue light coming from an object, making it appear redder. This occurs because dust has a greater effect on light with short wavelengths, such as blue. A similar effect is what causes sunsets to appear red.

    Astronomers can tell a lot about a star simply by its color, so this reddening effect can trick us into thinking a star is cooler and dimmer than it actually is. However, thanks to NSF-funded astronomers like Doug Finkbeiner of the Harvard-Smithsonian Center for Astrophysics, we can now correct for dust reddening and recover a star’s intrinsic color.

    Finkbeiner first began studying cosmic dust as a graduate student at the University of California, Berkeley in the late 1990s. Dust may seem like an odd thing to dedicate an astronomical career to but “dust is not as obscure as it sounds,” Finkbeiner said. “Objects like the Orion Nebula, the Horsehead Nebula, and the Pillars of Creation are dense, dusty clouds intermingled with bright stars, making a beautiful scene. But every part of the sky has at least some dust, and even a tiny amount of dust can interfere with astronomical measurements, so we need a way to correct for it.”

    Orion Nebula M. Robberto NASA ESA Space Telescope Science Institute Hubble
    Orion Nebula M. Robberto NASA/ESA Hubble Space Telescope Science Institute

    Horsehead Emission nebula
    Horsehead Emission nebula

    Pillars of Creation
    Pillars of Creation in the Eagle Nebula

    A necessary nuisance

    Knowing where dust is, and where it isn’t, gives us a better understanding of what’s happening in our galaxy. For example, an area saturated with dust may indicate a hotbed of star formation activity, while holes in an otherwise dusty area tell us that a supernova may have occurred and blown a pocket of dust away.

    “Dust is not a very glamorous name for something this important,” said Glen Langston, an NSF astronomy program director. “It represents both sides of star life — star birth and star death.”

    These dusty areas are also factories of cosmic chemistry — chemistry that creates molecules such as graphite (otherwise known as the stuff inside your pencil).

    When dying stars explode, they expel dust out into space that can be recycled to make something new. In fact, everything in the universe — stars, comets, asteroids, planets, even humans, started out as grains of dust floating around in space. As the late astronomer Carl Sagan famously said, “The nitrogen in our DNA, the calcium in our teeth, the iron in our blood, the carbon in our apple pies were made in the interiors of collapsing stars. We are made of starstuff.”

    Astronomers can peer into the galaxy and tell that some stars are making dust right now, but other dust might be billions of years old with a long, complicated history of growing, shrinking, freezing and burning as it traveled through space.

    “It’s not a bad analogy to think of dust like grains of sand on the beach,” Finkbeiner said. “You might have sand that looks the same because it’s coming from a coral reef 100 meters away, but in other places you might have sand that came from very far away which has been through a lot over thousands or millions of years.”

    Set your course by the stars…or dust

    Using data from almost one billion stars, Finkbeiner, along with student Gregory Green and former student Edward Schlafly, created a 3-D map of interstellar dust reddening across three quarters of the visible sky. This map allows astronomers to know when the targets of their observations may be suffering a reddening effect, and how much reddening they can expect. (You can further explore our dusty galaxy through several videos on a website Green created.)

    Dust distribution reveals our galaxy’s structure and we can see that most of the dust is contained in the disk, which is the plane in which the spiral arms of our galaxy lie. It also provides a snapshot of our galaxy’s history, showing that the Milky Way has had its fair share of galactic fender benders with other galaxies. In fact, we are due to collide and merge with our neighbor, the Andromeda galaxy, in about 4 billion years.

    Andromeda Galaxy
    Andromeda Galaxy. NASA/ESA Hubble

    Like dents in a bumper, we can see the damage by looking for ghostly trails of dust extending outward from the disk, showing that another galaxy might have passed through, dragging dust from our galaxy along for the ride.

    The map already combines data from 2MASS (the Two Micron All Sky Survey) and Pan-STARRS 1 (the Panoramic Survey Telescope & Rapid Response System), but there’s still a long way to go. Using multiple telescopes, 2MASS surveyed the entire sky in three infrared wavelengths between 1997 and 2001, while Pan-STARRS observes the entire visible sky several times per month. Pan-STARRS has provided a lot of data, but it is a drop in the bucket compared to what’s on the horizon.

    2MASS Telescope
    2MASS telescope
    Caltech 2MASS telescope

    Pann-STARSR1 Telescope
    Pann-STARRS1 interior
    Pan-STARRS1

    A few years from now, DECam (the Dark Energy Camera), a sensitive wide-field camera attached to the 4-meter Victor M. Blanco Telescope, will have looked at the entire southern hemisphere, allowing Finkbeiner to update his map to include the full sky in detail.

    DECam
    DECam, built at FNAL

    CTIO Victor M Blanco 4m Telescope
    CTIO Victor M Blanco 4 meter telescope in Chile which houses the DECam

    In the 2020s, LSST (the Large Synoptic Survey Telescope) — a wide-field telescope with an 8.4-meter primary mirror and the largest digital camera ever constructed — will provide data for 10 times more stars than currently available, recording the entire visible sky twice every week.

    LSST Camera
    LSST Interior
    LSST Exterior
    LSST camera, built by SLAC and the building which will house it in Chile

    LSST will gather more than 30 terabytes of data every night, providing more data than ever before. Astronomers like Finkbeiner are excited to face the new challenges this data overload will bring, hoping to solve some of the greatest cosmic mysteries, including the origin of some of the universe’s oldest dust. LSST, DECam, and several other surveys combined will help create a new map of much higher detail.

    In the future, Finkbeiner hopes his map will be incorporated with WorldWide Telescope, a free community-driven computer program that gathers the best images from ground and space-based telescopes and combines them with 3D navigation.

    “I can imagine the final product as something very beautiful,” Finkbeiner said. “So beautiful that every Hollywood movie will want to use it for their flying-through-the-galaxy scenes.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

     
  • richardmitnick 6:18 pm on January 8, 2016 Permalink | Reply
    Tags: , Blue and Green Antibacterial clays, NSF   

    From NSF: “Scientists discover how blue and green clays kill bacteria” 

    nsf
    National Science Foundation

    January 8, 2016
    Cheryl Dybas, NSF (703) 292-7734
    cdybas@nsf.gov

    Robert Burnham, ASU
    (480) 458-8207
    robert.burnham@asu.edu

    1
    Researchers unearth a natural clay deposit with antibacterial activity. Credit: ASU

    2
    Scientist Keith Morrison works in a mineral deposit to understand how antibacterial clays form. Credit: ASU

    3
    Outcrop of antibacterial blue clay and elemental sulfur (yellow) in a volcanic sulfide deposit. Credit: ASU

    5
    E.coli bacteria cluster, showing attack of the bacterial membrane (yellow). Credit: ASU

    Temp 2
    Researchers Lynda Williams, Rajeev Misra, Maitrayee Bose worked to uncover the mechanism. Credit: ASU

    Since prehistoric times, humans have used clays for medicinal purposes.

    Whether through ingestion, mud baths, or as a way to stop bleeding from wounds, clay has long helped keep humans healthy. Scientists have found that certain clays possess germ-killing abilities, but how these work has remained unclear.

    A new discovery by Arizona State University (ASU) scientists shows that two specific metallic elements in the right kinds of clay can kill disease-causing bacteria that infect humans and animals.

    “The novelty of this research is two-fold: identifying the natural environment of the formation of clays toxic to bacteria, and how the chemistry of these clays attacks and destroys the bacteria,” says Enriqueta Barrera, a program director in the National Science Foundation (NSF) Division of Earth Sciences, which funded the research. “This geochemical mechanism can be used to develop products that act on bacteria resistant to antibiotic treatment.”

    An antibacterial Trojan horse

    “We think of this mechanism like the Trojan horse attack in ancient Greece,” says Lynda Williams, a clay-mineral scientist at ASU. “Two elements in the clay work in tandem to kill bacteria.”

    She explains that “one metallic element — chemically reduced iron, which in small amounts is required by a bacterial cell for nutrition — tricks the cell into opening its wall. Then another element, aluminum, props the cell wall open, allowing a flood of iron to enter the cell. This overabundance of iron then poisons the cell, killing it as the reduced iron becomes oxidized.”

    Adds scientist Keith Morrison of the Lawrence Livermore National Laboratory, “It’s like putting a nail in the coffin of the dead bacteria.”

    Morrison is the lead author of a paper reporting the discovery, published today in the journal Nature Scientific Reports. Williams and Rajeev Misra, a microbiologist at ASU, are co-authors.

    From French green clay to Oregon blue clay

    A chance discovery of a medicinal clay from Europe caught Williams’ attention and put her on track for the recent discovery. Line Brunet de Courssou, a philanthropist with clinical medicine experience in Africa, passed along information about a particular green-hued clay found near her childhood home in France.

    Brunet de Courssou had taken samples of the clay to Africa, where she documented its ability to cure Buruli ulcer, a flesh-eating skin disease, for patients in Ivory Coast.

    Williams attempted to locate the site of the green clay deposit in the French Massif Central region. When the search proved unsuccessful, she began systematically testing clays sold online as “healing clays.”

    After analyzing dozens of samples, Williams and her team identified a blue-colored clay from the Oregon Cascades that proved to be highly antibacterial.

    The research shows that it works against a broad spectrum of human pathogens, including antibiotic-resistant strains such as methicillin-resistant Staphylococcus aureus (MRSA).

    Temp 3
    Scanning electron micrograph of a human neutrophil ingesting MRSA. National Institute of Allergy and Infectious Diseases (NIAID)
    Author: National Institutes of Health (NIH)

    The colors of the clays reflect their origins, Williams says.

    Greens and blues are antibacterial clues

    The greens and blues of antibacterial clays come from having a high content of chemically reduced iron, as opposed to oxidized iron, which provides the familiar rust color associated with many clays.

    Such “reduced” clays are common in many parts of the world, typically forming in volcanic ash layers as rocks become altered by water that is oxygen-deprived and hydrogen-rich.

    Because blue and green clays abound in nature, Williams says, the discovery of how their antibacterial action works should lead to alternative ways of treating persistent infections and diseases that are difficult to treat with antibiotics.

    “Finding out how natural clays kill human pathogens,” she says, “may lead to new economic uses of such clays and to new drug designs.”

    Part of the investigation involved the use of the NSF-supported Secondary Ion Mass Spectrometry Facility.

    6
    WiscSIMS, the Wisconsin Secondary Ion Mass Spectrometer Laboratory, explores new applications of in situ analysis to stable isotope geochemistry. Research in many disciplines can benefit by using the IMS-1280 ion microprobe, a CAMECA large radius magnetic sector SIMS. The dramatic reductions of sample size and analysis spot sizes from 1 to 10 micrometers offer many exciting, potentially revolutionary, research opportunities.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

     
  • richardmitnick 2:25 pm on January 6, 2016 Permalink | Reply
    Tags: , , NSF, Peering into the secret world of life beneath winter snows, Snow   

    From NSF: “Peering into the secret world of life beneath winter snows” 

    nsf
    National Science Foundation

    January 5, 2016
    Cheryl Dybas, NSF
    (703) 292-7734
    cdybas@nsf.gov

    Temp 1
    A secret world, unseen by most humans, is alive just beneath the winter snow. Credit: Kristin Link, https://www.flickr.com/photos/kristinillustration/

    Snow covers some 40 percent of Earth’s land masses year in and year out. And, as scientists are discovering, snow is critical to animals and plants that live in northern latitudes, as well as those in far southern latitudes like Patagonia at the tip of South America. It ensures their — and our — survival.

    “Without snow, plant and animal life would be completely different,” says biologist Jonathan Pauli of the University of Wisconsin-Madison.

    Pauli and scientists such as Ben Zuckerberg, also of the University of Wisconsin-Madison, are members of a new breed of researchers called winter ecologists. The field, which focuses on relationships among animals, plants and their snow-covered environments, is relatively new.

    “Compared to other habitats, snow ecosystems have been barely explored,” Pauli says. “That’s a major oversight, considering how important snow is in the lives of so many species.”

    That list of species includes humans. Our spring and summer water resources depend heavily on meltwater from winter snows.

    Nature’s igloo

    With a grant from the National Science Foundation (NSF)’s MacroSystems Biology Program, Pauli and Zuckerberg conduct research on what’s called the subnivium, a seasonal and sensitive refuge beneath the snow’s surface that’s insulated and maintains a constant temperature. It’s nature’s igloo.

    “We know very little about plants and animals that survive winter beneath the snow,” says Liz Blood, a program director in NSF’s Directorate for Biological Sciences. “This research is taking an innovative approach to studying the consequences of climate change on overwintering success of plants and animals in the subnivium.”

    What will happen if snow disappears in a warmer world?

    Pauli and colleagues believe warming winters caused by climate change reduce the subnivium’s duration, depth and insulation.

    As a result of these altered conditions, they report in the journals Frontiers in Ecology and the Environment and PLOS One, the subnivium is showing more temperature variability and decreased — not increased — temperatures. Without enough snow, temperatures fall due to loss of insulation.

    “In a warmer world with less snow, winter soils would be colder because the insulating snow layer on top is reduced,” says Henry Gholz, a program director in NSF’s Division of Environmental Biology. “That has implications for farmers planting crops in spring, as well as for the many burrowing mammals, microbes and insects that overwinter in snow.”

    The changes will have important implications for species that need the subnivium to survive, Zuckerberg says, “and will lead to large-scale shifts in their ranges.”

    Everything depends, adds Pauli, on having enough snow.

    No two snowflakes, nor snows, alike

    As children learn, no two snowflakes form in quite the same way. And, as northern peoples like the Inuit of Alaska know, nor are there two snows, nor snowfalls, exactly alike.

    Annui, api, pukak. Qali, siqoq, kimoagruk. Upsik, qamaniq, siqoqtoaq — these are the snow-words of the Inuit of northwestern Alaska. They’ve also become terms for snow used by many winter ecologists.

    Perhaps you’re reading this article nestled by a roaring fire, protected from the elements in a warm den or living room. Outside your icicle-laden window, you notice falling snow: annui.
    Should you awaken one morning to snow covering the ground, you’re seeing api.
    Pukak is the layer at the bottom of a snow bank, “which is critical to the small mammals that live there in winter,” Pauli says. “There’s an entire world going on beneath the snow that we can’t see.”
    Qali snow, nature’s paintbrush, frosts the limbs of trees, and is no less ecologically important, providing shelter to birds and other animals seeking escape from the cold. The golden-crowned kinglet, for example, a tiny songbird of northern forests, survives winter’s frigid nights because it can huddle under qali on conifer branches.
    Trees whose branches are iced-over are said to be covered with kanik, and snow that swirls in whorls is siqoq. When those snows form drifts, they’re known as kimoagruk.
    Winter winds may eventually compact snow into a hard surface, called upsik, which offers a highway to animals like deer and moose that navigate best on hard-packed surfaces.
    Qamaniq leaves hollows around the bases of trees; it offers shelter to birds such as ruffed and spruce grouse, and to snowshoe hares.
    As spring arrives, it brings siqoqtoaq, “the sun crust” in the surface layer of snow that melts by day and re-freezes by night. In siqoqtoaq, microbes called snow algae, dormant during early winter months, bloom bright red and green.

    Species need snow

    Gazelles in the Gobi Desert in northern China and southern Mongolia rely on snow mines, oases of water from melting snows buried beneath the sand. Other species use snow in ways that are just as resourceful.

    Tiny mammals like shrews don’t migrate but spend their winters in the subnivium. River otters slide down snowbanks and through openings in ice-covered waters to find fish. Gangly moose use snow as a footstools, to better reach tender shoots at the ends of branches. And some of the tiniest forms of life — fungi and other microorganisms that live in and under the snow — remain active throughout the winter.

    Without fungi, wildflowers’ summer bloom in mountain environments wouldn’t happen. Fungi increase their metabolism as winter progresses, releasing nutrients from their by-products as snow melts in spring.

    Snow research in a greenhouse

    Since plants and animals depend on snow cover in winter, what lies ahead when the global climate’s increased warmth has reduced snow levels?

    “We should be very concerned about these changes,” Pauli says.

    He, Zuckerberg, Warren Porter of the University of Wisconsin-Madison and Brian McMahon of Operation Fresh Start are working to assess climate change effects on the sensitive subnivian habitat.

    Using micro-greenhouses placed at sites in Wisconsin, Minnesota and Michigan, the ecologists mimic climate conditions predicted for the Great Lakes region by 2050. The micro-greenhouses automatically open when it snows, allowing a subnivium to form inside.

    The research reveals how future subnivium conditions will affect the physiology, survival and distribution of species dependent on this realm-beneath-the-snow.

    To date, micro-greenhouse experiments show that although ambient temperature within a micro-greenhouse is set at 5 degrees Celsius warmer than the temperature outside the greenhouse, “the natural daily minimum subnivium temperature drops significantly lower inside,” Pauli says.

    Translation: less snow is falling.

    Without snow cover and its insulation, the greenhouse interior — and its subnivian inhabitants such as hibernating amphibians — are exposed to the winter cold.

    “Our findings suggest climate change could have considerable effects on the refuge quality of the subnivium,” Pauli says.

    The future for plants and animals, including us, say Pauli and Zuckerberg, looks brightest as a white, not green, winter.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

     
  • richardmitnick 12:03 pm on December 19, 2015 Permalink | Reply
    Tags: , , , NSF   

    From Eos: “NSF Director Cautions Against Politicizing Science” 

    Eos news bloc

    Eos

    France Córdova says that elected officials are generally supportive of science and technology but that the political environment can be challenging.

    1
    National Science Foundation Director France Córdova, who spoke at the American Geophysical Union’s Fall Meeting on 15 December, called for scientists to continue setting their own priorities for goals and challenges to meet in the sciences. Credit: Gary Wagner

    17 December 2015
    Randy Showstack

    “It has been a tough time for geosciences on [Capitol] Hill,” National Science Foundation (NSF) Director France Córdova said in a 15 December speech at the American Geophysical Union’s (AGU) Fall Meeting in San Francisco, Calif.

    Earlier in 2015, for instance, some members of Congress had proposed restricting funding for geosciences within NSF and questioned climate change.

    However, a bipartisan omnibus spending bill for fiscal year (FY) 2016 introduced by the House Appropriations Committee in the early morning hours of 16 December—less than a day after Córdova’s lecture—would help to provide additional support for the geosciences at NSF. The bill would increase NSF’s funding by 1.6% above the FY 2015 enacted level.

    During a media availability following her AGU speech, Córdova said that the appropriations package that was then in the works includes a number of NSF priorities, “and just about all of them have to do with the geosciences.”

    Challenges to the Geosciences

    “Some would challenge [geosciences’] goal to understand our planet as not of the highest priority for the science agencies,”Córdova said during the Union Agency Lecture. “Some would find a hypothetical ocean on a distant moon of more interest than our own ocean, whose mysteries have barely been tapped.”

    “It is your challenge as scientists to ride this questioning tide with your best tools: your quest for truth; your application of the scientific method to increase our knowledge about our planet; and your ability to communicate in all directions the beauty, the value, the importance of the geosciences,” she said. Her speech also touched on NSF’s priorities, funding, investments, and instruments.

    Setting Priorities for Discovery

    During the media availability following her speech, Córdova elaborated on her concerns. Elected officials are “very enamored of science and technology, innovation in particular, and very supportive in general,” she said, adding, “They do have individual predilections about what they think is important to do.”

    The NSF director said it is “helpful to have the perspective that our major discoveries over history, over time, have been multidisciplinary discoveries. So, leaving out one branch of science in favor of others is just not a good thing to do for the progress of science. We have to appreciate that it’s interdisciplinary.”

    “Science and scientists should be setting the priorities for what are the big goals, the big challenges in the discovery space,” she said. “If that decision about what to prioritize becomes political, then who knows where it will all end. It will be unstable and certainly not flexible for the disciplines. And we can’t raise the next generation of scientists and engineers unless we do have support and flexibility.”

    Citation: Showstack, R. (2015), NSF director cautions against politicizing science, Eos, 96, doi:10.1029/2015EO042063. Published on 17 December 2015.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: