Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:55 pm on May 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From LBL: “Berkeley Lab Scientist Invents New Technique to Understand Cloud Behavior” 

    Berkeley Logo

    Berkeley Lab

    May 27, 2015
    Julie Chao (510) 486-6491

    1
    Berkeley Lab researchers David Romps and Rusen Oktem collected these images in Florida, showing 14 minutes of cloud movements.

    With two off-the-shelf digital cameras situated about 1 kilometer apart facing Miami’s Biscayne Bay, Lawrence Berkeley National Laboratory scientists David Romps and Rusen Oktem are collecting three-dimensional data on cloud behavior that have never been possible to collect before.

    The photos allow Romps, a climate scientist who specializes in clouds, to measure how fast the clouds rise, which in turn can shed light on a wide range of areas, ranging from lightning rates to extreme precipitation to the ozone hole. Perhaps most importantly, a better understanding of basic cloud behavior will allow scientists to improve global climate models.

    “We want to answer a very basic question: with what speeds do clouds rise through the atmosphere? This is very difficult to answer by any technology other than stereophotogrammetry,” he said. “Knowing their speeds is important for several reasons; the important one is that we lack a really basic understanding of what processes control these clouds, the levels they peter out at, and how buoyant they are.”

    While stereophotogrammetry, which uses photos to make 3D measurements of cloud boundaries, has been used before to study cloud behavior, Romps’ innovation was a technique that does not require a reference point, such as a mountain or other land-based feature. This allows scientists to study clouds over the open ocean.

    “We have a lot of measurements of clouds over land, but far fewer over the ocean,” Romps said. “The behavior can be quite different. For example, looking at satellite data, you see continental areas light up with a lot of lightning and oceans less so.”

    3
    Berkeley Lab researchers Rusen Oktem (left) and David Romps

    The technique was detailed in a paper published last year in the Journal of Atmospheric and Oceanic Technology, titled Stereophotogrammetry of Oceanic Clouds. Co-authors include Berkeley Lab computing experts Oktem, James Lee, Aaron Thomas, and Prabhat, and Paquita Zuidema of the University of Miami.

    The paper describes how to set up and calibrate the two cameras; Romps and his team also devised algorithms to automate the 3D reconstruction, quickly finding feature points and matching them. The accuracy of the technique was validated with lidar and radiosondes.

    With images taken every 10 to 30 seconds, “we can really start to look at the full lifecycle of clouds,” said Romps, who has a joint appointment in UC Berkeley’s Department of Earth and Planetary Science. His technique also offers far higher spatial and temporal resolution than other technologies.

    The Department of Energy’s Atmospheric Radiation Measurement (ARM) program has funded a second set of cameras at its Southern Great Plains site in Oklahoma, the largest and most extensive climate research site in the world. Across about 55,000 square miles, clusters of lidar, radar, and other sophisticated monitoring equipment gather massive amounts of data to study the effects of aerosols, precipitation, surface fluxes, and clouds on the global climate.

    3
    One of the cameras looking over Miami’s Biscayne Bay.

    Using stereophotogrammetry, Romps has measured the speeds of shallow clouds rising through the atmosphere at 1 to 3 meters per second and of deeper clouds rising at speeds in excess of 10 meters per second. “The updraft speeds play an important role in the microphysics, general precipitation, and aerosol processing, which all impact climate simulations,” he said.

    Updraft speeds can also impact lightning rates, as faster clouds tend to produce more lightning. And if clouds are fast enough they can penetrate the stratosphere. “They can throw out water vapor and ice, which sets the humidity of the stratosphere, and that has an impact both because water vapor is greenhouse gas and also because water vapor, through a sequence of events, has an effect on the ozone hole,” Romps said.

    The largest source of uncertainty in today’s climate models is clouds. “We are still seeking a fundamental theory for moist convection, or what we call convection with phase changes. Without that theory, it is difficult to construct more accurate parameterizations [or models] of clouds that go into global climate models,” Romps said. “Stereophotogrammetry can provide very useful information in this quest.”

    The next steps are to combine the stereophotogrammetric data with other observations at the ARM site to answer basic questions about cloud life cycles. In particular, Romps and colleagues want to understand what environmental conditions can be used to forecast the sizes, speeds, depths, and lifetimes of convective clouds.

    Romps and Oktem are also developing new techniques to automate the reconstruction of three-dimensional cloud features. Until now, stereophotogrammetry has been a labor-intensive process, but their new algorithms have been used on supercomputers to rapidly reconstruct 35 million cloud features from a three-month period. “The development of these new algorithms makes stereophotogrammetry a tool that can now be used on a regular basis,” Romps said.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 7:59 am on May 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Caltech: “Using Radar Satellites to Study Icelandic Volcanoes and Glaciers” 

    Caltech Logo
    Caltech

    05/26/2015
    Kimm Fesenmaier

    1
    This Landsat 8 image, acquired on September 6, 2014, is a false-color view of the Holuhraun lava field north of Vatnajökull glacier in Iceland. The image combines shortwave infrared, near infrared, and green light to distinguish between cooler ice and steam and hot extruded lava. The Bárðarbunga caldera, visible in the lower left of the image under the ice cap, experienced a large-scale collapse starting in mid-August. Credit: USGS

    NASA LandSat 8
    Landsat 8

    On August 16 of last year, Mark Simons, a professor of geophysics at Caltech, landed in Reykjavik with 15 students and two other faculty members to begin leading a tour of the volcanic, tectonic, and glaciological highlights of Iceland. That same day, a swarm of earthquakes began shaking the island nation—seismicity that was related to one of Iceland’s many volcanoes, Bárðarbunga caldera, which lies beneath Vatnajökull ice cap.

    2
    Bárðarbunga

    As the trip proceeded, it became clear to scientists studying the event that magma beneath the caldera was feeding a dyke, a vertical sheet of magma slicing through the crust in a northeasterly direction. On August 29, as the Caltech group departed Iceland, the dike triggered an eruption in a lava field called Holuhraun, about 40 kilometers (roughly 25 miles) from the caldera just beyond the northern limit of the ice cap.

    Although the timing of the volcanic activity necessitated some shuffling of the trip’s activities, such as canceling planned overnight visits near what was soon to become the eruption zone, it was also scientifically fortuitous. Simons is one of the leaders of a Caltech/JPL project known as the Advanced Rapid Imaging and Analysis (ARIA) program, which aims to use a growing constellation of international imaging radar satellites that will improve situational awareness, and thus response, following natural disasters. Under the ARIA umbrella, Caltech and JPL/NASA had already formed a collaboration with the Italian Space Agency (ASI) to use its COSMO-SkyMed (CSK) constellation (consisting of four orbiting X-Band radar satellites) following such events.

    Through the ASI/ARIA collaboration, the managers of CSK agreed to target the activity at Bárðarbunga for imaging using a technique called interferometric synthetic aperture radar (InSAR). As two CSK satellites flew over, separated by just one day, they bounced signals off the ground to create images of the surface of the glacier above the caldera. By comparing those two images in what is called an interferogram, the scientists could see how the glacier surface had moved during that intervening day. By the evening of August 28, Simons was able to pull up that first interferogram on his cell phone. It showed that the ice above the caldera was subsiding at a rate of 50 centimeters (more than a foot and a half) a day—a clear indication that the magma chamber below Bárðarbunga caldera was deflating.

    The next morning, before his return flight to the United States, Simons took the data to researchers at the University of Iceland who were tracking Bárðarbunga’s activity.

    “At that point, there had been no recognition that the caldera was collapsing. Naturally, they were focused on the dyke and all the earthquakes to the north,” says Simons. “Our goal was just to let them know about the activity at the caldera because we were really worried about the possibility of triggering a subglacial melt event that would generate a catastrophic flood.”

    Luckily, that flood never happened, but the researchers at the University of Iceland did ramp up observations of the caldera with radar altimetry flights and installed a continuous GPS station on the ice overlying the center of the caldera.

    Last December, Icelandic researchers published a paper in Nature about the Bárðarbunga event, largely focusing on the dyke and eruption. Now, completing the picture, Simons and his colleagues have developed a model to describe the collapsing caldera and the earthquakes produced by that action. The new findings appear in the journal Geophysical Journal International.

    “Over a span of two months, there were more than 50 magnitude-5 earthquakes in this area. But they didn’t look like regular faulting—like shearing a crack,” says Simons. “Instead, the earthquakes looked like they resulted from movement inward along a vertical axis and horizontally outward in a radial direction—like an aluminum can when it’s being crushed.”

    To try to determine what was actually generating the unusual earthquakes, Bryan Riel, a graduate student in Simons’s group and lead author on the paper, used the original one-day interferogram of the Bárðarbunga area along with four others collected by CSK in September and October. Most of those one-day pairs spanned at least one of the earthquakes, but in a couple of cases, they did not. That allowed Riel to isolate the effect of the earthquakes and determine that most of the subsidence of the ice was due to what is called aseismic activity—the kind that does not produce big earthquakes. Thus, Riel was able to show that the earthquakes were not the primary cause of the surface deformation inferred from the satellite radar data.

    “What we know for sure is that the magma chamber was deflating as the magma was feeding the dyke going northward,” says Riel. “We have come up with two different models to explain what was actually generating the earthquakes.”

    In the first scenario, because the magma chamber deflated, pressure from the overlying rock and ice caused the caldera to collapse, producing the unusual earthquakes. This mechanism has been observed in cases of collapsing mines (e.g., the Crandall Canyon Mine in Utah).

    The second model hypothesizes that there is a ring fault arcing around a significant portion of the caldera. As the magma chamber deflated, the large block of rock above it dropped but periodically got stuck on portions of the ring fault. As the block became unstuck, it caused rapid slip on the curved fault, producing the unusual earthquakes.

    “Because we had access to these satellite images as well as GPS data, we have been able to produce two potential interpretations for the collapse of a caldera—a rare event that occurs maybe once every 50 to 100 years,” says Simons. “To be able to see this documented as it’s happening is truly phenomenal.”

    Additional authors on the paper, The collapse of Bárðarbunga caldera, Iceland, are Hiroo Kanamori, John E. and Hazel S. Smits Professor of Geophysics, Emeritus, at Caltech; Pietro Milillo of the University of Basilicata in Potenza, Italy; Paul Lundgren of JPL; and Sergey Samsonov of the Canada Centre for Mapping and Earth Observation. The work was supported by a NASA Earth and Space Science Fellowship and by the Caltech/JPL President’s and Director’s Fund.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

     
  • richardmitnick 4:08 pm on May 26, 2015 Permalink | Reply
    Tags: Applied Research & Technology, , , , ,   

    From Symmetry: “A goldmine of scientific research” 

    Symmetry

    May 26, 2015
    Amelia Williamson Smith

    1
    Photo by Anna Davis

    The underground home of the LUX dark matter experiment has a rich scientific history.

    There’s more than gold in the Black Hills of South Dakota. For longer than five decades, the Homestake mine has hosted scientists searching for particles impossible to detect on Earth’s surface.

    It all began with the Davis Cavern.

    In the early 1960s, Ray Davis, a nuclear chemist at Brookhaven National Laboratory designed an experiment to detect particles produced in fusion reactions in the sun. The experiment would earn him a share of the Nobel Prize in Physics in 2002.

    Davis was searching for neutrinos, fundamental particles that had been discovered only a few years before. Neutrinos are very difficult to detect; they can pass through the entire Earth without bumping into another particle. But they are constantly streaming through us. So, with a big enough detector, Davis knew he could catch at least a few.

    Davis’ experiment had to be done deep underground; without the shielding of layers of rock and earth it would be flooded by the shower of cosmic rays also constantly raining from space.

    Davis put his first small prototype detector in a limestone mine near Akron, Ohio. But it was only about half a mile underground, not deep enough.

    “The only reason for mining deep into the earth was for something valuable like gold,” says Kenneth Lande, professor of physics at the University of Pennsylvania, who worked on the experiment with Davis. “And so a gold mine became the obvious place to look.”

    But there was no precedent for hosting a particle physics experiment in such a place. “There was no case where a physics group would appear at a working mine and say, ‘Can we move in please?’”

    Davis approached the Homestake Mining Company anyway, and the company agreed to excavate a cavern for the experiment.

    BNL funded the experiment. In 1965, it was installed in a cavern 4850 feet below the surface.

    The detector consisted of a 100,000-gallon tank of chlorine atoms. Davis had predicted that as solar neutrinos passed through the tank, one would occasionally collide with a chlorine atom, changing it to an argon atom. After letting the detector run for a couple of months at a time, Davis’ team would flush out the tank and count the argon atoms to determine how many neutrino interactions had occurred.

    “The detector had approximately 1031 atoms in it. One argon atom was produced every two days,” Lande says. “To design something that could do that kind of extraction was mind-boggling.”

    2
    Ray Davis. Courtesy of: Brookhaven National Laboratory

    A different kind of laboratory

    During the early years of the Davis experiment, around 2000 miners worked at the mine, along with engineers and geologists. The small group of scientists working on the Davis experiment would travel down into the mine with them.

    To go down the shaft to the 4850-foot level, they would get into what was called the “cage,” a 4.4-foot by 12.5-foot metal conveyance that held 36 people. The ride down, lit only by the glow of a couple of headlamps, took about five minutes, says Tom Regan, former operations safety manager and now safety consultant, who worked as a student laborer in the mine during the early years of the Davis experiment.

    Once they reached the 4850-foot level, the scientists walked across a rock dump. “It was guarded so a person couldn’t fall down the hole,” Regan says. “But you had to sometimes wait for a production train of rock or even loads of supplies or men or materials.”

    The Davis Cavern was 24 feet long, 24 feet wide, and 30 feet high. A small room off to the side held the group’s control system. “We were basically out of touch with the rest of the world when we were underground,” Lande says. “There was no difference between day and night, heat and cold, and snow and sunshine.”

    The miners and locals from Lead, South Dakota—the community surrounding the mine—were welcoming of the scientists and interested in their work, Lande says. “We’d go out to dinner at the local restaurant and we’d hear this hot conversation in the next booth, and they would be discussing black holes and neutron stars. So science became the talk of the small town.”

    4
    Davis Cavern, during the solar neutrino experiment. Photo by: Anna Davis

    The solar neutrino problem

    As the experiment began taking data, Davis’ group found they were detecting only about one-third the number of neutrinos predicted—a discrepancy that became known as the “solar neutrino problem.”

    Davis described the situation in his Nobel Prize biographical sketch: “My opinion in the early years was that something was wrong with the standard solar model; many physicists thought there was something wrong with my experiment.”

    However, every test of the experiment confirmed the results, and no problems were found with the model of the sun. Davis’ group began to suspect it was instead a problem with the neutrinos.

    This suspicion was confirmed in 2001, when the Sudbury Neutrino Observatory experiment [SNO] in Canada determined that as solar neutrinos travel through space, they oscillate, or change, between three flavors—electron, muon and tau. By the time neutrinos from the sun reach the Earth, they are an equal mixture of the three types.

    Sudbury Neutrino Observatory
    SNO

    The Davis experiment was sensitive only to electron neutrinos, so it was able to detect only one-third of the neutrinos from the sun. The solar neutrino problem was solved.

    5
    Davis Cavern, during a more recent expansion. Photo by: Matthew Kapust, Sanford Underground Research Facility

    A different kind of gold

    The Davis experiment ran for almost 40 years, until the mine closed in 2003.

    But the days of science in the Davis Cavern weren’t over. In 2006, the mining company donated Homestake to the state of South Dakota. It was renamed the Sanford Underground Research Facility.

    In 2009, many former Homestake miners became technicians on a $15.2 million project to renovate the experimental area. They completed the new 30,000-square-foot Davis Campus in 2012.

    Although scientists still ride in the cage to get down to the 4850-foot level of the mine, once they arrive it looks completely different.

    “It’s a very interesting contrast,” says Stanford University professor Thomas Shutt of SLAC National Accelerator Laboratory. “Going into the mine, it’s all mining carts, rust and rock, and then you get down to the Davis Campus, and it’s a really state-of-the-art facility.”

    The campus now contains block buildings with doors and windows. It has its own heating and air conditioning system, ventilation system, humidifiers and dust filters.

    The original Davis Cavern has been expanded and now houses the Large Underground Xenon experiment, the most sensitive detector yet searching for what many consider the most promising candidate for a type of dark matter particle.

    LUX Dark matter
    LUX

    Shielded from distracting background particles this far underground, scientists hope LUX will detect the rare interaction of dark matter particles with the nucleus of xenon atoms in the 368-kilogram tank.

    Another cavern nearby was excavated as part of the Davis Campus renovation project and now holds the Majorana Demonstrator experiment, which will soon start to examine whether neutrinos are their own antimatter partners.

    Majorano Demonstrator Experiment
    Majorano Demonstrator Experiment

    LUX began taking data in 2013. It is currently on its second run and will continue through spring 2016.

    After its current run, LUX will be replaced by the LUX-ZEPLIN, or LZ, experiment, which will be 50 times bigger in usable mass and several hundred times more sensitive than the current LUX results.

    LZ project
    LZ

    Science in the mine is still the talk of the town in Lead, says Carmen Carmona, an assistant project scientist at the University of California, Santa Barbara, who works on LUX. “When you go out on the streets and talk to people—especially the families of the miners from the gold mine days—they want to know how it is working underground now and how the experiment is going.”

    The spirit of cooperation between the mining community, the science community and the public community lives on, Regan says.

    “It’s been kind of a legacy to provide the beneficial space and be good neighbors and good hosts,” Regan says. “Our goal is for them to succeed, so we do everything we can to help and provide the best and safest place for them to do their good science.”

    6
    In 2010, Sanford Lab enlarged the Davis Cavern to support the Large Underground Xenon experiment. Matthew Kapust, Sanford Underground Research Facility

    7
    This cavern is being outfitted for the Compact Accelerator System Performing Astrophysical Research. CASPAR will use a low-powered accelerator to study what happens when stars die. Matthew Kapust, Sanford Underground Research Facility

    8
    Davis Cavern undergoes outfitting for the LUX experiment. Matthew Kapust, Sanford Underground Research Facility

    9
    Each day scientists working at the the Davis Campus pass this area, known as the Big X. The entrance to the Davis Campus is to the left; Yates Shaft is to the right. Matthew Kapust, Sanford Underground Research Facility

    10
    LUX researchers install the detector at the 4850 level. Matthew Kapust, Sanford Underground Research Facility

    11
    The Majorana Demonstrator experiment requires a very strict level of cleanliness. Researcher work in full clean room garb and assemble their detectors inside nitrogen-filled glove boxes. Matthew Kapust, Sanford Underground Research Facility

    12
    The LUX detector was built in a clean room on the surface and then brought underground. Matthew Kapust, Sanford Underground Research Facility

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:02 pm on May 22, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From PPPL: “A little drop will do it: Tiny grains of lithium can dramatically improve the performance of fusion plasmas” 


    PPPL

    May 22, 2015
    Raphael Rosen

    1
    Left: DIII-D tokamak. Right: Cross-section of plasma in which lithium has turned the emitted light green. (Credits: Left, General Atomics / Right, Steve Allen, Lawrence Berkeley National Laboratory)

    Scientists from General Atomics and the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) have discovered a phenomenon that helps them to improve fusion plasmas, a finding that may quicken the development of fusion energy. Together with a team of researchers from across the United States, the scientists found that when they injected tiny grains of lithium into a plasma undergoing a particular kind of turbulence then, under the right conditions, the temperature and pressure rose dramatically. High heat and pressure are crucial to fusion, a process in which atomic nuclei – or ions – smash together and release energy — making even a brief rise in pressure of great importance for the development of fusion energy.

    “These findings might be a step towards creating our ultimate goal of steady-state fusion, which would last not just for milliseconds, but indefinitely,” said Tom Osborne, a physicist at General Atomics and lead author of the paper. This work was supported by the DOE Office of Science.

    The scientists used a device developed at PPPL to inject grains of lithium measuring some 45 millionths of a meter in diameter into a plasma in the DIII-D National Fusion Facility – or tokamak – that General Atomics operates for DOE in San Diego.

    DOE DIII-D Tokamak
    DIII-D National Fusion Facility

    When the lithium was injected while the plasma was relatively calm, the plasma remained basically unaltered. Yet as reported this month in a paper in Nuclear Fusion, when the plasma was undergoing a kind of turbulence known as a “bursty chirping mode,” the injection of lithium doubled the pressure at the outer edge of the plasma. In addition, the length of time that the plasma remained at high pressure rose by more than a factor of 10.

    Experiments have sustained this enhanced state for up to one-third of a second. A key scientific objective will be to extend this enhanced performance for the full duration of a plasma discharge.

    Physicists have long known that adding lithium to a fusion plasma increases its performance. The new findings surprised researchers, however, since the small amount of lithium raised the plasma’s temperature and pressure more than had been expected.

    These results “could represent the birth of a new tool for influencing or perhaps controlling tokamak edge physics,” said Dennis Mansfield, a physicist at PPPL and a coauthor of the paper who helped develop the injection device called a “lithium dropper.” Also working on the experiments were researchers from Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, the University of Wisconsin-Madison and the University of California-San Diego.

    Conditions at the edge of the plasma have a profound effect on the superhot core of the plasma where fusion reactions take place. Increasing pressure at the edge region raises the pressure of the plasma as a whole. And the greater the plasma pressure, the more suitable conditions are for fusion reactions. “Making small changes at the plasma’s edge lets us increase the pressure further within the plasma,” said Rajesh Maingi, manager of edge physics and plasma-facing components at PPPL and a coauthor of the paper.

    Further experiments will test whether the lithium’s interaction with the bursty chirping modes — so-called because the turbulence occurs in pulses and involves sudden changes in pitch — caused the unexpectedly strong overall effect.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

     
  • richardmitnick 9:43 am on May 22, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Arizona: “New Insights Into Drivers of Earth’s Ecosystems” 

    U Arizona bloc

    University of Arizona

    May 21, 2015
    Raymond Sanchez

    1
    Seawater samples are prepared for extraction of marine viruses aboard the Tara Oceans vessel. (Photo: Anna Deniaud/Tara Oceans)

    A UA-led international team has uncovered new information about the ways marine viruses and microbes interact on a global scale.

    2
    Scientists aboard the Tara Oceans vessel prepare to lower a CTD device into the blue depths. A suite of sampling containers and instruments allows them to collect specimens and data at the same time. (Photo: Anna Deniaud/Tara Oceans)

    Hidden among Earth’s vast oceans are some of the most vital organisms to the health of delicate ecosystems. Tiny ocean microbes produce half of the oxygen we breathe, and they are important drivers in chemical reactions and energy transfers that fuel critical ecological processes.

    Much like other organisms, marine microbes are susceptible to viral infections that can alter their metabolic output, or even kill them. For example, certain ocean viruses invade algae and take control over the photosynthetic process, which replenishes the oxygen we breathe. Others simply kill off vast amounts of organisms, putting a cap on the biomass that can support food webs in the world’s oceans.

    3
    In the ship’s lab, former Sullivan lab member Melissa Duhaime filters viruses from ocean water samples. (Photo: Anna Deniaud/Tara Oceans)

    A new study from an international team led by UA scientists Matthew Sullivan, Jennifer Brum, Simon Roux and Julio Cesar Ignacio Espinoza draws on viral genome data to explain how oceanic viral communities maintain high regional diversity on par with global diversity. The findings may help researchers to predictively model the virus-microbe interactions driving Earth’s ecosystems.

    The paper is one of five landmark studies borne from the Tara Oceans Expedition featured this week in a special issue of the journal Science. Sullivan is scheduled to discuss the work this week on the National Public Radio program “Science Friday.”

    “We established a means to study viral populations within more complex communities and found that surface ocean viruses were passively transported on currents and that population abundances were structured by local environmental conditions,” said Sullivan, associate professor in the Department of Ecology and Evolutionary Biology and a member of the BIO5 Institute. The work was completed with the assistance of a Gordon and Betty Moore Foundation Investigator grant, a highly prestigious award given to researchers focused on environmental science and conservation.

    Sullivan’s work is part of the Tara Oceans Expedition, a global effort to understand complex interactions among ocean ecosystems, climate and biodiversity. For the past 10 years, the Tara Oceans research vessel has traversed more than 180,000 miles across all of the world’s oceans, collecting biological samples and information about the oceans’ physical parameters such as depth, temperature and salinity.

    “The Tara Oceans expedition provided a platform for systematically sampling ocean biota from viruses to fish larvae, and in a comprehensive environmental context,” Sullivan said. “Until now, a global picture of ocean viral community patterns and ecological drivers was something we could only dream of achieving.”

    To assess geographical diversity in marine viral communities, Sullivan and his team looked at double-stranded DNA viral genomic sequence data, or viromes, and whole viral community morphological data across 43 Tara Oceans expedition samples. The samples were globally distributed throughout the surface oceans (only one deep-sea sample) and represented diverse environmental conditions.

    Specifically, Sullivan and his team were interested in the previous observation that the diversity of ocean viruses at any given site was as high as that observed globally. Such high local and low global diversity had been observed a decade ago, and scientists proposed a seed-bank hypothesis to explain it. This hypothesis suggests that high local genetic diversity can exist by drawing variation from a common and relatively limited global gene pool. Local-dominant communities consist of viruses that are influenced by environmental conditions, which affect their microbial hosts and indirectly alter the structure of the viral community. These communities serve as the low-abundance “bank” for neighboring locations, as they are passively transported by way of ocean currents.

    Since viruses lack universal genes that could be used to identify global community patterns, Sullivan had to employ different techniques to study viral communities. The first approach involved looking at viral particles themselves, and comparing morphological characteristics such as capsid size and tail length.

    “This is the low resolution way to do things — viruses that appear identical may have completely different genomes,” Sullivan explained. “The fact that all viruses don’t share a single common gene calls for some clever approaches to investigating viral diversity.”

    Next, the researchers cataloged viral populations in terms of the proteins they shared in common, in a process called protein clustering. This allowed them to establish core genes that were shared across the viral communities studied. Finally, Sullivan and his team looked at the distribution of viral populations, the majority of which had not been previously characterized, across all of the Tara Oceans sample sites.

    When they investigated the distribution patterns, they found that the directionality of viral population flow closely corresponded to that of ocean currents, affirming the seed-bank hypothesis.

    “Ocean virus-microbe interactions have a huge impact on global biogeochemistry,” Sullivan said. “As they destroy microbial cells, they change the forms of nutrients available to other, larger organisms in ocean ecosystems. This recycling of nutrients through viral lysis is an important pathway that regulates how the oceanic ecosystem functions. Viral infections simultaneously reduce the amount of nutrients and materials available to larger organisms by killing microbial cells, but also stimulate microbial activity through the release of organic matter and nutrients, which provides increased biomass available for larger organisms including fish.”

    Sullivan’s findings stem from key advances in methodology, including the ability to systematically collect biological samples on a global scale and pushing the analysis of marine viral characteristics into the realm of the quantitative.

    “Up until recently, the methods used to study virus-microbe interactions were often qualitative,” Sullivan said. “With this study, we have made great quantitative advances. The goal now is to determine how our quantitative estimates can be used to build predictive models.”

    Sullivan emphasized the uniqueness and importance of working with the Tara Oceans team.

    “This is an incredible new way of doing science,” he said. “At Tara Oceans, we are united by a common goal rather than a common funding source. These first papers show the world that we’re capable of doing science at this scale, and yet they represent just the tip of the iceberg of what is hidden in these vast data sets. We’ve got years of work ahead of us.”

    Sullivan and his lab also contributed to three other papers in the special issue. Those three papers explore the global ocean microbiome and plankton interaction networks, as well as how plankton communities change across a key ocean circulation choke point off South Africa.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Arizona campus

    The University of Arizona (UA) is a place without limits-where teaching, research, service and innovation merge to improve lives in Arizona and beyond. We aren’t afraid to ask big questions, and find even better answers.

    In 1885, establishing Arizona’s first university in the middle of the Sonoran Desert was a bold move. But our founders were fearless, and we have never lost that spirit. To this day, we’re revolutionizing the fields of space sciences, optics, biosciences, medicine, arts and humanities, business, technology transfer and many others. Since it was founded, the UA has grown to cover more than 380 acres in central Tucson, a rich breeding ground for discovery.

    Where else in the world can you find an astronomical observatory mirror lab under a football stadium? An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why the UA is a university unlike any other.

     
  • richardmitnick 8:05 am on May 22, 2015 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From NBC via U Washington: “T. Rex’s Cousin? Scientists Find Washington State’s First Dinosaur Fossil” 

    NBC News

    NBC News

    U Washington

    May 20 2015
    Laura Geggel, Live Science

    1
    The University of Washington’s Brandon Peecook holds up a cast of a Daspletosaurus femur at right, while Christian Sidor holds the newly described fossil fragment at right for a size comparison.Burke Museum / UW

    A fragmented femur bone hidden underwater for millions of years has provided the first evidence that a dinosaur once roamed Washington state.

    And not just any dinosaur: Researchers say this beast was a theropod — a two-legged, mostly meat-eating group of beasts that are linked to modern-day birds. Other theropods include Tyrannosaurus rex and Velociraptor.

    Scientists found the 80 million-year-old fossil of the dinosaur when they were searching for ammonites — extinct marine invertebrates with spiral shells — and other fossilized animals. They had focused their fieldwork in the San Juan Islands, an archipelago located a short ferry ride away from Seattle.

    In April 2012, when the tide was out, they noticed a fossilized bone embedded in the marine rock. The researchers immediately contacted paleontologists at the University of Washington, who sent out a team in May of that year to excavate the fossil with a rock saw.

    “The rock there is tremendously hard, so it took them a full day to excavate it,” said Christian Sidor, a co-author of the study and a curator of vertebrate paleontology at the Burke Museum at the University of Washington.

    Sidor and his colleagues spent about a year and a half preparing the fossil, and “for the longest time, I was unconvinced that we were going to be able to say anything else besides ‘It’s a large bone,'” he told LiveScience. “What was exposed on the surface really had no anatomy. I couldn’t tell if it was a dinosaur, couldn’t tell if it was a marine reptile, couldn’t tell anything about it.”

    Once they removed the fossil from the rock and flipped it over, the researchers saw several signs that the fossil was half of the left femur (thighbone) of a theropod dinosaur. It measures 16.7 inches long by 8.7 inches wide (42 by 22 centimeters) but would have been almost 4 feet (1.2 meters) long — or slightly smaller than a T. rex thighbone — before it broke, the researchers said.

    Several clues suggest the fossil belonged to a theropod, Sidor said. For instance, the fossil once had a hollow middle cavity, which was unique to theropods during the late Cretaceous period. The bone also had a feature positioned closely to the hip, called a fourth trochanter. That feature is commonly associated with theropods. The researchers said it “seems likely” that the creature was a tyrannosauroid, a older cousin of T. rex.

    The specimen was uncovered near fossils of the clam species Crassatellites conradiana, which lived in shallow water. This suggests that the dinosaur died near the sea, was tossed around by the waves and found its resting place among the clams, the researchers said.

    The find makes Washington the 37th U.S. state known to have dinosaur fossils.

    Active plate tectonics and a vast amount of urban development have made it difficult for scientists to find dinosaur fossils in Washington, the researchers said. However, isolated dinosaur skeletons and bones have been found in nearby regions such as Oregon, California and south central Alaska.

    The study was published online Wednesday in the journal PLOS ONE. The fossil is due to go on display at the Burke Museum on May 21.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NBC News building
    30 Rock

     
  • richardmitnick 7:53 am on May 22, 2015 Permalink | Reply
    Tags: , Applied Research & Technology,   

    From AAAS: “The new shape of fusion” 

    AAAS

    AAAS

    21 May 2015
    Daniel Clery

    1
    A plasma glows inside MAST, a spherical tokamak.

    ITER, the international fusion reactor being built in France, will stand 10 stories tall, weigh three times as much as the Eiffel Tower, and cost its seven international partners $18 billion or more. The result of decades of planning, ITER will not produce fusion energy until 2027 at the earliest. And it will be decades before an ITER-like plant pumps electricity into the grid. Surely there is a quicker and cheaper route to fusion energy.

    Fusion enthusiasts have a slew of schemes for achieving the starlike temperatures or crushing pressures needed to get hydrogen nuclei to come together in an energy-spawning union. Some are mainstream, such as lasers, some unorthodox. Yet the doughnut-shaped vessels called tokamaks, designed to cage a superheated plasma using magnetic fields, remain the leading fusion strategy and are the basis of ITER. Even among tokamaks, however, a nimbler alternative has emerged: a spherical tokamak.

    Imagine the doughnut shape of a conventional tokamak plumped up into a shape more like a cored apple. That simple change, say the idea’s advocates, could open the way to a fusion power plant that would match ITER’s promise, without the massive scale. “The aim is to make tokamaks smaller, cheaper, and faster—to reduce the eventual cost of electricity,” says Ian Chapman, head of tokamak science at the Culham Centre for Fusion Energy in Abingdon, U.K.


    Download mp4 here.

    Culham is one of two labs about to give these portly tokamaks a major test. The world’s two front-rank machines—the National Spherical Torus Experiment (NSTX) at the Princeton Plasma Physics Laboratory in New Jersey and the Mega Amp Spherical Tokamak (MAST) in Culham—are both being upgraded with stronger magnets and more powerful heating systems. Soon they will switch on and heat hydrogen to temperatures much closer to those needed for generating fusion energy. If they perform well, then the next major tokamak to be built—a machine that would run in parallel with ITER and test technology for commercial reactors—will likely be a spherical tokamak.

    PPPL NSTX
    NSTX

    Mega Amp Spherical Tokamak
    MAST

    A small company spun off from Culham is even making a long-shot bet that it can have a spherical tokamak reactor capable of generating more energy than it consumes—one of ITER’s goals—up and running within the decade. If it succeeds, spherical tokamaks could change the shape of fusion’s future. “It’s going to be exciting,” says Howard Wilson, director of the York Plasma Institute at the University of York in the United Kingdom. “Spherical tokamaks are the new kids on the block. But there are still important questions we’re trying to get to the bottom of.”

    TOKAMAKS ARE AN INGENIOUS WAY to cage one of the most unruly substances humans have ever grappled with: plasma hot enough to sustain fusion. To get nuclei to slam together and fuse, fusion reactors must reach temperatures 10 times hotter than the core of the sun, about 150 million degrees Celsius. The result is a tenuous ionized gas that would vaporize any material it touches—and yet must be held in place long enough for fusion to generate useful amounts of energy.

    Tokamaks attempt this seemingly impossible task using magnets, which can hold and manipulate plasma because it is made of charged particles. A complex set of electromagnets encircle the doughnut-shaped vessel, some horizontal and some vertical, while one tightly wound coil of wire, called a solenoid, runs down the doughnut hole. Their combined magnetic field squeezes the plasma toward the center of the tube and drives it around the ring while also twisting in a slow corkscrew motion.

    But plasma is not easy to master. Confining it is like trying to squeeze a balloon with your hands: It likes to bulge out between your fingers. The hotter a plasma gets, the more the magnetically confined gas bulges and wriggles and tries to escape. Much of the past 60 years of fusion research has focused on how to control plasma.

    Generating and maintaining enough heat for fusion has been another challenge. Friction generated as the plasma surges around the tokamak supplies some of the heat, but modern tokamaks also beam in microwaves and high-energy particles. As fast as the heat is supplied, it bleeds away, as the hottest, fastest moving particles in the turbulent plasma swirl away from the hot core toward the cooler edge. “Any confinement system is going to be slightly leaky and will lose particles,” Wilson says.

    Studies of tokamaks of different sizes and configurations have always pointed to the same message: To contain a plasma and keep it hot, bigger is better. In a bigger volume, hot particles have to travel farther to escape. Today’s biggest tokamak, the 8-meter-wide Joint European Torus (JET) at Culham, set a record for fusion energy in 1997, generating 16 megawatts for a few seconds.

    Joint European Torus
    JET

    (That was still slightly less than the heating power pumped into the plasma.) For most of the fusion community, ITER is the logical next step. It is expected to be the first machine to achieve energy gain—more fusion energy out than heating power in.

    In the 1980s, a team at Oak Ridge National Laboratory in Tennessee explored how a simple shape change could affect tokamak performance. They focused on the aspect ratio—the radius of the whole tokamak compared to the radius of the vacuum tube. (A Hula-Hoop has a very high aspect ratio, a bagel a lower one.) Their calculations suggested that making the aspect ratio very low, so that the tokamak was essentially a sphere with narrow hole through the middle, could have many advantages.

    Near a spherical tokamak’s central hole, the Oak Ridge researchers predicted, particles would enjoy unusual stability. Instead of corkscrewing lazily around the tube as in a conventional tokamak, the magnetic field lines wind tightly around the central column, holding particles there for extended periods before they return to the outside surface. The D-shaped cross section of the plasma would also help suppress turbulence, improving energy confinement. And they reckoned that the new shape would use magnetic fields more efficiently—achieving more plasma pressure for a given magnetic pressure, a ratio known as beta. Higher beta means more bang for your magnetic buck. “The general idea of spherical tokamaks was to produce electricity on a smaller scale, and more cheaply,” Culham’s Chapman says.

    But such a design posed a practical problem. The narrow central hole in a spherical tokamak didn’t leave enough room for the equipment that needs to fit there: part of each vertical magnet plus the central solenoid. In 1984, Martin Peng of Oak Ridge came up with an elegant, space-saving solution: replace the multitude of vertical ring magnets with C-shaped rings that share a single conductor down the center of the reactor (see graphic, below).

    3
    JAMES PROVOST

    U.S. fusion funding was in short supply at that time, so Oak Ridge could not build a spherical machine to test Peng’s design. A few labs overseas converted some small devices designed for other purposes into spherical tokamaks, but the first true example was built at the Culham lab in 1990. “It was put together on a shoestring with parts from other machines,” Chapman says. Known as the Small Tight Aspect Ratio Tokamak (START), the device soon achieved a beta of 40%, more than three times that of any conventional tokamak.

    It also bested traditional machines in terms of stability. “It smashed the world record at the time,” Chapman says. “People got more interested.” Other labs rushed to build small spherical tokamaks, some in countries not known for their fusion research, including Australia, Brazil, Egypt, Kazakhstan, Pakistan, and Turkey.

    The next question, Chapman says, was “can we build a bigger machine and get similar performance?” Princeton and Culham’s machines were meant to answer that question. Completed in 1999, NSTX and MAST both hold plasmas about 3 meters across, roughly three times bigger than START’s but a third the size of JET’s. The performance of the pair showed that START wasn’t a one-off: again they achieved a beta of about 40%, reduced instabilities, and good confinement.

    Now, both machines are moving to the next stage: more heating power to make a hotter plasma and stronger magnets to hold it in place. MAST is now in pieces, the empty vacuum vessel looking like a giant tin can adorned with portholes, while its €30 million worth of new magnets, pumps, power supplies, and heating systems are prepared. At Princeton, technicians are putting the finishing touches to a similar $94 million upgrade of NSTX’s magnets and neutral beam heating. Like most experimental tokamaks, the two machines are not aiming to produce lots of energy, just learning how to control and confine plasma under fusionlike conditions. “It’s a big step,” Chapman says. “NSTX-U will have really high injected power in a small plasma volume. Can you control that plasma? This is a necessary step before you could make a spherical tokamak power plant.”

    4
    Engineers lift out MAST’s vacuum vessel for modifications during the €30 million upgrade. © CCFE

    The upgraded machines will each have a different emphasis. NSTX-U, with the greater heating power, will focus on controlling instabilities and improving confinement when it restarts this summer.

    PPPL NSTX-U
    NSTX-U

    “If we can get reasonable beta values, [NSTXU] will reach plasma [properties] similar to conventional tokamaks,” says NSTX chief Masayuki Ono. MAST-Upgrade, due to fire up in 2017, will address a different problem: capturing the fusion energy that would build up in a full-scale plant.

    Fusion reactions generate most of their energy in the form of high-energy neutrons, which, being neutral, are immune to magnetic fields and can shoot straight out of the reactor. In a future power plant, a neutron-absorbing material will capture them, converting their energy to heat that will drive a steam turbine and generate electricity. But 20% of the reaction energy heats the plasma directly and must somehow be tapped. Modern tokamaks remove heat by shaping the magnetic field into a kind of exhaust pipe, called a divertor, which siphons off some of the outermost layer of plasma and pipes it away. But fusion heat will build up even faster in a spherical tokamak because of its compact size. MAST-Upgrade has a flexible magnet system so that researchers can try out various divertor designs, looking for one that can cope with the heat.

    Researchers know from experience that when a tokamak steps up in size or power, plasma can start misbehaving in new ways. “We need MAST and NSTX to make sure there are no surprises at low aspect ratio,” says Dennis Whyte, director of the Plasma Science and Fusion Center at the Massachusetts Institute of Technology in Cambridge. Once NSTX and MAST have shown what they are capable of, Wilson says, “we can pin down what a [power-producing] spherical tokamak will look like. If confinement is good, we can make a very compact machine, around MAST size.”

    BUT GENERATING ELECTRICITY isn’t the only potential goal. The fusion community will soon have to build a reactor to test how components for a future power plant would hold up under years of bombardment by high-energy neutrons. That’s the goal of a proposed machine known in Europe as the Component Test Facility (CTF), which could run stably around the clock, generating as much heat from fusion as it consumes. A CTF is “absolutely necessary,” Chapman says. “It’s very important to test materials to make reactors out of.” The design of CTF hasn’t been settled, but spherical tokamak proponents argue their design offers an efficient route to such a testbed—one that “would be relatively compact and cheap to build and run,” Ono says.

    With ITER construction consuming much of the world’s fusion budget, that promise won’t be tested anytime soon. But one company hopes to go from a standing start to a small power-producing spherical tokamak in a decade. In 2009, a couple of researchers from Culham created a spinoff company—Tokamak Solutions—to build small spherical tokamaks as neutron sources for research. Later, one of the company’s suppliers showed them a new multilayered conducting tape, made with the high-temperature superconductor yttrium-barium-copper-oxide, that promised a major performance boost.

    Lacking electrical resistance, superconductors can be wound into electromagnets that produce much stronger fields than conventional copper magnets. ITER will use low-temperature superconductors for its magnets, but they require massive and expensive cooling. High-temperature materials are cheaper to use but were thought to be unable to withstand the strong magnetic fields around a tokamak—until the new superconducting tape came along. The company changed direction, was renamed Tokamak Energy, and is now testing a first-generation superconducting spherical tokamak no taller than a person.

    Superconductors allow a tokamak to confine a plasma for longer. Whereas NSTX and MAST can run for only a few seconds, the team at Tokamak Energy this year ran their machine—albeit at low temperature and pressure—for more than 15 minutes. In the coming months, they will attempt a 24-hour pulse—smashing the tokamak record of slightly over 5 hours.

    Next year, the company will put together a slightly larger machine able to produce twice the magnetic field of NSTX-U. The next step—investors permitting—will be a machine slightly smaller than Princeton’s but with three times the magnetic field. Company CEO David Kingham thinks that will be enough to beat ITER to the prize: a net gain of energy. “We want to get fusion gain in 5 years. That’s the challenge,” he says.

    “It’s a high-risk approach,” Wilson says. “They’re buying their lottery ticket. If they win, it’ll be great. If they don’t, they’ll likely disappear. Even if it doesn’t work, we’ll learn from it; it will accelerate the fusion program.”

    It’s a spirit familiar to everyone trying to reshape the future of fusion.

    See the full article here.

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

     
  • richardmitnick 8:26 am on May 15, 2015 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From BNL: “Intense Lasers Cook Up Complex, Self-Assembled Nanomaterials” 

    Brookhaven Lab

    May 13, 2015
    Justin Eure

    New technique developed at Brookhaven Lab makes self-assembly 1,000 times faster and could be used for industrial-scale solar panels and electronics

    1
    Brookhaven Lab scientist Kevin Yager (left) and postdoctoral researcher Pawel Majewski with the new Laser Zone Annealing instrument at the Center for Functional Nanomaterials.

    Nanoscale materials feature extraordinary, billionth-of-a-meter qualities that transform everything from energy generation to data storage. But while a nanostructured solar cell may be fantastically efficient, that precision is notoriously difficult to achieve on industrial scales. The solution may be self-assembly, or training molecules to stitch themselves together into high-performing configurations.

    Now, scientists at the U.S. Department of Energy’s Brookhaven National Laboratory have developed a laser-based technique to execute nanoscale self-assembly with unprecedented ease and efficiency.

    “We design materials that build themselves,” said Kevin Yager, a scientist at Brookhaven’s Center for Functional Nanomaterials (CFN). “Under the right conditions, molecules will naturally snap into a perfect configuration. The challenge is giving these nanomaterials the kick they need: the hotter they are, the faster they move around and settle into the desired formation. We used lasers to crank up the heat.”

    Yager and Brookhaven Lab postdoctoral researcher Pawel Majewski built a one-of-a-kind machine that sweeps a focused laser-line across a sample to generate intense and instantaneous spikes in temperature. This new technique, called Laser Zone Annealing (LZA), drives self-assembly at rates more than 1,000 times faster than traditional industrial ovens. The results are described in the journal ACS Nano.

    “We created extremely uniform self-assembled structures in less than a second,” Majewski said. “Beyond the extraordinary speed, our laser also reduced the defects and degradations present in oven-heated materials. That combination makes LZA perfect for carrying small-scale laboratory breakthroughs into industry.”

    The scientists prepared the materials and built the LZA instrument at the CFN. They then analyzed samples using advanced electron microscopy at CFN and x-ray scattering at Brookhaven’s now-retired National Synchrotron Light Source (NSLS)—both DOE Office of Science User Facilities.

    “It was enormously gratifying to see that our predictions were accurate—the enormous thermal gradients led to a correspondingly enormous acceleration!” Yager said.

    2
    Illustration of the Lazer Zone Annealing instrument showing the precise laser (green) striking the un-assembled polymer (purple). The extreme thermal gradients produced by the laser sweeping across the sample cause rapid and pristine self-assembly.

    Ovens versus lasers

    Imagine preparing a complex cake, but instead of baking it in the oven, a barrage of lasers heats it to perfection in an instant. Beyond that, the right cooking conditions will make the ingredients mix themselves into a picture-perfect dish. This nanoscale recipe achieves something equally extraordinary and much more impactful.

    The researchers focused on so-called block copolymers, molecules containing two linked blocks with different chemical structures and properties. These blocks tend to repel each other, which can drive the spontaneous formation of complex and rigid nanoscale structures.

    “The price of their excellent mechanical properties is the slow kinetics of their self-assembly,” Majewski said. “They need energy and time to explore possibilities until they find the right configuration.”

    In traditional block copolymer self-assembly, materials are heated in a vacuum-sealed oven. The sample is typically “baked” for a period of 24 hours or longer to provide enough kinetic energy for the molecules to snap into place—much too long for commercial viability. The long exposure to high heat also causes inevitable thermal degradation, leaving cracks and imperfections throughout the sample.

    The LZA process, however, offers sharp spikes of heat to rapidly excite the polymers without the sustained energy that damages the material.

    “Within milliseconds, the entire sample is beautifully aligned,” Yager said. “As the laser sweeps across the material, the localized thermal spikes actually remove defects in the nanostructured film. LZA isn’t just faster, it produces superior results.”

    LZA generates temperatures greater than 500 degrees Celsius, but the thermal gradients—temperature variations tied to direction and location in a material—can reach more than 4,000 degrees per millimeter. While scientists know that higher temperatures can accelerate self-assembly, this is the first proof of dramatic enhancement by extreme gradients.

    Built from scratch

    “Years ago, we observed a subtle hint that thermal gradients could improve self-assembly,” Yager said. “I became obsessed with the idea of creating more and more extreme gradients, which ultimately led to building this laser setup, and pioneering a new technique.”

    The researchers needed a high concentration of technical expertise and world-class facilities to move the LZA from proposal to execution.

    “Only at the CFN could we develop this technique so quickly,” Majewski said. “We could do rapid instrument prototyping and sample preparation with the on-site clean room, machine shop, and polymer processing lab. We then combined CFN electron microscopy with x-ray studies at NSLS for an unbeatable evaluation of the LZA in action.”

    Added Yager, “The ability to make new samples at the CFN and then walk across the street to characterize them in seconds at NSLS was key to this discovery. The synergy between these two facilities is what allowed us to rapidly iterate to an optimized design.”

    The scientists also developed a new microscale surface thermometry technique called melt-mark analysis to track the exact heat generated by the laser pulses and tune the instrument accordingly.

    “We burned a few films initially before we learned the right operating conditions,” Majewski said. “It was really exciting to see the first samples being rastered by the laser and then using NSLS to discover exactly what happened.”

    Future of the technique

    The LZA is the first machine of its kind in the world, but it signals a dramatic step forward in scaling up meticulously designed nanotechnology. The laser can even be used to “draw” structures across the surface, meaning the nanostructures can assemble in well-defined patterns. This unparalleled synthesis control opens the door to complex applications, including electronics.

    “There’s really no limit to the size of a sample this technique could handle,” Yager said. “In fact, you could run it in a roll-to-roll mode—one of the leading manufacturing technologies.”

    The scientists plan to further develop the new technique to create multi-layer structures that could have immediate impacts on anti-reflective coatings, improved solar cells, and advanced electronics.

    This research and operations at CFN and NSLS were funded by the DOE Office of Science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 7:01 pm on May 14, 2015 Permalink | Reply
    Tags: Applied Research & Technology, , Outsmart Ebola Together,   

    From WCG “A milestone and a roadmap: progress in the fight against Ebola” 

    New WCG Logo

    5 May 2015
    The Outsmart Ebola Together research team

    Summary
    Thanks to the huge level of support from World Community Grid, the team at the Scripps Research Institute has already received most of the matching data for the first target protein of the Ebola virus. While this data is being analyzed, the search now moves to another related protein with potential to help the fight against hemorrhagic fevers.

    Outsmart Ebola Together, a long-term scientific project whose goal is to find new drugs for curing Ebola and related life-threatening viral hemorrhagic fevers, is still in its early stages, but we’ve already reached a major milestone. Our first target was the newly revealed receptor-binding site of the Ebola surface protein, GP. GP is the molecule Ebola virus uses to fuse with a human cell and force its way inside. Armed with a new model of the binding site, and with the vast resources of World Community Grid, we set out to test this site against drugs that could potentially bond with it and prevent Ebola infection. This stage of work is now close to complete: we have received back from World Community Grid most of the data for the planned matchings of the Ebola surface protein against 5.4 million candidate chemical compounds.

    We are now analyzing this data. Drugs that simulations predict will bind well with the Ebola surface protein will go on to a next round of experiments, conducted in the lab with actual proteins and actual drug molecules. Our analysis may also yield general insights about how classes of drugs interact with viral proteins.

    Moreover, we are excited to announce that we are beginning work on a second target protein, the Lassa virus nucleoprotein.

    Like Ebola, Lassa is a “Group V” virus: in other words, both are viruses that have at their core a genome composed of “negative-sense”, single-stranded RNA. Both viruses produce a deadly hemorrhagic fever. While Lassa has received less publicity than Ebola, it is a more consistent killer. There are hundreds of thousands of cases of Lassa Fever every year in Western Africa, with tens of thousands of deaths. It is also the viral hemorrhagic fever most frequently transported out of Africa to the United States and Europe. There are no treatments approved for use in Lassa virus infection. Identification of a potent inhibitor of Lassa virus is imperative for public health.

    The Lassa virus’s nucleoprotein (NP) is so named because its first discovered function is to bind with, and so enclose and protect, the virus’s central strand of RNA. However, Lassa NP is a complex beast that has other functions as well. In particular, our lab discovered that the NP (almost paradoxically) is also responsible for digesting double-stranded RNA (dsRNA) created by the virus itself. Having gained entry to a human cell, the Lassa virus must copy its single-stranded RNA in order to produce viral proteins and replicate itself. This requires creating double-stranded RNA. However, the virus must keep this work secret. The presence of double-stranded RNA in the cytoplasm is a clear sign of a viral infection, and human cells are smart enough to detect this, triggering an effective immune response. Hence the importance of the Lassa NP, which rips apart the virus’s own dsRNA byproducts in order to keep its activities secret.

    We approach Lassa NP armed with our lab’s crystallographic structures, which clearly identify the shape of the NP and the site where the NP carries out its function of destroying double-stranded RNA. This site is a large cavity in the side of the protein; it is negatively charged, but is also bordered by a positively charged, protruding protein “arm”. These distinctive features are key to the site’s binding with dsRNA, and, we believe, should make it a good candidate for screenings against possible drugs.

    2
    Figure: Our lab’s structure for the Lassa NP protein. Portions important to the protein’s function of digesting double-stranded RNA include the “cavity” (glowing, particularly a manganese atom that helps bond RNA) and the adjacent “arm” (yellow).

    We will now prepare this target protein for matchings against millions of drugs using the resources of the World Community Grid. As with our previous matchings against the Ebola surface protein, drugs that do well in this “virtual screening” will go on to further tests with actual proteins in the lab. While this work is difficult and carries no guarantees, we hope that it will lead to the discovery of a drug that can prevent the Lassa NP from hiding the virus’s double-stranded RNA. We have already determined that doing this would allow human cells to detect and act against the Lassa virus more promptly and effectively, potentially saving lives.

    It’s amazing to us that we’ve been able to receive so many results so quickly, and we want to say thank you to everyone in the World Community Grid family who helped make this possible. There is much work ahead, but it’s immensely encouraging to know that we have the resources available to carry it out.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BET!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-
    Outsmart Ebola together

    Outsmart Ebola Together

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    Computing for Sustainable Water

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

     
  • richardmitnick 2:23 pm on May 14, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From LBL: “CLAIRE Brings Electron Microscopy to Soft Materials” 

    Berkeley Logo

    Berkeley Lab

    May 14, 2015
    Lynn Yarris (510) 486-5375

    Berkeley Researchers Develop Breakthrough Technique for Non-invasive Nano-scale Imaging

    1
    CLAIRE image of Al nanostructures with an inset that shows a cluster of six Al nanostructures.

    Soft matter encompasses a broad swath of materials, including liquids, polymers, gels, foam and – most importantly – biomolecules. At the heart of soft materials, governing their overall properties and capabilities, are the interactions of nano-sized components. Observing the dynamics behind these interactions is critical to understanding key biological processes, such as protein crystallization and metabolism, and could help accelerate the development of important new technologies, such as artificial photosynthesis or high-efficiency photovoltaic cells. Observing these dynamics at sufficient resolution has been a major challenge, but this challenge is now being met with a new non-invasive nanoscale imaging technique that goes by the acronym of CLAIRE.

    CLAIRE stands for “cathodoluminescence activated imaging by resonant energy transfer.” Invented by researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) Berkeley, CLAIRE extends the incredible resolution of electron microscopy to the dynamic imaging of soft matter.

    “Traditional electron microscopy damages soft materials and has therefore mainly been used to provide topographical or compositional information about robust inorganic solids or fixed sections of biological specimens,” says chemist Naomi Ginsberg, who leads CLAIRE’s development. “CLAIRE allows us to convert electron microscopy into a new non-invasive imaging modality for studying soft materials and providing spectrally specific information about them on the nanoscale.”

    2
    Naomi Ginsberg

    Ginsberg holds appointments with Berkeley Lab’s Physical Biosciences Division and its Materials Sciences Division, as well as UC Berkeley’s departments of chemistry and physics. She is also a member of the Kavli Energy NanoScience Institute (Kavli-ENSI) at Berkeley. She and her research group recently demonstrated CLAIRE’s imaging capabilities by applying the technique to aluminum nanostructures and polymer films that could not have been directly imaged with electron microscopy.

    “What microscopic defects in molecular solids give rise to their functional optical and electronic properties? By what potentially controllable process do such solids form from their individual microscopic components, initially in the solution phase? The answers require observing the dynamics of electronic excitations or of molecules themselves as they explore spatially heterogeneous landscapes in condensed phase systems,” Ginsberg says. “In our demonstration, we obtained optical images of aluminum nanostructures with 46 nanometer resolution, then validated the non-invasiveness of CLAIRE by imaging a conjugated polymer film. The high resolution, speed and non-invasiveness we demonstrated with CLAIRE positions us to transform our current understanding of key biomolecular interactions.”

    3
    CLAIRE imaging chip consists of a YAlO3:Ce scintillator film supported by LaAlO3 and SrTiO3 buffer layers and a Si frame. Al nanostructures embedded in SiO2 are positioned below and directly against the scintillator film. ProTEK B3 serves as a protective layer for etching.

    CLAIRE works by essentially combining the best attributes of optical and scanning electron microscopy into a single imaging platform. Scanning electron microscopes use beams of electrons rather than light for illumination and magnification. With much shorter wavelengths than photons of visible light, electron beams can be used to observe objects hundreds of times smaller than those that can be resolved with an optical microscope. However, these electron beams destroy most forms of soft matter and are incapable of spectrally specific molecular excitation.

    Ginsberg and her colleagues get around these problems by employing a process called “cathodoluminescence,” in which an ultrathin scintillating film, about 20 nanometers thick, composed of cerium-doped yttrium aluminum perovskite, is inserted between the electron beam and the sample. When the scintillating film is excited by a low-energy electron beam (about 1 KeV), it emits energy that is transferred to the sample, causing the sample to radiate. This luminescence is recorded and correlated to the electron beam position to form an image that is not restricted by the optical diffraction limit.

    Developing the scintillating film and integrating it into a microchip imaging device was an enormous undertaking, Ginsberg says, and she credits the “talent and dedication” of her research group for the success. She also gives much credit to the staff and capabilities of the Molecular Foundry, a DOE Office of Science User Facility, where the CLAIRE imaging demonstration was carried out.

    “The Molecular Foundry truly enabled CLAIRE imaging to come to life,” she says. “We collaborated with staff scientists there to design and install a high efficiency light collection apparatus in one of the Foundry’s scanning electron microscopes and their advice and input were fantastic. That we can work with Foundry scientists to modify the instrumentation and enhance its capabilities not only for our own experiments but also for other users is unique.”

    While there is still more work to do to make CLAIRE widely accessible, Ginsberg and her group are moving forward with further refinements for several specific applications.

    “We’re interested in non-invasively imaging soft functional materials like the active layers in solar cells and light-emitting devices,” she says. “It is especially true in organics and organic/inorganic hybrids that the morphology of these materials is complex and requires nanoscale resolution to correlate morphological features to functions.”

    Ginsberg and her group are also working on the creation of liquid cells for observing biomolecular interactions under physiological conditions. Since electron microscopes can only operate in a high vacuum, as molecules in the air disrupt the electron beam, and since liquids evaporate in high vacuum, aqueous samples must either be freeze-dried or hermetically sealed in special cells.

    “We need liquid cells for CLAIRE to study the dynamic organization of light-harvesting proteins in photosynthetic membranes,” Ginsberg says. “We should also be able to perform other studies in membrane biophysics to see how molecules diffuse in complex environments, and we’d like to be able to study molecular recognition at the single molecule level.”

    In addition, Ginsberg and her group will be using CLAIRE to study the dynamics of nanoscale systems for soft materials in general.

    “We would love to be able to observe crystallization processes or to watch a material made of nanoscale components anneal or undergo a phase transition,” she says. “We would also love to be able to watch the electric double layer at a charged surface as it evolves, as this phenomenon is crucial to battery science.”

    A paper describing the most recent work on CLAIRE has been published in the journal Nano Letters. The paper is titled Cathodoluminescence-Activated Nanoimaging: Noninvasive Near-Field Optical Microscopy in an Electron Microscope. Ginsberg is the corresponding author. Other authors are Connor Bischak, Craig Hetherington, Zhe Wang, Jake Precht, David Kaz and Darrell Schlom.

    This research was primarily supported by the DOE Office of Science and by the National Science Foundation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 444 other followers

%d bloggers like this: