Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:59 am on June 19, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , , , Searching Science Data   

    From Lawrence Berkeley National Lab: “Berkeley Lab Researchers Use Machine Learning to Search Science Data” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    1
    A screenshot of image-based results in the Science Search interface. In this case, the user performed an image search for nanoparticles. (Credit: Gonzalo Rodrigo/Berkeley Lab)

    As scientific datasets increase in both size and complexity, the ability to label, filter and search this deluge of information has become a laborious, time-consuming and sometimes impossible task, without the help of automated tools.

    With this in mind, a team of researchers from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley are developing innovative machine learning tools to pull contextual information from scientific datasets and automatically generate metadata tags for each file. Scientists can then search these files via a web-based search engine for scientific data, called Science Search, that the Berkeley team is building.

    As a proof-of-concept, the team is working with staff at Berkeley Lab’s Molecular Foundry, to demonstrate the concepts of Science Search on the images captured by the facility’s instruments. A beta version of the platform has been made available to Foundry researchers.

    LBNL Molecular Foundry – No image credits found

    “A tool like Science Search has the potential to revolutionize our research,” says Colin Ophus, a Molecular Foundry research scientist within the National Center for Electron Microscopy (NCEM) and Science Search Collaborator. “We are a taxpayer-funded National User Facility, and we would like to make all of the data widely available, rather than the small number of images chosen for publication. However, today, most of the data that is collected here only really gets looked at by a handful of people—the data producers, including the PI (principal investigator), their postdocs or graduate students—because there is currently no easy way to sift through and share the data. By making this raw data easily searchable and shareable, via the Internet, Science Search could open this reservoir of ‘dark data’ to all scientists and maximize our facility’s scientific impact.”

    The Challenges of Searching Science Data

    2
    This screen capture of the Science Search interface shows how users can easily validate metadata tags that have been generated via machine learning, or add information that hasn’t already been captured. (Credit: Gonzalo Rodrigo/Berkeley Lab)

    Today, search engines are ubiquitously used to find information on the Internet but searching science data presents a different set of challenges. For example, Google’s algorithm relies on more than 200 clues to achieve an effective search. These clues can come in the form of key words on a webpage, metadata in images or audience feedback from billions of people when they click on the information they are looking for. In contrast, scientific data comes in many forms that are radically different than an average web page, requires context that is specific to the science and often also lacks the metadata to provide context that is required for effective searches.

    At National User Facilities like the Molecular Foundry, researchers from all over the world apply for time and then travel to Berkeley to use extremely specialized instruments free of charge. Ophus notes that the current cameras on microscopes at the Foundry can collect up to a terabyte of data in under 10 minutes. Users then need to manually sift through this data to find quality images with “good resolution” and save that information on a secure shared file system, like Dropbox, or on an external hard drive that they eventually take home with them to analyze.

    Oftentimes, the researchers that come to the Molecular Foundry only have a couple of days to collect their data. Because it is very tedious and time consuming to manually add notes to terabytes of scientific data and there is no standard for doing it, most researchers just type shorthand descriptions in the filename. This might make sense to the person saving the file, but often doesn’t make much sense to anyone else.

    “The lack of real metadata labels eventually causes problems when the scientist tries to find the data later or attempts to share it with others,” says Lavanya Ramakrishnan, a staff scientist in Berkeley Lab’s Computational Research Division (CRD) and co-principal investigator of the Science Search project. “But with machine-learning techniques, we can have computers help with what is laborious for the users, including adding tags to the data. Then we can use those tags to effectively search the data.”

    3
    In addition to images, Science Search can also be used to look for proposals and papers. This is a screenshot of the paper search results. (Credit: Gonzalo Rodrigo/Berkeley Lab). [No hot links.]

    To address the metadata issue, the Berkeley Lab team uses machine-learning techniques to mine the “science ecosystem”—including instrument timestamps, facility user logs, scientific proposals, publications and file system structures—for contextual information. The collective information from these sources including timestamp of the experiment, notes about the resolution and filter used and the user’s request for time, all provides critical contextual information. The Berkeley lab team has put together an innovative software stack that uses machine-learning techniques including natural language processing pull contextual keywords about the scientific experiment and automatically create metadata tags for the data.

    For the proof-of-concept, Ophus shared data from the Molecular Foundry’s TEAM 1 electron microscope at NCEM that was recently collected by the facility staff, with the Science Search Team.

    LBNL National Center for Electron Microscopy (NCEM)

    He also volunteered to label a few thousand images to give the machine-learning tools some labels from which to start learning. While this is a good start, Science Search co-principal investigator Gunther Weber notes that most successful machine-learning applications typically require significantly more data and feedback to deliver better results. For example, in the case of search engines like Google, Weber notes that training datasets are created and machine-learning techniques are validated when billions of people around the world verify their identity by clicking on all the images with street signs or storefronts after typing in their passwords, or on Facebook when they’re tagging their friends in an image.

    “In the case of science data only a handful of domain experts can create training sets and validate machine-learning techniques, so one of the big ongoing problems we face is an extremely small number of training sets,” says Weber, who is also a staff scientist in Berkeley Lab’s CRD.

    To overcome this challenge, the Berkeley Lab researchers used transfer learning to limit the degrees of freedom, or parameter counts, on their convolutional neural networks (CNNs). Transfer learning is a machine learning method in which a model developed for a task is reused as the starting point for a model on a second task, which allows the user to get more accurate results from a smaller training set. In the case of the TEAM I microscope, the data produced contains information about which operation mode the instrument was in at the time of collection. With that information, Weber was able to train the neural network on that classification so it could generate that mode of operation label automatically. He then froze that convolutional layer of the network, which meant he’d only have to retrain the densely connected layers. This approach effectively reduces the number of parameters on the CNN, allowing the team to get some meaningful results from their limited training data.

    Machine Learning to Mine the Scientific Ecosystem

    In addition to generating metadata tags through training datasets, the Berkeley Lab team also developed tools that use machine-learning techniques for mining the science ecosystem for data context. For example, the data ingest module can look at a multitude of information sources from the scientific ecosystem—including instrument timestamps, user logs, proposals and publications—and identify commonalities. Tools developed at Berkeley Lab that use natural language-processing methods can then identify and rank words that give context to the data and facilitate meaningful results for users later on. The user will see something similar to the results page of an Internet search, where content with the most text matching the user’s search words will appear higher on the page. The system also learns from user queries and the search results they click on.

    Because scientific instruments are generating an ever-growing body of data, all aspects of the Berkeley team’s science search engine needed to be scalable to keep pace with the rate and scale of the data volumes being produced. The team achieved this by setting up their system in a Spin instance on the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC).

    NERSC

    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Spin is a Docker-based edge-services technology developed at NERSC that can access the facility’s high performance computing systems and storage on the back end.

    “One of the reasons it is possible for us to build a tool like Science Search is our access to resources at NERSC,” says Gonzalo Rodrigo, a Berkeley Lab postdoctoral researcher who is working on the natural language processing and infrastructure challenges in Science Search. “We have to store, analyze and retrieve really large datasets, and it is useful to have access to a supercomputing facility to do the heavy lifting for these tasks. NERSC’s Spin is a great platform to run our search engine that is a user-facing application that requires access to large datasets and analytical data that can only be stored on large supercomputing storage systems.”

    An Interface for Validating and Searching Data

    When the Berkeley Lab team developed the interface for users to interact with their system, they knew that it would have to accomplish a couple of objectives, including effective search and allowing human input to the machine learning models. Because the system relies on domain experts to help generate the training data and validate the machine-learning model output, the interface needed to facilitate that.

    “The tagging interface that we developed displays the original data and metadata available, as well as any machine-generated tags we have so far. Expert users then can browse the data and create new tags and review any machine-generated tags for accuracy,” says Matt Henderson, who is a Computer Systems Engineer in CRD and leads the user interface development effort.

    To facilitate an effective search for users based on available information, the team’s search interface provides a query mechanism for available files, proposals and papers that the Berkeley-developed machine-learning tools have parsed and extracted tags from. Each listed search result item represents a summary of that data, with a more detailed secondary view available, including information on tags that matched this item. The team is currently exploring how to best incorporate user feedback to improve the models and tags.

    “Having the ability to explore datasets is important for scientific breakthroughs, and this is the first time that anything like Science Search has been attempted,” says Ramakrishnan. “Our ultimate vision is to build the foundation that will eventually support a ‘Google’ for scientific data, where researchers can even search distributed datasets. Our current work provides the foundation needed to get to that ambitious vision.”

    “Berkeley Lab is really an ideal place to build a tool like Science Search because we have a number of user facilities, like the Molecular Foundry, that have decades worth of data that would provide even more value to the scientific community if the data could be searched and shared,” adds Katie Antypas, who is the principal investigator of Science Search and head of NERSC’s Data Department. “Plus we have great access to machine-learning expertise in the Berkeley Lab Computing Sciences Area as well as HPC resources at NERSC in order to build these capabilities.”

    In addition to Antypas, Ramakrishnan and Weber, UC Berkeley Computer Science Professor Joseph Hellerstein is also a principal investigator.

    This work was supported by the DOE Office of Advanced Scientific Computing Research (ASCR). Both the Molecular Foundry and NERSC are DOE Office of Science User Facilities located at Berkeley Lab.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    Advertisements
     
  • richardmitnick 2:14 pm on June 17, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From EarthSky: “Kilauea volcano lava river flows to sea” 

    1

    From EarthSky

    June 17, 2018
    Adam Voiland/NASA Earth Observatory

    Kilauea volcano’s Fissure 8 has produced a large, channelized lava flow that’s acted like a river, eating through the landscape, finally producing clouds of steamy, hazardous “laze” as hot lava meets the cold ocean.

    Steaming fissures in the Kilauea volcano first began to crack open and spread lava across Hawaii’s Leilani Estates neighborhood on May 3, 2018. Since then, more than 20 fissures have opened on the Kilauea’s Lower East Rift Zone, though most of the lava flows have been small and short-lived.

    Not so for Fissure 8. That crack in the Earth has been regularly generating large fountains of lava that soar tens to hundreds of feet into the air. It has produced a large, channelized lava flow that has acted like a river, eating through the landscape as it flows toward the sea.

    1
    Photo shows Fissure 8 of Kilauea volcano in Hawaii. Fissure 8 fountains reached heights up to 160 feet overnight on Friday. The USGS Hawaiian Volcano Observatory reports that fragments falling from the fountains are building a cinder-and-spatter cone around the vent. USGS image taken June 12, 2018, around 6:10 a.m. HST. View the latest images and videos via USGS.

    While the Fissure 8 lava flow initially remained in relatively narrow channels, it began to widen significantly as it neared the coastline and passed over flatter land. It evaporated Hawaii’s largest lake in a matter of hours, and devastated the communities of Vacationland and Kapoho, destroying hundreds of homes [which probably should never have been built there. Lessons unlearned also in the New Jersey shore communities].

    2
    May 14, 2018. Image via NASA.

    3
    June 7, 2018. Image via NASA.

    On June 3, 2018, lava from Fissure 8 reached the ocean at Kapoho Bay on Hawaii’s southeast coast. When the Multi-Spectral Instrument (MSI) on the European Space Agency’s Sentinel-2 satellite captured a natural-color image on June 7 (top image, above), the lava had completely filled in the bay and formed a new lava delta.

    ESA/Sentinel 2

    For comparison, the Landsat 8 image shows the coastline on May 14 (lower image, above).

    3
    June 15, 2018, photo of Fissure 8. This fissure has produced a lava fountain pulsing to heights of 185 to 200 feet (55 to 60 meters). Spattering has built a cinder cone that partially encircles Fissure 8, now 170 feet (51 meters) tall at its highest point. The steam in the foreground is the result of heavy morning rain falling on warm (not hot) tephra (lava fragments).

    Since May 3, 2018, Kilauea has erupted more than 110 million cubic meters of lava. That is enough to fill 45,000 Olympic-sized swimming pools, cover Manhattan Island to a depth of 7 feet (2 meters), or fill 11 million dump trucks, according to estimates from the U.S. Geological Survey (USGS). However, that is only about half of the volume erupted at nearby Mauna Loa in a major eruption in 1984.

    The new land at Kapoho Bay is quite dynamic, fragile, and dangerous. USGS warns:

    “Venturing too close to an ocean entry on land or the ocean exposes you to flying debris from sudden explosive interaction between lava and water.”

    Since lava deltas are built on unconsolidated fragments and sand, the loose material can abruptly collapse or quickly erode in the surf.

    4
    This thermal map shows the fissure system and lava flows as of 5:30 p.m. on Saturday, June 9, 2018. The flow from fissure 8 remains active, with the flow entering the ocean at Kapoho. The black and white area is the extent of the thermal map. Temperature in the thermal image is displayed as gray-scale values, with the brightest pixels indicating the hottest areas. Image via USGS.

    The plumes that form where lava meets seawater are also hazardous. Sometimes called laze, these white plumes of hydrochloric acid gas, steam, and tiny shards of volcanic glass can cause skin and eye irritation and breathing difficulties.

    5
    The ocean entry remains fairly broad with a white steam/laze plume blowing onshore. USGS image taken June 15, 2018.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Deborah Byrd created the EarthSky radio series in 1991 and founded EarthSky.org in 1994. Today, she serves as Editor-in-Chief of this website. She has won a galaxy of awards from the broadcasting and science communities, including having an asteroid named 3505 Byrd in her honor. A science communicator and educator since 1976, Byrd believes in science as a force for good in the world and a vital tool for the 21st century. “Being an EarthSky editor is like hosting a big global party for cool nature-lovers,” she says.

     
  • richardmitnick 4:44 pm on June 16, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , , , New type of photosynthesis discovered   

    From Imperial College London: “New type of photosynthesis discovered” 

    Imperial College London
    From Imperial College London

    15 June 2018
    Hayley Dunning

    1
    Colony of cells where colours represent chlorophyll-a and -f driven photosynthesis. Dennis Nuernberg

    The discovery changes our understanding of the basic mechanism of photosynthesis and should rewrite the textbooks.

    It will also tailor the way we hunt for alien life and provide insights into how we could engineer more efficient crops that take advantage of longer wavelengths of light.

    The discovery, published today in Science, was led by Imperial College London, supported by the BBSRC, and involved groups from the ANU in Canberra, the CNRS in Paris and Saclay and the CNR in Milan.

    The vast majority of life on Earth uses visible red light in the process of photosynthesis, but the new type uses near-infrared light instead. It was detected in a wide range of cyanobacteria (blue-green algae) when they grow in near-infrared light, found in shaded conditions like bacterial mats in Yellowstone and in beach rock in Australia.

    As scientists have now discovered, it also occurs in a cupboard fitted with infrared LEDs in Imperial College London.

    Photosynthesis beyond the red limit

    The standard, near-universal type of photosynthesis uses the green pigment, chlorophyll-a, both to collect light and use its energy to make useful biochemicals and oxygen. The way chlorophyll-a absorbs light means only the energy from red light can be used for photosynthesis.

    Since chlorophyll-a is present in all plants, algae and cyanobacteria that we know of, it was considered that the energy of red light set the ‘red limit’ for photosynthesis; that is, the minimum amount of energy needed to do the demanding chemistry that produces oxygen. The red limit is used in astrobiology to judge whether complex life could have evolved on planets in other solar systems.

    However, when some cyanobacteria are grown under near-infrared light, the standard chlorophyll-a-containing systems shut down and different systems containing a different kind of chlorophyll, chlorophyll-f, takes over.

    2
    Cross-section of beach rock (Heron Island, Australia) showing chlorophyll-f containing cyanobacteria (green band) growing deep into the rock, several millimetres below the surface. Dennis Nuernberg

    Until now, it was thought that chlorophyll-f just harvested the light. The new research shows that instead chlorophyll-f plays the key role in photosynthesis under shaded conditions, using lower-energy infrared light to do the complex chemistry. This is photosynthesis ‘beyond the red limit’.

    Lead researcher Professor Bill Rutherford, from the Department of Life Sciences at Imperial, said: “The new form of photosynthesis made us rethink what we thought was possible. It also changes how we understand the key events at the heart of standard photosynthesis. This is textbook changing stuff.”

    Preventing damage by light

    Another cyanobacterium, Acaryochloris, is already known to do photosynthesis beyond the red limit. However, because it occurs in just this one species, with a very specific habitat, it had been considered a ‘one-off’. Acaryochloris lives underneath a green sea-squirt that shades out most of the visible light leaving just the near-infrared.

    The chlorophyll-f based photosynthesis reported today represents a third type of photosynthesis that is widespread. However, it is only used in special infrared-rich shaded conditions; in normal light conditions, the standard red form of photosynthesis is used.

    It was thought that light damage would be more severe beyond the red limit, but the new study shows that it is not a problem in stable, shaded environments.

    Co-author Dr Andrea Fantuzzi, from the Department of Life Sciences at Imperial, said: “Finding a type of photosynthesis that works beyond the red limit changes our understanding of the energy requirements of photosynthesis. This provides insights into light energy use and into mechanisms that protect the systems against damage by light.”

    These insights could be useful for researchers trying to engineer crops to perform more efficient photosynthesis by using a wider range of light. How these cyanobacteria protect themselves from damage caused by variations in the brightness of light could help researchers discover what is feasible to engineer into crop plants.

    Textbook-changing insights

    More detail could be seen in the new systems than has ever been seen before in the standard chlorophyll-a systems. The chlorophylls often termed ‘accessory’ chlorophylls were actually performing the crucial chemical step, rather than the textbook ‘special pair’ of chlorophylls in the centre of the complex.

    This indicates that this pattern holds for the other types of photosynthesis, which would change the textbook view of how the dominant form of photosynthesis works.

    Dr Dennis Nürnberg, the first author and initiator of the study, said: “I did not expect that my interest in cyanobacteria and their diverse lifestyles would snowball into a major change in how we understand photosynthesis. It is amazing what is still out there in nature waiting to be discovered.”

    Peter Burlinson, lead for frontier bioscience at BBSRC – UKRI says, “This is an important discovery in photosynthesis, a process that plays a crucial role in the biology of the crops that feed the world. Discoveries like this push the boundaries of our understanding of life and Professor Bill Rutherford and the team at Imperial should be congratulated for revealing a new perspective on such a fundamental process.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Imperial College London

    Imperial College London is a science-based university with an international reputation for excellence in teaching and research. Consistently rated amongst the world’s best universities, Imperial is committed to developing the next generation of researchers, scientists and academics through collaboration across disciplines. Located in the heart of London, Imperial is a multidisciplinary space for education, research, translation and commercialisation, harnessing science and innovation to tackle global challenges.

     
  • richardmitnick 12:21 pm on June 15, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Retinal,   

    From SLAC Lab: “Scientists Make the First Molecular Movie of One of Nature’s Most Widely Used Light Sensors” 


    From SLAC Lab

    June 14, 2018
    Glennda Chui


    A molecular movie based on experimental data shows the retinal molecule, in green, changing shape along with parts of its surrounding protein pocket, in pink, when hit by light. The changing numbers are distances in angstroms. One angstrom is one ten-billionth of a meter. That’s roughly the diameter of the smallest atoms. (Paul Scherrer Institute, Andy Freeberg/SLAC)

    The X-ray laser movie shows what happens when light hits retinal, a key part of vision in animals and photosynthesis in microbes. The action takes place in a trillionth of an eye blink.

    Scientists have made the first molecular movie of the instant when light hits a sensor that’s widely used in nature for probing the environment and harvesting energy from light. The sensor, a form of vitamin A known as retinal, is central to a number of important light-driven processes in people, animals, microbes and algae, including human vision and some forms of photosynthesis, and the movie shows it changing shape in a trillionth of an eye blink.

    “To my knowledge, nobody has measured changes in a retinal biosensor so quickly and so accurately,” said Jörg Standfuss, a biologist at the Paul Scherrer Institute (PSI) in Switzerland who led the research at the Department of Energy’s SLAC National Accelerator Laboratory. “And the fact that we saw just the opposite of what we intuitively expected was spectacular and surprising to us.”

    The team carried out their experiments at the lab’s Linac Coherent Light Source (LCLS) X-ray laser and reported the results today in Science.

    SLAC/LCLS

    Comming soon (A really bad attempt at lab humor. In fact, it will be a while).

    SLAC/LCLS II projected view

    In the past, scientists had to fill the gaps in their knowledge about retinal’s behavior by making inferences based on theory and computer simulations, said Mark Hunter, a staff scientist at LCLS and paper co-author. But in this study, “LCLS’s super-short pulses allowed us to collect data on where the atoms actually were in space and how that changed over time,” he said, “so it gave us a much more direct visualization of molecules in motion.”

    Colorful Lakes and Arching Cats

    Retinal is so central to human vision – it’s named for the retina at the back of the eye – that scientists have been studying it for nearly a century, steadily building a more detailed picture of how it works. It’s also used in the burgeoning field of optogenetics to turn groups of nerve cells on and off, revealing how the brain works and how things go wrong in conditions like depression, stroke and addiction.

    The retinal studied in this experiment came from salt-loving microbes that use it to harvest energy from the sun. (Fun fact: Purple and orange-red pigments in these microbes give the briny waters they live in, from San Francisco Bay salt ponds to Senegal’s Lake Retba, their incredibly vivid colors.)

    Retinal does its job while snuggled deep into a pocket of specialized proteins in the membrane of the cell. When hit by light, the retinal changes shape – in this case it curves, like a cat arching its back. This creates a signal that’s transmitted by the protein into the cell’s interior, initiating photosynthesis or vision.

    Scientists thought retinal set off the signal by pushing on the protein pocket as it changed shape. But the LCLS experiments found just the opposite: The pocket actually changed shape first, creating space for the retinal to perform its arching-cat maneuver. Nearby water molecules also moved aside and made room, Standfuss said. It all took place within 200 to 500 femtoseconds, or millionths of a billionth of a second. That’s about a trillionth of the blink of an eye, making this one of the fastest chemical reactions known in living things.

    “In retrospect, this makes a lot of sense,” Standfuss said. “We always say seeing is believing in structural biology, and in this case it’s very true. The molecular movie we made makes it so obvious what’s going on that you can immediately grasp it. This solves a very important piece of the puzzle of how retinal works that people have been wondering about.”

    The protein pocket’s initial movements are triggered by small changes in electrical charge that rearrange certain chemical bonds, he said. These movements guide the retinal’s response and make it much more efficient, which is why it requires only a few photons of light and why nature can use that light so effectively.


    In this pair of molecular movies we see the retinal molecule (in the middle of each frame) and parts of its surrounding protein pocket with their shapes defined by their electron clouds (blue lines). The top frame shows the retinal molecule from the side, and the bottom one shows it from the top as it curves in response to light. (Paul Scherrer Institute)

    Catching Molecules in Action

    How can you watch something so small that happens so fast? The X-ray laser was key, Standfuss said. LCLS produces brilliant pulses of X-ray laser light that scatter off the electrons in a sample and reveal how its atoms are arranged. Like a camera with an extreme zoom lens and ultrafast shutter speed, the X-ray laser can also make snapshots of molecules moving, breaking apart and interacting with each other.

    In this case, the researchers looked at samples of retinal snuggled into pockets of bacteriorhodopsin, a purple protein found in simple microbes like those in the salt ponds.

    After years of effort, PSI postdoctoral researcher Przemyslaw Nogly, the lead author of the report, found ways to pack these retinal-protein pairs into thousands and thousands of tiny but well-ordered crystals. One after another, crystals were hit with light from an optical laser – a stand-in for sunlight – followed by X-ray laser pulses to record the response. Then Nogly and the team boiled down data into 20 snapshots and assembled them into stop-action movies that show the retinal moving in sync with its protein pocket.

    Proteins like bacteriorhodopsin that sit in cell membranes are notoriously difficult to study because it’s so hard to form them into crystals for X-ray experiments, Hunter said. But scientists have learned that they crystallize more readily when embedded in a fatty, toothpaste-like sludge that mimics their natural environment, and that’s how these crystals were formed and delivered into the X-ray beam.

    The researchers were also able to detect “protein quakes,” vibrations that release some of the energy deposited by the light flashes. These had been predicted by theory and came off as expected.

    Standfuss said he has spent most of his career studying retinal and its role in vision, which involves slightly different shape changes in the protein-embedded molecule. “I really hope that we can now study the same reaction in many different systems,” he said. “Now that we see for the first time how it works in one particular bacterial protein, I want to understand how it works in the human eye as well.”

    LCLS researchers Sergio Carbajo, Jason Koglin, Matthew Seaberg and Thomas Lane were co-authors of this study. Other contributors came from PSI, the University of Gothenburg in Sweden, the Fritz Haber Center for Molecular Dynamics at the Hebrew University of Jerusalem, the RIKEN SPring-8 Center and Kyoto University in Japan, the Center for Free-Electron Laser Science at DESY in Germany and Arizona State University. Major funding came from the European Horizon 2020 Program, the Swedish Research Council and the Swiss National Science Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

     
  • richardmitnick 4:44 pm on June 14, 2018 Permalink | Reply
    Tags: Applied Research & Technology, Entanglement on Demand, , , , TU Delft   

    From The Kavli Foundation and TU Delft: “Delft Scientists Make First ‘On Demand’ Entanglement Link” 

    KavliFoundation

    From The Kavli Foundation

    and

    TU Delft

    June 13, 2018
    Contact details:
    Prof. dr. ir. Ronald Hanson
    QuTech, Delft University of Technology
    Lorentzweg 1, 2628 CJ Delft, Netherlands
    R.Hanson@tudelft.nl
    +31 15 27 86133

    Researchers at QuTech in Delft have succeeded in generating quantum entanglement between two quantum chips faster than the entanglement is lost. Entanglement – once referred to by Einstein as “spooky action” – forms the link that will provide a future quantum internet its power and fundamental security. Via a novel smart entanglement protocol and careful protection of the entanglement, the scientists led by Prof. Ronald Hanson are the first in the world to deliver such a quantum link ‘on demand’. This opens the door to connect multiple quantum nodes and create the very first quantum network in the world. They publish their results on 14 June in Nature.

    Quantum Internet

    By exploiting the power of quantum entanglement it is theoretically possible to build a quantum internet that cannot be eavesdropped on. However, the realization of such a quantum network is a real challenge: you have to be able to create entanglement reliably, ‘on demand’, and maintain it long enough to pass the entangled information to the next node. So far, this has been beyond the capabilities of quantum experiments.

    1
    Researchers from QuTech in Delft working on the ‘entanglement on demand’ experiment’. The pictures show prof. Ronald Hanson, dr. Peter Humphreys and dr. Norbert Kalb, all from the group of prof Ronald Hanson of Delft University.

    Scientists at QuTech in Delft have now been the first to experimentally generate entanglement over a distance of two metres in a fraction of a second, ‘on demand’, and subsequently maintain this entanglement long enough to enable -in theory- further entanglement to a third node. ‘The challenge is now to be the first to create a network of multiple entangled nodes: the first version of a quantum internet’, professor Hanson states.

    Higher performance

    In 2015, Ronald Hanson’s research group already became world news: they were the first to generate long-lived quantum entanglement over a distance (1.3 kilometres), , allowing them to providefull experimental proof of quantum entanglement for the first time. This experiment is the basis of their current approach to developing a quantum internet: distant single electrons on diamond chips are entangled using photons as mediators.

    However, so far this experiment has not had the necessary performance to create a real quantum network. Hanson: ‘In 2015 we managed to establish a connection once an hour, while the connection only remained active for a fraction of a second. It was impossible to add a third node, let alone multiple nodes, to the network.’

    Entanglement on demand

    The scientists have now made multiple innovative improvements to the experiment. First of all, they demonstrated a new entanglement method. This allows for the generation of entanglement forty times a second between electrons at a distance of two metres. Peter Humphreys, an author of the paper, emphasises: ‘This is a thousand times faster than with the old method.’ In combination with a smart way of protecting the quantum link from external noise, the experiment has now surpassed a crucial threshold: for the first time, entanglement can be created faster than it is lost.

    Through technical improvements, the experimental setup is now always ready for ‘entanglement-on-demand’. Hanson: ‘Just like in the current internet, we always want to be online, the system has to entangle on each request.’ The scientists have achieved this by adding smart quality checks. Humphreys: ‘These checks only take a fraction of the total experimental time, while allowing us to ensure that our system is ready for entanglement, without any manual action’.

    Networks

    The researchers already demonstrated last year that they were able to protect https://qutech.nl/one-step-closer-to-the-quantum-internet-by-distillation/a quantum entangled link while a new connection was generated. By combining this and their new results, they are ready to create quantum networks with more than two nodes. The Delft scientists now plan to realize such a network between several quantum nodes. Hanson: ‘In 2020, we want to connect four cities in the Netherlands via quantum entanglement. This will be the very first quantum internet in the world.’

    This work was supported by the Netherlands Organisation for Scientific Research (NWO) through a VICI grant and by the European Research Council through a Starting Grant and a Synergy Grant.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    Delft University of Technology (Dutch: Technische Universiteit Delft) also known as TU Delft, is the largest and oldest Dutch public technological university, located in Delft, Netherlands. It counts as one of the best universities for engineering and technology worldwide, typically seen within the top 20.[7] It is repeatedly considered the best university of technology in the Netherlands.[8]

    With eight faculties and numerous research institutes,[9] it hosts over 19,000 students (undergraduate and postgraduate), more than 2,900 scientists, and more than 2,100 support and management staff.[5]

    The university was established on 8 January 1842 by William II of the Netherlands as a Royal Academy, with the main purpose of training civil servants for the Dutch East Indies. The school rapidly expanded its research and education curriculum, becoming first a Polytechnic School in 1864, Institute of Technology in 1905, gaining full university rights, and finally changing its name to Delft University of Technology in 1986.[1]

    Dutch Nobel laureates Jacobus Henricus van ‘t Hoff, Heike Kamerlingh Onnes, and Simon van der Meer have been associated with TU Delft. TU Delft is a member of several university federations including the IDEA League, CESAER, UNITECH International, and 3TU.

     
  • richardmitnick 3:22 pm on June 14, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , NIF achieves record double fusion yield,   

    From NIF at LLNL: “NIF achieves record double fusion yield” 

    From National Ignition Facility at Lawrence Livermore National Laboratory

    LLNL/NIF

    June 13, 2018
    Breanna Bishop
    bishop33@llnl.gov
    925-423-9802

    1
    This rendering of the inside of NIF’s target chamber shows the target positioner moving into place. Pulses from NIF’s high-powered lasers race through the facility at the speed of light and arrive at the center of the target chamber within a few trillionths of a second of each other, aligned to the accuracy of the diameter of a human hair. No image credit.

    An experimental campaign conducted at Lawrence Livermore National Laboratory’s (LLNL) National Ignition Facility (NIF) has achieved a total fusion neutron yield of 1.9e16 (1.9×1016) and 54 KJ of fusion energy output — double the previous record. Researchers in LLNL’s Inertial Confinement Fusion Program (ICF) detail the results in a paper that will be published this week in Physical Review Letters.

    NIF is the world’s largest and most energetic laser, designed to perform experimental studies of fusion ignition and thermonuclear burn, the phenomenon that powers the sun, stars and modern nuclear weapons. As a key component of the National Nuclear Security Administration’s Stockpile Stewardship Program, experiments fielded on NIF enable researchers to gain fundamental understanding of extreme temperatures, pressures and densities — knowledge that helps ensure the current and future nuclear stockpile is safe and reliable.

    The record-breaking experiments utilized a diamond capsule — a layer of ultra-thin high-density carbon containing the deuterium-tritium (DT) fusion fuel — seated inside a depleted uranium hohlraum. This approach allowed the researchers to greatly improve their control over the symmetry of the X-rays that drive the capsule, producing “rounder” and more symmetric implosions.

    “These results represent significant progress,” said Sebastien Le Pape, lead author of the paper and lead experimenter for the campaign. “By controlling the uniformity of the implosion, we’ve improved the compression of the hot spot leading to unprecedented hot spot pressure and areal density.”

    In addition to increased yield, the experiments produced other critical results. For the first time, the hot spot pressure topped out at approximately 360 Gbar (360 billion atmospheres) — exceeding the pressure at the center of the sun. Further, these record yields mean there was a record addition of energy to the hot spot due to fusion alpha particles. By depositing their energy rather than escaping, the alpha particles further heat the fuel, increasing the rate of fusion reactions and thus producing more alpha particles. This leads to yield amplification, which in these experiments was almost a factor of 3. As the implosions are further improved, this yield amplification could eventually lead to fusion ignition.

    “Because of the extreme levels of compression that these implosions have achieved, we are now at the threshold of achieving a ‘burning plasma’ state, where alpha-particle deposition in the fusing plasma is the dominant source of heating in that plasma,” said Omar Hurricane, chief scientist for the ICF Program.

    “Each experiment we do unlocks important data that informs how we design and field future experiments,” added NIF Director Mark Herrmann. “These results represent a significant advancement in our knowledge and will enable our next steps in tackling the difficult scientific challenge of ignition.”

    In addition, the experiments achieved conditions that now enable access to a range of nuclear and astrophysical regimes. The density, temperature and pressure of the hot spot are the closest to conditions in the sun, and the neutron density is now applicable for nucleosynthesis studies, which have traditionally needed an intense, laboratory-based neutron source. The conditions also are relevant for studying fundamental nuclear weapons physics.

    Additional experiments have shown similar levels of performance, confirming the importance of this approach. Looking ahead, LLNL plans to advance its experiments by exploring increased capsule size, energy delivery on NIF and improvements to features such as the capsule fill tube.

    “Every time we make progress, we can better understand what challenges lie ahead,” said Laura Berzak Hopkins, lead designer for the experiments. “Now, we’re in an exciting place where we understand our system a lot better than before, and we’ve been able to take that understanding and translate it into increased performance. I’m very excited about the progress we’ve been able to make, and where we can go next.”

    In addition to Le Pape, Hurricane and Berzak Hopkins, co-authors include Laurent Divol, Arthur Pak, Eduard Dewald, Suhas Bhandarkar, Laura Benedetti, Thomas Bunn, Juergen Biener, Daniel Casey, David Fittinghoff, Clement Goyon, Steven Haan, Robert Hatarik, Darwin Ho, Nobuhiko Izumi, Shahab Khan, Tammy Ma, Andrew Mackinnon, Andrew MacPhee, Brian MacGowan, Nathan Meezan, Jose Milovich, Marius Millot, Pierre Michel, Sabrina Nagel, Abbas Nikroo, Prav Patel, Joseph Ralph, Janes Ross, David Strozzi, Michael Stadermann, Charles Yeamans, Christopher Weber and Deborah Callahan of LLNL; Jay Crippen Martin Havre, Javier Jaquez and Neal Rice of General Atomics; Dana Edgell of the University of Rochester’s Laboratory for Laser Energetics; Maria Gatu-Johnson of the Massachusetts Institute of Technology’s Plasma Science and Fusion Center; George Kyrala and Petr Volegov of Los Alamos National Laboratory; and Christoph Wild of Diamond Materials Gmbh.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The National Ignition Facility, or NIF, is a large laser-based inertial confinement fusion (ICF) research device, located at the Lawrence Livermore National Laboratory in Livermore, California. NIF uses lasers to heat and compress a small amount of hydrogen fuel with the goal of inducing nuclear fusion reactions. NIF’s mission is to achieve fusion ignition with high energy gain, and to support nuclear weapon maintenance and design by studying the behavior of matter under the conditions found within nuclear weapons. NIF is the largest and most energetic ICF device built to date, and the largest laser in the world.

    Construction on the NIF began in 1997 but management problems and technical delays slowed progress into the early 2000s. Progress after 2000 was smoother, but compared to initial estimates, NIF was completed five years behind schedule and was almost four times more expensive than originally budgeted. Construction was certified complete on 31 March 2009 by the U.S. Department of Energy, and a dedication ceremony took place on 29 May 2009. The first large-scale laser target experiments were performed in June 2009 and the first “integrated ignition experiments” (which tested the laser’s power) were declared completed in October 2010.

    Bringing the system to its full potential was a lengthy process that was carried out from 2009 to 2012. During this period a number of experiments were worked into the process under the National Ignition Campaign, with the goal of reaching ignition just after the laser reached full power, some time in the second half of 2012. The Campaign officially ended in September 2012, at about 1⁄10 the conditions needed for ignition. Experiments since then have pushed this closer to 1⁄3, but considerable theoretical and practical work is required if the system is ever to reach ignition. Since 2012, NIF has been used primarily for materials science and weapons research.

    National Igniton Facility- NIF at LLNL

    The preamplifiers of the National Ignition Facility are the first step in increasing the energy of laser beams as they make their way toward the target chamber

    Lawrence Livermore National Laboratory (LLNL) is an American federal research facility in Livermore, California, United States, founded by the University of California, Berkeley in 1952. A Federally Funded Research and Development Center (FFRDC), it is primarily funded by the U.S. Department of Energy (DOE) and managed and operated by Lawrence Livermore National Security, LLC (LLNS), a partnership of the University of California, Bechtel, BWX Technologies, AECOM, and Battelle Memorial Institute in affiliation with the Texas A&M University System. In 2012, the laboratory had the synthetic chemical element livermorium named after it.

    LLNL is self-described as “a premier research and development institution for science and technology applied to national security.”[1] Its principal responsibility is ensuring the safety, security and reliability of the nation’s nuclear weapons through the application of advanced science, engineering and technology. The Laboratory also applies its special expertise and multidisciplinary capabilities to preventing the proliferation and use of weapons of mass destruction, bolstering homeland security and solving other nationally important problems, including energy and environmental security, basic science and economic competitiveness.

    Operated by Lawrence Livermore National Security, LLC, for the Department of Energy’s National Nuclear Security Administration

    DOE Seal
    NNSA

     
  • richardmitnick 8:24 pm on June 13, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Exascale supercomputing still to come, , NQI- Congress's National Quantum Initiative, , ,   

    From Science Magazine: “Quantum physics gets attention—and brighter funding prospects—in Congress” 

    AAAS
    From Science Magazine

    Jun. 13, 2018
    Gabriel Popkin

    1
    Ions trapped between gold blades serve as information-carrying qubits in a prototype quantum computer.
    E. EDWARDS/JOINT QUANTUM INSTITUTE

    Many members of Congress admit they find quantum physics mind-boggling, with its counterintuitive account of the subatomic world. But that isn’t stopping U.S. lawmakers, as well as policymakers in President Donald Trump’s administration, from backing an emerging effort to better organize and boost funding for quantum research, which could reshape computing, sensors, and communications.

    ORNL IBM AC922 SUMMIT supercomputer just launched by OLCF at ORNL, and there is more to come as we approach exascale supercomputing

    In the coming weeks, the science committee of the House of Representatives is expected to introduce legislation calling for a new, 10-year-long National Quantum Initiative (NQI). The White House, for its part, is scheduled to formally launch a new panel that will guide the federal government’s role in quantum science. Key science agencies are calling on Congress to accelerate spending on quantum research. And the Senate supports a boost for the field: Last week, it approved a mammoth defense policy bill that includes a provision directing the Pentagon to create a new $20 million quantum science program.

    A yearlong push by a coalition of academic researchers and technology firms helped trigger this flurry of activity. Proponents argue the United States needs a better plan for harvesting the potential fruits of quantum research—and for keeping up with global competitors.

    LLNL IBM Sierra ATS2 supercomputer still to come

    Depiction of ANL ALCF Cray Shasta Aurora supercomputer still to come

    The European Union has launched a decadelong quantum research initiative, and China is said to be investing heavily in the field. The United States is “kind of the only major country that’s not doing something [?],” says Chris Monroe, a physicist at the University of Maryland in College Park and co-founder of a startup developing quantum computers, which could outstrip conventional computers on certain problems. [I guess what is depicted below is someone’s idea of nothing.]

    Quantum computing – IBM I

    IBM Quantum Computing

    Last June, a small group of academics, executives, and lobbyists that includes Monroe released a white paper calling for an NQI; they issued a blueprint for the effort in April. Meanwhile, the House science committee held a hearing on the topic last October and plans to release a bill later this month that draws extensively from the blueprint.

    “We must ensure that the United States does not fall behind other nations that are advancing quantum programs,” Science committee chair Lamar Smith (R–TX) said yesterday in a statement about the bill.

    The legislation will authorize the Department of Energy (DOE) and the National Science Foundation (NSF) to create new research centers at universities, federal laboratories, and nonprofit research institutes, according to a committee spokesperson. These research hubs would aim to build alliances between physicists doing fundamental research, engineers who can build devices, and computer scientists developing quantum algorithms. The centers could give academics seeking to develop commercial technologies access to expertise and expensive research tools, says physicist David Awschalom of the University of Chicago in Illinois, one of the blueprint’s authors. “The research needs rapidly outpace any individual lab,” he says.

    The proposal “sounds really promising,” says Danna Freedman, a chemist at Northwestern University in Evanston, Illinois, who did not contribute to the proposal. But Freedman, who synthesizes materials that could be used to build new kinds of quantum computer components, says her enthusiasm “depends to what extent the government decides to prescribe the research.”

    The blueprint recommends that the hubs focus on three areas: developing ultraprecise quantum sensors for biomedicine, navigation, and other applications; hack-proof quantum communication; and quantum computers. The bill will likely leave it up to federal agencies, the new White House quantum panel, and an outside advisory group to determine the initiative’s focus. Backers also say the effort could help advance the development of software for quantum computers—a major hurdle. Right now, just “tens or hundreds of people” can program quantum computers, says William Zeng of Rigetti Computing, a startup in Berkeley, California, seeking to build a quantum computer and offer quantum computing services. “That’s not going to be able to support building the full potential of the tech.”

    It’s not yet known how much funding the House bill, which Republicans on the science panel are crafting, will recommend. The blueprint envisions channeling $800 million over 5 years to the NQI, but even if the bill endorses that figure, congressional appropriators will have the final say. Also uncertain is whether Democrats will sign on and help ensure passage through the full House, and whether the Senate will support the idea.

    In the meantime, lawmakers and the Trump administration are moving to shore up federal spending on quantum science, which analysts in 2016 estimated at about $200 million a year. Adding to the $20 million boost approved by the Senate (but not yet by the entire Congress), Trump’s 2019 budget request would create a new $30 million “Quantum Leap” initiative at NSF and boost DOE’s quantum research programs to $105 million.

    The United States, long seen as a leader, is facing growing global competition in the quantum field, says Walter Copan, director of the National Institutes of Standards and Technology in Gaithersburg, Maryland, which has long played a role in quantum research. “It is the equivalent of a space race now,” says Copan, who met last week with Smith. Focusing federal resources on the field, Copan adds, “has phenomenal promise for the country—if it’s done right.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 11:32 am on June 12, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From UC Santa Barbara: “Under the Sea” 

    UC Santa Barbara Name bloc
    From UC Santa Barbara

    June 5, 2018
    Jeff Mitchell

    Earth scientist Zach Eilon plumbs the depths of the Pacific Ocean to learn more about plate tectonics.

    5
    The Pacific ORCA science party on board the research vessel Kilo Moana; UCSB’s Zach Eilon is seventh from left. Photo Credit: Courtesy Zach Eilon

    2
    Watchstanders processing data in the vessel’s computer lab spot an underwater volcano that has never before been imaged. Photo Credit: Courtesy Zach Eilon

    3
    Preparing to deploy an Ocean Bottom Seismometer (OBS) at sunset. Photo Credit: Courtesy Zach Eilon

    4
    Preparing to test all the OBS communication devices, temporarily housed in the “rosette”, sitting beneath the A-frame; the yellow packages on deck are the OBS instruments, awaiting deployment. Photo Credit: Courtesy Zach Eilon

    Voyaging across a vast swath of the Pacific Ocean to learn more about how the Earth’s tectonic plates work, scientist Zach​​ Eilon was assisted along the way by friendly deep-sea denizen SpongeBob SquarePants.

    No, the beloved animated character wasn’t really there, but SpongeBob was the nickname Eilon, a UC Santa Barbara assistant professor of earth sciences, gave the sophisticated instrument that played a key role in his research.

    Otherwise known as ocean bottom seismometers, or OBS’s, these instruments are sensitive enough to detect earthquakes on the other side of the world.

    While the seismometers themselves sit on the seafloor, they are attached to a bright yellow flotation package — hence, the SpongeBob comparison — and are about a meter in width. The packages are affixed to a plastic base containing complex electronics.

    Eilon and collaborators carefully placed 30 of them on the ocean floor about 2,000 miles southeast of Hawaii during their recent Pacific ORCA (Pacific OBS Research into Convecting Asthenosphere) expedition aboard the U.S. Navy research vessel Kilo Moana.

    2
    U.S. Navy research vessel Kilo Moana

    The trip and the experiment were part of an ongoing and high-profile international effort, on which UCSB is one of three lead institutions in the U.S., to seismically instrument the Pacific Ocean.

    Oceanic plates make up 70 percent of the Earth’s surface and offer important windows into the Earth’s mantle, Eilon said, yet they are largely unexplored due to the obvious challenge of putting sensitive electronics three miles beneath the sea surface. The earth science community has identified several unanswered questions regarding the thermal structure of oceanic plates, the significance of volcanism in the middle of oceanic plates and how the convecting mantle beneath the plates controls their movements.

    Undulations in the gravity field and unexplained shallowing of the ocean floors hint that small-scale convection may be occurring beneath the oceanic plates, but this remains unconfirmed, according to Eilon. The new experiment could help prove it.

    “Our little instruments will sit on the ocean floor for approximately 15 months, recording earthquakes around the world,” he said. “When we return to retrieve them next year they’ll hold seismic data in their memory banks that could change the way in which we understand the oceanic plates. That understanding is pretty significant, considering that these plates make up about 70 percent of our planet’s surface.”

    When they are recovered in July 2019, the OBS units are expected to provide data that allows Eilon and his collaborators to make 3-D images of the oceanic tectonic plates – a bit like taking a CAT-scan of the Earth. Of particular interest is the mysterious asthenosphere, the zone of Earth’s mantle lying beneath the lithosphere (the tectonic plate) and believed to be much hotter and more fluid than rocks closer to the surface. The asthenosphere extends from about 60 miles to about 250 miles below Earth’s surface.

    Once ready for deployment, the weighted instrument packages are designed to carefully sink upright to the seafloor. When the science party returns to the site, the ship will send an acoustic signal down to the individual science packages, commanding them to release the weight holding them down, allowing the buoyant yellow “SpongeBob” portion of the device to slowly float them to the surface, he explained.

    Once on the surface, the ship’s crew will home in on the package (which has a light, flag, and radio so the scientists can locate it) and lift it from the sea. From there the science team will commence the process of downloading the seismic data which are detailed records of the ocean floor vibrations. Turning these wiggles into 3D images is the result of highly complex computer processing and mathematics.

    Eilon said that in addition to giving researchers a better idea of how the Earth’s tectonic plates work, the data is expected to provide important information about geologic hazards.

    “By improving our understanding of interactions between plates, the data we collect should improve our ability to forecast earthquakes and volcanic eruptions,” he said, “which I hope will help authorities save lives when these events occur.”

    Eilon, along with co-principal investigator Jim Gaherty of Columbia University, led the expedition’s diverse 14-member science team (drawn from 11 institutions across three continents). The $4-million research project is supported by the National Science Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Santa Barbara Seal
    The University of California, Santa Barbara (commonly referred to as UC Santa Barbara or UCSB) is a public research university and one of the 10 general campuses of the University of California system. Founded in 1891 as an independent teachers’ college, UCSB joined the University of California system in 1944 and is the third-oldest general-education campus in the system. The university is a comprehensive doctoral university and is organized into five colleges offering 87 undergraduate degrees and 55 graduate degrees. In 2012, UCSB was ranked 41st among “National Universities” and 10th among public universities by U.S. News & World Report. UCSB houses twelve national research centers, including the renowned Kavli Institute for Theoretical Physics.

     
  • richardmitnick 11:06 am on June 12, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Korean Superconducting Tokamak Advanced Research (KSTAR), Magnetic islands- bubble-like structures form in fusion plasmas, ,   

    From PPPL: “New model sheds light on key physics of magnetic islands that halt fusion reactions” 


    From PPPL

    June 6, 2018
    John Greenwald

    1
    The Korean Superconducting Tokamak Advanced Research facility. (Photo courtesy of the Korean National Fusion Research Institute.

    Magnetic islands, bubble-like structures that form in fusion plasmas, can grow and disrupt the plasmas and damage the doughnut-shaped tokamak facilities that house fusion reactions. Recent research at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) has used large-scale computer simulations to produce a new model that could be key to understanding how the islands interact with the surrounding plasma as they grow and lead to disruptions.

    The findings, which overturn long-held assumptions of the structure and impact of magnetic islands, are from simulations led by visiting physicist Jae-Min Kwon. Kwon, on a year-long sabbatical from the Korean Superconducting Tokamak Advanced Research (KSTAR) facility, worked with physicists at PPPL to model the detailed and surprising experimental observations recently made on KSTAR.

    Researchers intrigued

    “The experiments intrigued many KSTAR researchers including me,” said Kwon, first author of the new theoretical paper selected as an Editor’s Pick in the journal Physics of Plasmas. “I wanted to understand the physics behind the sustained plasma confinement that we observed,” he said. “Previous theoretical models assumed that the magnetic islands simply degraded the confinement instead of sustaining it. However, at KSTAR, we didn’t have the proper numerical codes needed to perform such studies, or enough computer resources to run them.”

    The situation turned Kwon’s thoughts to PPPL, where he has interacted over the years with physicists who work on the powerful XGC numerical code that the Laboratory developed. “Since I knew that the code had the capabilities that I needed to study the problem, I decided to spend my sabbatical at PPPL,” he said.

    Kwon arrived in 2017 and worked closely with C.S. Chang, a principal research physicist at PPPL and leader of the XGC team, and PPPL physicists Seung-Ho Ku, and Robert Hager. The researchers modeled magnetic islands using plasma conditions from the KSTAR experiments. The structure of the islands proved markedly different from standard assumptions, as did their impact on plasma flow, turbulence, and plasma confinement during fusion experiments.

    Fusion, the power that drives the sun and stars, is the fusing of light atomic elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — that generates massive amounts of energy. Scientists are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

    Long-absent understanding

    “Understanding how islands interact with plasma flow and turbulence has been absent until now,” Chang said. “Because of the lack of detailed calculations on the interaction of islands with complicated particle motions and plasma turbulence, the estimate of the confinement of plasma around the islands and their growth has been based on simple models and not well understood.”

    The simulations found the plasma profile inside the islands not to be constant, as previously thought, and to have a radial structure. The findings showed that turbulence can penetrate into islands and that the plasma flow across them can be strongly sheared so that it moves in opposite directions. As a result, plasma confinement can be maintained while the islands grow.

    These surprising findings contradicted past models and agreed with the experimental observations made on KSTAR. “The study exhibits the power of supercomputing on problems that could not be studied otherwise,” Chang said. “These findings could lay new groundwork for understanding the physics of plasma disruption, which is one of the most dangerous events a tokamak reactor could encounter.”

    Millions of processor hours

    Computing the new model required 6.2 million processor-core hours on the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science user facility at Lawrence Berkeley National Laboratory. The processing time equaled thousands of years on a desktop computer. “What I wanted was quantitatively accurate results that could be directly compared with the KSTAR data,” Kwon said. “Fortunately, I could access enough resources on NERSC to achieve that goal through the allocation given to the XGC program. I am grateful for this opportunity.”

    Going forward, a larger scale computer could allow the XGC code to start from the spontaneous formation of the magnetic islands and show how they grow, in self-consistent interaction, with the sheared plasma flow and plasma turbulence. The results could lead to a way to prevent disastrous disruptions in fusion reactors.

    Coauthors of the Physics of Plasmas paper together with the PPPL researchers were Minjun Choi, Hyungho Lee, and Hyunseok Kim of the Korean National Fusion Research Institute (NFRI), and Eisung Yoon of Rensselaer Polytechnic Institute. Support for this work comes from the DOE Office of Science and NFRI.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition


    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

     
  • richardmitnick 5:23 pm on June 11, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Graphene yet again, , Monitoring electromagnetic radiation, , The graphene is coupled to a device called a photonic nanocavity   

    From MIT News: “A better device for measuring electromagnetic radiation” 

    MIT News
    MIT Widget

    From MIT News

    June 11, 2018
    David Chandler

    1
    Schematic illustration of the experimental setup. Image courtesy of the researchers

    New bolometer is faster, simpler, and covers more wavelengths.

    Bolometers, devices that monitor electromagnetic radiation through heating of an absorbing material, are used by astronomers and homeowners alike. But most such devices have limited bandwidth and must be operated at ultralow temperatures. Now, researchers say they’ve found a ultrafast yet highly sensitive alternative that can work at room temperature — and may be much less expensive.

    The findings, published today in the journal Nature Nanotechnology, could help pave the way toward new kinds of astronomical observatories for long-wavelength emissions, new heat sensors for buildings, and even new kinds of quantum sensing and information processing devices, the multidisciplinary research team says. The group includes recent MIT postdoc Dmitri Efetov, Professor Dirk Englund of MIT’s Department of Electrical Engineering and Computer Science, Kin Chung Fong of Raytheon BBN Technologies, and colleagues from MIT and Columbia University.

    “We believe that our work opens the door to new types of efficient bolometers based on low-dimensional materials,” says Englund, the paper’s senior author. He says the new system, based on the heating of electrons in a small piece of a two-dimensional form of carbon called graphene, for the first time combines both high sensitivity and high bandwidth — orders of magnitude greater than that of conventional bolometers — in a single device.

    “The new device is very sensitive, and at the same time ultrafast,” having the potential to take readings in just picoseconds (trillionths of a second), says Efetov, now a professor at ICFO, the Institute of Photonic Sciences in Barcelona, Spain, who is the paper’s lead author. “This combination of properties is unique,” he says.

    The new system also can operate at any temperature, he says, unlike current devices that have to be cooled to extremely low temperatures. Although most actual applications of the device would still be done under these ultracold conditions, for some applications, such as thermal sensors for building efficiency, the ability to operate without specialized cooling systems could be a real plus. “This is the first device of this kind that has no limit on temperature,” Efetov says.

    The new bolometer they built, and demonstrated under laboratory conditions, can measure the total energy carried by the photons of incoming electromagnetic radiation, whether that radiation is in the form of visible light, radio waves, microwaves, or other parts of the spectrum. That radiation may be coming from distant galaxies, or from the infrared waves of heat escaping from a poorly insulated house.

    The device is entirely different from traditional bolometers, which typically use a metal to absorb the radiation and measure the resulting temperature rise. Instead, this team developed a new type of bolometer that relies on heating electrons moving in a small piece of graphene, rather than heating a solid metal. The graphene is coupled to a device called a photonic nanocavity, which serves to amplify the absorption of the radiation, Englund explains.

    “Most bolometers rely on the vibrations of atoms in a piece of material, which tends to make their response slow,” he says. In this case, though, “unlike a traditional bolometer, the heated body here is simply the electron gas, which has a very low heat capacity, meaning that even a small energy input due to absorbed photons causes a large temperature swing,” making it easier to make precise measurements of that energy. Although graphene bolometers had previously been demonstrated, this work solves some of the important outstanding challenges, including efficient absorption into the graphene using a nanocavity, and the impedance-matched temperature readout.

    The new technology, Englund says, “opens a new window for bolometers with entirely new functionalities that could radically improve thermal imaging, observational astronomy, quantum information, and quantum sensing, among other applications.”

    For astronomical observations, the new system could help by filling in some of the remaining wavelength bands that have not yet had practical detectors to make observations, such as the “terahertz gap” of frequencies that are very difficult to pick up with existing systems. “There, our detector could be a state-of-the-art system” for observing these elusive rays, Efetov says. It could be useful for observing the very long-wavelength cosmic background radiation, he says.

    Daniel Prober, a professor of applied physics at Yale University who was not involved in this research, says, “This work is a very good project to utilize the many benefits of the ultrathin metal layer, graphene, while cleverly working around the limitations that would otherwise be imposed by its conducting nature.” He adds, “The resulting detector is extremely sensitive for power detection in a challenging region of the spectrum, and is now ready for some exciting applications.”

    And Robert Hadfield, a professor of photonics at the University of Glasgow, who also was not involved in this work, says, “There is huge demand for new high-sensitivity infrared detection technologies. This work by Efetov and co-workers reporting an innovative graphene bolometer integrated in a photonic crystal cavity to achieve high absorption is timely and exciting.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: