Tagged: DOE’s Los Alamos National Laboratory (US) Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:57 pm on October 18, 2021 Permalink | Reply
    Tags: "Breakthrough proof clears path for quantum AI", , , DOE’s Los Alamos National Laboratory (US), Novel theorem demonstrates convolutional neural networks can always be trained on quantum computers., ,   

    From DOE’s Los Alamos National Laboratory (US) : “Breakthrough proof clears path for quantum AI” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    October 15, 2021
    Charles Poling
    (505)257-8006
    cpoling@lanl.gov

    Novel theorem demonstrates convolutional neural networks can always be trained on quantum computers, overcoming threat of ‘barren plateaus’ in optimization problems.

    1
    A novel proof that certain quantum convolutional networks can be guaranteed to be trained clears the way for quantum artificial intelligence to aid in materials discovery and many other applications.

    Convolutional neural networks running on quantum computers have generated significant buzz for their potential to analyze quantum data better than classical computers can. While a fundamental solvability problem known as “barren plateaus” has limited the application of these neural networks for large data sets, new research overcomes that Achilles heel with a rigorous proof that guarantees scalability.

    “The way you construct a quantum neural network can lead to a barren plateau—or not,” said Marco Cerezo, coauthor of the paper in Physical Review X. Cerezo is a physicist specializing in quantum computing, quantum machine learning, and quantum information at Los Alamos. “We proved the absence of barren plateaus for a special type of quantum neural network. Our work provides trainability guarantees for this architecture, meaning that one can generically train its parameters.”

    As an artificial intelligence (AI) methodology, quantum convolutional neural networks are inspired by the visual cortex. As such, they involve a series of convolutional layers, or filters, interleaved with pooling layers that reduce the dimension of the data while keeping important features of a data set.

    These neural networks can be used to solve a range of problems, from image recognition to materials discovery. Overcoming barren plateaus is key to extracting the full potential of quantum computers in AI applications and demonstrating their superiority over classical computers.

    Until now, Cerezo said, researchers in quantum machine learning analyzed how to mitigate the effects of barren plateaus, but they lacked a theoretical basis for avoiding it altogether. The Los Alamos work shows how some quantum neural networks are, in fact, immune to barren plateaus.

    “With this guarantee in hand, researchers will now be able to sift through quantum-computer data about quantum systems and use that information for studying material properties or discovering new materials, among other applications,” said Patrick Coles, a quantum physicist at Los Alamos and a coauthor of the paper.

    Many more applications for quantum AI algorithms will emerge, Coles thinks, as researchers use near-term quantum computers more frequently and generate more and more data—all machine learning programs are data-hungry.

    Avoiding the vanishing gradient

    “All hope of quantum speedup or advantage is lost if you have a barren plateau,” Cerezo said.

    The crux of the problem is a “vanishing gradient” in the optimization landscape. The landscape is composed of hills and valleys, and the goal is to train the model’s parameters to find the solution by exploring the geography of the landscape. The solution usually lies at the bottom of the lowest valley, so to speak. But in a flat landscape one cannot train the parameters because it’s difficult to determine which direction to take.

    That problem becomes particularly relevant when the number of data features increases. In fact, the landscape becomes exponentially flat with the feature size. Hence, in the presence of a barren plateau, the quantum neural network cannot be scaled up.

    The Los Alamos team developed a novel graphical approach for analyzing the scaling within a quantum neural network and proving its trainability.

    For more than 40 years, physicists have thought quantum computers would prove useful in simulating and understanding quantum systems of particles, which choke conventional classical computers. The type of quantum convolutional neural network that the Los Alamos research has proved robust is expected to have useful applications in analyzing data from quantum simulations.

    “The field of quantum machine learning is still young,” Coles said. “There’s a famous quote about lasers, when they were first discovered, that said they were a solution in search of a problem. Now lasers are used everywhere. Similarly, a number of us suspect that quantum data will become highly available, and then quantum machine learning will take off.”

    For instance, research is focusing on ceramic materials as high-temperature superconductors, Coles said, which could improve frictionless transportation, such as magnetic levitation trains. But analyzing data about the material’s large number of phases, which are influenced by temperature, pressure, and impurities in these materials, and classifying the phases is a huge task that goes beyond the capabilities of classical computers.

    Using a scalable quantum neural network, a quantum computer could sift through a vast data set about the various states of a given material and correlate those states with phases to identify the optimal state for high-temperature superconducting.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 12:35 pm on October 13, 2021 Permalink | Reply
    Tags: "Levitation yields better neutron-lifetime measurement", , , DOE’s Los Alamos National Laboratory (US), , ,   

    From DOE’s Los Alamos National Laboratory (US) via Science Alert (US) : “Levitation yields better neutron-lifetime measurement” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    via

    ScienceAlert

    Science Alert (US)

    13 OCTOBER 2021
    MICHELLE STARR

    1
    TanyaLovus/iStock/Getty Images Plus.

    We now know, to within a tenth of a percent, how long a neutron can survive outside the atomic nucleus before decaying into a proton.

    This is the most precise measurement yet of the lifespan of these fundamental particles, representing a more than two-fold improvement over previous measurements. This has implications for our understanding of how the first matter in the Universe was created from a soup of protons and neutrons in the minutes after the Big Bang.

    “The process by which a neutron ‘decays’ into a proton – with an emission of a light electron and an almost massless neutrino – is one of the most fascinating processes known to physicists,” said nuclear physicist Daniel Salvat of The Indiana University (US) Bloomington.

    “The effort to measure this value very precisely is significant because understanding the precise lifetime of the neutron can shed light on how the universe developed – as well as allow physicists to discover flaws in our model of the subatomic universe that we know exist but nobody has yet been able to find.”

    The research was conducted at The Los Alamos National Science Center, where a special experiment is set up just for trying to measure neutron lifespans. It’s called the UCNtau project, and it involves ultra-cold neutrons (UCNs) stored in a magneto-gravitational trap.

    The neutrons are cooled almost to absolute zero, and placed in the trap, a bowl-shaped chamber lined with thousands of permanent magnets, which levitate the neutrons, inside a vacuum jacket.

    The magnetic field prevents the neutrons from depolarizing and, combined with gravity, keeps the neutrons from escaping. This design allows neutrons to be stored for up to 11 days.

    The researchers stored their neutrons in the UCNtau trap for 30 to 90 minutes, then counted the remaining particles after the allotted time. Over the course of repeated experiments, conducted between 2017 and 2019, they counted over 40 million neutrons, obtaining enough statistical data to determine the particles’ lifespan with the greatest precision yet.

    This lifespan is around 877.75 ± 0.28 seconds (14 minutes and 38 seconds), according to the researchers’ analysis. The refined measurement can help place important physical constraints on the Universe, including the formation of matter and dark matter.

    After the Big Bang, things happened relatively quickly. In the very first moments, the hot, ultra-dense matter that filled the Universe cooled into quarks and electrons; just millionths of a second later, the quarks coalesced into protons and neutrons.

    Knowing the lifespan of the neutron can help physicists understand what role, if any, decaying neutrons play in the formation of the mysterious mass in the Universe known as dark matter. This information can also help test the validity of something called the Cabibbo-Kobayashi-Maskawa matrix, which helps explain the behavior of quarks under the Standard Model of physics, the researchers said.

    “The underlying model explaining neutron decay involves the quarks changing their identities, but recently improved calculations suggest this process may not occur as previously predicted,” Salvat said.

    “Our new measurement of the neutron lifetime will provide an independent assessment to settle this issue, or provide much-searched-for evidence for the discovery of new physics.”

    The research has been accepted into Physical Review Letters.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 3:06 pm on October 4, 2021 Permalink | Reply
    Tags: "Supercomputers reveal how X chromosomes fold; deactivate", DOE’s Los Alamos National Laboratory (US),   

    From DOE’s Los Alamos National Laboratory (US) via phys.org : “Supercomputers reveal how X chromosomes fold; deactivate” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    via

    phys.org

    October 4, 2021

    1
    RNA particles swarm an X chromosome from a mouse in a new visualization of X chromosome inactivation. Credit: DOE’s Los Alamos National Laboratory.

    Using supercomputer-driven dynamic modeling based on experimental data, researchers can now probe the process that turns off one X chromosome in female mammal embryos. This new capability is helping biologists understand the role of RNA and the chromosome’s structure in the X inactivation process, leading to a deeper understanding of gene expression and opening new pathways to drug treatments for gene-based disorders and diseases.

    “This is the first time we’ve been able to model all the RNA spreading around the chromosome and shutting it down,” said Anna Lappala, a visiting scientist at Los Alamos National Laboratory and a polymer physicist at Massachusetts General Hospital and the Harvard University (US) Department of Molecular Biology. Lappala is first author of the paper published Oct. 4 in the bioRxiv. “From experimental data alone, which is 2D and static, you don’t have the resolution to see a whole chromosome at this level of detail. With this modeling, we can see the processes regulating gene expression, and the modeling is grounded in 2D experimental data from our collaborators at Massachusetts General Hospital and Harvard.”

    The model—considered 4D because it shows motion, including time as the fourth dimension—runs on Los Alamos supercomputers. The model also incorporates experimental data from mice genomes obtained through a molecular method called 4DHiC. The combined molecular and computational methodology is a first.

    In the visualization, RNA particles swarm over the X chromosome. The tangled-spaghetti-like strands writhe, changing shape, then the particles engulf and penetrate the depths of the chromosome, turning it off. See the visualization:


    3D models reveal hidden process in X chromosome inactivation

    “The method allows us to develop an interactive model of this epigenetic process,” said Jeannie T. Lee, professor of Genetics at Harvard Medical School and vice chair in molecular biology at Massachusetts General Hospital, whose lab contributed the experimental data underpinning the model.

    Epigenetics is the study of changes in gene expression and heritable traits that don’t involve mutations in the genome.

    “What’s been missing in the field is some way for a user who’s not computationally savvy to go interactively into a chromosome,” Lee said. She compared using the Los Alamos model to using Google Earth, where “you can zoom into any location on an X chromosome, pick your favorite gene, see the other genes around it, and see how they interact.” That capability could lend insight into how diseases spread, for instance, she said.

    Based on the work in this paper, Los Alamos is currently developing a Google Earth-style browser where any scientist can upload their genomic data and view it dynamically in 3D at various magnifications, said Karissa Sanbonmatsu, a structural biologist at Los Alamos National Laboratory, corresponding author of the paper, and a project leader in developing the computational method.

    In mammals, a female embryo is conceived with two X chromosomes, one inherited from each parent. X inactivation shuts off the chromosome, a crucial step for the embryo to survive, and variations in X inactivation can trigger a variety of developmental disorders.

    The new Los Alamos model will facilitate a deeper understanding of gene expression and related problems, which could lead to pharmacological treatments for various gene-based diseases and disorders, Lee said.

    “Our main goal was to see the chromosome change its shape and to see gene-expression levels over time,” said Sanbonmatsu.

    To understand how genes are turned on and off, Sanbonmatsu said, “it really helps to know the structure of the chromosome. The hypothesis is that a compacted, tightly structured chromosome tends to turn off genes, but there are not a lot of smoking guns about this. By modeling 3D structures in motion, we can get closer to the relationship between structural compaction and turning off genes.”

    Lee likened the chromosome’s structure to origami. A complicated shape akin to an origami crane offers lots of surface for gene expression and might be biologically preferred to remain active.

    The model shows a variety of substructures in the chromosome. When it is shut down, “it’s a piecemeal process in which some substructures are kept but some are dissolved,” Sanbonmatsu said. “We see beginning, intermediate, and end stages, through a gradual transition. That’s important for epigenetics because it’s the first time we have been able to analyze the detailed structural transition in an epigenetic change.”

    The modeling also shows genes on the surface of the chromosome that escape X chromosome inactivation, confirming early experimental work. In the model, they cluster and apparently interact or work together on the surface of the chromosome.

    In another insight from the modeling, “As the chromosome goes from an active X, when it’s still fairly large, to a compact inactive X, that’s smaller, we notice there’s a core of the chromosome that’s extremely dense, but the surface is much less dense. We see a lot more motion on the surface too,” Lappala said. “Then there’s an intermediate region that’s not too fast or slow, where the chromosome can rearrange.”

    An inactive X can activate later in a process called age-related activation of inactive X. “It’s associated with problems in blood cells in particular that are known to cause autoimmunity,” Lee said. “Some research is trying pharmacologically to activate the inactive X to treat neurological disorders in children by giving them something back that’s missing on their active X chromosome. For instance, a child could have a mutation that can cause disease. We think if we can reactivate the normal copy on the inactive X, then we would have an epigenetic treatment for that mutation.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 1:27 pm on September 22, 2021 Permalink | Reply
    Tags: "Tracking the big melt", Arctic permafrost – frozen ground – is rapidly thawing due to a warming climate., , , , DOE’s Los Alamos National Laboratory (US), E3SM: Energy Exascale Earth System Model, Earth’s rapidly changing Arctic coastal regions have an outsized climatic effect that echoes around the globe., , Researchers have shown that September Arctic sea ice extent is declining by about 13 percent each decade.   

    From DOE’s ASCR Discovery (US) : “Tracking the big melt” 

    From DOE’s ASCR Discovery (US)

    September 2021

    DOE’s Los Alamos National Laboratory (US) and DOE’s Oak Ridge National Laboratory (US) scientists lead a DOE supercomputing effort to model the complex interactions affecting climate change in Arctic coastal regions.

    1
    Beaufort Sea ice, April 2007. Photo courtesy of Andrew Roberts, Los Alamos National Laboratory.

    Earth’s rapidly changing Arctic coastal regions have an outsized climatic effect that echoes around the globe. Tracking processes behind this evolution is a daunting task even for the best scientists.

    Coastlines are some of the planet’s most dynamic areas – places where marine, terrestrial, atmospheric and human actions meet. But the Arctic coastal regions face the most troubling issues from human-caused climate change from increasing greenhouse gas emissions, says Los Alamos National Laboratory (LANL) scientist Andrew Roberts.

    “Arctic coastal systems are very fragile,” says Roberts, who leads the high-performance computing systems element of a broader Department of Energy (DOE) Office of Science effort, led by its Biological and Environmental Research (BER) office, to simulate changing Arctic coastal conditions. “Until the last several decades, thick, perennial Arctic sea ice appears to have been generally stable. Now, warming temperatures are causing it to melt.”

    In the 1980s, multiyear ice at least four years old accounted for more than 30 percent of Arctic coverage; that has shrunk to not much more than 1 percent today. Whereas that perennial pack ice circulates around the Arctic, another type known as land-fast ice – anchored to a shoreline or the ocean bottom, acting as a floating land extension – is receding toward the coast due to rising temperatures.

    This exposes coastal regions to damaging waves that can disperse ice and erode coastal permafrost, Roberts says.

    Researchers have shown that September Arctic sea ice extent is declining by about 13 percent each decade, as the Arctic warms more than twice as fast as the rest of the planet – what scientists call “Arctic amplification.”

    Changes in Arctic sea-ice and land-ice melting can disrupt the so-called global ocean conveyor belt that circulates water around the planet and helps stabilize the climate, Roberts reports. The stream moves cold, dense, salty water from the poles to the tropical oceans, which send warm water in return.

    The Arctic is now stuck in a crippling feedback loop: Sea ice can reflect 80 percent or more of sunlight into space, but its relentless decline causes larger and larger areas of dark, open ocean to take its place in summer and absorb more than 90 percent of noon sunlight, leading to more warming.

    Roberts and his colleagues tease out how reductions in Arctic ice and increases in Arctic temperatures affect flooding, marine biogeochemistry, shipping, natural resource extraction and wildlife habitat loss. The team also assesses the effects of climate change on traditional communities, where anthropogenic warming affects weather patterns and damages hunting grounds and infrastructure such as buildings and roads.

    Arctic permafrost – frozen ground – is rapidly thawing due to a warming climate. Some scientists predict that roughly 2.5 million square miles of this soil – about 40 percent of the world’s total – could disappear by the century’s end and release mammoth amounts of potent greenhouse gases, including methane, carbon dioxide and water vapor.

    The overall research project, the BER-sponsored Interdisciplinary Research for Arctic Coastal Environments (InteRFACE), led by Joel Rowland, also from LANL, and is a multi-institutional collaboration that includes other national laboratories and universities. Roberts has overseen the computational aspects of the DOE project that have benefitted from 650,000 node-hours of supercomputing time in 2020 at the DOE’s National Energy Research Scientific Computing Center (US) at DOE’s Lawrence Berkeley National Laboratory (US).

    The Arctic coastal calculations used NERSC’s Cori, a Cray XC40 system with 700,000 processing cores that can perform 30 thousand trillion floating-point operations per second.

    The LANL researchers, with colleagues from many other national laboratories, have relied on and contributed to development of a sophisticated DOE-supported research tool called the Energy Exascale Earth System Model (E3SM), letting them use supercomputer simulation and data-management to better understand changes in Arctic coastal systems. InteRFACE activities contribute to the development of E3SM and benefit from its broader development.

    E3SM portrays the atmosphere, ocean, land and sea ice – including the mass and energy changes between them – in high-resolution, three-dimensional models, focusing Cori’s computing power on small regions of big interest. The scientists have created grid-like meshes of triangular cells in E3SM’s sea-ice and ocean components to reproduce the region’s coastlines with high fidelity.

    “One of the big questions is when melting sea ice will make the Arctic Ocean navigable year-round,” Roberts says. Although government and commercial ships – even cruise ships – have been able to maneuver through the Northwest Passage in the Canadian Archipelago in recent summers, by 2030 the region could be routinely navigable for many months of the year if sea-ice melting continues apace, he says.

    E3SM development will help researchers better understand how much the Northwest Passage is navigable compared with traditional rectangular meshes used in many lower-resolution climate models, Roberts notes.

    E3SM features weather-scale resolution – that is, detailed enough to capture fronts, storms, and hurricanes – and uses advanced computers to simulate aspects of the Earth’s variability. The code helps researchers anticipate decadal-scale changes that could influence the U.S. energy sector in years to come.

    “If we had the computing power, we would like to have high-resolution simulations everywhere in the world,” he says. “But that is incredibly expensive to undertake.”

    Ethan Coon, an Oak Ridge National Laboratory scientist and a co-investigator of a related project, supported by the DOE Advanced Scientific Computing Research (ASCR) program’s Leadership Computing Challenge (ALCC), says far-northern land warming “is transforming the Arctic hydrological cycle, and we are seeing significant changes in river and stream discharge.” The ALCC program allocates supercomputer time for DOE projects that emphasize high-risk, high-payoff simulations and that broadened the research community.

    Coon, an alumnus of the DOE Computational Science Graduate Fellowship, says warming is altering the pathways of rivers and streams. As thawing permafrost sinks lower below the surface, groundwater courses deeper underground and stays colder as it flows into streams – potentially affecting fish and other wildlife.

    What happens on land has a big ocean impact, Roberts agrees. At long last, he says, “we finally have the ability to really refine coastal regions and simulate their physical processes.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

    The United States Department of Energy (DOE)(US) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy(US). The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Energy Technology Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include:
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of Fossil Energy[32]
    Office of River Protection[33]
    Pantex
    Radiological and Environmental Sciences Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
  • richardmitnick 12:22 pm on August 24, 2021 Permalink | Reply
    Tags: "Megadrought", "Mountains of Data-An Unprecedented Climate Observatory to Understand the Future of Water", , , , DOE’s Los Alamos National Laboratory (US), , , Mountain watersheds provide 60 to 90% of water resources worldwide., SAIL is a research campaign managed by DOE’s "Atmospheric Radiation Measurement (ARM)" project., The Colorado River system, The Upper Colorado River powers more than $1 trillion in economic activity and provides an immense amount of hydroelectric power but it’s very understudied compared to how important it is.   

    From DOE’s Lawrence Berkeley National Laboratory (US) and DOE’s Los Alamos National Laboratory (US) : “Mountains of Data-An Unprecedented Climate Observatory to Understand the Future of Water” 

    From DOE’s Lawrence Berkeley National Laboratory (US)

    and

    LANL bloc

    DOE’s Los Alamos National Laboratory (US)

    August 24th, 2021
    Julie Chao

    First-ever “bedrock-to-atmosphere” observation system could allow scientists to predict the future of water availability in the West.

    The “megadrought” impacting the Colorado River system this year has been devastating to the 40 million people who rely on it for water. But could this drought have been predicted? Will we be able to predict the next one?

    Mountain watersheds provide 60 to 90% of water resources worldwide, but there is still much that scientists don’t know about the physical processes and interactions that affect hydrology in these ecosystems. And thus, the best Earth system computer models struggle to predict the timing and availability of water resources emanating from mountains.

    Now a team of Department of Energy (US) scientists led by Lawrence Berkeley National Laboratory (Berkeley Lab) aims to plug that gap, with an ambitious campaign to collect a vast array of measurements that will allow scientists to better understand the future of water in the West. The Surface Atmosphere Integrated Field Laboratory (SAIL) campaign will start on September 1, when scientists flip the switch on a slew of machinery that has been amassed in the Upper Colorado River Basin.

    2
    During the SAIL campaign instruments on the tower will measure core variables related to surface meteorology and collect radiation data. Credit: John Bilberry/DOE’s Los Alamos National Laboratory(US).

    Over the course of two falls, two winters, two springs, and a summer, more than three dozen scientific instruments – including a variety of radars, lidars, cameras, balloons, and other state-of-the-art equipment – will collect a treasure trove of data on precipitation, wind, clouds, aerosols, solar and thermal energy, temperature, humidity, ozone, and more. That data can then be used to turbocharge the capabilities of Earth system models and answer many scientific questions about how, why, where, and when rain and snow will fall. In close collaboration with researchers specializing in Earth’s surface and subsurface, the SAIL campaign will help the scientific community understand how mountains extract moisture from the atmosphere and then process the water all the way down to the bedrock beneath Earth’s surface. Ultimately, this will provide the tools for scientists to better predict the future availability of water.

    “The Upper Colorado River powers more than $1 trillion in economic activity and provides an immense amount of hydroelectric power but it’s very understudied compared to how important it is,” said Berkeley Lab scientist Daniel Feldman, the lead SAIL investigator. “We’re starting to see really dramatic consequences from the changing water resources, but the details of what is actually going on in these places where the water’s coming from – those details matter, and that’s what SAIL is focused on.”

    From the Arctic to the Rockies

    SAIL is a research campaign managed by DOE’s Atmospheric Radiation Measurement (ARM) user facility, a key contributor to climate research with its stationary and mobile climate observatories located throughout the United States and around the world. Much of the equipment being used in SAIL has just returned from a one-year Arctic expedition.

    “SAIL is a timely campaign because of the ongoing drought in the Western United States,” said Sally McFarlane, DOE Program Manager for the ARM user facility. “The Colorado River is of particular concern because it supplies water to 40 million people. SAIL is bringing together data from ARM and other research programs from within DOE to ultimately help provide insights into the atmospheric processes and land-atmosphere interactions that impact rain and snow in the upper Colorado River watershed.”

    3

    The instruments are mostly housed in large containers sited in the picturesque mountain town of Gothic, Colorado, an old mining town near Crested Butte, Colorado. The facility is hosted by the Rocky Mountain Biological Laboratory, which is dedicated to research on high-altitude ecosystems. A staff of three technicians will monitor the instruments around the clock.

    “This is a profound and incredibly unique opportunity and represents a first-of-its-kind experiment in mountainous systems worldwide, bridging the processes from the atmosphere all the way down to bedrock,” said Berkeley Lab scientist Ken Williams, the lead on-site researcher for SAIL.

    5

    3

    4

    SAIL instruments include (from top) radiometers, a rain guage, and Doppler lidar to measure wind velocities. Credit: John Bilberry, Los Alamos National Laboratory.

    SAIL science: better models to answer tough questions.

    Having this volume of data at a wide range of spatial and temporal scales will allow scientists to begin to understand the physical processes that may affect mountain hydrology and answer questions such as how dust, wildfire, hot drought, tree mortality, and other phenomena might affect the watershed. Ultimately, the data will be fed into Earth system models so they can “get the water balance right.”

    “Our models that predict what future water is going to be – their resolution is now about 100 kilometers [62 miles], but there’s a lot of activity that happens in 100 kilometers, a lot of terrain variability, a lot of differences in precipitation, and surface and subsurface processes,” Feldman said. “So really the question is, what are all the details that need to go into those big models, so that we can get them to get the water balance right? And that’s why this is really exciting – we’ll be measuring the inputs and the outputs at a fundamental level to develop a benchmark dataset for the scientific community to evaluate and improve their models.”

    DOE’s Atmospheric System Research (ASR) program works closely with ARM to improve understanding of the key processes that affect the Earth’s radiative balance and hydrological cycle.

    6
    Colorado River. Credit: Roy Kaltschmidt/ DOE’s Lawrence Berkeley National Laboratory (US).

    “ASR research projects during the SAIL campaign will help us learn more about the cloud, aerosol, precipitation, and radiation processes that affect the water cycle in the upper Colorado River watershed,” said Jeff Stehr, a DOE Program Manager for ASR. “Ultimately, this work will help us improve climate models so that they can be used to better understand, predict, and plan for threats to water resources in the arid West and globally.”

    SAIL leverages the substantial efforts that Berkeley Lab has already undertaken in this area: it has been leading field studies at the East River watershed of the Colorado Upper Gunnison Basin since 2014, as part of the DOE-funded Watershed Function Scientific Focus Area project. SAIL will build on that research effort, bringing together a wide range of scientific disciplines to create the world’s first bedrock-to-atmosphere mountain integrated field laboratory.

    7
    The East River watershed-a living laboratory. Credit: Roy Kaltschmidt/ DOE’s Lawrence Berkeley National Laboratory (US).

    Some of the practical questions the SAIL campaign could help answer include:

    ● How do we plan for a future of low snow or snowfall changing to rainfall? “Our planning for the Colorado River is largely based on historical weather patterns that might be changing, from snow to rain,” Feldman said.

    ● How do activities and disturbances in the forest affect water quality and water availability? “It’s not just about the total volume of water exiting these systems,” Williams said. “We’ll also be looking at how land activities – such as wildfire and forest management – affect the concentrations of constituents in the water and overall water quality.”

    ● Will dams overflow? The U.S. Bureau of Reclamation, the federal agency charged with managing dams in the western U.S., will be using the new data coming in from the radar system to help with controlled dam and reservoir operations. Feldman noted: “There have been some pretty scary situations that have arisen when rain falls on snow. The Oroville Dam disaster [in California in 2017] is just one of many such examples.” Additionally, one of the weather radars will be located at a ski area owned by Vail Resorts, a major Colorado ski resort, which could benefit outdoor enthusiasts as well as scientists. And the research will also be useful to organizations such as water utilities and the Bureau of Reclamation that are experimenting with weather modification technologies, such as cloud-seeding.

    8
    Glen Canyon Dam. Credit: Julie Chao.

    Additionally, one of the weather radars will be located at a ski area owned by Vail Resorts, a major Colorado ski resort, which could benefit outdoor enthusiasts as well as scientists. And the research will also be useful to organizations such as water utilities and the Bureau of Reclamation that are experimenting with weather modification technologies, such as cloud-seeding.

    Other federal agencies join the bandwagon

    All the data collected by SAIL will be freely available to researchers. What’s more, a bevy of researchers from other federal agencies are undertaking field campaigns in the area with complementary research efforts.

    The National Oceanic and Atmospheric Administration (NOAA)(US), a Department of Commerce agency, has launched a project called SPLASH, or the Study of Precipitation, the Lower Atmosphere and Surface for Hydrometeorology, to improve weather and water prediction in the Colorado mountains and beyond. It will also be making detailed atmospheric co-observations in the SAIL study area.

    The Geological Survey (US), a Department of Interior agency, has developed an Upper Colorado Next Generation Water Observing System (NGWOS) to provide real-time data on water quantity and quality in more affordable and rapid ways than previously possible, and in more locations.

    “It’s quite rare for a single research question, the future of water in the West, to integrate the research activities of investigators across multiple federal agencies,” Williams noted.

    But the scale of the challenge, and the prospect of a low- to no-snow future, calls for nothing less than an all-hands-on-deck response by scientists. “We need to understand the range of risks that we’re facing moving forward,” Feldman said. “The term ‘no-analog future’ is a really big one for us.”

    9
    Staff from DOE’s Los Alamos National Laboratory(US) and Hamelmann Communications. Credit: LANL

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.


    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) (US) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (US) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley (US) physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.


    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory (US), and Robert Wilson founded Fermi National Accelerator Laboratory(US).

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy (US). The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory (US)) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy (US), with management from the University of California (US). Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science (US):

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    LBNL/ALS


    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute (US) supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory (US), DOE’s Oak Ridge National Laboratory (US)(ORNL), DOE’s Pacific Northwest National Laboratory (US) (PNNL), and the HudsonAlpha Institute for Biotechnology (US). The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry (US) [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center (US) is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center(US) at Lawrence Berkeley National Laboratory

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network (US) is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (US) (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory (US), the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science (US), and DOE’s Lawrence Livermore National Laboratory (US) (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology (US) and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory (US) leads JCESR and Berkeley Lab is a major partner.

     
  • richardmitnick 4:49 pm on August 7, 2021 Permalink | Reply
    Tags: "Translation software enables efficient DNA data storage", ADS Codex adds additional information called error detection codes that can be used to validate the data., DNA offers a compact way to store huge amounts of data cost-effectively., DNA’s storage density is staggering., DOE’s Los Alamos National Laboratory (US), Long-term storage with cheaper media is important for the national security mission of Los Alamos and others., Los Alamos National Laboratory has developed ADS Codex to translate the 0s and 1s of digital computer files into the four-letter code of DNA., Unfortunately DNA synthesis sometimes makes mistakes in the coding so ADS Codex addresses two big obstacles to creating DNA data files.   

    From DOE’s Los Alamos National Laboratory (US) : “Translation software enables efficient DNA data storage” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    April 1, 2021 [This just turned up in RSS]

    Charles Poling
    (505) 257-8006
    cpoling@lanl.gov

    1
    DNA offers a compact way to store huge amounts of data cost-effectively. Los Alamos National Laboratory has developed ADS Codex to translate the 0s and 1s of digital computer files into the four-letter code of DNA.

    In support of a major collaborative project to store massive amounts of data in DNA molecules, a Los Alamos National Laboratory–led team has developed a key enabling technology that translates digital binary files into the four-letter genetic alphabet needed for molecular storage.

    “Our software, the Adaptive DNA Storage Codec (ADS Codex), translates data files from what a computer understands into what biology understands,” said Latchesar Ionkov, a computer scientist at Los Alamos and principal investigator on the project. “It’s like translating from English to Chinese, only harder.”

    The work is key part of the Intelligence Advanced Research Projects Activity (IARPA) Molecular Information Storage (MIST) program to bring cheaper, bigger, longer-lasting storage to big-data operations in government and the private sector. The short-term goal of MIST is to write 1 terabyte—a trillion bytes—and read 10 terabytes within 24 hours for $1,000. Other teams are refining the writing (DNA synthesis) and retrieval (DNA sequencing) components of the initiative, while Los Alamos is working on coding and decoding.

    “DNA offers a promising solution compared to tape, the prevailing method of cold storage, which is a technology dating to 1951,” said Bradley Settlemyer, a storage systems researcher and systems programmer specializing in high-performance computing at Los Alamos. “DNA storage could disrupt the way we think about archival storage, because the data retention is so long and the data density so high. You could store all of YouTube in your refrigerator, instead of in acres and acres of data centers. But researchers first have to clear a few daunting technological hurdles related to integrating different technologies.”

    Not lost in translation

    Compared to the traditional long-term storage method that uses pizza-sized reels of magnetic tape, DNA storage is potentially less expensive, far more physically compact, more energy efficient, and longer lasting—DNA survives for hundreds of years and doesn’t require maintenance. Files stored in DNA also can be very easily copied for negligible cost.

    DNA’s storage density is staggering. Consider this: humanity will generate an estimated 33 zettabytes by 2025—that’s 3.3 followed by 22 zeroes. All that information would fit into a ping pong ball, with room to spare. The Library of Congress has about 74 terabytes, or 74 million million bytes, of information—6,000 such libraries would fit in a DNA archive the size of a poppy seed. Facebook’s 300 petabytes (300,000 terabytes) could be stored in a half poppy seed.

    Encoding a binary file into a molecule is done by DNA synthesis. A fairly well understood technology, synthesis organizes the building blocks of DNA into various arrangements, which are indicated by sequences of the letters A, C, G, and T. They are the basis of all DNA code, providing the instructions for building every living thing on earth.

    The Los Alamos team’s ADS Codex tells exactly how to translate the binary data—all 0s and 1s—into sequences of four letter-combinations of A, C, G, and T. The Codex also handles the decoding back into binary. DNA can be synthesized by several methods, and ADS Codex can accommodate them all. The Los Alamos team has completed a version 1.0 of ADS Codex and in November 2021 plans to use it to evaluate the storage and retrieval systems developed by the other MIST teams.

    Unfortunately DNA synthesis sometimes makes mistakes in the coding so ADS Codex addresses two big obstacles to creating DNA data files.

    First, compared to traditional digital systems, the error rates while writing to molecular storage are very high, so the team had to figure out new strategies for error correction. Second, errors in DNA storage arise from a different source than they do in the digital world, making the errors trickier to correct.

    “On a digital hard disk, binary errors occur when a 0 flips to a 1, or vice versa, but with DNA, you have more problems that come from insertion and deletion errors,” Ionkov said. “You’re writing A, C, G, and T, but sometimes you try to write A, and nothing appears, so the sequence of letters shifts to the left, or it types AAA. Normal error correction codes don’t work well with that.”

    ADS Codex adds additional information called error detection codes that can be used to validate the data. When the software converts the data back to binary, it tests if the codes match. If they don’t, ACOMA tries removing or adding nucleotides until the verification succeeds.

    Smart scale-up

    Large warehouses contain today’s largest data centers, with storage at the exabyte scale—that’s a trillion million bytes or more. Costing billions to build, power, and run, this type of digitally based data centers may not be the best option as the need for data storage continues to grow exponentially.

    Long-term storage with cheaper media is important for the national security mission of Los Alamos and others. “At Los Alamos, we have some of the oldest digital-only data and largest stores of data, starting from the 1940s,” Settlemyer said. “It still has tremendous value. Because we keep data forever, we’ve been at the tip of the spear for a long time when it comes to finding a cold-storage solution.”

    Settlemyer said DNA storage has the potential to be a disruptive technology because it crosses between fields ripe with innovation. The MIST project is stimulating a new coalition among legacy storage vendors who make tape, DNA synthesis companies, DNA sequencing companies, and high-performance computing organizations like Los Alamos that are driving computers into ever-larger-scale regimes of science-based simulations that yield mind-boggling amounts of data that must be analyzed.

    Deeper dive into DNA

    When most people think of DNA, they think of life, not computers. But DNA is itself a four-letter code for passing along information about an organism. DNA molecules are made from four types of bases, or nucleotides, each identified by a letter: adenine (A), thymine (T), guanine (G), and cytosine (C).

    These bases wrap in a twisted chain around each other—the familiar double helix—to form the molecule. The arrangement of these letters into sequences creates a code that tells an organism how to form. The complete set of DNA molecules makes up the genome—the blueprint of your body.

    By synthesizing DNA molecules—making them from scratch—researchers have found they can specify, or write, long strings of the letters A, C, G, and T and then read those sequences back. The process is analogous to how a computer stores information using 0s and 1s. The method has been proven to work, but reading and writing the DNA-encoded files currently takes a long time, Ionkov said.

    “Appending a single nucleotide to DNA is very slow. It takes a minute,” Ionkov said. “Imagine writing a file to a hard drive taking more than a decade. So that problem is solved by going massively parallel. You write tens of millions of molecules simultaneously to speed it up.”

    While various companies are working on different ways of synthesizing to address this problem, ADS Codex can be adapted to every approach.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus

    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 4:19 pm on August 6, 2021 Permalink | Reply
    Tags: "Decades of research brings quantum dots to brink of widespread use", Advances include the discovery of carrier multiplication; pioneering research into quantum dot light emitting diodes (LEDs); luminescent solar concentrators., Another highly promising area is quantum dot luminescent solar concentrators or LSCs., , , DOE’s Los Alamos National Laboratory (US), Due to their narrowband spectrally tunable emission quantum dots allow for improved color purity and more complete coverage of the entire color space., In the case of photovoltaics (PV) the quantum dot approach could be used to realize a new generation of inexpensive thin-film PV devices., , Many advances in this science originated at Los Alamos including the first demonstration of colloidal quantum dot lasing., Over the years quantum dots have become industrial-grade materials exploited in a range of traditional and emerging technologies., Quantum dots are also of great potential utility in solar harvesting and light sensing technologies., The next frontier is creating technologically viable LEDs powered by electrically driven quantum dots., Thirty years ago quantum dots were just a subject of scientific curiosity studied by a small group of enthusiasts., Using modern colloidal chemistry the dimensions and internal structure of quantum dots can be manipulated with near-atomic precision.   

    From DOE’s Los Alamos National Laboratory (US) : “Decades of research brings quantum dots to brink of widespread use” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    August 5, 2021

    Review article in Science covers a wide range of breakthroughs including Los Alamos’ role in key advances.

    1
    The tiny specs of matter called quantum dots can be tuned to emit light in specific wavelengths. That’s just one quality that makes them valuable in a range of technology applications.

    A new article in Science magazine gives an overview of almost three decades of research into colloidal quantum dots, assesses the technological progress for these nanometer-sized specs of semiconductor matter, and weighs the remaining challenges on the path to widespread commercialization for this promising technology with applications in everything from TVs to highly efficient sunlight collectors.

    “Thirty years ago these structures were just a subject of scientific curiosity studied by a small group of enthusiasts. Over the years quantum dots have become industrial-grade materials exploited in a range of traditional and emerging technologies, some of which have already found their way into commercial markets,” said Victor I. Klimov, a coauthor of the paper and leader of the team conducting quantum dot research at Los Alamos National Laboratory.

    Many advances described in the Science article originated at Los Alamos including the first demonstration of colloidal quantum dot lasing, the discovery of carrier multiplication; pioneering research into quantum dot light emitting diodes (LEDs); luminescent solar concentrators; and recent studies of single-dot quantum emitters.

    Using modern colloidal chemistry the dimensions and internal structure of quantum dots can be manipulated with near-atomic precision, which allows for highly accurate control of their physical properties and thereby behaviors in practical devices.

    A number of ongoing efforts on practical applications of colloidal quantum dots have exploited size-controlled tunability of their emission color and high-emission quantum yields near the ideal 100 percent limit. These properties are attractive for screen displays and lighting, the technologies where quantum dots are used as color converting phosphors. Due to their narrowband spectrally tunable emission quantum dots allow for improved color purity and more complete coverage of the entire color space compared to the existing phosphor materials. Some of these devices, such as quantum dot TVs, have already reached technological maturity and are available in commercial markets.

    The next frontier is creating technologically viable LEDs powered by electrically driven quantum dots. The Science review describes various approaches to implement these devices and discusses the existing challenges. Quantum LEDs have already reached impressive brightness and almost ideal efficiencies near the theoretically defined limits. Much of this progress has been driven by continuing advances in understanding the performance-limiting factors such as nonradiative Auger recombination.

    The article also discusses the status and challenges of solution-processable quantum dot lasers.

    “Making these lasers available would benefit a range of technologies, including integrated photonic circuits, optical communication, lab-on-a-chip platforms, wearable devices, and medical diagnostics,” Klimov said.

    Los Alamos researchers have contributed key advances in this area including the elucidation of mechanisms for light amplification in colloidal nanostructures and the first demonstration of a lasing effect using these materials.

    “The primary current challenge is demonstrating lasing with electrical pumping,” Klimov said. “Los Alamos has been responsible for several important milestones on the path to this objective including the realization of optical gain with electrical excitation and the development of dual-function devices that operate as an optically pumped laser and a standard electrically driven LED.”

    Quantum dots are also of great potential utility in solar harvesting and light sensing technologies. Due to their tunable bandgap, they can be engineered to target a particular range of wavelengths, which is especially attractive for realizing inexpensive photodetectors for the infrared spectral range. In the realm of solar energy technologies, colloidal quantum dots have been exploited as active elements of both solar cells and luminescent sunlight collectors.

    In the case of photovoltaics (PV) the quantum dot approach could be used to realize a new generation of inexpensive thin-film PV devices prepared by scalable solution-based techniques such as roll-by-roll processing. In addition, they could enable conceptionally new photoconversion schemes derived from physical processes unique to ultrasmall “quantum-confined” colloidal particles. One such process, carrier multiplication, generates multiple electron-hole pairs by a single absorbed photon. This process, first reported by Los Alamos researchers in 2004, has been the subject of intense research in the context of its applications in both PVs and solar photochemistry.

    “Another highly promising area is quantum dot luminescent solar concentrators or LSCs,” Klimov said. “Using the LSC approach, one can, in principle, convert standard windows or wall sidings into power generating devices. Along with roof-top solar modules, this could help supply an entire building with clean energy. While the LSC concept was introduced back in 1970s, it truly flourished only recently due to introduction of specially engineered quantum dots.”

    Los Alamos researchers have contributed many important advances to the LSC field including the development of practical approaches for tackling the problem of light self-absorption and developing high-efficiency bi-layer (tandem) devices. Several start-ups, including a Laboratory spin-off, UbiQD Inc., have been actively pursuing commercialization of a quantum dot LSC technology.

    Funding: Laboratory Directed Research and Development (LDRD) at Los Alamos National Laboratory and DOE and U.S. Department of Energy Office of Science.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory (US), a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the
    Department of Energy’s National Nuclear Security Administration.
    Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

    Operated by Los Alamos National Security, LLC for the DOE National Nuclear Security Administration (US)

     
  • richardmitnick 4:06 pm on August 2, 2021 Permalink | Reply
    Tags: "What does the edge of the solar system look like?", A 3D map of the solar system's edge took 13 years to create., , , At the edge of the solar system is a violent frontier where two cosmic powers clash., , , DOE’s Los Alamos National Laboratory (US), , , , Solar wind and interstellar particles meet and form a boundary at the far reaches of the solar system., We've sent out various spacecraft over the years so do we have any idea what the edge of the solar system looks like?   

    From DOE’s Los Alamos National Laboratory (US) via Live Science : “What does the edge of the solar system look like?” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    via

    Live Science

    8.2.21
    Randyn Bartholomew

    It’s weirder than you may have imagined.

    1
    Credit: MARK GARLICK/SCIENCE PHOTO LIBRARY via Getty Images)

    Earth is the sixth planet from the edge of the solar system, meaning we’re none too near this cold and inhospitable frontier. But we’ve sent out various spacecraft over the years so do we have any idea what the edge of the solar system looks like?

    The answer is yes, but it’s a work in progress. One of the latest developments, a 3D map of the solar system’s edge that took 13 years to create, revealed a few more secrets about this mysterious boundary, called the outer heliosphere.

    The outer heliosphere marks the region of space where the solar wind, or the stream of charged particles emitted from the sun, is “deflected and draped back” by the interstellar radiation that permeates the empty space beyond the solar system, said Dan Reisenfeld, a space science researcher at Los Alamos National Laboratory in New Mexico and head of the team that conducted the research [The Astrophysical Journal Supplement Series] on the 3D map. In other words, solar wind and interstellar particles meet and form a boundary at the far reaches of the solar system.

    2
    The team’s three-dimensional map of the heliosphere shows the bubble is far thinner on the side facing the interstellar wind than on the opposite side. (Image credit: Reisenfeld et al.)

    National Aeronautics Space Agency (US)Heliosphere-heliopause showing positions of two Voyager spacecraft. Credit: NASA/JPL-Caltech.

    4
    A diagram of our heliosphere. For the first time, scientists have mapped the heliopause, which is the boundary between the heliosphere (brown) and interstellar space (dark blue). Credit: NASA/IBEX/Adler Planetarium (US)

    Earthlings first got a glimpse of the solar system’s outer edge in 2012, when Voyager I, a NASA spacecraft that launched in 1977, crossed into interstellar space, according to NASA. Voyager 2 was not far behind, repeating the feat in 2018.

    National Aeronautics Space Agency(US) Voyager 1.

    National Aeronautics and Space Administration(US)Voyager 2[/caption]

    Equipped with golden records full of Bach, Louis Armstrong and humpback whale songs, in addition to their scientific instruments, Voyagers 1 and 2 reported a sudden dropoff in solar particles and a substantial increase in galactic radiation when they left the solar system, according to NASA’s Jet Propulsion Laboratory (US) at the California Institute of Technology (US).

    At the edge of the solar system is a violent frontier where two cosmic powers clash. On one side is the solar wind, the constant flood of hot, charged particles flowing out of the sun at hundreds of miles per second. On the other side are the winds of space, blowing with the radiation of billions upon billions of nearby stars.

    Despite causing occasional blackouts here on Earth, the solar wind actually does a pretty good job of defending our planet (and the solar system) from the harshest interstellar radiation. As the wind gusts out of the sun in every direction at once, it forms an enormous protective bubble around the solar system that repels about 70% of incoming radiation, Live Science previously reported (Earth’s magnetic shield protects us from much of the rest).

    This bubble is known as the heliosphere, and its edge (called the heliopause) marks a physical border where the solar system ends and interstellar space begins — but, unlike most borders on Earth, scientists have no idea how big it is or what it looks like. A new study, published June 10 in The Astrophysical Journal [above], tackles these mysteries with the first 3D map of the heliosphere ever created.

    Using 10 years of data captured by NASA’s Interstellar Boundary Explorer satellite, the study authors tracked solar-wind particles as they traveled from the sun to the edge of the solar system and back again.

    NASA IBEX – Interstellar Boundary Explorer (US).

    From this travel time, the team calculated how far the wind had blown in a given direction before being repelled by interstellar radiation, allowing the researchers to map the invisible edges of the solar system similarly to the way bats use echolocation, the researchers said.

    That lack of symmetry comes from the sun’s movement through the Milky Way, as it experiences friction with the galactic radiation in front of it and clears out a space in its wake. “There’s a lot of plasma [charged particles] in the interstellar medium, and… the inner heliosphere, which is pretty round, is an obstacle in this stream of plasma which is flowing past it,” Reisenfeld told Live Science. “It has the same effect as water going around a rock in a stream,” with a rush of water crashing into the rock in front and a sheltered calm behind it.

    Measurements for the 3D map were gathered using the Interstellar Boundary Explorer (IBEX) [above], which was launched in 2008 and is “the size of a bus tire,” according to NASA. It’s pronounced “like the animal,” Reisenfeld said, referring to the ibex mountain goats known for their gravity-defying treks up alpine cliffs. But the animal that IBEX really takes after is the bat.

    Many bats hunt insects, such as mosquitoes, by emitting a pulse of sound and using the time delay of the echo to figure out the distance to their prey. Likewise, IBEX detects solar-wind particles that have bounced back from the edges of the solar system, allowing Reisenfeld and his colleagues to determine the distances involved by measuring how long their round trip took. “The sun will send out a pulse … and then we passively wait for a return signal from the outer heliosphere, and we use that time delay to determine where the outer heliosphere must be,” Reisenfeld explained.

    As the sun circles the outer rim of the Milky Way, the solar wind keeps cosmic radiation at bay, forming a protective bubble. This is good for us, since “that radiation can damage spacecraft and it can be a health hazard for astronauts,” Reisenfeld said.

    However, the boundaries may not stay this way in the long term. Reisenfeld noted that there is a correlation between the strength of the solar wind and the number of spots on the sun. A sunspot is a relatively dark patch that temporarily appears on the surface of the sun as a result of intense magnetic disturbances within. From 1645 to 1715, a period known to sun watchers as the Maunder minimum, there were very few sunspots, and thus there may have been only weak solar winds.

    “The sunspots disappeared for almost a century, and if that happens, the shape of the heliosphere could have also changed significantly,” Reisenfeld said. “We do see variations in solar activity, and at any time, another Maunder minimum could happen. It’s not a pie-in-the-sky concern to be worried that the [heliosphere’s] effectiveness at shielding could change over time.”

    To learn more about the heliosphere, NASA plans to launch a new mission called the Interstellar Mapping and Acceleration Probe (IMAP) in 2025. If all goes according to plan, IMAP will reveal further details about interactions between solar winds and cosmic radiation at the solar system’s edge.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory (US), a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the
    Department of Energy’s National Nuclear Security Administration.
    Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

    Operated by Los Alamos National Security, LLC for the DOE National Nuclear Security Administration (US)

     
  • richardmitnick 1:35 pm on July 13, 2021 Permalink | Reply
    Tags: "Neutron-clustering effect in nuclear reactors demonstrated for first time", As neutrons fission and create more neutrons some go on to form large lineages of clusters while others quickly die off resulting in so-called ‘power tilts’ or asymmetrical energy production., DOE’s Los Alamos National Laboratory (US), In nuclear reactors from generation to generation each neutron can be said to have a 50 percent chance of dying or fissioning to create more neutrons., The scientists used a low-power nuclear reactor located at the Walthousen Reactor Critical Facility at Rensselaer Polytechnic Institute (US) in New York., The scientists were able to model the life of each neutron in the nuclear reactor basically building a family tree for each.   

    From DOE’s Los Alamos National Laboratory (US) : “Neutron-clustering effect in nuclear reactors demonstrated for first time” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    July 12, 2021
    Laura Mullane
    (505) 412-7733
    mullane@lanl.gov

    Long-theorized phenomenon observed in a working reactor could improve reactor safety, according to a new study.

    1
    Reactor Operator Nicholas Thompson of Los Alamos National Laboratory helps to set up the neutron clustering measurements at the Walthousen Reactor Critical Facility at Rensselaer Polytechnic Institute (US) in Schenectady, NY.

    For the first time, the long-theorized neutron-clustering effect in nuclear reactors has been demonstrated, which could improve reactor safety and create more accurate simulations, according to a new study recently published in the journal Nature Communications Physics.

    “The neutron-clustering phenomenon had been theorized for years, but it had never been analyzed in a working reactor,” said Nicholas Thompson, an engineer with the Los Alamos Advanced Nuclear Technology Group. “The findings indicate that, as neutrons fission and create more neutrons some go on to form large lineages of clusters while others quickly die off resulting in so-called ‘power tilts’ or asymmetrical energy production.”

    Understanding these clustering fluctuations is especially important for safety and simulation accuracy, particularly as nuclear reactors first begin to power up. The study was a collaboration with the Institute for Radiological Protection and Nuclear Safety (IRSN) and the Atomic Energy Commission (CEA), both located in France.

    “We were able to model the life of each neutron in the nuclear reactor basically building a family tree for each,” said Thompson. “What we saw is that even if the reactor is perfectly critical, so the number of fissions from one generation to the next is even, there can be bursts of clusters that form and others that quickly die off.”

    This clustering phenomenon became important to understand because of a statistical concept known as the gambler’s ruin, believed to have been derived by Blaise Pascal. In a betting analogy, the concept says that even if the chances of a gambler winning or losing each individual bet are 50 percent, over the course of enough bets the statistical certainty that the gambler will go bankrupt is 100 percent.

    In nuclear reactors from generation to generation each neutron can be said to have a 50 percent chance of dying or fissioning to create more neutrons. According to the gambler’s ruin concept, the neutrons in a reactor might then have a statistical chance of dying off completely at some future generation, even though the system is at critical.

    This concept had been studied widely in other scientific fields, such as biology and epidemiology, where this generational clustering phenomenon is also present. By drawing on this related statistical math, the research team was able to analyze whether the gambler’s ruin concept would hold true for neutrons in nuclear reactors.

    “You would expect this theory to hold true,” says Jesson Hutchinson, who works with the Laboratory’s Advanced Nuclear Technology Group. “You should have a critical system that, while the neutron population is varying between generations, runs some chance of becoming subcritical and losing all neutrons. But that’s not what happens.”

    To understand why the gambler’s ruin concept didn’t hold true, researchers The scientists used A low-power reactor was essential for tracking the lifespans of individual neutrons because large-scale reactors can have trillions of interactions at any moment. The team used three different neutron detectors, including the Los Alamos-developed Neutron Multiplicity 3He Array Detector (NoMAD), to trace every interaction inside the reactor.

    The team found that while generations of neutrons would cluster in large family trees and others died out, a complete die-off was avoided in the small reactor because of spontaneous fission, or the non-induced nuclear splitting of radioactive material inside reactors, which creates more neutrons. That balance of fission and spontaneous fission prevented the neutron population from dying out completely, and it also tended to smooth out the energy bursts created by clustering neutrons.

    “Commercial-sized nuclear reactors don’t depend on the neutron population alone to reach criticality, because they have other interventions like temperature and control rod settings,” Hutchinson said. “But this test was interested in answering fundamental questions about neutron behavior in reactors, and the results will have an impact on the math we use to simulate reactors and could even affect future design and safety procedures.”

    Funding: This work was supported by the DOE Nuclear Criticality Safety Program, funded and managed by the National Nuclear Security Administration for the Department of Energy.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory (US), a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the
    Department of Energy’s National Nuclear Security Administration.
    Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

    Operated by Los Alamos National Security, LLC for the DOE National Nuclear Security Administration (US)

     
  • richardmitnick 4:06 pm on July 5, 2021 Permalink | Reply
    Tags: "Software evaluates qubits- characterizes noise in quantum annealers", , DOE’s Los Alamos National Laboratory (US)   

    From DOE’s Los Alamos National Laboratory (US) via phys.org : “Software evaluates qubits- characterizes noise in quantum annealers” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    via

    phys.org

    July 5, 2021

    1
    Credit: CC0 Public Domain

    High-performance computer users in the market for a quantum annealing machine or looking for ways to get the most out of one they already have will benefit from a new, open-source software tool for evaluating these emerging platforms at the individual qubit level.

    “We were motivated by the need for validation and verification of quantum annealers, similar to what is currently done by organizations when they purchase a new classical supercomputer,” said Carleton Coffrin, a computer scientist and expert in artificial intelligence at Los Alamos. “They conduct acceptance testing on a huge set of benchmarks. We didn’t have good analogs for that on the quantum annealing computers. For quantum annealing, our new a Quantum Annealing Single-qubit Assessment, or QASA, protocol gives us one tool for acceptance testing.”

    Coffrin is principal investigator of the project “Accelerating Combinatorial Optimization with Noisy Analog Hardware,” which developed the paper, “Single-Qubit Fidelity Assessment of Quantum Annealing Hardware”, IEEE Transactions on Quantum Engineering.

    QASA is available as open-source software at github.com/lanl-ansi/QASA . QASA, which is executed in parallel for all qubits on a quantum annealing device, provides a detailed characterization through salient metrics about individual qubits, such as their effective temperature, noise, and bias. In the key breakthrough of this work, the single-qubit model can be executed in parallel for every qubit in a quantum annealing hardware device.

    “The QASA protocol could eventually find a wide range of uses, such as tracking improved performance in quantum annealing computers and helping hardware developers spot inconsistencies in their own devices,” Coffrin said. With the protocol, users of quantum annealers could also calibrate their algorithms to their specific computers.

    “Characterizing the noise in the system is probably the most impactful thing because it’s the least well-recognized aspect of the hardware,” Coffrin noted. “We can measure it, and understand how it’s distributed throughout the whole hardware.”

    The protocol sheds light on the variability of qubit properties across the entire computer. With this detailed analysis of the properties of each qubit, quantum annealer users can employ QASA to quickly verify the level of consistency across the hardware’s qubits and either avoid or compensate for non-ideal qubits. Users also use this information to calibrate idealized quantum simulations running on specific hardware devices.

    The analysis also yields several key metrics, such as qubit noise, that support tracking technical improvements on quantum annealing hardware as it is developed.

    As both gate-based quantum computers and quantum annealing computers move from science projects to real-world tasks, measuring and tracking changes in the fidelity of quantum hardware platforms is essential to understanding the limitations of these devices and quantifying progress as these platforms continue to improve, the paper states.

    In a data-driven discovery process, Coffrin said, the Los Alamos team used machine learning and data from a D-Wave 2000Q computer at the Laboratory to develop the QASA protocol, which can run on any quantum annealer.

    “We ran a bunch of experiments on our D-Wave, putting in different values for one parameter, and watched what happened,” he said. The results yielded a surprising curve when graphed. “We had to develop a new theoretical model to correspond to what’s going on.” Then the team designed a machine-learning method that fit the theoretical model to the data.Quantum annealing computers operate on a different principle than gate-based quantum computers, which use gates analogous to the logic gates on a classical binary computer.

    Quantum annealers leverage a smooth quantum evolution to exploit fundamental quantum principles in finding high quality solutions. This process is more specialized than gate-based computer but is still sufficient to solve challenging computational problems in fields such as magnetic materials, machine learning and optimization, all of which rely on optimization, or finding the best answer among all plausible answers. For example, finding the shortest route for a delivery truck dropping packages at multiple locations is a classic optimization problem.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory (US), a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the
    Department of Energy’s National Nuclear Security Administration.
    Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

    Operated by Los Alamos National Security, LLC for the DOE National Nuclear Security Administration (US)

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: