Tagged: DOE’s Los Alamos National Laboratory (US) Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:13 am on December 19, 2021 Permalink | Reply
    Tags: "Using sparse data to predict lab quakes", , DOE’s Los Alamos National Laboratory (US), , , , , , , Transfer learning: comparisons from the lab to the field   

    From DOE’s Los Alamos National Laboratory (US) : “Using sparse data to predict lab quakes” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    December 16, 2021

    1
    Stick-slip events in the earth cause damage like this, but limited data from these relatively rare earthquakes makes them difficult to model with machine learning. Transfer learning may provide a path to understanding when such deep faults slip. Credit: Dreamstime.

    A machine-learning approach developed for sparse data reliably predicts fault slip in laboratory earthquakes and could be key to predicting fault slip and potentially earthquakes in the field. The research by a Los Alamos National Laboratory team builds on their previous success using data-driven approaches that worked for slow-slip events in earth but came up short on large-scale stick-slip faults that generate relatively little data—but big quakes.

    “The very long timescale between major earthquakes limits the data sets, since major faults may slip only once in 50 to 100 years or longer, meaning seismologists have had little opportunity to collect the vast amounts of observational data needed for machine learning,” said Paul Johnson, a geophysicist at Los Alamos and a co-author on a new paper in Nature Communications.

    To compensate for limited data, Johnson said, the team trained a convolutional neural network on the output of numerical simulations of laboratory quakes as well as on a small set of data from lab experiments. Then they were able to predict fault slips in the remaining unseen lab data.

    This research was the first application of transfer learning to numerical simulations for predicting fault slip in lab experiments, Johnson said, and no one has applied it to earth observations.

    With transfer learning, researchers can generalize from one model to another as a way of overcoming data sparsity. The approach allowed the Laboratory team to build on their earlier data-driven machine learning experiments successfully predicting slip in laboratory quakes and apply it to sparse data from the simulations. Specifically, in this case, transfer learning refers to training the neural network on one type of data—simulation output—and applying it to another—experimental data—with the additional step of training on a small subset of experimental data, as well.

    “Our aha moment came when I realized we can take this approach to earth,” Johnson said. “We can simulate a seismogenic fault in earth, then incorporate data from the actual fault during a portion of the slip cycle through the same kind of cross training.” The aim would be to predict fault movement in a seismogenic fault such as the San Andreas, where data is limited by infrequent earthquakes.

    The team first ran numerical simulations of the lab quakes. These simulations involve building a mathematical grid and plugging in values to simulate fault behavior, which are sometimes just good guesses.

    For this paper, the convolutional neural network comprised an encoder that boils down the output of the simulation to its key features, which are encoded in the model’s hidden, or latent space, between the encoder and decoder. Those features are the essence of the input data that can predict fault-slip behavior.

    The neural network decoded the simplified features to estimate the friction on the fault at any given time. In a further refinement of this method, the model’s latent space was additionally trained on a small slice of experimental data. Armed with this “cross-training,” the neural network predicted fault-slip events accurately when fed unseen data from a different experiment.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 9:17 am on December 9, 2021 Permalink | Reply
    Tags: "Physical features boost the efficiency of quantum simulations", 1. The complexity of a quantum simulation algorithm depends on the relevant energy scale and not the full range of energies of the system as previously thought., 2. Three quantum systems in which a quantum simulation algorithm can run faster-and in some cases exponentially faster-than the limits suggested by the time-energy uncertainty principle., DOE’s Los Alamos National Laboratory (US), , Recent theoretical breakthroughs have settled two long-standing questions about the viability of simulating quantum systems on future quantum computers.   

    From DOE’s Los Alamos National Laboratory (US) : “Physical features boost the efficiency of quantum simulations” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    December 6, 2021
    Charles Poling
    (505) 257-8006
    cpoling@lanl.gov

    1
    New theoretical research at Los Alamos National Laboratory lays the groundwork for robust quantum algorithms when large-scale quantum computers become available. Image credit: Dreamstime.

    Recent theoretical breakthroughs have settled two long-standing questions about the viability of simulating quantum systems on future quantum computers, overcoming challenges from complexity analyses to enable more advanced algorithms. Featured in two publications, the work by a quantum team at Los Alamos National Laboratory shows that physical properties of quantum systems allow for faster simulation techniques.

    ”Algorithms based on this work will be needed for the first full-scale demonstration of quantum simulations on quantum computers,” said Rolando Somma, a quantum theorist at Los Alamos and coauthor on the two papers.

    Low-energy quantum states key to faster quantum simulation

    The first paper demonstrates that the complexity of a quantum simulation algorithm depends on the relevant energy scale and not the full range of energies of the system as previously thought. In fact, some quantum systems can have states of unbounded energies, hence simulations would prove intractable even on large quantum computers.

    This new research found that, if a quantum system explores the low-energy states only, it could be simulated with low complexity on a quantum computer without errors crashing the simulation.

    “Our work provides a path to a systematic study of quantum simulations at low energies, which will be required to push quantum simulations closer to reality,” said Burak Şahinoğlu, a theoretical physicist at Los Alamos and lead author on the paper, published in the journal npj Quantum Information , a Nature partner journal.

    “We show that at every step of the algorithm, you never escape to the very large energies,” said Somma. “There’s a way of writing your quantum algorithm so that after each step you’re still within your low-energy subspace.”

    The authors said their research applies to a large class of quantum systems and will be useful in simulating quantum field theories, which describe physical phenomena within their low-energy states.

    Fast-forwarding of quantum systems bypasses the time-energy uncertainty principle

    The other paper a collaboration with The California Institute of Technology (US)’s Shouzhen Gu—a former Los Alamos quantum computing summer school student—is published in Quantum. It shows three quantum systems in which a quantum simulation algorithm can run faster—and in some cases exponentially faster—than the limits suggested by the time-energy uncertainty principle.

    “In quantum mechanics, the best precision that can be achieved when measuring a system’s energy scales, in general, with the inverse of the duration of the measurement,” said Somma.

    “However, this principle does not apply to all quantum systems, especially those that have certain physical features,” said Şahinoğlu.

    The authors showed that when this principle is bypassed, such quantum systems can also be simulated very efficiently, or fast-forwarded, on quantum computers.

    Funding: DOE SC HEP, ASCR, National Quantum Information Science Research Centers, Quantum Science Center.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 5:09 pm on November 8, 2021 Permalink | Reply
    Tags: , "Upgraded code reveals a source of damaging fusion disruptions", , , DOE’s Los Alamos National Laboratory (US)   

    From DOE’s Princeton Plasma Physics Laboratory (US) and DOE’s Los Alamos National Laboratory (US) via phys.org : “Upgraded code reveals a source of damaging fusion disruptions” 

    From DOE’s Princeton Plasma Physics Laboratory (US)

    at

    Princeton University

    Princeton University (US)

    and

    LANL bloc

    DOE’s Los Alamos National Laboratory (US)

    via

    phys.org

    November 8, 2021

    1
    Destructive magnetic perturbations create a complex 3-D structure of magnetic field lines that randomly wander inside the tokamak. The red line shows the 3-D trajectory of an example field line, and each field line can have a significantly different trajectory. The colors of the cross-section represent the length of field line trajectory through each area, from short (black) to long (yellow) lengths. Credit: Min-Gu Yoo.

    Researchers at the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) and Los Alamos National Laboratory have uncovered a key process behind a major challenge called thermal quenches, the rapid heat loss in hot plasmas that can occur in doughnut-shaped tokamak fusion devices. Such quenches are sudden drops of electron heat in the plasma that fuels fusion reactions, drops that can create damaging disruptions inside the tokamak. Understanding the physics behind these quenches, caused by powerful perturbations in the magnetic fields that confine the plasma in tokamaks, could lead to methods to mitigate or prevent them.

    Researchers have now traced a comprehensive mechanism for thermal quenches to turbulent particle transport. Using the laboratory’s Gyrokinetic Tokamak Simulation (GTS) code, the physicists explored how the hot plasma, which is composed of free electrons and atomic nuclei, or ions, generates the electric field and the turbulent particle transport at the outset of quenches.

    The GTS code was originally developed at PPPL to simulate turbulence and transport physics in the hot core plasmas which are confined by magnetic fields in tokamaks. Recently, the GTS code has been extended to study more complex plasmas and magnetic fields, such as destructive magnetic perturbations that break the magnetic field cage and create chaotic 3D magnetic field lines (Figure 1). The introduction of novel numerical algorithms and the acceleration of graphics processing units made this powerful new capability possible. This upgrade enables the consistent simulation of plasma transport during thermal quenches at lower computational costs, yielding important new insights into disruption physics.

    The GTS code traced the plasma transport mechanism to the evolution of a self-generated electric field in 3D chaotic magnetic fields, whose complexity had previously made the quenching mechanisms difficult to understand. The improved code unraveled the controversy and laid bare the physics behind the mechanism.

    The self-generated field mixes up the plasma, causing high-energy electrons to escape from the core and fly toward the wall. This enhanced heat transport produces a rapid and continuous drop in electron temperature, leading to the thermal quench.

    From the simulation results and comparison to experimental observations, researchers found that this novel mechanism could be a major contributor to the abrupt quenches. The researchers have proposed an analytic model of plasma transport that provides new physical insights for understanding the complex topology of 3D magnetic field lines. These breakthrough discoveries could lead to new steps to battle damaging disruptions.

    The work is to be presented at
    63rd Annual Meeting of the APS Division of Plasma Physics
    Monday–Friday, November 8–12, 2021; Pittsburgh, PA

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition


    PPPL campus

    Princeton Plasma Physics Laboratory (US) is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit https://energy.gov/science.

    Princeton University

    Princeton University

    About Princeton: Overview

    Princeton University is a private Ivy League research university in Princeton, New Jersey(US). Founded in 1746 in Elizabeth as the College of New Jersey, Princeton is the fourth-oldest institution of higher education in the United States and one of the nine colonial colleges chartered before the American Revolution. The institution moved to Newark in 1747, then to the current site nine years later. It was renamed Princeton University in 1896.

    Princeton provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences, and engineering. It offers professional degrees through the Princeton School of Public and International Affairs, the School of Engineering and Applied Science, the School of Architecture and the Bendheim Center for Finance. The university also manages the DOE’s Princeton Plasma Physics Laboratory. Princeton has the largest endowment per student in the United States.

    As of October 2020, 69 Nobel laureates, 15 Fields Medalists and 14 Turing Award laureates have been affiliated with Princeton University as alumni, faculty members or researchers. In addition, Princeton has been associated with 21 National Medal of Science winners, 5 Abel Prize winners, 5 National Humanities Medal recipients, 215 Rhodes Scholars, 139 Gates Cambridge Scholars and 137 Marshall Scholars. Two U.S. Presidents, twelve U.S. Supreme Court Justices (three of whom currently serve on the court) and numerous living billionaires and foreign heads of state are all counted among Princeton’s alumni body. Princeton has also graduated many prominent members of the U.S. Congress and the U.S. Cabinet, including eight Secretaries of State, three Secretaries of Defense and the current Chairman of the Joint Chiefs of Staff.

    Princeton University, founded as the College of New Jersey, was considered the successor of the “Log College” founded by the Reverend William Tennent at Neshaminy, PA in about 1726. New Light Presbyterians founded the College of New Jersey in 1746 in Elizabeth, New Jersey. Its purpose was to train ministers. The college was the educational and religious capital of Scottish Presbyterian America. Unlike Harvard University(US), which was originally “intensely English” with graduates taking the side of the crown during the American Revolution, Princeton was founded to meet the religious needs of the period and many of its graduates took the American side in the war. In 1754, trustees of the College of New Jersey suggested that, in recognition of Governor Jonathan Belcher’s interest, Princeton should be named as Belcher College. Belcher replied: “What a name that would be!” In 1756, the college moved its campus to Princeton, New Jersey. Its home in Princeton was Nassau Hall, named for the royal House of Orange-Nassau of William III of England.

    Following the untimely deaths of Princeton’s first five presidents, John Witherspoon became president in 1768 and remained in that post until his death in 1794. During his presidency, Witherspoon shifted the college’s focus from training ministers to preparing a new generation for secular leadership in the new American nation. To this end, he tightened academic standards and solicited investment in the college. Witherspoon’s presidency constituted a long period of stability for the college, interrupted by the American Revolution and particularly the Battle of Princeton, during which British soldiers briefly occupied Nassau Hall; American forces, led by George Washington, fired cannon on the building to rout them from it.

    In 1812, the eighth president of the College of New Jersey, Ashbel Green (1812–23), helped establish the Princeton Theological Seminary next door. The plan to extend the theological curriculum met with “enthusiastic approval on the part of the authorities at the College of New Jersey.” Today, Princeton University and Princeton Theological Seminary maintain separate institutions with ties that include services such as cross-registration and mutual library access.

    Before the construction of Stanhope Hall in 1803, Nassau Hall was the college’s sole building. The cornerstone of the building was laid on September 17, 1754. During the summer of 1783, the Continental Congress met in Nassau Hall, making Princeton the country’s capital for four months. Over the centuries and through two redesigns following major fires (1802 and 1855), Nassau Hall’s role shifted from an all-purpose building, comprising office, dormitory, library, and classroom space; to classroom space exclusively; to its present role as the administrative center of the University. The class of 1879 donated twin lion sculptures that flanked the entrance until 1911, when that same class replaced them with tigers. Nassau Hall’s bell rang after the hall’s construction; however, the fire of 1802 melted it. The bell was then recast and melted again in the fire of 1855.

    James McCosh became the college’s president in 1868 and lifted the institution out of a low period that had been brought about by the American Civil War. During his two decades of service, he overhauled the curriculum, oversaw an expansion of inquiry into the sciences, and supervised the addition of a number of buildings in the High Victorian Gothic style to the campus. McCosh Hall is named in his honor.

    In 1879, the first thesis for a Doctor of Philosophy (Ph.D.) was submitted by James F. Williamson, Class of 1877.

    In 1896, the college officially changed its name from the College of New Jersey to Princeton University to honor the town in which it resides. During this year, the college also underwent large expansion and officially became a university. In 1900, the Graduate School was established.

    In 1902, Woodrow Wilson, graduate of the Class of 1879, was elected the 13th president of the university. Under Wilson, Princeton introduced the preceptorial system in 1905, a then-unique concept in the United States that augmented the standard lecture method of teaching with a more personal form in which small groups of students, or precepts, could interact with a single instructor, or preceptor, in their field of interest.

    In 1906, the reservoir Carnegie Lake was created by Andrew Carnegie. A collection of historical photographs of the building of the lake is housed at the Seeley G. Mudd Manuscript Library on Princeton’s campus. On October 2, 1913, the Princeton University Graduate College was dedicated. In 1919 the School of Architecture was established. In 1933, Albert Einstein became a lifetime member of the Institute for Advanced Study with an office on the Princeton campus. While always independent of the university, the Institute for Advanced Study occupied offices in Jones Hall for 6 years, from its opening in 1933, until its own campus was finished and opened in 1939.

    Coeducation

    In 1969, Princeton University first admitted women as undergraduates. In 1887, the university actually maintained and staffed a sister college, Evelyn College for Women, in the town of Princeton on Evelyn and Nassau streets. It was closed after roughly a decade of operation. After abortive discussions with Sarah Lawrence College to relocate the women’s college to Princeton and merge it with the University in 1967, the administration decided to admit women and turned to the issue of transforming the school’s operations and facilities into a female-friendly campus. The administration had barely finished these plans in April 1969 when the admissions office began mailing out its acceptance letters. Its five-year coeducation plan provided $7.8 million for the development of new facilities that would eventually house and educate 650 women students at Princeton by 1974. Ultimately, 148 women, consisting of 100 freshmen and transfer students of other years, entered Princeton on September 6, 1969 amidst much media attention. Princeton enrolled its first female graduate student, Sabra Follett Meservey, as a PhD candidate in Turkish history in 1961. A handful of undergraduate women had studied at Princeton from 1963 on, spending their junior year there to study “critical languages” in which Princeton’s offerings surpassed those of their home institutions. They were considered regular students for their year on campus, but were not candidates for a Princeton degree.

    As a result of a 1979 lawsuit by Sally Frank, Princeton’s eating clubs were required to go coeducational in 1991, after Tiger Inn’s appeal to the U.S. Supreme Court was denied. In 1987, the university changed the gendered lyrics of “Old Nassau” to reflect the school’s co-educational student body. From 2009 to 2011, Princeton professor Nannerl O. Keohane chaired a committee on undergraduate women’s leadership at the university, appointed by President Shirley M. Tilghman.

    The main campus sits on about 500 acres (2.0 km^2) in Princeton. In 2011, the main campus was named by Travel+Leisure as one of the most beautiful in the United States. The James Forrestal Campus is split between nearby Plainsboro and South Brunswick. The University also owns some property in West Windsor Township. The campuses are situated about one hour from both New York City and Philadelphia.

    The first building on campus was Nassau Hall, completed in 1756 and situated on the northern edge of campus facing Nassau Street. The campus expanded steadily around Nassau Hall during the early and middle 19th century. The McCosh presidency (1868–88) saw the construction of a number of buildings in the High Victorian Gothic and Romanesque Revival styles; many of them are now gone, leaving the remaining few to appear out of place. At the end of the 19th century much of Princeton’s architecture was designed by the Cope and Stewardson firm (same architects who designed a large part of Washington University in St Louis (US) and University of Pennsylvania(US)) resulting in the Collegiate Gothic style for which it is known today. Implemented initially by William Appleton Potter and later enforced by the University’s supervising architect, Ralph Adams Cram, the Collegiate Gothic style remained the standard for all new building on the Princeton campus through 1960. A flurry of construction in the 1960s produced a number of new buildings on the south side of the main campus, many of which have been poorly received. Several prominent architects have contributed some more recent additions, including Frank Gehry (Lewis Library), I. M. Pei (Spelman Halls), Demetri Porphyrios (Whitman College, a Collegiate Gothic project), Robert Venturi and Denise Scott Brown (Frist Campus Center, among several others), and Rafael Viñoly (Carl Icahn Laboratory).

    A group of 20th-century sculptures scattered throughout the campus forms the Putnam Collection of Sculpture. It includes works by Alexander Calder (Five Disks: One Empty), Jacob Epstein (Albert Einstein), Henry Moore (Oval with Points), Isamu Noguchi (White Sun), and Pablo Picasso (Head of a Woman). Richard Serra’s The Hedgehog and The Fox is located between Peyton and Fine halls next to Princeton Stadium and the Lewis Library.

    At the southern edge of the campus is Carnegie Lake, an artificial lake named for Andrew Carnegie. Carnegie financed the lake’s construction in 1906 at the behest of a friend who was a Princeton alumnus. Carnegie hoped the opportunity to take up rowing would inspire Princeton students to forsake football, which he considered “not gentlemanly.” The Shea Rowing Center on the lake’s shore continues to serve as the headquarters for Princeton rowing.

    Cannon Green

    Buried in the ground at the center of the lawn south of Nassau Hall is the “Big Cannon,” which was left in Princeton by British troops as they fled following the Battle of Princeton. It remained in Princeton until the War of 1812, when it was taken to New Brunswick. In 1836 the cannon was returned to Princeton and placed at the eastern end of town. It was removed to the campus under cover of night by Princeton students in 1838 and buried in its current location in 1840.

    A second “Little Cannon” is buried in the lawn in front of nearby Whig Hall. This cannon, which may also have been captured in the Battle of Princeton, was stolen by students of Rutgers University in 1875. The theft ignited the Rutgers-Princeton Cannon War. A compromise between the presidents of Princeton and Rutgers ended the war and forced the return of the Little Cannon to Princeton. The protruding cannons are occasionally painted scarlet by Rutgers students who continue the traditional dispute.

    In years when the Princeton football team beats the teams of both Harvard University and Yale University in the same season, Princeton celebrates with a bonfire on Cannon Green. This occurred in 2012, ending a five-year drought. The next bonfire happened on November 24, 2013, and was broadcast live over the Internet.

    Landscape

    Princeton’s grounds were designed by Beatrix Farrand between 1912 and 1943. Her contributions were most recently recognized with the naming of a courtyard for her. Subsequent changes to the landscape were introduced by Quennell Rothschild & Partners in 2000. In 2005, Michael Van Valkenburgh was hired as the new consulting landscape architect for the campus. Lynden B. Miller was invited to work with him as Princeton’s consulting gardening architect, focusing on the 17 gardens that are distributed throughout the campus.

    Buildings

    Nassau Hall

    Nassau Hall is the oldest building on campus. Begun in 1754 and completed in 1756, it was the first seat of the New Jersey Legislature in 1776, was involved in the battle of Princeton in 1777, and was the seat of the Congress of the Confederation (and thus capitol of the United States) from June 30, 1783, to November 4, 1783. It now houses the office of the university president and other administrative offices, and remains the symbolic center of the campus. The front entrance is flanked by two bronze tigers, a gift of the Princeton Class of 1879. Commencement is held on the front lawn of Nassau Hall in good weather. In 1966, Nassau Hall was added to the National Register of Historic Places.

    Residential colleges

    Princeton has six undergraduate residential colleges, each housing approximately 500 freshmen, sophomores, some juniors and seniors, and a handful of junior and senior resident advisers. Each college consists of a set of dormitories, a dining hall, a variety of other amenities—such as study spaces, libraries, performance spaces, and darkrooms—and a collection of administrators and associated faculty. Two colleges, First College and Forbes College (formerly Woodrow Wilson College and Princeton Inn College, respectively), date to the 1970s; three others, Rockefeller, Mathey, and Butler Colleges, were created in 1983 following the Committee on Undergraduate Residential Life (CURL) report, which suggested the institution of residential colleges as a solution to an allegedly fragmented campus social life. The construction of Whitman College, the university’s sixth residential college, was completed in 2007.

    Rockefeller and Mathey are located in the northwest corner of the campus; Princeton brochures often feature their Collegiate Gothic architecture. Like most of Princeton’s Gothic buildings, they predate the residential college system and were fashioned into colleges from individual dormitories.

    First and Butler, located south of the center of the campus, were built in the 1960s. First served as an early experiment in the establishment of the residential college system. Butler, like Rockefeller and Mathey, consisted of a collection of ordinary dorms (called the “New New Quad”) before the addition of a dining hall made it a residential college. Widely disliked for their edgy modernist design, including “waffle ceilings,” the dormitories on the Butler Quad were demolished in 2007. Butler is now reopened as a four-year residential college, housing both under- and upperclassmen.

    Forbes is located on the site of the historic Princeton Inn, a gracious hotel overlooking the Princeton golf course. The Princeton Inn, originally constructed in 1924, played regular host to important symposia and gatherings of renowned scholars from both the university and the nearby Institute for Advanced Study for many years. Forbes currently houses nearly 500 undergraduates in its residential halls.

    In 2003, Princeton broke ground for a sixth college named Whitman College after its principal sponsor, Meg Whitman, who graduated from Princeton in 1977. The new dormitories were constructed in the Collegiate Gothic architectural style and were designed by architect Demetri Porphyrios. Construction finished in 2007, and Whitman College was inaugurated as Princeton’s sixth residential college that same year.

    The precursor of the present college system in America was originally proposed by university president Woodrow Wilson in the early 20th century. For over 800 years, however, the collegiate system had already existed in Britain at Cambridge and Oxford Universities. Wilson’s model was much closer to Yale University (US)’s present system, which features four-year colleges. Lacking the support of the trustees, the plan languished until 1968. That year, Wilson College was established to cap a series of alternatives to the eating clubs. Fierce debates raged before the present residential college system emerged. The plan was first attempted at Yale, but the administration was initially uninterested; an exasperated alumnus, Edward Harkness, finally paid to have the college system implemented at Harvard in the 1920s, leading to the oft-quoted aphorism that the college system is a Princeton idea that was executed at Harvard with funding from Yale.

    Princeton has one graduate residential college, known simply as the Graduate College, located beyond Forbes College at the outskirts of campus. The far-flung location of the GC was the spoil of a squabble between Woodrow Wilson and then-Graduate School Dean Andrew Fleming West. Wilson preferred a central location for the college; West wanted the graduate students as far as possible from the campus. Ultimately, West prevailed. The Graduate College is composed of a large Collegiate Gothic section crowned by Cleveland Tower, a local landmark that also houses a world-class carillon. The attached New Graduate College provides a modern contrast in architectural style.

    McCarter Theatre

    The Tony-award-winning McCarter Theatre was built by the Princeton Triangle Club, a student performance group, using club profits and a gift from Princeton University alumnus Thomas McCarter. Today, the Triangle Club performs its annual freshmen revue, fall show, and Reunions performances in McCarter. McCarter is also recognized as one of the leading regional theaters in the United States.

    Art Museum

    The Princeton University Art Museum was established in 1882 to give students direct, intimate, and sustained access to original works of art that complement and enrich instruction and research at the university. This continues to be a primary function, along with serving as a community resource and a destination for national and international visitors.

    Numbering over 92,000 objects, the collections range from ancient to contemporary art and concentrate geographically on the Mediterranean regions, Western Europe, China, the United States, and Latin America. There is a collection of Greek and Roman antiquities, including ceramics, marbles, bronzes, and Roman mosaics from faculty excavations in Antioch. Medieval Europe is represented by sculpture, metalwork, and stained glass. The collection of Western European paintings includes examples from the early Renaissance through the 19th century, with masterpieces by Monet, Cézanne, and Van Gogh, and features a growing collection of 20th-century and contemporary art, including iconic paintings such as Andy Warhol’s Blue Marilyn.

    One of the best features of the museums is its collection of Chinese art, with important holdings in bronzes, tomb figurines, painting, and calligraphy. Its collection of pre-Columbian art includes examples of Mayan art, and is commonly considered to be the most important collection of pre-Columbian art outside of Latin America. The museum has collections of old master prints and drawings and a comprehensive collection of over 27,000 original photographs. African art and Northwest Coast Indian art are also represented. The Museum also oversees the outdoor Putnam Collection of Sculpture.

    University Chapel

    The Princeton University Chapel is located on the north side of campus, near Nassau Street. It was built between 1924 and 1928, at a cost of $2.3 million [approximately $34.2 million in 2020 dollars]. Ralph Adams Cram, the University’s supervising architect, designed the chapel, which he viewed as the crown jewel for the Collegiate Gothic motif he had championed for the campus. At the time of its construction, it was the second largest university chapel in the world, after King’s College Chapel, Cambridge. It underwent a two-year, $10 million restoration campaign between 2000 and 2002.

    Measured on the exterior, the chapel is 277 feet (84 m) long, 76 feet (23 m) wide at its transepts, and 121 feet (37 m) high. The exterior is Pennsylvania sandstone, with Indiana limestone used for the trim. The interior is mostly limestone and Aquia Creek sandstone. The design evokes an English church of the Middle Ages. The extensive iconography, in stained glass, stonework, and wood carvings, has the common theme of connecting religion and scholarship.

    The Chapel seats almost 2,000. It hosts weekly ecumenical Christian services, daily Roman Catholic mass, and several annual special events.

    Murray-Dodge Hall

    Murray-Dodge Hall houses the Office of Religious Life (ORL), the Murray Dodge Theater, the Murray-Dodge Café, the Muslim Prayer Room and the Interfaith Prayer Room. The ORL houses the office of the Dean of Religious Life, Alison Boden, and a number of university chaplains, including the country’s first Hindu chaplain, Vineet Chander; and one of the country’s first Muslim chaplains, Sohaib Sultan.

    Sustainability

    Published in 2008, Princeton’s Sustainability Plan highlights three priority areas for the University’s Office of Sustainability: reduction of greenhouse gas emissions; conservation of resources; and research, education, and civic engagement. Princeton has committed to reducing its carbon dioxide emissions to 1990 levels by 2020: Energy without the purchase of offsets. The University published its first Sustainability Progress Report in November 2009. The University has adopted a green purchasing policy and recycling program that focuses on paper products, construction materials, lightbulbs, furniture, and electronics. Its dining halls have set a goal to purchase 75% sustainable food products by 2015. The student organization “Greening Princeton” seeks to encourage the University administration to adopt environmentally friendly policies on campus.

    Organization

    The Trustees of Princeton University, a 40-member board, is responsible for the overall direction of the University. It approves the operating and capital budgets, supervises the investment of the University’s endowment and oversees campus real estate and long-range physical planning. The trustees also exercise prior review and approval concerning changes in major policies, such as those in instructional programs and admission, as well as tuition and fees and the hiring of faculty members.

    With an endowment of $26.1 billion, Princeton University is among the wealthiest universities in the world. Ranked in 2010 as the third largest endowment in the United States, the university had the greatest per-student endowment in the world (over $2 million for undergraduates) in 2011. Such a significant endowment is sustained through the continued donations of its alumni and is maintained by investment advisers. Some of Princeton’s wealth is invested in its art museum, which features works by Claude Monet, Vincent van Gogh, Jackson Pollock, and Andy Warhol among other prominent artists.

    Academics

    Undergraduates fulfill general education requirements, choose among a wide variety of elective courses, and pursue departmental concentrations and interdisciplinary certificate programs. Required independent work is a hallmark of undergraduate education at Princeton. Students graduate with either the Bachelor of Arts (A.B.) or the Bachelor of Science in Engineering (B.S.E.).

    The graduate school offers advanced degrees spanning the humanities, social sciences, natural sciences, and engineering. Doctoral education is available in most disciplines. It emphasizes original and independent scholarship whereas master’s degree programs in architecture, engineering, finance, and public affairs and public policy prepare candidates for careers in public life and professional practice.

    The university has ties with the Institute for Advanced Study, Princeton Theological Seminary and the Westminster Choir College of Rider University(US).

    Undergraduate

    Undergraduate courses in the humanities are traditionally either seminars or lectures held 2 or 3 times a week with an additional discussion seminar that is called a “precept.” To graduate, all A.B. candidates must complete a senior thesis and, in most departments, one or two extensive pieces of independent research that are known as “junior papers.” Juniors in some departments, including architecture and the creative arts, complete independent projects that differ from written research papers. A.B. candidates must also fulfill a three or four semester foreign language requirement and distribution requirements (which include, for example, classes in ethics, literature and the arts, and historical analysis) with a total of 31 classes. B.S.E. candidates follow a parallel track with an emphasis on a rigorous science and math curriculum, a computer science requirement, and at least two semesters of independent research including an optional senior thesis. All B.S.E. students must complete at least 36 classes. A.B. candidates typically have more freedom in course selection than B.S.E. candidates because of the fewer number of required classes. Nonetheless, in the spirit of a liberal arts education, both enjoy a comparatively high degree of latitude in creating a self-structured curriculum.

    Undergraduates agree to adhere to an academic integrity policy called the Honor Code, established in 1893. Under the Honor Code, faculty do not proctor examinations; instead, the students proctor one another and must report any suspected violation to an Honor Committee made up of undergraduates. The Committee investigates reported violations and holds a hearing if it is warranted. An acquittal at such a hearing results in the destruction of all records of the hearing; a conviction results in the student’s suspension or expulsion. The signed pledge required by the Honor Code is so integral to students’ academic experience that the Princeton Triangle Club performs a song about it each fall. Out-of-class exercises fall under the jurisdiction of the Faculty-Student Committee on Discipline. Undergraduates are expected to sign a pledge on their written work affirming that they have not plagiarized the work.

    Graduate

    The Graduate School has about 2,600 students in 42 academic departments and programs in social sciences; engineering; natural sciences; and humanities. These departments include the Department of Psychology; Department of History; and Department of Economics.

    In 2017–2018, it received nearly 11,000 applications for admission and accepted around 1,000 applicants. The University also awarded 319 Ph.D. degrees and 170 final master’s degrees. Princeton has no medical school, law school, business school, or school of education. (A short-lived Princeton Law School folded in 1852.) It offers professional graduate degrees in architecture; engineering; finance and public policy- the last through the Princeton School of Public and International Affairs founded in 1930 as the School of Public and International Affairs and renamed in 1948 after university president (and U.S. president) Woodrow Wilson, and most recently renamed in 2020.

    Libraries

    The Princeton University Library system houses over eleven million holdings including seven million bound volumes. The main university library, Firestone Library, which houses almost four million volumes, is one of the largest university libraries in the world. Additionally, it is among the largest “open stack” libraries in existence. Its collections include the autographed manuscript of F. Scott Fitzgerald’s The Great Gatsby and George F. Kennan’s Long Telegram. In addition to Firestone library, specialized libraries exist for architecture, art and archaeology, East Asian studies, engineering, music, public and international affairs, public policy and university archives, and the sciences. In an effort to expand access, these libraries also subscribe to thousands of electronic resources.

    Institutes

    High Meadows Environmental Institute

    The High Meadows Environmental Institute is an “interdisciplinary center of environmental research, education, and outreach” at the university. The institute was started in 1994. About 90 faculty members at Princeton University are affiliated with it.

    The High Meadows Environmental Institute has the following research centers:

    Carbon Mitigation Initiative (CMI): This is a 15-year-long partnership between PEI and British Petroleum with the goal of finding solutions to problems related to climate change. The Stabilization Wedge Game has been created as part of this initiative.
    Center for BioComplexity (CBC)
    Cooperative Institute for Climate Science (CICS): This is a collaboration with the National Oceanographic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory.
    Energy Systems Analysis Group
    Grand Challenges

     
  • richardmitnick 7:37 pm on November 2, 2021 Permalink | Reply
    Tags: "3D simulations improve understanding of energetic-particle radiation and help protect space assets", , DOE’s Los Alamos National Laboratory (US),   

    From DOE’s Los Alamos National Laboratory (US) via phys.org : “3D simulations improve understanding of energetic-particle radiation and help protect space assets” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    via

    phys.org

    1
    3D simulations based on fundamental physics principles model the production of energetic ions and electrons. Credit: Los Alamos National Laboratory.

    A team of researchers used 3D particle simulations to model the acceleration of ions and electrons in a physical process called magnetic reconnection.

    Lockheed Martin Solar & Astrophysical Laboratory(US) Magnetic reconnection Credit: National Aeronautics and Space Administration(US)

    The results could contribute to the understanding and forecasting of energetic particles released during magnetic reconnection, which could help protect space assets and advance space exploration.

    “For the first time ever, we can use 3D simulations from fundamental physics principles to model the production of energetic ions and electrons in those magnetic explosions in space,” said paper author Qile Zhang, of the Nuclear and Particle Physics, Astrophysics and Cosmology group at Los Alamos National Laboratory.

    The research was published in Physical Review Letters.

    Magnetic reconnection can cause magnetic explosions, which result in events such as solar flares and geomagnetic storms near Earth; these explosions produce energetic-particle radiation that is harmful to spacecraft and humans. The research team discovered the underlying mechanisms controlling particle acceleration enabled by the 3D kink motions of plasmas—the collection of charged particles—and magnetic fields.

    They also revealed the processes governing the key properties of the energetic-particle energy distributions. The team’s predicted distributions agreed with observations from solar flares and Earth’s magnetic fields.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 8:57 pm on October 18, 2021 Permalink | Reply
    Tags: "Breakthrough proof clears path for quantum AI", , , DOE’s Los Alamos National Laboratory (US), Novel theorem demonstrates convolutional neural networks can always be trained on quantum computers., ,   

    From DOE’s Los Alamos National Laboratory (US) : “Breakthrough proof clears path for quantum AI” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    October 15, 2021
    Charles Poling
    (505)257-8006
    cpoling@lanl.gov

    Novel theorem demonstrates convolutional neural networks can always be trained on quantum computers, overcoming threat of ‘barren plateaus’ in optimization problems.

    1
    A novel proof that certain quantum convolutional networks can be guaranteed to be trained clears the way for quantum artificial intelligence to aid in materials discovery and many other applications.

    Convolutional neural networks running on quantum computers have generated significant buzz for their potential to analyze quantum data better than classical computers can. While a fundamental solvability problem known as “barren plateaus” has limited the application of these neural networks for large data sets, new research overcomes that Achilles heel with a rigorous proof that guarantees scalability.

    “The way you construct a quantum neural network can lead to a barren plateau—or not,” said Marco Cerezo, coauthor of the paper in Physical Review X. Cerezo is a physicist specializing in quantum computing, quantum machine learning, and quantum information at Los Alamos. “We proved the absence of barren plateaus for a special type of quantum neural network. Our work provides trainability guarantees for this architecture, meaning that one can generically train its parameters.”

    As an artificial intelligence (AI) methodology, quantum convolutional neural networks are inspired by the visual cortex. As such, they involve a series of convolutional layers, or filters, interleaved with pooling layers that reduce the dimension of the data while keeping important features of a data set.

    These neural networks can be used to solve a range of problems, from image recognition to materials discovery. Overcoming barren plateaus is key to extracting the full potential of quantum computers in AI applications and demonstrating their superiority over classical computers.

    Until now, Cerezo said, researchers in quantum machine learning analyzed how to mitigate the effects of barren plateaus, but they lacked a theoretical basis for avoiding it altogether. The Los Alamos work shows how some quantum neural networks are, in fact, immune to barren plateaus.

    “With this guarantee in hand, researchers will now be able to sift through quantum-computer data about quantum systems and use that information for studying material properties or discovering new materials, among other applications,” said Patrick Coles, a quantum physicist at Los Alamos and a coauthor of the paper.

    Many more applications for quantum AI algorithms will emerge, Coles thinks, as researchers use near-term quantum computers more frequently and generate more and more data—all machine learning programs are data-hungry.

    Avoiding the vanishing gradient

    “All hope of quantum speedup or advantage is lost if you have a barren plateau,” Cerezo said.

    The crux of the problem is a “vanishing gradient” in the optimization landscape. The landscape is composed of hills and valleys, and the goal is to train the model’s parameters to find the solution by exploring the geography of the landscape. The solution usually lies at the bottom of the lowest valley, so to speak. But in a flat landscape one cannot train the parameters because it’s difficult to determine which direction to take.

    That problem becomes particularly relevant when the number of data features increases. In fact, the landscape becomes exponentially flat with the feature size. Hence, in the presence of a barren plateau, the quantum neural network cannot be scaled up.

    The Los Alamos team developed a novel graphical approach for analyzing the scaling within a quantum neural network and proving its trainability.

    For more than 40 years, physicists have thought quantum computers would prove useful in simulating and understanding quantum systems of particles, which choke conventional classical computers. The type of quantum convolutional neural network that the Los Alamos research has proved robust is expected to have useful applications in analyzing data from quantum simulations.

    “The field of quantum machine learning is still young,” Coles said. “There’s a famous quote about lasers, when they were first discovered, that said they were a solution in search of a problem. Now lasers are used everywhere. Similarly, a number of us suspect that quantum data will become highly available, and then quantum machine learning will take off.”

    For instance, research is focusing on ceramic materials as high-temperature superconductors, Coles said, which could improve frictionless transportation, such as magnetic levitation trains. But analyzing data about the material’s large number of phases, which are influenced by temperature, pressure, and impurities in these materials, and classifying the phases is a huge task that goes beyond the capabilities of classical computers.

    Using a scalable quantum neural network, a quantum computer could sift through a vast data set about the various states of a given material and correlate those states with phases to identify the optimal state for high-temperature superconducting.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 12:35 pm on October 13, 2021 Permalink | Reply
    Tags: "Levitation yields better neutron-lifetime measurement", , , DOE’s Los Alamos National Laboratory (US), , ,   

    From DOE’s Los Alamos National Laboratory (US) via Science Alert (US) : “Levitation yields better neutron-lifetime measurement” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    via

    ScienceAlert

    Science Alert (US)

    13 OCTOBER 2021
    MICHELLE STARR

    1
    TanyaLovus/iStock/Getty Images Plus.

    We now know, to within a tenth of a percent, how long a neutron can survive outside the atomic nucleus before decaying into a proton.

    This is the most precise measurement yet of the lifespan of these fundamental particles, representing a more than two-fold improvement over previous measurements. This has implications for our understanding of how the first matter in the Universe was created from a soup of protons and neutrons in the minutes after the Big Bang.

    “The process by which a neutron ‘decays’ into a proton – with an emission of a light electron and an almost massless neutrino – is one of the most fascinating processes known to physicists,” said nuclear physicist Daniel Salvat of The Indiana University (US) Bloomington.

    “The effort to measure this value very precisely is significant because understanding the precise lifetime of the neutron can shed light on how the universe developed – as well as allow physicists to discover flaws in our model of the subatomic universe that we know exist but nobody has yet been able to find.”

    The research was conducted at The Los Alamos National Science Center, where a special experiment is set up just for trying to measure neutron lifespans. It’s called the UCNtau project, and it involves ultra-cold neutrons (UCNs) stored in a magneto-gravitational trap.

    The neutrons are cooled almost to absolute zero, and placed in the trap, a bowl-shaped chamber lined with thousands of permanent magnets, which levitate the neutrons, inside a vacuum jacket.

    The magnetic field prevents the neutrons from depolarizing and, combined with gravity, keeps the neutrons from escaping. This design allows neutrons to be stored for up to 11 days.

    The researchers stored their neutrons in the UCNtau trap for 30 to 90 minutes, then counted the remaining particles after the allotted time. Over the course of repeated experiments, conducted between 2017 and 2019, they counted over 40 million neutrons, obtaining enough statistical data to determine the particles’ lifespan with the greatest precision yet.

    This lifespan is around 877.75 ± 0.28 seconds (14 minutes and 38 seconds), according to the researchers’ analysis. The refined measurement can help place important physical constraints on the Universe, including the formation of matter and dark matter.

    After the Big Bang, things happened relatively quickly. In the very first moments, the hot, ultra-dense matter that filled the Universe cooled into quarks and electrons; just millionths of a second later, the quarks coalesced into protons and neutrons.

    Knowing the lifespan of the neutron can help physicists understand what role, if any, decaying neutrons play in the formation of the mysterious mass in the Universe known as dark matter. This information can also help test the validity of something called the Cabibbo-Kobayashi-Maskawa matrix, which helps explain the behavior of quarks under the Standard Model of physics, the researchers said.

    “The underlying model explaining neutron decay involves the quarks changing their identities, but recently improved calculations suggest this process may not occur as previously predicted,” Salvat said.

    “Our new measurement of the neutron lifetime will provide an independent assessment to settle this issue, or provide much-searched-for evidence for the discovery of new physics.”

    The research has been accepted into Physical Review Letters.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 3:06 pm on October 4, 2021 Permalink | Reply
    Tags: , "Supercomputers reveal how X chromosomes fold; deactivate", DOE’s Los Alamos National Laboratory (US)   

    From DOE’s Los Alamos National Laboratory (US) via phys.org : “Supercomputers reveal how X chromosomes fold; deactivate” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    via

    phys.org

    October 4, 2021

    1
    RNA particles swarm an X chromosome from a mouse in a new visualization of X chromosome inactivation. Credit: DOE’s Los Alamos National Laboratory.

    Using supercomputer-driven dynamic modeling based on experimental data, researchers can now probe the process that turns off one X chromosome in female mammal embryos. This new capability is helping biologists understand the role of RNA and the chromosome’s structure in the X inactivation process, leading to a deeper understanding of gene expression and opening new pathways to drug treatments for gene-based disorders and diseases.

    “This is the first time we’ve been able to model all the RNA spreading around the chromosome and shutting it down,” said Anna Lappala, a visiting scientist at Los Alamos National Laboratory and a polymer physicist at Massachusetts General Hospital and the Harvard University (US) Department of Molecular Biology. Lappala is first author of the paper published Oct. 4 in the bioRxiv. “From experimental data alone, which is 2D and static, you don’t have the resolution to see a whole chromosome at this level of detail. With this modeling, we can see the processes regulating gene expression, and the modeling is grounded in 2D experimental data from our collaborators at Massachusetts General Hospital and Harvard.”

    The model—considered 4D because it shows motion, including time as the fourth dimension—runs on Los Alamos supercomputers. The model also incorporates experimental data from mice genomes obtained through a molecular method called 4DHiC. The combined molecular and computational methodology is a first.

    In the visualization, RNA particles swarm over the X chromosome. The tangled-spaghetti-like strands writhe, changing shape, then the particles engulf and penetrate the depths of the chromosome, turning it off. See the visualization:


    3D models reveal hidden process in X chromosome inactivation

    “The method allows us to develop an interactive model of this epigenetic process,” said Jeannie T. Lee, professor of Genetics at Harvard Medical School and vice chair in molecular biology at Massachusetts General Hospital, whose lab contributed the experimental data underpinning the model.

    Epigenetics is the study of changes in gene expression and heritable traits that don’t involve mutations in the genome.

    “What’s been missing in the field is some way for a user who’s not computationally savvy to go interactively into a chromosome,” Lee said. She compared using the Los Alamos model to using Google Earth, where “you can zoom into any location on an X chromosome, pick your favorite gene, see the other genes around it, and see how they interact.” That capability could lend insight into how diseases spread, for instance, she said.

    Based on the work in this paper, Los Alamos is currently developing a Google Earth-style browser where any scientist can upload their genomic data and view it dynamically in 3D at various magnifications, said Karissa Sanbonmatsu, a structural biologist at Los Alamos National Laboratory, corresponding author of the paper, and a project leader in developing the computational method.

    In mammals, a female embryo is conceived with two X chromosomes, one inherited from each parent. X inactivation shuts off the chromosome, a crucial step for the embryo to survive, and variations in X inactivation can trigger a variety of developmental disorders.

    The new Los Alamos model will facilitate a deeper understanding of gene expression and related problems, which could lead to pharmacological treatments for various gene-based diseases and disorders, Lee said.

    “Our main goal was to see the chromosome change its shape and to see gene-expression levels over time,” said Sanbonmatsu.

    To understand how genes are turned on and off, Sanbonmatsu said, “it really helps to know the structure of the chromosome. The hypothesis is that a compacted, tightly structured chromosome tends to turn off genes, but there are not a lot of smoking guns about this. By modeling 3D structures in motion, we can get closer to the relationship between structural compaction and turning off genes.”

    Lee likened the chromosome’s structure to origami. A complicated shape akin to an origami crane offers lots of surface for gene expression and might be biologically preferred to remain active.

    The model shows a variety of substructures in the chromosome. When it is shut down, “it’s a piecemeal process in which some substructures are kept but some are dissolved,” Sanbonmatsu said. “We see beginning, intermediate, and end stages, through a gradual transition. That’s important for epigenetics because it’s the first time we have been able to analyze the detailed structural transition in an epigenetic change.”

    The modeling also shows genes on the surface of the chromosome that escape X chromosome inactivation, confirming early experimental work. In the model, they cluster and apparently interact or work together on the surface of the chromosome.

    In another insight from the modeling, “As the chromosome goes from an active X, when it’s still fairly large, to a compact inactive X, that’s smaller, we notice there’s a core of the chromosome that’s extremely dense, but the surface is much less dense. We see a lot more motion on the surface too,” Lappala said. “Then there’s an intermediate region that’s not too fast or slow, where the chromosome can rearrange.”

    An inactive X can activate later in a process called age-related activation of inactive X. “It’s associated with problems in blood cells in particular that are known to cause autoimmunity,” Lee said. “Some research is trying pharmacologically to activate the inactive X to treat neurological disorders in children by giving them something back that’s missing on their active X chromosome. For instance, a child could have a mutation that can cause disease. We think if we can reactivate the normal copy on the inactive X, then we would have an epigenetic treatment for that mutation.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 1:27 pm on September 22, 2021 Permalink | Reply
    Tags: "Tracking the big melt", Arctic permafrost – frozen ground – is rapidly thawing due to a warming climate., , , , DOE’s Los Alamos National Laboratory (US), E3SM: Energy Exascale Earth System Model, Earth’s rapidly changing Arctic coastal regions have an outsized climatic effect that echoes around the globe., , Researchers have shown that September Arctic sea ice extent is declining by about 13 percent each decade.   

    From DOE’s ASCR Discovery (US) : “Tracking the big melt” 

    From DOE’s ASCR Discovery (US)

    September 2021

    DOE’s Los Alamos National Laboratory (US) and DOE’s Oak Ridge National Laboratory (US) scientists lead a DOE supercomputing effort to model the complex interactions affecting climate change in Arctic coastal regions.

    1
    Beaufort Sea ice, April 2007. Photo courtesy of Andrew Roberts, Los Alamos National Laboratory.

    Earth’s rapidly changing Arctic coastal regions have an outsized climatic effect that echoes around the globe. Tracking processes behind this evolution is a daunting task even for the best scientists.

    Coastlines are some of the planet’s most dynamic areas – places where marine, terrestrial, atmospheric and human actions meet. But the Arctic coastal regions face the most troubling issues from human-caused climate change from increasing greenhouse gas emissions, says Los Alamos National Laboratory (LANL) scientist Andrew Roberts.

    “Arctic coastal systems are very fragile,” says Roberts, who leads the high-performance computing systems element of a broader Department of Energy (DOE) Office of Science effort, led by its Biological and Environmental Research (BER) office, to simulate changing Arctic coastal conditions. “Until the last several decades, thick, perennial Arctic sea ice appears to have been generally stable. Now, warming temperatures are causing it to melt.”

    In the 1980s, multiyear ice at least four years old accounted for more than 30 percent of Arctic coverage; that has shrunk to not much more than 1 percent today. Whereas that perennial pack ice circulates around the Arctic, another type known as land-fast ice – anchored to a shoreline or the ocean bottom, acting as a floating land extension – is receding toward the coast due to rising temperatures.

    This exposes coastal regions to damaging waves that can disperse ice and erode coastal permafrost, Roberts says.

    Researchers have shown that September Arctic sea ice extent is declining by about 13 percent each decade, as the Arctic warms more than twice as fast as the rest of the planet – what scientists call “Arctic amplification.”

    Changes in Arctic sea-ice and land-ice melting can disrupt the so-called global ocean conveyor belt that circulates water around the planet and helps stabilize the climate, Roberts reports. The stream moves cold, dense, salty water from the poles to the tropical oceans, which send warm water in return.

    The Arctic is now stuck in a crippling feedback loop: Sea ice can reflect 80 percent or more of sunlight into space, but its relentless decline causes larger and larger areas of dark, open ocean to take its place in summer and absorb more than 90 percent of noon sunlight, leading to more warming.

    Roberts and his colleagues tease out how reductions in Arctic ice and increases in Arctic temperatures affect flooding, marine biogeochemistry, shipping, natural resource extraction and wildlife habitat loss. The team also assesses the effects of climate change on traditional communities, where anthropogenic warming affects weather patterns and damages hunting grounds and infrastructure such as buildings and roads.

    Arctic permafrost – frozen ground – is rapidly thawing due to a warming climate. Some scientists predict that roughly 2.5 million square miles of this soil – about 40 percent of the world’s total – could disappear by the century’s end and release mammoth amounts of potent greenhouse gases, including methane, carbon dioxide and water vapor.

    The overall research project, the BER-sponsored Interdisciplinary Research for Arctic Coastal Environments (InteRFACE), led by Joel Rowland, also from LANL, and is a multi-institutional collaboration that includes other national laboratories and universities. Roberts has overseen the computational aspects of the DOE project that have benefitted from 650,000 node-hours of supercomputing time in 2020 at the DOE’s National Energy Research Scientific Computing Center (US) at DOE’s Lawrence Berkeley National Laboratory (US).

    The Arctic coastal calculations used NERSC’s Cori, a Cray XC40 system with 700,000 processing cores that can perform 30 thousand trillion floating-point operations per second.

    The LANL researchers, with colleagues from many other national laboratories, have relied on and contributed to development of a sophisticated DOE-supported research tool called the Energy Exascale Earth System Model (E3SM), letting them use supercomputer simulation and data-management to better understand changes in Arctic coastal systems. InteRFACE activities contribute to the development of E3SM and benefit from its broader development.

    E3SM portrays the atmosphere, ocean, land and sea ice – including the mass and energy changes between them – in high-resolution, three-dimensional models, focusing Cori’s computing power on small regions of big interest. The scientists have created grid-like meshes of triangular cells in E3SM’s sea-ice and ocean components to reproduce the region’s coastlines with high fidelity.

    “One of the big questions is when melting sea ice will make the Arctic Ocean navigable year-round,” Roberts says. Although government and commercial ships – even cruise ships – have been able to maneuver through the Northwest Passage in the Canadian Archipelago in recent summers, by 2030 the region could be routinely navigable for many months of the year if sea-ice melting continues apace, he says.

    E3SM development will help researchers better understand how much the Northwest Passage is navigable compared with traditional rectangular meshes used in many lower-resolution climate models, Roberts notes.

    E3SM features weather-scale resolution – that is, detailed enough to capture fronts, storms, and hurricanes – and uses advanced computers to simulate aspects of the Earth’s variability. The code helps researchers anticipate decadal-scale changes that could influence the U.S. energy sector in years to come.

    “If we had the computing power, we would like to have high-resolution simulations everywhere in the world,” he says. “But that is incredibly expensive to undertake.”

    Ethan Coon, an Oak Ridge National Laboratory scientist and a co-investigator of a related project, supported by the DOE Advanced Scientific Computing Research (ASCR) program’s Leadership Computing Challenge (ALCC), says far-northern land warming “is transforming the Arctic hydrological cycle, and we are seeing significant changes in river and stream discharge.” The ALCC program allocates supercomputer time for DOE projects that emphasize high-risk, high-payoff simulations and that broadened the research community.

    Coon, an alumnus of the DOE Computational Science Graduate Fellowship, says warming is altering the pathways of rivers and streams. As thawing permafrost sinks lower below the surface, groundwater courses deeper underground and stays colder as it flows into streams – potentially affecting fish and other wildlife.

    What happens on land has a big ocean impact, Roberts agrees. At long last, he says, “we finally have the ability to really refine coastal regions and simulate their physical processes.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

    The United States Department of Energy (DOE)(US) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy(US). The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Energy Technology Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include:
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of Fossil Energy[32]
    Office of River Protection[33]
    Pantex
    Radiological and Environmental Sciences Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
  • richardmitnick 12:22 pm on August 24, 2021 Permalink | Reply
    Tags: , "Mountains of Data-An Unprecedented Climate Observatory to Understand the Future of Water", , , , DOE’s Los Alamos National Laboratory (US), , , Mountain watersheds provide 60 to 90% of water resources worldwide., SAIL is a research campaign managed by DOE’s "Atmospheric Radiation Measurement (ARM)" project., The Colorado River system, The Upper Colorado River powers more than $1 trillion in economic activity and provides an immense amount of hydroelectric power but it’s very understudied compared to how important it is.   

    From DOE’s Lawrence Berkeley National Laboratory (US) and DOE’s Los Alamos National Laboratory (US) : “Mountains of Data-An Unprecedented Climate Observatory to Understand the Future of Water” 

    From DOE’s Lawrence Berkeley National Laboratory (US)

    and

    LANL bloc

    DOE’s Los Alamos National Laboratory (US)

    August 24th, 2021
    Julie Chao

    First-ever “bedrock-to-atmosphere” observation system could allow scientists to predict the future of water availability in the West.

    The “megadrought” impacting the Colorado River system this year has been devastating to the 40 million people who rely on it for water. But could this drought have been predicted? Will we be able to predict the next one?

    Mountain watersheds provide 60 to 90% of water resources worldwide, but there is still much that scientists don’t know about the physical processes and interactions that affect hydrology in these ecosystems. And thus, the best Earth system computer models struggle to predict the timing and availability of water resources emanating from mountains.

    Now a team of Department of Energy (US) scientists led by Lawrence Berkeley National Laboratory (Berkeley Lab) aims to plug that gap, with an ambitious campaign to collect a vast array of measurements that will allow scientists to better understand the future of water in the West. The Surface Atmosphere Integrated Field Laboratory (SAIL) campaign will start on September 1, when scientists flip the switch on a slew of machinery that has been amassed in the Upper Colorado River Basin.

    2
    During the SAIL campaign instruments on the tower will measure core variables related to surface meteorology and collect radiation data. Credit: John Bilberry/DOE’s Los Alamos National Laboratory(US).

    Over the course of two falls, two winters, two springs, and a summer, more than three dozen scientific instruments – including a variety of radars, lidars, cameras, balloons, and other state-of-the-art equipment – will collect a treasure trove of data on precipitation, wind, clouds, aerosols, solar and thermal energy, temperature, humidity, ozone, and more. That data can then be used to turbocharge the capabilities of Earth system models and answer many scientific questions about how, why, where, and when rain and snow will fall. In close collaboration with researchers specializing in Earth’s surface and subsurface, the SAIL campaign will help the scientific community understand how mountains extract moisture from the atmosphere and then process the water all the way down to the bedrock beneath Earth’s surface. Ultimately, this will provide the tools for scientists to better predict the future availability of water.

    “The Upper Colorado River powers more than $1 trillion in economic activity and provides an immense amount of hydroelectric power but it’s very understudied compared to how important it is,” said Berkeley Lab scientist Daniel Feldman, the lead SAIL investigator. “We’re starting to see really dramatic consequences from the changing water resources, but the details of what is actually going on in these places where the water’s coming from – those details matter, and that’s what SAIL is focused on.”

    From the Arctic to the Rockies

    SAIL is a research campaign managed by DOE’s Atmospheric Radiation Measurement (ARM) user facility, a key contributor to climate research with its stationary and mobile climate observatories located throughout the United States and around the world. Much of the equipment being used in SAIL has just returned from a one-year Arctic expedition.

    “SAIL is a timely campaign because of the ongoing drought in the Western United States,” said Sally McFarlane, DOE Program Manager for the ARM user facility. “The Colorado River is of particular concern because it supplies water to 40 million people. SAIL is bringing together data from ARM and other research programs from within DOE to ultimately help provide insights into the atmospheric processes and land-atmosphere interactions that impact rain and snow in the upper Colorado River watershed.”

    3

    The instruments are mostly housed in large containers sited in the picturesque mountain town of Gothic, Colorado, an old mining town near Crested Butte, Colorado. The facility is hosted by the Rocky Mountain Biological Laboratory, which is dedicated to research on high-altitude ecosystems. A staff of three technicians will monitor the instruments around the clock.

    “This is a profound and incredibly unique opportunity and represents a first-of-its-kind experiment in mountainous systems worldwide, bridging the processes from the atmosphere all the way down to bedrock,” said Berkeley Lab scientist Ken Williams, the lead on-site researcher for SAIL.

    5

    3

    4

    SAIL instruments include (from top) radiometers, a rain guage, and Doppler lidar to measure wind velocities. Credit: John Bilberry, Los Alamos National Laboratory.

    SAIL science: better models to answer tough questions.

    Having this volume of data at a wide range of spatial and temporal scales will allow scientists to begin to understand the physical processes that may affect mountain hydrology and answer questions such as how dust, wildfire, hot drought, tree mortality, and other phenomena might affect the watershed. Ultimately, the data will be fed into Earth system models so they can “get the water balance right.”

    “Our models that predict what future water is going to be – their resolution is now about 100 kilometers [62 miles], but there’s a lot of activity that happens in 100 kilometers, a lot of terrain variability, a lot of differences in precipitation, and surface and subsurface processes,” Feldman said. “So really the question is, what are all the details that need to go into those big models, so that we can get them to get the water balance right? And that’s why this is really exciting – we’ll be measuring the inputs and the outputs at a fundamental level to develop a benchmark dataset for the scientific community to evaluate and improve their models.”

    DOE’s Atmospheric System Research (ASR) program works closely with ARM to improve understanding of the key processes that affect the Earth’s radiative balance and hydrological cycle.

    6
    Colorado River. Credit: Roy Kaltschmidt/ DOE’s Lawrence Berkeley National Laboratory (US).

    “ASR research projects during the SAIL campaign will help us learn more about the cloud, aerosol, precipitation, and radiation processes that affect the water cycle in the upper Colorado River watershed,” said Jeff Stehr, a DOE Program Manager for ASR. “Ultimately, this work will help us improve climate models so that they can be used to better understand, predict, and plan for threats to water resources in the arid West and globally.”

    SAIL leverages the substantial efforts that Berkeley Lab has already undertaken in this area: it has been leading field studies at the East River watershed of the Colorado Upper Gunnison Basin since 2014, as part of the DOE-funded Watershed Function Scientific Focus Area project. SAIL will build on that research effort, bringing together a wide range of scientific disciplines to create the world’s first bedrock-to-atmosphere mountain integrated field laboratory.

    7
    The East River watershed-a living laboratory. Credit: Roy Kaltschmidt/ DOE’s Lawrence Berkeley National Laboratory (US).

    Some of the practical questions the SAIL campaign could help answer include:

    ● How do we plan for a future of low snow or snowfall changing to rainfall? “Our planning for the Colorado River is largely based on historical weather patterns that might be changing, from snow to rain,” Feldman said.

    ● How do activities and disturbances in the forest affect water quality and water availability? “It’s not just about the total volume of water exiting these systems,” Williams said. “We’ll also be looking at how land activities – such as wildfire and forest management – affect the concentrations of constituents in the water and overall water quality.”

    ● Will dams overflow? The U.S. Bureau of Reclamation, the federal agency charged with managing dams in the western U.S., will be using the new data coming in from the radar system to help with controlled dam and reservoir operations. Feldman noted: “There have been some pretty scary situations that have arisen when rain falls on snow. The Oroville Dam disaster [in California in 2017] is just one of many such examples.” Additionally, one of the weather radars will be located at a ski area owned by Vail Resorts, a major Colorado ski resort, which could benefit outdoor enthusiasts as well as scientists. And the research will also be useful to organizations such as water utilities and the Bureau of Reclamation that are experimenting with weather modification technologies, such as cloud-seeding.

    8
    Glen Canyon Dam. Credit: Julie Chao.

    Additionally, one of the weather radars will be located at a ski area owned by Vail Resorts, a major Colorado ski resort, which could benefit outdoor enthusiasts as well as scientists. And the research will also be useful to organizations such as water utilities and the Bureau of Reclamation that are experimenting with weather modification technologies, such as cloud-seeding.

    Other federal agencies join the bandwagon

    All the data collected by SAIL will be freely available to researchers. What’s more, a bevy of researchers from other federal agencies are undertaking field campaigns in the area with complementary research efforts.

    The National Oceanic and Atmospheric Administration (NOAA)(US), a Department of Commerce agency, has launched a project called SPLASH, or the Study of Precipitation, the Lower Atmosphere and Surface for Hydrometeorology, to improve weather and water prediction in the Colorado mountains and beyond. It will also be making detailed atmospheric co-observations in the SAIL study area.

    The Geological Survey (US), a Department of Interior agency, has developed an Upper Colorado Next Generation Water Observing System (NGWOS) to provide real-time data on water quantity and quality in more affordable and rapid ways than previously possible, and in more locations.

    “It’s quite rare for a single research question, the future of water in the West, to integrate the research activities of investigators across multiple federal agencies,” Williams noted.

    But the scale of the challenge, and the prospect of a low- to no-snow future, calls for nothing less than an all-hands-on-deck response by scientists. “We need to understand the range of risks that we’re facing moving forward,” Feldman said. “The term ‘no-analog future’ is a really big one for us.”

    9
    Staff from DOE’s Los Alamos National Laboratory(US) and Hamelmann Communications. Credit: LANL

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.


    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) (US) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (US) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley (US) physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.


    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory (US), and Robert Wilson founded Fermi National Accelerator Laboratory(US).

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy (US). The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory (US)) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy (US), with management from the University of California (US). Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science (US):

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    LBNL/ALS


    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute (US) supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory (US), DOE’s Oak Ridge National Laboratory (US)(ORNL), DOE’s Pacific Northwest National Laboratory (US) (PNNL), and the HudsonAlpha Institute for Biotechnology (US). The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry (US) [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center (US) is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center(US) at Lawrence Berkeley National Laboratory

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network (US) is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (US) (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory (US), the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science (US), and DOE’s Lawrence Livermore National Laboratory (US) (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology (US) and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory (US) leads JCESR and Berkeley Lab is a major partner.

     
  • richardmitnick 4:49 pm on August 7, 2021 Permalink | Reply
    Tags: "Translation software enables efficient DNA data storage", ADS Codex adds additional information called error detection codes that can be used to validate the data., DNA offers a compact way to store huge amounts of data cost-effectively., DNA’s storage density is staggering., DOE’s Los Alamos National Laboratory (US), Long-term storage with cheaper media is important for the national security mission of Los Alamos and others., Los Alamos National Laboratory has developed ADS Codex to translate the 0s and 1s of digital computer files into the four-letter code of DNA., Unfortunately DNA synthesis sometimes makes mistakes in the coding so ADS Codex addresses two big obstacles to creating DNA data files.   

    From DOE’s Los Alamos National Laboratory (US) : “Translation software enables efficient DNA data storage” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    April 1, 2021 [This just turned up in RSS]

    Charles Poling
    (505) 257-8006
    cpoling@lanl.gov

    1
    DNA offers a compact way to store huge amounts of data cost-effectively. Los Alamos National Laboratory has developed ADS Codex to translate the 0s and 1s of digital computer files into the four-letter code of DNA.

    In support of a major collaborative project to store massive amounts of data in DNA molecules, a Los Alamos National Laboratory–led team has developed a key enabling technology that translates digital binary files into the four-letter genetic alphabet needed for molecular storage.

    “Our software, the Adaptive DNA Storage Codec (ADS Codex), translates data files from what a computer understands into what biology understands,” said Latchesar Ionkov, a computer scientist at Los Alamos and principal investigator on the project. “It’s like translating from English to Chinese, only harder.”

    The work is key part of the Intelligence Advanced Research Projects Activity (IARPA) Molecular Information Storage (MIST) program to bring cheaper, bigger, longer-lasting storage to big-data operations in government and the private sector. The short-term goal of MIST is to write 1 terabyte—a trillion bytes—and read 10 terabytes within 24 hours for $1,000. Other teams are refining the writing (DNA synthesis) and retrieval (DNA sequencing) components of the initiative, while Los Alamos is working on coding and decoding.

    “DNA offers a promising solution compared to tape, the prevailing method of cold storage, which is a technology dating to 1951,” said Bradley Settlemyer, a storage systems researcher and systems programmer specializing in high-performance computing at Los Alamos. “DNA storage could disrupt the way we think about archival storage, because the data retention is so long and the data density so high. You could store all of YouTube in your refrigerator, instead of in acres and acres of data centers. But researchers first have to clear a few daunting technological hurdles related to integrating different technologies.”

    Not lost in translation

    Compared to the traditional long-term storage method that uses pizza-sized reels of magnetic tape, DNA storage is potentially less expensive, far more physically compact, more energy efficient, and longer lasting—DNA survives for hundreds of years and doesn’t require maintenance. Files stored in DNA also can be very easily copied for negligible cost.

    DNA’s storage density is staggering. Consider this: humanity will generate an estimated 33 zettabytes by 2025—that’s 3.3 followed by 22 zeroes. All that information would fit into a ping pong ball, with room to spare. The Library of Congress has about 74 terabytes, or 74 million million bytes, of information—6,000 such libraries would fit in a DNA archive the size of a poppy seed. Facebook’s 300 petabytes (300,000 terabytes) could be stored in a half poppy seed.

    Encoding a binary file into a molecule is done by DNA synthesis. A fairly well understood technology, synthesis organizes the building blocks of DNA into various arrangements, which are indicated by sequences of the letters A, C, G, and T. They are the basis of all DNA code, providing the instructions for building every living thing on earth.

    The Los Alamos team’s ADS Codex tells exactly how to translate the binary data—all 0s and 1s—into sequences of four letter-combinations of A, C, G, and T. The Codex also handles the decoding back into binary. DNA can be synthesized by several methods, and ADS Codex can accommodate them all. The Los Alamos team has completed a version 1.0 of ADS Codex and in November 2021 plans to use it to evaluate the storage and retrieval systems developed by the other MIST teams.

    Unfortunately DNA synthesis sometimes makes mistakes in the coding so ADS Codex addresses two big obstacles to creating DNA data files.

    First, compared to traditional digital systems, the error rates while writing to molecular storage are very high, so the team had to figure out new strategies for error correction. Second, errors in DNA storage arise from a different source than they do in the digital world, making the errors trickier to correct.

    “On a digital hard disk, binary errors occur when a 0 flips to a 1, or vice versa, but with DNA, you have more problems that come from insertion and deletion errors,” Ionkov said. “You’re writing A, C, G, and T, but sometimes you try to write A, and nothing appears, so the sequence of letters shifts to the left, or it types AAA. Normal error correction codes don’t work well with that.”

    ADS Codex adds additional information called error detection codes that can be used to validate the data. When the software converts the data back to binary, it tests if the codes match. If they don’t, ACOMA tries removing or adding nucleotides until the verification succeeds.

    Smart scale-up

    Large warehouses contain today’s largest data centers, with storage at the exabyte scale—that’s a trillion million bytes or more. Costing billions to build, power, and run, this type of digitally based data centers may not be the best option as the need for data storage continues to grow exponentially.

    Long-term storage with cheaper media is important for the national security mission of Los Alamos and others. “At Los Alamos, we have some of the oldest digital-only data and largest stores of data, starting from the 1940s,” Settlemyer said. “It still has tremendous value. Because we keep data forever, we’ve been at the tip of the spear for a long time when it comes to finding a cold-storage solution.”

    Settlemyer said DNA storage has the potential to be a disruptive technology because it crosses between fields ripe with innovation. The MIST project is stimulating a new coalition among legacy storage vendors who make tape, DNA synthesis companies, DNA sequencing companies, and high-performance computing organizations like Los Alamos that are driving computers into ever-larger-scale regimes of science-based simulations that yield mind-boggling amounts of data that must be analyzed.

    Deeper dive into DNA

    When most people think of DNA, they think of life, not computers. But DNA is itself a four-letter code for passing along information about an organism. DNA molecules are made from four types of bases, or nucleotides, each identified by a letter: adenine (A), thymine (T), guanine (G), and cytosine (C).

    These bases wrap in a twisted chain around each other—the familiar double helix—to form the molecule. The arrangement of these letters into sequences creates a code that tells an organism how to form. The complete set of DNA molecules makes up the genome—the blueprint of your body.

    By synthesizing DNA molecules—making them from scratch—researchers have found they can specify, or write, long strings of the letters A, C, G, and T and then read those sequences back. The process is analogous to how a computer stores information using 0s and 1s. The method has been proven to work, but reading and writing the DNA-encoded files currently takes a long time, Ionkov said.

    “Appending a single nucleotide to DNA is very slow. It takes a minute,” Ionkov said. “Imagine writing a file to a hard drive taking more than a decade. So that problem is solved by going massively parallel. You write tens of millions of molecules simultaneously to speed it up.”

    While various companies are working on different ways of synthesizing to address this problem, ADS Codex can be adapted to every approach.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus

    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: