Tagged: Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:38 am on August 29, 2014 Permalink | Reply
    Tags: , , Computing,   

    From BNL Lab: “DOE ‘Knowledgebase’ Links Biologists, Computer Scientists to Solve Energy, Environmental Issues” 

    Brookhaven Lab

    August 29, 2014
    Rebecca Harrington

    With new tool, biologists don’t have to be programmers to answer big computational questions

    If biologists wanted to determine the likely way a particular gene variant might increase a plant’s yield for producing biofuels, they used to have to track down several databases and cross-reference them using complex computer code. The process would take months, especially if they weren’t familiar with the computer programming necessary to analyze the data.

    ikb
    Combining information about plants, microbes, and the complex biomolecular interactions that take place inside these organisms into a single, integrated “knowledgebase” will greatly enhance scientists’ ability to access and share data, and use it to improve the production of biofuels and other useful products.

    Now they can do the same analysis in a matter of hours, using the Department of Energy’s Systems Biology Knowledgebase (KBase), a new computational platform to help the biological community analyze, store, and share data. Led by scientists at DOE’s Lawrence Berkeley, Argonne, Brookhaven, and Oak Ridge national laboratories, KBase amasses the data available on plants, microbes, microbial communities, and the interactions among them with the aim of improving the environment and energy production. The computational tools, resources, and community networking available will allow researchers to propose and test new hypotheses, predict biological behavior, design new useful functions for organisms, and perform experiments never before possible.

    “Quantitative approaches to biology were significantly developed during the last decade, and for the first time, we are now in a position to construct predictive models of biological organisms,” said computational biologist Sergei Maslov, who is principal investigator (PI) for Brookhaven’s role in the effort and Associate Chief Science Officer for the overall project, which also has partners at a number of leading universities, Cold Spring Harbor Laboratory, the Joint Genome Institute, the Environmental Molecular Sciences Laboratory, and the DOE Bioenergy Centers. “KBase allows research groups to share and analyze data generated by their project, put it into context with data generated by other groups, and ultimately come to a much better quantitative understanding of their results. Biomolecular networks, which are the focus of my own scientific research, play a central role in this generation and propagation of biological knowledge.”

    Maslov said the team is transitioning from the scientific pilot phase into the production phase and will gradually expand from the limited functionality available now. By signing up for an account, scientists can access the data and tools free of charge, opening the doors to faster research and deeper collaboration.
    Easy coding

    “We implement all the standard tools to operate on this kind of key data so a single PI doesn’t need to go through the hassle by themselves.”
    — Shinjae Yoo, assistant computational scientist working on the project at Brookhaven

    As problems in energy, biology, and the environment get bigger, the data needed to solve them becomes more complex, driving researchers to use more powerful tools to parse through and analyze this big data. Biologists across the country and around the world generate massive amounts of data — on different genes, their natural and synthetic variations, proteins they encode, and their interactions within molecular networks — yet these results often don’t leave the lab where they originated.

    “By doing small-scale experiments, scientists cannot get the system-level understanding of biological organisms relevant to the DOE mission,” said Shinjae Yoo, an assistant computational scientist working on the project at Brookhaven. “But they can use KBase for the analysis of their large-scale data. KBase will also allow them to compare and contrast their data with other key datasets generated by projects funded by the DOE and other agencies. We implement all the standard tools to operate on this kind of key data so a single PI doesn’t need to go through the hassle by themselves.”

    For non-programmers, KBase offers a “Narrative Interface,” allowing them to upload their data to KBase and construct a narrative of their analysis with a series of pre-coded programs that has a human in the middle interpreting and filtering their output.

    In one pre-coded narrative, researchers can filter through naturally occurring variations of Poplar genes, one of the DOE flagship bioenergy plant species. Scientists can discover genes associated with a reduced amount of lignin—a cell wall protein that makes conversion of Poplar biomass to biofuels more difficult. In this narrative, scientists can use datasets from KBase and from their own research to then find candidate genes, and use networks to select the genes most likely to be related to a specific trait they’re looking for—say, genes that result in reduced lignin content, which could ease the biomass to biofuel conversion. And if other researchers wanted to run the same program for a different plant, they could just put different data in the same narrative.

    “Everything is already there,” Yoo said. “You simply need to upload the data in the right format and run through several easy steps within the narrative.”

    For those who know how to code, KBase has the IRIS Interface, a web-based command line terminal where researchers can run and control the programs on their own, allowing scientists to analyze large volumes of data. If researchers want to learn how to do the coding themselves, KBase also has tutorials and resources to help interested scientists learn it.
    A social network

    But KBase’s most powerful resource is the community itself. Researchers are encouraged to upload their data and programs so that other users can benefit from them. This type of cooperative environment encourages sharing and feedback among researchers, so the programs, tools, and annotation of datasets can improve with other users’ input.

    Brookhaven is leading the plant team on the project, while the microbe and microbial community teams are based at other partner institutions. A computer scientist by training, Yoo said his favorite part of working on KBase has been how much biology he’s learned. Acting as a go-between among the biologists at Brookhaven, who are describing what they’d like to see KBase be able to do, and the computer scientists, who are coding the programs to make it happen, Yoo has had to understand both languages of science.

    “I’m learning plant biology. That’s pretty cool to me,” he said. “In the beginning, it was quite tough. Three years later I’ve caught up, but I still have a lot to learn.”

    Ultimately, KBase aims to interweave huge amounts of data with the right tools and user interface to enable bench scientists without programming backgrounds to answer the kinds of complex questions needed to solve the energy and environmental issues of our time.

    “We can gain systematic understanding of a biological process much faster, and also have a much deeper understanding,” Yoo said, “so we can engineer plant organisms or bacteria to improve productivity, biomass yield—and then use that information for biodesign.”

    KBase is funded by the DOE’s Office of Science. The Office of Science (SC) is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 10:49 am on August 13, 2014 Permalink | Reply
    Tags: , Computing, ,   

    From Fermilab: “Fermilab hosts first C++ school for next generation of particle physicists” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Wednesday, Aug. 13, 2014

    Fermilab Leah Hesla
    Leah Hesla

    Colliding particle beams without the software know-how to interpret the collisions would be like conducting an archaeological dig without the tools to sift through the artifacts. Without a way to get to the data, you wouldn’t know what you were looking at.

    Eager to keep future particle physicists well-equipped and up to date on the field’s chosen data analysis tools, Scientific Computing Division‘s Liz Sexton-Kennedy and Sudhir Malik, now physics professor at University of Puerto Rico Mayagyez, organized Fermilab’s first C++ software school, which was held last week.

    “C++ is the language of high-energy physics analysis and reconstruction,” Malik said. “There was no organized effort to teach it, so we started this school.”

    Although software skills are crucial for simulating and interpreting particle physics data, physics graduate school programs don’t formally venture into the more utilitarian skill sets. Thus scientists take it upon themselves to learn C++ outside the classroom, either on their own or through discussions with their peers. Usually this self-education is absorbed through examples, whether or not the examples are flawed, Sexton-Kennedy said.

    The school aimed to set its students straight.

    It also looked to increase the numbers of particle physicists fluent in C++, a skill that is useful beyond particle physics. Fields outside academia highly value such expertise — enough that particle physicists are being lured away to jobs in industry.

    “We would lose people who were good at both physics and C++,” Sexton-Kennedy said. “The few of us who stayed behind needed to teach the next generation.”

    The next generation appears to have been waiting for just such an opportunity: Within two weeks of the C++ school opening registration, 80 students signed up. It was so popular that the co-organizers had to start a wait list.

    The software school students include undergraduates, graduate students and postdocs, all of whom work on Fermilab experiments.

    “We get most of the ideas for how to use software for event reconstruction for the LBNE near-detector reference design from these sessions,” said Xinchun Tian, a University of South Carolina postdoc working on the Long-Baseline Neutrino Experiment. “C++ is very useful for our research.”

    Fermilab NUMI Tunnel project
    Fermilab NuMI tunnel

    Fermilab LBNE
    Fermilab LBNE

    University of Wisconsin physics professor Matt Herndon led the sessions. He was assisted by 13 people: Notre Dame University physics professor Mike Hildreth and volunteers from the SCD Scientific Software Infrastructure Department.

    Malik and Sexton-Kennedy plan to make the school material available online.

    “People have to take these tools seriously, and in high-energy physics, the skills mushroom around C++ software,” Malik said. “Students are learning C++ while growing up in the field.”

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 12:48 pm on June 18, 2014 Permalink | Reply
    Tags: , Computing, , ,   

    From Princeton: “Familiar yet strange: Water’s ‘split personality’ revealed by computer model” 

    Princeton University
    Princeton University

    June 18, 2014
    Catherine Zandonella, Office of the Dean for Research

    Seemingly ordinary, water has quite puzzling behavior. Why, for example, does ice float when most liquids crystallize into dense solids that sink?

    Using a computer model to explore water as it freezes, a team at Princeton University has found that water’s weird behaviors may arise from a sort of split personality: at very cold temperatures and above a certain pressure, water may spontaneously split into two liquid forms.

    The team’s findings were reported in the journal Nature.

    “Our results suggest that at low enough temperatures water can coexist as two different liquid phases of different densities,” said Pablo Debenedetti, the Class of 1950 Professor in Engineering and Applied Science and Princeton’s dean for research, and a professor of chemical and biological engineering.

    The two forms coexist a bit like oil and vinegar in salad dressing, except that the water separates from itself rather than from a different liquid. “Some of the molecules want to go into one phase and some of them want to go into the other phase,” said Jeremy Palmer, a postdoctoral researcher in the Debenedetti lab.

    The finding that water has this dual nature, if it can be replicated in experiments, could lead to better understanding of how water behaves at the cold temperatures found in high-altitude clouds where liquid water can exist below the freezing point in a “supercooled” state before forming hail or snow, Debenedetti said. Understanding how water behaves in clouds could improve the predictive ability of current weather and climate models, he said.

    chart
    Pressure–temperature phase diagram, including an illustration of the liquid–liquid transition line proposed for several polyamorphous materials. This liquid–liquid phase transition would be a first order, discontinuous transition between low and high density liquids (labelled 1 and 2). This is analogous to polymorphism of crystalline materials, where different stable crystalline states (solid 1, 2 in diagram) of the same substance can exist (e.g. diamond and graphite are two polymorphs of carbon). Like the ordinary liquid–gas transition, the liquid–liquid transition is expected to end in a critical point. At temperatures beyond these critical points there is a continuous range of fluid states, i.e. the distinction between liquids and gasses is lost. If crystallisation is avoided the liquid–liquid transition can be extended into the metastable supercooled liquid regime.

    The new finding serves as evidence for the “liquid-liquid transition” hypothesis, first suggested in 1992 by Eugene Stanley and co-workers at Boston University and the subject of recent debate. The hypothesis states that the existence of two forms of water could explain many of water’s odd properties — not just floating ice but also water’s high capacity to absorb heat and the fact that water becomes more compressible as it gets colder.

    deb
    Princeton University researchers conducted computer simulations to explore what happens to water as it is cooled to temperatures below freezing and found that the supercooled liquid separated into two liquids with different densities. The finding agrees with a two-decade-old hypothesis to explain water’s peculiar behaviors, such as becoming more compressible and less dense as it is cooled. The X axis above indicates the range of crystallinity (Q6) from liquid water (less than 0.1) to ice (greater than 0.5) plotted against density (ρ) on the Y axis. The figure is a two-dimensional projection of water’s calculated “free energy surface,” a measure of the relative stability of different phases, with orange indicating high free energy and blue indicating low free energy. The two large circles in the orange region reveal a high-density liquid at 1.15 g/cm3 and low-density liquid at 0.90 g/cm3. The blue area represents cubic ice, which in this model forms at a density of about 0.88 g/cm3. (Image courtesy of Jeremy Palmer)

    At cold temperatures, the molecules in most liquids slow to a sedate pace, eventually settling into a dense and orderly solid that sinks if placed in liquid. Ice, however, floats in water due to the unusual behavior of its molecules, which as they get colder begin to push away from each other. The result is regions of lower density — that is, regions with fewer molecules crammed into a given volume — amid other regions of higher density. As the temperature falls further, the low-density regions win out, becoming so prevalent that they take over the mixture and freeze into a solid that is less dense than the original liquid.

    The work by the Princeton team suggests that these low-density and high-density regions are remnants of the two liquid phases that can coexist in a fragile, or “metastable” state, at very low temperatures and high pressures. “The existence of these two forms could provide a unifying theory for how water behaves at temperatures ranging from those we experience in everyday life all the way to the supercooled regime,” Palmer said.

    Since the proposal of the liquid-liquid transition hypothesis, researchers have argued over whether it really describes how water behaves. Experiments would settle the debate, but capturing the short-lived, two-liquid state at such cold temperatures and under pressure has proved challenging to accomplish in the lab.

    Instead, the Princeton researchers used supercomputers to simulate the behavior of water molecules — the two hydrogens and the oxygen that make up “H2O” — as the temperature dipped below the freezing point.

    The team used computer code to represent several hundred water molecules confined to a box, surrounded by an infinite number of similar boxes. As they lowered the temperature in this virtual world, the computer tracked how the molecules behaved.

    The team found that under certain conditions — about minus 45 degrees Celsius and about 2,400-times normal atmospheric pressure — the virtual water molecules separated into two liquids that differed in density.

    The pattern of molecules in each liquid also was different, Palmer said. Although most other liquids are a jumbled mix of molecules, water has a fair amount of order to it. The molecules link to their neighbors via hydrogen bonds, which form between the oxygen of one molecule and a hydrogen of another. These molecules can link — and later unlink — in a constantly changing network. On average, each H2O links to four other molecules in what is known as a tetrahedral arrangement.

    The researchers found that the molecules in the low-density liquid also contained tetrahedral order, but that the high-density liquid was different. “In the high-density liquid, a fifth neighbor molecule was trying to squeeze into the pattern,” Palmer said.

    image
    Normal ice (left) contains water molecules linked into ring-like structures via hydrogen bonds (dashed blue lines) between the oxygen atoms (red beads) and hydrogen atoms (white beads) of neighboring molecules, with six water molecules per ring. Each water molecule in ice also has four neighbors that form a tetrahedron (right), with a center molecule linked via hydrogen bonds to four neighboring molecules. The green lines indicate the edges of the tetrahedron. Water molecules in liquid water form distorted tetrahedrons and ring structures that can contain more or less than six molecules per ring. (Image courtesy of Jeremy Palmer)

    The researchers also looked at another facet of the two liquids: the tendency of the water molecules to form rings via hydrogen bonds. Ice consists of six water molecules per ring. Calculations by Fausto Martelli, a postdoctoral research associate advised by Roberto Car, the Ralph W. *31 Dornte Professor in Chemistry, found that in this computer model the average number of molecules per ring decreased from about seven in the high-density liquid to just above six in the low-density liquid, but then climbed slightly before declining again to six molecules per ring as ice, suggesting that there is more to be discovered about how water molecules behave during supercooling.

    A better understanding of water’s behavior at supercooled temperatures could lead to improvements in modeling the effect of high-altitude clouds on climate, Debenedetti said. Because water droplets reflect and scatter the sunlight coming into the atmosphere, clouds play a role in whether the sun’s energy is reflected away from the planet or is able to enter the atmosphere and contribute to warming. Additionally, because water goes through a supercooled phase before forming hail or snow, such research may aid strategies for preventing ice from forming on airplane wings.

    “The research is a tour de force of computational physics and provides a splendid academic look at a very difficult problem and a scholarly controversy,” said C. Austen Angell, professor of chemistry and biochemistry at Arizona State University, who was not involved in the research. “Using a particular computer model, the Debenedetti group has provided strong support for one of the theories that can explain the outstanding properties of real water in the supercooled region.”

    In their computer simulations, the team used an updated version of a model noted for its ability to capture many of water’s unusual behaviors first developed in 1974 by Frank Stillinger, then at Bell Laboratories in Murray Hill, N.J., and now a senior chemist at Princeton; and Aneesur Rahman, then at the U.S. Argonne National Laboratory. The same model was used to develop the liquid-liquid transition hypothesis.

    Collectively, the work took several million computer hours, which would take several human lifetimes using a typical desktop computer, Palmer said. In addition to the initial simulations, the team verified the results using six calculation methods. The computations were performed at Princeton’s High-Performance Computing Research Center’s Terascale Infrastructure for Groundbreaking Research in Science and Engineering (TIGRESS).

    The team included Yang Liu, who earned her doctorate at Princeton in 2012, and Athanassios Panagiotopoulos, the Susan Dod Brown Professor of Chemical and Biological Engineering.

    Support for the research was provided by the National Science Foundation (CHE 1213343) and the U.S. Department of Energy (DE-SC0002128 and DE-SC0008626).

    The article, Metastable liquid-liquid transition in a molecular model of water, by Jeremy C. Palmer, Fausto Martelli, Yang Liu, Roberto Car, Athanassios Z. Panagiotopoulos and Pablo G. Debenedetti, appeared in the journal Nature.

    See the full article here.

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield

    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 7:31 am on May 15, 2014 Permalink | Reply
    Tags: , Computing, ,   

    From Sandia Lab: “The brain: key to a better computer “ 

    Sandia Lab

    May 15, 2014
    Sue Holmes, sholmes@sandia.gov, (505) 844-6362

    Your brain is incredibly well-suited to handling whatever comes along, plus it’s tough and operates on little energy. Those attributes — dealing with real-world situations, resiliency and energy efficiency — are precisely what might be possible with neuro-inspired computing.

    “Today’s computers are wonderful at bookkeeping and solving scientific problems often described by partial differential equations, but they’re horrible at just using common sense, seeing new patterns, dealing with ambiguity and making smart decisions,” said John Wagner, cognitive sciences manager at Sandia National Laboratories.

    In contrast, the brain is “proof that you can have a formidable computer that never stops learning, operates on the power of a 20-watt light bulb and can last a hundred years,” he said.

    Although brain-inspired computing is in its infancy, Sandia has included it in a long-term research project whose goal is future computer systems. Neuro-inspired computing seeks to develop algorithms that would run on computers that function more like a brain than a conventional computer.

    brain
    Sandia National Laboratories researchers are drawing inspiration from neurons in the brain, such as these green fluorescent protein-labeled neurons in a mouse neocortex, with the aim of developing neuro-inspired computing systems. Although brain-inspired computing is in its infancy, Sandia has included it in a long-term research project whose goal is future computer systems. (Photo by Frances S. Chance, courtesy of Janelia Farm Research Campus)

    “We’re evaluating what the benefits would be of a system like this and considering what types of devices and architectures would be needed to enable it,” said microsystems researcher Murat Okandan.

    Sandia’s facilities and past research make the laboratories a natural for this work: its Microsystems & Engineering Science Applications (MESA) complex, a fabrication facility that can build massively interconnected computational elements; its computer architecture group and its long history of designing and building supercomputers; strong cognitive neurosciences research, with expertise in such areas as brain-inspired algorithms; and its decades of work on nationally important problems, Wagner said.

    New technology often is spurred by a particular need. Early conventional computing grew from the need for neutron diffusion simulations and weather prediction. Today, big data problems and remote autonomous and semiautonomous systems need far more computational power and better energy efficiency.

    Neuro-inspired computers would be ideal for robots, remote sensors

    Neuro-inspired computers would be ideal for operating such systems as unmanned aerial vehicles, robots and remote sensors, and solving big data problems, such as those the cyber world faces and analyzing transactions whizzing around the world, “looking at what’s going where and for what reason,” Okandan said.

    Such computers would be able to detect patterns and anomalies, sensing what fits and what doesn’t. Perhaps the computer wouldn’t find the entire answer, but could wade through enormous amounts of data to point a human analyst in the right direction, Okandan said.

    “If you do conventional computing, you are doing exact computations and exact computations only. If you’re looking at neurocomputation, you are looking at history, or memories in your sort of innate way of looking at them, then making predictions on what’s going to happen next,” he said. “That’s a very different realm.”

    Modern computers are largely calculating machines with a central processing unit and memory that stores both a program and data. They take a command from the program and data from the memory to execute the command, one step at a time, no matter how fast they run. Parallel and multicore computers can do more than one thing at a time but still use the same basic approach and remain very far removed from the way the brain routinely handles multiple problems concurrently.

    The architecture of neuro-inspired computers would be fundamentally different, uniting processing and storage in a network architecture “so the pieces that are processing the data are the same pieces that are storing the data, and the data will be processed with all nodes functioning concurrently,” Wagner said. “It won’t be a serial step-by-step process; it’ll be this network processing everything all at the same time. So it will be very efficient and very quick.”

    Unlike today’s computers, neuro-inspired computers would inherently use the critical notion of time. “The things that you represent are not just static shots, but they are preceded by something and there’s usually something that comes after them,” creating episodic memory that links what happens when. This requires massive interconnectivity and a unique way of encoding information in the activity of the system itself, Okandan said.

    More neurosciences research opens more possibilities for brain-inspired computing

    Each neuron in a neural structure can have connections coming in from about 10,000 neurons, which in turn can connect to 10,000 other neurons in a dynamic way. Conventional computer transistors, on the other hand, connect on average to four other transistors in a static pattern.

    Computer design has drawn from neuroscience before, but an explosion in neuroscience research in recent years opens more possibilities. While it’s far from a complete picture, Okandan said what’s known offers “more guidance in terms of how neural systems might be representing data and processing information” and clues about replicating those tasks in a different structure to address problems impossible to solve on today’s systems.

    Brain-inspired computing isn’t the same as artificial intelligence, although a broad definition of artificial intelligence could encompass it.

    “Where I think brain-inspired computing can start differentiating itself is where it really truly tries to take inspiration from biosystems, which have evolved over generations to be incredibly good at what they do and very robust against a component failure. They are very energy efficient and very good at dealing with real-world situations. Our current computers are very energy inefficient, they are very failure-prone due to components failing and they can’t make sense of complex data sets,” Okandan said.

    Computers today do required computations without any sense of what the data is — it’s just a representation chosen by a programmer.

    “Whereas if you think about neuro-inspired computing systems, the structure itself will have an internal representation of the datastream that it’s receiving and previous history that it’s seen, so ideally it will be able to make predictions on what the future states of that datastream should be, and have a sense for what the information represents.” Okandan said.

    He estimates a project dedicated to brain-inspired computing will develop early examples of a new architecture in the first several years, but said higher levels of complexity could take decades, even with the many efforts around the world working toward the same goal.

    “The ultimate question is, ‘What are the physical things in the biological system that let you think and act, what’s the core essence of intelligence and thought?’ That might take just a bit longer,” he said.

    For more information, visit the 2014 Neuro-Inspired Computational Elements Workshop website.

    See the full article here.

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 8:31 am on April 20, 2014 Permalink | Reply
    Tags: , Computing, , , ,   

    From Berkeley Lab: “Discovery of New Semiconductor Holds Promise for 2D Physics and Electronics” 


    Berkeley Lab

    March 20, 2014
    Lynn Yarris (510) 486-5375 lcyarris@lbl.gov

    From super-lubricants, to solar cells, to the fledgling technology of valleytronics, there is much to be excited about with the discovery of a unique new two-dimensional semiconductor, rhenium disulfide, by researchers at Berkeley Lab’s Molecular Foundry. Rhenium disulfide, unlike molybdenum disulfide and other dichalcogenides, behaves electronically as if it were a 2D monolayer even as a 3D bulk material. This not only opens the door to 2D electronic applications with a 3D material, it also makes it possible to study 2D physics with easy-to-make 3D crystals.

    cuite
    Nano-beam electron diffraction pattern of rhenium disulfide with a zoom-in insert image reveals a quasi-hexagonal reflection pattern.

    “Rhenium disulfide remains a direct-bandgap semiconductor, its photoluminescence intensity increases while its Raman spectrum remains unchanged, even with the addition of increasing numbers of layers,” says Junqiao Wu, a physicist with Berkeley Lab’s Materials Sciences Division who led this discovery. “This makes bulk crystals of rhenium disulfide an ideal platform for probing 2D excitonic and lattice physics, circumventing the challenge of preparing large-area, single-crystal monolayers.”

    Wu, who is also a professor with the University of California-Berkeley’s Department of Materials Science and Engineering, headed a large international team of collaborators who used the facilities at the Molecular Foundry, a U.S Department of Energy (DOE) national nanoscience center, to prepare and characterize individual monolayers of rhenium disulfide. Through a variety of spectroscopy techniques, they studied these monolayers both as stacked multilayers and as bulk materials. Their study revealed that the uniqueness of rhenium disulfide stems from a disruption in its crystal lattice symmetry called a Peierls distortion.

    st
    Sefaattin Tongay was the lead author of a Nature Communications paper announcing the discovery of rhenium disulfide. (Photo by Roy Kaltschmidt)

    “Semiconducting transition metal dichalcogenides consist of monolayers held together by weak forces,” says Sefaattin Tongay, lead author of a paper describing this research in Nature Communications for which Wu was the corresponding author. The paper was titled Monolayer behaviour in bulk ReS2 due to electronic and vibrational decoupling.

    “Typically the monolayers in a semiconducting transition metal dichalcogenides, such as molybdenum disulfide, are relatively strongly coupled, but isolated monolayers show large changes in electronic structure and lattice vibration energies,” Tongay says. “The result is that in bulk these materials are indirect gap semiconductors and in the monolayer they are direct gap.”

    What Tongay, Wu and their collaborators found in their characterization studies was that rhenium disulfide contains seven valence electrons as opposed to the six valence electrons of molybdenum disulfide and other transition metal dichalcogenides. This extra valence electron prevents strong interlayer coupling between multiple monolayers of rhenium disulfide.

    “The extra electron is eventually shared between two rhenium atoms, which causes the atoms to move closer to one another other, forming quasi-one-dimensional chains within each layer and creating the Peierls distortion in the lattice,” Tongay says. “Once the Peierls distortion takes place, interlayer registry is largely lost, resulting in weak interlayer coupling and monolayer behavior in the bulk.”

    stuff
    Atomic structure of a monolayer of rhenium disulphide shows the dimerization of the rhenium atoms as a result of the Peierls, forming a rhenium chain denoted by the red zigzag line.

    Rhenium disulfide’s weak interlayer coupling should make this material highly useful in tribology and other low-friction applications. Since rhenium disulfide also exhibits strong interactions between light and matter that are typical of monolayer semiconductors, and since the bulk rhenium disulfide behaves as if it were a monolayer, the new material should also be valuable for solar cell applications. It might also be a less expensive alternative to diamond for valleytronics.

    In valleytronics, the wave quantum number of the electron in a crystalline material is used to encode information. This number is derived from the spin and momentum of an electron moving through a crystal lattice as a wave with energy peaks and valleys. Encoding information when the electrons reside in these minimum energy valleys offers a highly promising potential new route to quantum computing and ultrafast data-processing.

    “Rhenium atoms have a relatively large atomic weight, which means electron spin-orbit interactions are significant,” Tongay says. “This could make rhenium disulfide an ideal material for valleytronics applications.”

    The collaboration is now looking at ways to tune the properties of rhenium disulfide in both monolayer and bulk crystals through engineered defects in the lattice and selective doping. They are also looking to alloy rhenium disulfide with other members of the dichalcogenide family.

    Other authors of the Nature Communications paper in addition to Wu and Tongay were Hasan Sahin, Changhyun Ko, Alex Luce, Wen Fan, Kai Liu, Jian Zhou, Ying-Sheng Huang, Ching-Hwa Ho, Jinyuan Yan, Frank Ogletree, Shaul Aloni, Jie Ji, Shushen Li, Jingbo Li, and F. M. Peeters.

    This research was primarily supported by the DOE Office of Science.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 11:35 am on March 31, 2014 Permalink | Reply
    Tags: , , , Computing, ,   

    From Brookhaven Lab: “Generations of Supercomputers Pin Down Primordial Plasma” 

    Brookhaven Lab

    March 31, 2014
    Justin Eure

    As one groundbreaking IBM system retires, a new Blue Gene supercomputer comes online at Brookhaven Lab to help precisely model subatomic interactions

    three
    Brookhaven Lab physicists Peter Petreczky and Chulwoo Jung with technology architect Joseph DePace—who oversees operations and maintenance of the Lab’s supercomputers—in front of the Blue Gene/Q supercomputer.

    Supercomputers are constantly evolving to meet the increasing complexity of calculations ranging from global climate models to cosmic inflation. The bigger the puzzle, the more scientists and engineers push the limits of technology forward. Imagine, then, the advances driven by scientists seeking the code behind our cosmos.

    This mutual push and pull of basic science and technology plays out every day among physicists at the U.S. Department of Energy’s Brookhaven National Laboratory. The Lab’s Lattice Gauge Theory Group—led by physicist Frithjof Karsch—hunts for equations to describe the early universe and the forces binding matter together. Their search spans generations of supercomputers and parallels studies of the primordial plasma discovered and explored at Brookhaven’s Relativistic Heavy Ion Collider (RHIC).

    Brookhaven RHIC
    RHIC

    “You need more than just pen and paper to recreate the quantum-scale chemistry unfolding at the foundations of matter—you need supercomputers,” said Brookhaven Lab physicist Peter Petreczky. “The racks of IBM’s Blue Gene/L hosted here just retired after six groundbreaking years, but the cutting-edge Blue Gene/Q is now online to keep pushing nuclear physics forward. “

    Equations to Describe the Dawn of Time

    When RHIC smashes gold ions together at nearly the speed of light, the trillion-degree collisions melt the protons inside each atom. The quarks and gluons inside then break free for a fraction of a second, mirroring the ultra-hot conditions of the universe just microseconds after the Big Bang. This remarkable matter, called quark-gluon plasma, surprised physicists by exhibiting zero viscosity—it behaved like a perfect, friction-free liquid. But this raised new questions: how and why?

    Cosmic Microwave Background Planck
    Cosmic Microwave Background by ESA/Planck

    Armed with the right equations of state, scientists can begin to answer that question and model that perfect plasma at each instant. This very real quest revolves in part around the very artificial: computer simulations.

    “If our equations are accurate, the laws of physics hold up through the simulations and we gain a new and nuanced vocabulary to characterize and predict truly fundamental interactions,” Karsch said. “If we’re wrong, the simulation produces something very different from reality. We’re in the business of systematically eliminating uncertainties.”

    Building a Quantum Grid

    Quantum chromodynamics (QCD) is the theoretical framework that describes these particle interactions on the subatomic scale. But even the most sophisticated computer can’t replicate the full QCD complexity that plays out in reality.

    “To split that sea of information into discrete pieces, physicists developed a four-dimensional grid of space-time points called the lattice,” Petreczky said. “We increase the density of this lattice as technology evolves, because the closer we pack our lattice-bound particles, the closer we approximate reality.”

    Imagine a laser grid projected into a smoke-filled room, transforming that swirling air into individual squares. Each intersection in that grid represents a data point that can be used to simulate the flow of the actual smoke. In fact, scientists use this same lattice-based approximation in fields as diverse as climate science and nuclear fusion.

    As QCD scientists incorporated more and more subatomic details into an ever-denser grid—including the full range of quark and gluon types—the mathematical demands leapt exponentially.
    QCD on a Chip

    Physicist Norman Christ, a Columbia University professor and frequent Brookhaven Lab collaborator, partnered with supercomputing powerhouse IBM to tackle the unprecedented hardware challenge for QCD simulations. The new system would need a relatively small physical footprint, good temperature control, and a combination of low power and high processor density.

    The result was the groundbreaking QCDOC, or QuantumChromoDynamics On a Chip. QCDOC came online in 2004 with a processing power of 10 teraflops, or 10 trillion floating operations per second, a common performance standard.

    “The specific needs of Christ and his collaborators actually revolutionized and rejuvenated supercomputing in this country,” said physicist Berndt Mueller, who leads Brookhaven Lab’s Nuclear and Particle Physics directorate. “The new architecture developed for QCD simulations was driven by these fundamental physics questions. That group laid the foundation for generations of IBM supercomputers that routinely rank among the world’s most powerful.”
    Generations of Giants

    The first QCDOC simulations featured lattices with 16 points in each spatial direction—a strong starting point and testing ground for QCD hypotheses, but a far cry from definitive. Building on QCDOC, IBM launched its Blue Gene series of supercomputers. In fact, the chief architect for all three generations of these highly scalable, general-purpose machines was physicist Alan Gara, who did experimental work at Fermilab [Tevatron] and CERN’s Large Hadron Collider before being recruited by IBM.

    “We had the equation of state for quark-gluon plasma prepared for publication in 2007 based on QCDOC calculations,” Petreczky said, “but it was not as accurate as we hoped. Additional work on the newly installed Blue Gene/L gave us confidence that we were on the right track.”

    The New York Blue system—led by Stony Brook University and Brookhaven Lab with funding from New York State—added 18 racks of Blue Gene/L and two racks of Blue Gene/P in 2007. This 100-teraflop boost doubled the QCD model density to 32 lattice points and ran simulations some 10 million times more complex. Throughout this period, lattice theorists also used Blue Gene supercomputers at DOE’s Argonne and Lawrence Livermore national labs.

    The 600-teraflop Blue Gene/Q came online at Brookhaven Lab in 2013, packing the processing power of 18 racks of Blue Gene/P into just three racks. This new system signaled the end for Blue Gene/L, which went offline in January 2014. Both QCDOC and Blue Gene/Q were developed in close partnership with RIKEN, a leading Japanese research institution.

    “Exciting as it is, moving across multiple systems is also a bit of a headache,” Petreczky said. “Before we get to the scientific simulations, there’s a long transition period and a tremendous amount of code writing. Chulwoo Jung, one of our group members, takes on a lot of that crucial coding.”

    Pinning Down Fundamental Fabric

    Current simulations of QCD matter feature 64 spatial lattice points in each direction, allowing physicist an unprecedented opportunity to map the quark-gluon plasma created at RHIC and explore the strong nuclear force. The Lattice Gauge Theory collaboration continues to run simulations and plans to extend the equations of state to cover all the energy levels achieved at both RHIC and the Large Hadron Collider at CERN.

    The equations already ironed out by Brookhaven’s theorists apply to everything from RHIC’s friction-free superfluid to physics beyond the standard model—including the surprising spin of muons in the g-2 experiment and rare meson decays at Fermilab.

    “This is the beauty of pinning down fundamental interactions: the foundations of matter are literally universal,” Petreczky said. “And only a few groups in the world are describing this particular aspect of our universe.”

    Additional Brookhaven Lab lattice theorists include Michael Creutz, Christoph Lehner, Taku Izubuchi, Swagato Mukherjee, and Amarjit Soni.

    The Brookhaven Computational Science Center (CSC) hosts the IBM Blue Gene supercomputers and Intel clusters used by scientists across the Lab. The CSC brings together researchers in biology, chemistry, physics and medicine with applied mathematicians and computer scientists to take advantage of the new opportunities for scientific discovery made possible by modern computers. The CSC is supported by DOE’s Office of Science.

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 11:33 am on March 11, 2014 Permalink | Reply
    Tags: , Computing,   

    From Fermilab: “Network connections: universities compute for particle physics” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Tuesday, March 11, 2014
    Clementine Jones, Computing Sector Communications

    two
    Notre Dame University has an active high-energy physics computing program and is one of a number of schools contributing to computing for the CMS experiment. Pictured here are professors Mike Hildreth (left) and Kevin Lannon, who is also computing liaison for the US CMS collaboration board. Photo courtesy of Kevin Lannon, Notre Dame University

    Building on Fermilab Today’s University Profiles, the Computing Sector followed up with several 2012 participants to inquire about the roles their computing departments play in particle physics research programs. We questioned randomly selected universities for two articles, and this feature focuses on the seven responses to our second set of questions.

    The first article concentrated on the value of collaboration, demonstrated by the contribution of software development or local computing resources from different universities’ computing departments to various high-energy physics research groups. Unsurprisingly, this remains a theme here: Six of the universities are currently Open Science Grid members; six are CMS or ATLAS Tier-3 or Tier-2 centers, with the seventh almost finished installing a Tier-3 center; and all have contributed software to experiments. Effective collaboration also encompasses research to improve resources. This can be incremental, enhancing precision or efficiency, or revolutionary, with novel approaches stemming from R&D efforts that create new experiment and analysis opportunities. The following are selected examples of R&D work from the universities’ responses.

    Several are investigating future processing, infrastructure and storage requirements. Professor Markus Wobisch referred to Louisiana Tech University’s interest in “multicore particle physics computing and high-availability applications.” Research scientist Shawn McKee said that the University of Michigan is researching “next-generation infrastructures, including software-defined networking, new file systems, and tools and techniques for agilely provisioning, configuring and maintaining their infrastructure and virtualization capabilities.” The University of Notre Dame’s Professor Michael Hildreth is lead principal investigator on a project looking into “data preservation issues for the future.” He is also working with others, including Professor Kevin Lannon, to “develop techniques for opportunistic computing.”

    Others focused even further on grid infrastructure. Professor Brad Abbott emphasized the University of Oklahoma’s early involvement in grid computing R&D, having had “one of the first US ATLAS grid computing test-bed setups” and being the first site to adopt an existing high-performance computing cluster as part of ATLAS. Professors Ian Shipsey and Norbert Neumeister described Purdue University’s membership in the ExTENCI project to provide an interface between the Open Science Grid and XSEDE “to bridge the efforts of these two cyberinfrastructure projects.” Professor George Alverson says that Northeastern University personnel are currently involved as grid users and testers, and the institution hopes soon to begin grid integration work.

    Finally, Professor Sung-Won Lee and postdoctoral research fellow Chris Cowden said a group at Texas Tech University is studying further applications of their FFTJet algorithm, “which applies image processing techniques to jet finding in high-energy physics experiments,” as well as “developing an application of the Geant4 simulation toolkit to study the CMS Phase II detector upgrade designs.”

    Whether fine tuning or paradigm shifting, these projects represent advances in computing capabilities and applications, the benefits of which are felt across the field of high-energy physics.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 3:14 pm on January 15, 2014 Permalink | Reply
    Tags: , , Computing, ,   

    From Fermilab: “From the Scientific Computing Division – Intensity Frontier experiments develop insatiable appetite” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    rr
    Rob Roser, head of the Scientific Computing Division, wrote this column.

    The neutrino and muon experiments at Fermilab are getting more demanding! They have reached a level of sophistication and precision that the present computing resources available at Fermilab are no longer sufficient to handle. The solution: The Scientific Computing Division is now introducing grid and cloud services to satisfy those experiments’ appetite for large amounts of data and computing time.

    An insatiable appetite for computing resources is not new to Fermilab. Both Tevatron experiments as well as the CMS experiment require computing resources that far exceed our on-site capacity to successfully perform their science. As a result the scientific collaborations have been working closely with us over many years to leverage computing capabilities at the universities and other laboratories. Now, the demand from our Intensity Frontier experiments has reached this level.

    The Scientific Computing Services quadrant under the leadership of Margaret Votava has worked very hard over the past year with various computing organizations to provide experiments with the capability to run their software at remote locations, transfer data and bring the results back to Fermilab.

    See much more in the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 9:05 am on October 22, 2013 Permalink | Reply
    Tags: , Computing, , ,   

    From D.O.E. Pulse: “A toolbox to simulate the Big Bang and beyond” 

    pulse

    October 14, 2013
    Submitted by DOE’s Fermilab

    The universe is a vast and mysterious place, but thanks to high-performance computing technology scientists around the world are beginning to understand it better. They are using supercomputers to simulate how the Big Bang generated the seeds that led to the formation of galaxies such as the Milky Way.

    blob
    Courtesy of Ralf Kaehler and Tom Abel (visualization); John Wise and Tom Abel (numeric simulation).

    A new project involving DOE’s Argonne Lab, Fermilab and Berkeley Lab will allow scientists to study this vastness in greater detail with a new cosmological simulation analysis toolbox.

    Modeling the universe with a computer is very difficult, and the output of those simulations is typically very large. By anyone’s standards, this is “big data,” as each of these data sets can require hundreds of terabytes of storage space. Efficient storage and sharing of these huge data sets among scientists is paramount. Many different scientific analyses and processing sequences are carried out with each data set, making it impractical to rerun the simulations for each new study.

    This past year Argonne Lab, Fermilab and Berkeley Lab began a unique partnership on an ambitious advanced-computing project. Together the three labs are developing a new, state-of-the-art cosmological simulation analysis toolbox that takes advantage of DOE’s investments in supercomputers and specialized high-performance computing codes. Argonne’s team is led by Salman Habib, principal investigator, and Ravi Madduri, system designer. Jim Kowalkowski and Richard Gerber are the team leaders at Fermilab and Berkeley Lab.

    See the full article here.

    DOE Pulse highlights work being done at the Department of Energy’s national laboratories. DOE’s laboratories house world-class facilities where more than 30,000 scientists and engineers perform cutting-edge research spanning DOE’s science, energy, National security and environmental quality missions. DOE Pulse is distributed twice each month.

    DOE Banner


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 11:39 am on October 11, 2013 Permalink | Reply
    Tags: , Computing   

    From CERN: “Cloud and Grid: more connected that you might think?” 

    CERN New Masthead

    You may perceive the grid and the cloud to be two separate technologies: the grid as physical hardware and the cloud as virtual hardware simulated by running software. So how are the grid and the cloud being integrated at CERN?

    comp
    CERN Computer Centre.

    The LHC generates a large amount of data that needs to be stored, distributed and analysed. Grid technology is used for the mass physical data processing needed for the LHC supported by many data centres around the world as part of the Worldwide LHC Computing Grid. Beyond the technology itself, the Grid represents a collaboration of all these centres working towards a common goal.

    Cloud technology uses virtualisation techniques, which allow one physical machine to represent many virtual machines. This technology is being used today to develop and deploy a range of IT services (such Service Now*, a cloud hosted service), allowing for a great deal of operational flexibility. Such services are available at CERN through Openstack*.

    “The physics community is looking at cloud solutions in order to be able to extend grid services across internal and external clouds,” says David Foster, Deputy Head of IT at CERN. “Layering grid services, for example Batch, on top of a cloud infrastructure is increasingly popular.”

    See the full article here.

    *I have no access.

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New

    LHC

    CERN LHC New

    LHC particles

    Quantum Diaries


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 322 other followers

%d bloggers like this: