Tagged: US Department of Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:33 pm on November 21, 2014 Permalink | Reply
    Tags: , , , , US Department of Energy   

    From BNL: “Women @ Energy: Meifeng Lin” 

    Brookhaven Lab

    November 14, 2014
    Joe Gettler

    l
    Meifeng Lin is a theoretical particle physicist and a computational scientist at the Computational Science Center of Brookhaven National Laboratory.

    Meifeng Lin is a theoretical particle physicist and a computational scientist at the Computational Science Center of Brookhaven National Laboratory. Her research focuses on advancing scientific discovery through high performance computing. One such area of her focus is lattice gauge theory, in which large-scale Monte Carlo simulations of the strong interaction between the sub-atomic particles called quarks and gluons are performed to study fundamental symmetries of the universe and internal structure of hadronic matter. She obtained her Bachelor of Science degree in Physics from Peking University in Beijing, China. After getting her PhD in theoretical particle physics from Columbia University, she held postdoctoral positions at MIT, Yale University and Boston University. Prior to joining BNL in 2013, she was an assistant computational scientist at Argonne Leadership Computing Facility.

    1) What inspired you to work in STEM?

    I always like to solve problems and figure out how things work. Being a farm girl in a small village in China, I was very close to nature and had a lot of opportunities to see physics at work in the daily life, even though I didn’t realize it then. For example, in the starch making process, farmers would drain the water out of the barrels using the siphon principle. Such experiences fostered my curiosity and later on when I learned physics and could make such connections, I was quite fascinated. I guess I also inherited the “curiosity” genes from my parents, who, although did not have the chance to get much education, were always trying to figure out how things work and fix everything by themselves. My father, in particular, also accidentally cultivated my interest in math and logic through things like puzzles and Chinese chess when I was a little kid.

    But the realization that I would like to work in STEM has been gradual and the fact that I do is more a happy accident than determination. There wasn’t an “aha” moment that made me decide to choose science as my career. Growing up, I always wanted to be a writer. Sort of by chance I was admitted to the Physics Department at Peking University. Once I started studying physics as a major, I grew to love the problem-solving aspects of it and was amazed by the mathematical simplicity of the laws of physics. Even more importantly, I saw intelligence, dedication and constant hunger for new knowledge in my professors and colleagues throughout the years. And I enjoyed working and learning with them very much. I think that’s what got me to work in STEM eventually and stay with it.

    2) What excites you about your work at the Energy Department?

    Working in a field that strives to understand the most fundamental properties of our universe gives me this feeling that I am making a small contribution to the advancement of human knowledge, and that is very satisfying for me. At the Energy Department, I am surrounded by some of the smartest people and constantly exposed to new ideas and new technologies. It makes my work both challenging and exciting. Now that I am in an interdisciplinary research center, I am excited to have the opportunity to learn from my colleagues about their areas of interests and hopefully expand my research horizon.

    3) How can our country engage more women, girls, and other underrepresented groups in STEM?

    For young girls who are thinking about entering the field, some guidance and encouragement from the teachers, both male and female, will certainly help a great deal. When I was in high school, I had female teachers telling me that I just needed to marry well. But I was lucky to have several of my male teachers who saw my potential in math and physics and offered me very generous support and guided me through difficult times. Without them I would probably have followed a more stereotypical path for girls. This may be less an issue in the US now, but we still need to be careful not to typecast girls and minorities.

    On the other hand, we need to have a more supportive system which can retain women and underrepresented groups already working in STEM. I almost gave up working in STEM at one point, because it was so hard to find a job in my field that would allow me and my husband to stay in one place—the notorious “two-body problem”. I was fortunate enough to have some very understanding and supportive supervisors and colleagues. At both Boston University and Argonne, I was given the green light to work from home most of the time. I am immensely grateful for this arrangement, as it gave me the necessary transition to eventually get my current job which is close to where my husband works. Of course other people in STEM may have more constraints due to the nature of their work and don’t have the luxury of working remotely. But some flexibility and understanding will go a long way.

    4) Do you have tips you’d recommend for someone looking to enter your field of work?

    Take your time to find a field that interests and excites you. I always thought I wanted to be an experimental condensed matter physicist, but after a few summers in the labs, it turned out I did not like to do the experiments or be in the clean room. But I enjoyed writing computer programs to control the instruments or do simulations and data analysis. Then I found the field of lattice gauge theory where theoretical physics and supercomputers meet, which is perfect for me.

    For lattice gauge theory, and for computational sciences in general, the requirements on both mathematical and computational skills are pretty high. So it is important to have a solid mathematical foundation from early on. Some experience with scientific computing will be helpful. It probably sounds harder than it really is. Just don’t expect to know everything from the beginning. Nobody does. A lot of the skills, especially programming skills, can be picked up and improved on the job. As long as this is something you are interested in, be passionate, persevere, and don’t be afraid to ask for help.

    5) When you have free time, what are your hobbies?

    I enjoy reading, jogging, traveling and just checking out new neighborhoods with my husband. Occasionally when the mood strikes, I also like to write. I still hope someday I will be able to write a book or two. But with my first baby on the way, all this may change. Time will tell.

    See the full article here.

    BNL Campus

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:51 pm on November 18, 2014 Permalink | Reply
    Tags: , , US Department of Energy   

    From Scientific American: “Next Wave of U.S. Supercomputers Could Break Up Race for Fastest” 

    Scientific American

    Scientific American

    November 17, 2014
    Alexandra Witze and Nature magazine

    Once locked in an arms race with each other for the fastest supercomputers, US national laboratories are now banding together to buy their next-generation machines.

    On November 14, the Oak Ridge National Laboratory (ORNL) in Tennessee and the Lawrence Livermore National Laboratory in California announced that they will each acquire a next-generation IBM supercomputer that will run at up to 150 petaflops. That means that the machines can perform 150 million billion floating-point operations per second, at least five times as fast as the current leading US supercomputer, the Titan system at the ORNL.

    titan
    Cray Titan

    The new supercomputers, which together will cost $325 million, should enable new types of science for thousands of researchers who model everything from climate change to materials science to nuclear-weapons performance.

    “There is a real importance of having the larger systems, and not just to do the same problems over and over again in greater detail,” says Julia White, manager of a grant program that awards supercomputing time at the ORNL and Argonne National Laboratory in Illinois. “You can actually take science to the next level.” For instance, climate modellers could use the faster machines to link together ocean and atmospheric-circulation patterns in a regional simulation to get a much more accurate picture of how hurricanes form.

    A learning experience

    Building the most powerful supercomputers is a never-ending race. Almost as soon as one machine is purchased and installed, lab managers begin soliciting bids for the next one. Vendors such as IBM and Cray use these competitions to develop the next generation of processor chips and architectures, which shapes the field of computing more generally.

    In the past, the US national labs pursued separate paths to these acquisitions. Hoping to streamline the process and save money, clusters of labs have now joined together to put out a shared call — even those that perform classified research, such as Livermore. “Our missions differ, but we share a lot of commonalities,” says Arthur Bland, who heads the ORNL computing facility.

    In June, after the first such coordinated bid, Cray agreed to supply one machine to a consortium from the Los Alamos and Sandia national labs in New Mexico, and another to the National Energy Research Scientific Computing (NERSC) Center at the Lawrence Berkeley National Laboratory in Berkeley, California. Similarly, the ORNL and Livermore have banded together with Argonne.

    The joint bids have been a learning experience, says Thuc Hoang, programme manager for high-performance supercomputing research and operations with the National Nuclear Security Administration in Washington DC, which manages Los Alamos, Sandia and Livermore. “We thought it was worth a try,” she says. “It requires a lot of meetings about which requirements are coming from which labs and where we can make compromises.”

    At the moment, the world’s most powerful supercomputer is the 55-petaflop Tianhe-2 machine at the National Super Computer Center in Guangzhou, China. Titan is second, at 27 petaflops. An updated ranking of the top 500 supercomputers will be announced on November 18 at the 2014 Supercomputing Conference in New Orleans, Louisiana.

    When the new ORNL and Livermore supercomputers come online in 2018, they will almost certainly vault to near the top of the list, says Barbara Helland, facilities-division director of the Advanced Scientific Computing Research program at the Department of Energy (DOE) Office of Science in Washington DC.

    But more important than rankings is whether scientists can get more performance out of the new machines, says Sudip Dosanjh, director of the NERSC. “They’re all being inundated with data,” he says. “People have a desperate need to analyse that.”

    A better metric than pure calculating speed, Dosanjh says, is how much better computing codes perform on a new machine. That is why the latest machines were selected not on total speed but on how well they will meet specific computing benchmarks.

    Dual paths

    The new supercomputers, to be called Summit and Sierra, will be structurally similar to the existing Titan supercomputer. They will combine two types of processor chip: central processing units, or CPUs, which handle the bulk of everyday calculations, and graphics processing units, or GPUs, which generally handle three-dimensional computations. Combining the two means that a supercomputer can direct the heavy work to GPUs and operate more efficiently overall. And because the ORNL and Livermore will have similar machines, computer managers should be able to share lessons learned and ways to improve performance, Helland says.

    Still, the DOE wants to preserve a little variety. The third lab of the trio, Argonne, will be making its announcement in the coming months, Helland says, but it will use a different architecture from the combined CPU–GPU approach. It will almost certainly be like Argonne’s current IBM machine, which uses a lot of small but identical processors networked together. The latter approach has been popular for biological simulations, Helland says, and so “we want to keep the two different paths open”.

    Ultimately, the DOE is pushing towards supercomputers that could work at the exascale, or 1,000 times more powerful than the current petascale. Those are expected around 2023. But the more power the DOE labs acquire, the more scientists seem to want, says Katie Antypas, head of the services department at the NERSC.

    “There are entire fields that didn’t used to have a computational component to them,” such as genomics and bioimaging, she says. “And now they are coming to us asking for help.”

    See the full article here.

    Please help promote STEM in your local schools.

    stem

    STEM Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:18 pm on September 7, 2014 Permalink | Reply
    Tags: , , US Department of Energy   

    “8 DOE labs collaborating on climate change project” 

    Labs to help accelerate development of state-of-the-science earth system models.

    world

    High performance computing will be used to develop and apply the most complete climate and Earth system model to address the most challenging and demanding climate change issues.

    Eight Department of Energy (DOE) national laboratories, including Lawrence Berkeley National Laboratory, are combining forces with the National Center for Atmospheric Research, four academic institutions and one private-sector company in the new effort. Other participating national laboratories include Argonne, Brookhaven, Lawrence Livermore, Los Alamos, Oak Ridge, Pacific Northwest, and Sandia.

    The project, called Accelerated Climate Modeling for Energy, or ACME, is designed to accelerate the development and application of fully coupled, state-of-the-science Earth system models for scientific and energy applications. The plan is to exploit advanced software and new High Performance Computing machines as they become available.

    The initial focus will be on three climate change science drivers and corresponding questions to be answered during the project’s initial phase:

    (Water Cycle) How do the hydrological cycle and water resources interact with the climate system on local to global scales? How will more realistic portrayals of features important to the water cycle (resolution, clouds, aerosols, snowpack, river routing, land use) affect river flow and associated freshwater supplies at the watershed scale?
    (Biogeochemistry) How do biogeochemical cycles interact with global climate change? How do carbon, nitrogen and phosphorus cycles regulate climate system feedbacks, and how sensitive are these feedbacks to model structural uncertainty?
    (Cryosphere Systems) How do rapid changes in cryospheric systems, or areas of the earth where water exists as ice or snow, interact with the climate system? Could a dynamical instability in the Antarctic Ice Sheet be triggered within the next 40 years?

    Over a planned 10-year span, the project aim is to conduct simulations and modeling on the most sophisticated HPC machines as they become available, i.e., 100-plus petaflop machines and eventually exascale supercomputers. The team initially will use U.S. Department of Energy (DOE) Office of Science Leadership Computing Facilities at Oak Ridge and Argonne national laboratories.

    The model will also be optimized for deployment on the National Energy Research Scientific Computing Center (NERSC), which is located at Berkeley Lab.

    “We need a new paradigm for how to develop and apply climate models to answer critical questions regarding the implications of our past and future energy choices for society and the environment,” says Bill Collins, ACME’s Chief Scientist and head of the Earth Sciences Division’s Climate Sciences Department at Berkeley lab.

    “To address this critical need, ACME is designed to accelerate our progress towards actionable climate projections to help the nation anticipate, adapt to, and ultimately mitigate the potential risks of global climate change,” Collins adds.

    Berkeley Lab scientist Bill Collins is the ACME Chief Scientist, with duties to lead the overall scientific direction of the project. He is working with the rest of the team to ensure that ACME can fully exploit the world-leading computers deployed by the Department of Energy.

    To address the water cycle, the Project Plan states that changes in river flow over the last 40 years have been dominated primarily by land management, water management and climate change associated with aerosol forcing. During the next 40 years, greenhouse gas (GHG) emissions in a business as usual scenario will produce changes to river flow.

    “A goal of ACME is to simulate the changes in the hydrological cycle, with a specific focus on precipitation and surface water in orographically complex regions such as the western United States and the headwaters of the Amazon,” the report states.

    To address biogeochemistry, ACME researchers will examine how more complete treatments of nutrient cycles affect carbon–climate system feedbacks, with a focus on tropical systems; and investigate the influence of alternative model structures for below-ground reaction networks on global-scale biogeochemistry–climate feedbacks.

    For cyrosphere, the team will examine the near-term risks of initiating the dynamic instability and onset of the collapse of the Antarctic Ice Sheet due to rapid melting by warming waters adjacent to the ice sheet grounding lines.

    The experiment would be the first fully coupled global simulation to include dynamic ice shelf–ocean interactions for addressing the potential instability associated with grounding line dynamics in marine ice sheets around Antarctica.

    Other Berkeley Lab researchers involved in the program leadership include Will Riley, an expert in the terrestrial carbon cycle and co-leader of the Biogeochemical Experiment Task Team. Hans Johansen, a computational fluid dynamicist, is co-leader of the Computational Performance Task Team.

    Initial funding for the effort has been provided by DOE’s Office of Science.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 9:38 am on August 29, 2014 Permalink | Reply
    Tags: , , , US Department of Energy   

    From BNL Lab: “DOE ‘Knowledgebase’ Links Biologists, Computer Scientists to Solve Energy, Environmental Issues” 

    Brookhaven Lab

    August 29, 2014
    Rebecca Harrington

    With new tool, biologists don’t have to be programmers to answer big computational questions

    If biologists wanted to determine the likely way a particular gene variant might increase a plant’s yield for producing biofuels, they used to have to track down several databases and cross-reference them using complex computer code. The process would take months, especially if they weren’t familiar with the computer programming necessary to analyze the data.

    ikb
    Combining information about plants, microbes, and the complex biomolecular interactions that take place inside these organisms into a single, integrated “knowledgebase” will greatly enhance scientists’ ability to access and share data, and use it to improve the production of biofuels and other useful products.

    Now they can do the same analysis in a matter of hours, using the Department of Energy’s Systems Biology Knowledgebase (KBase), a new computational platform to help the biological community analyze, store, and share data. Led by scientists at DOE’s Lawrence Berkeley, Argonne, Brookhaven, and Oak Ridge national laboratories, KBase amasses the data available on plants, microbes, microbial communities, and the interactions among them with the aim of improving the environment and energy production. The computational tools, resources, and community networking available will allow researchers to propose and test new hypotheses, predict biological behavior, design new useful functions for organisms, and perform experiments never before possible.

    “Quantitative approaches to biology were significantly developed during the last decade, and for the first time, we are now in a position to construct predictive models of biological organisms,” said computational biologist Sergei Maslov, who is principal investigator (PI) for Brookhaven’s role in the effort and Associate Chief Science Officer for the overall project, which also has partners at a number of leading universities, Cold Spring Harbor Laboratory, the Joint Genome Institute, the Environmental Molecular Sciences Laboratory, and the DOE Bioenergy Centers. “KBase allows research groups to share and analyze data generated by their project, put it into context with data generated by other groups, and ultimately come to a much better quantitative understanding of their results. Biomolecular networks, which are the focus of my own scientific research, play a central role in this generation and propagation of biological knowledge.”

    Maslov said the team is transitioning from the scientific pilot phase into the production phase and will gradually expand from the limited functionality available now. By signing up for an account, scientists can access the data and tools free of charge, opening the doors to faster research and deeper collaboration.
    Easy coding

    “We implement all the standard tools to operate on this kind of key data so a single PI doesn’t need to go through the hassle by themselves.”
    — Shinjae Yoo, assistant computational scientist working on the project at Brookhaven

    As problems in energy, biology, and the environment get bigger, the data needed to solve them becomes more complex, driving researchers to use more powerful tools to parse through and analyze this big data. Biologists across the country and around the world generate massive amounts of data — on different genes, their natural and synthetic variations, proteins they encode, and their interactions within molecular networks — yet these results often don’t leave the lab where they originated.

    “By doing small-scale experiments, scientists cannot get the system-level understanding of biological organisms relevant to the DOE mission,” said Shinjae Yoo, an assistant computational scientist working on the project at Brookhaven. “But they can use KBase for the analysis of their large-scale data. KBase will also allow them to compare and contrast their data with other key datasets generated by projects funded by the DOE and other agencies. We implement all the standard tools to operate on this kind of key data so a single PI doesn’t need to go through the hassle by themselves.”

    For non-programmers, KBase offers a “Narrative Interface,” allowing them to upload their data to KBase and construct a narrative of their analysis with a series of pre-coded programs that has a human in the middle interpreting and filtering their output.

    In one pre-coded narrative, researchers can filter through naturally occurring variations of Poplar genes, one of the DOE flagship bioenergy plant species. Scientists can discover genes associated with a reduced amount of lignin—a cell wall protein that makes conversion of Poplar biomass to biofuels more difficult. In this narrative, scientists can use datasets from KBase and from their own research to then find candidate genes, and use networks to select the genes most likely to be related to a specific trait they’re looking for—say, genes that result in reduced lignin content, which could ease the biomass to biofuel conversion. And if other researchers wanted to run the same program for a different plant, they could just put different data in the same narrative.

    “Everything is already there,” Yoo said. “You simply need to upload the data in the right format and run through several easy steps within the narrative.”

    For those who know how to code, KBase has the IRIS Interface, a web-based command line terminal where researchers can run and control the programs on their own, allowing scientists to analyze large volumes of data. If researchers want to learn how to do the coding themselves, KBase also has tutorials and resources to help interested scientists learn it.
    A social network

    But KBase’s most powerful resource is the community itself. Researchers are encouraged to upload their data and programs so that other users can benefit from them. This type of cooperative environment encourages sharing and feedback among researchers, so the programs, tools, and annotation of datasets can improve with other users’ input.

    Brookhaven is leading the plant team on the project, while the microbe and microbial community teams are based at other partner institutions. A computer scientist by training, Yoo said his favorite part of working on KBase has been how much biology he’s learned. Acting as a go-between among the biologists at Brookhaven, who are describing what they’d like to see KBase be able to do, and the computer scientists, who are coding the programs to make it happen, Yoo has had to understand both languages of science.

    “I’m learning plant biology. That’s pretty cool to me,” he said. “In the beginning, it was quite tough. Three years later I’ve caught up, but I still have a lot to learn.”

    Ultimately, KBase aims to interweave huge amounts of data with the right tools and user interface to enable bench scientists without programming backgrounds to answer the kinds of complex questions needed to solve the energy and environmental issues of our time.

    “We can gain systematic understanding of a biological process much faster, and also have a much deeper understanding,” Yoo said, “so we can engineer plant organisms or bacteria to improve productivity, biomass yield—and then use that information for biodesign.”

    KBase is funded by the DOE’s Office of Science. The Office of Science (SC) is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:58 pm on June 25, 2014 Permalink | Reply
    Tags: , , US Department of Energy   

    From Fermilab: “Supercomputers help answer the big questions about the universe” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Wednesday, June 25, 2014
    Jim Simone

    The proton is a complicated blob. It is composed of three particles, called quarks, which are surrounded by a roiling sea of gluons that “glue” the quarks together. In addition to interacting with its surrounding particles, each gluon can also turn itself temporarily into a quark-antiquark pair and then back into a gluon.

    proton
    Proton -2 up quarks, one down quark

    gluon
    Gluon, after Feynman

    This tremendously complicated subatomic dance affects measurements that are crucial to answering important questions about the universe, such as: What is the origin of mass in the universe? Why do the elementary particles we know come in three generations? Why is there so much more matter than antimatter in the universe?

    A large group of theoretical physicists at U.S. universities and DOE national laboratories, known as the USQCD collaboration, aims to help experimenters solve the mysteries of the universe by computing the effects of this tremendously complicated dance of quarks and gluons on experimental measurements. The collaboration members use powerful computers to solve the complex equations of the theory of quantum chromodynamics, or QCD, which govern the behavior of quarks and gluons.

    The USQCD computing needs are met through a combination of INCITE resources at the DOE Leadership Class Facilities at Argonne and Oak Ridge national laboratories; NSF facilities such as the NCSA Blue Waters; a small Blue Gene/Q supercomputer at Brookhaven National Laboratory; and dedicated computer clusters housed at Fermilab and Jefferson Lab. USQCD also exploits floating point accelerators such as Graphic Processing Units (GPUs) and Intel’s Xeon Phi architecture.

    With funding from the DOE Office of Science SCIDAC program, the USQCD collaboration coordinates and oversees the development of community software that benefits all lattice QCD groups, enabling scientists to make the most efficient use of the latest supercomputer architectures and GPU clusters. Efficiency gains are achieved through new computing algorithms and techniques, such as communication avoidance, data compression and the use of mixed precision to represent numbers.

    The nature of lattice QCD calculations is very conducive to cooperation among collaborations, even among groups that focus on different scientific applications of QCD effects. Why? The most time-consuming and expensive computing in lattice QCD—the generation of gauge configuration files—is the basis for all lattice QCD calculations. (Gauge configurations represent the sea of gluons and virtual quarks that represent the QCD vacuum.) They are most efficiently generated on the largest leadership-class supercomputers. The MILC collaboration, a subgroup of the larger USQCD collaboration, is well known for the calculation of state-of-the-art gauge configurations and freely shares them with researchers worldwide.

    Specific predictions require more specialized computations and rely on the gauge configurations as input. These calculations are usually performed on dedicated computer hardware at the labs, such as the clusters at Fermilab and Jefferson Lab and the small Blue Gene/Q at BNL, which are funded by the DOE Office of Science ‘s LQCD-ext Project for hardware infrastructure.
    With the powerful human and computer resources of USQCD, particle physicists working on many different experiments—from measurements at the Large Hadron Collider to neutrino experiments at Fermilab—have a chance to get to the bottom of the universe’s most pressing questions.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 12:41 pm on April 29, 2014 Permalink | Reply
    Tags: , , US Department of Energy   

    From Fermilab: “Director’s Corner – Cultivating innovation” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Tuesday, April 29, 2014

    nl
    Fermilab Director Nigel Lockyer wrote this column.

    A new initiative has been launched at Fermilab: the DOE-sponsored Laboratory Directed Research and Development program, or LDRD.

    As a participant in LDRD, Fermilab will join other national laboratories in setting aside funds each year for small-scale, proof-of-principle or innovative research projects in science and technology. These funds are open to Fermilab employees who propose to become the principal investigator of an LDRD project. LDRD projects are awarded based on a competitive review process of proposals. The projects must be relevant to the mission of DOE and Fermilab, address innovative science and technology, and must be outside the existing scope of current programmatic activities and projects.

    The LDRD program aims to foster creative scientific and technological thinking at the national laboratories and enable those with such innovative ideas to try them out.

    William Wester has been appointed as the LDRD coordinator. He will discuss the details of this program in a lunchtime meeting today from noon-1 p.m. in One East. William is responsible for working within the DOE framework of the program and coordinating the review panel and other activities required to make this opportunity available. Other information is available on the LDRD Web page. The first call for LDRD proposals has gone out, and preliminary proposals are due May 9. Full proposals will be due May 23. We expect to announce the next call for proposals early in FY15.

    The Fermilab community boasts creative thinkers and problem solvers among its members. Your ideas could lead to the next big technological breakthrough for science and society, one that will become part of the future of Fermilab. We look forward to the discoveries and seeing the new concepts that get tested under this program.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 9:05 am on October 22, 2013 Permalink | Reply
    Tags: , , , , US Department of Energy   

    From D.O.E. Pulse: “A toolbox to simulate the Big Bang and beyond” 

    pulse

    October 14, 2013
    Submitted by DOE’s Fermilab

    The universe is a vast and mysterious place, but thanks to high-performance computing technology scientists around the world are beginning to understand it better. They are using supercomputers to simulate how the Big Bang generated the seeds that led to the formation of galaxies such as the Milky Way.

    blob
    Courtesy of Ralf Kaehler and Tom Abel (visualization); John Wise and Tom Abel (numeric simulation).

    A new project involving DOE’s Argonne Lab, Fermilab and Berkeley Lab will allow scientists to study this vastness in greater detail with a new cosmological simulation analysis toolbox.

    Modeling the universe with a computer is very difficult, and the output of those simulations is typically very large. By anyone’s standards, this is “big data,” as each of these data sets can require hundreds of terabytes of storage space. Efficient storage and sharing of these huge data sets among scientists is paramount. Many different scientific analyses and processing sequences are carried out with each data set, making it impractical to rerun the simulations for each new study.

    This past year Argonne Lab, Fermilab and Berkeley Lab began a unique partnership on an ambitious advanced-computing project. Together the three labs are developing a new, state-of-the-art cosmological simulation analysis toolbox that takes advantage of DOE’s investments in supercomputers and specialized high-performance computing codes. Argonne’s team is led by Salman Habib, principal investigator, and Ravi Madduri, system designer. Jim Kowalkowski and Richard Gerber are the team leaders at Fermilab and Berkeley Lab.

    See the full article here.

    DOE Pulse highlights work being done at the Department of Energy’s national laboratories. DOE’s laboratories house world-class facilities where more than 30,000 scientists and engineers perform cutting-edge research spanning DOE’s science, energy, National security and environmental quality missions. DOE Pulse is distributed twice each month.

    DOE Banner


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 2:29 pm on July 25, 2013 Permalink | Reply
    Tags: , , , US Department of Energy   

    From Energy.gov- “Photo of the Week: Faster than the Speed of Light” 

    ENERGYDOTGOVE BANNER

    July 24, 2013
    Sarah Gerrity

    “If you’ve ever heard the thunderous sound of a sonic boom, you’ve experienced the shock waves in the air created by an object traveling faster than the speed of sound. But what happens when an object travels faster than the speed of light?

    disc
    Photo courtesy of Jefferson Laboratory.

    At Jefferson Laboratory, construction is underway to upgrade the Continuous Electron Beam Accelerator Facility (CEBAF) and the CEABF Large Acceptance Spectrometer (CLAS12) at Hall B. During the experiments, the accelerator will shoot electrons at speeds faster than the speed of light, creating shock waves that emit a blue light, known as Cherenkov light — this light is equivalent to the sonic boom. By recording data from Cherenkov light, scientists will be able to map a nucleon’s three-dimensional spin. The device will use 48 ellipsoidal mirrors assembled into one circular, 8-foot diameter mirror to capture this light. Pictured here is the web-like component that will support the mirrors in the accelerator itself.

    See the full article here.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 12:48 pm on May 17, 2013 Permalink | Reply
    Tags: US Department of Energy   

    From Symmetry: "Moniz confirmed as Energy Secretary" 

    May 16, 2013
    Mike Ross

    “The US Senate has unanimously confirmed MIT physics professor Ernest Moniz as the next Secretary of Energy.

    moniz
    Ernest Moniz

    Ernest Moniz, an MIT physics professor with extensive experience with particle accelerators and national energy policies, has been confirmed in a unanimous vote by the US Senate as the next Secretary of Energy.

    The Department of Energy is the single largest supporter of particle physics, and of basic research in the physical sciences, in the United States.

    Moniz succeeds Steven Chu, also a physicist, who served during the Obama administration’s first term and who announced Feb. 1 that he would be stepping down. After the transition, Chu will be joining the faculty of Stanford University.

    Coincidentally, Moniz also has a Stanford connection. He earned his PhD in theoretical nuclear physics there in January 1972.

    After postdoctoral research stints in Saclay, France, and the University of Pennsylvania, Moniz joined MIT’s physics faculty in 1973. He was director of the DOE-funded Bates Linear Accelerator Center from 1983 to 1991. In the 1990s, Moniz became more active in the national energy policy discussion. He served the Clinton Administration as associate director for science in the White House’s Office of Science and Technology Policy (1995-97) and then as DOE undersecretary (1997-2001). In 2006, he was named director of the MIT Energy Initiative and the Laboratory for Energy and the Environment.

    ‘Taken together, these roles have given me a deep appreciation of DOE’s importance to American leadership in science,’ Moniz said in his April 9 written statement to the Senate committee reviewing his nomination. ‘DOE is the lead funder of basic research in the physical sciences and provides the national research community with unique research opportunities at major facilities for nuclear and particle physics, energy science, materials research and discovery, large-scale computation and other disciplines. DOE operates an unparalleled national laboratory system and partners with both university and industry at the research frontier.

    ‘The Secretary of Energy has the responsibility for stewardship of a crucial part of the American basic research enterprise. If confirmed, I will work with the scientific community and with Congress to assure that our researchers have continuing access to cutting-edge research tools for scientific discovery and for training the next generation.’

    With a 21-1 vote, the committee approved Moniz’s nomination on April 18.”

    See the full article here. Best of luck to the new Secretary and to us here in the embattled scientific community.

    Symmetry is a joint Fermilab/SLAC publication.

     
  • richardmitnick 2:40 pm on April 3, 2013 Permalink | Reply
    Tags: AMS Collaboration, , , , US Department of Energy   

    From CERN: "First result from the AMS experiment" 

    CERN New Masthead

    30 March 2013
    No Writer Credit

    “The Alpha Magnetic Spectrometer (AMS) Collaboration announces the publication of its first physics result in Physical Review Letters. The AMS Experiment is the most powerful and sensitive particle physics spectrometer ever deployed in space. As seen in Figure 1, AMS is located on the exterior of the International Space Station (ISS) and since its installation on 19 May 2011 it has measured over 30 billion cosmic rays at energies up to trillions of electron volts. Its permanent magnet and array of precision particle detectors collect and identify charged cosmic rays passing through AMS from the far reaches of space. Over its long duration mission on the ISS, AMS will record signals from 16 billion cosmic rays every year and transmit them to Earth for analysis by the AMS Collaboration. This is the first of many physics results to be reported.

    ams
    From its vantage point ~260 miles (~400 km) above the Earth, the Alpha Magnetic Spectrometer (AMS) collects data from primordial cosmic rays that traverse the detector.

    In the initial 18 month period of space operations, from 19 May 2011 to 10 December 2012, AMS analyzed 25 billion primary cosmic ray events. Of these, an unprecedented number, 6.8 million, were unambiguously identified as electrons and their antimatter counterpart, positrons. The 6.8 million particles observed in the energy range 0.5 to 350 GeV are the subject of the precision study reported in this first paper.

    Electrons and positrons are identified by the accurate and redundant measurements provided by the various AMS instruments against a large background of protons. Positrons are clearly distinguished from this background through the robust rejection power of AMS of more than one in one million.

    Currently, the total number of positrons identified by AMS, in excess of 400,000, is the largest number of energetic antimatter particles directly measured and analyzed from space.”

    From AMS at NASA
    “The AMS-02 experiment is a state-of-the-art particle physics detector that is constructed, tested and operated by an international team composed of 56 institutes from 16 countries and organized under United States Department of Energy (DOE) sponsorship. The JSC (Johnson Space Center) AMS project office oversaw the overall payload integration activities and ensured that the payload is safe and ready for launch on the Space Shuttle and and continues to be safe since its deployment onto the ISS. The AMS Experiment uses the unique environment of space to advance knowledge of the universe and lead to the understanding of the universe’s origin. AMS was launched on Space Shuttle Endeavour on May 16, 2011.” Operations on the ISS began three days later, and AMS continues operations onboard the ISS today.”

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New

    LHC

    CERN LHC New

    LHC particles

    Quantum Diaries


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 356 other followers

%d bloggers like this: