Tagged: DOE Pulse Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:54 pm on September 15, 2014 Permalink | Reply
    Tags: , DOE Pulse, SLAC SUNCAT, Statistical Analysis   

    From D.O.E. Pulse: “Uncertainty gives scientists new confidence in search for novel materials” 

    pulse

    DOE Pulse

    September 15, 2014
    Angela Anderson, 650.926.3505,
    angelaa@slac.stanford.edu

    Scientists at Stanford University and DOE’s SLAC National Accelerator Laboratory have found a way to estimate uncertainties in computer calculations that are widely used to speed the search for new materials for industry, electronics, energy, drug design and a host of other applications.

    try
    This image shows the results of calculations
    aimed at determining which of six chemical
    elements would make the best catalyst for
    promoting an ammonia synthesis reaction.
    Researchers at SLAC and Stanford used
    Density Functional Theory (DFT) to calculate
    the strength of the bond between nitrogen
    atoms and the surfaces of the catalysts.
    The bond strength, plotted on the horizontal
    axis, is a key factor in determining the
    reaction speed, plotted on the vertical axis.
    Based on thousands of these calculations,
    which yielded a range of results (colored dots)
    that reveal the uncertainty involved,
    researchers estimated an 80 percent chance
    that ruthenium (Ru, in red) will be a better
    catalyst than iron (Fe, in orange.)
    (Andrew Medford and Aleksandra Vojvodic/
    SUNCAT, Callie Cullum)

    The technique, reported in a recent issue of Science, should quickly be adopted in studies that produce some 30,000 scientific papers per year.

    “Over the past 10 years our ability to calculate the properties of materials and chemicals, such as reactivity and mechanical strength, has increased enormously. It’s totally exploded,” said Jens Nørskov, a professor at SLAC and Stanford and director of the SUNCAT Center for Interface Science and Catalysis, who led the research.

    “As more and more researchers use computer simulations to predict which materials have the interesting properties we’re looking for — part of a process called ‘materials by design’ — knowing the probability for error in these calculations is essential,” he said. “It tells us exactly how much confidence we can put in our results.”

    Nørskov and his colleagues have been at the forefront of developing this approach, using it to find better and cheaper catalysts to speed ammonia synthesis and generate hydrogen gas for fuel, among other things. But the technique they describe in the paper can be broadly applied to all kinds of scientific studies.

    Speeding the Material Design Cycle

    The set of calculations involved in this study is known as DFT, for Density Functional Theory. It predicts bond energies between atoms based on the principles of quantum mechanics. DFT calculations allow scientists to predict hundreds of chemical and materials properties, from the electronic structures of compounds to density, hardness, optical properties and reactivity.

    Because researchers use approximations to simplify the calculations — otherwise they’d take too much computer time — each of these calculated material properties could be off by a fairly wide margin.

    To estimate the size of those errors, the team applied a statistical method: They calculated each property thousands of times, each time tweaking one of the variables to produce slightly different results. That variation in results represents the possible range of error.

    “Even with the estimated uncertainties included, when we compared the calculated properties of different materials we were able to see clear trends,” said Andrew J. Medford, a graduate student with SUNCAT and first author of the study. “We could predict, for instance, that ruthenium would be a better catalyst for synthesizing ammonia than cobalt or nickel, and say what the likelihood is of our prediction being right.”

    An Essential New Tool for Thousands of Studies

    DFT calculations are used in the materials genome initiative to search through millions of solids and compounds, and also widely used in drug design, said Kieron Burke, a professor of chemistry and physics at the University of California-Irvine who was not involved in the study.

    “There were roughly 30,000 papers published last year using DFT,” he said. “I believe the technique they’ve developed will become absolutely necessary for these kinds of calculations in all fields in a very short period of time.”

    Thomas Bligaard, a senior staff scientist in charge of theoretical method development at SUNCAT, said the team has a lot of work ahead in implementing these ideas, especially in calculations attempting to make predictions of new phenomena or new functional materials.

    Other researchers involved in the study were Jess Wellendorff, Aleksandra Vojvodic, Felix Studt, and Frank Abild-Pedersen of SUNCAT and Karsten W. Jacobsen of the Technical University of Denmark. Funding for the research came from the DOE Office of Science.

    See the full article here.

    DOE Pulse highlights work being done at the Department of Energy’s national laboratories. DOE’s laboratories house world-class facilities where more than 30,000 scientists and engineers perform cutting-edge research spanning DOE’s science, energy, National security and environmental quality missions. DOE Pulse is distributed twice each month.

    DOE Banner

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:24 pm on July 21, 2014 Permalink | Reply
    Tags: , DOE Pulse, , ,   

    From DOE Pulse: “Diamond plates create nanostructures through pressure, not chemistry “ 

    pulse

    July 21, 2014
    Darrick Hurst, 505.844.8009,
    drhurst@sandia.gov

    You wouldn’t think that mechanical force — the simple kind used to eject unruly patrons from bars, shoe a horse or emboss the raised numerals on credit cards — could process nanoparticles more subtly than the most advanced chemistry.

    Yet, in a recent paper in Nature Communications, Sandia National Laboratories researcher Hongyou Fan and colleagues appear to have achieved a start toward that end.

    three
    Sandia National Laboratories researcher Hongyou Fan, center, points out a nanoscience result to Sandia paper co-authors Paul Clem, left, and Binsong Li.
    (Photo by Randy Montoya)

    Their newly patented and original method uses simple pressure — a kind of high-tech embossing — to produce finer and cleaner results in forming silver nanostructures than do chemical methods, which are not only inflexible in their results but leave harmful byproducts to dispose of.

    Fan calls his approach “a simple stress-based fabrication method” that, when applied to nanoparticle arrays, forms new nanostructures with tunable properties.

    “There is a great potential market for this technology,” he said. “It can be readily and directly integrated into current industrial manufacturing lines without creating new expensive and specialized equipment.”

    Said Sandia co-author Paul Clem, “This is a foundational method that should enable a variety of devices, including flexible electronics such as antennas, chemical sensors and strain detectors.” It also would produce transparent electrodes for solar cells and organic light-emitting diodes, Clem said.

    The method was inspired by industrial embossing processes in which a patterned mask is applied with high external pressure to create patterns in the substrate, Fan said. “In our technology, two diamond anvils were used to sandwich nanoparticulate thin films. This external stress manually induced transitions in the film that synthesized new materials,” he said.

    The pressure, delivered by two diamond plates tightened by four screws to any controlled setting, shepherds silver nanospheres into any desired volume. Propinquity creates conditions that produce nanorods, nanowires and nanosheets at chosen thicknesses and lengths rather than the one-size-fits-all output of a chemical process, with no environmentally harmful residues.

    While experiments reported in the paper were performed with silver — the most desirable metal because it is the most conductive, stable and optically interesting and becomes transparent at certain pressures — the method also has been shown to work with gold, platinum and other metallic nanoparticles

    Clem said the researchers are now starting to work with semiconductors.

    Bill Hammetter, manager of Sandia’s Advanced Materials Laboratory, said, “Hongyou has discovered a way to build one structure into another structure — a capability we don’t have now at the nanolevel. Eight or nine gigapascal —the amount of pressure at which phase change and new materials occur — are not difficult to reach. Any industry that has embossing equipment could lay a film of silver on a piece of paper, build a conductive pattern, then remove the extraneous material and be left with the pattern. A coating of nanoparticles that can build into another structure has a certain functionality we don’t have right now. It’s a discovery that hasn’t been commercialized, but could be done today with the same equipment used by anyone who makes credit cards.”

    The method can be used to configure new types of materials. For example, under pressure, the dimensions of ordered three-dimensional nanoparticle arrays shrink. By fabricating a structure in which the sandwiching walls permanently provide that pressure, the nanoparticle array will remain at a constant state, able to transmit light and electricity with specific characteristics. This pressure-regulated fine-tuning of particle separation enables controlled investigation of distance-dependent optical and electrical phenomena.

    At even higher pressures, nanoparticles are forced to sinter, or bond, forming new classes of chemically and mechanically stable nanostructures that no longer need restraining surfaces. These cannot be manufactured using current chemical methods.

    Depending on the size, composition and phase orientation of the initial nanoparticle arrays, a variety of nanostructures or nanocomposites and 3-D interconnected networks are achievable.

    The stress-induced synthesis processes are simple and clean. No thermal processing or further purification is needed to remove reaction byproducts.
    This work was funded by the Department of Energy’s Office of Science. Other authors of the paper are from Cornell University and Los Alamos National Laboratory.

    See the full article here.

    DOE Pulse highlights work being done at the Department of Energy’s national laboratories. DOE’s laboratories house world-class facilities where more than 30,000 scientists and engineers perform cutting-edge research spanning DOE’s science, energy, National security and environmental quality missions. DOE Pulse is distributed twice each month.

    DOE Banner


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 8:23 pm on June 23, 2014 Permalink | Reply
    Tags: , , DOE Pulse,   

    From DOE Pulse: “Supercomputer exposes enzyme’s secrets” 

    pulse

    June 23, 2014
    Heather Lammers, 303.275.4084,
    heather.lammers@nrel.gov

    Thanks to newer and faster supercomputers, today’s computer simulations are opening hidden vistas to researchers in all areas of science. These powerful machines are used for everything from understanding how proteins work to answering questions about how galaxies began. Sometimes the data they create manage to surprise the very researchers staring back at the computer screen—that’s what recently happened to a researcher at DOE’s National Renewable Energy Laboratory (NREL).

    “What I saw was completely unexpected,” NREL engineer Gregg Beckham said.

    two
    NREL Biochemist Michael Resch (left) and NREL Engineer Gregg Beckham discuss results of vials containing an enzymatic digestion assay of cellulose.
    Photo by Dennis Schroeder, NREL

    What startled Beckham was a computer simulation of an enzyme from the fungus Trichoderma reesei (Cel7A). The simulation showed that a part of an enzyme, the linker, may play a necessary role in breaking down biomass into the sugars used to make alternative transportation fuels.

    “A couple of years ago we decided to run a really long—well, really long being a microsecond—simulation of the entire enzyme on the surface of cellulose,” Beckham said. “We noticed the linker section of the enzyme started to bind to the cellulose—in fact, the entire linker binds to the surface of the cellulose.”

    The enzymes that the NREL researchers are examining have several different components that work together to break down biomass. The enzymes have a catalytic domain—which is the primary part of the enzyme that breaks down the material into the needed sugars. There is also a binding module, the sticky part that attaches the cellulose to the catalytic domain. The catalytic domain and the binding module are connected to each other by a linker.

    “For decades, many people have thought these linkers are quite boring,” Beckham said. “Indeed, we predicted that linkers alone act like wet noodles—they are really flexible, and unlike the catalytic domain or the binding module, they didn’t have a known, well-defined structure. But the computer simulation suggests that the linker has some function other than connecting the binding module to the catalytic domain; namely, it may have some cellulose binding function as well.”

    Cellulose is a long linear chain of glucose that makes up the main part of plant cell walls, but the bonds between the glucose molecules make it very tough to break apart. In fact, cellulose in fossilized leaves can remain intact for millions of years, but enzymes have evolved to break down this biomass into sugars by threading a single strand of cellulose up into the enzymes’ catalytic domain and cleaving the bonds that connect glucose molecules together. Scientists are interested in the enzymes in fungi like Trichoderma reesei because they are quite effective at breaking down biomass—and fungi can make a lot of protein, which is also important for biomass conversion.

    To make an alternative fuel like cellulosic ethanol or drop-in hydrocarbon fuels, biomass is pretreated with acid, hot water, or some other chemicals and heat to open up the plant cell wall. Next, enzymes are added to the biomass to break down the cellulose into glucose, which is then fermented and converted into fuel.

    While Beckham and his colleagues were excited by what the simulation showed, there was also some trepidation.

    “At first we didn’t believe it, and we thought that it must be wrong, so a colleague, Christina Payne [formerly at NREL, now an assistant professor in chemical and materials engineering at the University of Kentucky], ran another simulation on the second most abundant enzyme in Trichoderma reesei (Cel6A),” Beckham explained. “And we found exactly the same thing.

    “Many research teams have been engineering catalytic domains and binding modules, but this result perhaps suggests that we should also consider the functions of linkers. We now know they are important for binding, and we know binding is important for activity—but many unanswered questions remain that the team is working on now.”

    The NREL research team experimentally verified the computational predictions by working with researchers at the University of Colorado Boulder (CU Boulder), Swedish University of Agricultural Sciences, and Ghent University in Belgium. Using proteins made and characterized by the international project team, NREL’s Michael Resch showed that by measuring the binding affinity of the binding module and then comparing it to the binding module with the linker added, the linker imparted an order of magnitude in binding affinity to cellulose. These results were published in an article in the Proceedings of the National Academy of Sciences (PNAS). In addition to Beckham, Payne, and Resch, co-authors on the study include: Liqun Chen and Zhongping Tan (CU Boulder); Michael F. Crowley, Michael E. Himmel, and Larry E. Taylor II (NREL); Mats Sandgren and Jerry Ståhlberg (Swedish University of Agricultural Sciences); and Ingeborg Stals (University College Ghent).

    “In terms of fuels, if you make even a small improvement in these enzymes, you could then lower the enzyme loadings. On a commodities scale, there is potential for dramatic savings that will help make renewable fuels competitive with fuels derived from fossil resources,” Beckham said.

    According to Beckham, improving these enzymes is very challenging but incredibly important for cost-effective biofuels production, which the Energy Department has long recognized. “We are still unraveling a lot of the basic mechanisms about how they work. For instance, our recent paper suggests that this might be another facet of how these enzymes work and another target for improving them.”

    The research work at NREL is funded by the Energy Department’s Bioenergy Technologies Office, and the computer time was provided by both the Energy Department and the National Science Foundation (NSF). The original simulation was run on a supercomputer named Athena at the National Institute for Computational Sciences, part of the NSF Extreme Science and Engineering Discovery Environment (XSEDE). The Energy Department’s Red Mesa supercomputer at Sandia National Laboratories was used for the subsequent simulations.

    See the full article here.

    DOE Pulse highlights work being done at the Department of Energy’s national laboratories. DOE’s laboratories house world-class facilities where more than 30,000 scientists and engineers perform cutting-edge research spanning DOE’s science, energy, National security and environmental quality missions. DOE Pulse is distributed twice each month.

    DOE Banner


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 10:15 am on March 3, 2014 Permalink | Reply
    Tags: , , DOE Pulse, , , ,   

    From PPPL via DOE Pulse: “Celebrating the 20th anniversary of the tritium shot heard around the world” 

    DOE Pulse

    March 3, 2014

    No Writer Credit

    Tensions rose in the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) as the seconds counted down. At stake was the first crucial test of a high-powered mixture of fuel for producing fusion energy. As the control-room clock reached “zero,” a flash of light on a closed-circuit television monitor marked a historic achievement: A world-record burst of more than 3 million watts of fusion energy — enough to momentarily light some 3,000 homes — fueled by the new high-powered mixture. The time was 11:08 p.m. on Thursday, Dec. 9, 1993.

    “There was a tremendous amount of cheering and clapping,” recalled PPPL physicist Rich Hawryluk, who headed the Tokamak Fusion Test Reactor (TFTR), the huge magnetic fusion facility — or tokamak — that produced the historic power. “People had been on pins and needles for a long time and finally it all came together.” It did so again the very next day when TFTR shattered the mark by creating more than six million watts of fusion energy.

    pppl tokamak
    PPPL Tokamak

    The achievements generated headlines around the world and laid the foundation for the development of fusion energy in facilities such as ITER, the vast international experiment being built in France to demonstrate the feasibility of fusion power. The results delivered “important scientific confirmation of the path we are taking toward ITER,” said physicist Ed Synakowski, a PPPL diagnostics expert during the experiments and now associate director of the Office of Science for Fusion Energy Sciences at DOE. “I felt an important shift in the understanding of fusion’s likely reality with those experiments.”

    The breakthroughs proved the practicality of combining equal amounts of the hydrogen isotopes deuterium and its radioactive cousin tritium — the same combination that will be used in ITER and future fusion power plants — to form the superhot, charged plasma gas that fuels fusion reactions. The deuterium-tritium (D-T) mix produced some 150 times more power than a reaction fueled solely by deuterium, long the stand-alone ingredient in tokamak experiments, or “shots.”

    “This was the first test with equal parts D-T and it was technically quite challenging,” said Michael Zarnstorff, a task-force leader during the experiments and now deputy director for research at PPPL. “What we did marked a huge advance in integrating tritium into fusion facilities.”

    Gained insights included precise measurement of the confinement and loss of alpha particles that fusion reactions release along with energetic neutrons. Good confinement of the alpha particles is critically important since they are to serve as the primary means of heating the plasma in ITER, and thereby producing a self-sustaining fusion reaction, or “burning plasma.”

    Exciting journey

    The historic shots capped years of intense preparation for tritium operations, which ran until TFTR was decommissioned in 1997 after setting more records and producing reams of new knowledge. “The journey to tritium was at least as exciting as the first experiments,” said former PPPL Director Ronald Davidson, who led the Laboratory during the tritium years. “It was an enormous technical undertaking and one of my greatest elements of pride in the PPPL staff was that the preparations were so good and so thorough that the tritium shots were successful early on in the D-T campaign.”

    The preparations mobilized physicists, engineers and staffers throughout the Laboratory. “The absolute top priority was to demonstrate that one could carry out the tritium experiments safely,” said former Deputy Director Dale Meade. “Everyone focused on this mission as we went through a step-by-step construction and checkout of the tritium systems with rigorous adherence to procedures and strong oversight by DOE.”

    Leaders of this effort included Jerry Levine, now head of the Environment, Safety, Health & Security Department at PPPL, and John DeLooper, who heads the Best Practices and Outreach Department. Levine’s team launched an environmental assessment under the National Environmental Policy Act in 1989 and received DOE and state approval in 1992. “The purpose was to show that there would be no significant environmental impact as a result of tritium operations,” Levine noted. DeLooper’s team double-checked everything from operator training to preparations for storing and moving the tritium gas, which subsequently arrived in stainless steel containers from the Savannah River National Laboratory in South Carolina.

    In the towering TFTR test cell, engineers readied the three-story high, 695-ton tokamak to operate with tritium. Key tasks included adding more shielding, checking all major systems against possible failures and ensuring that every diagnostic device worked. “The major challenge was to bring everything on line so that failures didn’t happen,” said Mike Williams, the head of engineering at PPPL and also deputy head of TFTR at the time.

    Yet nothing could be certain until the experiment began. “The whole world was going to show up and we had lots of opportunity to fall on our faces,” said engineer Tim Stevenson, who headed the neutral beam operations that heated the plasma to temperatures of more than 100 million degrees centigrade during the shots. “All the instruments were tuned up,” Stevenson said, “but we still had to play the symphony.”

    Keeping the local community informed was another high-priority. PPPL leaders held open houses, met with local executives and government officials and conducted two public hearings before the arrival of tritium. Attendees at one hearing included a local college class that arrived at the urging of its professor.

    Scientists from around the world

    By the day of December 9, press coverage and Laboratory outreach had made PPPL a focus of attention. “Scientists from around the world flew in to witness the experiment,” recalled Rich Hawryluk. More than 100 local visitors flocked to the PPPL auditorium, where a closed-circuit TV feed displayed the control room and Ron Davidson and Dale Meade briefed the audience on unfolding developments. PPPL staffers and their families crowded around the viewing area that overlooked the control room.

    Reporters from several major newspapers covered the event. Also there was Mark Levenson, a reporter from New Jersey public TV station NJN whom the Lab hired to produce a video that subsequently received worldwide exposure.

    The source of all this excitement was surprisingly small: Just six-millionths of a gram of tritium was consumed that night in the shot that made global news. “Such tiny amounts generate huge energy because of the formula E = mc2” explained Charles Gentile, the head of tritium systems at PPPL. The celebrated Einstein equation states that the amount of energy in a body equals the mass of that body times the speed of light squared — an enormous number since light travels at 186,000 miles per second.

    The media seemed as eager as the scientists to watch the famed formula work. “The press people were enormously excited,” said now-retired physicist Ken Young, who headed the PPPL diagnostics department and led efforts to measure the confinement and loss of alpha particles during the experiments. “These reporters were seeing science as it happens and kept waiting for the shot.”

    Also anxiously waiting were more than 100 scientists, engineers and invited guests inside the control room, which normally held about 40 people. All sported red passes that the Laboratory gave to PPPL staffers and guests from DOE and institutions that collaborated on TFTR. “Everybody who could be in there was in there,” recalled Forrest Jobes, a now-retired physicist who kept those in the rear of the L-shaped room abreast of what was happening.

    Calling the shots

    Up front, physicist Jim Strachan was too intent on his job to be caught up in the exuberance. His task was literally to call the shots — to decide how much heating power to use, for example, and when to start the countdown. “Everyone in the group was out to get the most D-T power from reproducible shots,” the now-retired Strachan recalled. “I felt a lot of responsibility and didn’t want to foul up.”

    All eyes followed a closed-circuit TV monitor that displayed a neutron-sensitive scintillator screen in the TFTR test cell that glowed when struck by the neutrons that a D-T shot produced. Artfully covering this test-cell screen was a cardboard poster — designed by PPPL graphic artist Gregory Czechowicz at the behest of Dale Meade — with holes cut into the shape of a light bulb and letters spelling “Fusion Power.” Engineer George Renda designed the scintillator itself. A flash of light from the bulb and the letters in the 3-foot-by-3-foot poster that covered the screen signaled a successful shot. “We came to really count on that image,” said Ed Synakowski. “No need to wait for the computer system to process the data.”

    But there still was plenty of waiting while a series of hardware glitches dragged out the schedule. “Many people in the audience thought we were doing this intentionally to increase the suspense,” Meade recalled.

    By 11 p.m. the problems were solved — setting the stage for the record-breaking shot at 11:08 signaled by the brightly lit light bulb and “Fusion Power” sign on the TV monitor. The control room erupted in jubilation over the shot, which produced 3.8 million watts of power. The excitement reached even the normally staid control-room log, where an operator noted the historic event with the exclamation, “EEYAH”!

    On that high note the experiments ended and the control room opened for press interviews. NJN reporter Levenson returned to his studio to assemble a video news release that he uploaded to a satellite for worldwide distribution, sending the piece off at about 4 a.m. Key parts of the footage — including the control-room jubilation — were shown on nationwide newscasts the following evening.
    Looking back at these events, Hawryluk reflected on the sense of excitement, anticipation and relief that came with them. “We had worked so hard to finally get to that stage and we had done it,” he said. “That night on December 9 established a research capability that has enabled us to pursue a whole host of opportunities to advance the development of fusion energy.”

    See the full article here.

    DOE Pulse highlights work being done at the Department of Energy’s national laboratories. DOE’s laboratories house world-class facilities where more than 30,000 scientists and engineers perform cutting-edge research spanning DOE’s science, energy, National security and environmental quality missions. DOE Pulse is distributed twice each month.

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.


    ScienceSprings is powered by Maingear computers

     
  • richardmitnick 9:43 am on April 5, 2013 Permalink | Reply
    Tags: , DOE Pulse, , ,   

    From JLab via DOE Pulse: “Quarks’ spins dictate their location in the proton” 

    pulse

    Jefferson Lab

    “A successful measurement of the distribution of quarks that make up protons conducted at DOE’s Jefferson Lab has found that a quark’s spin can predict its general location inside the proton. Quarks with spin pointed in the up direction will congregate in the left half of the proton, while down-spinning quarks hang out on the right. The research also confirms that scientists are on track to the first-ever three-dimensional inside view of the proton.

    proton
    The quark structure of the proton. (The color assignment of individual quarks is not important, only that all three colors are present.)

    The proton lies at the heart of every atom that builds our visible universe, yet scientists are still struggling to obtain a detailed picture of how it is composed of its primary building blocks: quarks and gluons. Too small to see with ordinary microscopes, protons and their quarks and gluons are instead illuminated by particle accelerators. At Jefferson Lab, the CEBAF accelerator directs a stream of electrons into protons, and huge detectors then collect information about how the particles interact.

    cebaf

    According to Harut Avakian, a Jefferson Lab staff scientist, these observations have so far revealed important basic information on the proton’s structure, such as the number of quarks and their momentum distribution. This information comes from scattering experiments that detect only whether a quark was hit but do not measure the particles produced from interacting quarks.

    ‘If you sum the momenta of those quarks, it can be compared to the momentum of the proton. What scientists were doing these last 40 years, they were investigating the momentum distribution of quarks along the direction in which the electron looks at it – a one-dimensional picture of the proton,’ he explains.

    Now, he and his colleagues have used a new experimental method that can potentially produce a full three-dimensional view of the proton.”

    See the full article here.

    DOE Pulse highlights work being done at the Department of Energy’s national laboratories. DOE’s laboratories house world-class facilities where more than 30,000 scientists and engineers perform cutting-edge research spanning DOE’s science, energy, National security and environmental quality missions. DOE Pulse is distributed twice each month.

    DOE Banner


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 10:29 am on January 3, 2012 Permalink | Reply
    Tags: , , DOE Pulse, , , , ,   

    From DOE Pulse: “Initiative aims to speed carbon capture technology” 

    January 2, 2012
    Submitted by DOE’s National Energy Technology Laboratory

    “The Carbon Capture Simulation Initiative (CCSI) is a partnership among five DOE national laboratories (NETL, Lawrence Berkeley, Lawrence Livermore, Los Alamos, and Pacific Northwest), industry, and various academic institutions that are working together to develop state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately, widespread deployment at hundreds of power plants. CCSI is part of DOE/NETL’s comprehensive carbon capture and sequestration (CCS) research program, part of the President’s plan to overcome barriers to the widespread, cost-effective deployment of CCS within 10 years.”

    See the full post here.

    cs




     
  • richardmitnick 9:14 am on June 27, 2011 Permalink | Reply
    Tags: DOE Pulse,   

    From DOE Pulse: “Sharpening the plasmon nanofocus” 

    Paul Preuss
    June 27, 2011

    Plasmonics is one of the hottest fields in technology today. Electronic surface waves called plasmons can be generated by confining electromagnetic waves shorter than half the wavelength of incident light, for example at the interface between gold nanostructures and insulating air.

    If the oscillation frequency of the plasmons and the electromagnetic waves matches, the electromagnetic field can be “nanofocused” within a few hundred cubic nanometers. Nanofocusing can be used with dark-field microscopy to detect low concentrations of biochemical agents, single catalysis in nanoreactors, and other processes. Plasmonic sensing is especially promising for detecting flammable gases like hydrogen, where electrical sensors pose safety issues because of possible sparking.

    Researchers with DOE’s Lawrence Berkeley National Laboratory in collaboration with colleagues at the University of Stuttgart, Germany, reported the first experimental demonstration of nanofocusing to enhance gas sensing at the single-particle level in the journal Nature Materials. By placing a palladium nanoparticle on the focusing tip of a gold “nanoantenna,” they were able to clearly detect changes in the palladium’s optical properties upon exposure to hydrogen.

    See the full article here.

     
  • richardmitnick 1:46 pm on May 12, 2011 Permalink | Reply
    Tags: , DOE Pulse,   

    From DOE Pulse: “Berkeley Lab invents a new kind of superlens for the infrared” 

    Paul Preuss
    May 2, 2011

    “Superlenses have resolution far greater than diffraction-limited conventional optics, but most are made from hard-to-fabricate metamaterials. Now researchers with DOE’s Lawrence Berkeley National Laboratory have made superlenses from much simpler layered oxides known as perovskites. Ideal for capturing light in the mid-infrared range, the new perovskite superlenses open the door to highly sensitive biomedical detection and imaging.

    Ordinary optics are limited by diffraction, the bending or spreading of light that prevents the resolution of objects smaller than about half a wavelength – the so-called far field. Another kind of light exists in the near field, a standing wave of “evanescent” light that peaks about a third of a wavelength from an illuminated surface or boundary and then precipately decays with distance. Superlenses capture this evanescent light and add its reconstructed image to that created by conventionally focused propagating waves. “

    i1
    Perovskite superlens setup and image.

    i2
    This atomic-force microscopy image shows the subwavelength strontium ruthenate rectangles that were imaged with perovskite-based superlens using incident IR light of 14.6 micrometer wavelengths. Image from Kehr, et. al)

    See the full article here.

    Also, see a fuller explanation direct from Berkeley Lab here.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 342 other followers

%d bloggers like this: