Tagged: X-ray Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:47 am on December 2, 2019 Permalink | Reply
    Tags: , , X-ray laser, X-ray Technology,   

    From SLAC National Accelerator Lab: “SLAC scientists invent a way to see attosecond electron motions with an X-ray laser” 

    From SLAC National Accelerator Lab

    December 2, 2019
    Manuel Gnida
    (650) 926-2632

    Called XLEAP, the new method will provide sharp views of electrons in chemical processes that take place in billionths of a billionth of a second and drive crucial aspects of life.

    Researchers at the Department of Energy’s SLAC National Accelerator Laboratory have invented a way to observe the movements of electrons with powerful X-ray laser bursts just 280 attoseconds, or billionths of a billionth of a second, long.

    A SLAC-led team has invented a method, called XLEAP, that generates powerful low-energy X-ray laser pulses that are only 280 attoseconds, or billionths of a billionth of a second, long and that can reveal for the first time the fastest motions of electrons that drive chemistry. This illustration shows how the scientists use a series of magnets to transform an electron bunch (blue shape at left) at SLAC’s Linac Coherent Light Source into a narrow current spike (blue shape at right), which then produces a very intense attosecond X-ray flash (yellow). (Greg Stewart/SLAC National Accelerator Laboratory)


    The technology, called X-ray laser-enhanced attosecond pulse generation (XLEAP), is a big advance that scientists have been working toward for years, and it paves the way for breakthrough studies of how electrons speeding around molecules initiate crucial processes in biology, chemistry, materials science and more.

    The team presented their method today in an article in Nature Photonics.

    “Until now, we could precisely observe the motions of atomic nuclei, but the much faster electron motions that actually drive chemical reactions were blurred out,” said SLAC scientist James Cryan, one of the paper’s lead authors and an investigator with the Stanford PULSE Institute, a joint institute of SLAC and Stanford University. “With this advance, we’ll be able to use an X-ray laser to see how electrons move around and how that sets the stage for the chemistry that follows. It pushes the frontiers of ultrafast science.”

    Studies on these timescales could reveal, for example, how the absorption of light during photosynthesis almost instantaneously pushes electrons around and initiates a cascade of much slower events that ultimately generate oxygen.

    “With XLEAP we can create X-ray pulses with just the right energy that are more than a million times brighter than attosecond pulses of similar energy before,” said SLAC scientist Agostino Marinelli, XLEAP project lead and one of the paper’s lead authors. “It’ll let us do so many things people have always wanted to do with an X-ray laser – and now also on attosecond timescales.”

    A leap for ultrafast X-ray science

    One attosecond is an incredibly short period of time – two attoseconds is to a second as one second is to the age of the universe. In recent years, scientists have made a lot of progress in creating attosecond X-ray pulses. However, these pulses were either too weak or they didn’t have the right energy to home in on speedy electron motions.

    Over the past three years, Marinelli and his colleagues have been figuring out how an X-ray laser method suggested 14 years ago [Physical Review Accelerators and Beams] could be used to generate pulses with the right properties – an effort that resulted in XLEAP.

    In experiments carried out just before crews began work on a major upgrade of SLAC’s Linac Coherent Lightsource (LCLS) X-ray laser, the XLEAP team demonstrated that they can produce precisely timed pairs of attosecond X-ray pulses that can set electrons in motion and then record those movements. These snapshots can be strung together into stop-action movies.

    Linda Young, an expert in X-ray science at DOE’s Argonne National Laboratory and the University of Chicago who was not involved in the study, said, “XLEAP is a truly great advance. Its attosecond X-ray pulses of unprecedented intensity and flexibility are a breakthrough tool to observe and control electron motion at individual atomic sites in complex systems.”

    X-ray lasers like LCLS routinely generate light flashes that last a few millionths of a billionth of a second, or femtoseconds. The process starts with creating a beam of electrons, which are bundled into short bunches and sent through a linear particle accelerator, where they gain energy. Travelling at almost the speed of light, they pass through a magnet known as an undulator, where some of their energy is converted into X-ray bursts.

    The shorter and brighter the electron bunches, the shorter the X-ray bursts they create, so one approach for making attosecond X-ray pulses is to compress the electrons into smaller and smaller bunches with high peak brightness. XLEAP is a clever way to do just that.

    Making attosecond X-ray laser pulses

    At LCLS, the team inserted two sets of magnets in front of the undulator that allowed them to mold each electron bunch into the required shape: an intense, narrow spike containing electrons with a broad range of energies.

    Schematic of the XLEAP experiment at SLAC’s Linac Coherent Light Source (LCLS) X-ray laser. LCLS sends bunches of high-energy electrons (green) through an undulator magnet, where electron energy gets converted into extremely bright X-ray pulses (blue) of a few femtoseconds, or millionths of a billionth of a second. In the XLEAP configuration, electron bunches pass two additional sets of magnets (wiggler and chicane) that shape each electron bunch into an intense, narrow spike containing electrons with a broad range of energies. The spikes then produce attosecond X-ray pulses in the undulator. The XLEAP team also developed a customized pulse analyzer (right) to measure the extremely short pulse lengths. (Greg Stewart/SLAC National Accelerator Laboratory)

    “When we send these spikes, which have pulse lengths of about a femtosecond, through the undulator, they produce X-ray pulses that are much shorter than that,” said Joseph Duris, a SLAC staff scientist and paper co-first-author. The pulses are also extremely powerful, he said, with some of them reaching half a terawatt peak power.

    To measure these incredibly short X-ray pulses, the scientists designed a special device in which the X-rays shoot through a gas and strip off some of its electrons, creating an electron cloud. Circularly polarized light from an infrared laser interacts with the cloud and gives the electrons a kick. Because of the light’s particular polarization, some of the electrons end up moving faster than others.

    “The technique works similar to another idea implemented at LCLS, which maps time onto angles like the arms of a clock,” said Siqi Li, a paper co-first-author and recent Stanford PhD. “It allows us to measure the distribution of the electron speeds and directions, and from that we can calculate the X-ray pulse length.”

    Next, the XLEAP team will further optimize their method, which could lead to even more intense and possibly shorter pulses. They are also preparing for LCLS-II, the upgrade of LCLS that will fire up to a million X-ray pulses per second – 8,000 times faster than before. This will allow researchers to do experiments they have long dreamed of, such as studies of individual molecules and their behavior on nature’s fastest timescales.

    The XLEAP team included researchers from SLAC; Stanford University; Imperial College, UK; Max Planck Institute for Quantum Optics, Ludwig-Maximilians University Munich, Kassel University, Technical University Dortmund and Technical University Munich in Germany; and DOE’s Argonne National Laboratory. Large portions of this project were funded by the DOE Office of Science and through DOE’s Laboratory Directed Research and Development (LDRD) program. LCLS is a DOE Office of Science user facility.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition


    SLAC/LCLS II projected view

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 1:29 pm on November 8, 2019 Permalink | Reply
    Tags: "Machine Learning Enhances Light-Beam Performance at the Advanced Light Source", , And little tweaks to enhance light-beam properties at these individual beamlines can feed back into the overall light-beam performance across the entire facility., , , Environmental science, , Many of these synchrotron facilities deliver different types of light for dozens of simultaneous experiments., , STXM-scanning transmission X-ray microscopy, , X-ray Technology   

    From Lawrence Berkeley National Lab: “Machine Learning Enhances Light-Beam Performance at the Advanced Light Source” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    November 8, 2019
    Glenn Roberts Jr.

    Successful demonstration of algorithm by Berkeley Lab-UC Berkeley team shows technique could be viable for scientific light sources around the globe.

    Some members of the team that developed the machine-learning tool for the Advanced Light Source (ALS) are pictured in the ALS control room. Top row, from left: Changchun Sun, Simon Leemann, and Alex Hexemer. Bottom row, from left: Hiroshi Nishimura, C. Nathan Melton, and Yuping Lu. (Credit: Marilyn Chung/Berkeley Lab)

    Synchrotron light sources are powerful facilities that produce light in a variety of “colors,” or wavelengths – from the infrared to X-rays – by accelerating electrons to emit light in controlled beams.

    Synchrotrons like the Advanced Light Source at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) allow scientists to explore samples in a variety of ways using this light, in fields ranging from materials science, biology, and chemistry to physics and environmental science.

    LBNL Advanced Light Source

    LBNL Advanced Light Source

    This image shows the profile of an electron beam at Berkeley Lab’s Advanced Light Source synchrotron, represented as pixels measured by a charged coupled device (CCD) sensor. When stabilized by a machine-learning algorithm, the beam has a horizontal size dimension of 49 microns (root mean squared) and vertical size dimension of 48 microns (root mean squared). Demanding experiments require that the corresponding light-beam size be stable on time scales ranging from less than seconds to hours to ensure reliable data. (Credit: Lawrence Berkeley National Laboratory)

    Researchers have found ways to upgrade these machines to produce more intense, focused, and consistent light beams that enable new, and more complex and detailed studies across a broad range of sample types.

    But some light-beam properties still exhibit fluctuations in performance that present challenges for certain experiments.

    Addressing a decades-old problem

    Many of these synchrotron facilities deliver different types of light for dozens of simultaneous experiments. And little tweaks to enhance light-beam properties at these individual beamlines can feed back into the overall light-beam performance across the entire facility. Synchrotron designers and operators have wrestled for decades with a variety of approaches to compensate for the most stubborn of these fluctuations.

    And now, a large team of researchers at Berkeley Lab and UC Berkeley has successfully demonstrated how machine-learning tools can improve the stability of the light beams’ size for experiments via adjustments that largely cancel out these fluctuations – reducing them from a level of a few percent down to 0.4 percent, with submicron (below 1 millionth of a meter) precision.

    The tools are detailed in a study published Nov. 6 in the journal Physical Review Letters.

    This chart shows how vertical beam-size stability greatly improves when a neural network is implemented during Advanced Light Source operations. When the so-called “feed-forward” correction is implemented, the fluctuations in the vertical beam size are stabilized down to the sub-percent level (see yellow-highlighted section) from levels that otherwise range to several percent. (Credit: Lawrence Berkeley National Laboratory)

    Machine learning is a form of artificial intelligence in which computer systems analyze a set of data to build predictive programs that solve complex problems. The machine-learning algorithms used at the ALS are referred to as a form of “neural network” because they are designed to recognize patterns in the data in a way that loosely resembles human brain functions.

    In this study, researchers fed electron-beam data from the ALS, which included the positions of the magnetic devices used to produce light from the electron beam, into the neural network. The neural network recognized patterns in this data and identified how different device parameters affected the width of the electron beam. The machine-learning algorithm also recommended adjustments to the magnets to optimize the electron beam.

    Because the size of the electron beam mirrors the resulting light beam produced by the magnets, the algorithm also optimized the light beam that is used to study material properties at the ALS.

    Solution could have global impact

    The successful demonstration at the ALS shows how the technique could also generally be applied to other light sources, and will be especially beneficial for specialized studies enabled by an upgrade of the ALS known as the ALS-U project.

    That’s the beauty of this,” said Hiroshi Nishimura, a Berkeley Lab affiliate who retired last year and had engaged in early discussions and explorations of a machine-learning solution to the longstanding light-beam size-stability problem. “Whatever the accelerator is, and whatever the conventional solution is, this solution can be on top of that.”

    Steve Kevan, ALS director, said, “This is a very important advance for the ALS and ALS-U. For several years we’ve had trouble with artifacts in the images from our X-ray microscopes. This study presents a new feed-forward approach based on machine learning, and it has largely solved the problem.”

    The ALS-U project will increase the narrow focus of light beams from a level of around 100 microns down to below 10 microns and also create a higher demand for consistent, reliable light-beam properties.

    An exterior view of the Advanced Light Source dome that houses dozens of beamlines. (Credit: Roy Kaltschmidt/Berkeley Lab)

    The machine-learning technique builds upon conventional solutions that have been improved over the decades since the ALS started up in 1993, and which rely on constant adjustments to magnets along the ALS ring that compensate in real time for adjustments at individual beamlines.

    Nishimura, who had been a part of the team that brought the ALS online more than 25 years ago, said he began to study the potential application of machine-learning tools for accelerator applications about four or five years ago. His conversations extended to experts in computing and accelerators at Berkeley Lab and at UC Berkeley, and the concept began to gel about two years ago.

    Successful testing during ALS operations

    Researchers successfully tested the algorithm at two different sites around the ALS ring earlier this year. They alerted ALS users conducting experiments about the testing of the new algorithm, and asked them to give feedback on any unexpected performance issues.

    “We had consistent tests in user operations from April to June this year,” said C. Nathan Melton, a postdoctoral fellow at the ALS who joined the machine-learning team in 2018 and worked closely with Shuai Liu, a former UC Berkeley graduate student who contributed considerably to the effort and is a co-author of the study.

    Simon Leemann, deputy for Accelerator Operations and Development at the ALS and the principal investigator in the machine-learning effort, said, “We didn’t have any negative feedback to the testing. One of the monitoring beamlines the team used is a diagnostic beamline that constantly measures accelerator performance, and another was a beamline where experiments were actively running.” Alex Hexemer, a senior scientist at the ALS and program lead for computing, served as the co-lead in developing the new tool.

    The beamline with the active experiments, Beamline, uses a technique known as scanning transmission X-ray microscopy or STXM, and scientists there reported improved light-beam performance in experiments.

    The machine-learning team noted that the enhanced light-beam performance is also well-suited for advanced X-ray techniques such as ptychography, which can resolve the structure of samples down to the level of nanometers (billionths of a meter); and X-ray photon correlation spectroscopy, or XPCS, which is useful for studying rapid changes in highly concentrated materials that don’t have a uniform structure.

    Other experiments that demand a reliable, highly focused light beam of constant intensity where it interacts with the sample can also benefit from the machine-learning enhancement, Leemann noted.

    “Experiments’ requirements are getting tougher, with smaller-area scans on samples,” he said. “We have to find new ways for correcting these imperfections.”

    He noted that the core problem that the light-source community has wrestled with – and that the machine-learning tools address – is the fluctuating vertical electron beam size at the source point of the beamline.

    The source point is the point where the electron beam at the light source emits the light that travels to a specific beamline’s experiment. While the electron beam’s width at this point is naturally stable, its height (or vertical source size) can fluctuate.

    Opening the ‘black box’ of artificial intelligence

    “This is a very nice example of team science,” Leemann said, noting that the effort overcame some initial skepticism about the viability of machine learning for enhancing accelerator performance, and opened up the “black box” of how such tools can produce real benefits.

    “This is not a tool that has traditionally been a part of the accelerator community. We managed to bring people from two different communities together to fix a really tough problem.” About 15 Berkeley Lab researchers participated in the effort.

    “Machine learning fundamentally requires two things: The problem needs to be reproducible, and you need huge amounts of data,” Leemann said. “We realized we could put all of our data to use and have an algorithm recognize patterns.”

    The data showed the little blips in electron-beam performance as adjustments were made at individual beamlines, and the algorithm found a way to tune the electron beam so that it negated this impact better than conventional methods could.

    “The problem consists of roughly 35 parameters – way too complex for us to figure out ourselves,” Leemann said. “What the neural network did once it was trained – it gave us a prediction for what would happen for the source size in the machine if it did nothing at all to correct it.

    “There is an additional parameter in this model that describes how the changes we make in a certain type of magnet affects that source size. So all we then have to do is choose the parameter that – according to this neural-network prediction – results in the beam size we want to create and apply that to the machine,” Leemann added.

    The algorithm-directed system can now make corrections at a rate of up to 10 times per second, though three times a second appears to be adequate for improving performance at this stage, Leemann said.

    The search for new machine-learning applications

    The machine-learning team received two years of funding from the U.S. Department of Energy in August 2018 to pursue this and other machine-learning projects in collaboration with the Stanford Synchrotron Radiation Lightsource at SLAC National Accelerator Laboratory. “We have plans to keep developing this and we also have a couple of new machine-learning ideas we’d like to try out,” Leemann said.


    Nishimura said that the buzzwords “artificial intelligence” seem to have trended in and out of the research community for many years, though, “This time it finally seems to be something real.”

    The Advanced Light Source and Stanford Synchrotron Radiation Lightsource are DOE Office of Science User Facilities. This work involved researchers in Berkeley Lab’s Computational Research Division and was supported by the Department of Energy’s Basic Energy Sciences and Advanced Scientific Computing Research programs.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

  • richardmitnick 7:47 am on September 2, 2019 Permalink | Reply
    Tags: "Physicists Have Finally Built a Quantum X-Ray Device", , Bar Ilon University, PDC-parametric down-conversion, , Quantum enhancement, , Quantum illumination, Quantum imaging, , , X-ray PDC, X-ray Technology   

    From Bar Ilon University and Riken via Science Alert: “Physicists Have Finally Built a Quantum X-Ray Device” 


    From Bar Ilon University


    RIKEN bloc

    From RIKEN



    Science Alert

    2 SEP 2019

    (APS/Alan Stonebraker)

    A team of researchers has just demonstrated quantum enhancement in an actual X-ray machine, achieving the desirable goal of eliminating background noise for precision detection.

    The relationships between photon pairs on quantum scales can be exploited to create sharper, higher-resolution images than classical optics. This emerging field is called quantum imaging, and it has some really impressive potential – particularly since, using optical light, it can be used to show objects that can’t usually be seen, like bones and organs.

    Quantum correlation describes a number of different relationships between photon pairs. Entanglement is one of these, and is applied in optical quantum imaging.

    But the technical challenges of generating entangled photons in X-ray wavelengths are considerably greater than for optical light, so in the building of their quantum X-ray, the team took a different approach.

    They used a technique called quantum illumination to minimise background noise. Usually, this uses entangled photons, but weaker correlations work, too. Using a process called parametric down-conversion (PDC), the researchers split a high-energy – or “pump” – photon into two lower-energy photons, called a signal photon and an idler photon.

    “X-ray PDC has been demonstrated by several authors, and the application of the effect as a source for ghost imaging has been demonstrated recently,” the researchers write in their paper.

    “However, in all previous publications, the photon statistics have not been measured. Essentially, to date, there is no experimental evidence that photons, which are generated by X-ray PDC, exhibit statistics of quantum states of radiation. Likewise, observations of the quantum enhanced measurement sensitivity have never been reported at X-ray wavelengths.”

    The researchers achieved their X-ray PDC with a diamond crystal. The nonlinear structure of the crystal splits a beam of pump X-ray photons into signal and idler beams, each with half the energy of the pump beam.

    Normally, this process is very inefficient using X-rays, so the team scaled up the power. Using the SPring-8 synchrotron in Japan, they shot a 22 KeV beam of X-rays at their crystal, which split into two beams, each carrying 11 KeV.

    SPring-8 synchrotron

    SPring-8 synchrotron, located in Hyōgo Prefecture, Japan

    The signal beam is sent towards the object to be imaged – in the case of this research, a small piece of metal with three slits – with a detector on the other side. The idler beam is sent straight to a different detector. This is set up so that each beam hits its respective detector at the same place and at the same time.

    “The perfect time-energy relationship we observed could only mean that the two photons were quantum correlated,” said physicist Sason Sofer of Bar-Ilan University in Israel.

    For the next step, the researchers compared their detections. There were only around 100 correlated photons per point in the image, and around 10,000 more background photons. But the researchers could match each idler to a signal, so they could actually tell which photons in the image were from the beam, thus easily separating out the background noise.

    They then compared these images to images taken using regular, non-correlated photons – and the correlated photons clearly produced a much sharper image.

    It’s early days yet, but it’s definitely a step in the right direction for what could be a greatly exciting tool. Quantum X-ray imaging could have a number of uses outside the range of current X-ray technology.

    One promise is that it could lower the amount of radiation required for X-ray imaging. This would mean that samples easily damaged by X-rays could be imaged, or samples that require low temperatures; less radiation would mean less heat. It could also enable physicists to X-ray atomic nuclei to see what’s inside.

    Obviously, since these quantum X-rays require a hardcore particle accelerator, medical applications are currently off the table. The team has demonstrated that it can be done, but scaling down is going to be tricky.

    Currently, determining whether the photons are entangled is the next step. That would require the photons’ arrival at the detectors to be measured within attosecond scales, which is beyond our current technology.

    Still, this is a pretty amazing achievement.

    “We have demonstrated the ability to utilise the strong time-energy correlations of photon pairs for quantum enhanced photodetection. The procedure we have presented possesses great potential for improving the performances of X-ray measurements,” the researchers write.

    “We anticipate that this work will open the way for more quantum enhanced x-ray regime detection schemes, including the area of diffraction and spectroscopy.”

    The research has been published in Physical Review X.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    RIKEN campus

    RIKEN is Japan’s largest comprehensive research institution renowned for high-quality research in a diverse range of scientific disciplines. Founded in 1917 as a private research foundation in Tokyo, RIKEN has grown rapidly in size and scope, today encompassing a network of world-class research centers and institutes across Japan.

  • richardmitnick 7:09 am on August 30, 2019 Permalink | Reply
    Tags: , , , , , Small-angle x-ray scattering, X-ray Technology   

    From Brookhaven National Lab: “Smarter Experiments for Faster Materials Discovery” 

    From Brookhaven National Lab

    August 28, 2019
    Cara Laasch,
    (631) 344-8458,

    Peter Genzer,
    (631) 344-3174

    Scientists created a new AI algorithm for making measurement decisions; autonomous approach could revolutionize scientific experiments.

    (From left to right) Kevin Yager, Masafumi Fukuto, and Ruipeng Li prepared the Complex Materials Scattering (CMS) beamline at NSLS-II for a measurement using the new decision-making algorithm, which was developed by Marcus Noack (not pictured).

    A team of scientists from the U.S. Department of Energy’s Brookhaven National Laboratory and Lawrence Berkeley National Laboratory designed, created, and successfully tested a new algorithm to make smarter scientific measurement decisions.

    The algorithm, a form of artificial intelligence (AI), can make autonomous decisions to define and perform the next step of an experiment. The team described the capabilities and flexibility of their new measurement tool in a paper published on August 14, 2019 in Nature Scientific Reports.

    From Galileo and Newton to the recent discovery of gravitational waves, performing scientific experiments to understand the world around us has been the driving force of our technological advancement for hundreds of years. Improving the way researchers do their experiments can have tremendous impact on how quickly those experiments yield applicable results for new technologies.

    Over the last decades, researchers have sped up their experiments through automation and an ever-growing assortment of fast measurement tools. However, some of the most interesting and important scientific challenges—such as creating improved battery materials for energy storage or new quantum materials for new types of computers—still require very demanding and time-consuming experiments.

    By creating a new decision-making algorithm as part of a fully automated experimental setup, the interdisciplinary team from two of Brookhaven’s DOE Office of Science user facilities—the Center for Functional Nanomaterials (CFN) [below] and the National Synchrotron Light Source II (NSLS-II) [below]—and Berkeley Lab’s Center for Advanced Mathematics for Energy Research Applications (CAMERA) offers the possibility to study these challenges in a more efficient fashion.


    The challenge of complexity

    The goal of many experiments is to gain knowledge about the material that is studied, and scientists have a well-tested way to do this: They take a sample of the material and measure how it reacts to changes in its environment.

    A standard approach for scientists at user facilities like NSLS-II and CFN is to manually scan through the measurements from a given experiment to determine the next area where they might want to run an experiment. But access to these facilities’ high-end materials-characterization tools is limited, so measurement time is precious. A research team might only have a few days to measure their materials, so they need to make the most out of each measurement.

    “The key to achieving a minimum number of measurements and maximum quality of the resulting model is to go where uncertainties are large,” said Marcus Noack, a postdoctoral scholar at CAMERA and lead author of the study. “Performing measurements there will most effectively reduce the overall model uncertainty.”

    As Kevin Yager, a co-author and CFN scientist, pointed out, “The final goal is not only to take data faster but also to improve the quality of the data we collect. I think of it as experimentalists switching from micromanaging their experiment to managing at a higher level. Instead of having to decide where to measure next on the sample, the scientists can instead think about the big picture, which is ultimately what we as scientists are trying to do.”

    “This new approach is an applied example of artificial intelligence,” said co-author Masafumi Fukuto, a scientist at NSLS-II. “The decision-making algorithm is replacing the intuition of the human experimenter and can scan through the data and make smart decisions about how the experiment should proceed.”

    This animation shows a comparison between a traditional grid measurement (left) of a sample with a measurement steered by the newly-developed decision-making algorithm (right). This comparison shows that the algorithm can identify the edges and inner part of the sample and focuses the measurement in these regions to gain more knowledge about the sample.

    More information for less?

    In practice, before starting an experiment, the scientists define a set of goals they want to get out of the measurement. With these goals set, the algorithm looks at the previously measured data while the experiment is ongoing to determine the next measurement. On its search for the best next measurement, the algorithm creates a surrogate model of the data, which is an educated guess as to how the material will behave in the next possible steps, and calculates the uncertainty—basically how confident it is in its guess—for each possible next step. Based on this, it then selects the most uncertain option to measure next. The trick here is by picking the most uncertain step to measure next, the algorithm maximizes the amount of knowledge it gains by making that measurement. The algorithm not only maximizes the information gain during the measurement, it also defines when to end the experiment by figuring out the moment when any additional measurements would not result in more knowledge.

    “The basic idea is, given a bunch of experiments, how can you automatically pick the next best one?” said James Sethian, director of CAMERA and a co-author of the study. “Marcus has built a world which builds an approximate surrogate model on the basis of your previous experiments and suggests the best or most appropriate experiment to try next.”

    To use the decision-making algorithm for their measurements, the team needed to automate the measurement and also the data analysis. This image shows how all pieces are integrated with each other to form a closed looped. The algorithm receives analyzed data from the last measurement step, adds this data to its model, calculates the best next step, and sends its decision to the beamline to execute the next measurement.

    How we got here

    To make autonomous experiments a reality, the team had to tackle three important pieces: the automation of the data collection, real-time analysis, and, of course, the decision-making algorithm.

    “This is an exciting part of this collaboration,” said Fukuto. “We all provided an essential piece for it: The CAMERA team worked on the decision-making algorithm, Kevin from CFN developed the real-time data analysis, and we at NSLS-II provided the automation for the measurements.”

    The team first implemented their decision-making algorithm at the Complex Materials Scattering (CMS) beamline at NSLS-II, which the CFN and NSLS-II operate in partnership. This instrument offers ultrabright x-rays to study the nanostructure of various materials. As the lead beamline scientist of this instrument, Fukuto had already designed the beamline with automation in mind. The beamline offers a sample-exchanging robot, automatic sample movement in various directions, and many other helpful tools to ensure fast measurements. Together with Yager’s real-time data analysis, the beamline was—by design—the perfect fit for the first “smart” experiment.

    The first “smart” experiment

    The first fully autonomous experiment the team performed was to map the perimeter of a droplet where nanoparticles segregate using a technique called small-angle x-ray scattering at the CMS beamline. During small-angle x-ray scattering, the scientists shine bright x-rays at the sample and, depending on the atomic to nanoscale structure of the sample, the x-rays bounce off in different directions. The scientists then use a large detector to capture the scattered x-rays and calculate the properties of the sample at the illuminated spot. In this first experiment, the scientists compared the standard approach of measuring the sample with measurements taken when the new decision-making algorithm was calling the shots. The algorithm was able to identify the area of the droplet and focused on its edges and inner parts instead of the background.

    “After our own initial success, we wanted to apply the algorithm more, so we reached out to a few users and proposed to test our new algorithm on their scientific problems,” said Yager. “They said yes, and since then we have measured various samples. One of the most interesting ones was a study on a sample that was fabricated to contain a spectrum of different material types. So instead of making and measuring an enormous number of samples and maybe missing an interesting combination, the user made one single sample that included all possible combinations. Our algorithm was then able to explore this enormous diversity of combinations efficiently,” he said.

    What’s next?

    After the first successful experiments, the scientists plan to further improve the algorithm and therefore its value to the scientific community. One of their ideas is to make the algorithm “physics-aware”—taking advantage of anything already known about material under study—so the method can be even more effective. Another development in progress is to use the algorithm during synthesis and processing of new materials, for example to understand and optimize processes relevant to advanced manufacturing as these materials are incorporated into real-world devices. The team is also thinking about the larger picture and wants to transfer the autonomous method to other experimental setups.

    “I think users view the beamlines of NSLS-II or microscopes of CFN just as powerful characterization tools. We are trying to change these capabilities into a powerful material discovery facility,” Fukuto said.

    This work was funded by the DOE Office of Science (ASCR and BES).

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL Center for Functional Nanomaterials



    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 8:28 am on August 18, 2019 Permalink | Reply
    Tags: , , LCLS, , , , , SSRL-Stanford Synchrotron Light Source, , X-ray Technology   

    From SLAC National Accelerator Lab: “Scientists report two advances in understanding the role of ‘charge stripes’ in superconducting materials” 

    From SLAC National Accelerator Lab

    Ali Sundermier
    Glennda Chui

    The studies could lead to a new understanding of how high-temperature superconductors operate.

    High-temperature superconductors, which carry electricity with zero resistance at much higher temperatures than conventional superconducting materials, have generated a lot of excitement since their discovery more than 30 years ago because of their potential for revolutionizing technologies such as maglev trains and long-distance power lines. But scientists still don’t understand how they work.

    One piece of the puzzle is the fact that charge density waves – static stripes of higher and lower electron density running through a material – have been found in one of the major families of high-temperature superconductors, the copper-based cuprates. But do these charge stripes enhance superconductivity, suppress it or play some other role?

    In independent studies, two research teams report important advances in understanding how charge stripes might interact with superconductivity. Both studies were carried out with X-rays at the Department of Energy’s SLAC National Accelerator Laboratory.

    Exquisite detail

    In a paper published today in Science Advances, researchers from the University of Illinois at Urbana-Champaign (UIUC) used SLAC’s Linac Coherent Light Source (LCLS) X-ray free-electron laser [below] to observe fluctuations in charge density waves in a cuprate superconductor.

    This cutaway view shows stripes of higher and lower electron density – “charge stripes” – within a copper-based superconducting material. Experiments with SLAC’s X-ray laser directly observed how those stripes fluctuate when hit with a pulse of light, a step toward understanding how they interact with high-temperature superconductivity. (Greg Stewart/SLAC National Accelerator Laboratory)

    They disturbed the charge density waves with pulses from a conventional laser and then used RIXS, or resonant inelastic X-ray scattering, to watch the waves recover over a period of a few trillionths of a second. This recovery process behaved according to a universal dynamical scaling law: It was the same at all scales, much as a fractal pattern looks the same whether you zoom in or zoom out.

    With LCLS, the scientists were able to measure, for the first time and in exquisite detail, exactly how far and how fast the charge density waves fluctuated. To their surprise, the team discovered that the fluctuations were not like the ringing of a bell or the bouncing of a trampoline; instead, they were more like the slow diffusion of a syrup – a quantum analog of liquid crystal behavior, which had never been seen before in a solid.

    “Our experiments at LCLS establish a new way to study fluctuations in charge density waves, which could lead to a new understanding of how high-temperature superconductors operate,” says Matteo Mitrano, a postdoctoral researcher in professor Peter Abbamonte’s group at UIUC.

    This team also included researchers from Stanford University, the National Institute of Standards and Technology and Brookhaven National Laboratory.

    Hidden arrangements

    Another study, reported last month in Nature Communications, used X-rays from SLAC’S Stanford Synchrotron Radiation Lightsource (SSRL) to discover two types of charge density wave arrangements, making a new link between these waves and high-temperature superconductivity.


    Led by SLAC scientist Jun-Sik Lee, the research team used RSXS, or resonant soft X-ray scattering, to watch how temperature affected the charge density waves in a cuprate superconductor.

    “This resolves a mismatch in data from previous experiments and charts a new course for fully mapping the behaviors of electrons in these exotic superconducting materials,” Lee says.

    “I believe that exploring new or hidden arrangements, as well as their intertwining phenomena, will contribute to our understanding of high-temperature superconductivity in cuprates, which will inform researchers in their quest to design and develop new superconductors that work at warmer temperatures.”

    The team also included researchers from Stanford, Pohang Accelerator Laboratory in South Korea and Tohoku University in Japan.

    SSRL and LCLS are DOE Office of Science user facilities. Both studies were supported by the Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition


    SLAC/LCLS II projected view

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 10:18 am on August 12, 2019 Permalink | Reply
    Tags: , , , Cryomodules and Cavities, Fermilab modified a cryomodule design from DESY in Germany, , , , LCLS-II will provide a staggering million pulses per second., Lined up end to end 37 cryomodules will power the LCLS-II XFEL., , , , , SLAC’s linear particle accelerator, X-ray Technology,   

    From Fermi National Accelerator Lab: “A million pulses per second: How particle accelerators are powering X-ray lasers” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 12, 2019
    Caitlyn Buongiorno

    About 10 years ago, the world’s most powerful X-ray laser — the Linac Coherent Light Source — made its debut at SLAC National Accelerator Laboratory. Now the next revolutionary X-ray laser in a class of its own, LCLS-II, is under construction at SLAC, with support from four other DOE national laboratories.


    Researchers in biology, chemistry and physics will use LCLS-II to probe fundamental pieces of matter, creating 3-D movies of complex molecules in action, making LCLS-II a powerful, versatile instrument at the forefront of discovery.

    The project is coming together thanks largely to a crucial advance in the fields of particle and nuclear physics: superconducting accelerator technology. DOE’s Fermilab and Thomas Jefferson National Accelerator Facility are building the superconducting modules necessary for the accelerator upgrade for LCLS-II.

    SLAC National Accelerator Laboratory is upgrading its Linac Coherent Light Source, an X-ray laser, to be a more powerful tool for science. Both Fermilab and Thomas Jefferson National Accelerator Facility are contributing to the machine’s superconducting accelerator, seen here in the left part of the diagram. Image: SLAC

    A powerful tool for discovery

    Inside SLAC’s linear particle accelerator today, bursts of electrons are accelerated to energies that allow LCLS to fire off 120 X-ray pulses per second. These pulses last for quadrillionths of a second – a time scale known as a femtosecond – providing scientists with a flipbook-like look at molecular processes.

    “Over time, you can build up a molecular movie of how different systems evolve,” said SLAC scientist Mike Dunne, director of LCLS. “That’s proven to be quite remarkable, but it also has a number of limitations. That’s where LCLS-II comes in.”

    Using state-of-the-art particle accelerator technology, LCLS-II will provide a staggering million pulses per second. The advance will provide a more detailed look into how chemical, material and biological systems evolve on a time scale in which chemical bonds are made and broken.

    To really understand the difference, imagine you’re an alien visiting Earth. If you take one image a day of a city, you would notice roads and the cars that drive on them, but you couldn’t tell the speed of the cars or where the cars go. But taking a snapshot every few seconds would give you a highly detailed picture of how cars flow through the roads and would reveal phenomena like traffic jams. LCLS-II will provide this type of step-change information applied to chemical, biological and material processes.

    To reach this level of detail, SLAC needs to implement technology developed for particle physics – superconducting acceleration cavities – to power the LCLS-II free-electron laser, or XFEL.

    This is an illustration of the electron accelerator of SLAC’s LCLS-II X-ray laser. The first third of the copper accelerator will be replaced with a superconducting one. The red tubes represent cryomodules, which are provided by Fermilab and Jefferson Lab. Image: SLAC

    Accelerating science

    Cavities are structures that impart energy to particle beams, accelerating the particles within them. LCLS-II, like modern particle accelerators, will take advantage of superconducting radio-frequency cavity technology, also called SRF technology. When cooled to 2 Kelvin, superconducting cavities allow electricity to flow freely, without any resistance. Like reducing the friction between a heavy object and the ground, less electrical resistance saves energy, allowing accelerators to reach higher power for less cost.

    “The SRF technology is the enabling step for LCLS-II’s million pulses per second,” Dunne said. “Jefferson Lab and Fermilab have been developing this technology for years. The core expertise to make LCLS-II possible lives at these labs.”

    Fermilab modified a cryomodule design from DESY, in Germany, and specially prepared the cavities to draw the record-setting performance from the cavities and cryomodules that will be used for LCLS-II.

    The cylinder-shaped cryomodules, about a meter in diameter, act as specialized containers for housing the cavities. Inside, ultracold liquid helium continuously flows around the cavities to ensure they maintain the unwavering 2 Kelvin essential for superconductivity. Lined up end to end, 37 cryomodules will power the LCLS-II XFEL.

    See the full here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

  • richardmitnick 11:06 am on August 9, 2019 Permalink | Reply
    Tags: "Argonne receives go-ahead for $815 million upgrade to X-ray facility", , , The Advanced Photon Source Upgrade transforms today's APS into a world-leading storage-ring-based hard X-ray light source., X-ray Technology   

    From University of Chicago: “Argonne receives go-ahead for $815 million upgrade to X-ray facility” 

    U Chicago bloc

    From University of Chicago

    Aug 8, 2019

    A multimillion-dollar upgrade to the Advanced Photon Source, a kilometer-long X-ray science facility at Argonne National Laboratory, will allow scientists to observe atoms moving in real time. Courtesy of Argonne National Laboratory

    Accelerator at UChicago-affiliated lab will boost discovery across scientific fields.

    For the past quarter-century, the Advanced Photon Source at Argonne National Laboratory has helped scientists and engineers make groundbreaking discoveries—providing extremely bright X-rays to investigate everything from dinosaur bones and lunar rocks to materials for new solar panels and new pharmaceutical drugs.

    By accelerating particles to nearly the speed of light, the APS creates X-rays that researchers can use to peer through dense materials and illuminate the structure and chemistry of matter at the molecular and atomic level. Now Argonne, which is operated by the University of Chicago, been cleared to begin building a massive, $815 million upgrade to its kilometer-long X-ray facility.

    The U.S. Department of Energy recently announced that the design report for the upgrade has been finalized and that the laboratory could begin moving forward with procurement and construction. Upon completion, the upgrade will equip scientists with a vastly more powerful tool for extending their research into new realms, accelerating impactful discoveries in science and technology.

    The Advanced Photon Source Upgrade transforms today’s APS into a world-leading, storage-ring-based, hard X-ray light source.

    “This project will be a scientific game-changer,” said Argonne scientist Robert Hettel, director of the APS Upgrade project. “The APS upgrade will allow researchers to see things at a scale they have never seen before with storage-ring X-rays. We’ll be able to look deep inside real samples, such as biological organisms, and observe atoms moving in real time. Such extreme levels of detail will open new frontiers and discoveries in basic science and help solve pressing problems across a wide range of industries.”

    Among potential discoveries are revolutionary systems to convert sunlight into energy and ways to store that energy; new drugs to treat infections resistant to today’s antibiotics; a better understanding of the way the brain processes and stores information with neurons; detailed mechanisms by which pollutants move through soil; transformational understanding of the structure of the Earth’s inner core; and cleaner, more efficient biofuels.

    “Virtually every department in the sciences and engineering here at the University of Chicago has multiple faculty members whose research relies on the Advanced Photon Source, from molecular engineers, to geoscientists to astronomers,” said Juan de Pablo, vice president for national laboratories and the Liew Family Professor in Molecular Engineering at UChicago. “It’s an extraordinary resource for the nation, and we are thrilled to see it go forward and to contribute towards its development.”

    The upgrade will increase the brightness of the already super-bright X-rays another 100 to 1,000 times over the present facility, and depending on the technique used, will allow scientists to map the position, identity and dynamics of the key atoms in a sample.

    Science at an even bigger scale

    Every year, more than 5,500 researchers from every U.S. state and countries across the world conduct experiments at the APS. That research has led to two Nobel Prizes (and contributed to a third), supported the development of numerous pharmaceuticals (including one of the most successful drugs to stop the progression of the HIV virus), and improved products including more efficient vehicles and more powerful electronics.

    Scientists at the APS have also uncovered secrets of history and archaeology by studying the composition of an ancient Egyptian mummy and the arms of SUE, the Tyrannosaurus rex specimen at The Field Museum of Chicago.

    In addition, APS research has increased our understanding of our solar system and the Earth itself through studies of meteorites, space dust and geological rocks and minerals.

    “The upgraded APS will enable science at a completely new scale, enabling discoveries across a wide range of research—from microelectronics to polymers to quantum,” said Paul Kearns, director of Argonne National Laboratory.

    The upgrade comes as Argonne also prepares to host what will be the most powerful supercomputer ever built in the U.S. Called “Aurora,” it will be capable of a quintillion—or one billion billion—calculations per second.

    Depiction of ANL ALCF Cray Inetl SC18 Shasta Aurora exascale supercomputer

    “It’s an exciting time as Argonne is building two powerful facilities for the world’s scientific community,” said Kearns. “Together, the upgraded APS and our Aurora exascale computing system will provide powerful new capabilities to accelerate science and technology for U.S. prosperity and security.”

    Removal of the old storage ring and installation of the new one is planned to begin in June 2022. This installation and subsequent ring commissioning period will last for about one year, after which the APS-U X-ray beamlines will be brought online for researchers.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

  • richardmitnick 1:28 pm on July 30, 2019 Permalink | Reply
    Tags: "Study reveals new structure of gold at extremes", , , , Increase in pressure and temperature changes the crystalline structure to a new phase of gold., , , X-ray Technology   

    From Lawrence Livermore National Laboratory: “Study reveals new structure of gold at extremes” 

    From Lawrence Livermore National Laboratory

    July 30, 2019
    Breanna Bishop

    Three of the images collected at Argonne National Laboratory’s Dynamic Compression Sector, highlighting diffracted signals recorded on the X-ray detector.

    Section 1 shows the starting face-centered cubic structure; Section 2 shows the new body-centered cubic structure at 220 GPa; and Section 3 shows the liquid gold at 330 GPa.

    Gold is an extremely important material for high-pressure experiments and is considered the “gold standard” for calculating pressure in static diamond anvil cell experiments. When compressed slowly at room temperature (on the order of seconds to minutes), gold prefers to be the face-centered cubic (fcc) structure at pressures up to three times the center of the Earth.

    However, researchers from Lawrence Livermore National Laboratory (LLNL) and the Carnegie Institution for Science have found that when gold is compressed rapidly over nanoseconds (1 billionth of a second), the increase in pressure and temperature changes the crystalline structure to a new phase of gold.

    This well-known body-centered cubic (bcc) structure morphs to a more open crystal structure than the fcc structure. These results were published recently in Physical Review Letters.

    “We discovered a new structure in gold that exists at extreme states — two thirds of the pressure found at the center of Earth,” said lead author Richard Briggs, a postdoctoral researcher at LLNL. “The new structure actually has less efficient packing at higher pressures than the starting structure, which was surprising considering the vast amount of theoretical predictions that pointed to more tightlypacked structures that should exist.”

    The experiments were carried out at the Dynamic Compression Sector (DCS) at the Advanced Photon Source, Argonne National Laboratory.

    ANL Advanced Photon Source

    DCS is the first synchrotron X-ray facility dedicated to dynamic compression science. These user experiments were some of the first conducted on hutch-C, the dedicated high energy laser station of DCS. Gold was the ideal subject to study due to its high-Z (providing a strong X-ray scattering signal) and relatively unexplored phase diagram at high temperatures.

    The team found that that the structure of gold began to change at a pressure of 220 GPa (2.2 million times Earth’s atmospheric pressure) and started to melt when compressed beyond 250 GPa.

    “The observation of liquid gold at 330 GPa is astonishing,” Briggs said. “This is the pressure at the center of the Earth and is more than 300 GPa higher than previous measurements of liquid gold at high pressure.”

    The transition from fcc to bcc structure is perhaps one of the most studied phase transitions due to its importance in the manufacturing of steel, where high temperatures or stress causes a change in structure between the two fcc/bcc structures. However, it is not known what phase transition mechanism is responsible. The research team’s results show that gold undergoes the same phase transition before it melts, as a consequence of both pressure and temperature, and future experiments focusing on the mechanism of the transition can help clarify key details of this important transition for manufacturing strong steels.

    “Many of the theoretical models of gold that are used to understand the high-pressure/high-temperature behavior did not predict the formation of a body-centered structure – only two out of more than 10 published works,” Briggs said. “Our results can help theorists improve their models of elements under extreme compression and look toward using those new models to examine the effects of chemical bonding to aid the development of new materials that can be formed at extreme states.”

    Briggs was joined on the publication by co-authors Federica Coppari, Martin Gorman, Ray Smith, Amy Coleman, Amalia Fernandez-Panella, Marius Millot, Jon Eggert and Dane Fratanduono from LLNL, and Sally Tracy from the Carnegie Institution of Washington’s Geophysical Laboratory.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    LLNL Campus

    Operated by Lawrence Livermore National Security, LLC, for the Department of Energy’s National Nuclear Security Administration
    Lawrence Livermore National Laboratory (LLNL) is an American federal research facility in Livermore, California, United States, founded by the University of California, Berkeley in 1952. A Federally Funded Research and Development Center (FFRDC), it is primarily funded by the U.S. Department of Energy (DOE) and managed and operated by Lawrence Livermore National Security, LLC (LLNS), a partnership of the University of California, Bechtel, BWX Technologies, AECOM, and Battelle Memorial Institute in affiliation with the Texas A&M University System. In 2012, the laboratory had the synthetic chemical element livermorium named after it.
    LLNL is self-described as “a premier research and development institution for science and technology applied to national security.” Its principal responsibility is ensuring the safety, security and reliability of the nation’s nuclear weapons through the application of advanced science, engineering and technology. The Laboratory also applies its special expertise and multidisciplinary capabilities to preventing the proliferation and use of weapons of mass destruction, bolstering homeland security and solving other nationally important problems, including energy and environmental security, basic science and economic competitiveness.

    The Laboratory is located on a one-square-mile (2.6 km2) site at the eastern edge of Livermore. It also operates a 7,000 acres (28 km2) remote experimental test site, called Site 300, situated about 15 miles (24 km) southeast of the main lab site. LLNL has an annual budget of about $1.5 billion and a staff of roughly 5,800 employees.

    LLNL was established in 1952 as the University of California Radiation Laboratory at Livermore, an offshoot of the existing UC Radiation Laboratory at Berkeley. It was intended to spur innovation and provide competition to the nuclear weapon design laboratory at Los Alamos in New Mexico, home of the Manhattan Project that developed the first atomic weapons. Edward Teller and Ernest Lawrence,[2] director of the Radiation Laboratory at Berkeley, are regarded as the co-founders of the Livermore facility.

    The new laboratory was sited at a former naval air station of World War II. It was already home to several UC Radiation Laboratory projects that were too large for its location in the Berkeley Hills above the UC campus, including one of the first experiments in the magnetic approach to confined thermonuclear reactions (i.e. fusion). About half an hour southeast of Berkeley, the Livermore site provided much greater security for classified projects than an urban university campus.

    Lawrence tapped 32-year-old Herbert York, a former graduate student of his, to run Livermore. Under York, the Lab had four main programs: Project Sherwood (the magnetic-fusion program), Project Whitney (the weapons-design program), diagnostic weapon experiments (both for the Los Alamos and Livermore laboratories), and a basic physics program. York and the new lab embraced the Lawrence “big science” approach, tackling challenging projects with physicists, chemists, engineers, and computational scientists working together in multidisciplinary teams. Lawrence died in August 1958 and shortly after, the university’s board of regents named both laboratories for him, as the Lawrence Radiation Laboratory.

    Historically, the Berkeley and Livermore laboratories have had very close relationships on research projects, business operations, and staff. The Livermore Lab was established initially as a branch of the Berkeley laboratory. The Livermore lab was not officially severed administratively from the Berkeley lab until 1971. To this day, in official planning documents and records, Lawrence Berkeley National Laboratory is designated as Site 100, Lawrence Livermore National Lab as Site 200, and LLNL’s remote test location as Site 300.[3]

    The laboratory was renamed Lawrence Livermore Laboratory (LLL) in 1971. On October 1, 2007 LLNS assumed management of LLNL from the University of California, which had exclusively managed and operated the Laboratory since its inception 55 years before. The laboratory was honored in 2012 by having the synthetic chemical element livermorium named after it. The LLNS takeover of the laboratory has been controversial. In May 2013, an Alameda County jury awarded over $2.7 million to five former laboratory employees who were among 430 employees LLNS laid off during 2008.[4] The jury found that LLNS breached a contractual obligation to terminate the employees only for “reasonable cause.”[5] The five plaintiffs also have pending age discrimination claims against LLNS, which will be heard by a different jury in a separate trial.[6] There are 125 co-plaintiffs awaiting trial on similar claims against LLNS.[7] The May 2008 layoff was the first layoff at the laboratory in nearly 40 years.[6]

    On March 14, 2011, the City of Livermore officially expanded the city’s boundaries to annex LLNL and move it within the city limits. The unanimous vote by the Livermore city council expanded Livermore’s southeastern boundaries to cover 15 land parcels covering 1,057 acres (4.28 km2) that comprise the LLNL site. The site was formerly an unincorporated area of Alameda County. The LLNL campus continues to be owned by the federal government.


    DOE Seal

  • richardmitnick 8:50 am on July 26, 2019 Permalink | Reply
    Tags: "Imaging the Chemical Structure of Individual Molecules Atom by Atom", , GXSM-Gnome X Scanning Microscopy, nc-AFM-Noncontact atomic force microscope, Scanning probe microscopy, , X-ray Technology   

    From Brookhaven National Lab: “Imaging the Chemical Structure of Individual Molecules, Atom by Atom” 

    From Brookhaven National Lab

    July 22, 2019

    Ariana Manglaviti

    Using atomic force microscopy images, scientists at Brookhaven Lab’s Center for Functional Nanomaterials developed a guide for discriminating atoms other than hydrogen and carbon in aromatic molecules—ring-shaped molecules with special bonding properties—to help identify contaminants found in petroleum.

    Brookhaven Lab physicist Percy Zahl with the noncontact atomic force microscope he adapted and used at the Center for Functional Nanomaterials (CFN) to image nitrogen- and sulfur-containing molecules in petroleum.

    For physicist Percy Zahl, optimizing and preparing a noncontact atomic force microscope (nc-AFM) to directly visualize the chemical structure of a single molecule is a bit like playing a virtual reality video game. The process requires navigating and manipulating the tip of the instrument over the world of atoms and molecules, eventually picking some up at the right location and in the right way. If these challenges are completed successfully, you advance to the highest level, obtaining images that precisely show where individual atoms are located and how they are chemically bonded to other atoms. But take one wrong move, and it is game over. Time to start again.

    “The nc-AFM has a very sensitive single-molecule tip that scans over a carefully prepared clean single-crystal surface at a constant height and “feels” the forces between the tip molecule and single atoms and bonds of molecules placed on this clean surface,” explained Zahl, who is part of the Interface Science and Catalysis Group at the Center for Functional Nanomaterials (CFN), a U.S. Department of Energy (DOE) Office of Science User Facility at Brookhaven National Laboratory. “It can take an hour or days to get this sensor working properly. You can’t simply press a button; fine tuning is required. But all of this effort is definitely worthwhile once you see the images appearing like molecules in a chemistry textbook.”

    A history of chemical structure determination

    Since the beginning of the field of chemistry, scientists have been able to determine the elemental composition of molecules. What has been more difficult is to figure out their chemical structures, or the particular arrangement of atoms in space. Knowing the chemical structure is important because it impacts the molecule’s reactivities and other properties.

    Kekulé claims that the idea of the ring structure of benzene came to him in a dream of a snake eating its own tail.

    For example, Michael Faraday isolated benzene in 1825 from an oil gas residue. It was soon determined that benzene is composed of six hydrogen and six carbon atoms, but its chemical structure remained controversial until 1865, when Friedrich August Kekulé proposed a cyclic structure. However, his proposal was not based on a direct observation but rather on logic deduction from the number of isomers (compounds with the same chemical formula but different chemical structures) of benzene. The correct symmetric hexagonal structure of benzene was finally revealed through its diffraction pattern obtained by Kathleen Lonsdale via x-ray crystallography in 1929. In 1931, Erich Huckel used quantum theory to explain the origin of “aromaticity” in benzene. Aromaticity is a property of flat ring-shaped molecules in which electrons are shared between atoms. Because of this unique arrangement of electrons, aromatic compounds have a special stability (low reactivity).

    Today, x-ray crystallography continues to be a mainstream technique for determining chemical structures, along with nuclear magnetic resonance spectroscopy. However, both techniques require crystals or relatively pure samples, and chemical structure models must be deducted by analyzing the resulting diffraction patterns or spectra.

    The first-ever actual image of a chemical structure was obtained only a decade ago. In 2009, scientists at IBM Research–Zurich Lab in Switzerland used nc-AFM to resolve the atomic backbone of an individual molecule of pentacene, seeing its five fused benzene rings and even the carbon-hydrogen bonds. This breakthrough was made possible by selecting an appropriate molecule for the end of the tip—one that could come very close to the surface of pentacene without reacting with or binding to it. It also required optimized sensor readout electronics at cryogenic temperatures to measure small frequency shifts in the probe oscillation (which relates to the force) while maintaining mechanical and thermal stability through vibration damping setups, ultrahigh vacuum chambers, and low-temperature cooling systems.

    “Low-temperature nc-AFM is the only method that can directly image the chemical structure of a single molecule,” said Zahl. “With nc-AFM, you can visualize the positions of individual atoms and the arrangement of chemical bonds, which affect the molecule’s reactivity.”

    However, currently there are still some requirements for molecules to be suitable for nc-AFM imaging. Molecules must be mainly planar (flat), as the scanning occurs on the surface and thus is not suitable for large three-dimensional (3-D) structures such as proteins. In addition, because of the slow nature of scanning, only a few hundred molecules can be practically examined per experiment. Zahl notes that this limitation could be overcome in the future through artificial intelligence, which would pave the way toward automated scanning probe microscopy.

    According to Zahl, though nc-AFM has since been applied by a few groups around the world, it is not widespread, especially in the United States.

    “The technique is still relatively new and there is a long learning curve in acquiring CO tip-based molecular structures,” said Zahl. “It takes a lot of experience in scanning probe microscopy, as well as patience.”

    A unique capability and expertise

    The nc-AFM at the CFN represents one of a few in this country. Over the past several years, Zahl has upgraded and customized the instrument, most notably with the open-source software and hardware, GXSM (for Gnome X Scanning Microscopy). Zahl has been developing GXSM for more than two decades. A real-time signal processing control system and software continuously records operating conditions and automatically adjusts the tip position as necessary to avoid unwanted collisions when the instrument is operated in an AFM-specific scanning mode to record forces over molecules. Because Zahl wrote the software himself, he can program and implement new imaging or operating modes for novel measurements and add features to help operators better explore the atomic world.

    DBT (left column) is one of the sulfur-containing compounds in petroleum; CBZ and ACR (right and middle columns, respectively) are nitrogen-containing compounds. Illustrations and ball-and-stick models of their chemical structures are shown at the top of each column (black indicates carbon atoms; yellow indicates sulfur, and blue indicates nitrogen). The simulated atomic force microscopy images (a, b, d, e, g, and h) well match the ones obtained experimentally (c, f, and i).

    For example, recently Zahl applied a custom “slicing” mode to determine the 3-D geometrical configuration in which a single molecule of dibenzothiopene (DBT)—a sulfur-containing aromatic molecule commonly found in petroleum—adsorbs on a gold surface. The DBT molecule is not entirely planar but rather tilted at an angle, so he combined a series of force images (slices) to create a topographic-like representation of the molecule’s entire structure.

    “In this mode, obstacles such as protruding atoms are automatically avoided,” said Zahl. “This capability is important, as the force measurements are ideally taken in one fixed plane, with the need to be very close to the atoms to feel the repulsive forces and ultimately to achieve detailed image contrast. When parts stick out of the molecule plane, they will likely negatively impact image quality.”

    This imaging of DBT was part of a collaboration with Yunlong Zhang, a physical organic chemist at ExxonMobil Research and Engineering Corporate Strategic Research in New Jersey. Zhang met Zahl at a conference two years ago and realized that the capabilities and expertise in nc-AFM at the CFN would have great potential for his research on petroleum chemistry.

    Zahl and Zhang used nc-AFM to image the chemical structure of not only DBT but also of two nitrogen-containing aromatic molecules—carbazole (CBZ) and acridine (ACR)—that are widely observed in petroleum. In analyzing the images, they developed a set of templates of common features in the ring-shaped molecules that can be used to find sulfur and nitrogen atoms and distinguish them from carbon atoms.

    Petroleum: a complex mixture

    The chemical composition of petroleum widely varies depending on where and how it formed, but in general it contains mostly carbon and hydrogen (hydrocarbons) and smaller amounts of other elements, including sulfur and nitrogen. During combustion, when the fuel is burned, these “heteroatoms” produce sulfur and nitrogen oxides, which contribute to the formation of acid rain and smog, both air pollutants that are harmful to human health and the environment. Heteroatoms can also reduce fuel stability and corrode engine components. Though refining processes exist, not all of the sulfur and nitrogen is removed. Identifying the most common structures of impure molecules containing nitrogen and sulfur atoms could lead to optimized refining processes for producing cleaner and more efficient fuels.

    “Our previous research with the IBM group at Zurich on petroleum asphaltenes and heavy oil mixtures provided the first “peek” into numerous structures in petroleum,” said Zhang. “However, more systemic studies are needed, especially on the presence of heteroatoms and their precise locations within aromatic hydrocarbon frameworks in order to broaden the application of this new technique to identify complex molecular structures in petroleum.”

    To image the atoms and bonds in DBT, CBZ, and ACR, the scientists prepared the tip of the nc-AFM with a single crystal of gold at the apex and a single molecule of carbon monoxide (CO) at the termination point (the same kind of molecule used in the original IBM experiment). The metal crystal provides an atomically clean and flat support from which the CO molecule can be picked up.

    After “functionalizing” the tip, they deposited a few of each of the molecules (dusting amount) on a gold surface inside the nc-AFM under ultrahigh vacuum at room temperature via sublimation. During sublimation, the molecules go directly from a solid to gas phase.

    Though the images they obtained strikingly resemble chemical structure drawings, you cannot directly tell from these images whether there is a nitrogen, sulfur, or carbon atom present in a particular site. It takes some input knowledge to deduct this information.

    “As a starting point, we imaged small well-known molecules with typical building blocks that are found in larger polycyclic aromatic hydrocarbons—in this case, in petroleum,” explained Zahl. “Our idea was to see what the basic building blocks of these chemical structures look like and use them to create a set of templates for finding them in larger unknown molecular mixtures.”

    An illustration showing how nc-AFM can distinguish sulfur- and nitrogen-containing molecules commonly found in petroleum. A tuning fork (grey arm) with a highly sensitive tip containing a single carbon monoxide molecule (black is carbon and red is oxygen) is brought very close to the surface (outlined in white), with the oxygen molecule lying flat on the surface without making contact. As the tip scans across the surface, it “feels” the forces from the bonds between atoms to generate an image of the molecule’s chemical structure. One image feature that can be used to discriminate between the different types of atoms is the relative “size” of the elements (indicated by the size of the boxes in the overlaid periodic table).

    For example, for sulfur- and nitrogen-containing molecules in petroleum, sulfur is only found in ring structures with five atoms (pentagon ring structure), while nitrogen can be present in rings with either five or six (hexagonal ring structure) atoms. In addition to this bonding geometry, the relative “size,” or atomic radius, of the elements can help distinguish them. Sulfur is relatively larger than nitrogen and carbon, and nitrogen is slightly smaller than carbon. It is this size, or “height,” that AFM is extremely sensitive to.

    “Simply speaking, the force that the AFM records in very close proximity to an atom relates to the distance and thus to the size of that atom; as the AFM scans over a molecule at a fixed elevation, bigger atoms protrude more out of the plane,” explained Zahl. “Therefore, the larger the atom in a molecule, the bigger the force that the AFM records as it gets closer to its atomic shell, and the repulsion increases dramatically. That is why in the images sulfur appears as a bright dot, while nitrogen looks a hint fainter.”

    Zahl and Zhang then compared their experimental images to computer-simulated ones they obtained using the mechanical probe particle simulation method. This method simulates the actual forces acting on the CO molecule on the tip end as it scans over molecules and bends in response. They also performed theoretical calculations to determine how the electrostatic potential (charge distribution) of the molecules affects the measured force and relates to their appearance in the nc-AFM images.

    “We used density functional theory to study how the forces felt by the CO probe molecule behave in the presence of the charge environment surrounding the molecules,” said Zahl. “We need to know how the electrons are distributed in order to understand the atomic force and bond contrast mechanism. These insights even allow us to assign single or double bonds between atoms by analyzing image details.”

    Going forward, Zahl will continue developing and enhancing nc-AFM imaging modes and related technologies to explore many kinds of interesting, unknown, or novel molecules in collaboration with various users. Top candidate molecules of interest include those with large magnetic moments and special spin properties for quantum applications and novel graphene-like (graphene is a one-atom-thick sheet of carbon atoms arranged in a hexagonal lattice) materials with extraordinary electronic properties.

    “The CFN has unique capabilities and expertise in nc-AFM that can be applied to a wide range of molecules,” said Zahl. “In the coming years, I believe that artificial intelligence will make a big impact on the field by helping us operate the microscope autonomously to perform the most time-consuming, tedious, and error-prone parts of experiments. With this special power, our chances of winning the “game” will be much improved.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL Center for Functional Nanomaterials



    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 10:24 am on July 17, 2019 Permalink | Reply
    Tags: , , DSSC detector, , MiniSDD sensors, , SCS-Spectroscopy and Coherent Scattering, X-ray Technology   

    From European XFEL: “Fastest soft X-ray camera in the world installed at European XFEL” 

    XFEL bloc

    European XFEL

    From European XFEL

    DSSC detector will expand scientific capabilities of the instrument for Spectroscopy and Coherent Scattering (SCS)

    DSSC detector

    At European XFEL near Hamburg the world’s fastest soft X-ray camera has successfully been put through its paces. The installation, commissioning and operation of the unique detector marks the culmination of over a decade of international collaborative research and development. The so-called DSSC detector, designed specifically for the low energy regimes and long X-ray wavelengths used at the European XFEL soft X-ray instruments, will significantly expand the scientific capabilities of the instrument for Spectroscopy and Coherent Scattering (SCS) where it is installed. It will enable ultrafast studies of electronic, spin and atomic structures at the nanoscale making use of each X-ray flash provided by European XFEL. At the end of May, the first scientific experiments using the DSSC were successfully conducted at SCS.

    The DSSC was developed by an international consortium coordinated by European XFEL. Other partners include DESY, University of Heidelberg, Politecnico di Milano, the Istituto Nazionale di Fisica Nucleare, and University of Bergamo. It is the fourth fast X-ray detector to be installed at European XFEL, and the second detector available for experiments at the SCS instrument.

    Matteo Porro, DSSC project and consortium leader said: “This is a fantastic achievement in terms of detector development and it opens up unique possibilities for the photon science community. With the DSSC we have shown that it is possible to count single photons in the soft X-ray regime at the very high pulse repetition rate provided by the European XFEL. I would like to thank the DSSC consortium, who with their commitment and creativity, have made this possible. It was a privilege to work with people who provided such an extraordinary level of know-how in detector and electronics design.”

    During an experiment, X-ray flashes are fired at the sample being studied. The X-rays diffract off the atoms in the sample, resulting in a distinctive pattern that is recorded and stored by the detector located behind the sample. The European XFEL delivers X-rays flashes grouped together in packets known as trains. Each train contains a maximum of 2700 flashes. Within these trains the X-ray flashes are fired in quick succession with a time difference of 220 nanoseconds. At full capacity, the DSSC detector can acquire images at a rate of 4.5 million images per second, matching the speed of the X-ray flashes provided by the European XFEL. For every train the DSSC detector can store 800 one megapixel images. This makes the DSSC the fastest soft X-ray detector in the world. It was designed and built to accommodate the low energy regimes and long wavelengths unique to the soft X-ray instruments at European XFEL. The DSSC detector is based on silicon sensors and is made up of 1024 x 1024 hexagonal pixels for a total active area of 210 x 210 mm2.

    The DSSC detector is currently equipped with a type of sensors called MiniSDD sensors which were produced by the Semiconductor Laboratory of the Max Planck Society in Munich. PNSensor GmbH based in Munich, recently joined the DSSC consortium to further develop another type of sensor, DePFET, for a second improved DSSC camera. This will enable an even greater level of detail to be recorded than currently possible.

    “After years of design and development, it was great to see the individual detector components being assembled together at European XFEL during this past year. This was as an extremely exciting and intense time.” European XFEL Detector Group leader Markus Kuster says. “Having seen the results of the first scientific experiment with the DSSC, I am proud of the whole project team and pleased that our efforts are now bearing fruits. This is a fantastic start for the future development of the DSSC detector technology.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    XFEL Campus

    XFEL Tunnel

    XFEL Gun

    The Hamburg area will soon boast a research facility of superlatives: The European XFEL will generate ultrashort X-ray flashes—27 000 times per second and with a brilliance that is a billion times higher than that of the best conventional X-ray radiation sources.

    The outstanding characteristics of the facility are unique worldwide. Started in 2017, it will open up completely new research opportunities for scientists and industrial users.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: