Tagged: X-ray Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:21 pm on August 29, 2014 Permalink | Reply
    Tags: , , Space Dust, X-ray Technology   

    From ANL: “Mysteries of space dust revealed” 

    News from Argonne National Laboratory

    August 29, 2014
    This story was originally reported by Kate Greene of Berkeley National Laboratory.

    The first analysis of space dust collected by a special collector onboard NASA’s Stardust mission and sent back to Earth for study in 2006 suggests the tiny specks open a door to studying the origins of the solar system and possibly the origin of life itself.

    NASA Stardust spacecraft

    This is the first time synchrotron light sources have been used to look at microscopic particles caught in the path of a comet. The Advanced Photon Source, the Advanced Light Source, and the National Synchrotron Light Source at the U.S. Department of Energy’s Argonne, Lawrence Berkeley and Brookhaven National Laboratories, respectively, enabled analysis that showed that the dust, which likely originated from beyond our solar system, is more complex in composition and structure than previously imagined.

    “Fundamentally, the solar system and everything in it was ultimately derived from a cloud of interstellar gas and dust,” says Andrew Westphal, physicist at the University of California, Berkeley’s Space Sciences Laboratory and lead author on the paper published this week in Science titled Evidence for interstellar origin of seven dust particles collected by the Stardust spacecraft. “We’re looking at material that’s very similar to what made our solar system.”

    The analysis tapped a variety of microscopy techniques including those that rely on synchrotron radiation. “Synchrotrons are extremely bright light sources that enable light to be focused down to the small size of these particles while providing unprecedented chemical identification,” said Hans Bechtel, principal scientific engineering associate at Berkeley Lab.

    The APS helped the researchers create a map of the locations and abundances of the different elements in each tiny particle, said Argonne physicist Barry Lai, who was involved with the analysis at the APS.

    “The Advanced Photon Source was unique in the capability to perform elemental imaging and analysis on such small particles — just 500 nanometers or less across,” Lai said. (That is so small that about 1,000 of them could fit in the period at the end of a sentence.) “This provided an important screening tool for differentiating the origin of each particle.”

    Researchers used the scanning transmission x-ray and Fourier transform infrared microscopes at the ALS. The X-ray microscope ruled out tens of interstellar dust candidates because they contained aluminum, not found in space or other substances and possibly knocked off the spacecraft and embedded in the aerogel. The infrared spectroscopy helped to identify sample contamination that could ultimately be subtracted later.

    “Almost everything we’ve known about interstellar dust has previously come from astronomical observations — either ground-based or space-based telescopes,” says Westphal. But telescopes don’t tell you about the diversity or complexity of interstellar dust, he says. “The analysis of these particles captured by Stardust is our first glimpse into the complexity of interstellar dust, and the surprise is that each of the particles are quite different from each other.”

    Westphal, who is also affiliated with Berkeley Lab’s Advanced Light Source, and his 61 co-authors, including researchers from the University of Chicago and the Chicago Field Museum of Natural History, found and analyzed a total of seven grains of possible interstellar dust and presented preliminary findings. All analysis was non-destructive, meaning that it preserved the structural and chemical properties of the particles. While the samples are suspected to be from beyond the solar system, he says, potential confirmation of their origin must come from subsequent tests that will ultimately destroy some of the particles.

    “Despite all the work we’ve done, we have limited the analyses on purpose,” Westphal explains. “These particles are so precious. We have to think very carefully about what we do with each particle.”

    Between 2000 and 2002, the Stardust spacecraft, on its way to meet a comet named Wild 2, exposed the special collector to the stream of dust coming from outside our solar system. The mission objectives were to catch particles from both the comet coma as well as from the interstellar dust stream. When both collections were complete, Stardust launched its sample capsule back to earth where it landed in northwestern Utah. The analyses of Stardust’s cometary sample have been widely published in recent years, and the comet portion of the mission has been considered a success.

    This new analysis is the first time researchers have looked at the microscopic particles collected en route to the comet. Both types of dust were captured by the spacecraft’s sample-collection trays, made of an airy material called aerogel separated by aluminum foil. Three of the space-dust particles (a tenth the size of comet dust) either lodged or vaporized within the aerogel while four others produced pits in the aluminum foil leaving a rim residue that fit the profile of interstellar dust.

    Much of the new study relied on novel methods and techniques developed specifically for handling and analyzing the fine grains of dust, which are more than a thousand times smaller than a grain of sand. These methods are described in twelve other papers available now and next week in the journal of Meteoritics & Planetary Science.

    One of the first research objectives was to simply find the particles within the aerogel. The aerogel panels were essentially photographed in tiny slices by changing the focus of the camera to different depths, which resulted in millions of images eventually stitched together into video. With the help of a distributed science project called Stardust@home [running on BOINC software from SSL], volunteer space enthusiasts from around the world combed through video, flagging tracks they believed were created by interstellar dust. More than 100 tracks have been found so far, but not all of these have been analyzed. Additionally, only 77 of the 132 aerogel panels have been scanned. Still, Westphal doesn’t expect more than a dozen particles of interstellar dust will be seen.

    The researchers found that the two larger dust particles from the aerogel have a fluffy composition, similar to that of a snowflake, says Westphal. Models of interstellar dust particles had suggested a single, dense particle, so the lighter structure was unexpected. They also contain crystalline material called olivine, a mineral made of magnesium, iron, and silicon, which suggest the particles came from disks or outflows from other stars and were modified in the interstellar medium.

    Three of the particles found in the aluminum foil were also complex, and contain sulfur compounds, which some astronomers believe should not occur in interstellar dust particles. Study of further foil-embedded particles could help explain the discrepancy.

    Westphal says that team will continue to look for more tracks as well as take the next steps in dust analysis. “The highest priority is to measure relative abundance of three stable isotopes of oxygen,” he says. The isotope analysis could help confirm that the dust originated outside the solar system, but it’s a process that would destroy the precious samples. In the meantime, Westphal says, the team is honing their isotope analysis technique on artificial dust particles called analogs. “We have to be super careful,” he says. “We’re doing a lot of work on analogs to practice, practice, practice.”

    The Advanced Photon Source is currently in the process of designing a proposed upgrade that would increase its ability to do such analyses, Lai said.

    “With the APS upgrade, we would be able to increase the spatial resolution and to image faster — effectively scanning a larger area of the aerogel in a shorter time,” he said.

    Since just over half of the aerogels have been checked for particles, there are plenty more waiting to be analyzed.

    This research was supported by NASA, the Klaus Tschira Foundation, the Tawani Foundation, the German Science Foundation, and the Funds for Scientific Research, Flanders, Belgium. In addition to ALS, the research made use of the National Synchrotron Light Source at Brookhaven National Laboratory and the Advanced Photon Source at Argonne. All three x-ray light sources are DOE Office of Science User Facilities.

    Brookhaven NSLS
    Brookhaven NSLS

    Berkeley Advanced Light Source

    See the full article here.

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 3:34 pm on August 22, 2014 Permalink | Reply
    Tags: , , , X-ray Technology   

    From Brookhaven Lab: “A Single Diamond Crystal Does the Job” 

    Brookhaven Lab

    August 22, 2014
    Laura Mgrdichian

    X-ray absorption spectroscopy (XAS) is a technique used in many areas of science, from biology to materials science,that allows researchers to uncover information on a sample’s molecular structure and electronic behavior by studying how it absorbs and re-emits x-rays. Recently, a research team working at the National Synchrotron Light Source developed a way to improve certain XAS experiments by replacing a standard experimental component, an x-ray beam monitor, with a diamond-based type that is better performing but has been incompatible with many XAS experiments due to technical roadblocks.

    Brookhaven NSLS
    NSLS at Brookhaven

    x scans
    X-ray absorption spectroscopy (XAS) scans over energy ranges typical of XAS experiments

    The most common type of beam monitor is an ionization chamber, which consists of a gas-filled chamber between two charged electrodes. When x-rays pass through the gas, they ionize the molecules and cause a tiny but measurable current to flow between the electrodes. By calculating backward from the amount of current, researchers can determine the “flux” of the x-ray beam; that is, the total number of x-ray photons passing through a unit area as a function of the beam energy.

    Diamond sensors, which consist of a single diamond crystal, have many advantages over ionization chambers, including faster response times, less leakage current, and smaller size. But even as electronics-grade diamond is more readily available, they have not been commonly used in XAS because they respond poorly during experiments that require scanning over a large energy range. Often the range of the scan includes the Bragg diffraction energies for diamond – x-rays with wavelengths that are diffracted by the diamond rather than transmitted through it. (If the range of the energy scan avoids this value, then diamond sensors can simply be swapped out for ionization chambers.)

    “Measuring XAS data with a diamond sensor through an energy region that produces diffraction peaks yields data that require extensive post-processing,” said the study’s corresponding scientist, Bruce Ravel of the National Institute of Standards and Technology. “We have found a way that diamond sensors can be used, at least for certain XAS experiments, without having to do so much work to the data.”

    Ravel and his colleagues, from Stony Brook University, Brookhaven National Laboratory, and Case Western Reserve University, discovered that coupling the diamond sensor to an optic component known as a “half polycapillary lens” significantly mitigates the diffraction problem. The lens consists of a bundle of tiny glass tubes encased in a steel cylinder, with one end of the tubes drawn into a taper (in a full lens, both ends are tapered). The lens “smears” the x-ray beam before it reaches the diamond sensor, causing the diffraction effect to be far less pronounced.

    “The data we gathered using both the diamond sensor and the lens are of comparable quality as data taken using an ionization chamber and no lens,” said Ravel.

    The results of their investigation have led the group to propose combination devices, with a diamond window placed onto the end of the steel cylinder that encases the glass tubes, instead of the usual beryllium window.

    X-ray data for this study were collected at NSLS beamline X23A2. The paper describing the work is published in the October 2013 issue of Review of Scientific Instruments.

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 4:09 pm on August 21, 2014 Permalink | Reply
    Tags: , , , X-ray Technology   

    From Berkeley Lab: “Researchers Map Quantum Vortices Inside Superfluid Helium Nanodroplets” 

    Berkeley Logo

    Berkeley Lab

    August 21, 2014
    Kate Greene

    Scientists have, for the first time, characterized so-called quantum vortices that swirl within tiny droplets of liquid helium. The research, led by scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), the University of Southern California, and SLAC National Accelerator Laboratory, confirms that helium nanodroplets are in fact the smallest possible superfluidic objects and opens new avenues for studying quantum rotation.

    “The observation of quantum vortices is one of the most clear and unique demonstrations of the quantum properties of these microscopic objects,” says Oliver Gessner, senior scientist in the Chemical Sciences Division at Berkeley Lab. Gessner and colleagues, Andrey Vilesov of the University of Southern California and Christoph Bostedt of SLAC National Accelerator Laboratory at Stanford, led the multi-facility and multi-university team that published the work this week in Science.

    Illustration of analysis of superfluid helium nanodroplets. Droplets are emitted via a cooled nozzle (upper right) and probed with x-ray from the free-electron laser. The multicolored pattern (upper left) represents a diffraction pattern that reveals the shape of a droplet and the presence of quantum vortices such as those represented in the turquoise circle with swirls (bottom center). Credit: Felix P. Sturm and Daniel S. Slaughter, Berkeley Lab.

    The finding could have implications for other liquid or gas systems that contain vortices, says USC’s Vilesov. “The quest for quantum vortices in superfluid droplets has stretched for decades,” he says. “But this is the first time they have been seen in superfluid droplets.”

    Superfluid helium has long captured scientist’s imagination since its discovery in the 1930s. Unlike normal fluids, superfluids have no viscosity, a feature that leads to strange and sometimes unexpected properties such as crawling up the walls of containers or dripping through barriers that contained the liquid before it transitioned to a superfluid.

    Helium superfluidity can be achieved when helium is cooled to near absolute zero (zero kelvin or about -460 degrees F). At this temperature, the atoms within the liquid no longer vibrate with heat energy and instead settle into a calm state in which all atoms act together in unison, as if they were a single particle.

    For decades, researchers have known that when superfluid helium is rotated–in a little spinning bucket, say–the rotation produces quantum vortices, swirls that are regularly spaced throughout the liquid. But the question remained whether anyone could see this behavior in an isolated, nanoscale droplet. If the swirls were there, it would confirm that helium nanodroplets, which can range in size from tens of nanometers to microns, are indeed superfluid throughout and that the motion of the entire liquid drop is that of a single quantum object rather than a mixture of independent particles.

    But measuring liquid flow in helium nanodroplets has proven to be a serious challenge. “The way these droplets are made is by passing helium through a tiny nozzle that is cryogenically cooled down to below 10 Kelvin,” says Gessner. “Then, the nanoscale droplets shoot through a vacuum chamber at almost 200 meters-per-second. They live once for a few milliseconds while traversing the experimental chamber and then they’re gone. How do you show that these objects, which are all different from one another, have quantum vortices inside?”

    Oliver Gessner, Chemical Sciences Division, Berkeley Lab. Credit: Roy Kaltschmidt

    The researchers turned to a facility at SLAC called the Linac Coherent Light Source (LCLS), a DOE Office of Science user facility that is the world’s first x-ray free-electron laser. This laser produces very short light pulses, lasting just a ten-trillionth of a second, which contain a huge number of high-energy photons. These intense x-ray pulses can effectively take snapshots of single, ultra-fast, ultra-small objects and phenomena.

    Inside the SLAC LCLS

    “With the new x-ray free electron laser, we can now image phenomenon and look at processes far beyond what we could imagine just a decade ago,” says Bostedt of SLAC. “Looking at the droplets gave us a beautiful glimpse into the quantum world. It really opens the door to fascinating sciences.”

    In the experiment, the researchers blasted a stream of helium nanodroplets across the x-ray laser beam inside a vacuum chamber; a detector caught the pattern that formed when the x-ray light diffracted off the drops.

    The diffraction patterns immediately revealed that the shape of many droplets were not spheres, as was previously assumed. Instead, they were oblate. Just as the Earth’s rotation causes it to bulge at the equator, so too do rotating nanodroplets expand around the middle and flatten at the top and bottom.

    But the vortices themselves are invisible to x-ray diffraction, so the researchers used a trick of adding xenon atoms to the droplets. The xenon atoms get pulled into the vortices and cluster together.

    “It’s similar to pulling the plug in a bathtub and watching the kids’ toys gather in the vortex,” says Gessner. The xenon atoms diffract x-ray light much stronger than the surrounding helium, making the regular arrays of vortices inside the droplet visible. In this way, the researchers confirmed that vortices in nanodroplets behave as those found in larger amounts of rotating superfluid helium.

    Armed with this new information, the researchers were able to determine the rotational speed of the nanodroplets. They were surprised to find that the nanodroplets spin up to 100,000 times faster than any other superfluid helium sample ever studied in a laboratory.

    Moreover, while normal liquid drops will change shape as they spin faster and faster–to resemble a peanut or multi-lobed globule, for instance–the researchers saw no evidence of such shapeshifting in the helium nanodroplets. “Essentially, we’re exploring a new regime of quantum rotation with this matter,” Gessner says.

    “It’s a new kind of matter in a sense because it is a self-contained isolated superfluid,” he adds. “It’s just all by itself, held together by its own surface tension. It’s pretty perfect to study these systems if one wants to understand superfluidity and isolate it as much as possible.”

    This research was supported by the DOE Office of Science, Office of Basic Energy Sciences, Chemical Sciences, Geosciences and Biosciences Division as well as the National Science Foundation.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 8:45 am on August 21, 2014 Permalink | Reply
    Tags: , , , , , X-ray Technology   

    From Astrobiology: “Scientists Detect Evidence of ‘Oceans Worth’ of Water in Earth’s Mantle” 

    Astrobiology Magazine

    Astrobiology Magazine

    Aug 21, 2014
    Andrew Williams

    Researchers have found evidence of a potential “ocean’s worth” of water deep beneath the United States.

    Although not present in a familiar form, the building blocks of water are bound up in rock located deep in the Earth’s mantle, and in quantities large enough to represent the largest water reservoir on the planet, according to the research.

    For many years, scientists have attempted to establish exactly how much water may be cycling between the Earth’s surface and interior reservoirs through the action of plate tectonics. Northwestern University geophysicist Steve Jacobsen and University of New Mexico seismologist Brandon Schmandt have found deep pockets of magma around 400 miles beneath North America — a strong indicator of the presence of H₂O stored in the crystal structure of high-pressure minerals at these depths.

    “The total H₂O content of the planet has long been among the most poorly constrained ‘geochemical parameters’ in Earth science. Our study has found evidence for widespread hydration of the mantle transition zone,” says Jacobsen.

    For at least 20 years geologists have known from laboratory experiments that the Earth’s transition zone — a rocky layer of the Earth’s mantle located between the lower mantle and upper mantle, at depths between 250 and 410 miles — can, in theory, hold about 1 percent of its total weight as H₂O, bound up in minerals called wadsleyite and ringwoodite. However, as Schmandt explains, up until now it has been difficult to figure out whether that potential water reservoir is empty, as many have suggested, or not.

    If there does turn out to be a substantial amount of H₂O in the transition zone, then recent laboratory experiments conducted by Jacobsen indicate there should be large quantities of what he calls “partial melt” in areas where mantle flows downward out of the zone. This water-rich silicate melt is molten rock that occurs at grain boundaries between solid mineral crystals and may account for about 1 percent of the volume of rocks.

    Brandon Schmandt (University of New Mexico, left) and Steve Jacobsen (Northwestern University, right) combined seismic observations from the US-Array with laboratory experiments to detect dehydration melting of hydrous mantle material beneath North America at depths of 700-800 km. Credit: University of New Mexico/Northwestern University

    “Melting occurs because hydrated rocks are carried from the transition zone, where the rocks can hold lots of H₂O, downward into the lower mantle, where the rocks cannot hold as much H₂O. Melting is the way to get rid of the H₂O that won’t fit in the crystal structure present in the lower mantle,” says Jacobsen.

    He adds:

    “When a rock starts to melt, whatever H₂O is bound in the rock will go into the melt right away. So the melt would have much higher H₂O concentration than the remaining solid. We’re not sure how it got there. Maybe it’s been stuck there since early in Earth’s history or maybe it’s constantly being recycled by plate tectonics.”

    Seismic Waves

    Melt strongly affects the speed of seismic waves — the acoustic-like waves of energy that travel through the Earth’s layers as a result of an earthquake or explosion. This is because stiff rocks, like the silicate-rich ones present in the mantle, propagate seismic waves very quickly. According to Schmandt, if just a little melt — even 1 percent or less — is added between the crystal grains of such a rock it causes it to become less stiff, meaning that elastic waves propagate more slowly.

    “We were able to analyse seismic waves from earthquakes to look for melt in the mantle just beneath the transition zone,” says Schmandt.

    “What we found beneath the U.S. is consistent with partial melt being present in areas of downward flow out of the transition zone. Without the presence of H₂O, it is very difficult to explain melting at these depths. This is a good hint that the transition zone H₂O reservoir is not empty, and even if it’s only partially filled that could correspond to about the same mass of H₂O as in Earth’s oceans,” he adds.

    Jacobsen and Schmandt hope that their findings, published in the June issue of the journal Science, will help other scientists to understand how the Earth formed and what its current composition and inner workings are, as well as establish how much water is trapped in mantle rock.

    “I think we are finally seeing evidence for a whole-Earth water cycle, which may help explain the vast amount of liquid water on the surface of our habitable planet. Scientists have been looking for this missing deep water for decades,” says Jacobsen

    Mantle Rock Studies

    The study combined Schmandt’s analysis of seismic data from the USArray, a network of over 2,000 seismometers across the U.S., with Jacobsen’s laboratory experiments, in which he examined the behaviour of mantle rock under conditions designed to simulate the high pressures and temperatures present at 400 miles below the Earth’s surface.

    Schematic representation of seismometers placed in the US-Array between 2004 and 2014 and used in the study by Schmandt and Jacobsen to detect dehydration melting at the top of the lower mantle beneath North America. Image Credit: NSF-Earthscope

    The USArray is part of Earthscope, a program sponsored by National Science Foundation. Jacobsen’s experiments were conducted at two Department of Energy. user facilities, the Advanced Photon Source of Argonne National Laboratory and the National Synchrotron Light Source at Brookhaven National Laboratory.

    Argonne APS
    APS at Argonne Lab

    Brookhaven NSLS
    NSLS at Brookhaven

    Taken as a whole, their findings produced strong evidence that melting may occur about 400 miles deep in the Earth, with H₂O stored in mantle rocks, such as those containing the mineral ringwoodite, which is likely to be a dominant mineral at those depths.

    Schmandt explains that he made this discovery after carrying out seismic imaging of the boundary between the transition zone and lower mantle. He found evidence that, in areas where “sharp transitions” like melt are present, some earthquake energy had converted from a compressional, or longitudinal wave, to a shear or S-wave. The phase of the converted S-waves in areas where the mantle is flowing down and out of the transition zone indicated a significantly lower velocity than surrounding mantle. The discovery suggests that water from the Earth’s surface can be driven to such great depths by plate tectonics, eventually resulting in the partial melting of the rocks found deep in the mantle.

    “We used many seismic wave conversions to see that many areas beneath the U.S. may have some melt just beneath the transition zone. The next step was comparing these areas to the areas where mantle flow models predict downward flow out of the transition zone,” says Schmandt.


    Schmandt and Jacobsen’s findings build on a discovery reported in March in the journal Nature in which scientists discovered a piece of the blue mineral ringwoodite inside a diamond brought up from a depth of 400 miles by a volcano in Brazil. That tiny piece of ringwoodite — the only sample we have from within the Earth — contained a surprising amount of water bound in solid form in the mineral.

    “Not only was this the first terrestrial ringwoodite ever seen — all other natural ringwoodite examples came from shocked meteorites — but the tiny inclusion of ringwoodite was also full of H₂O, to about 1.5 percent of total weight,” says Jacobsen. “This is about the maximum amount of water that we are able to put into ringwoodite in laboratory experiments.”

    Although the discovery provided direct evidence of water in the deep mantle at about 700 kilometers (434 miles) deep, the diamond sampled only one point of the mantle. Jacobsen explains that the paper expands the search to question how widespread hydration might be throughout the entire transition zone. This is important because the presence of H₂O in the large volumes of rock found at depths of between 410 to 660 kilometers (255 to 410 miles) would “significantly alter our understanding of the composition of the Earth.”

    Crystals of laboratory-grown hydrous ringwoodite, a high-pressure polymorph of olivine that is stable from about 520-660 km depth in the Earth’s mantle. The ringwoodite pictured here contains around one weight percent of H2O, similar to what was inferred in the seismic observations made by Schmandt and Jacobsen. Image Credit: Steve Jacobsen/Northwestern University

    Crystals of laboratory-grown hydrous ringwoodite, a high-pressure polymorph of olivine that is stable from about 520-660 km depth in the Earth’s mantle. The ringwoodite pictured here contains around one weight percent of H2O, similar to what was inferred in the seismic observations made by Schmandt and Jacobsen. Image Credit: Steve Jacobsen/Northwestern University

    “It would double or triple the known amount of H₂O in the bulk Earth. Just 1 to 2 percent H₂O by weight in the transition zone would be equivalent to 2 to 3 times the amount of H₂O in the oceans,” adds Jacobsen.

    Big Questions

    Looking ahead, Jacobsen admits that some big questions remain. For example, if the transition zone is full of H₂O, what does this tell us about the origin of Earth’s water? And is the presence of ringwoodite in a planet’s mantle necessary for a planet to retain enough original water to form oceans? Moreover, how is the H₂O in the transition zone connected to the surface reservoirs? Is the transition zone, if it contains a geochemical reservoir of H₂O larger than the oceans, somehow buffering the amount of liquid water on the Earth’s surface?

    “An analogy could be that of a sponge, which needs to be filled before liquid water can be supported on top. Was water in the transition zone added through plate tectonics early in Earth’s history, or did the oceans de-gas from the mantle until an equilibrium was reached between surface and interior reservoirs?” asks Jacobsen.

    Either way, the research is likely to be of strong interest to astrobiologists largely because water is often so closely linked to the formation of biological life. Remote geochemical analysis could be one way of detecting if such processes occur elsewhere in the universe, and it is likely that such analysis would involve the use of gamma-ray, neutron, and x-ray spectrometers of the type used by the NASA MESSENGER spacecraft for the remote geochemical mapping of Mercury.

    NASA Messenger satellite
    NASA Messenger

    “On other hard to reach planets it’s not practical to apply the type of seismic imaging that I used. So my guess is that geochemical analysis of volcanic rocks from other planetary bodies may be our best way to test whether volatiles are stored in the planet’s interior,” says Schmandt.

    See the full article here.


    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 2:51 pm on August 14, 2014 Permalink | Reply
    Tags: , , , X-ray Technology   

    From SLAC Lab: “SLAC Secures Role in Energy Frontier Research Center Focused on Next-generation Materials” 

    SLAC Lab

    August 14, 2014

    X-ray Studies will Explore Hybrid Materials for Solar Energy, Efficient Lighting and Other Uses

    he Department of Energy’s SLAC National Accelerator Laboratory will play a key role in a research consortium that seeks out new materials for next-generation solar panels, low-energy lighting and other uses.

    Collaborators in this effort will use SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL), a DOE Office of Science User Facility, to characterize these new materials as they are being discovered.

    SLAC SSRL Accelerator Tunnel

    A researcher at SLAC’s Stanford Synchrotron Radiation Lightsource holds up a thin strip of material printed with an ink (magenta) relevant to solar-energy conversion. SSRL will play a role in a new center, led by the National Renewable Energy Laboratory in Colorado, that will explore new materials for solar panels, energy-efficient lighting and other uses. (SLAC National Accelerator Laboratory)

    The collaboration will also aid in understanding their structure and performance as they operate. The work is made possible by a four-year, $14 million DOE award for an Energy Frontier Research Center (EFRC) distributed among several national labs and universities.

    “We are pushing the idea of ‘materials by design’ to the next step,” said Mike Toney, an SSRL senior staff scientist and head of the SSRL Materials Science Division. Toney will oversee SLAC’s contributions to this Center for Next Generation of Materials by Design: Incorporating Metastability, which is led by the National Renewable Energy Laboratory (NREL) in Colorado.

    “It’s a theory-centric center that aims to tell you what materials to make and how to make them,” Toney added. SLAC’s role will be purely experimental: investigating the novel materials with a slew of X-ray techniques. Watching materials as they’re being made and while they’re operating are already SSRL specialties, Toney said.

    “SLAC is a key partner on our EFRC team, bringing unique characterization tools to probe and understand new materials, including the processes that control their formation,” Bill Tumas, EFRC director and associate lab director for materials and chemistry at NREL, said.

    The center is one of 32 EFRCs approved by the DOE in June, which follow an initial batch of DOE research centers approved five years ago. According to Toney, SLAC played an important role in one of the earlier centers, the Center for Inverse Design, which was also led by NREL and laid the groundwork for the new round of research.

    In materials science it’s common to work from known materials and modify them to achieve desired properties. The Center for Inverse Design sought to flip this approach on its head by using theory integrated with experiment to discover new materials with desired properties.

    The new center stretches this idea to a realm where the sought-after material properties are complex and theory and computation are not fully developed. It will initially focus on creating new semiconductor materials that can be incorporated into solar energy conversion systems and solid-state lighting technologies that use less power than standard light bulbs.

    It also aims to tackle “multiple-property design” – tailoring materials with several enhanced properties. And it will explore lesser-understood “metastable” materials, which can have desirable traits but are not in their most stable state – they can fall back to a lower, more stable energy level when disturbed, for example.

    “These centers are bringing together different groups of people who normally would not converse,” Toney said, which makes for lively discussions and innovative approaches to scientific challenges.

    Other participants in the new research center, which starts up this summer, are from Oregon State University, Colorado School of Mines, the Massachusetts Institute of Technology, Lawrence Berkeley National Laboratory and Harvard University.

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 3:33 pm on August 13, 2014 Permalink | Reply
    Tags: , , Bone Density, Osteoporosis, X-ray Technology   

    From APS at Argonne Lab: “Revealing a Novel Mode of Action for an Osteoporosis Drug” 

    News APS at Argonne National Laboratory

    August 13, 2014

    Emma Nichols

    Raloxifene is a U.S. Food and Drug Administration (FDA)-approved treatment for decreasing fracture risk in osteoporosis. While raloxifene is as effective at reducing fracture risk as other current treatments, this works only partially by suppressing bone loss. With the use of wide- and small-angle x-ray scattering (WAXS and SAXS, respectively), researchers carried out experiments at the U.S. Department of Energy’s (DOE’s) Advanced Photon Source (APS) at Argonne National Laboratory that revealed an additional mechanism underlying raloxifene action, providing an explanation for how this drug can achieve equivalent clinical benefit.

    Schematic of mechanical testing apparatus utilized during collection of WAXS diffraction data at the APS. Bone beams were subjected to 4-point bending, and after each displacement of 20 mm (black arrow), 20 x-ray scattering measurements were taken (red dashed line). Diffraction patterns collected by the detector (indicated) allow for quantification of strain experienced by hydroxyapatite crystals and mineralized collagen within the bone. Adapted from M.A. Gallant, Bone 61, 191 (2014).

    These data, together with complementary techniques, help define a novel mechanism by which raloxifene increases inherent bone toughness.

    In osteoporosis, decreased bone density increases the risk of fracture. All current drugs for treatment of this disease act upon living cells within the bone matrix to either decrease bone resorption, a process by which the mineral components of bone are broken down and released into the bloodstream, or to increase net bone formation during remodeling, a process by which bone is also broken down but then reforms bone. In either case, treatment results in an overall increase in bone density, and therefore a reduction in fracture risk.

    While raloxifene is known to mildly suppress bone loss, “It has always been somewhat paradoxical that raloxifene suppresses bone loss less than other osteoporosis therapies, yet reduces fracture risk to about the same level,” said David B. Burr of Indiana University School of Medicine and lead author of the Bone article on this research.

    To uncover the density-independent mechanism by which raloxifene (marketed as Evista by Eli Lilly and Company) increases bone toughness, researchers in this study from the Indiana University School of Medicine; Purdue University; Indiana University–Purdue University at Indianapolis; the University of California, San Diego; Northwestern University; and Argonne National Laboratory assessed the effect of the drug on devitalized bone cleared of living cells that normally mediate resorption and remodeling.

    In these bone samples, raloxifene prolonged the loading that the bone could bear before fracturing, indicating that the drug was acting upon the physical properties of the bone itself. Using ultra-short-echo-time nuclear magnetic resonance, researchers found that raloxifene-mediated water retention within the bone matrix is associated with the observed increase in toughness.

    In order to elucidate the mechanism underlying this association, researchers collected WAXS and SAXS diffraction patterns of carbonated hydroxyapatite crystals (cAp), the mineral component of bone that had been subjected to four-point bending. These data, collected at the X-ray Science Division 1-ID x-ray beamline at the Argonne APS, a Department of Energy user facility, allowed the researchers to measure mechanical strains on cAp crystals at a resolution of 1μm and showed that raloxifene increased the amount of physical deformation, or strain, that occurred at the collagen-mineral interface before fracture.

    This increased strain between cAp and collagen reduces stresses and may be caused by water-mediated slipping between these components at their interface, increasing the amount of energy the bone can absorb prior to fracture.

    “The x-ray diffraction data,” Burr said, “allowed us to explain the mechanism by which increases in bound water would improve the fracture properties of bone.”

    According to Burr, this work uncovers an entirely novel mechanism of action for raloxifene and “paves the way for a new class of drugs to treat osteoporosis, therapies that do not act by altering cellular activity or bone remodeling, but act by directly changing the physical properties of the bone matrix constituents.”

    See the full article here.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security.

    Argonne Lab Campus
    Argonne APS Banner
    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 10:12 am on August 7, 2014 Permalink | Reply
    Tags: , , , X-ray Technology   

    From Slac Lab: “Catching Chemistry in Motion” 

    SLAC Lab

    August 6, 2014
    Laser-timing Tool Works at the Speed of Electrons

    Researchers at the Department of Energy’s SLAC National Accelerator Laboratory have developed a laser-timing system that could allow scientists to take snapshots of electrons zipping around atoms and molecules. Taking timing to this new extreme of speed and accuracy at the Linac Coherent Light Source X-ray laser, a DOE Office of Science user facility, will make it possible to see the formative stages of chemical reactions.

    “Previously, we could see a chemical bond before it’s broken and after it’s broken,” said Ryan Coffee, an LCLS scientist whose team developed this system. “With this tool, we can watch the bond while it is breaking and ‘freeze-frame’ it.”

    The success of most LCLS experiments relies on precise timing of the X-ray laser with another laser, a technique known as “pump-probe.” Typically, light from an optical laser “pumps” or triggers a specific effect in a sample, and researchers vary the arrival of the X-ray laser pulses, which serve as the “probe” to capture images and other data that allow them to study the effects at different points in time.

    Pump-probe experiments at LCLS are used to study a wide range of processes at the atomic or molecular scale, including studies of biological samples and exotic materials like high-temperature superconductors.

    But LCLS X-ray pulses are tricky to control. They have inherent jitter that causes them to fluctuate in arrival time, energy, position, duration and the wavelength of their light.

    There are several tools and techniques that scientists use to understand and limit the impacts of jitter on experiments, and timing tools counter the arrival-time jitter by offering very precise measurements. These measurements can help scientists to interpret their data by pinpointing the timing of changes they see in samples after they are exposed to the first laser pulse. Some experiments would not be possible without precise timing tools because of the ultrafast scale of the changes they are trying to observe.

    Achieving ‘Attosecond’ Experiments

    An illustration of the setup used to test an “attosecond” timing tool at SLAC’s Linac Coherent Light Source X-ray laser. The dashed line, produced by an algorithm that analyzes the colorized spectrograph image (bottom) represents the arrival time of the X-ray laser. (Ryan Coffee and Nick Hartmann/SLAC)

    Timing tools now in place at most LCLS experimental stations can measure the arrival time of the optical and X-ray laser pulses to an accuracy within 10 femtoseconds, or quadrillionths of a second. The new pulse-measuring system, which is highlighted in the July 27 edition of Nature Photonics, builds upon the existing tools and pushes timing to attoseconds, which are quintillionths (billion-billionths) of a second.

    This animation shows a sequence of spectrograph images used to precisely measure arrival time of X-rays relative to optical laser pulses at SLAC’s LCLS. The upper edge of the dark blue pattern represents the arrival time of the X-ray laser pulse. The scale at left measures the relative delay of X-ray and optical laser pulses, and the bottom measures the wavelength of the transmitted optical light. (Nick Hartmann/SLAC)

    Nick Hartmann, an LCLS research associate and doctoral student at the University of Bern in Switzerland who is the lead author of the study detailing the system, said, “An X-ray laser with attosecond timing resolution would open up a new class of experiments on the natural time scale of electron motion.”

    The new system uses a high-resolution spectrograph, a type of camera that records the timing and wavelength of the probe laser pulses. The colorful patterns it displays represent the different wavelengths of light that passed, at slightly different times, through a thin sample of silicon nitride.

    This material experiences a cascading reaction in its electrons when it is struck by an X-ray pulse. This effect leaves a brief imprint in the way light passes through the sample, sort of like a temporary interruption of vision following a camera’s flash.

    This X-ray-caused effect shows up in the way the light from the other laser pulse passes through the silicon nitride – it is seen as a brief dip in the amount of light recorded by the spectrograph, like the after-image of a camera flash. An image-analysis algorithm then precisely calculates, based on the recorded patterns, the relative arrival time of the X-ray pulses.

    The new timing system is designed to avoid distortion effects caused by some other timing tools and to work reliably with a variety of focusing and filtering tools. It can provide real-time readouts of laser arrival times and jitter to benefit experiments in progress, and can be added to existing timing setups at LCLS.

    Hartmann said additional innovations could expand the applications of the new system: “We are putting the parts together to allow attosecond experiments at LCLS and other X-ray lasers like it.”

    hese three panels show different types of jitter, or fluctuations, in the X-ray laser pulses produced at SLAC’s Linac Coherent Light Source. The left panel shows how the X-ray beam fluctuates in its direction. The middle panel shows how the spectrum (wavelength or “color”) of the X-ray laser changes randomly from pulse to pulse. The right panel shows the X-ray-caused dip in the amount of light being recorded. (SLAC)

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 8:49 am on August 6, 2014 Permalink | Reply
    Tags: , , , X-ray Technology   

    From Brookhaven Lab: “New Method Provides Nanoscale Details of Electrochemical Reactions in Electric Vehicle Battery Materials” 

    Brookhaven Lab

    August 4, 2014
    Karen McNulty Walsh, (631) 344-8350 or Peter Genzer, (631) 344-3174

    Using a new method to track the electrochemical reactions in a common electric vehicle battery material under operating conditions, scientists at the U.S. Department of Energy’s Brookhaven National Laboratory have revealed new insight into why fast charging inhibits this material’s performance. The study also provides the first direct experimental evidence to support a particular model of the electrochemical reaction. The results, published August 4, 2014, in Nature Communications, could provide guidance to inform battery makers’ efforts to optimize materials for faster-charging batteries with higher capacity.

    Jiajun Wang, Karen Chen and Jun Wang prepare a sample for study at NSLS beamline X8C.

    “This is the first time anyone has been able to see that delithiation was happening differently at different spatial locations on an electrode under rapid charging conditions.”
    — Brookhaven physicist Jun Wang

    “Our work was focused on developing a method to track structural and electrochemical changes at the nanoscale as the battery material was charging,” said Brookhaven physicist Jun Wang, who led the research. Her group was particularly interested in chemically mapping what happens in lithium iron phosphate—a material commonly used in the cathode, or positive electrode, of electrical vehicle batteries—as the battery charged. “We wanted to catch and monitor the phase transformation that takes place in the cathode as lithium ions move from the cathode to the anode,” she said.

    Getting as many lithium ions as possible to move from cathode to anode through this process, known as delithiation, is the key to recharging the battery to its fullest capacity so it will be able to provide power for the longest possible period of time. Understanding the subtle details of why that doesn’t always happen could ultimately lead to ways to improve battery performance, enabling electric vehicles to travel farther before needing to be recharged.

    X-ray imaging and chemical fingerprinting

    In operando 2D chemical mapping of multi particle lithium iron phosphate cathode during fast charging (top to bottom). The called-out close-up frame shows that as the sample charges, some regions become completely delithiated (green) while others remain completely lithiated (red). This inhomogeneity results in a lower overall battery capacity than can be attained with slower charging, where delithiation occurs more evenly throughout the electrode. No image credit

    Many previous methods used to analyze such battery materials have produced data that average out effects over the entire electrode. These methods lack the spatial resolution needed for chemical mapping or nanoscale imaging, and are likely to overlook possible small-scale effects and local differences within the sample, Wang explained.

    To improve upon those methods, the Brookhaven team used a combination of full- field, nanoscale-resolution transmission x-ray microscopy (TXM) and x-ray absorption near-edge spectroscopy (XANES) at the National Synchrotron Light Source (NSLS), a DOE Office of Science User Facility that provides beams of high-intensity x-rays for studies in many areas of science. These x-rays can penetrate the material to produce both high-resolution images and spectroscopic data—a sort of electrochemical “fingerprint” that reveals, pixel by pixel, where lithium ions remain in the material, where they’ve been removed leaving only iron phosphate, and other potentially interesting electrochemical details.

    The scientists used these methods to analyze samples made up of multiple nanoscale particles in a real battery electrode under operating conditions (in operando). But because there can be a lot of overlap of particles in these samples, they also conducted the same in operando study using smaller amounts of electrode material than would be found in a typical battery. This allowed them to gain further insight into how the delithiation reaction proceeds within individual particles without overlap. They studied each system (multi-particle and individual particles) under two different charging scenarios—rapid (like you’d get at an electric vehicle recharging station), and slow (used when plugging in your vehicle at home overnight).

    Insight into why charging rate matters

    The detailed images and spectroscopic information reveal unprecedented insight into why fast charging reduces battery capacity. At the fast charging rate, the pixel-by-pixel images show that the transformation from lithiated to delithiated iron phosphate proceeds inhomogeneously. That is, in some regions of the electrode, all the lithium ions are removed leaving only iron phosphate behind, while particles in other areas show no change at all, retaining their lithium ions. Even in the “fully charged” state, some particles retain lithium and the electrode’s capacity is well below the maximum level.

    “This is the first time anyone has been able to see that delithiation was happening differently at different spatial locations on an electrode under rapid charging conditions,” Jun Wang said.

    Slower charging, in contrast, results in homogeneous delithiation, where lithium iron phosphate particles throughout the electrode gradually change over to pure iron phosphate—and the electrode has a higher capacity.

    Implications for better battery design

    Scientists have known for a while that slow charging is better for this material, “but people don’t want to charge slowly,” said Jiajun Wang, the lead author of the paper. “Instead, we want to know why fast charging gives lower capacity. Our results offer clues to explain why, and could give industry guidance to help them develop a future fast-charge/high-capacity battery,” he said.

    For example, the phase transformation may happen more efficiently in some parts of the electrode than others due to inconsistencies in the physical structure or composition of the electrode—for example, its thickness or how porous it is. “So rather than focusing only on the battery materials’ individual features, manufacturers might want to look at ways to prepare the electrode so that all parts of it are the same, so all particles can be involved in the reaction instead of just some,” he said.

    The individual-particle study also detected, for the first time, the coexistence of two distinct phases—lithiated iron phosphate and delithiated, or pure, iron phosphate—within single particles. This finding confirms one model of the delithiation phase transformation—namely that it proceeds from one phase to the other without the existence of an intermediate phase.

    “These discoveries provide the fundamental basis for the development of improved battery materials,” said Jun Wang. “In addition, this work demonstrates the unique capability of applying nanoscale imaging and spectroscopic techniques in understanding battery materials with a complex mechanism in real battery operational conditions.”

    The paper notes that this in operando approach could be applied in other fields, such as studies of fuel cells and catalysts, and in environmental and biological sciences.

    Future studies using these techniques at NSLS-II—which will produce x-rays 10,000 times brighter than those at NSLS—will have even greater resolution and provide deeper insight into the physical and electrochemical characteristics of these materials, thus making it possible for scientists to further elucidate how those properties affect performance.

    Yu-chen Karen Chen-Wiegart also contributed to this research. This work was supported by a Laboratory Directed Research and Development (LDRD) project at Brookhaven National Laboratory. The use of the NSLS was supported by the U.S. Department of Energy’s Office of Science.

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 7:36 pm on July 22, 2014 Permalink | Reply
    Tags: , , , , , X-ray Technology   

    From SLAC: “Bringing High-energy X-rays into Better Focus” 

    SLAC Lab

    July 22, 2014
    SLAC-invented Etching Process Builds Custom Nanostructures for X-ray Optics

    Scientists at the Department of Energy’s SLAC National Accelerator Laboratory have invented a customizable chemical etching process that can be used to manufacture high-performance focusing devices for the brightest X-ray sources on the planet, as well as to make other nanoscale structures such as biosensors and battery electrodes.

    “The tools researchers use to manipulate X-rays today are very limited,” said Anne Sakdinawat, an associate staff scientist at SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL) who developed the new “V-MACE” process with Chieh Chang, an SSRL research associate.

    Scanning electron microscope image of a cleaved spiral zone plate, a type of X-ray optic, created using a chemical etching technique that was developed at SLAC. (Chieh Chang, Anne Sakdinawat)

    “Our new technique for fabricating high performance X-ray optics involves just a few chemicals in a simple, easy-to-implement, one-step technology,” Sakdinawat said. “It offers significant advantages in many far-ranging applications.” The patent-pending technique is detailed in the June 27 edition of Nature Communications.

    Focusing X-rays, particularly higher-energy or “hard” X-rays, is particularly challenging at the nanoscale, though it is key to the success of many scientific studies at two of SLAC’s DOE Office of Science user facilities, SSRL and the Linac Coherent Light Source (LCLS) X-ray laser.

    It is also of great interest for commercial applications such as X-ray microscopy, complex electronics, and biomedical devices and imaging tools.

    Existing tools for focusing hard X-rays, such as specialized mirrors and sequences of concave metal structures that form lenses, are generally limited in how they can shape the X-ray light. Focusing the highest-energy X-rays to produce crisp images remains a challenge, as the focusing tools themselves generally lack nanoscale precision and sap away much of the X-ray energy.

    “It’s been technologically very difficult to fabricate structures that offer both high resolution and high efficiency,” Sakdinawat said, and the effectiveness of the structures, which are examples of X-ray “diffractive optics,” is typically based on the height and precision of their features.

    The new fabrication technique is adapted from a process used to create hairlike silicon wires for research on advanced batteries and electronics. It can fabricate structures up to 100 times as tall as they are wide, with dimensions accurate to billionths of a meter. The technique reduces the need to stack multiple layers to create tall structures.

    The researchers used the etching technique to build tall, precise X-ray diffractive optics, called zone plates, whose thinly spaced lines, symmetric rings or spiral patterns alternately obstruct or phase-shift X-rays and allow them to pass through in a way that separates and refocuses them. This improves the focus and produces higher-quality images.

    Scanning electron microscope (SEM) image of a zone plate pattern produced using a chemical etching technique invented at SLAC. (Chieh Chang, Anne Sakdinawat)

    This scanning electron microscope image shows a cross-sectional view of a zone plate produced using a patent-pending chemical etching technique called “V-MACE” developed at SLAC. (Chieh Chang, Anne Sakdinawat)

    “Basically, this is like an artificial crystal,” Sakdinawat said, diffracting the X-ray light in a predictable pattern, as a crystal would. “You can basically manipulate the light in whatever fashion you want – you can shape the light in different ways,” she said, based on the design of the optics and the needs of the experiment.

    Sakdinawat and Chang tested and imaged a sample zone plate at SSRL, and they hope to construct similar plates for use in experiments at SSRL and LCLS.

    The same technique can be used to build other types of precise silicon and metal-coated nanostructures, such as filtration devices, thermoelectric devices that can create electricity from heat and components for tiny bio-sensors that can be embedded in the body, and researchers are working to tailor the process to suit the needs of government agencies and corporate partners.

    “We’re trying to expand into other fields,” Sakdinawat said. “There are many different applications for this.”

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 9:17 am on July 21, 2014 Permalink | Reply
    Tags: , , , X-ray Technology   

    From Fermilab: “Prototype CT scanner could improve targeting accuracy in proton therapy treatment” 

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Monday, July 21, 2014
    Rhianna Wisniewski

    A prototype proton CT scanner developed by Fermilab and Northern Illinois University could someday reduce the amount of radiation delivered to healthy tissue in a patient undergoing cancer treatment.

    Members of the prototype proton CT scanner collaboration move the detector into the CDH Proton Center in Warrenville. Photo: Reidar Hahn

    The proton CT scanner would better target radiation doses to the cancerous tumors during proton therapy treatment. Physicists recently started testing with beam at the CDH Proton Center in Warrenville.

    To create a custom treatment plan for each proton therapy patient, radiation oncologists currently use X-ray CT scanners to develop 3-D images of patient anatomy, including the tumor, to determine the size, shape and density of all organs and tissues in the body. To make sure all the tumor cells are irradiated to the prescribed dose, doctors often set the targeting volume to include a minimal amount of healthy tissue just outside the tumor.

    Collaborators believe that the prototype proton CT, which is essentially a particle detector, will provide a more precise 3-D map of the patient anatomy. This allows doctors to more precisely target beam delivery, reducing the amount of radiation to healthy tissue during the CT process and treatment.

    “The dose to the patient with this method would be lower than using X-ray CTs while getting better precision on the imaging,” said Fermilab’s Peter Wilson, PPD associate head for engineering and support.

    Fermilab became involved in the project in 2011 at the request of NIU’s high-energy physics team because of the laboratory’s detector building expertise.

    The project’s goal was a tall order, Wilson explained. The group wanted to build a prototype device, imaging software and computing system that could collect data from 1 billion protons in less than 10 minutes and then produce a 3-D reconstructed image of a human head, also in less than 10 minutes. To do that, they needed to create a device that could read data very quickly, since every second data from 2 million protons would be sent from the device — which detects only one proton at a time — to a computer.

    NIU physicist Victor Rykalin recommended building a scintillating fiber tracker detector with silicon photomultipliers. A similar detector was used in the DZero experiment.

    “The new prototype CT is a good example of the technical expertise of our staff in detector technology. Their expertise goes back 35 to 45 years and is really what makes it possible for us to do this,” Wilson said.

    In the prototype CT, protons pass through two tracking stations, which track the particles’ trajectories in three dimensions. (See figure below.) The protons then pass through the patient and finally through two more tracking stations before stopping in the energy detector, which is used to calculate the total energy loss through the patient. Devices called silicon photomultipliers pick up signals from the light resulting from these interactions and subsequently transmit electronic signals to a data acquisition system.

    In the prototype proton CT scanner, protons enter from the left, passing through planes of fibers and the patient’s head. Data from the protons’ trajectories, including the energy deposited in the patient, is collected in a data acquisition system (right), which is then used to map the patient’s tissue. Image courtesy of George Coutrakon, NIU

    Scientists use specialized software and a high-performance computer at NIU to accurately map the proton stopping powers in each cubic millimeter of the patient. From this map, visually displayed as conventional CT slices, the physician can outline the margins, dimensions and location of the tumor.

    Elements of the prototype were developed at both NIU and Fermilab and then put together at Fermilab. NIU developed the software and computing systems. The teams at Fermilab worked on the design and construction of the tracker and the electronics to read the tracker and energy measurement. The scintillator plates, fibers and trackers were also prepared at Fermilab. A group of about eight NIU students, led by NIU’s Vishnu Zutshi, helped build the detector at Fermilab.

    “A project like this requires collaboration across multiple areas of expertise,” said George Coutrakon, medical physicist and co-investigator for the project at NIU. “We’ve built on others’ previous work, and in that sense, the collaboration extends beyond NIU and Fermilab.”

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings is powered by MAINGEAR computers

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 310 other followers

%d bloggers like this: