Tagged: Material Sciences Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:16 pm on July 17, 2019 Permalink | Reply
    Tags: , , Glass technology, Material Sciences, ,   

    From UCLA Newsroom: “UCLA researchers toughen glass using nanoparticles” 

    From UCLA Newsroom

    July 16, 2019
    Matthew Chin

    Process could be useful for applications in manufacturing and architecture.

    An electron microscope image of a new, tougher glass developed at UCLA, showing how nanoparticles (rounded, irregular shapes) deflect a crack and force it to branch out. SciFacturing Lab/UCLA

    UCLA mechanical engineers and materials scientists have developed a process that uses nanoparticles to strengthen the atomic structure of glass. The result is a product that’s at least five times tougher than any glass currently available.

    The process could yield glass that’s useful for industrial applications — in engine components and tools that can withstand high temperatures, for instance — as well as for doors, tables and other architectural and design elements.

    The study was published online in the journal Advanced Materials and will be included in a future print edition. The authors wrote that same approach could also be used for manufacturing tougher ceramics that could be used, for example, in spacecraft components that are better able to withstand extreme heat.

    In materials science, “toughness” measures how much energy a material can absorb — and how much it can deform — without fracturing. While glass and ceramics can be reinforced using external treatments, like chemical coatings, those approaches don’t change the fact that the materials themselves are brittle.

    To solve that issue, the UCLA researchers took a cue from the atomic structure of metals, which can take a pounding and not break.

    “The chemical bonds that hold glass and ceramics together are pretty rigid, while the bonds in metals allow some flexibility,” said Xiaochun Li, the Raytheon Professor of Manufacturing at the UCLA Samueli School of Engineering, and the study’s principal investigator. “In glass and ceramics, when the impact is strong enough, a fracture will propagate quickly through the material in a mostly straight path.

    “When something impacts a metal, its more deformable chemical bonds act as shock absorbers and its atoms move around while still holding the structure together.”

    The researchers hypothesized that by infusing glass with nanoparticles of silicon carbide, a metal-like ceramic, the resulting material would be able to absorb more energy before it would fail. They added the nanoparticles into molten glass at 3,000 degrees Fahrenheit, which helped ensure that the nanoparticles were evenly dispersed.

    Once the material solidified, the embedded nanoparticles could act as roadblocks to potential fractures. When a fracture does occur, the tiny particles force it to branch out into tiny networks, instead of allowing it to take a straight path. That branching out enables the glass to absorb significantly more energy from a fracture before it causes significant damage.

    Sintering, in which a powder is heated under pressure, and then cooled, is the main method used to make glass. It also was the method used in previous experiments by other research groups to disperse nanoparticles in glass or ceramics. But in those experiments, the nanoparticles weren’t spread evenly, and the resulting material had uneven toughness.

    The glass blocks that the UCLA team developed for the experiment were somewhat milky, rather than clear, but Li said the process could be adapted to create clear glass.

    The other authors of the study are Qiang-Guo Jiang, a visiting scholar in Li’s SciFacturing Laboratory; Chezheng Cao and Ting-Chiang Lin, who received their doctorates from UCLA in 2018; and Shanghua Wu, an engineering professor at Guangdong University of Technology, China.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

  • richardmitnick 1:49 pm on July 10, 2019 Permalink | Reply
    Tags: , Material Sciences, , , , , Computational materials science, DFT-density functional theory, Atomic force microscopy, Kelvin probe force microscopy, Coupled cluster theory   

    From Argonne Leadership Computing Facility: “Predicting material properties with quantum Monte Carlo” 

    Argonne Lab
    News from Argonne National Laboratory

    From Argonne Leadership Computing Facility

    July 9, 2019
    Nils Heinonen

    For one of their efforts, the team used diffusion Monte Carlo to compute how doping affects the energetics of nickel oxide. Their simulations revealed the spin density difference between bulks of potassium-doped nickel oxide and pure nickel oxide, showing the effects of substituting a potassium atom (center atom) for a nickel atom on the spin density of the bulk. Credit: Anouar Benali, Olle Heinonen, Joseph A. Insley, and Hyeondeok Shin, Argonne National Laboratory.

    Recent advances in quantum Monte Carlo (QMC) methods have the potential to revolutionize computational materials science, a discipline traditionally driven by density functional theory (DFT). While DFT—an approach that uses quantum-mechanical modeling to examine the electronic structure of complex systems—provides convenience to its practitioners and has unquestionably yielded a great many successes throughout the decades since its formulation, it is not without shortcomings, which have placed a ceiling on the possibilities of materials discovery. QMC is poised to break this ceiling.

    The key challenge is to solve the quantum many-body problem accurately and reliably enough for a given material. QMC solves these problems via stochastic sampling—that is, by using random numbers to sample all possible solutions. The use of stochastic methods allows the full many-body problem to be treated while circumventing large approximations. Compared to traditional methods, they offer extraordinary potential accuracy, strong suitability for high-performance computing, and—with few known sources of systematic error—transparency. For example, QMC satisfies a mathematical principle that allows it to set a bound for a given system’s ground state energy (the lowest-energy, most stable state).

    QMC’s accurate treatment of quantum mechanics is very computationally demanding, necessitating the use of leadership-class computational resources and thus limiting its application. Access to the computing systems at the Argonne Leadership Computing Facility (ALCF) and the Oak Ridge Leadership Computing Facility (OLCF)—U.S. Department of Energy (DOE) Office of Science User Facilities—has enabled a team of researchers led by Paul Kent of Oak Ridge National Laboratory (ORNL) to meet the steep demands posed by QMC. Supported by DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, the team’s goal is to simulate promising materials that elude DFT’s investigative and predictive powers.

    To conduct their work, the researchers employ QMCPACK, an open-source QMC code developed by the team. It is written specifically for high-performance computers and runs on all the DOE machines. It has been run at the ALCF since 2011.

    Functional materials

    The team’s efforts are focused on studies of materials combining transition metal elements with oxygen. Many of these transition metal oxides are functional materials that have striking and useful properties. Small perturbations in the make-up or structure of these materials can cause them to switch from metallic to insulating, and greatly change their magnetic properties and ability to host and transport other atoms. Such attributes make the materials useful for technological applications while posing fundamental scientific questions about how these properties arise.

    The computational challenge has been to simulate the materials with sufficient accuracy: the materials’ properties are sensitive to small changes due to complex quantum mechanical interactions, which make them very difficult to model.

    The computational performance and large memory of the ALCF’s Theta system have been particularly helpful to the team. Theta’s storage capacity has enabled studies of material changes caused by small perturbations such as additional elements or vacancies. Over three years the team developed a new technique to more efficiently store the quantum mechanical wavefunctions used by QMC, greatly increasing the range of materials that could be run on Theta.

    ANL ALCF Theta Cray XC40 supercomputer

    Experimental Validation

    Kent noted that experimental validation is a key component of the INCITE project. “The team is leveraging facilities located at Argonne and Oak Ridge National Laboratories to grow high-quality thin films of transition-metal oxides,” he said, including vanadium oxide (VO2) and variants of nickel oxide (NiO) that have been modified with other compounds.

    For VO2, the team combined atomic force microscopy, Kelvin probe force microscopy, and time-of-flight secondary ion mass spectroscopy on VO2 grown at ORNL’s Center for Nanophase Materials Science (CNMS) to demonstrate how oxygen vacancies suppress the transition from metallic to insulating VO2. A combination of QMC, dynamical mean field theory, and DFT modeling was deployed to identify the mechanism by which this phenomenon occurs: oxygen vacancies leave positively charged holes that are localized around the vacancy site and end up distorting the structure of certain vanadium orbitals.

    For NiO, the challenge was to understand how a small quantity of dopant atoms, in this case potassium, modifies the structure and optical properties. Molecular beam epitaxy at Argonne’s Materials Science Division was used to create high quality films that were then probed via techniques such as x-ray scattering and x-ray absorption spectroscopy at Argonne’s Advanced Photon Source (APS) [below] for direct comparison with computational results. These experimental results were subsequently compared against computational models employing QMC and DFT. The APS and CNMS are DOE Office of Science User Facilities.

    So far the team has been able to compute, understand, and experimentally validate how the band gap of materials containing a single transition metal element varies with composition. Band gaps determine a material’s usefulness as a semiconductor—a substance that can alternately conduct or cease the flow of electricity (which is important for building electronic sensors or devices). The next steps of the study will be to tackle more complex materials, with additional elements and more subtle magnetic properties. While more challenging, these materials could lead to greater discoveries.

    New chemistry applications

    Many of the features that make QMC attractive for materials also make it attractive for chemistry applications. An outside colleague—quantum chemist Kieron Burke of the University of California, Irvine—provided the impetus for a paper published in Journal of Chemical Theory and Computation. Burke approached the team’s collaborators with a problem he had encountered while trying to formulate a new method for DFT. Moving forward with his attempt required benchmarks against which to test his method’s accuracy. As QMC was the only means by which sufficiently precise benchmarks could be obtained, the team produced a series of calculations for him.

    The reputed gold standard for many-body system numerical techniques in quantum chemistry is known as coupled cluster theory. While it is extremely accurate for many molecules, some are so strongly correlated quantum-mechanically that they can be thought of as existing in a superposition of quantum states. The conventional coupled cluster method cannot handle something so complicated. Co-principal investigator Anouar Benali, a computational scientist at the ALCF and Argonne’s Computational Sciences Division, spent some three years collaborating on efforts to expand QMC’s capability so as to include both low-cost and highly efficient support for these states that will in future also be needed for materials problems. Performing analysis on the system for which Burke needed benchmarks required this superposition support; he verified the results of his newly developed DFT approach against the calculations generated with Benali’s QMC expansion. They were in close agreement with each other, but not with the results conventional coupled cluster had generated—which, for one particular compound, contained significant errors.

    “This collaboration and its results have therefore identified a potential new area of research for the team and QMC,” Kent said. “That is, tackling challenging quantum chemical problems.”

    The research was supported by DOE’s Office of Science. ALCF and OLCF computing time and resources were allocated through the INCITE program.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About ALCF
    The Argonne Leadership Computing Facility’s (ALCF) mission is to accelerate major scientific discoveries and engineering breakthroughs for humanity by designing and providing world-leading computing facilities in partnership with the computational science community.

    We help researchers solve some of the world’s largest and most complex problems with our unique combination of supercomputing resources and expertise.

    ALCF projects cover many scientific disciplines, ranging from chemistry and biology to physics and materials science. Examples include modeling and simulation efforts to:

    Discover new materials for batteries
    Predict the impacts of global climate change
    Unravel the origins of the universe
    Develop renewable energy technologies

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

  • richardmitnick 12:13 pm on July 4, 2019 Permalink | Reply
    Tags: "An atomic-scale erector set", "Discretization-" whereby a building is divided into different points, , Kostas Keremidis of the MIT Concrete Sustainability Hub is modeling structures as ensembles of atoms, Material Sciences, , Modeling a building as a collection of points that interact through forces like those found at the atomic scale,   

    From MIT News: “An atomic-scale erector set” 

    MIT News

    From MIT News

    July 3, 2019
    Andrew Logan

    A building modeled with the molecular dynamics-based structural modeling approach. Image courtesy of Kostas Keremidis

    To predict building damage, Kostas Keremidis of the MIT Concrete Sustainability Hub is modeling structures as ensembles of atoms.

    To design buildings that can withstand the largest of storms, Kostas Keremidis, a PhD candidate at the MIT Concrete Sustainability Hub, is using research at the smallest scale — that of the atom.

    His approach, which derives partially from materials science, models a building as a collection of points that interact through forces like those found at the atomic scale.

    “When you look at a building, it is actually a series of connections between columns, windows, doors, and so on,” says Keremidis. “Our new framework looks at how different building components connect together to form a building like atoms form a molecule — similar forces hold them together, both at the atomic and building scale.” The framework is called molecular dynamics-based structural modeling.

    Eventually, Keremidis hopes it will provide developers and builders with a new way to readily predict building damage from disasters like hurricanes and earthquakes.

    Making models

    But before he can predict building damage, Keremidis must first assemble a model.

    He begins by taking a building and dividing its respective elements into nodes, or “atoms.” This is a standard procedure called “discretization,” whereby a building is divided into different points. Then he gives each “atom” different properties according to its material. For example, the weight of each “atom” may depend on if it’s part of a floor, a door, a window, and so on. After modeling them, he defines their bonds.

    The first type of bond between points in a building model is called an axial bond. These describe how elements deform under a load in the direction of their span — in other words, they model how a column shrinks and then rebounds under a load, like a spring.

    The second type of connection is that of the angular bonds, which represent how elements like a beam bend in the lateral direction. Keremidis uses these vertical and lateral interactions to model the deformation and breaking of different building elements. Breaking occurs when these bonds deform too much, just like in real structures.

    To see how one of his buildings will fare under conditions like storms or earthquakes, Keremidis must thoroughly test these assembled atoms and their bonds under numerous simulations.

    “Once I have my model and my building, I then run around 10,000 simulations,” explains Keremidis. “I can assign 10,000 different loads to one element or building, or I can also assign that element 10,000 different properties.”

    For him to assess the results of these simulated conditions or properties, Keremidis returns to the bonds. “When they deform during a simulation, these bonds will try to bring the building back to its original position,” he notes. “But they may also get damaged, too. This is how we model damage — we count how many bonds are destroyed and where.”

    The damage is in the details

    The model’s innovations actually lie in its damage prediction.

    Traditionally, engineers have used a method called finite element analysis to model building damage. Like MIT’s approach, it also breaks down a building into component parts. But it is generally a time-consuming technique that is set up around the elasticity of elements. This means that it can model only small deformations in a building, rather than large-scale inelastic deformations, like fracture, that frequently occur under hurricane loads.

    An added benefit of his molecular dynamics model is that Keremidis can explore “different materials, different structural properties, and different building geometries” by playing with the layout and nature of atoms and their bonds. This means that molecular dynamics can potentially model any element of a building, and more quickly, too.

    By scaling this approach beyond individual buildings, molecular dynamics could also better inform city, state, and even federal hazard-mitigation efforts.

    For hazard mitigation, cities currently rely on a model by the Federal Emergency Management Agency (FEMA) called HAZUS. It takes historical weather data and a dozen standard building models to predict the damage that a community might experience during a hazard.

    While useful, HAZUS is not ideal. It offers around only a dozen standardized building types and provides qualitative, rather than quantitative, results.

    The MIT model, however, will allow stakeholders to go into finer detail. “With FEMA’s HAZUS, the current level of categorization is too coarse. Instead, we should have 50 or 60 building types,” says Keremidis. “Our model will allow us to collect and model this wider range of buildings types.”

    Since it measures damage by counting the broken bonds between atoms, a molecular dynamics approach will also more easily quantify the damage that hazards like windstorms or earthquakes can inflict on a community. Such a quantifiable understanding of hazard damage should lead to more accurate estimations of mitigation costs and recovery.

    According to the U.S. Congressional Budget Office, wind storms currently cause $28 billion in damage annually. By 2075, they will cause $38 billion, due to climate change and coastal development.

    With a molecular dynamics approach, developers and government agencies will have one more tool to predict and mitigate these damages.

    The MIT Concrete Sustainability Hub is a team of researchers from several departments across MIT working on concrete and infrastructure science, engineering, and economics. Its research is supported by the Portland Cement Association and the Ready Mixed Concrete Research and Education Foundation.

    Research Brief: Resilience Assessment of Structures Using Molecular Dynamics

    Research Brief: Validation of Molecular Dynamics-Based Structural Damage Models

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

  • richardmitnick 10:42 am on June 15, 2019 Permalink | Reply
    Tags: A tale of two liquids, , , , , Material Sciences, , When stable becomes unstable,   

    From SLAC National Accelerator Lab: “A quick liquid flip helps explain how morphing materials store information” 

    From SLAC National Accelerator Lab

    June 14, 2019

    Experiments at SLAC’s X-ray laser reveal in atomic detail how two distinct liquid phases in these materials enable fast switching between glassy and crystalline states that represent 0s and 1s in memory devices.

    In phase-change memory devices, a material switches between glassy and crystalline phases that represent the 0s and 1s used to store information. One pulse of electricity or light heats the material to high temperature, causing it to crystallize, and another pulse melts it into a disordered, glassy state. Experiments at SLAC’s X-ray laser revealed a key part of this switch – a quick transition from one liquid-like state to another – that enables fast and reliable data storage. (Peter Zalden/European XFEL)

    Instead of flash drives, the latest generation of smart phones uses materials that change physical states, or phases, to store and retrieve data faster, in less space and with more energy efficiency. When hit with a pulse of electricity or optical light, these materials switch between glassy and crystalline states that represent the 0s and 1s of the binary code used to store information.

    Now scientists have discovered how those phase changes occur on an atomic level.

    Researchers from European XFEL and the University of Duisburg-Essen in Germany, working in collaboration with researchers at the Department of Energy’s SLAC National Accelerator Laboratory, led X-ray laser experiments at SLAC that collected more than 10,000 snapshots of phase-change materials transforming from a glassy to a crystalline state in real time.

    They discovered that just before the material crystallizes, it changes from one liquid-like state to another, a process that could not be clearly seen in prior studies because it was blurred by the rapid motions of the atoms. And they showed that this transition is responsible for the material’s unique ability to store information for long periods of time while also quickly switching between states.

    The results, published in Science today, offer a new strategy for designing improved phase-change materials for specialized memory storage.

    “Current data storage technology has reached a scaling limit, so that new concepts are required to store the amounts of data that we will produce in the future,” said Peter Zalden, a scientist at European XFEL and lead author of the study. “Our study explains how the switching process in a promising new technology can be fast and reliable at the same time.”

    When stable becomes unstable

    The experiments took place at SLAC’s Linac Coherent Light Source (LCLS) which produces X-ray laser pulses that are short enough and intense enough to capture snapshots of atomic changes occurring in femtoseconds – millionths of a billionths of a second.

    To store information with phase-change materials, they must be cooled quickly to enter a glassy state without crystallizing, and remain in this glassy state as long as the information needs to stay there. This means the crystallization process must be very slow to the point of being almost absent, such as is the case in ordinary glass. But when it comes time to erase the information, which is done by applying high temperatures, the same material has to crystallize very quickly. The fact that a material can form a stable glass but then become very unstable at elevated temperatures has puzzled researchers for decades.

    At LCLS, the scientists used an optical laser to rapidly heat amorphous films of phase-change materials, just 50 nanometers thick, atop an equally thin support. The films cooled into a crystalline state as the heat from the laser blast dissipated into the surrounding support structure over billionths of a second.

    They used X-ray laser pulses to make images of the material’s structural evolution, collecting each snapshot in the instant before a sample deteriorated.

    A tale of two liquids

    The researchers found that when the liquid cools far enough below the material’s melting temperature, it undergoes a structural change to form another, lower-temperature liquid that exists for just billionths of a second.

    The two liquids not only have very different atomic structures, but they also behave differently: The one at higher temperature has highly mobile atoms that can quickly arrange themselves into the well-ordered structure of a crystal. But in the lower-temperature liquid, some chemical bonds become stronger and more rigid and can hold the disordered atomic structure of the glass in place. It is only the rigid nature of these chemical bonds that keeps the glass from crystallizing and – in the case of phase-change memory devices – secures information in place. The results also help scientists understand how other classes of materials form a glass.

    The research team after performing experiments at SLAC’s Linac Coherent Light Source X-ray laser. (Klaus Sokolowski-Tinten/University of Duisburg-Essen)

    See the full article here.
    See the XFEL press release here .

    Please help promote STEM in your local schools.

    Stem Education Coalition


    SLAC/LCLS II projected view

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 9:24 am on June 12, 2019 Permalink | Reply
    Tags: , “Engineering Rules: Global Standard Setting Since 1880.”, But the products and tools of the time were not necessarily uniform., By the early 20th century engineers ramped up their efforts to make standards international, Industrial standards are voluntary and have the same source: engineers, JoAnne Yates, Many efforts to standardize technologies required firms and business leaders to put aside their short-term interests for a longer-term good, Material Sciences, , The latter half of the 1800s was an unprecedented time of industrial expansion., The standardization of screw threads in U.S. machine shops, What goes for 19th-century hardware goes for hundreds of things used in daily life today   

    From MIT News: “Engineers set the standards” 

    MIT News

    From MIT News

    June 12, 2019
    Peter Dizikes

    JoAnne Yates and her new book “Engineering Rules: Global Standard Setting Since 1880.” Image: Ed Collier

    MIT business historian’s new book chronicles the emergence of global standardization in technology.

    It might not seem consequential now, but in 1863, Scientific American weighed in on a pressing technological issue: the standardization of screw threads in U.S. machine shops. Given standard-size threads — the ridges running around screws and bolts — screws missing from machinery could be replaced with hardware from any producer. But without a standard, fixing industrial equipment would be harder or even impossible.

    Moreover, Great Britain had begun standardizing the size of screw threads, so why couldn’t the U.S.? After energetic campaigning by a mechanical engineer named William Sellers, both the U.S. Navy and the Pennsylvania Railroad got on board with the idea, greatly helping standardization take hold.

    Why did it matter? The latter half of the 1800s was an unprecedented time of industrial expansion. But the products and tools of the time were not necessarily uniform. Making them compatible served as an accelerant for industrialization. The standardization of screw threads was a signature moment in this process — along with new standards for steam boilers (which had a nasty habit of exploding) and for the steel rails used in train tracks.

    Moreover, what goes for 19th-century hardware goes for hundreds of things used in daily life today. From software languages to batteries, transmission lines to power plants, cement, and more, standardization still helps fuel economic growth.

    “Everything around us is full of standards,” says JoAnne Yates, the Sloan Distinguished Professor of Management at MIT. “None of us could function without standards.”

    But how did this all come about? One might expect government treaties to be essential for global standards to exist. But time and again, Yates notes, industrial standards are voluntary and have the same source: engineers. Or, more precisely, nongovernmental standard-setting bodies dominated by engineers, which work to make technology uniform across borders.

    “On one end of a continuum is government regulation, and on the other are market forces, and in between is an invisible infrastructure of organizations that helps us arrive at voluntary standards without which we couldn’t operate,” Yates says.

    Now Yates is the co-author of a new history that makes the role of engineers in setting standards more visible than ever. The book, Engineering Rules: Global Standard Setting since 1880, is being published this week by Johns Hopkins University Press. It is co-authored by Yates, who teaches in the MIT Sloan School of Management, and Craig N. Murphy, who is the Betty Freyhof Johnson ’44 Professor of International Relations at Wellesley College.

    Joint research project

    As it happens, Murphy is also Yates’ husband — and, for the first time, they have collaborated on a research project.

    “He’s a political scientist and I’m a business historian, but we had said throughout our careers, ‘Some day we should write a book together,’” Yates says. When it crossed their radar as a topic, the evolution of standards “immediately appealed to both of us,” she adds. “From Craig’s point of view, he studies global governance, which also includes nongovernmental institutions like this. I saw it as important because of the way firms play a role in it.”

    As Yates and Murphy see it, there have been three distinct historical “waves” of technological standardization. The first, the late 19th- and early 20th-century industrial phase, was spurred by the professionalization of engineering itself. Those engineers were trying to impose order on a world far less organized than ours: Although the U.S. Constitution gives Congress the power to set standards, a U.S. National Bureau of Standards was not created until 1901, when there were still 25 different basic units of length — such as “rods” — being used in the country.

    Much of this industrial standardization occured country by country. But by the early 20th century, engineers ramped up their efforts to make standards international — and some, like the British engineer Charles le Maistre, a key figure in the book, were very aspirational about global standards.

    “Technology evangelists, like le Maistre, spread the word about the importance of standardizing and how technical standards should transcend politics and transcend national boundaries,” Yates says, adding that many had a “social movement-like fervor, feeling that they were contributing to the common good. They even thought it would create world peace.”

    It didn’t. Still, the momentum for standards created by Le Maistre carried into the post-World War II era, the second wave detailed in the book. This new phase, Yates notes, is exemplified by the creation of the standardized shipping container, which made world-wide commerce vastly easier in terms of logistics and efficiency.

    “This second wave was all about integrating the global market,” Yates says.

    The third and most recent wave of standardization, as Yates and Murphy see it, is centered on information technology — where engineers have once again toiled, often with a sense of greater purpose, to develop global standards.

    To some degree this is an MIT story; Tim Berners-Lee, inventor of the World Wide Web, moved to MIT to establish a global standards consortium for the web, W3C, which was founded in 1994, with the Institute’s backing. More broadly, Yates and Murphy note, the era is marked by efforts to speed up the process of standard-setting, “to respond to a more rapid pace of technological change” in the world.

    Setting a historical standard

    Intriguingly, as Yates and Murphy document, many efforts to standardize technologies required firms and business leaders to put aside their short-term interests for a longer-term good — whether for a business, an industry, or society generally.

    “You can’t explain the standards world entirely by economics,” Yates says. “And you can’t explain the standards world entirely by power.”

    Other scholars regard the book as a significant contribution to the history of business and globalization. Yates and Murphy “demonstrate the crucial impact of private and informal standard setting on our daily lives,” according to Thomas G. Weiss, a professor of international relations and global governance at the Graduate Center of the City University of New York. Weiss calls the book “essential reading for anyone wishing to understand the major changes in the global economy.”

    For her part, Yates says she hopes readers will, among other things, reflect on the idealism and energy of the engineers who regarded international standards as a higher cause.

    “It is a story about engineers thinking they could contribute something good for the world, and then putting the necessary organizations into place.” Yates notes. “Standardization didn’t create world peace, but it has been good for the world.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

  • richardmitnick 8:58 am on May 29, 2019 Permalink | Reply
    Tags: "Designing a new solution to our waste crisis", , Material Sciences,   

    From University of New South Wales: “Designing a new solution to our waste crisis” 

    U NSW bloc

    From University of New South Wales

    29 May 2019
    Veena Sahajwalla

    Creating new materials from waste products is essential if we’re to solve the global recycling, waste and emissions crisis.

    Veena Sahajwalla

    If they don’t know it already, designers of all types will soon be at the forefront of a new recycling ethos in Australia and around the world.

    For too long products of all kinds have been designed without consideration of the environmental consequences of their disposal.

    The burden of what to do with all of the unwanted items in our households has fallen to consumers and local councils in the ‘down-stream’ part of the life cycle of products via bin collections and waste sorting.

    Many of these waste materials are ending up in landfill and causing damaging greenhouse gases, and if the world keeps doing this, the waste crisis experienced in Australia since China last year and now India this year banned the importation of international waste, will become critical.

    The reality is that much of the waste that ends up in landfill is actually a renewable resource. This has been proved in our labs at the Sustainable Materials Research and Technology Centre at UNSW Sydney through our microrecycling science and with our prototype green microfactory technology.

    For instance, we are producing building panels from old clothing and textiles, as well as from coffee grounds and cups, and even from glass and saw dust. We are also extracting from electronic waste such as printers, computers and mobile phones the valuable metal alloys they contain and from the plastics we can produce high quality filament for 3D printing.

    And we need to do this if we want to achieve a ‘circular economy’ which minimises waste by ensuring that the valuable resources contained in waste and discarded products are kept in use for as long as possible. For instance, metals can be reformed over and over ad infinitum while glass and even plastics can also be reformed and re-used many times depending on quality.

    But it is not just the designers of products in the so called ‘up-stream’ part of our market places, it is the producers and manufacturers of their products and services in the ‘mid-stream’ that must also play a key role in creating a true circular economy.

    A key problem is there is little commercial appetite to ensure we divert from landfill the waste that can be reformed into new, valued-added materials, products and manufacturing feedstock.

    To that end, the NSW Government has announced in 2019 via its Office of Chief Scientist and Engineer funding to be awarded to UNSW Sydney to establish the Circular Economy Innovation Network, to which I’ve been appointed Director.

    There are so many stakeholders across all supply chains that the challenge is to work together to find the opportunities to make changes that not only reduce waste but to ensure it can be valued and used over and over as a renewable resource to create a circular economy.

    If designers and producers of products, packaging and applicable services accounted for and built in, from the very beginning of the product lifecycle, a consideration for how all of the materials in products will become part of the circular economy so they do not have to end up in landfill, then we may have a positive impact on addressing the world’s growing waste problem.

    For example, using a modular design means that if a part of a product breaks, a replacement component could be made from 3D printing technology from filament made from recovered quality plastic so the whole product is not thrown in the bin. This reduces waste, the need to mine finite resources and the associated environmental impacts and costs of transportation and processing.

    Some designers and producers are now making products from waste resources that otherwise would have gone to the tip and produced green house gases – high-end furniture, is one example. In our labs, we work with various industry partners and one is Dresden which makes prescription glasses and has a mission to do it sustainably using recycled and recyclable plastics materials. Our researchers are helping them by testing the viability of using plastics from things like discarded fishing netting, plastic bags and plastic lids.

    The new Circular Economy Innovation Network will bring together key stakeholders and case studies to accelerate partnerships and opportunities to build the circular economy not just in NSW but across Australia to address the waste and recycling issue, while enhancing manufacturing and industry capability to create new jobs.

    Workshops, seminars and identifying market opportunities and new partnerships with researches, industry and governments, will be some of the key activities and I am excited to be leading this exciting new initiative.

    It’s a big challenge to create a Network like this to bring together all of the touchpoints along business supply chains to help build a true circular economy, but we must act now for the future.

    Let me give you some stark statistics and facts.

    Let me give you some stark statistics and facts.

    The clothing and textiles industry is the second most polluting sector in the world, accounting for 10% of the world’s total carbon emissions. That clothing is now one of the biggest consumer waste streams, with 92 million tons estimated to be thrown out in a year, means we must urgently and seriously consider new ways to deal with unwanted clothes.

    Much of the materials collected from kerbside recycling bins has been going to developing nations and in Australia that gets ticked off as ‘recycled’ but much of it ends up in landfill or burnt. And due to the China and Indian waste importation bans, recyclable materials around Australian are being stockpiled and going into local landfill. Other Asian countries are also getting sick of being the Australia and the world’s dumping grounds.

    UNSW’s own research shows 65.4% of people believe recyclables put into council bins goes to landfill (69.5% female, 51.4% aged 18-34, 75.1% aged 65-plus); 49% of people believe green and ecofriendly efforts will not have an effect in their lifetime; 63.8% of those aged 65-plus see no benefits being realised; and 72.4% of people would recycle more if the material was reliably recycled.

    So, when considering that the population growth trend is expected to continue in the following decades, from a current world population of 7.6 billion to approximately 9.8 billion by 2050, our resources globally and at home need to be preserved and re-used.

    Smart design and production in a new ‘circular economy’ can make a big difference.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U NSW Campus

    Welcome to UNSW Australia (The University of New South Wales), one of Australia’s leading research and teaching universities. At UNSW, we take pride in the broad range and high quality of our teaching programs. Our teaching gains strength and currency from our research activities, strong industry links and our international nature; UNSW has a strong regional and global engagement.

    In developing new ideas and promoting lasting knowledge we are creating an academic environment where outstanding students and scholars from around the world can be inspired to excel in their programs of study and research. Partnerships with both local and global communities allow UNSW to share knowledge, debate and research outcomes. UNSW’s public events include concert performances, open days and public forums on issues such as the environment, healthcare and global politics. We encourage you to explore the UNSW website so you can find out more about what we do.

  • richardmitnick 11:06 am on May 27, 2019 Permalink | Reply
    Tags: "'Submarines' small enough to deliver medicine inside human body", , Dr Liang: each capsule of medicine could contain millions of micro-submarines and within each micro-submarine would be millions of drug molecules., Material Sciences, , Micro-submarines powered by nano-motors, , This is significant not just for medical applications but for micro-motors generally.,   

    From University of New South Wales: “‘Submarines’ small enough to deliver medicine inside human body” 

    U NSW bloc

    From University of New South Wales

    27 May 2019
    Lachlan Gilbert

    UNSW engineers have shown that micro-submarines powered by nano-motors could navigate the human body to provide targeted drug delivery to diseased organs without the need for external stimulus.

    An artist’s representation of ‘micro-submarines’ transporting their medical cargo through capillaries among red blood cells. Picture: UNSW.

    Cancers in the human body may one day be treated by tiny, self-propelled ‘micro-submarines’ delivering medicine to affected organs after UNSW Sydney chemical and biomedical engineers proved it was possible.

    In a paper published in Materials Today, the engineers explain how they developed micrometre-sized submarines that exploit biological environments to tune their buoyancy, enabling them to carry drugs to specific locations in the body.

    Corresponding author Dr Kang Liang, with both the School of Biomedical Engineering and School of Chemical Engineering at UNSW, says the knowledge can be used to design next generation ‘micro-motors’ or nano-drug delivery vehicles, by applying novel driving forces to reach specific targets in the body.

    “We already know that micro-motors use different external driving forces – such as light, heat or magnetic field – to actively navigate to a specific location,” Dr Liang says.

    “In this research, we designed micro-motors that no longer rely on external manipulation to navigate to a specific location. Instead, they take advantage of variations in biological environments to automatically navigate themselves.”

    What makes these micro-sized particles unique is that they respond to changes in biological pH environments to self-adjust their buoyancy. In the same way that submarines use oxygen or water to flood ballast points to make them more or less buoyant, gas bubbles released or retained by the micro-motors due to the pH conditions in human cells contribute to these nanoparticles moving up or down.

    This is significant not just for medical applications, but for micro-motors generally.

    “Most micro-motors travel in a 2-dimensional fashion,” Dr Liang says.

    “But in this work, we designed a vertical direction mechanism. We combined these two concepts to come up with a design of autonomous micro-motors that move in a 3D fashion. This will enable their ultimate use as smart drug delivery vehicles in the future.”

    Dr Liang illustrates a possible scenario where drugs are taken orally to treat a cancer in the stomach or intestines. To give an idea of scale, he says each capsule of medicine could contain millions of micro-submarines, and within each micro-submarine would be millions of drug molecules.

    “Imagine you swallow a capsule to target a cancer in the gastrointestinal tract,” he says.

    “Once in the gastrointestinal fluid, the micro-submarines carrying the medicine could be released. Within the fluid, they could travel to the upper or bottom region depending on the orientation of the patient.

    “The drug-loaded particles can then be internalised by the cells at the site of the cancer. Once inside the cells, they will be degraded causing the release of the drugs to fight the cancer in a very targeted and efficient way.”

    For the micro-submarines to find their target, a patient would need to be oriented in such a way that the cancer or ailment being treated is either up or down – in other words, a patient would be either upright or lying down.

    Dr Liang says the so-called micro-submarines are essentially composite metal-organic frameworks (MOF)-based micro-motor systems containing a bioactive enzyme (catalase, CAT) as the engine for gas bubble generation. He stresses that he and his colleagues’ research is at the proof-of-concept stage, with years of testing needing to be completed before this could become a reality.

    Dr Liang says the research team – comprised of engineers from UNSW, University of Queensland, Stanford University and University of Cambridge – will be also looking outside of medical applications for these new multi-directional nano-motors.

    “We are planning to apply this new finding to other types of nanoparticles to prove the versatility of this technique,” he says.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U NSW Campus

    Welcome to UNSW Australia (The University of New South Wales), one of Australia’s leading research and teaching universities. At UNSW, we take pride in the broad range and high quality of our teaching programs. Our teaching gains strength and currency from our research activities, strong industry links and our international nature; UNSW has a strong regional and global engagement.

    In developing new ideas and promoting lasting knowledge we are creating an academic environment where outstanding students and scholars from around the world can be inspired to excel in their programs of study and research. Partnerships with both local and global communities allow UNSW to share knowledge, debate and research outcomes. UNSW’s public events include concert performances, open days and public forums on issues such as the environment, healthcare and global politics. We encourage you to explore the UNSW website so you can find out more about what we do.

  • richardmitnick 8:59 am on May 15, 2019 Permalink | Reply
    Tags: "‘Impossible’ nano-sized protein cages made with the help of gold", , Artificial protein cages, Geometry problem: the wrong shape, Material Sciences, , , The building block of a protein cage is an 11-sided shape,   

    From University of Oxford: “‘Impossible’ nano-sized protein cages made with the help of gold” 

    U Oxford bloc

    From University of Oxford

    15 May 2019


    A collaborative effort between the University of Oxford and the Malopolska Centre of Biotechnology, Jagiellonian University in Poland, has produced a super-stable artificial protein ball that apparently defies the rules of geometry and which may have applications in materials science and medicine.

    Researchers are interested in making artificial protein cages in the hope that they can design them to have useful properties not found in nature. There are two challenges to achieving this goal. The first is the geometry problem: some proteins may have great potential utility but have the wrong shape to assemble into cages. The second problem is complexity: in nature the many proteins that form a protein cage are held together by a complex network of chemical bonds and these are very difficult to predict and simulate.

    In new work, published in Nature, researchers found a way to solve both of these problems.

    Professor Heddle, senior author of the research, said: ‘We were able to replace the complex interactions between proteins with a simple ‘staple’ consisting of a single gold atom. This simplifies the design problem and allows us to imbue the cages with new properties such as assembly and disassembly on demand.’

    The research has also found a way to get around the geometrical problem: the building block of a protein cage is an 11-sided shape. Theoretically this should not be able to form the faces of a regular convex polyhedron. However the research has found that while this is mathematically true, some so-called ‘impossible shapes’ can assemble into cages which are so close to being regular that the errors are not noticeable.

    Central to the study was the ability to characterise different cages, as well the ability to monitor and thereby understand the (dis)assembly dynamically. This work was done in the groups of Professors Justin Benesch and Philipp Kukura at Oxford, using innovative mass measurement approaches with a particular focus on biomolecular structure and assembly.

    Justin Benesch, in the Department of Chemistry, said: ‘The ability to interrogate the cages using the advanced mass measurement approaches we have developed here in Oxford, both on the level of their assembly and the constituent building block, was key to not just validating their structure, but also the mechanism by which they are formed.’

    The potential implications of the work are far-reaching. The researchers hope that the work can be expanded further to produce cages with new structures and new capabilities with potential applications particularly in drug delivery.

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Oxford campus

    Oxford is a collegiate university, consisting of the central University and colleges. The central University is composed of academic departments and research centres, administrative departments, libraries and museums. The 38 colleges are self-governing and financially independent institutions, which are related to the central University in a federal system. There are also six permanent private halls, which were founded by different Christian denominations and which still retain their Christian character.

    The different roles of the colleges and the University have evolved over time.

  • richardmitnick 10:27 am on May 7, 2019 Permalink | Reply
    Tags: , , , Material Sciences, , , , , TRC- Translational Research Capability   

    From Oak Ridge National Laboratory: “New research facility will serve ORNL’s growing mission in computing, materials R&D” 


    From Oak Ridge National Laboratory

    May 7, 2019
    Bill H Cabage

    Pictured in this early conceptual drawing, the Translational Research Capability planned for Oak Ridge National Laboratory will follow the design of research facilities constructed during the laboratory’s modernization campaign.

    Energy Secretary Rick Perry, Congressman Chuck Fleischmann and lab officials today broke ground on a multipurpose research facility that will provide state-of-the-art laboratory space for expanding scientific activities at the Department of Energy’s Oak Ridge National Laboratory.

    The new Translational Research Capability, or TRC, will be purpose-built for world-leading research in computing and materials science and will serve to advance the science and engineering of quantum information.

    “Through today’s groundbreaking, we’re writing a new chapter in research at the Translational Research Capability Facility,” said U.S. Secretary of Energy Rick Perry. “This building will be the home for advances in Quantum Information Science, battery and energy storage, materials science, and many more. It will also be a place for our scientists, researchers, engineers, and innovators to take on big challenges and deliver transformative solutions.”

    With an estimated total project cost of $95 million, the TRC, located in the central ORNL campus, will accommodate sensitive equipment, multipurpose labs, heavy equipment and inert environment labs. Approximately 75 percent of the facility will contain large, modularly planned and open laboratory areas with the rest as office and support spaces.

    “This research and development space will advance and support the multidisciplinary mission needs of the nation’s advanced computing, materials research, fusion science and physics programs,” ORNL Director Thomas Zacharia said. “The new building represents a renaissance in the way we carry out research allowing more flexible alignment of our research activities to the needs of frontier research.”

    The flexible space will support the lab’s growing fundamental materials research to advance future quantum information science and computing systems. The modern facility will provide atomic fabrication and materials characterization capabilities to accelerate the development of novel quantum computing devices. Researchers will also use the facility to pursue advances in quantum modeling and simulation, leveraging a co-design approach to develop algorithms along with prototype quantum systems.

    The new laboratories will provide noise isolation, electromagnetic shielding and low vibration environments required for multidisciplinary research in quantum information science as well as materials development and performance testing for fusion energy applications. The co-location of the flexible, modular spaces will enhance collaboration among projects.

    At approximately 100,000 square feet, the TRC will be similar in size and appearance to another modern ORNL research facility, the Chemical and Materials Sciences Building, which was completed in 2011 and is located nearby.

    The facility’s design and location will also conform to sustainable building practices with an eye toward encouraging collaboration among researchers. The TRC will be centrally located in the ORNL main campus area on a brownfield tract that was formerly occupied by one of the laboratory’s earliest, Manhattan Project-era structures.

    ORNL began a modernization campaign shortly after UT-Battelle arrived in 2000 to manage the national laboratory. The new construction has enabled the laboratory to meet growing space and infrastructure requirements for rapidly advancing fields such as scientific computing while vacating legacy spaces with inherent high operating costs, inflexible infrastructure and legacy waste issues.

    The construction is supported by the Science Laboratory Infrastructure program of the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 2:29 pm on April 26, 2019 Permalink | Reply
    Tags: "New Lens System for Brighter Sharper Diffraction Images", "The team used a photocathode gun that generates the electrons through a process called photoemission”, , “We made the sample by depositing the gold atoms on a several nanometer thick carbon film using a technique called thermal evaporation”, , Brookhaven’s Accelerator Test Facility, , Electron beam-related research techniques, Material Sciences, , , The researchers used two groups of four quadrupole magnets to tune the electron beam., Ultra-fast electron diffraction imaging   

    From Brookhaven National Lab: “New Lens System for Brighter, Sharper Diffraction Images” 

    From Brookhaven National Lab

    April 25, 2019

    Cara Laasch
    (631) 344-8458

    Peter Genzer
    (631) 344-3174

    Researchers from Brookhaven Lab designed, implemented, and applied a new and improved focusing system for electron diffraction measurements.

    Mikhail Fedurin, Timur Shaftan, Victor Smalyuk, Xi Yang, Junjie Li, Lewis Doom, Lihua Yu, and Yimei Zhu are the Brookhaven team of scientists that realized and demonstrated the new lens system for as ultra-fast electron diffraction imaging.

    To design and improve energy storage materials, smart devices, and many more technologies, researchers need to understand their hidden structure and chemistry. Advanced research techniques, such as ultra-fast electron diffraction imaging can reveal that information. Now, a group of researchers from the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have developed a new and improved version of electron diffraction at Brookhaven’s Accelerator Test Facility (ATF)—a DOE Office of Science User Facility that offers advanced and unique experimental instrumentation for studying particle acceleration to researchers from all around the world. The researchers published their findings in Scientific Reports, an open-access journal by Nature Research.

    Advancing a research technique such as ultra-fast electron diffraction will help future generations of materials scientists to investigate materials and chemical reactions with new precision. Many interesting changes in materials happen extremely quickly and in small spaces, so improved research techniques are necessary to study them for future applications. This new and improved version of electron diffraction offers a stepping stone for improving various electron beam-related research techniques and existing instrumentation.

    “We implemented our new focusing system for electron beams and demonstrated that we can improve the resolution significantly when compared to the conventional solenoid technique,” said Xi Yang, author of the study and an accelerator physicist at the National Synchrotron Light Source II (NSLS-II) [see below], a DOE Office of Science User Facility at Brookhaven Lab. “The resolution mainly depends on the properties of light – or in our case – of the electron beam. This is universal for all imaging techniques, including light microscopy and x-ray imaging. However, it is much more challenging to focus the charged electrons to a near-parallel pencil-like beam at the sample than it would be with light, because electrons are negatively charged and therefore repulse one another. This is called the space charge effect. By using our new setup, we were able to overcome the space charge effect and obtain diffraction data that is three times brighter and two times sharper; it’s a major leap in resolution.”

    The colorful images are four different electron diffraction measurements at ATF. The left column shows diffraction patterns of the sample using the newly developed quadrupoles, while the right column shows diffraction patterns without the new lens system. In the left column the rings of the pattern are sharper, rounder and turn red, which means that the overall resolution of the measurement is higher.

    Every electron diffraction setup uses an electron beam that is focused on the sample so that the electrons bounce off the atoms in the sample and travel further to the detector behind the sample. The electrons create a so-called diffraction pattern, which can be translated into the structural makeup of the materials at the nanoscale. The advantage of using electrons to image this inner structure of materials is that the so called diffraction limit of electrons is very low, which means scientists can resolve smaller details in the structure compared to other diffraction methods.

    A diverse team of researchers was needed to improve such a complex research method. The Brookhaven Lab team consisted of electron beam experts from the NSLS-II, electron accelerator experts from ATF, and materials science experts from the condensed matter physics & materials science (CMPMS) department.

    “This advance would not have been possible without the combination of all our expertise across Brookhaven Lab. At NSLS-II, we have expertise on how to handle the electron beam. The ATF group brought the expertise and capabilities of the electron gun and laser technologies – both of which were needed to create the electron beam in the first place. And the CMPMS group has the sample expertise and, of course, drives the application needs. This is a unique synergy and, together, we were able to show how the resolution of the technique can be improved drastically,” said Li Hua Yu, NSLS-II senior accelerator physicist and co-author of the study.

    To achieve its improved resolution, the team developed a different method of focusing the electron beam. Instead of using a conventional approach that involves solenoid magnets, the researchers used two groups of four quadrupole magnets to tune the electron beam. Compared to solenoid magnets, which act as just one lens to shape the beam, the quadrupole magnets work like a specialized lens system for the electrons, and they gave the scientists far more flexibility to tune and shape the beam according to the needs of their experiment.

    “Our lens system can provide a wide range of tunability of the beam. We can optimize the most important parameters such as beam size, or charge density, and beam divergence based on the experimental conditions, and therefore provide the best beam quality for the scientific needs,” said Yang.

    The team can even adjust the parameters on-the-fly with online optimization tools and correct any nonuniformities of the beam shape; however, to make this measurement possible, the team needed the excellent electron beam that ATF provides. ATF has an electron gun that generates an extremely bright and ultrashort electron beam, which offers the best conditions for electron diffraction.

    “The team used a photocathode gun that generates the electrons through a process called photoemission,” said Mikhail Fedurin, an accelerator physicist at ATF. “We shoot an ultrashort laser pulse into a copper cathode, and when the pulse hits the cathode a cloud of electrons forms over the copper. We pull the electrons away using an electric field and then accelerate them. The amount of electrons in one of these pulses and our capability to accelerate them to specific energies make our system attractive for material science research – particularly for ultrafast electron diffraction.”

    The focusing system together with the ATF electron beam is very sensitive, so the researchers can measure the influences of Earth’ magnetic field on the electron beam.

    “In general, electrons are always influenced by magnetic fields—this is how we steer them in particle accelerators in the first place; however, the effect of Earth’s magnetic field is not negligible for the low-energy beam we used in this experiment,” said Victor Smalyuk, NSLS-II accelerator physics group leader and co-author of the study. “The beam deviated from the desired trajectory, which created difficulties during the initial starting phase, so we had to correct for this effect.”

    Beyond the high brightness of the electron beam and the high precision of the focusing system, the team also needed the right sample to make these measurements. The CMPMS group provided the team with a polycrystalline gold film to fully explore the newly designed lens system and to put it to the test.

    “We made the sample by depositing the gold atoms on a several nanometer thick carbon film using a technique called thermal evaporation,” said Junjie Li, a physicist in the CMPMS department. “We evaporated gold particles so that they condense on the carbon film and form tiny, isolated nanoparticles that slowly merge together and form the polycrystalline film.”

    This film was essential for the measurements because it has randomly oriented crystals that merge together. Therefore, the inner structure of the sample is not uniform, but consists of many differently oriented areas, which means that the diffraction pattern mainly depends on the electron beam qualities. This gives the scientists the best ground to really test their lens system, to tune the beam, and to see the impact of their tuning directly in the quality of the diffraction measurement.

    “We initially set out to improve electron diffraction for scientific studies of materials, but we also found that this technique can help us characterize our electron beam. In fact, diffraction is very sensitive to the electron beam parameters, so we can use the diffraction pattern of a known sample to measure our beam parameters precisely and directly, which is usually not that easy,” said Yang.

    The team intends to pursue further improvements, and they already have plans to develop another setup for ultra-fast electron microscopy to directly visualize a biological sample.

    “We hope to achieve ultrafast single-shot electron beam imaging at some point and maybe even make molecular movies, which isn’t possible with our current electron beam imaging setup,” said Yang.

    This research was supported by Laboratory Directed Research and Development funding and by DOE’s Office of Science through its support of the ATF.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL Center for Functional Nanomaterials



    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: