Tagged: Chemistry Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:21 am on March 23, 2019 Permalink | Reply
    Tags: "What Was It Like When Life Began On Earth?", , Chemistry, , , The death of the Martian magnetic field caused its atmosphere to be stripped away rendering it solid and frozen, The planet Earth has had life on it in some form or another for nearly as long as it has existed, While Venus and Mars may have had similar chances radical changes to Venus’ atmosphere rendered it a searing hothouse world after just 200–300 million years   

    From Ethan Siegel: “What Was It Like When Life Began On Earth?” 

    From Ethan Siegel
    Mar 20, 2019

    A planet that is a candidate for being inhabited will no doubt experience catastrophes, collisions, and extinction-level events on it. If life is to survive and thrive on a world, it must possess the right intrinsic and environmental conditions to allow it to persist. Here, an illustration of early Earth’s environment may look fearsome, but life somehow still found a way. (NASA GODDARD SPACE FLIGHT CENTER)

    The planet has had life on it, in some form or another, for nearly as long as Earth has existed.

    If you came to our Solar System right after it formed, you would have seen a completely foreign-looking sight. Our Sun would have been about the same mass it is today, but only about 80% as luminous, as stars heat up as they age. The four inner, rocky worlds would still be there, but three of them would look extremely similar. Venus, Earth, and Mars all had thin atmospheres, liquid water on their surface, and the organic ingredients that could give rise to life.

    While we still don’t know whether life ever took hold on Venus or Mars, we know that by the time Earth was only 100 million years old, there were organisms living on its surface. After billions of years of cosmic evolution gave rise to the elements, molecules, and conditions from which life could exist, our planet became the one where it not only did, but where it thrived. To the best of our scientific knowledge, here’s what those first steps were like.

    A micron-scale view of very primitive organisms. Whether the first organisms formed on Earth or predate the formation of our planet is still an open question, but evidence favors the scenarios where life arises on our world. (ERIC ERBE, DIGITAL COLORIZATION BY CHRISTOPHER POOLEY, BOTH OF USDA, ARS, EMU)

    Life as we know it has a few properties that everyone agrees on. While life on Earth involves carbon-based chemistry (requiring carbon, oxygen, nitrogen, hydrogen, and many other elements like phosphorous, copper, iron, sulfur, and so on) and relies on liquid water, other combinations of elements and molecules may be possible. The four general properties that all life shares, however, are as follows:

    Life has a metabolism, where it harvests energy/resources from an external source for its own use.
    Life responds to external stimuli from its environment, and alters its behavior accordingly.
    Life can grow, adapt to its environment, or can otherwise evolve from its present form into a different one.
    And life can reproduce, creating viable offspring that arise from its own internal processes.

    The formation and growth of a snowflake, a particular configuration of ice crystal. Although crystals have a molecular configuration that allows them to reproduce and copy themselves, they do not utilize energy or encode genetic information. (VYACHESLAV IVANOV / VIMEO.COM/87342468)

    All four of these must be in place, simultaneously, for a population of organisms to be considered alive. Snowflakes and crystals may be able to grow and reproduce, but their lack of a metabolism prevents them from being classified as alive. Proteins may have a metabolism and be able to reproduce, but they do not respond to external stimuli or alter behavior based on what they encounter. Even viruses, which are the most debatable organism on the line between life and non-life, can only reproduce by infecting other successfully living cells, casting doubt on whether they’re classified as living or non-living.

    Many organic materials — chemical compounds like sugars, amino acids, ethyl formate, and even complex ones like polycyclic aromatic hydrocarbons — are found in interstellar space, in asteroids, and were abundant on early Earth. But we do not have evidence that life began prior to Earth’s formation.

    The early Solar System was filled with comets, asteroids, and small clumps of matter that struck practically every world around. This period is historically known as the “late-heavy bombardment”, and is thought to have brought many of the ingredients for life, but not living organisms themselves, to Earth. (NASA)

    Instead, the leading thought is that the Earth was formed with these raw ingredients on it, and perhaps many more. Perhaps nucleotides were common; perhaps proteins and protein fragments came pre-assembled; perhaps lipid layers and bilayers could spontaneously arise in an aqueous environment. In order to go from precursors to life to actual life, however, it’s believed that we needed the right environment.

    These three favorable planets — Venus, Earth, and Mars — all likely had a reasonable level of surface gravity, thin atmospheres, liquid water on their surfaces, and these biochemical precursor molecules. The one thing Earth had that the other two planets likely didn’t, however, was a Moon. While all three worlds likely had a chance to form life for the first time, our Moon helped give us chances that the other worlds may not have had.

    The Earth and Sun, not so different from how they might have appeared 4 billion years ago. In the early stages of the Solar System, Venus and Mars may have looked quite similar. (NASA/TERRY VIRTS)

    The amount of water present on these early planets was very likely enough to create oceans, seas, lakes, and rivers, but not enough to completely cover them in liquid water. This means they all had continents and oceans, and at the interface of the two, there were tidepools: regions where water can stably exist on dry land and be subject to all sorts of energy gradients.

    Sunlight, shadow and night, cycles of evaporation and concentration, porous fluid flow in the presence of minerals, and gradients of water activity could all provide opportunities for molecules to bind together in novel and interesting ways. The effects of tides may be enhanced by the Moon, but all these worlds possess tides due to the Sun. However, there’s an additional energy source that the Earth possesses that likely contributed to life’s origin, that may not have been as spectacular on Venus or Mars.

    Tidal pools, like the ones shown here from Wisconsin, occur at the interface of land and large bodies of water, like lakes, seas, or oceans. A pool with the right conditions and precursor molecules is one candidate for where life could have possibly arisen on Earth.(GOODFREEPHOTOS_COM / PIXABAY)

    That latter factor is thermal activity from the interior of the planet. At the bottom of the oceans, hydrothermal vents are geological hotspots that are excellent candidate locations for life to arise. Even today, they are home to organisms known as extremophiles: bacteria and other lifeforms that can withstand the temperatures that typically break the molecular bonds associated with life processes.

    These vents contain enormous energy gradients as well as chemical gradients, where extremely alkaline vent water mixes with the acidic, carbonic-acid-rich ocean water. Finally, these vents contain both sodium and potassium ions, as well as calcium carbonate structures that could serve as a template for the first cells. The fact that life exists in environments like this points to worlds like Europa or Enceladus as potential homes for life elsewhere in the Solar System today.

    Deep under the sea, around hydrothermal vents, where no sunlight reaches, life still thrives on Earth. How to create life from non-life is one of the great open questions in science today. If life can exist down here, at the bottom of Earth’s oceans, perhaps there’s a chance for life in the deep subsurface oceans of Europa or Enceladus, too. (NOAA/PMEL VENTS PROGRAM)

    But perhaps the most likely location for life to begin on Earth is the best of all worlds: hydrothermal fields. Volcanic activity doesn’t solely occur beneath the oceans, but also on land. Beneath areas of fresh water, these volcanically-active areas provide an additional heat and energy source that can stabilize temperatures and provide an energy gradient. All the while, these locations still allow evaporation/concentration cycles, provide a confined environment that enables the right ingredients to accumulate, and allow a sunlight/night cycle of exposure.

    On Earth, we can be confident that tidepools, hydrothermal vents, and hydrothermal fields were all common. While the precursor molecules certainly originated beyond Earth, it was likely here on our planet that the transformation of non-life into life spontaneously occurred.

    This aerial view of Grand Prismatic Spring in Yellowstone National Park is one of the most iconic hydrothermal features on land in the world. The colors are due to the various organisms living under these extreme conditions, and depend on the amount of sunlight that reaches the various parts of the springs. Hydrothermal fields like this are some of the best candidate locations for life to have arisen on Earth. (JIM PEACO, NATIONAL PARKS SERVICE)

    Over time, the Earth has changed tremendously, as have the living organisms on our planet. We do not know if life arose once, more than once, or in disparate locations. What we do know, however, is that if we reconstruct the evolutionary tree of every extant organism found on Earth today, they all share the same ancestor.

    By studying the genomes of the extant organisms found on our world today, biologists can reconstruct the timescale of what’s known as LUCA: the Last Universal Common Ancestor of life on Earth. By time the Earth was less than 1 billion years old, life already had the ability to transcribe and translate information between DNA, RNA, and proteins, and these mechanisms exist in all organisms today. Whether life arose multiple times is unknown, but it is generally accepted that life as we know it today descended from a single population.

    Scanning electron microscope image at the sub-cellular level. While DNA is an incredibly complex, long molecule, it is made of the same building blocks (atoms) as everything else. To the best of our knowledge, the DNA structure that life is based on may even predate the fossil record. (PUBLIC DOMAIN IMAGE BY DR. ERSKINE PALMER, USCDCP)

    Despite the fact that geological processes can often obscure the fossil record beyond a few hundred million years, we have been able to trace back the origin of life extraordinarily far. Microbial fossils have been found in sandstone dating to 3.5 billion years ago. Graphite, found deposited in metamorphosed sedimentary rock, has been traced back to having biogenic origins, and dates back to 3.8 billion years ago.

    Trilobites fossilized in limestone, from the Field Museum in Chicago. All extant and fossilized organisms can have their lineage traced back to a universal common ancestor that lived an estimated 3.5 billion years ago. (JAMES ST. JOHN / FLICKR)

    At even earlier, more extreme times, the deposits of certain crystals in rocks appear to originate from biological processes, suggesting that Earth was teeming with life as early as 4.3 to 4.4 billion years ago: as soon as 100–200 million years after the Earth and Moon formed. To the best of our knowledge, life on Earth has existed almost as long as Earth itself has.

    Graphite deposits found in Zircon, some of the oldest pieces of evidence for carbon-based life on Earth. These deposits, and the carbon-12 ratios they show in the inclusions, date life on Earth to more than 4 billion years ago. (E A BELL ET AL, PROC. NATL. ACAD. SCI. USA, 2015)

    At some point on our planet, in the very early stages, the molecules that are abundant and precursors to life, under the right energy and chemical conditions, began to simultaneously metabolize energy, respond to the environment, grow, adapt, evolve, and reproduce. Even if it would be unrecognizable to us today, that marks the origin of life. In a radically unbroken string of biological success, our planet has been a living world ever since.

    Hadean diamonds embedded in zircon/quartz. You can find the oldest deposits in panel d, which indicate an age of 4.26 billion years, or nearly the age of Earth itself. (M. MENNEKEN, A. A. NEMCHIN, T. GEISLER, R. T. PIDGEON & S. A. WILDE, NATURE 448 7156 (2007))

    While Venus and Mars may have had similar chances, radical changes to Venus’ atmosphere rendered it a searing hothouse world after just 200–300 million years, while the death of the Martian magnetic field caused its atmosphere to be stripped away, rendering it solid and frozen. While asteroid strikes may send Earth-based life off-world, throughout the Solar System and galaxy, all the evidence suggests that we are where it started.

    By 9.4 billion years after the Big Bang, Earth was teeming with life. We’ve never looked back.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 8:16 am on March 19, 2019 Permalink | Reply
    Tags: Ames Laboratory, , Chemistry, Iowa State University, Janus nanocrystal platform, MSE-Material Science and Engineering, Nanoparticle self-assembly, ,   

    From Iowa State University: “Engineered nanoparticle discovery led by MSE’s Jiang makes cover of Nano Letters” 

    Iowa State University

    March 13, 2019
    Cyclone Engineering


    Shan Jiang, assistant professor of material science and engineering [MSE], led a research group that created a novel Janus nanocrystal platform to control nanoparticle self-assembly.

    Janus particles are fundamental new materials, and Jiang’s discovery opens opportunities in different areas including energy, drug delivery, disease diagnosis and therapy. The results appear on the cover of the March issue of Nano Letters.

    Key to the team’s discoveries were a multidisciplinary approach and the powerful high-resolution scanning transmission electron microscopy available at U.S. Department of Energy’s Ames Laboratory’s Sensitive Instrument Facility.

    Ames Lab’s Matt Kramer with the Tecnai transmission electron microscope at the new Sensitive Instrument Facility

    The collaborative research effort is led by Jiang with Eric Cochran, professor of chemical and biological engineering, and Lin Zhou, scientist at Ames Laboratory. Fei Liu, a postdoctoral researcher in materials science and engineering, is the first author. Shailja Goyal and Michael Forrester, graduate students in chemical and biological engineering, contributed to the synthesis and Tao Ma, a postdoctoral researcher at Ames Laboratory contributed to the electron microscopy characterization. Undergraduates in materials science and engineering Yasmeen Mansoorieh and John Henjum also contributed to the work.

    Jiang’s research team’s technique is inexpensive, scalable to commercial production. The group demonstrated their synthesis approach in the form of Au-Fe3O4 nanocrystals, particularly important materials because the particles are biocompatible and have enhanced magnetic and surface plasmon resonance properties.

    “We had the right people and the right facilities to demonstrate for the first time that we can make these particles that show unique structures. The work was all completed here on the Iowa State University campus, and I’m very proud of that,” said Jiang.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Iowa State University is a public, land-grant university, where students get a great academic start in learning communities and stay active in 800-plus student organizations, undergrad research, internships and study abroad. They learn from world-class scholars who are tackling some of the world’s biggest challenges — feeding the hungry, finding alternative fuels and advancing manufacturing.

    Iowa Agricultural College and Model Farm (now Iowa State University) was officially established on March 22, 1858, by the legislature of the State of Iowa. Story County was selected as a site on June 21, 1859, and the original farm of 648 acres was purchased for a cost of $5,379. The Farm House, the first building on the Iowa State campus, was completed in 1861, and in 1862, the Iowa legislature voted to accept the provision of the Morrill Act, which was awarded to the agricultural college in 1864.

    Iowa State University Knapp-Wilson Farm House. Photo between 1911-1926

    Iowa Agricultural College (Iowa State College of Agricultural and Mechanic Arts as of 1898), as a land grant institution, focused on the ideals that higher education should be accessible to all and that the university should teach liberal and practical subjects. These ideals are integral to the land-grant university.

    The first official class entered at Ames in 1869, and the first class (24 men and 2 women) graduated in 1872. Iowa State was and is a leader in agriculture, engineering, extension, home economics, and created the nation’s first state veterinary medicine school in 1879.

    In 1959, the college was officially renamed Iowa State University of Science and Technology. The focus on technology has led directly to many research patents and inventions including the first binary computer (the ABC), Maytag blue cheese, the round hay baler, and many more.

    Beginning with a small number of students and Old Main, Iowa State University now has approximately 27,000 students and over 100 buildings with world class programs in agriculture, technology, science, and art.

    Iowa State University is a very special place, full of history. But what truly makes it unique is a rare combination of campus beauty, the opportunity to be a part of the land-grant experiment, and to create a progressive and inventive spirit that we call the Cyclone experience. Appreciate what we have here, for it is indeed, one of a kind.

  • richardmitnick 11:33 am on March 11, 2019 Permalink | Reply
    Tags: "Electrically-heated silicate glass appears to defy Joule's first law", , Chemistry, Joule heating also known as Ohmic heating and resistive heating is the process by which the passage of an electric current through a conductor produces heat., ,   

    From Lehigh University: “Electrically-heated silicate glass appears to defy Joule’s first law” 

    From Lehigh University

    February 27, 2019
    Lori Friedman

    Experiments show electric field can modify silicate glass, causing parts to melt while remaining solid elsewhere; discovery suggests heat in glass could be produced on a very fine scale, could point to performance challenges for devices that use glass.

    Charles T. McLaren (left), with Himanshu Jain, says applying a direct current field across glass also reduces its melting temperature and makes it possible to shape glass with greater precision than can be done using heat alone. (Courtesy of Lehigh University)

    Characterizing and predicting how electrically-heated silicate glass behaves is important because it is used in a variety of devices that drive technical innovations. Silicate glass is used in display screens. Glass fibers power the internet. Nanoscale glass devices are being deployed to provide breakthrough medical treatments such as targeted drug-delivery and re-growing tissue.

    The discovery that under certain conditions electrically-heated silicate glass defies a long-accepted law of physics known as Joule’s first law should be of interest to a broad spectrum of scientists, engineers, even the general public, according to Himanshu Jain, Diamond Distinguished Chair of the Department of Materials Science and Engineering at Lehigh University.

    The foundation of electrical heating was laid by James Prescott Joule, an English physicist and mathematician, in 1840. Joule demonstrated that heat is generated when electrical current is passed through a resistor. His conclusion, known as Joule’s first law, simply states that heat is produced in proportion to the square of an electrical current that passes through a material.

    “It has been verified over and over on homogeneous metals and semiconductors which heat up uniformly, like an incandescent light bulb does,” says Jain.

    He and his colleagues―which includes Nicholas J. Smith and Craig Kopatz, both of Corning Incorporated, as well as Charles T. McLaren, a former Ph.D. student of Jain’s, now a researcher at Corning―have authored a paper published in Nature Scientific Reports that details their discovery that electrically-heated common, homogeneous silicate glasses appear to defy Joule’s first law.

    Joule heating, also known as Ohmic heating and resistive heating, is the process by which the passage of an electric current through a conductor produces heat.

    Joule’s first law, also known as the Joule–Lenz law, states that the power of heating generated by an electrical conductor is proportional to the product of its resistance and the square of the current:

    P ∝ I 2 R {\displaystyle P\propto I^{2}R} {\displaystyle P\propto I^{2}R}

    Joule heating affects the whole electric conductor, unlike the Peltier effect which transfers heat from one electrical junction to another.

    A coiled heating element from an electric toaster, showing red to yellow incandescence


    In the paper, titled Development of highly inhomogeneous temperature profile within electrically heated alkali silicate glasses, the authors write: “Unlike electronically conducting metals and semiconductors, with time the heating of ionically conducting glass becomes extremely inhomogeneous with the formation of a nanoscale alkali-depletion region, such that the glass melts near the anode, even evaporates, while remaining solid elsewhere. In situ infrared imaging shows and finite element analysis confirms localized temperatures more than thousand degrees above the remaining sample depending on whether the field is DC or AC.”

    “In our experiments, the glass became more than a thousand degrees Celsius hotter near the positive side than in the rest of the glass, which was very surprising considering that the glass was totally homogeneous to begin with,” says Jain. “The cause of this result is shown to be in the change in the structure and chemistry of glass on nanoscale by the electric field itself, which then heats up this nano-region much more strongly.”

    Jain says that the application of classical Joule’s law of physics needs to be reconsidered carefully and adapted to accommodate these findings.

    These observations unravel the origin of a recently discovered electric field induced softening of glass. In a previous paper, Jain and his colleagues reported the phenomenon of Electric Field Induced Softening. They demonstrated that the softening temperature of glass heated in a furnace can be reduced by as much as a couple of hundred degrees Celsius simply by applying 100 Volt across an inch thick sample.

    “The calculations did not add up to explain what we were seeing as simply standard Joule heating,” says Jain. “Even under very moderate conditions, we observed fumes of glass that would require thousands of degrees higher temperature than Joule’s law could predict!”

    The team then undertook a systematic study to monitor the temperature of glass. They used high-resolution infrared pyrometers to map out the temperature profile of the whole sample. New data together with their previous observations showed that electric field modified the glass dramatically and that they had to modify how Joule’s law can be applied.

    The researchers believe that this work shows it is possible to produce heat in a glass on a much finer scale than by the methods used so far, possibly down to the nanoscale. It would then allow making new optical and other complex structures and devices on glass surface more precisely than before.

    “Besides demonstrating the need to qualify Joule’s law, the results are critical to developing new technology for the fabrication and manufacturing of glass and ceramic materials,” says Jain.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Lehigh University is an American private research university in Bethlehem, Pennsylvania. It was established in 1865 by businessman Asa Packer. Its undergraduate programs have been coeducational since the 1971–72 academic year. As of 2014, the university had 4,904 undergraduate students and 2,165 graduate students. Lehigh is considered one of the twenty-four Hidden Ivies in the Northeastern United States.

    Lehigh has four colleges: the P.C. Rossin College of Engineering and Applied Science, the College of Arts and Sciences, the College of Business and Economics, and the College of Education. The College of Arts and Sciences is the largest, which roughly consists of 40% of the university’s students.The university offers a variety of degrees, including Bachelor of Arts, Bachelor of Science, Master of Arts, Master of Science, Master of Business Administration, Master of Engineering, Master of Education, and Doctor of Philosophy.

    Lehigh has produced Pulitzer Prize winners, Fulbright Fellows, members of the American Academy of Arts & Sciences and of the National Academy of Sciences, and National Medal of Science winners.

  • richardmitnick 10:28 am on March 4, 2019 Permalink | Reply
    Tags: , , Chemistry, Directed evolution, Engineer synthetic nanoparticles as optical biosensors, , ,   

    From École Polytechnique Fédérale de Lausanne: “Directed evolution builds nanoparticles” 

    EPFL bloc

    From École Polytechnique Fédérale de Lausanne

    Nik Papageorgiou

    Directed evolution is a powerful technique for engineering proteins. EPFL scientists now show that it can also be used to engineer synthetic nanoparticles as optical biosensors, which are used widely in biology, drug development, and even medical diagnostics such as real-time monitoring of glucose.

    The 2018 Nobel Prize in Chemistry went to three scientists who developed the method that forever changed protein engineering: directed evolution. Mimicking natural evolution, directed evolution guides the synthesis of proteins with improved or new functions.

    First, the original protein is mutated to create a collection of mutant protein variants. The protein variants that show improved or more desirable functions are selected. These selected proteins are then once more mutated to create another collection of protein variants for another round of selection. This cycle is repeated until a final, mutated protein is evolved with optimized performance compared to the original protein.

    Now, scientists from the lab of Ardemis Boghossian at EPFL, have been able to use directed evolution to build not proteins, but synthetic nanoparticles. These nanoparticles are used as optical biosensors – tiny devices that use light to detect biological molecules in air, water, or blood. Optical biosensors are widely used in biological research, drug development, and medical diagnostics, such as real-time monitoring of insulin and glucose in diabetics.

    “The beauty of directed evolution is that we can engineer a protein without even knowing how its structure is related to its function,” says Boghossian. “And we don’t even have this information for the vast, vast majority of proteins.”

    Her group used directed evolution to modify the optoelectronic properties of DNA-wrapped single-walled carbon nanotubes (or, DNA-SWCNTs, as they are abbreviated), which are nano-sized tubes of carbon atoms that resemble rolled up sheets of graphene covered by DNA. When they detect their target, the DNA-SWCNTs emit an optical signal that can penetrate through complex biological fluids, like blood or urine.

    General principle of the directed evolution approach applied to the nanoparticle DNA-SWCNT complexes. The starting complex is a DNA-SWCNT with a dim optical signal. This is evolved through directed evolution: (1) random mutation of the DNA sequence; (2) wrapping of the SWCNTs with the DNA and screening of the complex’s optical signal; (3) selection of the DNA-SWCNT complexes exhibiting an improved optical signal. After several cycles of evolution, we can evolve DNA-SWCNT complexes that show enhanced optical behavior. Credit: Benjamin Lambert (EPFL)

    Using a directed evolution approach, Boghossian’s team was able to engineer new DNA-SWCNTs with optical signals that are increased by up to 56% – and they did it over only two evolution cycles.

    “The majority of researchers in this field just screen large libraries of different materials in hopes of finding one with the properties they are looking for,” says Boghossian. “In optical nanosensors, we try to improve properties like selectivity, brightness, and sensitivity. By applying directed evolution, we provide researchers with a guided approach to engineering these nanosensors.”

    The study [Chemical Communications] shows that what is essentially a bioengineering technique can be used to more rationally tune the optoelectronic properties of certain nanomaterials. Boghossian explains: “Fields like materials science and physics are mostly preoccupied with defining material structure-function relationships, making materials that lack this information difficult to engineer. But this is a problem that nature solved billions of years ago – and, in recent decades, biologists have tackled it as well. I think our study shows that as materials scientists and physicists, we can still learn a few pragmatic lessons from biologists.”

    SNSF AP Energy Grant

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL campus

    EPFL is Europe’s most cosmopolitan technical university. It receives students, professors and staff from over 120 nationalities. With both a Swiss and international calling, it is therefore guided by a constant wish to open up; its missions of teaching, research and partnership impact various circles: universities and engineering schools, developing and emerging countries, secondary schools and gymnasiums, industry and economy, political circles and the general public.

  • richardmitnick 9:18 am on February 28, 2019 Permalink | Reply
    Tags: An element is defined by the number of protons it contains, At the far edge of the periodic table elements decay within instants of their formation offering very little time to study their properties, , Chemistry, Each element comes in a variety of types known as isotopes distinguished by the number of neutrons in the nucleus, For superheavy atoms chemistry gets weird, , , Scientists are hoping to stretch the periodic table even further beyond tennessine and three other recently discovered elements (113 115 and 118) that completed the table’s seventh row.   

    From Science News: “Extreme elements push the boundaries of the periodic table” 

    From Science News

    February 27, 2019
    Emily Conover

    For superheavy atoms, chemistry gets weird.

    SMASH HIT To create new elements and study the chemistry of the periodic table’s heaviest atoms, researchers at the GSI Helmholtz Center for Heavy Ion Research in Darmstadt, Germany, use the apparatus above to create beams of ions that scientists then smash into other elements.

    GSI Helmholtz Centre for Heavy Ion Research GmbH, Darmstadt, Germany,

    The rare radioactive substance made its way from the United States to Russia on a commercial flight in June 2009. Customs officers balked at accepting the package, which was ensconced in lead shielding and emblazoned with bold-faced warnings and the ominous trefoil symbols for ionizing radiation. Back it went across the Atlantic.

    U.S. scientists enclosed additional paper work and the parcel took a second trip, only to be rebuffed again. All the while, the precious cargo, 22 milligrams of an element called berkelium created in a nuclear reactor at Oak Ridge National Laboratory in Tennessee, was deteriorating. Day by day, its atoms were decaying. “We were all a little frantic on our end,” says Oak Ridge nuclear engineer Julie Ezold.

    On the third try, the shipment cleared customs. At a laboratory in Dubna, north of Moscow, scientists battered the berkelium with calcium ions to try to create an even rarer substance. After 150 days of pummeling, the researchers spotted six atoms of an element that had never been seen on Earth. In 2015, after other experiments confirmed the discovery, element 117, tennessine, earned a spot on the periodic table (SN: 2/6/16, p. 7).

    Scientists made radioactive berkelium at the High Flux Isotope Reactor at Oak Ridge National Laboratory in Tennessee (shown), and shipped it to Russia to be bombarded with a beam of calcium-48 to yield the superheavy element tennessine. ORNL/Flickr (CC BY 2.0)

    ORNL High Flux Isotope Reactor

    Scientists are hoping to stretch the periodic table even further, beyond tennessine and three other recently discovered elements (113, 115 and 118) that completed the table’s seventh row. Producing the next elements will require finessing new techniques using ultrapowerful beams of ions, electrically charged atoms. Not to mention the stress of shipping more radioactive material across borders.

    But questions circulating around the periodic table’s limits are too tantalizing not to make the effort. It’s been 150 years since Russian chemist Dmitrii Mendeleev created his periodic table. Yet “we still cannot answer the question: Which is the heaviest element that can exist?” says nuclear chemist Christoph Düllmann of the GSI Helmholtz Center for Heavy Ion Research in Darmstadt, Germany.

    At the far edge of the periodic table, elements decay within instants of their formation, offering very little time to study their properties. In fact, scientists still know little about the latest crew of newfound elements. So while some scientists are hunting for never-before-seen elements, others want to learn more about the table’s newcomers and the strange behaviors those superheavy elements may exhibit.

    For such outsized atoms, chemistry can get weird, as atomic nuclei, the hearts at the center of each atom, bulge with hundreds of protons and neutrons. Around them swirl great flocks of electrons, some moving at close to the speed of light. Such extreme conditions might have big consequences — messing with the periodic table’s tidy order, in which elements in each column are close chemical kin that behave in similar ways.

    In Russia, scientist Vladislav Shchegolev inspects a package of berkelium after its overseas flight in 2009. The material was later used to create element 117, tennessine.
    Courtesy of ORNL.

    Scientists keep pushing these superheavy elements further as part of the search for what’s poetically known as the island of stability. Atoms with certain numbers of protons and neutrons are expected to live longer than their fleeting friends, persisting perhaps for hours rather than fractions of a second. Such an island would give scientists enough time to study those elements more closely and understand their quirks. The first glimpses of that mysterious atoll have been spotted, but it’s not clear how to get a firm footing on its shores.

    Driving all this effort is a deep curiosity about how elements act at the boundaries of the periodic table. “This might sound corny, but it’s really just [about] pure scientific understanding,” says nuclear chemist Dawn Shaughnessy of Lawrence Livermore National Laboratory in California. “We have these things that are really at the extremes of matter and we don’t understand right now how they behave.”

    Assembling atoms

    An element is defined by the number of protons it contains. Create an atom with more protons than ever before, and you’ve got yourself a brand new element. Each element comes in a variety of types, known as isotopes, distinguished by the number of neutrons in the nucleus. Changing the number of neutrons in an atom’s nucleus alters the delicate balance of forces that makes a nucleus stable or that causes it to decay quickly. Different isotopes of an element might have wildly different half-lives, the period of time it takes for half of the atoms in a sample to decay into smaller elements.

    Mendeleev’s periodic table, presented to the Russian Chemical Society on March 6, 1869, contained only 63 elements (SN: 1/19/19, p. 14). At first, scientists added to the periodic table by isolating elements from naturally occurring materials, for example, by scrutinizing minerals and separating them into their constituent parts. But that could take scientists only so far. All the elements beyond uranium (element 92) must be created artificially; they do not exist in significant quantities in nature. Scientists discovered elements beyond uranium by bombarding atoms with neutrons or small atomic nuclei or by sifting through the debris from thermonuclear weapons tests.

    But to make the heaviest elements, researchers adopted a new brute force approach: slamming beams of heavy atoms into a target, a disk that holds atoms of another element. If scientists are lucky, the atoms in the beam and target fuse, creating a new atom with a bigger, bulkier nucleus, perhaps one holding more protons than any other known.

    Researchers are using this strategy to go after elements 119 and 120. Scientists want to create such never-before-seen atoms to test how far the periodic table goes, to satisfy curiosity about the forces that hold atoms together and to understand what bizarre chemistry might occur with these extreme atoms.


    How the periodic table went from a sketch to an enduring masterpiece
    150 years ago, Mendeleev perceived the relationships of the chemical elements
    REVOLUTIONARY Russian chemist Dmitrii Mendeleev (shown around 1880) was the first to publish a periodic table, which put the known elements into a logical order and left room for elements not yet discovered. Heritage Image Partnership Ltd/Alamy Stock Photo.

    An ordered vision

    Mendeleev’s periodic table, published in 1869, was a vertical chart that organized 63 known elements by atomic weight. This arrangement placed elements with similar properties into horizontal rows. The title, translated from Russian, reads: “Draft of system of elements: based on their atomic masses and chemical characteristics.”

    The periodic table’s lineup

    The search is gearing up for the next superheavy elements, 119 and 120 (red boxes in the table below). Meanwhile, scientists are studying the known superheavy elements (blue) to better understand how such large atoms behave.


    Coaxing nuclei to combine into a new element is done only at highly specialized facilities in a few locations across the globe, including labs in Russia and Japan. Researchers carefully choose the makeup of the beam and the target in hopes of producing a designer atom of the element desired. That’s how the four newest elements were created: nihonium (element 113), moscovium (115), tennessine (117) and oganesson (118) (SN Online: 11/30/16).

    To create tennessine, for example, scientists combined beams of calcium with a target made of berkelium — once the berkelium finally made it through customs in Russia. The union makes sense when you consider the number of protons in each nucleus. Calcium has 20 protons and berkelium has 97, making for 117 protons total, the number found in tennessine’s nucleus. Combine calcium with the next element down the table, californium, and you get element 118, oganesson.

    Using calcium beams — specifically a stable calcium isotope with a combined total of 48 protons and neutrons known as calcium-48 — has been highly successful. But to create bigger nuclei would take increasingly exotic materials. The californium and berkelium used in previous efforts are so rare that the target materials had to be made at Oak Ridge, where researchers stew materials in a nuclear reactor for months and carefully process the highly radioactive product that comes out. All that work might produce just milligrams of the material.

    To discover element 119 using a calcium-48 beam, researchers would need a target made of einsteinium (element 99) which is even rarer than californium and berkelium. “We can’t make enough einsteinium,” says Oak Ridge physicist James Roberto. Scientists need a new approach. They’ve switched to relatively untested techniques relying on different beams of particles.

    Decay parade

    To discover oganesson-294 (with 294 protons and neutrons), scientists slammed calcium ions into a californium target and observed the chain of radioactive decays initiated by the new element.



    But any new approach would have to produce new elements often enough to be worthwhile. It took almost nine years for a Japanese experiment to prove the existence of nihonium. In that time, researchers spotted the element only three times.

    To avoid such long waits, scientists are carefully choosing their tactics and revving up improved machines to quicken the search.

    A team at the RIKEN Nishina Center for Accelerator-Based Science near Tokyo uses beams of vanadium (element 23), rather than calcium, slamming them into curium (element 96) in the quest to grab elemental glory and find element 119. The group is starting with an existing accelerator and will soon switch to an accelerator upgraded to pump out ion beams that pack more punch. That revamped accelerator could be ready within a year, says RIKEN nuclear chemist Hiromitsu Haba.

    Meanwhile, a new laboratory at the Joint Institute for Nuclear Research, or JINR, in Dubna called the Superheavy Element Factory boasts an accelerator that will crank out ion beams that pummel the target at 10 times the rate of its predecessor. In an upcoming experiment, scientists plan to crash beams of titanium (element 22) into berkelium and californium targets to attempt to produce elements 119 and 120.

    Once JINR’s new experiment is up and running, 119 might be discovered after a couple of years, says JINR nuclear physicist Yuri Oganessian, for whom oganesson, one of several elements discovered there, was named.

    Scientists in Russia have built a new accelerator facility, the Superheavy Element Factory, to search for elements 119 and 120. JINR.

    Relativity rules

    Simply detecting an element, however, doesn’t mean scientists know much about it. “How would one kilogram of flerovium behave, if I had it?” Düllmann asks, referring to element 114. “It would be unlike any other material.”

    The known superheavy elements — those beyond number 103 on the table — are too short-lived to create a chunk big enough to hold in the palm of your hand. So scientists are limited to studying individual atoms, getting to know each new element by analyzing its properties, including how easily it reacts with other substances.

    One big question is whether the periodicity the table is named for applies to superheavy elements. In the table, elements are ordered according to their number of protons, arranged so that the elements in each column have similar properties. Lithium, sodium and others in the first column react violently with water, for example. Elements in the last column, known as noble gases, are famously inert (SN: 1/19/19, p. 18). But for the newest, heaviest elements at the periodic table’s outer reaches, that long-standing rule of chemistry may unravel; some superheavy elements may behave differently from neighbors sitting above them in the table.

    For nuclei crammed with 100-plus protons, a special type of physics takes center stage. Electrons zip around these giant nuclei, sometimes surpassing 80 percent the speed of light. According to Einstein’s special theory of relativity, when particles move that fast, they seem to gain mass. That property changes how closely the electrons hug the nucleus, and as a result, how easily the atoms share electrons to produce chemical reactions. In such atoms, “relativity rules, and standard common wisdom breaks down,” says nuclear physicist Witold Nazarewicz of Michigan State University in East Lansing. “We have to write new textbooks on those atoms.”

    Getting heavy

    The nucleus of superheavy oganesson has 118 protons and many neutrons (blue and red). Its 118 electrons (green) surround the nucleus. Carbon, which is much lighter, contains just six protons and six electrons (not to scale).

    T. Tibbitts

    Some of the periodic table’s more familiar elements are already affected by special relativity. The theory explains why gold has a yellowish hue and why mercury is liquid at room temperature (SN: 2/18/17, p. 11). “Without relativity, a car would not start,” says theoretical chemist Pekka Pyykkö of the University of Helsinki. The reactions that power a car battery depend on special relativity.

    Relativity’s influence may surge as scientists progress along the periodic table. In 2018 in Physical Review Letters, Nazarewicz and colleagues reported that oganesson could be utterly bizarre (SN Online: 2/12/18). The table’s heaviest element, oganesson sits among the reclusive noble gases that shun reactions with other elements. But oganesson bucks the trend, theoretical calculations suggest, and may instead be reactive.

    Oganesson’s chemistry is a hot topic, but scientists haven’t yet been able to directly probe its properties with experiments because oganesson is too rare and fleeting. “All the theoreticians are now running around this element trying to make spectacular predictions,” says theoretical chemist Valeria Pershina of GSI. Similarly, some calculations suggest that flerovium might lean in the opposite direction, being relatively inert, even though it inhabits the same column as more reactive elements such as lead.

    Chemists are striving to test such calculations about how superheavy elements behave. But there is nothing traditional about these chemistry experiments. There are no scientists in white coats wielding flasks and Bunsen burners. “Because we make these things one atom at a time, we can’t do what most people think of as chemistry,” Lawrence Livermore’s Shaughnessy says.

    The experiments can run for months with only a few atoms to show for it. Scientists put those atoms in contact with other elements to see if the two react. At GSI, Düllmann and colleagues are looking at whether flerovium sticks to gold surfaces. Likewise, Shaughnessy and colleagues are testing whether flerovium will glom on to ring-shaped molecules, chosen so that the heavy element could fit inside the molecule’s ring. These studies will test how easily flerovium bonds with other elements, revealing whether it behaves as expected based on its place on the periodic table.

    It’s not just chemical reactions that can get wacky for superheavy elements. Atomic nuclei can be warped into various shapes when packed with protons. Oganesson may have a “bubble” in its nucleus, with fewer protons in its center than at its edges (SN: 11/26/16, p. 11). Still more extreme nuclei may be doughnut-shaped, Nazarewicz says.

    Even the most basic properties of these elements, such as their mass, need to be measured. While scientists had estimated the mass of the various isotopes of the latest new elements using indirect measurements, the arguments supporting those mass estimates weren’t airtight, says Jacklyn Gates of Lawrence Berkeley National Laboratory in California. “They hinge on physics not throwing you a curveball.”

    Jacklyn Gates and Ken Gregorich of the FIONA experiment at Lawrence Berkeley National Laboratory made the first measurements of the masses of recently discovered elements 113 and 115.
    Marilyn Chung/Berkeley Lab

    So Gates and colleagues directly measured the masses of isotopes of nihonium and moscovium using an accelerator at Lawrence Berkeley. An apparatus called FIONA helped researchers measure the masses, thanks to electromagnetic fields that steered an ion of each element onto a detector. The location where each ion hit indicated how massive it was.

    The nihonium isotope the researchers detected had a mass number of 284, meaning its nucleus had a combined total of 284 protons and neutrons. Moscovium had a mass number of 288. Those masses were as predicted, the scientists reported in November in Physical Review Letters. It took about a month just to find one atom of each element.
    Island views

    If researchers could coax these fleeting elements to live longer, studying their properties might be easier. Scientists have caught enticing visions of increasing life spans lying just out of reach — the fabled island of stability (SN: 6/5/10, p. 26). Scientists hope that the isotopes on that island, which would be packed with lots of neutrons, may live long enough that their chemistry can be studied in detail.

    When the idea of an island of stability was proposed in the 1960s, scientists had suggested that the isotopes on its shores might live millions of years. Advances in theoretical physics have since knocked that time frame down, Nazarewicz says. Instead, nuclear physicists now expect the island’s inhabitants to stick around for minutes, hours or maybe even a day — a pleasant eternity for superheavy elements.

    To reach the island of stability, scientists must create new isotopes of known elements. Researchers already know which direction they need to row: They must cram more neutrons into the nuclei of the superheavy elements that have already been discovered. Currently, scientists can’t make atoms with enough neutrons to reach the island’s center, where isotopes are expected to be most stable. But the signs of this island’s existence are already clear. The half-lives of superheavy elements tend to shoot up as scientists pack more neutrons into each nucleus, approaching the island. Flerovium’s half-life increases by almost a factor of 700 as five more neutrons are added, from three milliseconds to two seconds.

    Long life

    Each row below is an element, and each column a different isotope. Atoms are expected to be more stable on the island of stability (predicted location shown). As isotopes of elements (gray squares) approach the island, they tend to live longer, as more neutrons fill the nucleus. Flerovium’s half-life, for example, increases from 0.003 to two seconds.

    T. Tibbitts

    Sources: S. Hofmann et al/Pure and Applied Chemistry 2018; W. Nazarewicz; Y. Oganessian

    Reaching this island “is our big dream,” Haba says. “Unfortunately, we don’t have a very good method to reach the island.” That island is thought to be centered around isotopes that bulge with around 184 neutrons and something like 110 protons. Making such neutron-rich nuclei would demand new, difficult techniques, such as using beams of radioactive particles instead of stable ones. Although radioactive beams can be produced at RIKEN, Haba says, the beams aren’t intense enough to produce new elements at a reasonable rate.

    Still, superheavy element sleuths are keeping at it to learn how these weird atoms behave.

    End of the line

    To fully grasp nature’s extremes, scientists want to know where the periodic table ends.

    “Everybody knows at some point there will be an end,” Düllmann says. “There will be a heaviest element, ultimately.” The table will be finished when we’ve discovered all elements with isotopes that live at least a hundredth of a trillionth of a second. That’s the limit for what qualifies as an element, according to the standards set by the International Union of Pure and Applied Chemistry. More ephemeral nuclei wouldn’t have enough time to gather a crew of electrons. Since the give-and-take of electrons is the basis of chemical reactions, lone nuclei wouldn’t exhibit chemistry at all, and therefore don’t deserve a spot on the table.

    “Where it will exactly end is difficult to say,” Nazarewicz says. Calculations of how quickly a nucleus will decay by fission, or splitting in two, are uncertain, which makes it hard to estimate how long elements might live without actually creating them.

    The linear accelerator at RIKEN in Japan, used to discover element 113*, is being refurbished to probe for element 119. RIKEN

    *According to a statement via email from LLNL, 113, was first found at LLNL; but on 113, Riken published first and so got the credit.

    And the final table may contain holes or other odd features. That could happen if, within a row of elements, there’s one spot for which no isotope persists long enough to qualify as an element.

    Another idiosyncrasy: Elements may not be arranged in sequential order by the number of protons they contain, according to calculations in a 2011 paper by Pyykkö in Physical Chemistry Chemical Physics. Element 139, for example, might sit to the right of element 164 — if such heavy elements indeed exist. That’s because special relativity alters the normal order in which electrons slot themselves into shells, arrangements that define how the electrons swirl about the atom. That pattern of shell filling is what gives the periodic table its shape, and the unusual filling may mean scientists decide to assign elements to spots out of order.

    But additions to the table could dry up before that happens if scientists reach the limit of their ability to create heavier elements. When elements live minuscule fractions of a second, even the atom’s trip to a detector may take too long; the element would decay before it ever had a chance to be spotted.

    In reality, there’s no clear idea of how to search for elements beyond 119 and 120. But the picture has seemed bleak before.

    “We should not underestimate the next generation. They may have smart ideas. They will have new technologies,” Düllmann says. “The next element is always the hardest. But it’s probably not the last one.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 8:30 pm on February 21, 2019 Permalink | Reply
    Tags: , , Chemistry, , Molecular ensemble, , , , PtPOP, , , ,   

    From SLAC National Accelerator Lab: “Researchers watch molecules in a light-triggered catalyst ring ‘like an ensemble of bells’’ 

    From SLAC National Accelerator Lab

    February 21, 2019
    Ali Sundermier

    Synchronized molecules
    When photocatalyst molecules absorb light, they start vibrating in a coordinated way, like an ensemble of bells. Capturing this response is a critical step towards understanding how to design molecules for the efficient transformation of light energy to high-value chemicals. (Gregory Stewart/SLAC National Accelerator Laboratory)

    A better understanding of these systems will aid in developing next-generation energy technologies.

    Photocatalysts ­– materials that trigger chemical reactions when hit by light – are important in a number of natural and industrial processes, from producing hydrogen for fuel to enabling photosynthesis.

    Now an international team has used an X-ray laser at the Department of Energy’s SLAC National Accelerator Laboratory to get an incredibly detailed look at what happens to the structure of a model photocatalyst when it absorbs light.

    The researchers used extremely fast laser pulses to watch the structure change and see the molecules vibrating, ringing “like an ensemble of bells,” says lead author Kristoffer Haldrup, a senior scientist at Technical University of Denmark (DTU). This study paves the way for deeper investigation into these processes, which could help in the design of better catalysts for splitting water into hydrogen and oxygen for next-generation energy technologies.

    “If we can understand such processes, then we can apply that understanding to developing molecular systems that do tricks like that with very high efficiency,” Haldrup says.

    The results published last week in Physical Review Letters.

    Molecular ensemble

    The platinum-based photocatalyst they studied, called PtPOP, is one of a class of molecules that scissors hydrogen atoms off various hydrocarbon molecules when hit by light, Haldrup says: “It’s a testbed – a playground, if you will – for studying photocatalysis as it happens.”

    At SLAC’S X-ray laser, the Linac Coherent Light Source (LCLS), the researchers used an optical laser to excite the platinum-containing molecules and then used X-rays to see how these molecules changed their structure after absorbing the visible photons.


    The extremely short X-ray laser pulses allowed them to watch the structure change, Haldrup says.

    The researchers used a trick to selectively “freeze” some of the molecules in their vibrational motion, and then used the ultrashort X-ray pulses to capture how the entire ensemble of molecules evolved in time after being hit with light. By taking these images at different times they can stitch together the individual frames like a stop-motion movie. This provided them with detailed information about molecules that were not hit by the laser light, offering insight into the ultrafast changes occurring in the molecules when they are at their lowest energy.

    Swimming in harmony

    Even before the light hits the molecules, they are all vibrating but out of sync with one another. Kelly Gaffney, co-author on this paper and director of SLAC’s Stanford Synchrotron Radiation Lightsource, likens this motion to swimmers in a pool, furiously treading water.

    SLAC SSRL Campus



    When the optical laser hits them, some of the molecules affected by the light begin moving in unison and with greater intensity, switching from that discordant tread to synchronized strokes. Although this phenomenon has been seen before, until now it was difficult to quantify.

    “This research clearly demonstrates the ability of X-rays to quantify how excitation changes the molecules,” Gaffney says. “We can not only say that it’s excited vibrationally, but we can also quantify it and say which atoms are moving and by how much.”

    Predictive chemistry

    To follow up on this study, the researchers are investigating how the structures of PtPOP molecules change when they take part in chemical reactions. They also hope to use the information they gained in this study to directly study how chemical bonds are made and broken in similar molecular systems.

    “We get to investigate the very basics of photochemistry, namely how exciting the electrons in the system leads to some very specific changes in the overall molecular structure,” says Tim Brandt van Driel, a co-author from DTU who is now a scientist at LCLS. “This allows us to study how energy is being stored and released, which is important for understanding processes that are also at the heart of photosynthesis and the visual system.”

    A better understanding of these processes could be key to designing better materials and systems with useful functions.

    “A lot of chemical understanding is rationalized after the fact. It’s not predictive at all,” Gaffney says. “You see it and then you explain why it happened. We’re trying to move the design of useful chemical materials into a more predictive space, and that requires accurate detailed knowledge of what happens in the materials that already work.”

    LCLS and SSRL are DOE Office of Science user facilities. This research was supported by DANSCATT; the Independent Research Fund Denmark; the Icelandic Research Fund; the Villum Foundation; and the AMOS program within the Chemical Sciences, Geosciences and Biosciences Division of the DOE Office of Basic Energy Sciences.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 5:27 pm on February 20, 2019 Permalink | Reply
    Tags: , , , , Chemistry, Photo Essay- Underground Lab science in many fields, ,   

    From Sanford Underground Research Facility: “Science impact” 

    SURF logo
    Sanford Underground levels

    From Sanford Underground Research Facility

    Erin Broberg
    Matthew Kapust, photographer

    Sanford Lab’s dedication to science, research and development and engineering, as well as its innovative approach to education, make it a world-leading science facility.

    Dark matter science impacts

    The LZ experiment is the upgraded successor to the highly successful Large Underground Xenon (LUX) experiment. LUX held world-leading sensitivity for approximately three and a half years over most of the WIMP-mass region. The LZ experiment was one of two direct-search, next-generation dark matter experiments selected for funding by DOE’s Office of High Energy Physics (HEP).

    LZ involves a collaboration of 250 scientists, engineers and technicians from 38 institutions, including five U.S. National Labs. LZ expects to achieve a projected sensitivity level up to 100 times better than the final LUX search result for weakly interacting massive particles (WIMPs), the leading dark matter particle candidate.

    Currently in the construction and installation phase, LZ is expected to begin operations in late 2019. The collaboration will perform a direct search for dark matter using 10 tonnes of liquid xenon within an ultra-pure titanium cryostat that will be surrounded by a new liquid scintillator veto system. The entire experiment will be immersed in a 72,000-gallon tank filled with ultra-pure water.

    Neutrino science impacts

    Beginning with Dr. Ray Davis’ groundbreaking neutrino research (1965-1992), the drifts at Sanford Lab are dedicated to refining knowledge about neutrinos and other research.

    The Majorana Demonstrator Project has been collecting physics data since 2017. Recently published results are competitive with world-leading experiments and highlight the exceptional energy resolution and low backgrounds that have been achieved through the shielding offered at Sanford Lab.

    The MJD project invested significant resources to produce the world’s purest copper. In parallel with ongoing MJD operations, specific elements—such as electronics upgrades and copper electroforming—are being pursued at Sanford Lab in the context of R&D for the next-generation neutrinoless double-beta decay experiment called the Large Enriched Germanium Experiment for Neutrinoless bb Decay (LEGEND). LEGEND-200 physics data collection is expected to begin in 2021. Extraordinary levels of material radiopurity will be required to reach the LEGEND-1000 background goal.

    The work done at Sanford Lab, including depth and ultra-pure materials, have been instrumental in refining the search and preparing for the next generations.

    223 acres
    Surface footprint

    The local footprint of the facility includes 223 acres on the surface. Facilities at both the Yates and Ross surface campuses offer researchers administrative support, office space, communications and education and public outreach. The Waste Water Treatment Plant handles and processes waste materials and a warehouse for shipping and receiving.

    370 miles
    Underground footprint
    Of the 370 total miles of underground space, Sanford Lab maintains approximately 12 for science at various levels, including the 300, 800, 1700, 2000, 4100, and 4850 levels. The Davis Campus on the 4850 Level is a world-class laboratory space that houses experiment for neutrinoless double-beta decay and dark matter.

    The CASPAR experiment, led by SD Mines, studies stellar nuclear fusion reactions, especially neutron production for slow neutron-capture nucleosynthesis (s-process). Accelerator components were relocated from the University of Notre Dame in 2015, and since the first beam in May 2017 and the first operations event in July 2017, accelerator commissioning has continued. Advanced commissioning data were obtained starting in February 2018 using the domain of interest for stellar CNO reactions.

    “Researchers at CASPAR are engaging a community of researchers. Although Notre Dame and SD Mines are at the core, the collaboration continues to reach out to other research groups to build interest. One of the biggest impacts in South Dakota is the number of grad students participating in the Physics Ph.D. program in the state.” —Jaret Heise


    Low-background counting impacts

    The BHUC houses a low-background counting facility where components for physics experiments, including current and future Sanford Lab experiments, can be assayed. There has been significant interest in the BHUC low-background counting facility from many groups, including the Sub Electron Noise Skipper-CCD Experimental Instrument (SENSEI) experiment, which aims to search for low-mass dark matter using ~100 g of silicon CCD sensors, and the Germanium Internal Charge Amplification for Dark Matter Searches (GeICA) project.

    Six high-purity germanium detectors are currently operating at the facility, with installation of an additional germanium detector expected in 2019. These low-background counters have been instrumental in characterizing materials for the LZ experiment for the past several years.

    “The campus at Sanford Lab is an ideal location for these counters. Not only does its depth create a shield for the detectors, but it’s in the thick of major physics experiments—it’s where the action is.” —Kevin Lesko, senior scientist at Lawrence Berkley National Lab (Berkeley Lab) who manages the measurement and control of backgrounds

    Geology research impacts

    The SIGMA-V experiment, led by Lawrence Berkeley National Lab (Berkeley Lab), is a significant effort within the earth science field. SIGMA-V mobilized in October 2017, drilling a set of eight horizontal holes (each nearly 200 feet long) on the 4850L.

    Members of the SIGMA-V experiment are continuing to explore enhanced or engineered geothermal systems (EGS) by building on results obtained from a previous experiment that was hosted at SURF between 2016 and 2017. Both groups drilled new holes as field demonstration sites in support of DOE flagship EGS effort called the Frontier Observatory for Research in Geothermal Energy (FORGE). SIGMA-V is testing the validation of thermal-hydrological mechanical-chemical (THMC) modeling approaches, as well as novel monitoring tools.

    Biology opportunities

    Important questions in life science, such as the conditions of life, the extent of life and ultimately the rules of life, are also being addressed underground at SURF. Generally, these programs have a small footprint in existing spaces and require only modest support from the facility. Biology researchers take full advantage of SURF’s footprint by gathering samples from a number of underground levels and areas with different temperatures and geologic mineralogies. Various groups focus on the diversity of life, including rock-hosted microbial ecosystems, and engineering applications such as improvements to biofuel production.


    The Sanford Underground Research Facility offers a variety of environments in which engineers can test real-world applications and new technologies. And the rich history of the Homestake Mine, which includes a vast archive of core samples, allows engineers to better understand how to excavate caverns for new experiments.

    Sanford Lab’s dedication to science, research and development and engineering, as well as its innovative approach to education, make it a world-leading science facility.

    The Sanford Underground Research Facility supports world-leading research in particle and nuclear physics and other science disciplines. While still a gold mine, the facility hosted Ray Davis’s solar neutrino experiment, which shared the 2002 Nobel Prize in Physics. His work is a model for other experiments looking to understand the nature of the universe.

    The Facility’s depth, rock stability and history make it ideal for sensitive experiments that need to escape cosmic rays. The impacts on science can be seen worldwide.

    Our science as national priority

    In 2014, the Department of Energy’s High Energy Physics Advisory Panel (HEPAP) committee prioritized physics experiments, giving neutrino and dark matter projects high-priority. Sanford Lab houses two of the five experiments named in the Particle Physics Project Prioritization Panel (P-5) Report: LUX-ZEPLIN (LZ) and LBNF/DUNE.

    In 2015, a similar report done by the Department of Energy’s Nuclear Science Advisory Committee (NSAC) committee prioritized the ton-scale neutrinoless double-beta decay experiment, which aligns with the objectives of the Majorana Demonstrator Project.

    International investment and cooperation

    Sanford Lab hosts a variety of research projects in many disciplines. Researchers from around the globe use the facility to learn more about our universe, life underground and the unique geology of the region.

    The site also allows scientists to share and foster growth within the science community and encourages cooperation between many countries and institutions.

    We now have several hundred researchers from dozens of institutions around the world.

    For example, for the first time in its history, CERN is investing in an experiment outside of the European Union with its $90 million commitment to LBNF/DUNE in the form of ProtoDUNE. The ProtoDUNE detectors have already recorded physics results. Additionally, the UK committed $88 million to the project.

    CERN ProtoDune

    Cern ProtoDune


    Local impact

    Building laboratory spaces deep underground at Sanford Lab created new opportunities for higher education in South Dakota. In 2012, the Board of Regents authorized a joint Ph.D. physics program at the South Dakota School of Mines and Technology in Rapid City and the University of South Dakota in Vermillion. Since then, dozens of students have participated in the program and worked on experiments at Sanford Lab. In 2017, each university saw their first students complete the program.

    To date, there are 27 ongoing research projects housed at Sanford Lab, 24 of which include students and faculty from universities across South Dakota.

    The Black Hills State University Underground Campus (BHUC) provides a space for students from across the state to preform interdisciplinary research underground. While physics students contribute to large-scale physics experiments by working in the low background counting facility, students from other disciplines can work on research in two areas adjoining the counting cleanroom.

    “Biology students can study microbes in situ, and geology students can study the unique rock formations of the Black Hills,” said Briana Mount, director of the BHUC.

    Additionally, a National Science Foundation (NSF) program, Research Experience for Undergraduates (REU), gives students from around the country, opportunities to pursue research through the underground campus.


    Global footprint

    Competition for underground laboratory space is fierce. With the completion of the Long-Baseline Neutrino Facility (LBNF) construction, Sanford Lab will host approximately 25 percent of the total volume of underground laboratory space in the world.

    Surf-Dune/LBNF Caverns at Sanford

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    The sheer amount of space (7,700 acres underground) and existing infrastructure make the site highly attractive for future experiments in a variety of disciplines.

    Global footprint depth

    Sanford Lab is the deepest underground lab in the U.S. at 1,490 meters. The average rock overburden is approximately 4300 meters water equivalent for existing laboratories on the 4850 Level. The underground laboratory space has a strong track record of meeting experiment needs.


    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    LBNL LZ project will replace LUX at SURF [see below]

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    LUX’s mission was to scour the universe for WIMPs, vetoing all other signatures. It would continue to do just that for another three years before it was decommissioned in 2016.

    In the midst of the excitement over first results, the LUX collaboration was already casting its gaze forward. Planning for a next-generation dark matter experiment at Sanford Lab was already under way. Named LUX-ZEPLIN (LZ), the next-generation experiment would increase the sensitivity of LUX 100 times.

    SLAC physicist Tom Shutt, a previous co-spokesperson for LUX, said one goal of the experiment was to figure out how to build an even larger detector.
    “LZ will be a thousand times more sensitive than the LUX detector,” Shutt said. “It will just begin to see an irreducible background of neutrinos that may ultimately set the limit to our ability to measure dark matter.”
    We celebrate five years of LUX, and look into the steps being taken toward the much larger and far more sensitive experiment.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE

    U Washington Majorana Demonstrator Experiment at SURF

    The MAJORANA DEMONSTRATOR will contain 40 kg of germanium; up to 30 kg will be enriched to 86% in 76Ge. The DEMONSTRATOR will be deployed deep underground in an ultra-low-background shielded environment in the Sanford Underground Research Facility (SURF) in Lead, SD. The goal of the DEMONSTRATOR is to determine whether a future 1-tonne experiment can achieve a background goal of one count per tonne-year in a 4-keV region of interest around the 76Ge 0νββ Q-value at 2039 keV. MAJORANA plans to collaborate with GERDA for a future tonne-scale 76Ge 0νββ search.

    LBNL LZ project at SURF, Lead, SD, USA


    CASPAR is a low-energy particle accelerator that allows researchers to study processes that take place inside collapsing stars.

    The scientists are using space in the Sanford Underground Research Facility (SURF) in Lead, South Dakota, to work on a project called the Compact Accelerator System for Performing Astrophysical Research (CASPAR). CASPAR uses a low-energy particle accelerator that will allow researchers to mimic nuclear fusion reactions in stars. If successful, their findings could help complete our picture of how the elements in our universe are built. “Nuclear astrophysics is about what goes on inside the star, not outside of it,” said Dan Robertson, a Notre Dame assistant research professor of astrophysics working on CASPAR. “It is not observational, but experimental. The idea is to reproduce the stellar environment, to reproduce the reactions within a star.”

  • richardmitnick 11:20 am on February 15, 2019 Permalink | Reply
    Tags: "Physicists create a quantum refrigerator that cools with an absence of light", , Chemistry, , Near-field photonic cooling through control of the chemical potential of photons, , , ,   

    From U Michigan via Science Magazine: “Physicists create a quantum refrigerator that cools with an absence of light” 

    U Michigan bloc

    From University of Michigan


    Science Magazine

    Feb. 14, 2019
    Daniel Garisto

    This new device shows that an LED can cool other tiny objects. Joseph Xu/Michigan Engineering, Communications & Marketing

    For decades, atomic physicists have used laser light to slow atoms zinging around in a gas, cooling them to just above absolute zero to study their weird quantum properties. Now, a team of scientists has managed to similarly cool an object—but with the absence of light rather than its presence. The technique, which has never before been experimentally shown, might someday be used to chill the components in microelectronics.

    In an ordinary laser cooling experiment, physicists shine laser light from opposite directions—up, down, left, right, front, back—on a puff of gas such as rubidium. They tune the lasers precisely, so that if an atom moves toward one of them, it absorbs a photon and gets a gentle push back toward the center. Set it up just right and the light saps away the atoms’ kinetic energy, cooling the gas to a very low temperature.

    But Pramod Reddy, an applied physicist at the University of Michigan in Ann Arbor, wanted to try cooling without the special properties of laser light. He and colleagues started with a widget made of semiconducting material commonly found in video screens—a light-emitting diode (LED). An LED exploits a quantum mechanical effect to turn electrical energy into light. Roughly speaking, the LED acts like a little ramp for electrons. Apply a voltage in the right direction and it pushes electrons up and over the ramp, like kids on skateboards. As electrons fall over the ramp to a lower energy state, they emit photons.

    Crucially for the experiment, the LED emits no light when the voltage is reversed, as the electrons cannot go over the ramp in the opposite direction. In fact, reversing the voltage also suppresses the device’s infrared radiation—the broad spectrum of light (including heat) that you see when you look at a hot object through night vision goggles.

    That effectively makes the device colder—and it means the little thing can work like a microscopic refrigerator, Reddy says. All that’s necessary is to put it close enough to another tiny object, he says. “If you take a hot object and a cold object … you can have a radiative exchange of heat,” Reddy says. To prove that they could use an LED to cool, the scientists placed one just tens of nanometers—the width of a couple hundred atoms—away from a heat-measuring device called a calorimeter. That was close enough to increase the transfer of photons between the two objects, due to a process called quantum tunneling. Essentially, the gap was so small that photons could sometimes hop over it.

    The cooler LED absorbed more photons from the calorimeter than it gave back to it, wicking heat away from the calorimeter and lowering its temperature by a ten-thousandth of a degree Celsius, Reddy and colleagues report this week in Nature. That’s a small change, but given the tiny size of the LED, it equals an energy flux of 6 watts per square meter. For comparison, the sun provides about 1000 watts per square meter. Reddy and his colleagues believe they could someday increase the cooling flux up to that strength by reducing the gap size and siphoning away the heat that builds up in the LED.

    The technique probably won’t replace traditional refrigeration techniques or be able to cool materials below temperatures of about 60 K. But it has the potential to someday be used for cooling microelectronics, according to Shanhui Fan, a theoretical physicist at Stanford University in Palo Alto, California, who was not involved with the work. In earlier work, Fan used computer modeling to predict that an LED could have a sizeable cooling effect if placed nanometers from another object. Now, he said, Reddy and his team have realized that idea experimentally.

    See the full article here .


    Please support STEM education in your local school system

    Stem Education Coalition

    U MIchigan Campus

    The University of Michigan (U-M, UM, UMich, or U of M), frequently referred to simply as Michigan, is a public research university located in Ann Arbor, Michigan, United States. Originally, founded in 1817 in Detroit as the Catholepistemiad, or University of Michigania, 20 years before the Michigan Territory officially became a state, the University of Michigan is the state’s oldest university. The university moved to Ann Arbor in 1837 onto 40 acres (16 ha) of what is now known as Central Campus. Since its establishment in Ann Arbor, the university campus has expanded to include more than 584 major buildings with a combined area of more than 34 million gross square feet (781 acres or 3.16 km²), and has two satellite campuses located in Flint and Dearborn. The University was one of the founding members of the Association of American Universities.

    Considered one of the foremost research universities in the United States,[7] the university has very high research activity and its comprehensive graduate program offers doctoral degrees in the humanities, social sciences, and STEM fields (Science, Technology, Engineering and Mathematics) as well as professional degrees in business, medicine, law, pharmacy, nursing, social work and dentistry. Michigan’s body of living alumni (as of 2012) comprises more than 500,000. Besides academic life, Michigan’s athletic teams compete in Division I of the NCAA and are collectively known as the Wolverines. They are members of the Big Ten Conference.

  • richardmitnick 10:28 am on February 15, 2019 Permalink | Reply
    Tags: , Ceramics, Chemistry, Researchers create ultra-lightweight ceramic material that can better withstand extreme temperatures,   

    From UCLA Newsroom: “Researchers create ultra-lightweight ceramic material that can better withstand extreme temperatures” 

    From UCLA Newsroom

    February 14, 2019
    Matthew Chin

    UCLA-led team develops highly durable aerogel that could ultimately be an upgrade for insulation on spacecraft.

    The new ceramic aerogel is so lightweight that it can rest on a flower without damaging it. Xiangfeng Duan and Xiang Xu/UCLA

    UCLA researchers and collaborators at eight other research institutions have created an extremely light, very durable ceramic aerogel. The material could be used for applications like insulating spacecraft because it can withstand the intense heat and severe temperature changes that space missions endure.

    Ceramic aerogels have been used to insulate industrial equipment since the 1990s, and they have been used to insulate scientific equipment on NASA’s Mars rover missions. But the new version is much more durable after exposure to extreme heat and repeated temperature spikes, and much lighter. Its unique atomic composition and microscopic structure also make it unusually elastic.

    When it’s heated, the material contracts rather than expanding like other ceramics do. It also contracts perpendicularly to the direction that it’s compressed — imagine pressing a tennis ball on a table and having the center of the ball move inward rather than expanding out — the opposite of how most materials react when compressed. As a result, the material is far more flexible and less brittle than current state-of-the-art ceramic aerogels: It can be compressed to 5 percent of its original volume and fully recover, while other existing aerogels can be compressed to only about 20 percent and then fully recover.

    The research, which was published today in Science, was led by Xiangfeng Duan, a UCLA professor of chemistry and biochemistry; Yu Huang, a UCLA professor of materials science and engineering; and Hui Li of Harbin Institute of Technology, China. The study’s first authors are Xiang Xu, a visiting postdoctoral fellow in chemistry at UCLA from Harbin Institute of Technology; Qiangqiang Zhang of Lanzhou University; and Menglong Hao of UC Berkeley and Southeast University.

    Other members of the research team were from UC Berkeley; Purdue University; Lawrence Berkeley National Laboratory; Hunan University, China; Lanzhou University, China; and King Saud University, Saudi Arabia.

    Despite the fact that more than 99 percent of their volume is air, aerogels are solid and structurally very strong for their weight. They can be made from many types of materials, including ceramics, carbon or metal oxides. Compared with other insulators, ceramic-based aerogels are superior in blocking extreme temperatures, and they have ultralow density and are highly resistant to fire and corrosion — all qualities that lend themselves well to reusable spacecraft.

    But current ceramic aerogels are highly brittle and tend to fracture after repeated exposure to extreme heat and dramatic temperature swings, both of which are common in space travel.

    The new material is made of thin layers of boron nitride, a ceramic, with atoms that are connected in hexagon patterns, like chicken wire.

    In the UCLA-led research, it withstood conditions that would typically fracture other aerogels. It stood up to hundreds of exposures to sudden and extreme temperature spikes when the engineers raised and lowered the temperature in a testing container between minus 198 degrees Celsius and 900 degrees above zero over just a few seconds. In another test, it lost less than 1 percent of its mechanical strength after being stored for one week at 1,400 degrees Celsius.

    “The key to the durability of our new ceramic aerogel is its unique architecture,” Duan said. “Its innate flexibility helps it take the pounding from extreme heat and temperature shocks that would cause other ceramic aerogels to fail.”

    Breath mint-sized samples of the ceramic aerogels developed by a UCLA-led research team. The material is 99 percent air by volume, making it super lightweight. Oszie Tarula/UCLA

    Ordinary ceramic materials usually expand when heated and contract when they are cooled. Over time, those repeated temperature changes can lead those materials to fracture and ultimately fail. The new aerogel was designed to be more durable by doing just the opposite — it contracts rather than expanding when heated.

    In addition, the aerogel’s ability to contract perpendicularly to the direction that it’s being compressed — like the tennis ball example — help it survive repeated and rapid temperature changes. (That property is known as a negative Poisson’s ratio.) It also has interior “walls” that are reinforced with a double-pane structure, which cuts down the material’s weight while increasing its insulating abilities.

    Duan said the process researchers developed to make the new aerogel also could be adapted to make other ultra-lightweight materials.

    “Those materials could be useful for thermal insulation in spacecraft, automobiles or other specialized equipment,” he said. “They could also be useful for thermal energy storage, catalysis or filtration.”

    The research was partly supported by grants from the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

  • richardmitnick 4:15 pm on February 12, 2019 Permalink | Reply
    Tags: , , Chemist Francesco Evangelista-a winner of the Dirac Medal, Chemistry, Emory University, ,   

    From Emory University- “A new spin on computing: Chemist leads $3.9 million DOE quest for quantum software” 

    From Emory University

    February 5, 2019
    Carol Clark

    “Quantum computers are not just exponentially faster, they work in a radically different way from classical computers,” says chemist Francesco Evangelista, who is leading a project to develop quantum software He is a winner of the Dirac medal.

    When most people think of a chemistry lab, they picture scientists in white coats mixing chemicals in beakers. But the lab of theoretical chemist Francesco Evangelista looks more like the office of a tech start-up. Graduate students in jeans and t-shirts sit around a large, round table chatting as they work on laptops.

    “A ‘classical’ chemist is focused on getting a chemical reaction and creating new molecules,” explains Evangelista, assistant professor at Emory University. “As theoretical chemists, we want to understand how chemistry really works — how all the atoms involved interact with one another during a reaction.”

    Working at the intersection of math, physics, chemistry and computer science, the theorists develop algorithms to serve as simulation models for the molecular behaviors of atomic nuclei and electrons. They also develop software that enables them to feed these algorithms into “super” computers — nearly a million times faster than a laptop — to study chemical processes.

    The problem is, even super computers are taxed by the mind-boggling combinatorial complexity underlying reactions. That limits the pace of the research.

    “Computers have hit a barrier in terms of speed,” Evangelista says. “One way to make them more powerful is to make transistors smaller, but you can’t make them smaller than the width of a couple of atoms — the limit imposed by quantum mechanics. That’s why there is a race right now to make breakthroughs in quantum computing.”

    Evangelista and his graduate students have now joined that race.

    The Department of Energy (DOE) awarded Evangelista $3.9 million to lead research into the development of software to run the first generation of quantum computers. He is the principal investigator for the project, encompassing scientists at seven universities, to develop new methods and algorithms for calculating problems in quantum chemistry. The tools the team develops will be open access, made available to other researchers for free.

    Watch a video about Francesco Evangelista’s work,
    produced by the Camille & Henry Dreyfus Foundation:

    While big-data leaders — such as IBM, Google, Intel and Rigetti — have developed prototypes of quantum computers, the field remains in its infancy. Many technological challenges remain before quantum computers can fulfill their promise of speeding up calculations to crack major mysteries of the natural world.

    The federal government will play a strong supporting role in achieving this goal. President Trump recently signed a $1.2 billion law, the National Quantum Initiative Act, to fund advances in quantum technologies over the next five years.

    “Right now, it’s a bit of a wild west, but eventually people working on this giant endeavor are going to work out some of the current technological problems,” Evangelista says. “When that happens, we need to have quantum software ready and a community trained to use it for theoretical chemistry. Our project is working on programming codes that will someday get quantum computers to do the calculations we want them to do.”

    The project will pave the way for quantum computers to simulate chemical systems critical to the mission of the DOE, such as transition metal catalysts, high-temperature superconductors and novel materials that are beyond the realm of simulation on “classical” computers. The insights gained could speed up research into how to improve everything from solar power to nuclear energy.

    Unlike objects in the “classical” world, that we can touch, see and experience around us, nature behaves much differently in the ultra-small quantum world of atoms and subatomic particles.

    “One of the weird things about quantum mechanics is that you can’t say whether an electron is actually only here or there,” Evangelista says.

    He takes a coin from his pocket. “In the classical world, we know that an object like this quarter is either in my pocket or in your pocket,” Evangelista says. “But if this was an electron, it could be in both our pockets. I cannot tell you exactly where it is, but I can use a wave function to describe the likelihood of whether it is here or there.”

    To make things even more complicated, the behavior of electrons can be correlated, or entangled. When objects in our day-to-day lives, like strands of hair, become entangled they can be teased apart and separated again. That rule doesn’t apply at the quantum scale where entangled objects are somehow intimately connected even if they are apart in space.

    “Three electrons moving in three separate orbitals can actually be interacting with one another,” Evangelista says. “Somehow they are talking together and their motion is correlated like ballerinas dancing and moving in a concerted way.”

    Graduate students in Evangelista’s lab are developing algorithms to simulate quantum software so they can run tests and adapt the design based on the results.

    Much of Evangelista’s work involves trying to predict the collective behavior of strongly correlated electrons. In order to understand how a drug interacts with a protein, for example, he needs to consider how it affects the hundreds of thousands of atoms in that protein, along with the millions of electrons within those atoms.

    “The problem quickly explodes in complexity,” Evangelista says. “Computationally, it’s difficult to account for all the possible combinations of ways the electrons could be interacting. The computer soon runs out of memory.”

    A classical computer stores memory in a line of “bits,” which are represented by either a “0” or a “1.” It operates on chunks of 64 bits of memory at a time, and each bit is either distinctly a 0 or a 1. If you add another bit to the line, you get just one more bit of memory.

    A quantum computer stores memory in quantum bits, or qubits. A single qubit can be either a 0 or a 1 — or mostly a 0 and part of a 1 — or any other combination of the two. When you add a qubit to a quantum computer, it increases the memory by a factor of two. The fastest quantum computers now available contain around 70 qubits.

    “Quantum computers are not just exponentially faster, they work in a radically different way from classical computers,” Evangelista says.

    For instance, a classical computer can determine all the consequences of a chess move by working one at a time through the chain of possible next moves. A quantum computer, however, could potentially determine all these possible moves in one go, without having to work through each step.

    While quantum computers are powerful, they are also somewhat delicate.

    “They’re extremely sensitive,” Evangelista says. “They have to be kept at low temperatures to maintain their coherence. In a typical setup, you also need a second computer kept at very low temperatures to drive the quantum computer, otherwise the heat from the wires coming out will destroy entanglement.”

    The potential error rate is one of the challenges of the DOE project to develop quantum software. The researchers need to determine the range of errors that can still yield a practical solution to a calculation. They will also develop standard benchmarks for testing the accuracy and computing power of new quantum hardware and they will validate prototypes of quantum computers in collaborations with industry partners Google and Rigetti.

    Just as they develop algorithms to simulate chemical processes, Evangelista and his graduate students are now developing algorithms to simulate quantum software so they can run tests and adapt the design based on the results.

    Evangelista pulled together researchers from other universities with a range of expertise for the project, including some who are new to quantum computing and others who are already experts in the field. The team includes scientists from Rice University, Northwestern, the University of Michigan, CalTech, the University of Toronto and Dartmouth.

    The long-range goal is to spur the development of more efficient energy sources, including solar power, by providing detailed data on phenomena such as the ways electrons in a molecule are affected when that molecule absorbs light.

    “Ultimately, such theoretical insights could provide a rational path to efforts like making solar cells more efficient, saving the time and money needed to conduct trial-and-error experiments in a lab,” Evangelista says.

    Evangelista also has ongoing collaborations with Emory chemistry professor Tim Lian, studying ways to harvest and convert solar energy into chemical fuels. In 2017, Evangelista won the Dirac Medal, one of the world’s most prestigious awards for theoretical and computational chemists under 40.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Emory University is a private research university in metropolitan Atlanta, located in the Druid Hills section of DeKalb County, Georgia, United States. The university was founded as Emory College in 1836 in Oxford, Georgia by the Methodist Episcopal Church and was named in honor of Methodist bishop John Emory. In 1915, the college relocated to metropolitan Atlanta and was rechartered as Emory University. The university is the second-oldest private institution of higher education in Georgia and among the fifty oldest private universities in the United States.

    Emory University has nine academic divisions: Emory College of Arts and Sciences, Oxford College, Goizueta Business School, Laney Graduate School, School of Law, School of Medicine, Nell Hodgson Woodruff School of Nursing, Rollins School of Public Health, and the Candler School of Theology. Emory University, the Georgia Institute of Technology, and Peking University in Beijing, China jointly administer the Wallace H. Coulter Department of Biomedical Engineering. The university operates the Confucius Institute in Atlanta in partnership with Nanjing University. Emory has a growing faculty research partnership with the Korea Advanced Institute of Science and Technology (KAIST). Emory University students come from all 50 states, 6 territories of the United States, and over 100 foreign countries.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: