Updates from January, 2018 Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:00 pm on January 23, 2018 Permalink | Reply
    Tags: , , , Neural networks for neutrinos, , , ,   

    From Symmetry: “Neural networks for neutrinos” 

    Symmetry Mag

    Symmetry

    01/23/18
    Diana Kwon

    1
    Artwork by Sandbox Studio, Chicago

    Scientists are using cutting-edge machine-learning techniques to analyze physics data.

    Particle physics and machine learning have long been intertwined.

    One of the earliest examples of this relationship dates back to the 1960s, when physicists were using bubble chambers to search for particles invisible to the naked eye. These vessels were filled with a clear liquid that was heated to just below its boiling point so that even the slightest boost in energy—for example, from a charged particle crashing into it—would cause it to bubble, an event that would trigger a camera to take a photograph.

    Female scanners often took on the job of inspecting these photographs for particle tracks. Physicist Paul Hough handed that task over to machines when he developed the Hough transform, a pattern recognition algorithm, to identify them.

    The computer science community later developed the Hough transform for use in applications such as computer vision, attempts to train computers to replicate the complex function of a human eye.

    “There’s always been a little bit of back and forth” between these two communities, says Mark Messier, a physicist at Indiana University.

    Since then, the field of machine learning has rapidly advanced. Deep learning, a form of artificial intelligence modeled after the human brain, has been implemented for a wide range of applications such as identifying faces, playing video games and even synthesizing life-like videos of politicians.

    Over the years, algorithms that help scientists pick interesting aberrations out of background data have been used in physics experiments such as BaBar at SLAC National Accelerator Laboratory and experiments at the Large Electron-Positron Collider at CERN and the Tevatron at Fermi National Accelerator Laboratory.

    SLAC BABAR

    CERN LEP Collider

    FNAL Tevatron

    FNAL/Tevatron map


    FNAL/Tevatron DZero detector


    FNAL/Tevatron CDF detector

    More recently, algorithms that learn to recognize patterns in large datasets have been handy for physicists studying hard-to-catch particles called neutrinos.

    This includes scientists on the NOvA experiment, who study a beam of neutrinos created at the US Department of Energy’s Fermilab near Chicago.

    FNAL NOvA Near Detector


    FNAL/NOvA experiment map

    The neutrinos stream straight through Earth to a 14,000-metric-ton detector filled with liquid scintillator sitting near the Canadian border in Minnesota.

    When a neutrino strikes the liquid scintillator, it releases a burst of particles. The detector collects information about the pattern and energy of those particles. Scientists use that information to figure out what happened in the original neutrino event.

    “Our job is almost like reconstructing a crime scene,” Messier says. “A neutrino interacts and leaves traces in the detector—we come along afterward and use what we can see to try and figure out what we can about the identity of the neutrino.”

    Over the last few years, scientists have started to use algorithms called convolutional neural networks (CNNs) to take on this task instead.

    CNNs, which are modelled after the mammalian visual cortex, are widely used in the technology industry—for example, to improve computer vision for self-driving cars. These networks are composed of multiple layers that act somewhat like filters: They contain densely interconnected nodes that possess numerical values, or weights, that are adjusted and refined as inputs pass through.

    “The ‘deep’ part comes from the fact that there are many layers to it,” explains Adam Aurisano, an assistant professor at the University of Cincinnati. “[With deep learning] you can take nearly raw data, and by pushing it through these stacks of learnable filters, you wind up extracting nearly optimal features.”

    For example, these algorithms can extract details associated with particle interactions of varying complexity from the “images” collected by recording different patterns of energy deposits in particle detectors.

    “Those stacks of filters have sort of sliced and diced the image and extracted physically meaningful bits of information that we would have tried to reconstruct before,” Aurisano says.

    Although they can be used to classify events without recreating them, CNNs can also be used to reconstruct particle interactions using a method called semantic segmentation.

    When applied to an image of a table, for example, this method would reconstruct the object by tagging each pixel associated with it, Aurisano explains. In the same way, scientists can label each pixel associated with characteristics of neutrino interactions, then use algorithms to reconstruct the event.

    Physicists are using this method to analyze data collected from the MicroBooNE neutrino detector.

    FNAL/MicrobooNE

    “The nice thing about this process is that you might find a cluster that’s made by your network that doesn’t fit in any interpretation in your model,” says Kazuhiro Terao, a scientist at SLAC National Accelerator Laboratory. “That might be new physics. So we could use these tools to find stuff that we might not understand.”

    Scientists working on other particle physics experiments, such as those at the Large Hadron Collider at CERN, are also using deep learning for data analysis.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    “All these big physics experiments are really very similar at the machine learning level,” says Pierre Baldi, a computer scientist at the University of California, Irvine. “It’s all images associated with these complex, very expensive detectors, and deep learning is the best method for extracting signal against some background noise.”

    Although most of the information is currently flowing from computer scientists to particle physicists, other communities may also gain new tools and insights from these experimental applications as well.

    For example, according to Baldi, one question that’s currently being discussed is whether scientists can write software that works across all these physics experiments with a minimal amount of human tuning. If this goal were achieved, it could benefit other fields, such a biomedical imaging, that use deep learning as well. “[The algorithm] would look at the data and calibrate itself,” he says. “That’s an interesting challenge for machine learning methods.”

    Another future direction, Terao says, would be to get machines to ask questions—or, more simply, to be able to identify outliers and try to figure out how to explain them.

    “If the AI can form a question and come up with a logical sequence to solve it, then that replaces a human,” he says. “To me, the kind of AI you want to see is a physics researcher—one that can do scientific research.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


    Advertisements
     
  • richardmitnick 1:34 pm on January 23, 2018 Permalink | Reply
    Tags: ALMA Captured Betelgeuse, , , , , , ,   

    From ALMA: “ALMA Captured Betelgeuse” 

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    ALMA

    2018.01.23

    1
    ALMA Captured Betelgeuse Credit: ALMA (ESO/NAOJ/NRAO) /E. O’Gorman/P. Kervella.

    ALMA captured this image of the bright star Betelgeuse in the constellation Orion at an ultra-high resolution which exceeds 80000/20 vision in terms of eyesight. Betelgeuse is a red supergiant star in the final stage of its life. It has swelled up to about 1400 times bigger than the Sun. In the image taken by ALMA, the radio waves are stronger on a part of the star’s surface (the white part in the image), and it turned out that that part was about 1000 degrees Celsius hotter than its surroundings. Also on the left side of the image, a slightly swollen structure can be seen.

    Investigating the Surface of a Star with Extremely High-resolution Observations

    The stars visible in the night sky are located very far away. Even if you look at the stars with a telescope, you usually can only see them as dots. However, Betelgeuse is located relatively close at 500 light-years from the Earth, and it has expanded to 1400 times as big as the Sun, which is about the same size as Jupiter’s orbit in the Solar System. So, it is one of the few stars where we can investigate the surface pattern with extremely high-resolution observations.

    ALMA captured radio waves radiated slightly above the photosphere, the surface of Betelgeuse which you can see with visible light. The average temperature estimated from the radio intensity is about 2500 degrees Celsius. Since Betelgeuse’s photosphere is about 3400 degrees Celsius, we can say that the temperature of the upper atmosphere is about 1000 degrees Celsius colder than the surface of the photosphere. On the other hand, as shown in the image, some regions captured by ALMA are hotter than the surroundings. Researchers think that this is due to a convection phenomenon in which high temperature matter comes up from inside Betelgeuse. Observing Betelgeuse in extremely high-resolution gives us a clue to understand what is happening inside the giant star at the end of its life.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    The Atacama Large Millimeter/submillimeter Array (ALMA), an international astronomy facility, is a partnership of Europe, North America and East Asia in cooperation with the Republic of Chile. ALMA is funded in Europe by the European Organization for Astronomical Research in the Southern Hemisphere (ESO), in North America by the U.S. National Science Foundation (NSF) in cooperation with the National Research Council of Canada (NRC) and the National Science Council of Taiwan (NSC) and in East Asia by the National Institutes of Natural Sciences (NINS) of Japan in cooperation with the Academia Sinica (AS) in Taiwan.

    ALMA construction and operations are led on behalf of Europe by ESO, on behalf of North America by the National Radio Astronomy Observatory (NRAO), which is managed by Associated Universities, Inc. (AUI) and on behalf of East Asia by the National Astronomical Observatory of Japan (NAOJ). The Joint ALMA Observatory (JAO) provides the unified leadership and management of the construction, commissioning and operation of ALMA.

    NRAO Small
    ESO 50 Large
    NAOJ

     
  • richardmitnick 1:14 pm on January 23, 2018 Permalink | Reply
    Tags: , Forensic DNA evidence, PROVEDIt database,   

    From Rutgers Camden: “Rutgers-Camden Houses PROVEDIt DNA Database” 

    Rutgers Camden

    1.23.18
    Jeanne Leong
    jeanne.leong@camden.rutgers.edu

    1
    No image caption or credit

    Forensic DNA evidence is a valuable tool in criminal investigations to link a suspect to the scene of a crime, but the process to make that determination is not so simple since the genetic material found at a crime scene often comes from more than one person.

    That task may become somewhat less challenging, thanks to a new database at Rutgers University–Camden that can help to bring more reliability to the interpretation of complex DNA evidence. This innovative new resource was developed by a research team led by Rutgers University–Camden professors Catherine Grgicak and Desmond Lun, and Ken Duffy of the University of Ireland at Maynooth.

    “Right now, there’s no standardization of tests,” says Grgicak, the Henry Rutgers Chair in chemistry at Rutgers–Camden. “There’s accreditation of crime labs, but that’s different from having standards set out for labs to meet some critical threshold of a match statistic.”

    In analyzing DNA mixtures, scientists will often find partial matches, so part of the determination of whether a suspect contributed to an item of evidence depends on interpretations by forensic scientists.

    The Project Research Openness for Validation with Empirical Data (PROVEDIt) database will help reduce the risk of misinterpreting the profile.

    The team of researchers spent more than six years developing computational algorithms that sorted through possible DNA signal combinations in a piece of evidence, taking into account their prevalence in the general population to determine the likelihood that the genetic material came from one, two, three, four, or five people.

    Information from the PROVEDIt database, the housed at Rutgers–Camden, could be used to test software systems and interpretation protocols, and be used as a benchmark for future developments in DNA analysis.

    The PROVEDIt database, which consists of approximately 25,000 samples, is accessible to anyone for free.

    “We wanted to provide these data to the community so that they could test their own probabilistic systems,” says Grgicak. “Other academicians or other researchers might develop their own systems by which to interpret these very complex types of samples.”

    The website’s files contain data that can be used to develop new or compare existing interpretation or analysis strategies.

    Grgicak says forensic laboratories could use the database for validating or testing new or existing forensic DNA interpretation protocols. Researchers requiring data to test newly developed methodologies, technologies, ideas, developments, hypotheses, or prototypes can use the database to advance their own work.

    Lun, a computer science professor at Rutgers–Camden, led the way in developing the software systems, doing the number crunching to determine the likely number of contributors in a DNA sample, and calculating statistics to determine the likelihood that a person contributed to a sample or not.

    “The approach that we took to develop these methods is that we thought that it is very important that they be empirically driven,” says Lun. “That they can be used on real experimental data in order both to train or calibrate these methods and validate them.”

    Grgicak’s and Lun’s research to produce the database, titled “A Large-Scale Dataset of Single and Mixed-Source Short Tandem Repeat Profiles to Inform Human Identification Strategies: PROVEDIt,” is published in the journal Forensic Science International: Genetics.

    The database was mentioned in 2016 in a report by President Barak Obama’s President’s Council of Advisors on Science and Technology (PCAST), an advisory group of the nation’s leading scientists and engineers who directly advise the president and make policy recommendations in science, technology, and innovation.

    The research was supported by the National Institute of Justice, Office of the Defense and Army Research Office Rapid Innovation Fund.

    Other researchers contributing to the study include Lauren Alfonse and Amanda Garrett, of the biomedical forensic sciences program at Boston University School of Medicine.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A regional focus and a global outlook. All the prestige and resources of Rutgers, all the vitality and opportunity of the metro Philadelphia region, all at Rutgers–Camden. As the Rutgers of South Jersey, we deliver the academic heft you’d expect from a powerhouse public research university. And we focus that energy—in teaching, research, and civic engagement—within the greater Delaware Valley.

    The work we do on our 40-acre campus along the bustling Camden Waterfront is felt far beyond. We educate students for successful careers and productive citizenship. We support a faculty of sharp thinkers who turn new knowledge into creative solutions. And we share our expertise with partners—local and global—to improve individual lives and build stronger communities.

     
  • richardmitnick 12:47 pm on January 23, 2018 Permalink | Reply
    Tags: , , , , ,   

    From Texas A&M: “A&M professors help develop new telescope” 

    Texas A&M logo

    Texas A&M

    Dec 4, 2017
    Elaine Soliman

    The new telescope will help change how astronomers study space.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The LSST Facility will change how astronomers study the sky by providing a new method of examination.

    The ability to finally be able to analyze and learn more about dark matter and dark energy is just around the corner thanks the innovative Large Synoptic Survey Telescope, or LSST. It is being developed by an international team made of over thousands of people, including three professors at Texas A&M.

    The LSST is a groundbreaking telescope which will develop a digital picture of the entire sky continuously over a three-night period. The project is funded by the National Science Foundation and the Department of Energy, according to Lucas Macri, institutional board representative of the LSST. The LSST is aimed to become operational as soon as 2022. This project wasn’t feasible fifteen years ago, but the LSST will bring in all sorts of new data about the universe around us, according to Macri.

    “Imagine if our only knowledge of biology was one picture of a cell that you took once,” Macri said. “The nice thing about a microscope … is you can actually see a cell … that dynamic and that temporal coverage, of in this case, a cell, gives you a lot of information and it is the same thing with the sky. We’ve been able to study small patches of the sky repeatedly. Many pictures of them see things that change, discover new stars, exploding stars, asteroid, whatever. But we have never been able to do an unbiased complete survey of the sky.”

    The LSST is able to do this using a charged couple device that is sensitive to light. It also requires a large mirror to be cast, and a lot glass melted into the right shape. The LSST has two mirrors in one shape which collect in a flight with enough quality so that eventually one can make these pictures of the sky, according to Macri.

    “The telescope was able to be designed to look at a large part of the sky,” Nicholas Suntzeff, university distinguished professor and head of the TAMU Astronomy Group said. “The digital detector for the LSST is the size of [a] table. Imagine covering [a] table with silicon chips and cramming them all together. And so every image you take with this telescope is the size of [a] table and you’re taking images every twenty seconds all night long. So this is an unbelievable size of an image of a focal plane. And compare that with the camera that’s being built for the LSST and so that’s what [a] table is, it’s the size of the image. As a person whose built instruments that just blows my mind that we’re able to do something like that.”

    This flood of new information about the entire universe can be utilized to further understand dark matter and dark energy. With the development of the LSST, astronomers can learn more about dark matter and energy than was ever possible before. They will also be able to better understand transient objects, according to Suntzeff.

    “This telescope will be the first big telescope to devote itself to searching for what we call in astronomy the transient sky,” Suntzeff said. “Stars that vary get brighter and fainter. Stars that explode. Galaxies that get brighter and fainter. Black holes that rip apart stars. Gamma Ray explosions at the edge of the universe. And we’ll discover things that we can’t even imagine right now. That’s one of the beauties of astronomy.

    Every time a telescope is built, that opens up a new way of looking at the universe, Suntzeff said.

    “We anticipate cool things to discover that ultimately what was really exciting was to discover things that we had no idea existed,” Suntzeff said. “So, in this case we’re opening up the transient sky and we will find things beyond our imaginations.”

    The LSST will also be able to help predict if an asteroid is projected to hit the earth, according to Macri. Macri said if an asteroid the size of Kyle Field hit the earth, the impact wouldn’t be the problem, but the amount of dust would eventually black out the whole earth.

    The LSST is currently being developed as a worldwide project. The LSST headquarters are in Tuscon, Arizona. Astronomers at Stanford University are developing the camera, which will be the largest digital camera ever assembled. The telescope itself is being built in Chile.

    Suntzeff, who picked the mountain in Chile on which to build the telescope, was actually one of the first people involved with the project approximately twenty years ago. According to Suntzeff, the LSST has brought together the astronomy and statistics departments.

    “It’s unbelievable how much data is going to come from this telescope,” Suntzeff said. “And in order to sift through the data we can’t just be normal astronomers. We have to use advanced mathematical and statistical techniques. So we’ve begun a program in collaboration with the statistics department in studying something that’s called astrostatistics. And astrostatistics will allow us to have tools to allow us to search very large databases for objects of interest.”

    Currently, these TAMU professors are preparing their graduate students for what is to come in the next few years with the completion of the LSST.

    “Well I am preparing for some software I was thinking about getting students to work LSST related problems in particular to identify objects that may be interesting to us,” said Lifan Wang, professor in physics and astronomy at TAMU and member of the LSST dark energy science collaboration.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition
    Located in College Station, Texas, about 90 miles northwest of Houston and within a two to three-hour drive from Austin and Dallas.
    Home to more than 50,000 students, ranking as the sixth-largest university in the country, with more than 370,000 former students worldwide.
    Holds membership in the prestigious Association of American Universities, one of only 62 institutions with this distinction.
    More than $820 million in research expenditures generated by faculty-researchers
    Has an endowment valued at more than $5 billion, which ranks fourth among U.S. public universities and 10th overall.

     
  • richardmitnick 11:49 am on January 23, 2018 Permalink | Reply
    Tags: Elena Belova, , Theoretical physicist Elena Belova named to editorial board of Physics of Plasmas,   

    From PPPL: Women in STEM – “Theoretical physicist Elena Belova named to editorial board of Physics of Plasmas” 


    PPPL

    January 22, 2018
    John Greenwald

    1
    Elena Belova. (Photo by Elle Starkman/Office of Communications).

    Elena Belova, a principal research physicist in the Theory Department at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL), has been named to the editorial board of the Physics of Plasmas, a monthly peer-reviewed scientific journal published by the American Institute of Physics. Duties of board members, selected for their high degree of technical expertise, range from suggesting topics for special sections to adjudicating impasses between authors and referees that arise over manuscripts.

    Belova, a PPPL physicist for more than 20 years, is expert at developing computer codes, such as simulations of wave-particle interactions and models of global stability in fusion plasmas that are widely used in fusion research. “I like code development because it is algorithmic and codes can really help to understand the experimental results,” she said. “But it still surprises me when theory works the way it’s supposed to. I also like that you can perform the simulation and look “inside” the device – which is not always possible in a real experiment. Visualizing things through computer simulations allows one to ‘see a picture,’ which is, as they say, better than a thousand words.”

    Fusing of light elements

    Fusion, the reaction that powers the sun and most stars, is the fusing of light elements that generates massive amounts of energy. Researchers seek to replicate fusion on Earth for a virtually inexhaustible supply of energy by controlling plasma, the hot, charged state of matter composed of electrons and atomic nuclei, or ions, that fuels fusion reactions. Theorists create computer models that simulate the processes involved, which experiments then test in attempts to confirm.

    Recent experiments at PPPL validated a code of Belova’s to predict a way to suppress a type of plasma instability that can halt fusion production. The method could prove useful to ITER, the international fusion facility under construction in France to demonstrate the ability to produce 10 times more power than it consumes.

    Second female physicist in Theory Department

    Belova, 53, joined PPPL in 1997 as the second female physicist to work in the Theory Department. Among her honors has been the Katherine E. Weimer Award for Women in Plasma Physics, a national honor named for the first woman theorist at PPPL, which Belova received in 2005.

    As a high school student in the former Soviet Union, Belova grew interested in mathematics and spent three years in an after-school program sponsored by the Moscow Institute of Physics and Technology. “In math you don’t really need to know anything,” she said. “You just solve puzzles. At least, this is what I thought in high school.”

    She earned a bachelor’s degree in applied mathematics in 1984 and a master’s degree in plasma physics in 1987, both from the Institute, though relatives had tried to persuade her not to switch subjects. “They said physics was too hard for a woman,” she recalled.

    But math had become too abstract for Belova and physics, while more difficult, was also more practical and exciting. She worked as a research engineer at the Space Research Institute in Moscow from 1987 to 1989 and as a junior research scientist from 1989 to 1992. While space physics is no longer her subject, her knowledge has served in good stead. “There are many common approaches in fusion and space plasma physics,” she said.

    Arrived in U.S. in 1992

    Belova and her husband, also a physicist, left Russia for the United States in 1992. She had been accepted in the graduate program at Dartmouth College, and became a research assistant in the Department of Physics and Astronomy. While she had learned technical English terms as an undergraduate student in Russia, her command of the broader language was still a bit shaky. “In my first year as a teaching assistant I would sometimes just write equations on the board and would point them out to students rather than trying to explain,” she said.

    After earning her doctorate in physics from Dartmouth in 1997 she worked as an associate research physicist at PPPL until 2004, a research physicist until 2008 and a principal research physicist since then. Among the scientific articles she has written at the Lab have been 15 invited papers for workshops and conferences around the world.

    Belova is the fourth PPPL staff member to be appointed to an editorial position in recent years. Richard Hawryluk, interim director of the laboratory, chairs the editorial board of the journal Nuclear Fusion; David Gates, principal research physicist and Stellarator Physics Division Head at PPPL, is editor-in-chief of the new online journal Plasma; and Igor Kaganovich, principal research physicist and deputy head of the PPPL Theory Department, serves as associate editor of Physics of Plasmas.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition


    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

     
  • richardmitnick 11:19 am on January 23, 2018 Permalink | Reply
    Tags: , , , , Rutgers Scientists Discover 'Legos of Life',   

    From Rutgers: “Rutgers Scientists Discover ‘Legos of Life'” 

    Rutgers University
    Rutgers University

    THIS POST IS DEDICATED TO J.L. FROM HP AND RUTGERS. I HOPE HE SEES IT.

    January 21, 2018
    Todd B. Bates

    Deep dive into the 3D structures of proteins reveals key building blocks.

    1
    Rutgers researchers identified a small set of simple protein building blocks (left) that likely existed at the earliest stages of life’s history. Over billions of years, these “Legos of life” were assembled and repurposed by evolution into complex proteins (right) that are at the core of modern metabolism.
    Image: Vikas Nanda/Rutgers Robert Wood Johnson Medical School.

    Rutgers scientists have found the “Legos of life” – four core chemical structures that can be stacked together to build the myriad proteins inside every organism – after smashing and dissecting nearly 10,000 proteins to understand their component parts.

    The four building blocks make energy available for humans and all other living organisms, according to a study published online today in the Proceedings of the National Academy of Sciences.

    The study’s findings could lead to applications of these stackable, organic building blocks for biomedical engineering and therapeutic proteins and the development of safer, more efficient industrial and energy catalysts – proteins and enzymes that, like tireless robots, can repeatedly carry out chemical reactions and transfer energy to perform tasks.

    “Understanding these parts and how they are connected to each other within the existing proteins could help us understand how to design new catalysts that could potentially split water, fix nitrogen or do other things that are really important for society,” said Paul G. Falkowski, study co-author and a distinguished professor who leads the Environmental Biophysics and Molecular Ecology Laboratory at Rutgers University–New Brunswick.

    The scientists’ research was done on computers, using data on the 3D atomic structures of 9,500 proteins in the RCSB Protein Data Bank based at Rutgers, a rich source of information about how proteins work and evolve.

    “We don’t have a fossil record of what proteins looked like 4 billion years ago, so we have to take what we have today and start walking backwards, trying to imagine what these proteins looked like,” said Vikas Nanda, senior author of the study and an associate professor in the Department of Biochemistry and Molecular Biology at Rutgers’ Robert Wood Johnson Medical School, within Rutgers Biomedical and Health Sciences. “The study is the first time we’ve been able to take something with thousands of amino acids and break it down into reasonable chunks that could have had primordial origins.”

    The identification of four fundamental building blocks for all proteins is just a beginning. Nanda said future research may discover five or 10 more building blocks that serve as biological Legos.

    “Now we need to understand how to put these parts together to make more interesting functional molecules,” he said. “That’s the next grand challenge.”

    The study’s lead author is Hagai Raanana, a post-doctoral associate in the Environmental Biophysics and Molecular Ecology Program. Co-authors include Douglas H. Pike, a doctoral student at the Rutgers Institute for Quantitative Biomedicine, and Eli K. Moore, a post-doctoral associate in the Environmental Biophysics and Molecular Ecology Program.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    rutgers-campus

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers smaller
    Please give us back our original beautiful seal which the University stole away from us.
    As a ’67 graduate of University college, second in my class, I am proud to be a member of

    Alpha Sigma Lamda, National Honor Society of non-tradional students.

     
  • richardmitnick 10:57 am on January 23, 2018 Permalink | Reply
    Tags: , , , , , Gravitational wave source GW170817, , NASA Missions Catch First Light from a Gravitational-Wave Event,   

    From Chandra: “NASA Missions Catch First Light from a Gravitational-Wave Event” 

    NASA Chandra Banner

    NASA Chandra Telescope

    NASA Chandra

    October 16, 2017 [Just appeared in social media.]

    1
    Credit X-ray: NASA/CXC/Northwestern U./W. Fong & R. Margutti et al. & NASA/GSFC/E. Troja et al.; Optical:NASA/STScI

    Astronomers have used Chandra to make the first X-ray detection of a gravitational wave source.

    This is the first evidence that the aftermath of gravitational wave events can also emit X-rays.

    The data indicate this event was the merger of two neutron stars that produced a jet pointing away from Earth.

    Chandra provides the missing observational link between short gamma-ray bursts (GRBs) and gravitational waves from neutron star mergers.

    Astronomers have used NASA’s Chandra X-ray Observatory to make the first X-ray detection of a gravitational wave source. Chandra was one of multiple observatories to detect the aftermath of this gravitational wave event, the first to produce an electromagnetic signal of any type. This discovery represents the beginning of a new era in astrophysics.

    The gravitational wave source, GW170817, was detected with the advanced Laser Interferometer Gravitational-Wave Observatory, or LIGO, at 8:41am EDT on Thursday August 17, 2017.


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    Two seconds later NASA’s Fermi Gamma-ray Burst Monitor (GBM) detected a weak pulse of gamma-rays.

    NASA/Fermi LAT


    NASA/Fermi Gamma Ray Space Telescope

    Later that morning, LIGO scientists announced that GW170817 had the characteristics of a merger of two neutron stars.

    During the evening of August 17, multiple teams of astronomers using ground-based telescopes reported a detection of a new source of optical and infrared light in the galaxy NGC 4993, a galaxy located about 130 million light years from Earth. The position of the new optical and infrared source agreed with the position of the Fermi and the gravitational wave sources. The latter was refined by combining information from LIGO and its European counterpart, Virgo.

    Over the following two weeks, Chandra observed NGC 4993 and the source GW170817 four separate times. In the first observation on August 19th (Principal Investigator: Wen-fai Fong from Northwestern University in Evanston, Illinois), no X-rays were detected at the location of GW170817. This observation was obtained remarkably quickly, only 2.3 days after the gravitational source was detected.

    On August 26, Chandra observed GW170817 again and this time, X-rays were seen for the first time (PI: Eleonora Troja from Goddard Space Flight Center in Greenbelt, MD, and the University of Maryland, College Park). This new X-ray source was located at the exact position of the optical and infrared source.

    “This Chandra detection is very important because it is the first evidence that sources of gravitational waves are also sources of X-ray emission,” said Troja. “This detection is teaching us a great deal of information about the collision and its remnant. It helps to give us an important confirmation that gamma-ray bursts are beamed into narrow jets.”

    The accompanying graphic shows both the Chandra non-detection, or upper limit of X-rays from GW170817 on August 19th, and the subsequent detection on August 26th, in the two sides of the inset box. The main panel of the graphic is the Hubble Space Telescope image of NGC 4993, which includes data taken on August 22nd. The variable optical source corresponding to GW170817 is located in the center of the circle in the Hubble image.

    Chandra observed GW170817 again on September 1st (PI Eleonora Troja) and September 2nd (PI: Daryl Haggard from McGill University in Montreal, Canada), when the source appeared to have roughly the same level of X-ray brightness as the August 26 observation.

    The properties of the source’s X-ray brightness with time matches that predicted by theoretical models of a short gamma-ray burst (GRB). During such an event, a burst of X-rays and gamma rays is generated by a narrow jet, or beam, of high-energy particles produced by the merger of two neutron stars. The initial non-detection by Chandra followed by the detections show that the X-ray emission from GW170817 is consistent with the afterglow from a GRB viewed “off-axis,” that is, with the jet not pointing directly towards the Earth. This is the first time astronomers have ever detected an off-axis short GRB.

    “After some thought, we realized that the initial non-detection by Chandra perfectly matches with what we expect,” said Fong. “The fact that we did not see anything at first gives us a very good handle on the orientation and geometry of the system.”

    2
    Illustration Credit: NASA/CXC/K.DiVona

    The researchers think that initially the jet was narrow, with Chandra viewing it from the side. However, as time passed the material in the jet slowed down and widened as it slammed into surrounding material, causing the X-ray emission to rise as the jet came into direct view. The Chandra data allow researchers to estimate the angle between the jet and our line of sight. The three different Chandra observing teams each estimate angles between 20 and 60 degrees. Future observations may help refine these estimates.

    The detection of this off-axis short GRB helps explain the weakness of the gamma-ray signal detected with Fermi GBM for a burst that is so close by. Because our telescopes are not looking straight down the barrel of the jet as they have for other short GRBs, the gamma-ray signal is much fainter.

    The optical and infrared light is likely caused by the radioactive glow when heavy elements such as gold and platinum are produced in the material ejected by the neutron star merger. This glow had been predicted to occur after neutron stars merged.

    By detecting an off-axis short GRB at the location of the radioactive glow, the Chandra observations provide the missing observational link between short GRBs and gravitational waves from neutron star mergers.

    This is the first time astronomers have all of the necessary pieces of information of neutron stars merging — from the production of gravitational waves followed by signals in gamma rays, X-rays, optical and infrared light, that all agree with predictions for a short GRB viewed off-axis.

    “This is a big deal because it’s an entirely new level of knowledge,” said Haggard. “This discovery allows us to link this gravitational wave source up to all the rest of astrophysics, stars, galaxies, explosions, growing massive black holes, and of course neutron star mergers.”

    Papers describing these results have been accepted for publication in Nature (Troja et al.), and The Astrophysical Journal Letters (Haggard et al. and Margutti et al.). Raffaella Margutti is a collaborator of Fong’s, also from Northwestern.

    See the full article here .

    If you have the time, please visit the very best produced work, from UCSC, on this detection:
    https://sciencesprings.wordpress.com/2017/10/20/from-ucsc-neutron-stars-gravitational-waves-and-all-the-gold-in-the-universe/
    5

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA’s Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra’s science and flight operations from Cambridge, Mass.

     
  • richardmitnick 10:12 am on January 23, 2018 Permalink | Reply
    Tags: , , Online tool calculates reproducibility scores of PubMed papers,   

    From Science: “Online tool calculates reproducibility scores of PubMed papers” 

    AAAS
    Science Magazine

    Jan. 22, 2018
    Dalmeet Singh Chawla

    1
    Scientific societies are seeking new tools to measure the reproducibility of published research findings, amid concerns that many cannot be reproduced independently. National Eye Institute, National Institutes of Health/Flickr (CC BY NC 2.0).

    A new online tool unveiled 19 January measures the reproducibility of published scientific papers by analyzing data about articles that cite them.

    The software comes at a time when scientific societies and journals are alarmed by evidence that findings in many published articles are not reproducible and are struggling to find reliable methods to evaluate whether they are.

    The tool, developed by the for-profit firm Verum Analytics in New Haven, Connecticut, generates a metric called the r-factor that indicates the veracity of a journal article based on the number of other studies that confirm or refute its findings. The r-factor metric has drawn much criticism from academics who said its relatively simple approach might not be sufficient to solve the multifaceted problem that measuring reproducibility presents.

    Early reaction to the new tool suggests that Verum has not fully allayed those concerns. The Verum developers concede the tool still has limitations; they said they released it to receive feedback about how well it works and how it could be improved. Verum has developed the project as a labor of love, and Co-Founder Josh Nicholson said he hopes the release of the early version tool will attract potential funders to help improve it.

    Verum announced the methodology underlying the tool, based on the r-factor, in a preprint paper [BioRXiv] last August and refined it in the new tool. It relies solely on data from freely available research papers in the popular biomedical search engine PubMed.

    Nicholson and his colleagues developed the tool by first manually examining 48,000 excerpts of text in articles that cited other published papers. Verum’s workers classified each of these passages as either confirming, refuting, or mentioning the other papers. Verum then used these classifications to train an algorithm to autonomously recognize each kind of passage in papers outside this sample group.

    Based on a sample of about 10,000 excerpts, Verum’s developers claim their tool correctly classifies passages accurately 93% of the time. But it detects mentioning citations much more precisely than confirming or refuting ones, which were much less common in their sample. The vast majority of articles mention previous studies without confirming or refuting their claims; only about 8% of all citations are confirmatory and only about 1% are refuting.

    The tool’s users can apply the algorithm by entering an article’s unique PubMed identifier code. The algorithm scours PubMed to find articles that cite the paper of interest and all passages that confirm, refute, or mention the paper. The tool then generates an r-factor score for the paper by dividing the number of confirming papers by the sum of the confirming and refuting papers.

    This formula tends to assign high scores, close to 1, to papers seldom refuted. The low number of refuting papers in Verum’s database means that many articles have r-factors of 1—which tends to limit the tool’s usefulness. (R-factors also contain a subscript number indicating the total number of studies that attempted to replicate the paper—an r-factor of 116 means the tool scanned 16 replication studies.)

    Psychologist Christopher Chartier of Ashland University in Ohio, who developed an online platform that assists with the logistics of replication studies, tried the new tool at the request of ScienceInsider. “It appears to do what it claims to do, but I don’t find much value in the results,” he says. One reason, he says, is that r-factors may be skewed by a publication bias—where scholarly journals favorably publish positive results over negative results. “We simply can’t trust the published literature to be a reliable and valid indicator of a finding’s replicability,” Chartier said.

    “Attempting to estimate the robustness of a published research finding is notoriously difficult,” said Marcus Munafò, a biological psychologist at the University of Bristol in the United Kingdom, a key figure in tackling irreproducibility [nature human behavior] . It’s difficult, he said, to know the precision or quality of individual confirmatory or refuting studies without reading them.

    Another limitation in Verum’s tool is that because it trawls only freely available papers on PubMed, it misses paywalled scholarly literature.

    Still, the Verum team will press on. Next on their agenda is to increase the number of sample papers used to train their algorithm to improve its accuracy in recognizing confirming and refuting papers.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:35 am on January 23, 2018 Permalink | Reply
    Tags: , , , Tsunami warnings issued – later canceled – after powerful Alaska quake   

    From EarthSky: “Tsunami warnings issued – later canceled – after powerful Alaska quake” 

    1

    EarthSky

    January 23, 2018
    Deborah Byrd

    A powerful earthquake struck 174 miles (280 km) southeast of Kodiak, Alaska early this morning. Tsunami watches or warnings were issued – later cancelled – for the western North America and Hawaii.

    1
    A National Weather Service map showing the red tsunami warning zone as well as the yellow tsunami watch zone of January 23, 2018. The original watches and warnings ran south from Alaska, into Washington and California and also included Hawaii. At this writing (12:30 UTC, or 6:30 EST), no tsunami watch, warning or advisory is in effect according to the Pacific Tsunamic Warning Center.

    The U.S. Geological Survey (USGS) reported a very large earthquake this morning in the Gulf of Alaska. It was originally reported at 8.2 magnitude, then 7.9 magnitude, then downgraded further to 7.0; even at the lowest number, it’s still a powerful quake (though much less powerful than originally reported). The earthquake struck on January 23, 2018 at 9:31 UTC (3:31 a.m. CST). It occurred 174 miles (280 km) southeast of Kodiak, Alaska.

    The Pacific Tsunami Warning Center (PTWC) issued tsunami watches or warnings for large portions of the Pacific, including a watch for the U.S. west coast from Washington to California as well as Hawaii, and a tsunami warning for the coast of Alaska and the Canadian province of British Columbia. Subsequently, all watches and warnings were cancelled, but not before a mass of confusion on Twitter and other news outlets.

    There were reports of some panic in Kodiak, Alaska (sirens blaring, people being woken from sleep), near the quake’s epicenter. Waters were then said to be receding in Kodiak, and waves were said to have been “small.”

    We have not yet seen reports of damages or injuries from this event.

    The PTWC – which was still in its calculation process when this advisory was issued at 10:17 UTC (4:17 a.m. CST) today – said tsunami waves were originally forecast to be less than one foot (0.3 meters) above the tide level for the coasts of Guam, Hawaii and northwestern Hawaiian Islands, Japan, Johnston Atoll, Mexico, Midway Island, Northern Marianas, Russia, and Wake Island.

    This story is still being updated.

    4
    Aftershocks will follow an earthquake this size. Resident of both Alaska and Canada should be prepared. Sometimes aftershocks can be even stronger than the initial earthquake. Daniel McFarland‏

    Large earthquakes are common in the Pacific-North America plate boundary region south of Alaska. USGS explained:

    The January 23, 2018 M 7.9 earthquake southeast of Kodiak Island in the Gulf of Alaska occurred as the result of strike slip faulting within the shallow lithosphere of the Pacific plate … At the location of the earthquake, the Pacific plate is converging with the North America plate at a rate of approximately 59 mm/yr towards the north-northwest. The Pacific plate subducts beneath the North America plate at the Alaska-Aleutians Trench, about 90 km to the northwest of today’s earthquake. The location and mechanism of the January 23rd earthquake are consistent with it occurring on a fault system within the Pacific plate before it subducts, rather than on the plate boundary between the Pacific and North America plates further to the northwest.

    Bottom line: A 7.9-magnitude earthquake struck on January 23, 2018 in the Gulf of Alaska. Tsunami watches and warnings issued. The situation is still unfolding.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Deborah Byrd created the EarthSky radio series in 1991 and founded EarthSky.org in 1994. Today, she serves as Editor-in-Chief of this website. She has won a galaxy of awards from the broadcasting and science communities, including having an asteroid named 3505 Byrd in her honor. A science communicator and educator since 1976, Byrd believes in science as a force for good in the world and a vital tool for the 21st century. “Being an EarthSky editor is like hosting a big global party for cool nature-lovers,” she says.

     
  • richardmitnick 9:06 am on January 23, 2018 Permalink | Reply
    Tags: , , , , ,   

    From LBNL- “It All Starts With a ‘Spark’: Berkeley Lab Delivers Injector That Will Drive X-Ray Laser Upgrade” 

    Berkeley Logo

    Berkeley Lab

    January 22, 2018
    Glenn Roberts, Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Unique device will create bunches of electrons to stimulate million-per-second X-ray pulses.

    1
    Joe Wallig, left, a mechanical engineering associate, and Brian Reynolds, a mechanical technician, work on the final assembly of the LCLS-II injector gun in a specially designed clean room at Berkeley Lab in August. (Credit: Marilyn Chung/Berkeley Lab)

    Every powerful X-ray pulse produced for experiments at a next-generation laser project, now under construction, will start with a “spark” – a burst of electrons emitted when a pulse of ultraviolet light strikes a 1-millimeter-wide spot on a specially coated surface.

    A team at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) designed and built a unique version of a device, called an injector gun, that can produce a steady stream of these electron bunches that will ultimately be used to produce brilliant X-ray laser pulses at a rapid-fire rate of up to 1 million per second.

    The injector arrived Jan. 22 at SLAC National Accelerator Laboratory (SLAC) in Menlo Park, California, the site of the Linac Coherent Light Source II (LCLS-II), an X-ray free-electron laser project.


    Stanford/SLAC Campus


    SLAC/LCLS II projected view

    2
    An electron beam travels through a niobium cavity, a key component of a future LCLS-II X-ray laser, in this illustration. Kept at minus 456 degrees Fahrenheit, these cavities will power a highly energetic electron beam that will create up to 1 million X-ray flashes per second. (Credit: SLAC National Accelerator Laboratory)

    Getting up to speed

    The injector will be one of the first operating pieces of the new X-ray laser. Initial testing of the injector will begin shortly after its installation.

    The injector will feed electron bunches into a superconducting particle accelerator that must be supercooled to extremely low temperatures to conduct electricity with nearly zero loss. The accelerated electron bunches will then be used to produce X-ray laser pulses.

    Scientists will employ the X-ray pulses to explore the interaction of light and matter in new ways, producing sequences of snapshots that can create atomic- and molecular-scale “movies,” for example, to illuminate chemical changes, magnetic effects, and other phenomena that occur in just quadrillionths (million-billionths) of a second.

    This new laser will complement experiments at SLAC’s existing X-ray laser, which launched in 2009 and fires up to 120 X-ray pulses per second. That laser will also be upgraded as a part of the LCLS-II project.

    SLAC/LCLS

    3
    A rendering of the completed injector gun and related beam line equipment. (Credit: Greg Stewart/SLAC National Accelerator Laboratory)

    The injector gun project teamed scientists from Berkeley Lab’s Accelerator Technology and Applied Physics Division with engineers and technologists from the Engineering Division in what Engineering Division Director Henrik von der Lippe described as “yet another success story from our longstanding partnership – (this was) a very challenging device to design and build.”

    “The completion of the LCLS-II injector project is the culmination of more than three years of effort,” added Steve Virostek, a Berkeley Lab senior engineer who led the gun construction. The Berkeley Lab team included mechanical engineers, physicists, radio-frequency engineers, mechanical designers, fabrication shop personnel, and assembly technicians.

    “Virtually everyone in the Lab’s main fabrication shop made vital contributions,” he added, in the areas of machining, welding, brazing, ultrahigh-vacuum cleaning, and precision measurements.

    The injector source is one of Berkeley Lab’s major contributions to the LCLS-II project, and builds upon its expertise in similar electron gun designs, including the completion of a prototype gun. Almost a decade ago, Berkeley Lab researchers began building a prototype for the injector system in a beam-testing area at the Lab’s Advanced Light Source.

    LBNL/ALS

    That successful effort, dubbed APEX (Advanced Photoinjector Experiment), produced a working injector that has since been repurposed for experiments that use its electron beam to study ultrafast processes at the atomic scale.

    7
    The APEX electron gun and test beamline at the ALS Beam Test Facility. APEX team members include (from left) Daniele Filippetto, Fernando Sannibale, and John Staples of the Accelerator and Fusion Research Division and Russell Wells of the Engineering Division. (Photo by Roy Kaltschmidt, Lawrence Berkeley National Laboratory)

    4
    Daniele Filippetto, a Berkeley Lab scientist, works on the High-Repetition-rate Electron Scattering apparatus (HiRES), which will function like an ultrafast electron camera. HiRES is a new capability that builds on the Advanced Photo-injector Experiment (APEX), a prototype electron source for advanced X-ray lasers. (Roy Kaltschmidt/Berkeley Lab)

    Fernando Sannibale, Head of Accelerator Physics at the ALS, led the development of the prototype injector gun.

    5
    Krista Williams, a mechanical technician, works on the final assembly of LCLS-II injector components on Jan. 11. (Credit: Marilyn Chung/Berkeley Lab)

    “This is a ringing affirmation of the importance of basic technology R&D,” said Wim Leemans, director of Berkeley Lab’s Accelerator Technology and Applied Physics Division. “We knew that the users at next-generation light sources would need photon beams with exquisite characteristics, which led to highly demanding electron-beam requirements. As LCLS-II was being defined, we had an excellent team already working on a source that could meet those requirements.”

    The lessons learned with APEX inspired several design changes that are incorporated in the LCLS-II injector, such as an improved cooling system to prevent overheating and metal deformations, as well as innovative cleaning processes.

    “We’re looking forward to continued collaboration with Berkeley Lab during commissioning of the gun,” said SLAC’s John Galayda, LCLS-II project director. “Though I am sure we will learn a lot during its first operation at SLAC, Berkeley Lab’s operating experience with APEX has put LCLS-II miles ahead on its way to achieving its performance and reliability objectives.”

    Mike Dunne, LCLS director at SLAC, added, “The performance of the injector gun is a critical component that drives the overall operation of our X-ray laser facility, so we greatly look forward to seeing this system in operation at SLAC. The leap from 120 pulses per second to 1 million per second will be truly transformational for our science program.”

    How it works

    Like a battery, the injector has components called an anode and cathode. These components form a vacuum-sealed central copper chamber known as a radio-frequency accelerating cavity that sends out the electron bunches in a carefully controlled way.

    The cavity is precisely tuned to operate at very high frequencies and is ringed with an array of channels that allow it to be water-cooled, preventing overheating from the radio-frequency currents interacting with copper in the injector’s central cavity.

    7
    A copper cone structure inside the injector gun’s central cavity. (Credit: Marilyn Chung/Berkeley Lab)

    A copper cone structure within its central cavity is tipped with a specially coated and polished slug of molybdenum known as a photocathode. Light from an infrared laser is converted to an ultraviolet (UV) frequency laser, and this UV light is steered by mirrors onto a small spot on the cathode that is coated with cesium telluride (Cs2Te), exciting the electrons.

    These electrons are are formed into bunches and accelerated by the cavity, which will, in turn, connect to the superconducting accelerator. After this electron beam is accelerated to nearly the speed of light, it will be wiggled within a series of powerful magnetic structures called undulator segments, stimulating the electrons to emit X-ray light that is delivered to experiments.

    Precision engineering and spotless cleaning

    Besides the precision engineering that was essential for the injector, Berkeley Lab researchers also developed processes for eliminating contaminants from components through a painstaking polishing process and by blasting them with dry ice pellets.

    The final cleaning and assembly of the injector’s most critical components was performed in filtered-air clean rooms by employees wearing full-body protective clothing to further reduce contaminants – the highest-purity clean room used in the final assembly is actually housed within a larger clean room at Berkeley Lab.

    “The superconducting linear accelerator is extremely sensitive to particulates,” such as dust and other types of tiny particles, Virostek said. “Its accelerating cells can become non-usable, so we had to go through quite a few iterations of planning to clean and assemble our system with as few particulates as possible.”

    8
    Joe Wallig, a mechanical engineering associate, prepares a metal ring component of the injector gun for installation using a jet of high-purity dry ice in a clean room. (Credit: Marilyn Chung/Berkeley Lab)

    The dry ice-based cleaning processes function like sandblasting, creating tiny explosions that cleanse the surface of components by ejecting contaminants. In one form of this cleaning process, Berkeley Lab technicians enlisted a specialized nozzle to jet a very thin stream of high-purity dry ice.

    After assembly, the injector was vacuum-sealed and filled with nitrogen gas to stabilize it for shipment. The injector’s cathodes degrade over time, and the injector is equipped with a “suitcase” of cathodes, also under vacuum, that allows cathodes to be swapped out without the need to open up the device.

    “Every time you open it up you risk contamination,” Virostek explained. Once all of the cathodes in a suitcase are used up, the suitcase must be replaced with a fresh set of cathodes.

    The overall operation and tuning of the injector gun will be remotely controlled, and there is a variety of diagnostic equipment built into the injector to help ensure smooth running.

    Even before the new injector is installed, Berkeley Lab has proposed to undertake a design study for a new injector that could generate electron bunches with more than double the output energy. This would enable higher-resolution X-ray-based images for certain types of experiments.

    Berkeley Lab Contributions to LCLS-II

    John Corlett, Berkeley Lab’s senior team leader, worked closely with the LCLS-II project managers at SLAC and with Berkeley Lab managers to bring the injector project to fruition.

    9
    Steve Virostek, a senior engineer who led the injector gun’s construction, inspects the mounted injector prior to shipment. (Credit: Marilyn Chung/Berkeley Lab)

    “In addition to the injector source, Berkeley Lab is also responsible for the undulator segments for both of the LCLS-II X-ray free-electron laser beamlines, for the accelerator physics modeling that will optimize their performance, and for technical leadership in the low-level radio-frequency controls systems that stabilize the superconducting linear accelerator fields,” Corlett noted.

    James Symons, Berkeley Lab’s associate director for physical sciences, said, “The LCLS-II project has provided a tremendous example of how multiple laboratories can bring together their complementary strengths to benefit the broader scientific community. The capabilities of LCLS-II will lead to transformational understanding of chemical reactions, and I’m proud of our ability to contribute to this important national project.”

    LCLS-II is being built at SLAC with major technical contributions from Argonne National Laboratory, Fermilab, Jefferson Lab, Berkeley Lab, and Cornell University. Construction of LCLS-II is supported by DOE’s Office of Science.

    10
    Members of the LCLS-II injector gun team at Berkeley Lab. (Credit: Marilyn Chung/Berkeley Lab)

    View more photos of the injector gun and related equipment: here and here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel