Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:00 pm on January 23, 2018 Permalink | Reply
    Tags: , Applied Research & Technology, , Neural networks for neutrinos, , , ,   

    From Symmetry: “Neural networks for neutrinos” 

    Symmetry Mag

    Symmetry

    01/23/18
    Diana Kwon

    1
    Artwork by Sandbox Studio, Chicago

    Scientists are using cutting-edge machine-learning techniques to analyze physics data.

    Particle physics and machine learning have long been intertwined.

    One of the earliest examples of this relationship dates back to the 1960s, when physicists were using bubble chambers to search for particles invisible to the naked eye. These vessels were filled with a clear liquid that was heated to just below its boiling point so that even the slightest boost in energy—for example, from a charged particle crashing into it—would cause it to bubble, an event that would trigger a camera to take a photograph.

    Female scanners often took on the job of inspecting these photographs for particle tracks. Physicist Paul Hough handed that task over to machines when he developed the Hough transform, a pattern recognition algorithm, to identify them.

    The computer science community later developed the Hough transform for use in applications such as computer vision, attempts to train computers to replicate the complex function of a human eye.

    “There’s always been a little bit of back and forth” between these two communities, says Mark Messier, a physicist at Indiana University.

    Since then, the field of machine learning has rapidly advanced. Deep learning, a form of artificial intelligence modeled after the human brain, has been implemented for a wide range of applications such as identifying faces, playing video games and even synthesizing life-like videos of politicians.

    Over the years, algorithms that help scientists pick interesting aberrations out of background data have been used in physics experiments such as BaBar at SLAC National Accelerator Laboratory and experiments at the Large Electron-Positron Collider at CERN and the Tevatron at Fermi National Accelerator Laboratory.

    SLAC BABAR

    CERN LEP Collider

    FNAL Tevatron

    FNAL/Tevatron map


    FNAL/Tevatron DZero detector


    FNAL/Tevatron CDF detector

    More recently, algorithms that learn to recognize patterns in large datasets have been handy for physicists studying hard-to-catch particles called neutrinos.

    This includes scientists on the NOvA experiment, who study a beam of neutrinos created at the US Department of Energy’s Fermilab near Chicago.

    FNAL NOvA Near Detector


    FNAL/NOvA experiment map

    The neutrinos stream straight through Earth to a 14,000-metric-ton detector filled with liquid scintillator sitting near the Canadian border in Minnesota.

    When a neutrino strikes the liquid scintillator, it releases a burst of particles. The detector collects information about the pattern and energy of those particles. Scientists use that information to figure out what happened in the original neutrino event.

    “Our job is almost like reconstructing a crime scene,” Messier says. “A neutrino interacts and leaves traces in the detector—we come along afterward and use what we can see to try and figure out what we can about the identity of the neutrino.”

    Over the last few years, scientists have started to use algorithms called convolutional neural networks (CNNs) to take on this task instead.

    CNNs, which are modelled after the mammalian visual cortex, are widely used in the technology industry—for example, to improve computer vision for self-driving cars. These networks are composed of multiple layers that act somewhat like filters: They contain densely interconnected nodes that possess numerical values, or weights, that are adjusted and refined as inputs pass through.

    “The ‘deep’ part comes from the fact that there are many layers to it,” explains Adam Aurisano, an assistant professor at the University of Cincinnati. “[With deep learning] you can take nearly raw data, and by pushing it through these stacks of learnable filters, you wind up extracting nearly optimal features.”

    For example, these algorithms can extract details associated with particle interactions of varying complexity from the “images” collected by recording different patterns of energy deposits in particle detectors.

    “Those stacks of filters have sort of sliced and diced the image and extracted physically meaningful bits of information that we would have tried to reconstruct before,” Aurisano says.

    Although they can be used to classify events without recreating them, CNNs can also be used to reconstruct particle interactions using a method called semantic segmentation.

    When applied to an image of a table, for example, this method would reconstruct the object by tagging each pixel associated with it, Aurisano explains. In the same way, scientists can label each pixel associated with characteristics of neutrino interactions, then use algorithms to reconstruct the event.

    Physicists are using this method to analyze data collected from the MicroBooNE neutrino detector.

    FNAL/MicrobooNE

    “The nice thing about this process is that you might find a cluster that’s made by your network that doesn’t fit in any interpretation in your model,” says Kazuhiro Terao, a scientist at SLAC National Accelerator Laboratory. “That might be new physics. So we could use these tools to find stuff that we might not understand.”

    Scientists working on other particle physics experiments, such as those at the Large Hadron Collider at CERN, are also using deep learning for data analysis.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    “All these big physics experiments are really very similar at the machine learning level,” says Pierre Baldi, a computer scientist at the University of California, Irvine. “It’s all images associated with these complex, very expensive detectors, and deep learning is the best method for extracting signal against some background noise.”

    Although most of the information is currently flowing from computer scientists to particle physicists, other communities may also gain new tools and insights from these experimental applications as well.

    For example, according to Baldi, one question that’s currently being discussed is whether scientists can write software that works across all these physics experiments with a minimal amount of human tuning. If this goal were achieved, it could benefit other fields, such a biomedical imaging, that use deep learning as well. “[The algorithm] would look at the data and calibrate itself,” he says. “That’s an interesting challenge for machine learning methods.”

    Another future direction, Terao says, would be to get machines to ask questions—or, more simply, to be able to identify outliers and try to figure out how to explain them.

    “If the AI can form a question and come up with a logical sequence to solve it, then that replaces a human,” he says. “To me, the kind of AI you want to see is a physics researcher—one that can do scientific research.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


    Advertisements
     
  • richardmitnick 1:14 pm on January 23, 2018 Permalink | Reply
    Tags: Applied Research & Technology, Forensic DNA evidence, PROVEDIt database,   

    From Rutgers Camden: “Rutgers-Camden Houses PROVEDIt DNA Database” 

    Rutgers Camden

    1.23.18
    Jeanne Leong
    jeanne.leong@camden.rutgers.edu

    1
    No image caption or credit

    Forensic DNA evidence is a valuable tool in criminal investigations to link a suspect to the scene of a crime, but the process to make that determination is not so simple since the genetic material found at a crime scene often comes from more than one person.

    That task may become somewhat less challenging, thanks to a new database at Rutgers University–Camden that can help to bring more reliability to the interpretation of complex DNA evidence. This innovative new resource was developed by a research team led by Rutgers University–Camden professors Catherine Grgicak and Desmond Lun, and Ken Duffy of the University of Ireland at Maynooth.

    “Right now, there’s no standardization of tests,” says Grgicak, the Henry Rutgers Chair in chemistry at Rutgers–Camden. “There’s accreditation of crime labs, but that’s different from having standards set out for labs to meet some critical threshold of a match statistic.”

    In analyzing DNA mixtures, scientists will often find partial matches, so part of the determination of whether a suspect contributed to an item of evidence depends on interpretations by forensic scientists.

    The Project Research Openness for Validation with Empirical Data (PROVEDIt) database will help reduce the risk of misinterpreting the profile.

    The team of researchers spent more than six years developing computational algorithms that sorted through possible DNA signal combinations in a piece of evidence, taking into account their prevalence in the general population to determine the likelihood that the genetic material came from one, two, three, four, or five people.

    Information from the PROVEDIt database, the housed at Rutgers–Camden, could be used to test software systems and interpretation protocols, and be used as a benchmark for future developments in DNA analysis.

    The PROVEDIt database, which consists of approximately 25,000 samples, is accessible to anyone for free.

    “We wanted to provide these data to the community so that they could test their own probabilistic systems,” says Grgicak. “Other academicians or other researchers might develop their own systems by which to interpret these very complex types of samples.”

    The website’s files contain data that can be used to develop new or compare existing interpretation or analysis strategies.

    Grgicak says forensic laboratories could use the database for validating or testing new or existing forensic DNA interpretation protocols. Researchers requiring data to test newly developed methodologies, technologies, ideas, developments, hypotheses, or prototypes can use the database to advance their own work.

    Lun, a computer science professor at Rutgers–Camden, led the way in developing the software systems, doing the number crunching to determine the likely number of contributors in a DNA sample, and calculating statistics to determine the likelihood that a person contributed to a sample or not.

    “The approach that we took to develop these methods is that we thought that it is very important that they be empirically driven,” says Lun. “That they can be used on real experimental data in order both to train or calibrate these methods and validate them.”

    Grgicak’s and Lun’s research to produce the database, titled “A Large-Scale Dataset of Single and Mixed-Source Short Tandem Repeat Profiles to Inform Human Identification Strategies: PROVEDIt,” is published in the journal Forensic Science International: Genetics.

    The database was mentioned in 2016 in a report by President Barak Obama’s President’s Council of Advisors on Science and Technology (PCAST), an advisory group of the nation’s leading scientists and engineers who directly advise the president and make policy recommendations in science, technology, and innovation.

    The research was supported by the National Institute of Justice, Office of the Defense and Army Research Office Rapid Innovation Fund.

    Other researchers contributing to the study include Lauren Alfonse and Amanda Garrett, of the biomedical forensic sciences program at Boston University School of Medicine.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A regional focus and a global outlook. All the prestige and resources of Rutgers, all the vitality and opportunity of the metro Philadelphia region, all at Rutgers–Camden. As the Rutgers of South Jersey, we deliver the academic heft you’d expect from a powerhouse public research university. And we focus that energy—in teaching, research, and civic engagement—within the greater Delaware Valley.

    The work we do on our 40-acre campus along the bustling Camden Waterfront is felt far beyond. We educate students for successful careers and productive citizenship. We support a faculty of sharp thinkers who turn new knowledge into creative solutions. And we share our expertise with partners—local and global—to improve individual lives and build stronger communities.

     
  • richardmitnick 11:19 am on January 23, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , , Rutgers Scientists Discover 'Legos of Life',   

    From Rutgers: “Rutgers Scientists Discover ‘Legos of Life'” 

    Rutgers University
    Rutgers University

    THIS POST IS DEDICATED TO J.L. FROM HP AND RUTGERS. I HOPE HE SEES IT.

    January 21, 2018
    Todd B. Bates

    Deep dive into the 3D structures of proteins reveals key building blocks.

    1
    Rutgers researchers identified a small set of simple protein building blocks (left) that likely existed at the earliest stages of life’s history. Over billions of years, these “Legos of life” were assembled and repurposed by evolution into complex proteins (right) that are at the core of modern metabolism.
    Image: Vikas Nanda/Rutgers Robert Wood Johnson Medical School.

    Rutgers scientists have found the “Legos of life” – four core chemical structures that can be stacked together to build the myriad proteins inside every organism – after smashing and dissecting nearly 10,000 proteins to understand their component parts.

    The four building blocks make energy available for humans and all other living organisms, according to a study published online today in the Proceedings of the National Academy of Sciences.

    The study’s findings could lead to applications of these stackable, organic building blocks for biomedical engineering and therapeutic proteins and the development of safer, more efficient industrial and energy catalysts – proteins and enzymes that, like tireless robots, can repeatedly carry out chemical reactions and transfer energy to perform tasks.

    “Understanding these parts and how they are connected to each other within the existing proteins could help us understand how to design new catalysts that could potentially split water, fix nitrogen or do other things that are really important for society,” said Paul G. Falkowski, study co-author and a distinguished professor who leads the Environmental Biophysics and Molecular Ecology Laboratory at Rutgers University–New Brunswick.

    The scientists’ research was done on computers, using data on the 3D atomic structures of 9,500 proteins in the RCSB Protein Data Bank based at Rutgers, a rich source of information about how proteins work and evolve.

    “We don’t have a fossil record of what proteins looked like 4 billion years ago, so we have to take what we have today and start walking backwards, trying to imagine what these proteins looked like,” said Vikas Nanda, senior author of the study and an associate professor in the Department of Biochemistry and Molecular Biology at Rutgers’ Robert Wood Johnson Medical School, within Rutgers Biomedical and Health Sciences. “The study is the first time we’ve been able to take something with thousands of amino acids and break it down into reasonable chunks that could have had primordial origins.”

    The identification of four fundamental building blocks for all proteins is just a beginning. Nanda said future research may discover five or 10 more building blocks that serve as biological Legos.

    “Now we need to understand how to put these parts together to make more interesting functional molecules,” he said. “That’s the next grand challenge.”

    The study’s lead author is Hagai Raanana, a post-doctoral associate in the Environmental Biophysics and Molecular Ecology Program. Co-authors include Douglas H. Pike, a doctoral student at the Rutgers Institute for Quantitative Biomedicine, and Eli K. Moore, a post-doctoral associate in the Environmental Biophysics and Molecular Ecology Program.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    rutgers-campus

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers smaller
    Please give us back our original beautiful seal which the University stole away from us.
    As a ’67 graduate of University college, second in my class, I am proud to be a member of

    Alpha Sigma Lamda, National Honor Society of non-tradional students.

     
  • richardmitnick 10:12 am on January 23, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Online tool calculates reproducibility scores of PubMed papers,   

    From Science: “Online tool calculates reproducibility scores of PubMed papers” 

    AAAS
    Science Magazine

    Jan. 22, 2018
    Dalmeet Singh Chawla

    1
    Scientific societies are seeking new tools to measure the reproducibility of published research findings, amid concerns that many cannot be reproduced independently. National Eye Institute, National Institutes of Health/Flickr (CC BY NC 2.0).

    A new online tool unveiled 19 January measures the reproducibility of published scientific papers by analyzing data about articles that cite them.

    The software comes at a time when scientific societies and journals are alarmed by evidence that findings in many published articles are not reproducible and are struggling to find reliable methods to evaluate whether they are.

    The tool, developed by the for-profit firm Verum Analytics in New Haven, Connecticut, generates a metric called the r-factor that indicates the veracity of a journal article based on the number of other studies that confirm or refute its findings. The r-factor metric has drawn much criticism from academics who said its relatively simple approach might not be sufficient to solve the multifaceted problem that measuring reproducibility presents.

    Early reaction to the new tool suggests that Verum has not fully allayed those concerns. The Verum developers concede the tool still has limitations; they said they released it to receive feedback about how well it works and how it could be improved. Verum has developed the project as a labor of love, and Co-Founder Josh Nicholson said he hopes the release of the early version tool will attract potential funders to help improve it.

    Verum announced the methodology underlying the tool, based on the r-factor, in a preprint paper [BioRXiv] last August and refined it in the new tool. It relies solely on data from freely available research papers in the popular biomedical search engine PubMed.

    Nicholson and his colleagues developed the tool by first manually examining 48,000 excerpts of text in articles that cited other published papers. Verum’s workers classified each of these passages as either confirming, refuting, or mentioning the other papers. Verum then used these classifications to train an algorithm to autonomously recognize each kind of passage in papers outside this sample group.

    Based on a sample of about 10,000 excerpts, Verum’s developers claim their tool correctly classifies passages accurately 93% of the time. But it detects mentioning citations much more precisely than confirming or refuting ones, which were much less common in their sample. The vast majority of articles mention previous studies without confirming or refuting their claims; only about 8% of all citations are confirmatory and only about 1% are refuting.

    The tool’s users can apply the algorithm by entering an article’s unique PubMed identifier code. The algorithm scours PubMed to find articles that cite the paper of interest and all passages that confirm, refute, or mention the paper. The tool then generates an r-factor score for the paper by dividing the number of confirming papers by the sum of the confirming and refuting papers.

    This formula tends to assign high scores, close to 1, to papers seldom refuted. The low number of refuting papers in Verum’s database means that many articles have r-factors of 1—which tends to limit the tool’s usefulness. (R-factors also contain a subscript number indicating the total number of studies that attempted to replicate the paper—an r-factor of 116 means the tool scanned 16 replication studies.)

    Psychologist Christopher Chartier of Ashland University in Ohio, who developed an online platform that assists with the logistics of replication studies, tried the new tool at the request of ScienceInsider. “It appears to do what it claims to do, but I don’t find much value in the results,” he says. One reason, he says, is that r-factors may be skewed by a publication bias—where scholarly journals favorably publish positive results over negative results. “We simply can’t trust the published literature to be a reliable and valid indicator of a finding’s replicability,” Chartier said.

    “Attempting to estimate the robustness of a published research finding is notoriously difficult,” said Marcus Munafò, a biological psychologist at the University of Bristol in the United Kingdom, a key figure in tackling irreproducibility [nature human behavior] . It’s difficult, he said, to know the precision or quality of individual confirmatory or refuting studies without reading them.

    Another limitation in Verum’s tool is that because it trawls only freely available papers on PubMed, it misses paywalled scholarly literature.

    Still, the Verum team will press on. Next on their agenda is to increase the number of sample papers used to train their algorithm to improve its accuracy in recognizing confirming and refuting papers.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:35 am on January 23, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , Tsunami warnings issued – later canceled – after powerful Alaska quake   

    From EarthSky: “Tsunami warnings issued – later canceled – after powerful Alaska quake” 

    1

    EarthSky

    January 23, 2018
    Deborah Byrd

    A powerful earthquake struck 174 miles (280 km) southeast of Kodiak, Alaska early this morning. Tsunami watches or warnings were issued – later cancelled – for the western North America and Hawaii.

    1
    A National Weather Service map showing the red tsunami warning zone as well as the yellow tsunami watch zone of January 23, 2018. The original watches and warnings ran south from Alaska, into Washington and California and also included Hawaii. At this writing (12:30 UTC, or 6:30 EST), no tsunami watch, warning or advisory is in effect according to the Pacific Tsunamic Warning Center.

    The U.S. Geological Survey (USGS) reported a very large earthquake this morning in the Gulf of Alaska. It was originally reported at 8.2 magnitude, then 7.9 magnitude, then downgraded further to 7.0; even at the lowest number, it’s still a powerful quake (though much less powerful than originally reported). The earthquake struck on January 23, 2018 at 9:31 UTC (3:31 a.m. CST). It occurred 174 miles (280 km) southeast of Kodiak, Alaska.

    The Pacific Tsunami Warning Center (PTWC) issued tsunami watches or warnings for large portions of the Pacific, including a watch for the U.S. west coast from Washington to California as well as Hawaii, and a tsunami warning for the coast of Alaska and the Canadian province of British Columbia. Subsequently, all watches and warnings were cancelled, but not before a mass of confusion on Twitter and other news outlets.

    There were reports of some panic in Kodiak, Alaska (sirens blaring, people being woken from sleep), near the quake’s epicenter. Waters were then said to be receding in Kodiak, and waves were said to have been “small.”

    We have not yet seen reports of damages or injuries from this event.

    The PTWC – which was still in its calculation process when this advisory was issued at 10:17 UTC (4:17 a.m. CST) today – said tsunami waves were originally forecast to be less than one foot (0.3 meters) above the tide level for the coasts of Guam, Hawaii and northwestern Hawaiian Islands, Japan, Johnston Atoll, Mexico, Midway Island, Northern Marianas, Russia, and Wake Island.

    This story is still being updated.

    4
    Aftershocks will follow an earthquake this size. Resident of both Alaska and Canada should be prepared. Sometimes aftershocks can be even stronger than the initial earthquake. Daniel McFarland‏

    Large earthquakes are common in the Pacific-North America plate boundary region south of Alaska. USGS explained:

    The January 23, 2018 M 7.9 earthquake southeast of Kodiak Island in the Gulf of Alaska occurred as the result of strike slip faulting within the shallow lithosphere of the Pacific plate … At the location of the earthquake, the Pacific plate is converging with the North America plate at a rate of approximately 59 mm/yr towards the north-northwest. The Pacific plate subducts beneath the North America plate at the Alaska-Aleutians Trench, about 90 km to the northwest of today’s earthquake. The location and mechanism of the January 23rd earthquake are consistent with it occurring on a fault system within the Pacific plate before it subducts, rather than on the plate boundary between the Pacific and North America plates further to the northwest.

    Bottom line: A 7.9-magnitude earthquake struck on January 23, 2018 in the Gulf of Alaska. Tsunami watches and warnings issued. The situation is still unfolding.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Deborah Byrd created the EarthSky radio series in 1991 and founded EarthSky.org in 1994. Today, she serves as Editor-in-Chief of this website. She has won a galaxy of awards from the broadcasting and science communities, including having an asteroid named 3505 Byrd in her honor. A science communicator and educator since 1976, Byrd believes in science as a force for good in the world and a vital tool for the 21st century. “Being an EarthSky editor is like hosting a big global party for cool nature-lovers,” she says.

     
  • richardmitnick 9:06 am on January 23, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , , ,   

    From LBNL- “It All Starts With a ‘Spark’: Berkeley Lab Delivers Injector That Will Drive X-Ray Laser Upgrade” 

    Berkeley Logo

    Berkeley Lab

    January 22, 2018
    Glenn Roberts, Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Unique device will create bunches of electrons to stimulate million-per-second X-ray pulses.

    1
    Joe Wallig, left, a mechanical engineering associate, and Brian Reynolds, a mechanical technician, work on the final assembly of the LCLS-II injector gun in a specially designed clean room at Berkeley Lab in August. (Credit: Marilyn Chung/Berkeley Lab)

    Every powerful X-ray pulse produced for experiments at a next-generation laser project, now under construction, will start with a “spark” – a burst of electrons emitted when a pulse of ultraviolet light strikes a 1-millimeter-wide spot on a specially coated surface.

    A team at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) designed and built a unique version of a device, called an injector gun, that can produce a steady stream of these electron bunches that will ultimately be used to produce brilliant X-ray laser pulses at a rapid-fire rate of up to 1 million per second.

    The injector arrived Jan. 22 at SLAC National Accelerator Laboratory (SLAC) in Menlo Park, California, the site of the Linac Coherent Light Source II (LCLS-II), an X-ray free-electron laser project.


    Stanford/SLAC Campus


    SLAC/LCLS II projected view

    2
    An electron beam travels through a niobium cavity, a key component of a future LCLS-II X-ray laser, in this illustration. Kept at minus 456 degrees Fahrenheit, these cavities will power a highly energetic electron beam that will create up to 1 million X-ray flashes per second. (Credit: SLAC National Accelerator Laboratory)

    Getting up to speed

    The injector will be one of the first operating pieces of the new X-ray laser. Initial testing of the injector will begin shortly after its installation.

    The injector will feed electron bunches into a superconducting particle accelerator that must be supercooled to extremely low temperatures to conduct electricity with nearly zero loss. The accelerated electron bunches will then be used to produce X-ray laser pulses.

    Scientists will employ the X-ray pulses to explore the interaction of light and matter in new ways, producing sequences of snapshots that can create atomic- and molecular-scale “movies,” for example, to illuminate chemical changes, magnetic effects, and other phenomena that occur in just quadrillionths (million-billionths) of a second.

    This new laser will complement experiments at SLAC’s existing X-ray laser, which launched in 2009 and fires up to 120 X-ray pulses per second. That laser will also be upgraded as a part of the LCLS-II project.

    SLAC/LCLS

    3
    A rendering of the completed injector gun and related beam line equipment. (Credit: Greg Stewart/SLAC National Accelerator Laboratory)

    The injector gun project teamed scientists from Berkeley Lab’s Accelerator Technology and Applied Physics Division with engineers and technologists from the Engineering Division in what Engineering Division Director Henrik von der Lippe described as “yet another success story from our longstanding partnership – (this was) a very challenging device to design and build.”

    “The completion of the LCLS-II injector project is the culmination of more than three years of effort,” added Steve Virostek, a Berkeley Lab senior engineer who led the gun construction. The Berkeley Lab team included mechanical engineers, physicists, radio-frequency engineers, mechanical designers, fabrication shop personnel, and assembly technicians.

    “Virtually everyone in the Lab’s main fabrication shop made vital contributions,” he added, in the areas of machining, welding, brazing, ultrahigh-vacuum cleaning, and precision measurements.

    The injector source is one of Berkeley Lab’s major contributions to the LCLS-II project, and builds upon its expertise in similar electron gun designs, including the completion of a prototype gun. Almost a decade ago, Berkeley Lab researchers began building a prototype for the injector system in a beam-testing area at the Lab’s Advanced Light Source.

    LBNL/ALS

    That successful effort, dubbed APEX (Advanced Photoinjector Experiment), produced a working injector that has since been repurposed for experiments that use its electron beam to study ultrafast processes at the atomic scale.

    7
    The APEX electron gun and test beamline at the ALS Beam Test Facility. APEX team members include (from left) Daniele Filippetto, Fernando Sannibale, and John Staples of the Accelerator and Fusion Research Division and Russell Wells of the Engineering Division. (Photo by Roy Kaltschmidt, Lawrence Berkeley National Laboratory)

    4
    Daniele Filippetto, a Berkeley Lab scientist, works on the High-Repetition-rate Electron Scattering apparatus (HiRES), which will function like an ultrafast electron camera. HiRES is a new capability that builds on the Advanced Photo-injector Experiment (APEX), a prototype electron source for advanced X-ray lasers. (Roy Kaltschmidt/Berkeley Lab)

    Fernando Sannibale, Head of Accelerator Physics at the ALS, led the development of the prototype injector gun.

    5
    Krista Williams, a mechanical technician, works on the final assembly of LCLS-II injector components on Jan. 11. (Credit: Marilyn Chung/Berkeley Lab)

    “This is a ringing affirmation of the importance of basic technology R&D,” said Wim Leemans, director of Berkeley Lab’s Accelerator Technology and Applied Physics Division. “We knew that the users at next-generation light sources would need photon beams with exquisite characteristics, which led to highly demanding electron-beam requirements. As LCLS-II was being defined, we had an excellent team already working on a source that could meet those requirements.”

    The lessons learned with APEX inspired several design changes that are incorporated in the LCLS-II injector, such as an improved cooling system to prevent overheating and metal deformations, as well as innovative cleaning processes.

    “We’re looking forward to continued collaboration with Berkeley Lab during commissioning of the gun,” said SLAC’s John Galayda, LCLS-II project director. “Though I am sure we will learn a lot during its first operation at SLAC, Berkeley Lab’s operating experience with APEX has put LCLS-II miles ahead on its way to achieving its performance and reliability objectives.”

    Mike Dunne, LCLS director at SLAC, added, “The performance of the injector gun is a critical component that drives the overall operation of our X-ray laser facility, so we greatly look forward to seeing this system in operation at SLAC. The leap from 120 pulses per second to 1 million per second will be truly transformational for our science program.”

    How it works

    Like a battery, the injector has components called an anode and cathode. These components form a vacuum-sealed central copper chamber known as a radio-frequency accelerating cavity that sends out the electron bunches in a carefully controlled way.

    The cavity is precisely tuned to operate at very high frequencies and is ringed with an array of channels that allow it to be water-cooled, preventing overheating from the radio-frequency currents interacting with copper in the injector’s central cavity.

    7
    A copper cone structure inside the injector gun’s central cavity. (Credit: Marilyn Chung/Berkeley Lab)

    A copper cone structure within its central cavity is tipped with a specially coated and polished slug of molybdenum known as a photocathode. Light from an infrared laser is converted to an ultraviolet (UV) frequency laser, and this UV light is steered by mirrors onto a small spot on the cathode that is coated with cesium telluride (Cs2Te), exciting the electrons.

    These electrons are are formed into bunches and accelerated by the cavity, which will, in turn, connect to the superconducting accelerator. After this electron beam is accelerated to nearly the speed of light, it will be wiggled within a series of powerful magnetic structures called undulator segments, stimulating the electrons to emit X-ray light that is delivered to experiments.

    Precision engineering and spotless cleaning

    Besides the precision engineering that was essential for the injector, Berkeley Lab researchers also developed processes for eliminating contaminants from components through a painstaking polishing process and by blasting them with dry ice pellets.

    The final cleaning and assembly of the injector’s most critical components was performed in filtered-air clean rooms by employees wearing full-body protective clothing to further reduce contaminants – the highest-purity clean room used in the final assembly is actually housed within a larger clean room at Berkeley Lab.

    “The superconducting linear accelerator is extremely sensitive to particulates,” such as dust and other types of tiny particles, Virostek said. “Its accelerating cells can become non-usable, so we had to go through quite a few iterations of planning to clean and assemble our system with as few particulates as possible.”

    8
    Joe Wallig, a mechanical engineering associate, prepares a metal ring component of the injector gun for installation using a jet of high-purity dry ice in a clean room. (Credit: Marilyn Chung/Berkeley Lab)

    The dry ice-based cleaning processes function like sandblasting, creating tiny explosions that cleanse the surface of components by ejecting contaminants. In one form of this cleaning process, Berkeley Lab technicians enlisted a specialized nozzle to jet a very thin stream of high-purity dry ice.

    After assembly, the injector was vacuum-sealed and filled with nitrogen gas to stabilize it for shipment. The injector’s cathodes degrade over time, and the injector is equipped with a “suitcase” of cathodes, also under vacuum, that allows cathodes to be swapped out without the need to open up the device.

    “Every time you open it up you risk contamination,” Virostek explained. Once all of the cathodes in a suitcase are used up, the suitcase must be replaced with a fresh set of cathodes.

    The overall operation and tuning of the injector gun will be remotely controlled, and there is a variety of diagnostic equipment built into the injector to help ensure smooth running.

    Even before the new injector is installed, Berkeley Lab has proposed to undertake a design study for a new injector that could generate electron bunches with more than double the output energy. This would enable higher-resolution X-ray-based images for certain types of experiments.

    Berkeley Lab Contributions to LCLS-II

    John Corlett, Berkeley Lab’s senior team leader, worked closely with the LCLS-II project managers at SLAC and with Berkeley Lab managers to bring the injector project to fruition.

    9
    Steve Virostek, a senior engineer who led the injector gun’s construction, inspects the mounted injector prior to shipment. (Credit: Marilyn Chung/Berkeley Lab)

    “In addition to the injector source, Berkeley Lab is also responsible for the undulator segments for both of the LCLS-II X-ray free-electron laser beamlines, for the accelerator physics modeling that will optimize their performance, and for technical leadership in the low-level radio-frequency controls systems that stabilize the superconducting linear accelerator fields,” Corlett noted.

    James Symons, Berkeley Lab’s associate director for physical sciences, said, “The LCLS-II project has provided a tremendous example of how multiple laboratories can bring together their complementary strengths to benefit the broader scientific community. The capabilities of LCLS-II will lead to transformational understanding of chemical reactions, and I’m proud of our ability to contribute to this important national project.”

    LCLS-II is being built at SLAC with major technical contributions from Argonne National Laboratory, Fermilab, Jefferson Lab, Berkeley Lab, and Cornell University. Construction of LCLS-II is supported by DOE’s Office of Science.

    10
    Members of the LCLS-II injector gun team at Berkeley Lab. (Credit: Marilyn Chung/Berkeley Lab)

    View more photos of the injector gun and related equipment: here and here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 10:53 am on January 22, 2018 Permalink | Reply
    Tags: , Applied Research & Technology, , , , Marine geodesy, Megathrust zone, ,   

    From Eos: “Modeling Megathrust Zones” 

    AGU bloc

    AGU
    Eos news bloc

    Eos

    1.22.18
    Rob Govers

    A recent paper in Review of Geophysics built a unifying model to predict the surface characteristics of large earthquakes.

    1
    The Sendai coast of Japan approximately one year after the 2011 Tohoku earthquake. The harbor moorings and the quay show significant co-seismic subsidence. The dark band along the quay wall resulted from post-seismic uplift. Credit: Rob Govers.

    The past few decades have seen a number of very large earthquakes at subduction zones. Researchers now have an array of advanced technologies that provide insights into the processes of plate movement and crustal deformation. A review article recently published in Reviews of Geophysics pulled together observations from different locations worldwide to evaluate whether similar physical processes are active at different plate margins. The editors asked one of the authors to describe advances in our understanding and where additional research is still needed.

    What are “megathrust zones” and what are the main processes that occur there?

    A megathrust zone is a thin boundary layer between a tectonic plate that sinks into the Earth’s mantle and an overriding plate. The largest earthquakes and tsunamis are produced here. High friction in the shallow part of the megathrust zone effectively locks parts of the interface during decades to centuries. Ongoing plate motion slowly brings the shallow interface closer to failure, i.e., an earthquake. Other parts of the megathrust zone are mechanically weaker. They consequently attempt to creep at a rate that is required by plate tectonics, but are limited by being connected to the locked part of the interface.

    What insights have been learned from recent megathrust earthquakes at different margins?

    High magnitude earthquakes in Indonesia (2004), Chile (2010) and Japan (2011) were recorded by new networks utilizing Global Positioning System technology, which is capable of measuring ground displacements with millimeter accuracy. This complemented seismological observations of megathrust slip during these earthquakes. The crust turned out to deform significantly during and after these earthquakes. These observations indicated that slip on weak parts of the megathrust zone may be responsible, likely in combination with the more classical stress relaxation in the Earth’s mantle. In regions where megathrust earthquakes are anticipated, crustal deformation observations allowed researchers to identify parts of the megathrust zone that are currently locked. In our review article, we integrate these perspectives into a general framework for the earthquake cycle.

    How have models been used to complement observations and better understand these processes?

    Mechanical models are needed to tie the surface observations to their causative processes that take place from a few to hundreds of kilometers deep into the Earth, which is beyond what is directly accessible by drilling. Many of the published models focus on a single earthquake along a specific megathrust zone. We wondered what deep earth processes are common to these regions globally and built a unifying model to predict its surface expressions. Our model roughly reproduced the observed surface deformation, but it also became clear that some regional diversity would be required to match the data shortly after a major earthquake.

    What have been some of the recent significant scientific advances in understanding plate boundaries?

    Creep on weak parts of the megathrust zone is a very significant contributor to the surface measurements after an earthquake. Mantle relaxation is also relevant. We demonstrate that the surface deformation of these processes may give a biased impression of low friction on the megathrust zone. Creep on the megathrust zone downdip of a major earthquake may be responsible for observations that were puzzling thus far; in an overall context of convergence and compression, tension was observed in the overriding plate shortly after recent major earthquakes.

    What are some of the unresolved questions where additional research or modeling is needed?

    Marine geodesy is an exciting new field that aims to monitor deformation of the sea floor that already yielded important constraints on the deformation of the Japan megathrust. Measurements along various margins will tell whether all megathrusts are locked all the way up to the seafloor. A longstanding question is how observations on geological time scales of mountain building and deformation of the overriding plate are linked to the observations of active deformation. We think that the multi-earthquake cycle model that we present in this review article is a first step towards that goal.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 9:14 am on January 22, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Cambridge Rindge and Latin School (CRLS), , Harvard Life Sciences Outreach Program,   

    From Harvard Gazette “Learning to understand their own DNA” 

    Harvard University
    Harvard University


    Harvard Gazette

    January 19, 2018
    Deborah Blackwell

    1
    Cambridge Rindge and Latin student Hannah Thomsen isolates her DNA for sequencing during an Amgen biotech lab inside the Science Center. Kris Snibbe/Harvard Staff Photographer.

    Harvard opens its labs to help local high school students decode biotech

    On the fourth floor of Harvard’s Science Center, high school biology students from Cambridge Rindge and Latin School (CRLS) put on safety goggles and gloves, and step up to lab tables conveniently set up with pipettes, centrifuges, and other implements.

    Then they get to work isolating their own DNA.

    “This is real-life science, the stuff that people who work in biotech are actually doing in their labs, and the fact that kids get to do this at the high school level is amazing,” said Janira Arocho, a biology teacher at CRLS. “I didn’t get to do this type of stuff until I was in college.”

    Teaching younger students the tools of modern science is the goal of the Amgen Biotech Experience (ABE,) a STEM (science, technology, engineering, and mathematics) program that opens the field of biotechnology to high schoolers and their teachers, while at the same time teaching them how to approach science as critical thinkers and innovators — and a lot about who they are.

    “It’s normally really, really challenging to give them a good sense of what happens just by lecturing about it,” said Tara Bennett Bristow, site director of the Massachusetts ABE. “The ABE program is not only helping to increase their scientific literacy in biotechnology, it’s exposing them in a hands-on fashion, which generates enthusiasm.”

    In its sixth year in Massachusetts, the local branch of the program is a partnership between the Harvard and the Amgen Foundation. A foundation grant through the University’s Life Sciences Outreach Program provides the kits of materials and equipment for students to do labs that mirror the process of therapeutic research and development, and Massachusetts teachers participating in the program complete summer training workshops at Harvard.

    Arocho, who has participated in the program for several years, said with the training, “I was able to learn everything my students would be doing ahead of time, as opposed to learning along with them in my own classroom.”

    More than 80,000 students around the world — 6,000 of them from Massachusetts high schools, along with 100 of their teachers — participated in ABE last year. At Harvard, which in July received another three-year grant to continue ABE programing, about 500 CRLS students are able to use the undergraduate biology teaching laboratories, where their own teacher leads the lab and graduate students and postdoctoral fellows are on site for assistance.

    2
    CRLS students Hannah Thomsen (from left) and Elizabeth Lucas-Foley work with their Biology teacher Janira Arocho, GSAS student Alyson Ramirez, and CRLS students Peter Fulweiler and Kerri Sands. Kris Snibbe/Harvard Staff Photographer.

    n one lab in December, the CRLS students isolated their own DNA (their results were sent out for sequencing, and reports returned to them several days later for analysis). In another, the students produced a red fluorescent protein — used in the field for in vivo imaging — with common biotech tools.

    Alia Qatarneh, the site coordinator of the ABE program at Harvard, leads teacher ABE workshops, training, and student labs. Qatarneh said she is particularly excited that the program was just implemented at her alma mater, Boston Latin School, where she was able to teach an ABE lab to four advanced placement biology classes last fall.

    “It was amazing to go back to Boston Latin and think of my own experience as a high school student. I was so into science and loved hands-on things, but didn’t take AP biology because I was scared,” she said. “If I were a high school student and I had a chance to hold pipettes, to change the genetic makeup of bacteria to make it glow in the dark, how cool would that be?”

    An assessment by the nonprofit research firm WestEd found that the ABE program substantially adds to students’ knowledge of biotechnology, and increases their interest and confidence in their scientific abilities. The program is open and for free participating high school biology students, including those with learning disabilities, and even those without an interest in science.

    “Students may say, ‘Wow biotech, I didn’t know that this field existed. I thought that if I liked science I had to be a doctor, and now I have this whole different path in front of me,’” Qatarneh said.

    Arocho said her students love going to Harvard, seeing what the labs look like, and doing their work there. “Alia always starts by telling them that this is the exact same lab that the Harvard freshman are doing, and the exact same place, so they do get excited about that,” she said.

    CRLS junior Peter Fulweiler, one of Arocho’s students, said the best part is taking what he learned in the classroom and putting it all together in the lab.

    “I love the hands-on part of this. It’s really interesting, because it’s not like we are reading instructions; we are making an attempt to actually understand what we are learning by doing it,” he said. “The bonus is that we get to find out where we are from on our mothers’ side.”

    Science teacher Lawrence Spezzano is one of 10 instructors at Boston Latin now implementing the ABE program. He said it allows for flexibility and differentiation, and enhances learning opportunities as well as classroom logistics.

    “The program was perfect. As an AP biology teacher struggling to fit more labs and biotechnology into a time-constrained curriculum, the mapped-out process is creative and engaging to both me and my students,” Spezzano said.

    Kerri Sands, a junior at CRLS, said she has always dreamed of being a geneticist. She wants to eventually change the future of medicine, and now feels like she can.

    “I just love the science of this, the lab is like my home. I love the whole experience of everything from the micro pipetting to the centrifuging. I love it all,” she said. “This has made my passion for science even stronger.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Harvard University campus
    Harvard is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

     
  • richardmitnick 8:17 am on January 22, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Mounting Evidence Suggests a Remote Australian Region Was Once Part of North America,   

    From Science Alert: “Mounting Evidence Suggests a Remote Australian Region Was Once Part of North America” 

    ScienceAlert

    Science Alert

    22 JAN 2018
    MICHELLE STARR

    1
    (Janaka Dharmasena/Shutterstock)

    It really is a small world after all.

    Geologists tend to agree that, billions of years ago, the configuration of the continents was very different. How exactly they all fit together and when is a bit more of a puzzle, the pieces of which can be put together by studying rocks and fossils.

    Now researchers have found a series of rocks that show something surprising: part of Australia could have once been connected to part of Canada on the North American continent, around 1.7 billion years ago.

    Actually, the discovery that the two continents were once connected isn’t hugely surprising. Speculation about such a connection has existed since the late 1970s, when a paper proposed a connection dating back to the continent of Rodinia, around 1.13 billion years ago. However, an exact time and location for the connection has remained under debate.

    Found in Georgetown, a small town of just a few hundred people in the north east of Australia, the rocks are unlike other rocks on the Australian continent.

    Instead, they show similarities to ancient rocks found in Canada, in the exposed section of the continental crust called the Canadian Shield.

    This unexpected finding, according to researchers at Curtin University, Monash University and the Geological Survey of Queensland in Australia, reveals something about the composition of the ancient supercontinent Nuna.

    “Our research shows that about 1.7 billion years ago, Georgetown rocks were deposited into a shallow sea when the region was part of North America. Georgetown then broke away from North America and collided with the Mount Isa region of northern Australia around 100 million years later,” said Curtin PhD student and lead researcher Adam Nordsvan.

    “This was a critical part of global continental reorganisation when almost all continents on Earth assembled to form the supercontinent called Nuna.”

    The last time the continents were close to one another was the major supercontinent known as Pangea, which broke apart around 175 million years ago.

    However, before Pangea, the planet went through a number of supercontinent configurations – one of which was Nuna, also called Columbia, which existed from around 2.5 billion to 1.5 billion years ago.

    The team reached its conclusion by examining new sedimentological field data, and new and existing geochronological data from both Georgetown and Mount Isa, another remote town in north east Australia, and comparing it to rocks from Canada.

    According to the research, when Nuna started breaking up, the Georgetown area remained permanently stuck to Australia.

    This, the researchers said in their paper, challenges the current model that suggests the Georgetown region was part of the continent that would become Australia prior to 1.7 billion years ago.

    The research also found new evidence that Georgetown and Mount Isa mountain ranges were formed when the two regions collided.

    “Ongoing research by our team shows that this mountain belt, in contrast to the Himalayas, would not have been very high, suggesting the final continental assembling process that led to the formation of the supercontinent Nuna was not a hard collision like India’s recent collision with Asia,” said co-author Zheng-Xiang Li.

    “This new finding is a key step in understanding how Earth’s first supercontinent Nuna may have formed, a subject still being pursued by our multidisciplinary team here at Curtin University.”

    The research has been published in the journal Geology.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 7:25 am on January 22, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , Great Barrier Reef - Australia, Helping put the Great Barrier Reef on the road to recovery,   

    From CSIROscope: “Helping put the Great Barrier Reef on the road to recovery” 

    CSIRO bloc

    CSIROscope

    22 January 2018
    No writer credit

    3
    2
    1

    The Great Barrier Reef.

    We often hear the same depressing story about the Great Barrier Reef: Australia’s iconic living structure is struggling to cope with a plethora of problems. Deteriorating water quality, rising water temperatures and ocean acidification, and consecutive bleaching events all have their detrimental impacts on the Reef.

    Despite these multiple large-scale and complex problems, many areas of the Great Barrier Reef still show resilience, which presents a window of opportunity to act.

    The Hon. Prime Minister Malcolm Turnbull recently announced a $60 million package of measures to address the challenges that face the Reef. The range of activities includes $6 million for the Australian Institute of Marine Science, ourselves and partners to scope and design a Reef Restoration and Adaptation Program (RRAP). This program will assess and develop existing and novel technologies to assist the recovery and repair of the Reef.

    Dr Peter Mayfield, our Executive Director for Environment, Energy And Resources, said the magnitude of challenges facing the Reef means it cannot be addressed by one organisation alone.

    “The RRAP will provide a unique opportunity to harness our collective knowledge and expertise across the entire research and science sector,” Dr Mayfield said.

    “We’re delighted be working alongside our many partner institutions to help deliver material solutions for the Reef.”

    Bringing together the best

    The nature of the environmental challenge facing the Reef demands the best scientific minds across a range of Australian universities, research institutions, park managers and charities. These include the Australian Institute of Marine Science, Great Barrier Reef Foundation, James Cook University, The University of Queensland, Queensland University of Technology, the Great Barrier Reef Marine Park Authority and researchers from many other organisations.

    We have a long history of working together with AIMS and the Great Barrier Marine Park Authority in the Great Barrier Reef World Heritage Area. The Reef Restoration and Adaptation Program takes this historical collaboration to a new level, involving many more national and international partners.

    Global solutions

    Coral reefs around the world support 25 per cent of all marine life and provide essential goods and services to an estimated one billion people. The solutions we uncover through this program could be used to help save reefs around the world.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: