Tagged: Particle Accelerators Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:40 pm on February 27, 2020 Permalink | Reply
    Tags: "‘Flash photography’ at the LHC", , , , , Particle Accelerators, , ,   

    From Symmetry: “‘Flash photography’ at the LHC” 

    Symmetry Mag
    From Symmetry<

    02/27/20
    Sarah Charley

    1
    Photo by Tom Bullock

    An extremely fast new detector inside the CMS detector will allow physicists to get a sharper image of particle collisions.

    Some of the best commercially available high-speed cameras can capture thousands of frames every second. They produce startling videos of water balloons popping and hummingbirds flying in ultra-slow motion.

    But what if you want to capture an image of a process so fast that it looks blurry if the shutter is open for even a billionth of a second? This is the type of challenge scientists on experiments like CMS and ATLAS face as they study particle collisions at CERN’s Large Hadron Collider.

    When the LHC is operating to its full potential, bunches of about 100 billion protons cross each other’s paths every 25 nanoseconds. During each crossing, which lasts about 2 nanoseconds, about 50 protons collide and produce new particles. Figuring out which particle came from which collision can be a daunting task.

    “Usually in ATLAS and CMS, we measure the charge, energy and momentum of a particle, and also try to infer where it was produced,” says Karri DiPetrillo, a postdoctoral fellow working on the CMS experiment at the US Department of Energy’s Fermilab. “We’ve had timing measurements before—on the order of nanoseconds, which is sufficient to assign particles to the correct bunch crossing, but not enough to resolve the individual collisions within the same bunch.”

    Thanks to a new type of detector DiPetrillo and her collaborators are building for the CMS experiment, this is about to change.

    CERN/CMS Detector

    Physicists on the CMS experiment are devising a new detector capable of creating a more accurate timestamp for passing particles. The detector will separate the 2-nanosecond bursts of particles into several consecutive snapshots—a feat a bit like taking 30 billion pictures a second.

    This will help physicists with a mounting challenge at the LHC: collision pileup.

    Picking apart which particle tracks came from which collision is a challenge. A planned upgrade to the intensity of the LHC will increase the number of collisions per bunch crossing by a factor of four—that is from 50 to 200 proton collisions—making that challenge even greater.

    Currently, physicists look at where the collisions occurred along the beamline as a way to identify which particular tracks came from which collision. The new timing detector will add another dimension to that.

    “These time stamps will enable us to determine when in time different collisions occurred, effectively separating individual bunch crossings into multiple ‘frames,’” says DiPetrillo.

    DiPetrillo and fellow US scientists working on the project are supported by DOE’s Office of Science, which is also contributing support for the detector development.

    According to DiPetrillo, being able to separate the collisions based on when they occur will have huge downstream impacts on every aspect of the research. “Disentangling different collisions cleans up our understanding of an event so well that we’ll effectively gain three more years of data at the High-Luminosity LHC. This increase in statistics will give us more precise measurements, and more chances to find new particles we’ve never seen before,” she says.

    The precise time stamps will also help physicists search for heavy, slow moving particles they might have missed in the past.

    “Most particles produced at the LHC travel at close to the speed of light,” DiPetrillo says. “But a very heavy particle would travel slower. If we see a particle arriving much later than expected, our timing detector could flag that for us.”

    The new timing detector inside CMS will consist of a 5-meter-long cylindrical barrel made from 160,000 individual scintillating crystals, each approximately the width and length of a matchstick. This crystal barrel will be capped on its open ends with disks containing delicately layered radiation-hard silicon sensors. The barrel, about 2 meters in diameter, will surround the inner detectors that compose CMS’s tracking system closest to the collision point. DiPetrillo and her colleagues are currently working out how the various sensors and electronics at each end of the barrel will coordinate to give a time stamp within 30 to 50 picoseconds.

    “Normally when a particle passes through a detector, the energy it deposits is converted into an electrical pulse that rises steeply and the falls slowly over the course of a few nanoseconds,” says Joel Butler, the Fermilab scientist coordinating this project. “To register one of these passing particles in under 50 picoseconds, we need a signal that reaches its peaks even faster.”

    Scientists can use the steep rising slopes of these signals to separate the collisions not only in space, but also in time. In the barrel of the detector, a particle passing through the crystals will release a burst of light that will be recorded by specialized electronics. Based on when the intense flash of light arrives at each sensor, physicists will be able to calculate the particle’s exact location and when it passed. Particles will also produce a quick pulse in the endcaps, which are made from a new type of silicon sensor that amplifies the signal. Each silicon sensor is about the size of a domino and can determine the location of a passing particle to within 1.3 millimeters.

    The physicists working on the timing detector plan to have all the components ready and installed inside CMS for the start-up of the High Luminosity LHC in 2027

    “High-precision timing is a new concept in high-energy physics,” says DiPetrillo. “I think it will be the direction we pursue for future detectors and colliders because of its huge physics potential. For me, it’s an incredibly exciting and novel project to be on right now.”

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:23 pm on February 24, 2020 Permalink | Reply
    Tags: "ATLAS experiment searches for natural supersymmetry using novel techniques", , , , , Particle Accelerators, ,   

    From CERN ATLAS via phys.org: “ATLAS experiment searches for natural supersymmetry using novel techniques” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    From CERN ATLAS

    via


    phys.org

    February 24, 2020

    1
    Visualisation of the highest jet multiplicity event selected in a control region used to make predictions of the background from multijet production. This event was recorded by ATLAS on 18 July 2018, and contains 19 jets, illustrated by cones. Yellow blocks represent the calorimeter energy measured in in noise-suppressed clusters. Of the reconstructed jets, 16 (10) have transverse momenta above 50 GeV (80 GeV). Credit: ATLAS Collaboration/CERN

    In new results presented at CERN, the ATLAS Experiment’s search for supersymmetry (SUSY) reached new levels of sensitivity. The results examine a popular SUSY extension studied at the Large Hadron Collider (LHC): the “Minimal Supersymmetric Standard Model” (MSSM), which includes the minimum required number of new particles and interactions to make predictions at the LHC energies. However, even this minimal model introduces a large amount of new parameters (masses and other properties of the new particles), whose values are not predicted by the theory (free parameters).

    To frame their search, ATLAS physicists look for “natural” SUSY, which assumes the various corrections to the Higgs mass comparable in magnitude and their sum close to the electroweak scale (v ~ 246 GeV). Under this paradigm, the supersymmetric partners of the third-generation quarks (“top and bottom squarks”) and gluons (“gluinos”) could have masses close to the TeV scale, and would be produced through the strong interaction at rates large enough to be observed at the LHC.

    In a recent CERN LHC seminar, the ATLAS Collaboration presented new results in the search for natural SUSY, including searches for top squarks and gluinos using the full LHC Run-2 dataset collected between 2015 and 2018. The new results explore previously uncovered, challenging regions of the free parameter space. This is achieved thanks to new analysis techniques improving the identification of low-energy (“soft”) and high-energy (“boosted”) particles in the final state.

    ATLAS’ search for top squarks was performed by selecting proton–proton collisions containing up to one electron or muon. For top-squark masses less than the top-quark mass of 173 GeV (see Figure 1), the resulting decay products tend to be soft and therefore difficult to identify. Physicists developed new techniques based on charged-particle tracking to better identify these decay products, thus significantly improving the experimental sensitivity. For larger top-squark masses, the decay products are boosted, resulting in high-energy, close-by decay products. Physicists improved the search in this regime by using, among other techniques, more precise estimates of the statistical significance of the missing transverse momentum in a collision event.

    3
    Figure 1: Schematic representation of the various topologies of top-squark decays in the scenarios presented at today’s seminar (see link in footer). The region where the top-squark is lighter than the neutralino is not allowed in the models considered. Credit: ATLAS Collaboration/CERN

    The new search for gluinos looks at events containing eight or more “jets”—collimated sprays of hadrons—and missing transverse momentum generated by the production of stable neutralinos in the gluino decays, which, similar to neutrinos, are not directly detected by ATLAS. Physicists employed new reconstruction techniques to improve the energy resolution of the jets and the missing transverse momentum, allowing them to better separate the putative signal from background processes. These take advantage of “particle-flow” jet algorithms [https://arxiv.org/abs/1703.10485] that combine information from both the tracking detector and the calorimeter system.

    4
    Figure 2: Updated exclusion limits on (left) gluino and (right) top-squark production including the new results presented by ATLAS at the CERN LHC seminar today. Credit: ATLAS Collaboration/CERN

    ATLAS physicists also optimised their event-selection criteria to enhance the contribution of possible SUSY signals compared to the Standard Model background processes. No excess was observed in the data. The results were used to derive exclusion limits on MSSM-inspired simplified models in terms of gluino, top-squark and neutralino masses (see Figure 2).

    The new analyses significantly extend the sensitivity of the searches and further constrain the available parameter space for natural SUSY. The exclusion of heavy top squarks is extended from 1 to 1.25 TeV. The search continues.

    More information: CERN LHC Seminar: Constraining natural supersymmetry with the ATLAS detector by Jeanette Miriam Lorenz indico.cern.ch/event/868249/

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries
    QuantumDiaries

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles

     
  • richardmitnick 6:47 pm on February 13, 2020 Permalink | Reply
    Tags: "University of Chicago to build instrumentation for upgrades to the Large Hadron Collider", , , , , Particle Accelerators, , ,   

    From University of Chicago: “University of Chicago to build instrumentation for upgrades to the Large Hadron Collider” 

    U Chicago bloc

    From University of Chicago

    Feb 13, 2020
    Natalie Lund

    1
    The ATLAS detector at the Large Hadron Collider. UChicago scientists will build components for an upgrade to the detector. CERN.

    Faculty, students, engineers to design and build systems for ATLAS experiment.

    In 2012, scientists and the public around the world rejoiced at the news that CERN’s Large Hadron Collider had discovered the long-sought Higgs boson—a particle regarded as a linchpin in the Standard Model of particle physics, the theory that describes the fundamental forces and classifies all known elementary particles.

    Standard Model of Particle Physics, Quantum Diaries

    CERN CMS Higgs Event May 27, 2012

    CERN ATLAS Higgs Event

    Despite the breakthrough, subsequent collisions in the machine have yet to produce evidence of what physicists call “new physics”: science that could address the areas where the Standard Model seems to break down—like dark matter, dark energy and why there is more matter than antimatter. So now, the particle accelerator and its detectors are getting an upgrade.

    On Feb. 5, the National Science Foundation and the National Science Board gave the green light for $75 million in funding for upgrades to the ATLAS experiment, one of the collider’s two 7-story high and half a football-field long detectors—opening the doors for the discovery of new particles and rare processes. Approximately $5.5 million will go to the University of Chicago, a founding member of the ATLAS experiment, to design and build several components for the upgraded detector.

    “These upgrades will help the physics community answer glaring questions surrounding the structure of the fundamental particle universe,” said Asst. Prof. David Miller, a particle physicist who has worked extensively on the ATLAS detector and is co-leading the University’s participation in the upgrade. “Why do the fundamental particles that we know about exist in the first place? What is the underlying pattern and structure behind them?”

    The upgrades, which are estimated for completion in 2026, will allow researchers to study the Higgs boson in greater detail; continue the hunt for dark matter, which comprises 25% of our universe and has never been directly detected; and identify new particles, interactions, and physical properties such as new symmetries or spatial dimensions.

    The upgrades to the LHC itself will increase its luminosity—the intensity of its proton beams—by a factor of ten, substantially increasing the number of particle collisions that occur in a given amount of time. Thus ATLAS detector, which is the “camera” capturing images of the collisions, must also be upgraded to filter larger quantities of data at high speeds and to deal with more intense radiation.

    “The biggest challenge with our existing detector is separating the signal from the background. For every interesting particle you produce, there are probably something like a million standard particle decays that look about the same,” said Prof. Mark Oreglia, a renowned expert in collider research and development and the other leader for the project.

    Researchers at the University of Chicago will build portions of the calorimeter, the system that measures the energy of the particles that enter the detector; and the trigger, which tells the detector what images, or “events” to record or ignore.

    The challenge for the new calorimeter is building an instrument so sensitive it can instantaneously measure the light and energy coming off the 200 proton-proton collisions that will occur 40 million times per second—while also robust enough to withstand that same powerful radiation.

    UChicago researchers already have built prototypes of some components and sent them for rigorous testing to ensure they could withstand the LHC’s increased intensity. Construction of electronics is slated to begin this spring, with undergraduate students participating in the testing of the boards to look for short circuits and other flaws.

    2
    CERN staff member Irakli Minashvili asks UChicago undergraduate student Hadar Lazar for the results of a test she is running on ATLAS detector electronics. Courtesy Mark Oreglia.

    Another challenge posed by the upgraded LHC is the volume of data produced by the collisions.

    “In their raw form, the data volume is nearly one petabyte per second, so there’s no way we can save that amount,” Miller said. “We have to come up with clever ways to determine what to keep and what to throw away.”

    Miller’s team is partnering with UChicago Computer Science faculty Assoc. Prof. Risi Kondor and Asst. Prof. Yuxin Chen, tapping their pioneering work in machine learning to develop innovative algorithms and software to tackle this unprecedented task.

    “Machine learning helps us detect patterns in the data and uncover features that we might not otherwise have seen,” Miller said. “For example, I’m working with Risi Kondor to build a completely new type of neural network whose inherent structure reflects known symmetries of nature.”

    The $75 million from the National Science Foundation will complement $163 million funding from the U.S. Department of Energy to support the upgrade to the detector.

    The project will involve multiple universities as well as national laboratories. ATLAS is a large international collaboration consisting of 3000 scientists from 182 institutions and 38 countries.

    Additional researchers and groups involved with the construction are Young-Kee Kim, the Louis Block Distinguished Service Professor of Physics; Melvyn Shochet, the Kersten Distinguished Service Professor of Physics; and the staff of the Enrico Fermi Institute’s Electronics Development Group (EDG) and MANIAC Lab: EDG director Mary Heintz, EDG research professor Kelby Anderson, EDG engineers Mircea Bogdan and Fukun Tang, and MANIAC Lab director research professor Robert Gardner, who heads an NSF effort called Scalable Systems Laboratory to develop data processing systems for all of the data collected and to provide platforms for complex data analysis tasks.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

     
  • richardmitnick 6:47 pm on February 11, 2020 Permalink | Reply
    Tags: , , , Particle Accelerators, , , Proton Synchrotron prepared for higher injection energies   

    From CERN: “LS2 Report: Proton Synchrotron prepared for higher injection energies” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    11 February, 2020
    Achintya Rao

    CERN’s oldest working accelerator has a new injection kicker magnet and will soon receive a new septum as well.

    1
    The new kicker for the PS being installed in the accelerator (Image: Julien Ordan/CERN)

    CERN Proton Synchrotron

    Proton beams entering the Proton Synchrotron (PS) from the PS Booster have to be deflected into a circulating orbit before they can be accelerated. This is done by two specialised beam-line elements: a strong magnetic septum and a fast injection-kicker magnet. The latter is a precisely synchronised electromagnet that can be switched on and off in about 100 ns, providing a stable and uniform kick that only affects the injected beam batches, while leaving the already circulating beam unperturbed.

    After the ongoing second long shutdown of CERN’s accelerator complex (LS2), the PS Booster will accelerate particles to 2 GeV, almost 50% higher than the pre-LS2 value of 1.4 GeV. The PS therefore needed a new septum and a new kicker capable of coping with this increased injection energy. On 31 January, as part of the LHC Injectors Upgrade (LIU) project, the new kicker magnet was installed, replacing the kicker that had operated since 1979. The magnet will soon be aligned, connected to the vacuum system and then connected to the power and control cables.

    Like the magnet it replaced, the PS’s new kicker is made of four identical modules sitting in a 1-metre-long vacuum tank. Each module receives power from a separate pulse generator that consists of two high-power electrical switches – a main switch and a dump switch to control the pulse length – and around 280 metres of a so-called “pulse-forming line”, wound and stored on gigantic drums. These lines are thick, coaxial cables filled with sulphur hexafluoride (SF6) at a pressure of 10 bars, to provide the necessary insulation for the charging voltage of 80 kV. Since SF6 is a strong greenhouse gas, special care has to be taken to ensure that it is safely manipulated and recuperated, and that the system has no leaks.

    In order to reduce the dependence on the SF6-based cables, part of the transmission line between the pulse generator and the magnet was replaced with conventional cables. “Disconnecting the SF6 cables from the magnet to connect the reserves was a two-person job, and required time-consuming gas-handling procedures to be followed,” explains Thomas Kramer from the TE-ABT (Accelerator Beam Transfer) group. “On the other hand, the new conventional cables have quick-release connectors and can be operated by one person fairly quickly.”

    Kramer and colleagues also replaced the old analogue control system for the kicker, parts of which had been in place since the system was constructed in the 1970s. “Things made back then still work reliably,” smiles Kramer, while noting that the new digital systems make it possible to monitor the situation remotely.

    One element that remains to be installed is the new septum. This is a delicate device used in the injection system, composed of two cavities separated by a thin wall: one cavity allows the beams from the PS Booster to enter the PS while the second is meant for the circulating beams. The new septum, which required construction of a novel power converter, will be installed upstream of the magnet in the coming weeks.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

     
  • richardmitnick 9:54 am on February 6, 2020 Permalink | Reply
    Tags: "Could the next generation of particle accelerators come out of the 3D printer?", , , , Consortium on the Properties of Additive-Manufactured Copper, Particle Accelerators, , ,   

    From SLAC National Accelerator Lab: “Could the next generation of particle accelerators come out of the 3D printer?” 

    From SLAC National Accelerator Lab

    February 5, 2020
    Jennifer Huber

    SLAC scientists and collaborators are developing 3D copper printing techniques to build accelerator components.

    Imagine being able to manufacture complex devices whenever you want and wherever you are. It would create unforeseen possibilities even in the most remote locations, such as building spare parts or new components on board a spacecraft. 3D printing, or additive manufacturing, could be a way of doing just that. All you would need is the materials the device will be made of, a printer and a computer that controls the process.

    Diana Gamzina, a staff scientist at the Department of Energy’s SLAC National Accelerator Laboratory; Timothy Horn, an assistant professor of mechanical and aerospace engineering at North Carolina State University; and researchers at RadiaBeam Technologies dream of developing the technique to print particle accelerators and vacuum electronic devices for applications in medical imaging and treatment, the electrical grid, satellite communications, defense systems and more.

    1
    Examples of 3D-printed copper components that could be used in a particle accelerator: X-band klystron output cavity with micro-cooling channels (at left) and a set of coupled accelerator cavities. (Christopher Ledford/North Carolina State University)

    In fact, the researchers are closer to making this a reality than you might think.

    “We’re trying to print a particle accelerator, which is really ambitious,” Gamzina said. “We’ve been developing the process over the past few years, and we can already print particle accelerator components today. The whole point of 3D printing is to make stuff no matter where you are without a lot of infrastructure. So you can print your particle accelerator on a naval ship, in a small university lab or somewhere very remote.”

    3D printing can be done with liquids and powders of numerous materials, but there aren’t any well-established processes for 3D printing ultra-high-purity copper and its alloys – the materials Gamzina, Horn and their colleagues want to use. Their research focuses on developing the method.

    Indispensable copper

    Accelerators boost the energy of particle beams, and vacuum electronic devices are used in amplifiers and generators. Both rely on components that can be easily shaped and conduct heat and electricity extremely well. Copper has all of these qualities and is therefore widely used.

    Traditionally, each copper component is machined individually and bonded with others using heat to form complex geometries. This manufacturing technique is incredibly common, but it has its disadvantages.

    “Brazing together multiple parts and components takes a great deal of time, precision and care,” Horn said. “And any time you have a joint between two materials, you add a potential failure point. So, there is a need to reduce or eliminate those assembly processes.”

    Potential of 3D copper printing

    3D printing of copper components could offer a solution.

    It works by layering thin sheets of materials on top of one another and slowly building up specific shapes and objects. In Gamzina’s and Horn’s work, the material used is extremely pure copper powder.

    The process starts with a 3D design, or “construction manual,” for the object. Controlled by a computer, the printer spreads a few-micron-thick layer of copper powder on a platform. It then moves the platform about 50 microns – half the thickness of a human hair – and spreads a second copper layer on top of the first, heats it with an electron beam to about 2,000 degrees Fahrenheit and welds it with the first layer. This process repeats over and over until the entire object has been built.


    3D printing of copper devices
    3D printing of a layer of a device known as a traveling wave tube using copper powder. (Christopher Ledford/North Carolina State University)

    The amazing part: no specific tooling, fixtures or molds are needed for the procedure. As a result, 3D printing eliminates design constraints inherent in traditional fabrication processes and allows the construction of objects that are uniquely complex.

    “The shape doesn’t really matter for 3D printing,” said SLAC staff scientist Chris Nantista, who designs and tests 3D-printed samples for Gamzina and Horn. “You just program it in, start your system and it can build up almost anything you want. It opens up a new space of potential shapes.”

    The team took advantage of that, for example, when building part of a klystron – a specialized vacuum tube that amplifies radiofrequency signals – with internal cooling channels at NCSU. Building it in one piece improved the device’s heat transfer and performance.

    Compared to traditional manufacturing, 3D printing is also less time consuming and could translate into cost savings of up to 70%, Gamzina said.

    A challenging technique

    But printing copper devices has its own challenges, as Horn, who began developing the technique with collaborators from RadiaBeam years ago, knows. One issue is finding the right balance between the thermal and electrical properties and strengths of the printed objects. But the biggest hurdle for manufacturing accelerators and vacuum electronics, though, is that these high-vacuum devices require extremely high quality and pure materials to avoid part failures, such as cracking or vacuum leaks.

    The research team tackled these challenges by first improving the material’s surface quality, using finer copper powder and varying the way they fused layers together. However, using finer copper powder led to the next challenge. It allowed more oxygen to attach to the copper powder, increasing the oxide in each layer and making the printed objects less pure.

    So, Gamzina and Horn had to find a way to reduce the oxygen content in their copper powders. The method they came up with, which they recently reported in Applied Sciences, relies on hydrogen gas to bind oxygen into water vapor and drive it out of the powder.

    Using this method is somewhat surprising, Horn said. In a traditionally manufactured copper object, the formation of water vapor would create high-pressure steam bubbles inside the material, and the material would blister and fail. In the additive process, on the other hand, the water vapor escapes layer by layer, which releases the water vapor more effectively.

    Although the technique has shown great promise, the scientists still have a ways to go to reduce the oxygen content enough to print an actual particle accelerator. But they have already succeeded in printing a few components, such as the klystron output cavity with internal cooling channels and a string of coupled cavities that could be used for particle acceleration.

    Planning to team up with industry partners

    The next phase of the project will be driven by the newly-formed Consortium on the Properties of Additive-Manufactured Copper, which is led by Horn. The consortium currently has four active industry members – Siemens, GE Additive, RadiaBeam and Calabazas Creek Research – with more on the way.

    “This would be a nice example of collaboration between an academic institution, a national lab and small and large businesses,” Gamzina said. “It would allow us to figure out this problem together. Our work has already allowed us to go from ‘just imagine, this is crazy’ to ‘we can do it’ in less than two years.”

    This work was primarily funded by the Naval Sea Systems Command, as a Small Business Technology Transfer Program with Radiabeam, SLAC, and NCSU. Other SLAC contributors include Chris Pearson, Andy Nguyen, Arianna Gleason, Apurva Mehta, Kevin Stone, Chris Tassone and Johanna Weker. Additional contributions came from Christopher Ledford and Christopher Rock at NCSU and Pedro Frigola, Paul Carriere, Alexander Laurich, James Penney and Matt Heintz at RadiaBeam.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC/LCLS


    SLAC/LCLS II projected view


    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    SSRL and LCLS are DOE Office of Science user facilities.

     
  • richardmitnick 2:48 pm on February 5, 2020 Permalink | Reply
    Tags: "ISOLDE steps into unexplored region of the nuclear chart to study exotic isotopes", 207Hg is a relatively close neighbor of r-process nuclei lying in this almost unexplored region. As such could help reveal some of the nuclear secrets of r-process nuclei and hence shed light on the o, A first study of the neutron structure of the mercury isotope 207Hg, , , , , , Particle Accelerators, , , psyg.org, Rapid neutron-capture process or “r-process”, This isotope is not directly involved in the rapid neutron-capture process, This study was possible thanks to three things: the HIE-ISOLDE accelerator system; the installation of the ISS; and last but not least a particle-detector system from the Argonne National Laboratory.   

    From CERN via phys.org: “ISOLDE steps into unexplored region of the nuclear chart to study exotic isotopes” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    via


    phys.org

    February 5, 2020

    CERN ISOLDE Looking down into the ISOLDE experimental hall

    2
    Instrumentation inside the ISOLDE Solenoidal Spectrometer. Credit: Ben Kay, Argonne National Laboratory

    Many heavy elements, such as gold, are thought to form in cosmic environments rich in neutrons—think supernovae or mergers of neutron stars. In these extreme settings, atomic nuclei can rapidly capture neutrons and become heavier, creating new elements. At the far reaches of the nuclear chart, which arranges all known nuclei according to their number of protons and neutrons, lie unexplored nuclei that are crucial to understanding the details of this rapid neutron-capture process. This is especially the case for nuclei with fewer than 82 protons and more than 126 neutrons.

    Researchers using CERN’s nuclear-physics facility ISOLDE have now stepped into this nearly uncharted region of the nuclear chart with a first study of the neutron structure of the mercury isotope 207Hg. This isotope is not directly involved in the rapid neutron-capture process, or “r-process,” but it is a relatively close neighbor of r-process nuclei lying in this almost unexplored region. As such, 207Hg could help reveal some of the nuclear secrets of r-process nuclei and hence shed light on the origin of heavy elements.

    To study the neutron structure of 207Hg, the researchers first took 206Hg isotopes that were produced along with hundreds of other exotic isotopes at ISOLDE by firing a 1.4 billion electronvolt proton beam from the Proton Synchrotron Booster onto a molten lead target.

    CERN The Proton Synchrotron Booster

    The 206Hg isotopes, which have one fewer neutron in the nucleus than 207Hg, were then accelerated in the facility’s HIE-ISOLDE accelerator to an energy of about 1.52 billion electronvolts—the highest energy ever achieved at HIE-ISOLDE. The researchers then focused the 206Hg isotopes at a deuterium target inside the ISOLDE Solenoidal Spectrometer (ISS), a newly developed magnetic spectrometer that was able to reveal events in which the 206Hg isotopes captured a neutron and turned into excited 207Hg isotopes.

    3
    CERN ISOLDE Solenoidal Spectrometer

    From the analysis of these events, the researchers determined the binding energies of the nuclear orbitals into which the neutron is captured, that is, the degree to which the captured neutron is bound to the other neutrons and protons. They then fed these results into theoretical models of the r-process to test and challenge these models.

    “This result marks the first exploration of the neutron structure of the 207Hg nucleus, paving the way for future experimental studies, with the ISS instrument at ISOLDE and at next-generation nuclear-physics facilities, of the almost uncharted nuclear region where 207Hg lies,” says principal investigator Ben Kay from Argonne National Laboratory, where the technique that underlies the ISS was pioneered.

    “This study was possible thanks to three things: the completed HIE-ISOLDE accelerator system, which now allows radioactive isotopes to be accelerated to energies close to 10 million electronvolts per proton or neutron; the installation of the ISS, a former MRI magnet repurposed for studies of exotic nuclei by a collaboration from the UK, Belgium and CERN; and, last but not least, a particle-detector system that was supplied by the Argonne National Laboratory and allowed the experiment to be performed just before the beginning of the ongoing shutdown of CERN’s accelerator complex,” explained ISOLDE spokesperson Gerda Neyens.

    More information: T. L. Tang, et al. First Exploration of Neutron Shell Structure Below Lead and Beyond N=126.
    https://arxiv.org/abs/2001.00976

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education CoaliAbout Science X in 100 words

    Science X™ is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004 (Physorg.com), Science X’s readership has grown steadily to include 5 million scientists, researchers, and engineers every month. Science X publishes approximately 200 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Science X community members enjoy access to many personalized features such as social networking, a personal home page set-up, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.
    Mission 12 reasons for reading daily news on Science X Organization Key editors and writersinclude 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

    tion

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

     
  • richardmitnick 1:38 pm on January 29, 2020 Permalink | Reply
    Tags: "Particle Physics Turns to Quantum Computing for Solutions to Tomorrow’s Big-Data Problems", , , , , Particle Accelerators, ,   

    From Lawrence Berkeley National Lab: “Particle Physics Turns to Quantum Computing for Solutions to Tomorrow’s Big-Data Problems” 

    From Lawrence Berkeley National Lab

    January 29, 2020
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Berkeley Lab researchers testing several techniques, technologies to be ready for the incoming deluge of particle data.

    1
    Display of a simulated High-Luminosity Large Hadron Collider (HL-LHC) particle collision event in an upgraded ATLAS detector. The event has an average of 200 collisions per particle bunch crossing. (Credit: ATLAS Collaboration/CERN)

    Giant-scale physics experiments are increasingly reliant on big data and complex algorithms fed into powerful computers, and managing this multiplying mass of data presents its own unique challenges.

    To better prepare for this data deluge posed by next-generation upgrades and new experiments, physicists are turning to the fledgling field of quantum computing to find faster ways to analyze the incoming info.

    Giant-scale physics experiments are increasingly reliant on big data and complex algorithms fed into powerful computers, and managing this multiplying mass of data presents its own unique challenges.

    To better prepare for this data deluge posed by next-generation upgrades and new experiments, physicists are turning to the fledgling field of quantum computing to find faster ways to analyze the incoming info.

    Click on a name or photo in the series of articles listed below profile three student researchers who have participated in Berkeley Lab-led efforts to learn about research projects in quantum computing by early-career researchers at Berkeley Lab:

    In a conventional computer, memory takes the form of a large collection of bits, and each bit has only two values: a one or zero, akin to an on or off position. In a quantum computer, meanwhile, data is stored in quantum bits, or qubits. A qubit can represent a one, a zero, or a mixed state in which it is both a one and a zero at the same time.

    By tapping into this and other quantum properties, quantum computers hold the potential to handle larger datasets and quickly work through some problems that would trip up even the world’s fastest supercomputers. For other types of problems, though, conventional computers will continue to outperform quantum machines.

    The High Luminosity Large Hadron Collider (HL-LHC) Project, a planned upgrade of the world’s largest particle accelerator at the CERN laboratory in Europe, will come on line in 2026.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    It will produce billions of particle events per second – five to seven times more data than its current maximum rate – and CERN is seeking new approaches to rapidly and accurately analyze this data.

    In these particle events, positively charged subatomic particles called protons collide, producing sprays of other particles, including quarks and gluons, from the energy of the collision. The interactions of particles can also cause other particles – like the Higgs boson – to pop into existence.

    Tracking the creation and precise paths (called “tracks”) of these particles as they travel through layers of a particle detector – while excluding the unwanted mess, or “noise” produced in these events – is key in analyzing the collision data.

    The data will be like a giant 3D connect-the-dots puzzle that contains many separate fragments, with little guidance on how to connect the dots.

    To address this next-gen problem, a group of student researchers and other scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have been exploring a wide range of new solutions.

    One such approach is to develop and test a variety of algorithms tailored to different types of quantum-computing systems. Their aim: Explore whether these technologies and techniques hold promise for reconstructing these particle tracks better and faster than conventional computers can.

    Particle detectors work by detecting energy that is deposited in different layers of the detector materials. In the analysis of detector data, researchers work to reconstruct the trajectory of specific particles traveling through the detector array. Computer algorithms can aid this process through pattern recognition, and particles’ properties can be detailed by connecting the dots of individual “hits” collected by the detector and correctly identifying individual particle trajectories.

    2
    A new wheel-shaped muon detector is part of the ATLAS detector upgrade at CERN. This wheel-shaped detector measures more than 30 feet in diameter. (Credit: Julien Marius Ordan/CERN)

    Heather Gray, an experimental particle physicist at Berkeley Lab and a UC Berkeley physics professor, leads the Berkeley Lab-based R&D effort – Quantum Pattern Recognition for High-Energy Physics (HEP.QPR) – that seeks to identify quantum technologies to rapidly perform this pattern-recognition process in very-high-volume collision data. This R&D effort is funded as part of the DOE’s QuantISED (Quantum Information Science Enabled Discovery for High Energy Physics) portfolio.

    The HEP.QPR project is also part of a broader initiative to boost quantum information science research at Berkeley Lab and across U.S. national laboratories.

    Other members of the HEP.QPR group are: Wahid Bhimji, Paolo Calafiura, Wim Lavrijsen, and former postdoctoral researcher Illya Shapoval, who explored quantum algorithms for associative memory. Bhimji is a big data architect at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC). Calafiura is chief software architect of CERN’s ATLAS experiment and a member of Berkeley Lab’s Computational Research Division (CRD). And Lavrijsen is a CRD software engineer who is also involved in CERN’s ATLAS experiment.

    Members of the HEP.QPR project have collaborated with researchers at the University of Tokyo and from Canada on the development of quantum algorithms in high-energy physics, and jointly organized a Quantum Computing Mini-Workshop at Berkeley Lab in October 2019.

    Gray and Calafiura were also involved in a CERN-sponsored competition, launched in mid-2018, that challenged computer scientists to develop machine-learning-based techniques to accurately reconstruct particle tracks using a simulated set of HL-LHC data known as TrackML. Machine learning is a form of artificial intelligence in which algorithms can become more efficient and accurate through a gradual training process akin to human learning. Berkeley Lab’s quantum-computing effort in particle-track reconstruction also utilizes this TrackML set of simulated data.

    Berkeley Lab and UC Berkeley are playing important roles in the rapidly evolving field of quantum computing through their participation in several quantum-focused efforts, including The Quantum Information Edge, a research alliance announced in December 2019.

    The Quantum Information Edge is a nationwide alliance of national labs, universities, and industry advancing the frontiers of quantum computing systems to address scientific challenges and maintain U.S. leadership in next-generation information technology. It is led by the DOE’s Berkeley Lab and Sandia National Laboratories.

    The series of articles listed below profile three student researchers who have participated in Berkeley Lab-led efforts to apply quantum computing to the pattern-recognition problem in particle physics:

    4
    Lucy Linder, while working as a researcher at Berkeley Lab, developed her master’s thesis – supervised by Berkeley Lab staff scientist Paolo Calafiura – about the potential application of a quantum-computing technique called quantum annealing for finding particle tracks. She remotely accessed quantum-computing machines at D-Wave Systems Inc. in Canada and at Los Alamos National Laboratory in New Mexico.

    Linder’s approach was to first format the particle-track simulated data as something known as a QUBO (quadratic unconstrained binary optimization) problem that formulated the problem as an equation with binary values: either a one or a zero. This QUBO formatting also helped prepare the data for analysis by a quantum annealer, which uses qubits to help identify the best possible solution by applying a physics principle that describes how objects naturally seek the lowest-possible energy state.
    Read More

    5
    Eric Rohm, an undergraduate student working on a contract at Berkeley Lab as part of the DOE’s Science Undergraduate Laboratory Internship program, developed a quantum approximate optimization algorithm (QAOA) using quantum-computing resources at Rigetti Computing in Berkeley, California. He was supervised by Berkeley Lab physicist Heather Gray.

    This approach used a blend of conventional and quantum computing techniques to develop a custom algorithm. The algorithm, still in refinement, has been tested on the Rigetti Quantum Virtual Machine, a conventional computer that simulates a small quantum computer. The algorithm may eventually be tested on a Rigetti quantum processing unit that is equipped with actual qubits.
    Read More

    6
    Amitabh Yadav, a student research associate at Berkeley Lab since November who is supervised by Gray and Berkeley Lab software engineer Wim Lavrijsen, is working to apply a quantum version of a convention technique called Hough transform to identify and reconstruct particle tracks using IBM’s Quantum Experience, a form of quantum computing.

    The classical Hough transform technique can be used to detect specific features such as lines, curves, and circles in complex patterns, and the quantum Hough transform technique could potentially call out more complex shapes from exponentially larger datasets. Read More

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

     
  • richardmitnick 10:05 am on January 24, 2020 Permalink | Reply
    Tags: , , , , Particle Accelerators, , , The team has been able to ramp up the machine to 500 milliamperes (mA) of current and to keep this current stable for more than six hours.,   

    From Brookhaven National Lab: “NSLS-II Achieves Design Beam Current of 500 Milliamperes” 

    From Brookhaven National Lab

    January 22, 2020
    Cara Laasch
    laasch@bnl.gov

    Accelerator division enables new record current during studies.

    1
    The NSLS-II accelerator division proudly gathered to celebrate their recent achievement. The screen above them shows the slow increase of the electron current in the NSLS-II storage ring and its stability.

    The National Synchrotron Light Source II (NSLS-II) [below] at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory is a gigantic x-ray microscope that allows scientists to study the inner structure of all kinds of material and devices in real time under realistic operating conditions. The scientists using the machine are seeking answers to questions including how can we built longer lasting batteries; when life started on our planet; and what kinds of new materials can be used in quantum computers, along with many other questions in a wide variety of research fields.

    The heart of the facility is a particle accelerator that circulates electrons at nearly the speed of light around the roughly half-a-mile-long ring. Steered by special magnets within the ring, the electrons generate ultrabright x-rays that enable scientists to address the broad spectrum of research at NSLS-II.

    Now, the accelerator division at NSLS-II has reached a new milestone for machine performance. During recent accelerator studies, the team has been able to ramp up the machine to 500 milliamperes (mA) of current and to keep this current stable for more than six hours. Similar to a current in a river, the current in an accelerator is a measure of the number of electrons that circulate the ring at any given time. In NSLS-II’s case, a higher electron current opens the pathway to more intense x-rays for all the experiments happening at the facility.

    “Since we turned on the machine for the first time in 2014 with 50mA current, we have progressed steadily upwards in current and now – in just five years – we have reached 500mA,” said Timur Shaftan, NSLS-II accelerator division director. “Along the way, we encountered many significant challenges, and it is thanks to the dedication, knowledge, and expertise of the team that we were able to overcome them all to get here.”

    All good things come in threes?

    On their quest to a higher current, the accelerator division faced three major challenges: an increase in power consumption of the radiofrequency (RF) accelerating cavities, more intense “wakefields,” and the unexpected heating of some accelerator components.

    The purpose of the RF accelerating cavities can be compared to pushing a child on a swing – with the child being the electrons. With the correct timing, large amplitudes can be driven with little effort. The cavities feed more and more energy to the electrons to compensate for the energy the electrons lose as they generate x-rays in their trips around the ring.

    “The cavities use electricity to push the electrons forward, and even though our cavities are very efficient, they still draw a good amount of raw power,” said Jim Rose, RF group leader. “To reach 500 mA, we monitored this increase closely to ensure that we wouldn’t cross our limit for power, which we didn’t. However, there is another challenge we now have to face: The cavities compress the groups of electrons—we call them bunches—that rush through the machine, and by doing so they increase the heating issues that we face. To fully address this in the future, we will install other cavities of a different RF frequency that would lengthen the bunches again.”

    Rose is referring to the issue of “wakefields.” As the electrons speed around the ring, they create so called wakefields—just like when you run your finger through still water and create waves that roll on even though your fingers are long gone. In the same way, the rushing electrons generate a front of electric fields that follow them around the ring.

    “Having more intense wakefields causes two challenges: First, these fields influence the next set of electrons, causing them to lose energy and become unstable, and second, they heat up the vacuum chamber in which the beam travels,” said accelerator physicist Alexei Blednykh. “One of the limiting components in our efforts to reach 500mA was the ceramic vacuum chambers, because they were overheating. We mitigated the effect by installing additional cooling fans. However, to fully solve the issue we will need to replace the existing chambers with new chambers that have a thin titanium coating on the inside.”

    The accelerator division decided to coat all the new vacuum chambers in house using a technique called direct current magnetron sputtering. During the sputtering process, a titanium target is bombarded with ionized gas so that it ejects millions of titanium atoms, which spray onto the surface of the vacuum chamber to create a thin metal film.

    “At first, coating chambers sounds easy enough, but our chambers are long and narrow, which forces you to think differently how you can apply the coating. We had to design a coating system that was capable of handling the geometry of our chambers,” said vacuum group leader Charlie Hetzel. “Once we developed a system that could be used to coat the chambers, we had to develop a method that could accurately measure the thickness and uniformity along the entire length of the chamber.”

    For the vacuum chambers to survive the machine at high current, the coatings had to meet a number of demanding requirements in terms of their adhesion, thickness, and uniformity.

    The third challenge the team needed to overcome was resolving the unexpected heating found between some of the vacuum flanges. Each of the vacuum joints around the half-mile long accelerator contain a delicate RF bridge. Any errors during installation can result in additional heating and risk to the vacuum seal of the machine.

    “We knew from the beginning that increasing the current to 500 mA would be hard on the machine, however, we needed to know exactly where the real hot spots were,” explained accelerator coordination group leader Guimei Wang. “So, we installed more than 1000 temperature sensors around the whole machine, and we ran more than 400 hours of high-current beam studies over the past three years, where we monitored the temperature, vacuum, and many other parameters of the electrons very closely to really understand how our machine is behaving.”

    Based on all these studies and many more hours spend analyzing each single study run, the accelerator team made the necessary decisions as to which what parts needed to be coated or changed and, most importantly, how to run the machine at such a high current safely and reliably.

    Where do we go from here?

    Achieving 500mA during beam studies was an important step to begin to shed light on the physics within the machine at these high currents, as well as to understand the present limits of the accelerator. Equipped with these new insights, the accelerator division now knows that their machine can reach the 500mA current for a short time, but at this point it’s not possible to sustain high current for operations over extended periods with the RF power necessary to deliver it to users. To run the machine at this current, NSLS-II’s accelerator will need additional RF systems both to lengthen the bunches and to secure high reliability of operations, while providing sufficient RF power to the beam to generate x-rays for the growing set of beamlines.

    “Achieving 500 mA for the first time is a major milestone in the life of NSLS-II, showing that we can reach the aggressive design current goals we set for ourselves when we first started thinking about what NSLS-II could be all those years ago. This success is due to a lot of hard work, expertise, and dedication by many, many people at NSLS-II and I would like to thank them all very much,” said NSLS-II Director John Hill. “The next steps are to fully understand how the machine behaves at this current and ultimately deliver it to our users. This will require further upgrades to our accelerator systems—and we are actively working towards those now.”

    NSLS-II is a DOE Office of Science user facility.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    Brookhaven campus

    BNL Center for Functional Nanomaterials

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL Phenix Detector

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

     
  • richardmitnick 8:22 am on January 23, 2020 Permalink | Reply
    Tags: , , , , , , , Particle Accelerators, ,   

    From Fermi National Accelerator Lab: “USCMS collaboration gets green light on upgrades to CMS particle detector” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    January 22, 2020
    Leah Hesla

    In its ongoing quest to understand the nature of the universe’s fundamental constituents, the CMS collaboration has reached another milestone.

    CERN/CMS Detector

    In October 2019, the U.S. contingent of the CMS collaboration presented their plans to upgrade the CMS particle detector for the high-luminosity phase of the Large Hadron Collider at CERN.

    CERN LHC Tunnel

    The upgrades would enable CMS to handle the challenging environment brought on by the upcoming increase in the LHC’s particle collision rate, fully exploiting the discovery potential of the upgraded machine.

    In response, on Dec. 19, 2019, the Department of Energy Office of Science gave the plan its stamp of CD-1 approval, signaling that it favorably evaluated the project’s conceptual design, schedule range and cost, among other factors.

    “This is a major achievement because it paves the way for the next major steps in our project, in which funds are allocated to start the production phase,” said scientist Anadi Canepa, head of the Fermilab CMS Department. “The U.S. project team was extremely satisfied. Preparing for CD-1 was a monumental effort.”

    2
    The CMS detector upgrade team met in October 2019 for a DOE review. Photo: Reidar Hahn, Fermilab

    The LHC’s increase in beam intensity is planned for 2027, when it will become the High-Luminosity LHC. Racing around its 17-mile circumference, the upgraded collider’s proton beams will smash together to reveal even more about the nature of the subatomic realm thanks to a 10-fold increase in collision rate compared to the LHC’s design value.

    The cranked up intensity means that the High-Luminosity LHC will deliver an unprecedented amount of data, and the giant detectors that sit in the path of the beam have to be able to withstand the higher data delivery rate and radiation dose. In preparation, USCMS will upgrade the CMS detector to keep up with the increase in data output, not to mention to harsher collision environment.

    The collaboration plans to upgrade the detector with state-of-the-art technology. The new detector will exhibit improved sensitivity, with over 2 billion sensor channels — up from 80 million. USCMS is also replacing the central part of the detector so that, when charged particles fly through it, the upgraded device will take readings of their momenta at an astounding 40 million times per second, a first for hadron colliders. They’re implementing an innovative design for the detector, measuring the energy of particles using very precise silicon sensors. The upgraded CMS will also have a breakthrough component to take higher-resolution, more precisely timed images of complex particle interactions. Scientists are introducing a system using machine learning on electronic circuits called FPGAs to more efficiently select which of the billions of particle events that CMS processes every 25 nanoseconds might signal new physics.

    “The successful completion of the CD-1 review is a reflection of the competence, commitment and dedication of a very large team of Fermilab scientists and university colleagues,” said Fermilab scientist Steve Nahn, U.S. project manager for the CMS detector upgrade.

    Now USCMS will refine the plan, getting it ready to serve as the project baseline.

    “With these improvements, we’ll be able to explore uncharted territories and might discover new phenomena that revolutionize our description of nature,” Canepa said.

    The USCMS collaboration comprises Fermilab and 54 institutions.

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 4:47 pm on January 9, 2020 Permalink | Reply
    Tags: "Department of Energy picks New York over Virginia for site of new particle collider", , , , , Particle Accelerators, , ,   

    From BNL via Science Magazine: “Department of Energy picks New York over Virginia for site of new particle collider” 

    From Brookhaven National Lab

    via

    AAAS
    Science Magazine

    Jan. 9, 2020
    Adrian Cho

    Nuclear physicists’ next dream machine will be built at Brookhaven National Laboratory in Upton, New York, officials with the Department of Energy (DOE) announced today. The Electron-Ion Collider (EIC) will smash a high-energy beam of electrons into one of protons to probe the mysterious innards of the proton. The machine will cost between $1.6 billion and $2.6 billion and should be up and running by 2030, said Paul Dabbar, DOE’s undersecretary for science, in a telephone press briefing.

    6
    This schematic shows how the EIC will fit within the tunnel of the Relativistic Heavy Ion Collider (RHIC, background photo), reusing essential infrastructure and key components of RHIC.

    3
    Electrons will collide with protons or larger atomic nuclei at the Electron-Ion Collider to produce dynamic 3-D snapshots of the building blocks of all visible matter.

    7
    The EIC will allow nuclear physicists to track the arrangement of the quarks and gluons that make up the protons and neutrons of atomic nuclei.

    “It will be the first brand-new greenfield collider built in the country in decades,” Dabbar said. “The U.S. has been at the front end in nuclear physics since the end of the Second World War and this machine will enable the U.S. to stay at the front end for decades to come.”

    The site decision brings to a close the competition to host the machine. Physicists at DOE’s Thomas Jefferson National Accelerator Facility in Newport News, Virginia, had also hoped to build the EIC.

    Protons and neutrons make up the atomic nucleus, so the sort of work the EIC would do falls under the rubric of nuclear physics. Although they’re more common than dust, protons remain somewhat mysterious. Since the early 1970s, physicists have known that each proton consists of a trio of less massive particles called quarks. These bind to one another by exchanging other quantum particles called gluons.

    However, the detailed structure of the proton is far more complex. Thanks to the uncertainties inherent in quantum mechanics, its interior roils with countless gluons and quark-antiquark pairs that flit in and out of existence too quickly to be directly observed. And many of the proton’s properties—including its mass and spin—emerge from that sea of “virtual” particles. To determine how that happens, the EIC will use its electrons to probe the protons, colliding the two types of particles at unprecedented energies and in unparalleled numbers.

    Researchers at Jefferson lab already do similar work by firing their electron beam at targets rich with protons and neutrons. In 2017, researchers completed a $338 million upgrade to double the energy of the lab’s workhorse, the Continuous Electron Beam Accelerator Facility.

    3
    4
    Continuous Electron Beam Accelerator Facility

    With that electron accelerator in hand, Jefferson lab researchers had hoped to build the EIC by adding a new proton accelerator.

    Brookhaven researchers have studied a very different type of nuclear physics. Their Relativistic Heavy Ion Collider (RHIC) [below] collides nuclei such as gold and copper to produce fleeting puffs of an ultrahot plasma of free-flying quarks and gluons like the one that filled the universe in the split second after the big bang. The RHIC is a 3.8-kilometer-long ring consisting of two concentric and counter-circulating accelerators. Brookhaven researchers plan to make the EIC by using one of the RHIC’s rings to accelerate the protons and to add an electron accelerator to the complex.

    To decide which option to take, DOE officials convened an independent EIC site selection committee, Dabbar says. The committee weighed numerous factors, including the relative costs of the rival plans, he says. Proton accelerators are generally larger and more expensive than electron accelerators.

    The Jefferson lab won’t be left out in the cold, Dabbar says. Researchers there have critical expertise in, among other things, making the superconducting accelerating cavities that will be needed for the new collider. So, scientists there will participate in designing, building, and operating the new collider. “We certainly look forward to [the Jefferson lab] taking the lead in these areas,” Dabbar says.

    The site decision does not commit DOE to building the EIC. The project must still pass several milestones before researchers can being construction—including the approval of a detailed design, cost estimate, and construction schedule. That process can take a few years. However, the announcement does signal the end for the RHIC, which has run since 1999. To make way for the new collider, the RHIC will shut down for good in 2024, Dabbar said at the briefing.

    The decision on a machine still 10 years away reflects the relative good times for DOE science funding, Dabbar says. “We’ve been able to start on every major project that’s been on the books for years.” DOE’s science budget is up 31% since 2016—in spite of the fact that under President Donald Trump, the White House has tried to slash it every year.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition


    BNL Center for Functional Nanomaterials

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: