Tagged: Particle Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:40 pm on February 27, 2020 Permalink | Reply
    Tags: "‘Flash photography’ at the LHC", , , , , , Particle Physics, ,   

    From Symmetry: “‘Flash photography’ at the LHC” 

    Symmetry Mag
    From Symmetry<

    Sarah Charley

    Photo by Tom Bullock

    An extremely fast new detector inside the CMS detector will allow physicists to get a sharper image of particle collisions.

    Some of the best commercially available high-speed cameras can capture thousands of frames every second. They produce startling videos of water balloons popping and hummingbirds flying in ultra-slow motion.

    But what if you want to capture an image of a process so fast that it looks blurry if the shutter is open for even a billionth of a second? This is the type of challenge scientists on experiments like CMS and ATLAS face as they study particle collisions at CERN’s Large Hadron Collider.

    When the LHC is operating to its full potential, bunches of about 100 billion protons cross each other’s paths every 25 nanoseconds. During each crossing, which lasts about 2 nanoseconds, about 50 protons collide and produce new particles. Figuring out which particle came from which collision can be a daunting task.

    “Usually in ATLAS and CMS, we measure the charge, energy and momentum of a particle, and also try to infer where it was produced,” says Karri DiPetrillo, a postdoctoral fellow working on the CMS experiment at the US Department of Energy’s Fermilab. “We’ve had timing measurements before—on the order of nanoseconds, which is sufficient to assign particles to the correct bunch crossing, but not enough to resolve the individual collisions within the same bunch.”

    Thanks to a new type of detector DiPetrillo and her collaborators are building for the CMS experiment, this is about to change.

    CERN/CMS Detector

    Physicists on the CMS experiment are devising a new detector capable of creating a more accurate timestamp for passing particles. The detector will separate the 2-nanosecond bursts of particles into several consecutive snapshots—a feat a bit like taking 30 billion pictures a second.

    This will help physicists with a mounting challenge at the LHC: collision pileup.

    Picking apart which particle tracks came from which collision is a challenge. A planned upgrade to the intensity of the LHC will increase the number of collisions per bunch crossing by a factor of four—that is from 50 to 200 proton collisions—making that challenge even greater.

    Currently, physicists look at where the collisions occurred along the beamline as a way to identify which particular tracks came from which collision. The new timing detector will add another dimension to that.

    “These time stamps will enable us to determine when in time different collisions occurred, effectively separating individual bunch crossings into multiple ‘frames,’” says DiPetrillo.

    DiPetrillo and fellow US scientists working on the project are supported by DOE’s Office of Science, which is also contributing support for the detector development.

    According to DiPetrillo, being able to separate the collisions based on when they occur will have huge downstream impacts on every aspect of the research. “Disentangling different collisions cleans up our understanding of an event so well that we’ll effectively gain three more years of data at the High-Luminosity LHC. This increase in statistics will give us more precise measurements, and more chances to find new particles we’ve never seen before,” she says.

    The precise time stamps will also help physicists search for heavy, slow moving particles they might have missed in the past.

    “Most particles produced at the LHC travel at close to the speed of light,” DiPetrillo says. “But a very heavy particle would travel slower. If we see a particle arriving much later than expected, our timing detector could flag that for us.”

    The new timing detector inside CMS will consist of a 5-meter-long cylindrical barrel made from 160,000 individual scintillating crystals, each approximately the width and length of a matchstick. This crystal barrel will be capped on its open ends with disks containing delicately layered radiation-hard silicon sensors. The barrel, about 2 meters in diameter, will surround the inner detectors that compose CMS’s tracking system closest to the collision point. DiPetrillo and her colleagues are currently working out how the various sensors and electronics at each end of the barrel will coordinate to give a time stamp within 30 to 50 picoseconds.

    “Normally when a particle passes through a detector, the energy it deposits is converted into an electrical pulse that rises steeply and the falls slowly over the course of a few nanoseconds,” says Joel Butler, the Fermilab scientist coordinating this project. “To register one of these passing particles in under 50 picoseconds, we need a signal that reaches its peaks even faster.”

    Scientists can use the steep rising slopes of these signals to separate the collisions not only in space, but also in time. In the barrel of the detector, a particle passing through the crystals will release a burst of light that will be recorded by specialized electronics. Based on when the intense flash of light arrives at each sensor, physicists will be able to calculate the particle’s exact location and when it passed. Particles will also produce a quick pulse in the endcaps, which are made from a new type of silicon sensor that amplifies the signal. Each silicon sensor is about the size of a domino and can determine the location of a passing particle to within 1.3 millimeters.

    The physicists working on the timing detector plan to have all the components ready and installed inside CMS for the start-up of the High Luminosity LHC in 2027

    “High-precision timing is a new concept in high-energy physics,” says DiPetrillo. “I think it will be the direction we pursue for future detectors and colliders because of its huge physics potential. For me, it’s an incredibly exciting and novel project to be on right now.”


    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan

    CERN LHC particles



    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    CERN/ALICE Detector

    CERN CMS New

    CERN LHCb New II

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 2:23 pm on February 24, 2020 Permalink | Reply
    Tags: "ATLAS experiment searches for natural supersymmetry using novel techniques", , , , , , Particle Physics,   

    From CERN ATLAS via phys.org: “ATLAS experiment searches for natural supersymmetry using novel techniques” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN




    February 24, 2020

    Visualisation of the highest jet multiplicity event selected in a control region used to make predictions of the background from multijet production. This event was recorded by ATLAS on 18 July 2018, and contains 19 jets, illustrated by cones. Yellow blocks represent the calorimeter energy measured in in noise-suppressed clusters. Of the reconstructed jets, 16 (10) have transverse momenta above 50 GeV (80 GeV). Credit: ATLAS Collaboration/CERN

    In new results presented at CERN, the ATLAS Experiment’s search for supersymmetry (SUSY) reached new levels of sensitivity. The results examine a popular SUSY extension studied at the Large Hadron Collider (LHC): the “Minimal Supersymmetric Standard Model” (MSSM), which includes the minimum required number of new particles and interactions to make predictions at the LHC energies. However, even this minimal model introduces a large amount of new parameters (masses and other properties of the new particles), whose values are not predicted by the theory (free parameters).

    To frame their search, ATLAS physicists look for “natural” SUSY, which assumes the various corrections to the Higgs mass comparable in magnitude and their sum close to the electroweak scale (v ~ 246 GeV). Under this paradigm, the supersymmetric partners of the third-generation quarks (“top and bottom squarks”) and gluons (“gluinos”) could have masses close to the TeV scale, and would be produced through the strong interaction at rates large enough to be observed at the LHC.

    In a recent CERN LHC seminar, the ATLAS Collaboration presented new results in the search for natural SUSY, including searches for top squarks and gluinos using the full LHC Run-2 dataset collected between 2015 and 2018. The new results explore previously uncovered, challenging regions of the free parameter space. This is achieved thanks to new analysis techniques improving the identification of low-energy (“soft”) and high-energy (“boosted”) particles in the final state.

    ATLAS’ search for top squarks was performed by selecting proton–proton collisions containing up to one electron or muon. For top-squark masses less than the top-quark mass of 173 GeV (see Figure 1), the resulting decay products tend to be soft and therefore difficult to identify. Physicists developed new techniques based on charged-particle tracking to better identify these decay products, thus significantly improving the experimental sensitivity. For larger top-squark masses, the decay products are boosted, resulting in high-energy, close-by decay products. Physicists improved the search in this regime by using, among other techniques, more precise estimates of the statistical significance of the missing transverse momentum in a collision event.

    Figure 1: Schematic representation of the various topologies of top-squark decays in the scenarios presented at today’s seminar (see link in footer). The region where the top-squark is lighter than the neutralino is not allowed in the models considered. Credit: ATLAS Collaboration/CERN

    The new search for gluinos looks at events containing eight or more “jets”—collimated sprays of hadrons—and missing transverse momentum generated by the production of stable neutralinos in the gluino decays, which, similar to neutrinos, are not directly detected by ATLAS. Physicists employed new reconstruction techniques to improve the energy resolution of the jets and the missing transverse momentum, allowing them to better separate the putative signal from background processes. These take advantage of “particle-flow” jet algorithms [https://arxiv.org/abs/1703.10485] that combine information from both the tracking detector and the calorimeter system.

    Figure 2: Updated exclusion limits on (left) gluino and (right) top-squark production including the new results presented by ATLAS at the CERN LHC seminar today. Credit: ATLAS Collaboration/CERN

    ATLAS physicists also optimised their event-selection criteria to enhance the contribution of possible SUSY signals compared to the Standard Model background processes. No excess was observed in the data. The results were used to derive exclusion limits on MSSM-inspired simplified models in terms of gluino, top-squark and neutralino masses (see Figure 2).

    The new analyses significantly extend the sensitivity of the searches and further constrain the available parameter space for natural SUSY. The exclusion of heavy top squarks is extended from 1 to 1.25 TeV. The search continues.

    More information: CERN LHC Seminar: Constraining natural supersymmetry with the ATLAS detector by Jeanette Miriam Lorenz indico.cern.ch/event/868249/

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries

    CERN map

    CERN LHC Grand Tunnel
    CERN LHC particles

  • richardmitnick 11:44 am on February 19, 2020 Permalink | Reply
    Tags: , , , , , Particle Physics   

    From CERN ALPHA: “ALPHA collaboration at CERN reports first measurements of certain quantum effects in antimatter” 

    Cern New Bloc

    Cern New Particle Event

    From CERN ALPHA-g


    The ALPHA collaboration at CERN has reported the first measurements of certain quantum effects in the energy structure of antihydrogen, the antimatter counterpart of hydrogen. These quantum effects are known to exist in matter, and studying them could reveal as yet unobserved differences between the behaviour of matter and antimatter. The results, described in a paper published today in the journal Nature, show that these first measurements are consistent with theoretical predictions of the effects in “normal” hydrogen, and pave the way for more precise measurements of these and other fundamental quantities.

    “Finding any difference between these two forms of matter would shake the foundations of the Standard Model of particle physics, and these new measurements probe aspects of antimatter interaction – such as the Lamb shift – that we have long looked forward to addressing,” says Jeffrey Hangst, spokesperson for the ALPHA experiment.

    Standard Model of Particle Physics, Quantum Diaries

    “Next on our list is chilling large samples of antihydrogen using state-of-the-art laser cooling techniques. These techniques will transform antimatter studies and will allow unprecedentedly high-precision comparisons between matter and antimatter.”

    The ALPHA team creates antihydrogen atoms by binding antiprotons delivered by CERN’s Antiproton Decelerator with antielectrons, more commonly called “positrons”.

    CERN Antimatter factory – antiproton decelerator main device. Wikepedia

    CERN ALPHA-g experiment being installed at CERN’s Antiproton Decelerator hall. (Image CERN)

    It then confines them in a magnetic trap in an ultra-high vacuum, which prevents them from coming into contact with matter and annihilating. Laser light is then shone onto the trapped atoms to measure their spectral response. This technique helps measure known quantum effects like the so-called fine structure and the Lamb shift, which correspond to tiny splittings in certain energy levels of the atom, and were measured in this study in the antihydrogen atom for the first time. The team previously used this approach to measure other quantum effects in antihydrogen, the latest being a measurement of the Lyman-alpha transition.

    The fine structure was measured in atomic hydrogen more than a century ago, and laid the foundation for the introduction of a fundamental constant of nature that describes the strength of the electromagnetic interaction between elementary charged particles. The Lamb shift was discovered in the same system about 70 years ago and was a key element in the development of quantum electrodynamics, the theory of how matter and light interact.

    The Lamb-shift measurement, which won Willis Lamb the Nobel Prize in Physics in 1955, was reported in 1947 at the famous Shelter Island conference – the first important opportunity for leaders of the American physics community to gather after the war.

    Technical Note:
    Both the fine structure and the Lamb shift are small splittings in certain energy levels (or spectral lines) of an atom, which can be studied with spectroscopy. The fine-structure splitting of the second energy level of hydrogen is a separation between the so-called 2P3/2 and 2P1/2 levels in the absence of a magnetic field. The splitting is caused by the interaction between the velocity of the atom’s electron and its intrinsic (quantum) rotation. The “classic” Lamb shift is the splitting between the 2S1/2 and 2P1/2 levels, also in the absence of a magnetic field. It is the result of the effect on the electron of quantum fluctuations associated with virtual photons popping in and out of existence in a vacuum.

    In their new study, the ALPHA team determined the fine-structure splitting and the Lamb shift by inducing and studying transitions between the lowest energy level of antihydrogen and the 2P3/2 and 2P1/2 levels in the presence of a magnetic field of 1 Tesla. Using the value of the frequency of a transition that they had previously measured, the 1S–2S transition, and assuming that certain quantum interactions were valid for antihydrogen, the researchers inferred from their results the values of the fine-structure splitting and the Lamb shift. They found that the inferred values are consistent with theoretical predictions of the splittings in “normal” hydrogen, within the experimental uncertainty of 2% for the fine-structure splitting and of 11% for the Lamb shift.

    Received via email.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier

    CERN ALPHA Antimatter Factory

    CERN ALPHA Detector

  • richardmitnick 4:01 pm on February 17, 2020 Permalink | Reply
    Tags: "UNTIL THE END OF TIME" Mind, and Our Search for Meaning in an Evolving Universe By Brian Greene, , , , , , , Matter, , Particle Physics,   

    From The New York Times: “Just a Few Billion Years Left to Go” 

    From The New York Times

    Feb. 17, 2020
    Dennis Overbye

    Mind, Matter, and Our Search for Meaning in an Evolving Universe
    By Brian Greene

    Brian Greene’s main idea, his own grand, unified theory of human endeavor, is that we want to transcend death by attaching ourselves to something permanent that will outlast us. Credit Elena Seibert

    “In the fullness of time all that lives will die.” With this bleak truth Brian Greene, a physicist and mathematician at Columbia University, the author of best-selling books like The Elegant Universe and co-founder of the yearly New York celebration of science and art known as the World Science Festival, sets off in Until the End of Time on the ultimate journey, a meditation on how we go on doing what we do, why and how it will end badly, and why it matters anyway.

    For going on is what we do, building bridges, spaceships and families, composing great symphonies and other works of art, directing movies, and waging wars and presidential campaigns, even though not only are we going to die, but so is all life everywhere in the fullness of eternity, according to what science now thinks it knows about us and the universe.

    Until the End of Time is encyclopedic in its ambition and its erudition, often heartbreaking, stuffed with too many profundities that I wanted to quote, as well as potted descriptions of the theories of a galaxy of contemporary thinkers, from Chomsky to Hawking, and anecdotes from Greene’s own life — of which we should wish for more — that had me laughing.

    It is also occasionally afflicted with stretches of prose that seem as if eternity will come before you ever get through them, especially when Greene is discussing challenging topics like entropy. If I really understood entropy, I suspect I would be writing this review in an office at M.I.T., not an apartment on Manhattan’s Upper West Side.

    Greene’s main idea, his own grand unified theory of human endeavor, expanding on the thoughts of people like Otto Rank, Jean-Paul Sartre and Oswald Spengler, is that we want to transcend death by attaching ourselves to something permanent that will outlast us: art, science, our families and so forth.

    For Greene this impulse has taken the form of a lifetime devotion to mathematics and physics, of the search for laws and truths that transcend time and place. “The enchantment of a mathematical proof might be that it stands forever,” he writes.

    If he dies, the work lives on as part of the body of science and knowledge. But as a cosmologist, he knows this is an illusion: “As our trek across time will make clear, life is likely transient, and all understanding that arose with its emergence will almost certainly dissolve with its conclusion. Nothing is permanent. Nothing is absolute.”

    Depressing. But in a Starbucks one day, he says, he had a realization, a sort of conversion to gratitude. Life and thought might occupy only a minute oasis in cosmic time, but, he writes, “If you take that in fully, envisioning a future bereft of stars and planets and things that think, your regard for our era can appreciate toward reverence.” Or maybe, he jokes, he was just losing his mind.

    This book, then, is a love letter to the ephemeral cosmic moment when everything is possible. Reading it is like riding an escalator up through a giant department store. On the lower floors you find things like time, energy, gravity and the Big Bang, and biology.

    The universe is expanding — why? So far the best explanation is that a virulent antigravitational force dubbed “inflation” — and strangely allowed by Einstein’s equations — briefly switched on during the first split trillionth of a second of time and sent everything flying, but astronomers still lack the smoking-gun proof.

    All living creatures that we know about on Earth share the same genetic tool kit, based on DNA. And we are all battery-operated, deriving energy from a molecule called adenosine triphosphate, ATP for short. In order to keep going, Greene tells us, each cell in your body consumes some 10 million of these molecules every second.

    Upward we go through the emporium of ideas to floors dedicated to consciousness, free will, language and religion. We don’t linger long on any floor. Greene is like one of those custom shopping consultants. He knows the wares, the ideas being pitched in every department. He drags in all the experts — from Proust to Hawking — and tries to be an honest broker about the answers to questions we can’t really answer.

    Why do humans tell stories? Was there an evolutionary advantage to be gained from taking time out from the hunt to sit around the campfire and gab — a bonding experience? Is the shared imagination a way to practice navigating unknown territory, or a guide for living your life?

    Can physics explain not just how the mind — neurons and electrochemical impulses — works but also explain the feeling of having a mind, that is to say consciousness? Greene is cautiously hopeful it can. “That the mind can do all it does is extraordinary. That the mind may accomplish all it does with nothing more than the kinds of ingredients and types of forces holding together my coffee cup, makes it more extraordinary still. Consciousness would be demystified without being diminished.”

    But he’s not always sure. Admitting that the neurophysical facts shed only “a monochrome light” on human experience, he extols art as another dimension. “We gain access to worlds otherwise uncharted,” he says. “As Proust emphasized, this is to be celebrated. Only through art, he noted, can we enter the secret universe of another, the only journey in which we truly ‘fly from star to star,’ a journey that cannot be navigated by ‘direct and conscious methods.’”

    Two main themes run through this story. The first is natural selection, the endless inventive process of evolution that keeps molding organisms into more and more complex arrangements and codependencies. The second is what Greene calls the “entropic-two step.” This refers to the physical property known as entropy. In thermodynamics it denotes the amount of heat — wasted energy — inevitably produced by a steam engine, for example as it goes through its cycle of expansion and contraction. It’s the reason you can’t build a perpetual motion machine. In modern physics it’s a measure of disorder and information. Entropy is a big concept in information theory and black holes, as well as in biology.

    We are all little steam engines, apparently, and everything we accomplish has a cost. That is why your exhaust pipe gets too hot to touch, or why your desk tends to get more cluttered by the end of the day.

    In the end, Greene says, entropy will get us all, and everything else in the universe, tearing down what evolution has built. “The entropic two-step and the evolutionary forces of selection enrich the pathway from order to disorder with prodigious structure, but whether stars or black holes, planets or people, molecules or atoms, things ultimately fall apart,” he writes.

    In a virtuosic final section Greene describes how this will work by inviting us to climb an allegorical Empire State Building; on each floor the universe is 10 times older. If the first floor is Year 10, we now are just above the 10th (10 billion years). By the time we get to the 11th floor the sun will be gone and with it probably any life on Earth. As we climb higher we are exposed to expanses of time that make the current age of the universe look like less than the blink of an eye.

    Eventually the Milky Way galaxy will fall into a black hole. On about the 38th floor of the future, when the universe is 100 trillion trillion trillion years old, protons, the building blocks of atoms, will dissolve out from under us, leaving space populated by a thin haze of lightweight electrons and a spittle of radiation.

    In the far, far, far, far future, even holding a thought will require more energy than will be available in the vastly dissipated universe. It will be an empty and cold place that doesn’t remember us. “Nabokov’s description of a human life as a ‘brief crack of light between two eternities of darkness’ may apply to the phenomenon of life itself,” Greene writes.

    In the end it is up to us to make of this what we will. We can contemplate eternity, Greene concludes, “and even though we can reach for eternity, apparently we cannot touch eternity.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 3:10 pm on February 17, 2020 Permalink | Reply
    Tags: , , , , , Particle Physics, , , Shedding new light on the internal structure of atomic nuclei.   

    From KTH Royal Institute of Technology via phys.org: “Exotic atomic nuclei reveal traces of new form of superfluidity” 


    From KTH Royal Institute of Technology



    Published Feb 17, 2020
    David Callahan

    The team behind the discovery of the new form of superfluidity: from left, Bo Cederwall, professor of physics at KTH Royal Institute of Technology, Xiaoyu Liu, Wei Zhang, Aysegül Ertoprak, Farnaz Ghazi Moradi and Özge Aktas.Published Feb 17, 2020

    Recent observations of the internal structure of the rare isotope ruthenium-88 shed new light on the internal structure of atomic nuclei, a breakthrough which could also lead to further insights into how some chemical elements in nature and their isotopes are formed.

    Led by Bo Cederwall, Professor of Experimental Nuclear Physics at KTH Royal Institute of Technology, an international research team identified new rotational states in the extremely neutron-deficient, deformed, atomic nucleus 88Ru. The results suggest that the structure of this exotic nuclear system is heavily influenced by the presence of strongly-coupled neutron-proton pairs.

    “Such a structure is fundamentally different from the normal conditions observed in atomic nuclei, where neutrons and protons interact in pairs in separate systems, forming a near-superfluid state,” Cederwall says.

    The results may also suggest alternative explanations for how the production of different chemical elements, and in particular their most neutron-poor isotopes, proceeds in the nucleosynthesis reactions in certain stellar environments such as neutron star-red giant binaries, he says.

    The discovery, which was published February 12 in the journal, Physical Review Letters, results from an experiment at the Grand Accélérateur National d’Ions Lourds (GANIL), France, using the Advanced Gamma Tracking Array (AGATA) [below].

    The researchers used nuclear collisions to create highly unstable atomic nuclei with equal numbers of neutrons and protons. Their structure was studied by using sensitive instruments, including AGATA, detecting the radiation they emit in the form of high-energy photons, neutrons, protons and other particles.

    The Advanced Gamma Tracking Array (AGATA), which researchers from KTH used to study unstable atomic nuclei generated at the Grand Accélérateur National d’Ions Lourds.

    According to the Standard Model of particle physics describing the elementary particles and their interactions, there are two general types of particles in nature; bosons and fermions, which have integer and half-integer spins, respectively. Examples of fermions are fundamental particles like the electron and the electron neutrino but also composite particles like the proton and the neutron and their fundamental building blocks, the quarks. Examples of bosons are the fundamental force carriers; the photon, the intermediate vector bosons, the gluons and the graviton.

    The properties of a system of particles differ considerably depending on whether it is based on fermions or bosons. As a result of the Pauli principle of quantum mechanics, in a system of fermions (such as an atomic nucleus) only one particle can hold a certain quantum state at a certain point in space and time. For several fermions to appear together, at least one property of each fermion, such as its spin, must be different. At low temperature systems of many fermions can exhibit condensates of paired particles manifested as superfluidity for uncharged particles (for example, the superfluid 3He), and superconductivity for charged particles, such as electrons in a superconductor below the critical temperature. Bosons, on the other hand, can condense individually with an unlimited number of particles in the same state, so-called Bose-Einstein condensates.

    In most atomic nuclei that are close to the line of beta stability and in their ground state, or excited to an energy not too high above it, the basic structure appears to be based on pair-correlated condensates of particles with the same isospin quantum number but with opposite spins. This means that neutrons and protons are paired separately from each other. These isovector pair correlations give rise to properties similar to superfluidity and superconductivity. In deformed nuclei, this structure is for example revealed as discontinuities in the rotational frequency when the rotational excitation energy of the nucleus is increased.

    Such discontinuities, which were discovered already in the early 1970s by KTH Professor emeritus Arne Johnson, have been labeled “backbending”. The backbending frequency is a measure of the energy required to break a neutron or proton pair and therefore also reflects the energy released by the formation of a pair of nucleons in the nucleus. There are long-standing theoretical predictions that systems of neutron-proton pairs can be mixed with, or even replace, the standard isovector pair correlations in exotic atomic nuclei with equal numbers of protons and neutrons. The nuclear structure resulting from the isoscalar component of such pair correlations is different from that found in “ordinary” atomic nuclei close to stability. Among different possible experimental observables, the backbending frequency in deformed nuclei is predicted to increase significantly compared with nuclei with different numbers of neutrons and protons.

    The KTH research group has previously observed evidence of strong neutron-proton correlations in the spherical nuclear nucleus 92Pd, which was published in the journal Nature (B. Cederwall et al., Nature, volume 469, p 68-71 (2011)). The ruthenium isotope 88Ru, with 44 neutrons and 44 protons, is deformed and exhibits a rotation-like structure that has now been observed up to higher spin, or rotational frequency, than previously possible. The new measurement provides a different angle on nuclear pair correlations compared with the previous work. By confirming the theoretical predictions of a shift towards higher backbending frequency it provides complementary evidence for the occurrence of strong isoscalar pair correlations in the heaviest nuclear systems with equal numbers of neutrons and protons.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Science X in 100 words

    Science X™ is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004 (Physorg.com), Science X’s readership has grown steadily to include 5 million scientists, researchers, and engineers every month. Science X publishes approximately 200 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Science X community members enjoy access to many personalized features such as social networking, a personal home page set-up, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.
    Mission 12 reasons for reading daily news on Science X Organization Key editors and writersinclude 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.


    An innovative European technical university

    Since its founding in 1827, KTH Royal Institute of Technology in Stockholm has grown to become one of Europe’s leading technical and engineering universities, as well as a key centre of intellectual talent and innovation. We are Sweden’s largest technical research and learning institution and home to students, researchers and faculty from around the world dedicated to advancing knowledge.

  • richardmitnick 6:47 pm on February 13, 2020 Permalink | Reply
    Tags: "University of Chicago to build instrumentation for upgrades to the Large Hadron Collider", , , , , , Particle Physics, ,   

    From University of Chicago: “University of Chicago to build instrumentation for upgrades to the Large Hadron Collider” 

    U Chicago bloc

    From University of Chicago

    Feb 13, 2020
    Natalie Lund

    The ATLAS detector at the Large Hadron Collider. UChicago scientists will build components for an upgrade to the detector. CERN.

    Faculty, students, engineers to design and build systems for ATLAS experiment.

    In 2012, scientists and the public around the world rejoiced at the news that CERN’s Large Hadron Collider had discovered the long-sought Higgs boson—a particle regarded as a linchpin in the Standard Model of particle physics, the theory that describes the fundamental forces and classifies all known elementary particles.

    Standard Model of Particle Physics, Quantum Diaries

    CERN CMS Higgs Event May 27, 2012

    CERN ATLAS Higgs Event

    Despite the breakthrough, subsequent collisions in the machine have yet to produce evidence of what physicists call “new physics”: science that could address the areas where the Standard Model seems to break down—like dark matter, dark energy and why there is more matter than antimatter. So now, the particle accelerator and its detectors are getting an upgrade.

    On Feb. 5, the National Science Foundation and the National Science Board gave the green light for $75 million in funding for upgrades to the ATLAS experiment, one of the collider’s two 7-story high and half a football-field long detectors—opening the doors for the discovery of new particles and rare processes. Approximately $5.5 million will go to the University of Chicago, a founding member of the ATLAS experiment, to design and build several components for the upgraded detector.

    “These upgrades will help the physics community answer glaring questions surrounding the structure of the fundamental particle universe,” said Asst. Prof. David Miller, a particle physicist who has worked extensively on the ATLAS detector and is co-leading the University’s participation in the upgrade. “Why do the fundamental particles that we know about exist in the first place? What is the underlying pattern and structure behind them?”

    The upgrades, which are estimated for completion in 2026, will allow researchers to study the Higgs boson in greater detail; continue the hunt for dark matter, which comprises 25% of our universe and has never been directly detected; and identify new particles, interactions, and physical properties such as new symmetries or spatial dimensions.

    The upgrades to the LHC itself will increase its luminosity—the intensity of its proton beams—by a factor of ten, substantially increasing the number of particle collisions that occur in a given amount of time. Thus ATLAS detector, which is the “camera” capturing images of the collisions, must also be upgraded to filter larger quantities of data at high speeds and to deal with more intense radiation.

    “The biggest challenge with our existing detector is separating the signal from the background. For every interesting particle you produce, there are probably something like a million standard particle decays that look about the same,” said Prof. Mark Oreglia, a renowned expert in collider research and development and the other leader for the project.

    Researchers at the University of Chicago will build portions of the calorimeter, the system that measures the energy of the particles that enter the detector; and the trigger, which tells the detector what images, or “events” to record or ignore.

    The challenge for the new calorimeter is building an instrument so sensitive it can instantaneously measure the light and energy coming off the 200 proton-proton collisions that will occur 40 million times per second—while also robust enough to withstand that same powerful radiation.

    UChicago researchers already have built prototypes of some components and sent them for rigorous testing to ensure they could withstand the LHC’s increased intensity. Construction of electronics is slated to begin this spring, with undergraduate students participating in the testing of the boards to look for short circuits and other flaws.

    CERN staff member Irakli Minashvili asks UChicago undergraduate student Hadar Lazar for the results of a test she is running on ATLAS detector electronics. Courtesy Mark Oreglia.

    Another challenge posed by the upgraded LHC is the volume of data produced by the collisions.

    “In their raw form, the data volume is nearly one petabyte per second, so there’s no way we can save that amount,” Miller said. “We have to come up with clever ways to determine what to keep and what to throw away.”

    Miller’s team is partnering with UChicago Computer Science faculty Assoc. Prof. Risi Kondor and Asst. Prof. Yuxin Chen, tapping their pioneering work in machine learning to develop innovative algorithms and software to tackle this unprecedented task.

    “Machine learning helps us detect patterns in the data and uncover features that we might not otherwise have seen,” Miller said. “For example, I’m working with Risi Kondor to build a completely new type of neural network whose inherent structure reflects known symmetries of nature.”

    The $75 million from the National Science Foundation will complement $163 million funding from the U.S. Department of Energy to support the upgrade to the detector.

    The project will involve multiple universities as well as national laboratories. ATLAS is a large international collaboration consisting of 3000 scientists from 182 institutions and 38 countries.

    Additional researchers and groups involved with the construction are Young-Kee Kim, the Louis Block Distinguished Service Professor of Physics; Melvyn Shochet, the Kersten Distinguished Service Professor of Physics; and the staff of the Enrico Fermi Institute’s Electronics Development Group (EDG) and MANIAC Lab: EDG director Mary Heintz, EDG research professor Kelby Anderson, EDG engineers Mircea Bogdan and Fukun Tang, and MANIAC Lab director research professor Robert Gardner, who heads an NSF effort called Scalable Systems Laboratory to develop data processing systems for all of the data collected and to provide platforms for complex data analysis tasks.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

  • richardmitnick 6:16 pm on February 13, 2020 Permalink | Reply
    Tags: , , , , Particle Physics   

    From Fermi National Accelerator Lab: “Finding hidden neutrinos with MicroBooNE” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    February 13, 2020
    Owen Goodwin
    Davide Porzio
    Stefan Söldner-Rembold
    Yun-Tse Tsai

    Neutrinos have baffled scientists for decades as their properties and behavior differ from those of other known elementary particles. Their masses, for example, are much smaller than the masses measured for any other elementary matter particle we know. They also carry no electric charge and interact only very rarely – through the weak force — with matter. At Fermilab, a chain of accelerators generates neutrino beams so researchers can study neutrino properties and understand their role in the formation of the universe.

    Scientists working on Fermilab’s MicroBooNE experiment have published a paper [Physical Review D] describing a search for a new – hidden – type of heavier neutrino that could help explain why the masses of ordinary neutrinos are so small. It could also provide important clues about the nature of dark matter. This search is the first of its kind performed with a type of particle detector known as a liquid-argon time projection chamber.

    The MicroBooNE detector consists of a large tank of liquid argon [below], totaling 170 tons, located in an intense beam of neutrinos at Fermilab. The neutrinos originate in a beam produced by the lab’s accelerators. Some of these ordinary neutrinos will hit an argon nucleus in the tank, resulting in the production of other particles. The MicroBooNE detector then acts like a giant camera that records the particles produced in this collision.

    A heavier type of neutrino – which has been hypothesized but never observed – could also be produced in the accelerator-generated beam. These heavier types of neutrinos, scientifically called “heavy neutral leptons,” would not interact through the weak force and therefore could not hit an argon nucleus in the same way as ordinary neutrinos do. They could, however, leave a hint of their existence if they decayed into known particles inside the MicroBooNE detector.

    The display shows the decay of a heavy neutrino as it would be measured in the MicroBooNE detector. Scientists use such simulations to understand what a signal in data would look like. Image: MicroBooNE collaboration

    To find such signatures of heavy neutrinos, MicroBooNE scientists devised a new method that helps them distinguish the heavy neutrino decays from ordinary neutrino scatterings on argon, and it has a lot to do with timing.

    The Fermilab neutrino beam is not a continuous stream of particles. Rather, it is pulsed, and the experimenters know when these neutrino pulses are supposed to arrive at the MicroBooNE detector: The heavy neutrinos would be more massive and therefore slower than the ordinary neutrinos – a well-tested prediction of special relativity. The trick is therefore to wait just long enough — until the ordinary neutrinos in a pulse have passed through and only heavy neutrinos could arrive.

    In the MicroBooNE detector, a heavy neutrino would appear to come out of nowhere. The only traces of its appearance would be tracks from two charged particles emerging from its decay – a muon and a pion (see figure). Using the measured angles and energies of these two daughter particles, the mass of the invisible parent particle – assumed to be the heavy neutrino — can be calculated.

    After sifting through all the MicroBooNE data, scientists found that only a handful of heavy-neutrino candidates remained. Scientists found that the origin of these candidates is consistent with being muons from cosmic rays constantly bombarding the MicroBooNE detector. In very rare cases, such a muon can mimic the two charged particles from a heavy neutral lepton.

    The heavy neutrinos – if they exist – are therefore still hiding. MicroBooNE’s results are expressed as a limit on the strength of the coupling – or mixing – of the hidden neutrinos with ordinary neutrinos. In this way, the sensitivity of the MicroBooNE detector can be translated into stringent constraints on models that predict hidden neutrino states, leading to better predictions. The short-baseline liquid-argon neutrino experiments at Fermilab are going to collect much more data in the coming years. Heavy neutrinos might not be able to hide much longer.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world collaborate at Fermilab on experiments at the frontiers of discovery.

    FNAL MINERvA front face Photo Reidar Hahn


    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF


    FNAL Don Lincoln


    FNAL Cryomodule Testing Facility

    FNAL MINOS Far Detector in the Soudan Mine in northern Minnesota

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector


    FNAL Holometer

  • richardmitnick 6:47 pm on February 11, 2020 Permalink | Reply
    Tags: , , , , Particle Physics, , Proton Synchrotron prepared for higher injection energies   

    From CERN: “LS2 Report: Proton Synchrotron prepared for higher injection energies” 

    Cern New Bloc

    Cern New Particle Event

    From CERN

    11 February, 2020
    Achintya Rao

    CERN’s oldest working accelerator has a new injection kicker magnet and will soon receive a new septum as well.

    The new kicker for the PS being installed in the accelerator (Image: Julien Ordan/CERN)

    CERN Proton Synchrotron

    Proton beams entering the Proton Synchrotron (PS) from the PS Booster have to be deflected into a circulating orbit before they can be accelerated. This is done by two specialised beam-line elements: a strong magnetic septum and a fast injection-kicker magnet. The latter is a precisely synchronised electromagnet that can be switched on and off in about 100 ns, providing a stable and uniform kick that only affects the injected beam batches, while leaving the already circulating beam unperturbed.

    After the ongoing second long shutdown of CERN’s accelerator complex (LS2), the PS Booster will accelerate particles to 2 GeV, almost 50% higher than the pre-LS2 value of 1.4 GeV. The PS therefore needed a new septum and a new kicker capable of coping with this increased injection energy. On 31 January, as part of the LHC Injectors Upgrade (LIU) project, the new kicker magnet was installed, replacing the kicker that had operated since 1979. The magnet will soon be aligned, connected to the vacuum system and then connected to the power and control cables.

    Like the magnet it replaced, the PS’s new kicker is made of four identical modules sitting in a 1-metre-long vacuum tank. Each module receives power from a separate pulse generator that consists of two high-power electrical switches – a main switch and a dump switch to control the pulse length – and around 280 metres of a so-called “pulse-forming line”, wound and stored on gigantic drums. These lines are thick, coaxial cables filled with sulphur hexafluoride (SF6) at a pressure of 10 bars, to provide the necessary insulation for the charging voltage of 80 kV. Since SF6 is a strong greenhouse gas, special care has to be taken to ensure that it is safely manipulated and recuperated, and that the system has no leaks.

    In order to reduce the dependence on the SF6-based cables, part of the transmission line between the pulse generator and the magnet was replaced with conventional cables. “Disconnecting the SF6 cables from the magnet to connect the reserves was a two-person job, and required time-consuming gas-handling procedures to be followed,” explains Thomas Kramer from the TE-ABT (Accelerator Beam Transfer) group. “On the other hand, the new conventional cables have quick-release connectors and can be operated by one person fairly quickly.”

    Kramer and colleagues also replaced the old analogue control system for the kicker, parts of which had been in place since the system was constructed in the 1970s. “Things made back then still work reliably,” smiles Kramer, while noting that the new digital systems make it possible to monitor the situation remotely.

    One element that remains to be installed is the new septum. This is a delicate device used in the injection system, composed of two cavities separated by a thin wall: one cavity allows the beams from the PS Booster to enter the PS while the second is meant for the circulating beams. The new septum, which required construction of a novel power converter, will be installed upstream of the magnet in the coming weeks.

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier



    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    CERN/ALICE Detector

    CERN CMS New

    CERN LHCb New II


    CERN map

    CERN LHC Tunnel

    CERN LHC particles

  • richardmitnick 3:42 pm on February 11, 2020 Permalink | Reply
    Tags: , Particle Physics, , , , Zee burst   

    From Washington University in St.Louis: “Ultra-high energy events key to study of ghost particles” 

    Wash U Bloc

    From Washington University in St.Louis

    January 31, 2020 [Just now in social media]
    Talia Ogliore

    This is the highest energy neutrino ever observed, with an estimated energy of 1.14 PeV. The IceCube Neutrino Observatory at the South Pole observed it on January 3, 2012. IceCube physicists named it Ernie. (Credit: IceCube Collaboration)

    Physicists at Washington University in St. Louis have proposed a way to use data from ultra-high energy neutrinos to study interactions beyond the standard model of particle physics. The ‘Zee burst’ model leverages new data from large neutrino telescopes such as the IceCube Neutrino Observatory in Antarctica and its future extensions.

    U Wisconsin IceCube neutrino observatory


    IceCube employs more than 5000 detectors lowered on 86 strings into almost 100 holes in the Antarctic ice NSF B. Gudbjartsson, IceCube Collaboration

    Lunar Icecube

    IceCube DeepCore annotated

    IceCube PINGU annotated

    DM-Ice II at IceCube annotated

    “Neutrinos continue to intrigue us and stretch our imagination. These ‘ghost particles’ are the least understood in the standard model, but they hold the key to what lies beyond,” said Bhupal Dev, assistant professor of physics in Arts & Sciences and author of a new study in Physical Review Letters.

    “So far, all nonstandard interaction studies at IceCube have focused only on the low-energy atmospheric neutrino data,” said Dev, who is part of Washington University’s McDonnell Center for the Space Sciences. “The ‘Zee burst’ mechanism provides a new tool to probe nonstandard interactions using the ultra-high energy neutrinos at IceCube.”

    Ultra-high energy events

    Since the discovery of neutrino oscillations two decades ago, which earned the 2015 Nobel Prize in physics, scientists have made significant progress in understanding neutrino properties — but a lot of questions remain unanswered.

    For example, the fact that neutrinos have such a tiny mass already requires scientists to consider theories beyond the standard model. In such theories, “neutrinos could have new nonstandard interactions with matter as they propagate through it, which will crucially affect their future precision measurements,” Dev said.

    In 2012, the IceCube collaboration reported the first observation of ultra-high energy neutrinos from extraterrestrial sources, which opened a new window to study neutrino properties at the highest possible energies. Since that discovery, IceCube has reported about 100 such ultra-high energy neutrino events.

    This is the highest-energy neutrino ever observed, with an estimated energy of 1.14 PeV. It was detected by the IceCube Neutrino Observatory at the South Pole on Jan. 3, 2012. IceCube physicists named it Ernie. (Credit: IceCube collaboration)

    “We immediately realized that this could give us a new way to look for exotic particles, like supersymmetric partners and heavy decaying dark matter,” Dev said. For the previous several years, he had been looking for ways to find signals of new physics at different energy scales and had co-authored half a dozen papers studying the possibilities.

    “The common strategy I followed in all these works was to look for anomalous features in the observed event spectrum, which could then be interpreted as a possible sign of new physics,” he said.

    The most spectacular feature would be a resonance: what physicists witness as a dramatic enhancement of events in a narrow energy window. Dev devoted his time to thinking about new scenarios that could give rise to such a resonance feature. That’s where the idea for the current work came from.

    In the standard model, ultra-high energy neutrinos can produce a W-boson at resonance. This process, known as the Glashow resonance, has already been seen at IceCube, according to preliminary results presented at the Neutrino 2018 conference.

    “We propose that similar resonance features can be induced due to new light, charged particles, which provides a new way to probe nonstandard neutrino interactions,” Dev said.

    Bursting onto the neutrino scene

    Dev and his co-author Kaladi Babu at Oklahoma State University considered the Zee model, a popular model of radiative neutrino mass generation, as a prototype for their study. This model allows for charged scalars to be as light as 100 times the proton mass.

    “These light, charged Zee-scalars could give rise to a Glashow-like resonance feature in the ultra-high energy neutrino event spectrum at the IceCube Neutrino Observatory,” Dev said.

    Because the new resonance involves charged scalars in the Zee model, they decided to call it the ‘Zee burst.’

    Rendering of an observation of the ultra-high energy events that feed into the ‘Zee burst’ model. (Image by Yicong Sui, Washington University)

    Yicong Sui at Washington University and Sudip Jana at Oklahoma State, both graduate students in physics and co-authors of this study, did extensive event simulations and data analysis showing that it is possible to detect such a new resonance using IceCube data.

    “We need an effective exposure time of at least four times the current exposure to be sensitive enough to detect the new resonance — so that would be about 30 years with the current IceCube design, but only three years of IceCube-Gen 2,” Dev said, referring to the proposed next-generation extension of IceCube with 10 km3 detector volume.

    “This is an effective way to look for the new charged scalars at IceCube, complementary to direct searches for these particles at the Large Hadron Collider.”

    Funding: This work was supported by US Department of Energy and by the US Neutrino Theory Network Program.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Wash U campus

    Washington University’s mission is to discover and disseminate knowledge, and protect the freedom of inquiry through research, teaching, and learning.

    Washington University creates an environment to encourage and support an ethos of wide-ranging exploration. Washington University’s faculty and staff strive to enhance the lives and livelihoods of students, the people of the greater St. Louis community, the country, and the world.

  • richardmitnick 9:54 am on February 6, 2020 Permalink | Reply
    Tags: "Could the next generation of particle accelerators come out of the 3D printer?", , , , Consortium on the Properties of Additive-Manufactured Copper, , Particle Physics, ,   

    From SLAC National Accelerator Lab: “Could the next generation of particle accelerators come out of the 3D printer?” 

    From SLAC National Accelerator Lab

    February 5, 2020
    Jennifer Huber

    SLAC scientists and collaborators are developing 3D copper printing techniques to build accelerator components.

    Imagine being able to manufacture complex devices whenever you want and wherever you are. It would create unforeseen possibilities even in the most remote locations, such as building spare parts or new components on board a spacecraft. 3D printing, or additive manufacturing, could be a way of doing just that. All you would need is the materials the device will be made of, a printer and a computer that controls the process.

    Diana Gamzina, a staff scientist at the Department of Energy’s SLAC National Accelerator Laboratory; Timothy Horn, an assistant professor of mechanical and aerospace engineering at North Carolina State University; and researchers at RadiaBeam Technologies dream of developing the technique to print particle accelerators and vacuum electronic devices for applications in medical imaging and treatment, the electrical grid, satellite communications, defense systems and more.

    Examples of 3D-printed copper components that could be used in a particle accelerator: X-band klystron output cavity with micro-cooling channels (at left) and a set of coupled accelerator cavities. (Christopher Ledford/North Carolina State University)

    In fact, the researchers are closer to making this a reality than you might think.

    “We’re trying to print a particle accelerator, which is really ambitious,” Gamzina said. “We’ve been developing the process over the past few years, and we can already print particle accelerator components today. The whole point of 3D printing is to make stuff no matter where you are without a lot of infrastructure. So you can print your particle accelerator on a naval ship, in a small university lab or somewhere very remote.”

    3D printing can be done with liquids and powders of numerous materials, but there aren’t any well-established processes for 3D printing ultra-high-purity copper and its alloys – the materials Gamzina, Horn and their colleagues want to use. Their research focuses on developing the method.

    Indispensable copper

    Accelerators boost the energy of particle beams, and vacuum electronic devices are used in amplifiers and generators. Both rely on components that can be easily shaped and conduct heat and electricity extremely well. Copper has all of these qualities and is therefore widely used.

    Traditionally, each copper component is machined individually and bonded with others using heat to form complex geometries. This manufacturing technique is incredibly common, but it has its disadvantages.

    “Brazing together multiple parts and components takes a great deal of time, precision and care,” Horn said. “And any time you have a joint between two materials, you add a potential failure point. So, there is a need to reduce or eliminate those assembly processes.”

    Potential of 3D copper printing

    3D printing of copper components could offer a solution.

    It works by layering thin sheets of materials on top of one another and slowly building up specific shapes and objects. In Gamzina’s and Horn’s work, the material used is extremely pure copper powder.

    The process starts with a 3D design, or “construction manual,” for the object. Controlled by a computer, the printer spreads a few-micron-thick layer of copper powder on a platform. It then moves the platform about 50 microns – half the thickness of a human hair – and spreads a second copper layer on top of the first, heats it with an electron beam to about 2,000 degrees Fahrenheit and welds it with the first layer. This process repeats over and over until the entire object has been built.

    3D printing of copper devices
    3D printing of a layer of a device known as a traveling wave tube using copper powder. (Christopher Ledford/North Carolina State University)

    The amazing part: no specific tooling, fixtures or molds are needed for the procedure. As a result, 3D printing eliminates design constraints inherent in traditional fabrication processes and allows the construction of objects that are uniquely complex.

    “The shape doesn’t really matter for 3D printing,” said SLAC staff scientist Chris Nantista, who designs and tests 3D-printed samples for Gamzina and Horn. “You just program it in, start your system and it can build up almost anything you want. It opens up a new space of potential shapes.”

    The team took advantage of that, for example, when building part of a klystron – a specialized vacuum tube that amplifies radiofrequency signals – with internal cooling channels at NCSU. Building it in one piece improved the device’s heat transfer and performance.

    Compared to traditional manufacturing, 3D printing is also less time consuming and could translate into cost savings of up to 70%, Gamzina said.

    A challenging technique

    But printing copper devices has its own challenges, as Horn, who began developing the technique with collaborators from RadiaBeam years ago, knows. One issue is finding the right balance between the thermal and electrical properties and strengths of the printed objects. But the biggest hurdle for manufacturing accelerators and vacuum electronics, though, is that these high-vacuum devices require extremely high quality and pure materials to avoid part failures, such as cracking or vacuum leaks.

    The research team tackled these challenges by first improving the material’s surface quality, using finer copper powder and varying the way they fused layers together. However, using finer copper powder led to the next challenge. It allowed more oxygen to attach to the copper powder, increasing the oxide in each layer and making the printed objects less pure.

    So, Gamzina and Horn had to find a way to reduce the oxygen content in their copper powders. The method they came up with, which they recently reported in Applied Sciences, relies on hydrogen gas to bind oxygen into water vapor and drive it out of the powder.

    Using this method is somewhat surprising, Horn said. In a traditionally manufactured copper object, the formation of water vapor would create high-pressure steam bubbles inside the material, and the material would blister and fail. In the additive process, on the other hand, the water vapor escapes layer by layer, which releases the water vapor more effectively.

    Although the technique has shown great promise, the scientists still have a ways to go to reduce the oxygen content enough to print an actual particle accelerator. But they have already succeeded in printing a few components, such as the klystron output cavity with internal cooling channels and a string of coupled cavities that could be used for particle acceleration.

    Planning to team up with industry partners

    The next phase of the project will be driven by the newly-formed Consortium on the Properties of Additive-Manufactured Copper, which is led by Horn. The consortium currently has four active industry members – Siemens, GE Additive, RadiaBeam and Calabazas Creek Research – with more on the way.

    “This would be a nice example of collaboration between an academic institution, a national lab and small and large businesses,” Gamzina said. “It would allow us to figure out this problem together. Our work has already allowed us to go from ‘just imagine, this is crazy’ to ‘we can do it’ in less than two years.”

    This work was primarily funded by the Naval Sea Systems Command, as a Small Business Technology Transfer Program with Radiabeam, SLAC, and NCSU. Other SLAC contributors include Chris Pearson, Andy Nguyen, Arianna Gleason, Apurva Mehta, Kevin Stone, Chris Tassone and Johanna Weker. Additional contributions came from Christopher Ledford and Christopher Rock at NCSU and Pedro Frigola, Paul Carriere, Alexander Laurich, James Penney and Matt Heintz at RadiaBeam.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition


    SLAC/LCLS II projected view

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    SSRL and LCLS are DOE Office of Science user facilities.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: