Tagged: Particle Accelerators Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:47 am on December 6, 2017 Permalink | Reply
    Tags: , , FAST Fermilab Accelerator Science and Technology, FNAL IOTA ring, , Particle Accelerators, ,   

    From FNAL: “FAST electron beam achieves milestone energy for future accelerator R&D” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    December 6, 2017
    Tom Barratt

    1
    The linear accelerator at the Fermilab Accelerator Science and Technology facility recently ramped up a beam of electrons to 300 MeV, surpassing the threshold needed to launch a new accelerator physics program at Fermilab. Photo: Giulio Stancari.

    On Nov. 15, a team at the Fermilab Accelerator Science and Technology (FAST) facility ramped up a beam of electrons to 300 million electronvolts.

    It was a double milestone event. For one, the beam surpassed the threshold needed to launch a new accelerator physics program at Fermilab supported by the DOE Office of Science. The driving goal of the FAST facility is to support a new R&D accelerator that will require a minimum of 150 million electronvolts (MeV) of electron beam energy.

    “It’s a great achievement and an important milestone for the project and the laboratory,” said Jerry Leibfritz, project engineer for FAST.

    It was also a hard-earned success for the larger particle accelerator community. For the first time anywhere, the Fermilab group achieved a beam energy of 250 MeV from a single ILC-type cryomodule. For years, a worldwide R&D effort has been under way to develop cryomodules — accelerating structures — for the proposed International Linear Collider under development by a broad international collaboration for possible construction in Japan. The beam energy milestone was one of the goals of the international ILC R&D effort.

    ILC schematic

    “Fermilab has been known for its outstanding proton beams,” said Vladimir Shiltsev, head of the Fermilab Accelerator Physics Center. “Getting to a 300 MeV electron beam is a big deal for Fermilab’s accelerator R&D. We’re happy we could advance this R&D on two fronts.”

    The delivery of the high-energy electron beam enables the start of the accelerator research program at FAST and demonstrates that the facility is up to the job of supporting the future Integrable Optics Test Accelerator, or IOTA, scheduled to come online in mid-2018.

    The IOTA idea

    2
    FNAL IOTA ring schematic.

    The team behind IOTA, a 40-meter-circumference ring, is looking to break new ground in beam physics research. Once it is operable, scientists and engineers will inject beams of electrons from FAST accelerators into the ring, where they can be maintained and used to carry out R&D for new, advanced accelerator equipment and techniques.

    Although technology behind accelerator-based particle physics has made huge strides over recent decades, all current accelerators use magnets with linear focusing to stabilize and guide particle beams. The technique can only advance so far, however, before fundamental physics effects, which tend to destabilize beams, limit the experimental possibilities.

    To solve this problem, Fermilab began a research program to develop a number of novel accelerator techniques, and IOTA will be the pinnacle of this effort. Scientists from around the world will be able to use the next-generation accelerator to collaborate and test innovative ideas, finding ways around the physics constraints to reach the next level of accelerator beam power.

    For IOTA to carry out successful research, the FAST scientists chose a minimum threshold of energy for the injected particles.

    “Generally speaking, the more energy, the better the particle stability, but then you have to increase the technical complexity,” said Alexander Valishev, Fermilab scientist and head of the IOTA/FAST Department. “We chose 150 MeV as the electron energy for IOTA because it’s a good balance between energy and expense.”

    The team used a specialized beamline to produce the high-energy electron beam.

    “One of the biggest challenges in achieving this milestone was just getting all the complex systems and components of the accelerator to integrate and work together,” Valishev said.

    3
    The Fermilab Accelerator Science and Technology facility supports the research and development of accelerator technology for the next generation of particle accelerators. Photo: Jerry Leibfritz

    World-record beam

    In one sense, it was an achievement decades in the making. At the core of the electron-beam accelerator are superconducting elements whose development at Fermilab began in earnest in the late 1990s. That was the start of the development of SRF — superconducting radio-frequency technology — at Fermilab.

    SRF is the technology of choice for many current and future particle accelerators, including machines under construction now in the United States and Germany, Fermilab’s PIP-II accelerator that will provide high-intensity beams for neutrino experiments, and the proposed ILC. The ILC’s technical design, completed in 2012, requires that each 12-meter-long unit of the accelerator, called a cryomodule, deliver an electron beam energy gain of 250 MeV at a particular maxiumum gradient, which is a measure of how much energy the beam gains over a given distance. It was a formidable task for SRF experts and accelerator physicists worldwide.

    Now the highest-energy beam ever accelerated through the ILC-type cryomodule at design specifications has been demonstrated at Fermilab.

    “In October 2014 Fermilab successfully operated the superconducting elements in this cryomodule to the nominal ILC gradient, but without accelerating an actual electron beam. The milestone achieved this month proves the expected performance definitively,” said physicist Marc Ross of SLAC National Accelerator Laboratory. “The work done over the last years has paid off, the calibrations were correct, and the cryomodule delivered as required. It is a critical milestone toward the realization of ILC.”

    FAST friends

    The FAST team are now engaged in joint experiments with external collaborators, such as physicists from Northern Illinois University, the University of Chicago and Los Alamos National Laboratory.

    “We are very happy to have another working accelerator at Fermilab,” Valishev said. “FAST and IOTA will enable high-impact research and will be very useful for establishing more international collaboration.”

    The IOTA/FAST collaboration currently has 27 partners, including CERN and the University of Oxford. The team aims to finish the ring’s construction by early 2018 and to have it up and running by the summer. Injections of protons into the IOTA ring is expected a year later.

    “This is a tremendous accomplishment for all of us who have been involved in building this facility for the past decade,” Leibfritz said. “In 2006, a small group of us laid the first concrete blocks for the FAST test cave. This successful acceleration of beam really is the greatest testament to everyone who’s been involved in making it a reality.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

    Advertisements
     
  • richardmitnick 11:59 am on December 5, 2017 Permalink | Reply
    Tags: , , , , , Particle Accelerators, ,   

    From GIZMODO via FNAL: “Two Teams Have Simultaneously Unearthed Evidence of an Exotic New Particle” Revised to include the DZero result 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    GIZMODO bloc

    GIZMODO
    11/17/17
    Ryan F. Mandelbaum

    I can’t believe I’ve written three articles about this weird XI particle.

    1
    A tetraquark (Artwork: Fermilab)

    A few months ago, physicists observed a new subatomic particle—essentially an awkwardly-named, crazy cousin of the proton. Its mere existence has energized teams of particle physicists to dream up new ways about how matter forms, arranges itself, and exists.

    Now, a pair of new research papers using different theoretical methods have independently unearthed another, crazier particle predicted by the laws of physics. If discovered in an experiment, it would provide conclusive evidence of a whole new class of exotic particles called tetraquarks, which exist outside the established expectations of the behavior of the proton sub-parts called quarks. And this result is more than just mathematics.

    “We think this is not totally academic,” Chris Quigg, theoretical physicist from the Fermi National Accelerator Laboratory told Gizmodo. “Its discovery may well happen.”

    Bust first, some physics. Zoom all the way in and you’ll find that matter is made of atoms. Atoms, in turn, are made of protons, neutrons, and electrons. Protons and neutrons can further be divided into three quarks.

    Physicists have discovered six types of quarks, which also have names, masses, and electrical charges. Protons and neutrons are made from “up” and “down” quarks, the lightest two. But there are four rarer, heavier ones. From least to most massive, they are: “strange,” “charm,” “bottom,” and “top.” Each one has an antimatter partner—the same particle, but with the opposite electrical sign. As far as physicists have confirmed, these quarks and antiquarks can only arrange themselves in pairs or threes. They cannot exist on their own in nature.

    Scientists in the Large Hadron Collider’s LHCb collaboration recently announced spotting a new arrangement of three quarks, called the Ξcc++ or the “doubly charged, doubly charmed xi particle.”

    CERN/LHCb detector

    It had an up quark and two heavy charm quarks. But “most of these particles” with three quarks “containing two heavy quarks, charm or beauty, have not yet been found,” physicist Patrick Koppenburg from Nikhef, the Dutch National Institute for Subatomic Physics, told Gizmodo back then. “This is the first in a sense.”

    The DZero collaboration at Fermilab announced the discovery of a new particle whose quark content appears to be qualitatively different from normal.

    5
    The particle newly discovered by DZero decays into a Bs meson and pi meson. The Bs meson decays into a J/psi and a phi meson, and these in turn decay into two muons and two kaons, respectively. The dotted lines indicate promptly decaying particles.

    The study, using the full data set acquired at the Tevatron collider from 2002 to 2011 totaling 10 inverse femtobarns, identified the Bs meson through its decay into intermediate J/psi and phi mesons, which subsequently decayed into a pair of oppositely charged muons and a pair of oppositely charged K mesons respectively. Science paper in Physical Review Letters.

    With the knowledge such a particle could exist (and with the knowledge of its properties like its mass), two teams of physicists crunched the numbers in two separate ways. One team used extrapolations of the experimental data and methods they’d previously used to predict this past summer’s particle. The other used a mathematical abstraction of the real world, using approximations that take into account just how much heavier the charm, bottom, and top are than the rest to simplify the calculations.

    In both new papers published in Physical Review Letters https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.119.202002 and https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.119.202001, a stable four-quark particle with two bottom quarks, an anti-up quark, and an anti-down quark fell out of the math. Furthermore, the predicted particles’ masses were not quite the same, but similar enough to raise eyebrows.

    “As you notice, the conclusions are basically identical on a qualitative level,” Marek Karliner, author of the first study from Tel Aviv University in Israel, told Gizmodo. And while lots of tetraquark candidates have been spotted, this particle’s strange identity—including the added properties and stabilization from its two heavy bottom quarks—would offer unambiguous evidence of the particle’s existence.

    “The things we’re talking about are so weird that they couldn’t be something else,” said Quigg.

    But now it’s just a manner of finding the dang things. Quigg thought a new collider such as one proposed for China might be required.

    2
    Rendering of the proposed CEPC [CEPC-SppC for Circular Electron-Positron Collider and Super Proton-Proton Collider]. Photo: IHEP [China’s Institute of High Energy Physics]

    But physicists are in agreement that the sometimes-overlooked LHCb experiment has been doing some of the year’s most exciting work—Karliner thought the experiment could soon spot the particle. “My experimental colleagues are quite firm in this statement. They say that if it’s there, they will see it.” He thought the observation could come in perhaps two to three years time, though Quigg was less optimistic.

    Such unambiguous detection of the tetraquark would confirm guesses from as far back as 1964 as to how quarks arrange themselves. And the independent confirmation from different methods have made both teams confident.

    “I think we have pretty great confidence that the doubly-b tetraquark could exist,” said Quigg. “It’s just a matter of looking hard for it.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “We come from the future.”

    GIZMOGO pictorial

     
  • richardmitnick 4:32 pm on November 28, 2017 Permalink | Reply
    Tags: , , , , , , NERSC Cori II XC40 supercomputer, Particle Accelerators, , ,   

    From BNL: “High-Performance Computing Cuts Particle Collision Data Prep Time” 

    Brookhaven Lab

    November 28, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    New approach to raw data reconstruction has potential to turn particle tracks into physics discoveries faster.

    1
    Mark Lukascsyk, Jérôme Lauret, and Levente Hajdu standing beside a tape silo at the RHIC & ATLAS Computing Facility at Brookhaven National Laboratory. Data sets from RHIC runs are stored on tape and were transferred from Brookhaven to NERSC.

    For the first time, scientists have used high-performance computing (HPC) to reconstruct the data collected by a nuclear physics experiment—an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries.

    The demonstration project used the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC), a high-performance computing center at Lawrence Berkeley National Laboratory in California, to reconstruct multiple datasets collected by the STAR detector during particle collisions at the Relativistic Heavy Ion Collider (RHIC), a nuclear physics research facility at Brookhaven National Laboratory in New York.

    NERSC Cray Cori II XC40 supercomputer at NERSC at LBNL

    BNL/RHIC Star Detector


    BNL RHIC Campus

    “The reason why this is really fantastic,” said Brookhaven physicist Jérôme Lauret, who manages STAR’s computing needs, “is that these high-performance computing resources are elastic. You can call to reserve a large allotment of computing power when you need it—for example, just before a big conference when physicists are in a rush to present new results.” According to Lauret, preparing raw data for analysis typically takes many months, making it nearly impossible to provide such short-term responsiveness. “But with HPC, perhaps you could condense that many months production time into a week. That would really empower the scientists!”

    The accomplishment showcases the synergistic capabilities of RHIC and NERSC—U.S. Department of Energy (DOE) Office of Science User Facilities located at DOE-run national laboratories on opposite coasts—connected by one of the most extensive high-performance data-sharing networks in the world, DOE’s Energy Sciences Network (ESnet), another DOE Office of Science User Facility.

    “This is a key usage model of high-performance computing for experimental data, demonstrating that researchers can get their raw data processing or simulation campaigns done in a few days or weeks at a critical time instead of spreading out over months on their own dedicated resources,” said Jeff Porter, a member of the data and analytics services team at NERSC.

    NERSC Cray XC40 Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Billions of data points

    To make physics discoveries at RHIC, scientists must sort through hundreds of millions of collisions between ions accelerated to very high energy. STAR, a sophisticated, house-sized electronic instrument, records the subatomic debris streaming from these particle smashups. In the most energetic events, many thousands of particles strike detector components, producing firework-like displays of colorful particle tracks. But to figure out what these complex signals mean, and what they can tell us about the intriguing form of matter created in RHIC’s collisions, scientists need detailed descriptions of all the particles and the conditions under which they were produced. They must also compare huge statistical samples from many different types of collision events.

    Cataloging that information requires sophisticated algorithms and pattern recognition software to combine signals from the various readout electronics, and a seamless way to match that data with records of collision conditions. All the information must then be packaged in a way that physicists can use for their analyses.

    By running multiple computing jobs simultaneously on the allotted supercomputing cores, the team transformed 4.73 petabytes of raw data into 2.45 petabytes of “physics-ready” data in a fraction of the time it would have taken using in-house high-throughput computing resources, even with a two-way transcontinental data journey.

    Since RHIC started running in the year 2000, this raw data processing, or reconstruction, has been carried out on dedicated computing resources at the RHIC and ATLAS Computing Facility (RACF) at Brookhaven. High-throughput computing (HTC) clusters crunch the data, event-by-event, and write out the coded details of each collision to a centralized mass storage space accessible to STAR physicists around the world.

    But the challenge of keeping up with the data has grown with RHIC’s ever-improving collision rates and as new detector components have been added. In recent years, STAR’s annual raw data sets have reached billions of events with data sizes in the multi-Petabyte range. So the STAR computing team investigated the use of external resources to meet the demand for timely access to physics-ready data.

    Many cores make light work

    Unlike the high-throughput computers at the RACF, which analyze events one-by-one, HPC resources like those at NERSC break large problems into smaller tasks that can run in parallel. So the first challenge was to “parallelize” the processing of STAR event data.

    “We wrote workflow programs that achieved the first level of parallelization—event parallelization,” Lauret said. That means they submit fewer jobs made of many events that can be processed simultaneously on the many HPC computing cores.

    3
    In high-throughput computing, a workload made up of data from many STAR collisions is processed event-by-event in a sequential manner to give physicists “reconstructed data” —the product they need to fully analyze the data. High-performance computing breaks the workload into smaller chunks that can be run through separate CPUs to speed up the data reconstruction. In this simple illustration, breaking a workload of 15 events into three chunks of five events processed in parallel yields the same product in one-third the time as the high-throughput method. Using 32 CPUs on a supercomputer like Cori can greatly reduce the time it takes to transform the raw data from a real STAR dataset, with many millions of events, into useful information physicists can analyze to make discoveries.

    “Imagine building a city with 100 homes. If this was done in high-throughput fashion, each home would have one builder doing all the tasks in sequence—building the foundation, the walls, and so on,” Lauret said. “But with HPC we change the paradigm. Instead of one worker per house we have 100 workers per house, and each worker has a task—building the walls or the roof. They work in parallel, at the same time, and we assemble everything together at the end. With this approach, we will build that house 100 times faster.”

    Of course, it takes some creativity to think about how such problems can be broken up into tasks that can run simultaneously instead of sequentially, Lauret added.

    HPC also saves time matching raw detector signals with data on the environmental conditions during each event. To do this, the computers must access a “condition database”—a record of the voltage, temperature, pressure, and other detector conditions that must be accounted for in understanding the behavior of the particles produced in each collision. In event-by-event, high-throughput reconstruction, the computers call up the database to retrieve data for every single event. But because HPC cores share some memory, events that occur close in time can use the same cached condition data. Fewer calls to the database means faster data processing.

    Networking teamwork

    Another challenge in migrating the task of raw data reconstruction to an HPC environment was just getting the data from New York to the supercomputers in California and back. Both the input and output datasets are huge. The team started small with a proof-of-principle experiment—just a few hundred jobs—to see how their new workflow programs would perform.

    “We had a lot of assistance from the networking professionals at Brookhaven,” said Lauret, “particularly Mark Lukascsyk, one of our network engineers, who was so excited about the science and helping us make discoveries.” Colleagues in the RACF and ESnet also helped identify hardware issues and developed solutions as the team worked closely with Jeff Porter, Mustafa Mustafa, and others at NERSC to optimize the data transfer and the end-to-end workflow.

    Start small, scale up

    4
    This animation shows a series of collision events at STAR, each with thousands of particle tracks and the signals registered as some of those particles strike various detector components. It should give you an idea of how complex the challenge is to reconstruct a complete record of every single particle and the conditions under which it was created so scientists can compare hundreds of millions of events to look for trends and make discoveries.

    After fine-tuning their methods based on the initial tests, the team started scaling up to using 6,400 computing cores at NERSC, then up and up and up.

    “6,400 cores is already half of the size of the resources available for data reconstruction at RACF,” Lauret said. “Eventually we went to 25,600 cores in our most recent test.” With everything ready ahead of time for an advance-reservation allotment of time on the Cori supercomputer, “we did this test for a few days and got an entire data production done in no time,” Lauret said.According to Porter at NERSC, “This model is potentially quite transformative, and NERSC has worked to support such resource utilization by, for example, linking its center-wide high-performant disk system directly to its data transfer infrastructure and allowing significant flexibility in how job slots can be scheduled.”

    The end-to-end efficiency of the entire process—the time the program was running (not sitting idle, waiting for computing resources) multiplied by the efficiency of using the allotted supercomputing slots and getting useful output all the way back to Brookhaven—was 98 percent.

    “We’ve proven that we can use the HPC resources efficiently to eliminate backlogs of unprocessed data and resolve temporary resource demands to speed up science discoveries,” Lauret said.

    He’s now exploring ways to generalize the workflow to the Open Science Grid—a global consortium that aggregates computing resources—so the entire community of high-energy and nuclear physicists can make use of it.

    This work was supported by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 3:43 pm on November 26, 2017 Permalink | Reply
    Tags: , , , Meet ISOLDE: Future physics with HIE-ISOLDE, Particle Accelerators, ,   

    From ISOLDE at CERN: “Meet ISOLDE: Future physics with HIE-ISOLDE” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    16 Oct 2017 [Just now in social media.]
    Harriet Kim Jarlett

    1
    HIE-ISOLDE has innovated many new ideas, particularly in space-saving solutions. One way the engineers kept the system compact was to build cryomodules that each contain five cavities, not just one (Image: Maximilien Brice/ CERN)

    This week, ISOLDE, CERN’s nuclear physics facility, is celebrating 50 years of physics. But after half a century of studying radioactive isotopes, the facility is on the brink of a new phase in its history, as its upgrade, HIE-ISOLDE, nears completion.

    “ISOLDE makes sure that it is always improving,” says Razvan Lica, a CERN PhD student working at the ISOLDE facility.


    Watch the fourth part in our documentary series about ISOLDE to find out more about the HIE-ISOLDE upgrade and the people building it. (Video: Christoph Madsen/CERN

    The HIE-ISOLDE upgrade, which will allow ISOLDE to collide beams of isotopes into targets at higher energies, is a chance for the facility to reinvent itself. The HIE stands for High Intensity and Energy, and physicists hope that it will guarantee ISOLDE another ten to fifteen years at the forefront of this area of research.

    Currently, to produce radioactive isotopes, ISOLDE takes proton beams from one of CERN’s accelerators, the Proton Synchrotron Booster (PSB) and fires them into a target.

    CERN Super Proton Synchrotron

    The target then sends out many radioactive isotopes, which can be directed down beamlines to various experiments. HIE-ISOLDE uses a new, unique, linear accelerator (linac) to take these beams and accelerate them again, before sending them on to secondary targets, where nuclear reactions occur.

    3
    The new linac had to fit into just 16 m of space. “We had to develop a very compact linac. That’s what makes it unique. In other facilities, every cavity has its own cryostat but if we had to do that it would be far too long, so we had to squeeze all of them into one cryomodule. We had to have the solenoids fitted too, they’re almost the same length as a cavity, so we had to do lots of design, research and development. The biggest challenge was to design in spaces with clearances of just 1 mm,” explains Yacine Kadi, project leader for HIE-ISOLDE. (Image: Maximilien Brice/CERN)

    “When we talk about elements we use their proton number. A heavy element is one with a higher proton number, but you can have many different isotopes of the same element. These have the same proton number but a different number of neutrons,” explains Liam Gaffney, who works on the Miniball set-up, attached to one of the HIE-ISOLDE beamlines.

    Physicists like Liam use these isotopes to research a range of topics, from astrophysics, by recreating reactions that happen in the stars, to the internal structure and shape of exotic nuclei, giving us an insight into the building blocks of the world around us.

    “Previously we couldn’t do as many of the reactions as we wanted to with radioactive isotopes, as the beam energy wasn’t high enough. To study the shape of the nuclei of the heaviest elements we need higher energies to overcome an increase in the nuclei charge. More protons means a higher positive charge, and since two positive nuclei repel each other, it means a higher energy is needed to collide them,” he continues.

    “Higher energy opens a new field. We had a stepping stone with the REX upgrade, when ISOLDE first introduced the possibility of reaccelerating isotopes, in 2001, but with the higher energies from HIE-ISOLDE, it’s a new realm,” says Karl Johnston, ISOLDE’s physics coordinator, who hopes the upgrade will mean even more applications are found for ISOLDE’s research.

    Future-proofing

    5
    There are currently three spaces for experiments to be attached to HIE-ISOLDE, with the hope that seven or more will eventually run each year. There is one permanent station attached to the linac, called Miniball, seen here, which can be set up to run multiple different experiments (Image: Julien Ordan/CERN)

    “Higher energy gives us the chance to study many different things. We focus on fundamental questions concerning the structure of nuclei,” explains Liam. “Studying reactions inside the stars to learn more about how the different elements are produced. Asking questions like: why are there so many heavy elements, like uranium, on the planet?”

    “HIE-ISOLDE is a major breakthrough and is the result of almost eight years of research and development, of prototyping and design. It’s a huge adventure and what makes us most proud isn’t even that we managed to build the machine but that from the start we have seen new physics and new, enthusiastic users,” enthuses Maria Borge, who led the ISOLDE group from 2012 to 2017.

    Challenge accepted

    But building a machine of this scope hasn’t been easy. Yacine Kadi, who leads the HIE-ISOLDE project, starts to laugh as he spends minutes listing the challenges the project faced.

    6
    Each cryomodule contains more than 10 000 parts, which need to be carefully cleaned, calibrated and installed. (Image: Maximilien Brice/CERN)

    With scarce resources, the design and development phase of the project relied on early-career researchers to carry out the majority of the work. It was a risk that paid off: “We didn’t have the resources to hire virtually any staff, but we made sure we only took the absolute best – we couldn’t afford not to – and they did a fantastic job. But then they left before the project was finished!” he explains.

    With his fair share of challenges, Yacine had to rethink construction materials when the metal niobium proved too costly, and amend original plans to avoid a building crossing the border between Switzerland and France.

    “Engineers told me it was mission impossible,” exclaims Yacine. “It was a big, complex project and the choices we made weren’t things we had much experience of at CERN. This meant we had to develop novel ideas and at the same time profit from technological breakthroughs made at CERN, for the Lepton Positron Collider (LEP). In the end it was just a question of the imagination of our physicists and technical staff.”

    7
    The inflatable T-Rex at HIE-ISOLDE is the mascot of the REX experiment, which was an earlier post-accelerator at ISOLDE. (Image: Julien Ordan/CERN)

    HIE-ISOLDE is unique in its design because it had to fit a lot of accelerating power into a very compact space. Linear accelerators use radiofrequency cavities to accelerate a beam. Normally an accelerator will house each one of these cavities in its own cryostat – a vacuum chamber that supercools the cavity so that the helium needed for the superconductors to work stays liquid – but HIE-ISOLDE didn’t have the room for each cavity to have its own cryostat. Instead, one way the engineers kept the system compact was to build cryomodules that each contain five cavities but require only one cryogenic system.

    “Yes, HIE-ISOLDE was a challenge from a technical point of view, but it was a major human adventure for me. You increase your field of knowledge, and you work in different domains, so you meet many different people. I met people I wouldn’t ever have met even after spending forty years at CERN,” he continues.

    Currently, HIE-ISOLDE is nearing the completion of its energy upgrade and has already had two successful running periods with more than 15 experiments. The last of the four superconducting cryomodules is due to be installed over the winter shutdown in 2018, and will allow the machine to accelerate the radioactive beams to energies of 10MeV/u.

    “Others might get to that energy but no other facility in the world can accelerate very heavy nuclei. We can do that,” says Maria, emphasising the importance of this new upgrade, which cements ISOLDE’s role at the forefront of nuclear physics for the foreseeable future.

    This week, ISOLDE, CERN’s nuclear facility, is celebrating 50 years of physics with a series of articles and a short documentary series that takes a closer look at the facility and the people that work there. See the rest of the series here.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 8:57 pm on November 24, 2017 Permalink | Reply
    Tags: , , , , Particle Accelerators,   

    From CERN: “AWAKE: Closer to a breakthrough acceleration technology” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    24 Nov 2017
    Iva Raynova

    1
    The electron source and electron beam line of AWAKE have just been installed. (Image: Maximilien Brice, Julien Ordan/CERN)

    CERN AWAKE schematic

    We are one step closer to testing a breakthrough technology for particle acceleration. The final three key parts of AWAKE have just been put in place: its electron source, electron beam line and electron spectrometer. This marks the end of the installation phase of the Proton Driven Plasma Wakefield Acceleration Experiment (AWAKE), a proof-of-principle experiment at CERN that is developing a new technique for accelerating particles.

    The accelerators currently in use rely on electric fields generated by radiofrequency (RF) cavities to accelerate charged particles by giving them a “kick”. In AWAKE, a beam of electrons will “surf” waves of electric charges, or wakefields. These waves are created when a beam of protons is injected into the heart of AWAKE, a 10-metre plasma cell full of ionised gas. When the protons travel through the plasma, they attract free electrons, which generate wakefields. A second particle beam, this time of electrons, is injected into the right phase behind the proton beam. As a result, it feels the wakefield and is accelerated, just like a surfer riding a wave.

    After exiting the plasma cell, the electrons will pass through a dipole magnet, which will curve their path. More energetic particles will get a smaller curvature. As well as the electron beam line, another new component is the scintillator that awaits the electrons at the end of the dipole, showing whether or not they have been accelerated. Essentially, this is a screen that lights up whenever a charged particle passes through it. Successfully accelerated electrons will be bent to a lesser degree by the magnetic field and will appear on one side of the scintillator.

    2
    Ans Pardons, integration and installation coordinator of AWAKE, beside the one-metre-wide scintillator. (Image: Maximilien Brice, Julien Ordan/CERN)

    From now until the end of 2017, the whole AWAKE experiment, including the electron source and the electron beam line, will be being commissioned and prepared for a very important year ahead. “It is very important to first create a high-quality electron beam with the correct energy and intensity, and then to successfully send it through the electron beam line and the plasma cell,” explains Ans Pardons, integration and installation coordinator of AWAKE.

    The first milestone was reached in December 2016, when the first data showed that the wakefields had been successfully generated. After a very successful 2017 run, it is now time for AWAKE’s next big step. Next year will be fully dedicated to proving that the acceleration of electrons in the wake of proton bunches is possible.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 11:36 am on November 23, 2017 Permalink | Reply
    Tags: , , , , Identifying the major ICT challenges that face CERN, Particle Accelerators, ,   

    From CERN openlab: “CERN openlab tackles ICT challenges of High-Luminosity LHC” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    1

    CERN openlab has published a white paper identifying the major ICT challenges that face CERN, the European Organization for Nuclear Research, and other ‘big science’ projects in the coming years.

    CERN is home to the Large Hadron Collider (LHC), the world’s most powerful particle accelerator. The complexity of the scientific instruments at the laboratory throw up extreme ICT challenges, and make it an ideal environment for carrying out joint R&D projects and testing with industry.

    A continuing programme of upgrades to the LHC and the experiments at CERN will result in hugely increased ICT demands in the coming years. The High-Luminosity LHC, the successor to the LHC, is planned to come online in around 2026.

    By this time, the total computing capacity required by the experiments is expected to be 50-100 times greater than today, with data storage needs expected to be in the order of exabytes.

    CERN openlab works to develop and test the new ICT solutions and techniques that help to make the ground-breaking physics discoveries at CERN possible. It is a unique public-private partnership that provides a framework through which CERN can collaborate with leading ICT companies to accelerate the development of these cutting-edge technologies.

    With a new three-year phase of CERN openlab set to begin at the start of 2018, work has been carried out throughout the first half of 2017 to identify key areas for future collaboration. A series of workshops and discussions was held to discuss the ICT challenges faced by the LHC research community — and other ‘big science’ projects over the coming years. This white paper is the culmination of these investigations, and sets out specific challenges that are ripe for tackling through collaborative R&D projects with leading ICT companies.

    The white paper identifies 16 ICT ‘challenge areas’, which have been grouped into four overarching ‘R&D topics’ (data-centre technologies and infrastructures, computing performance and software, machine learning and data analytics, applications in other disciplines). Challenges identified include ensuring that data centre architectures are flexible and cost effective; using cloud computing resources in a scalable, hybrid manner; fully modernising code, in order to exploit hardware to its maximum potential; making sure large-scale platforms are in place to enable global scientific collaboration; and successfully translating the huge potential of machine learning into concrete solutions .

    “Tackling these challenges — through a public-private partnership that brings together leading experts from each of these spheres — has the potential to positively impact on a range of scientific and technological fields, as well as wider society,” says Alberto Di Meglio, head of CERN openlab.

    “With the LHC and the experiments set to undergo major upgrade work in 2019 and 2020, CERN openlab’s sixth phase offers a clear opportunity to develop ICT solutions that will already make a tangible difference for researchers when the upgraded LHC and experiments come back online in 2021,” says Maria Girone, CERN openlab CTO.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About CERN openlab
    CERN openlab is a unique public-private partnership that accelerates the development of cutting-edge solutions for the worldwide LHC community and wider scientific research. Through CERN openlab, CERN collaborates with leading ICT companies and research institutes.

    Within this framework, CERN provides access to its complex IT infrastructure and its engineering experience, in some cases even extended to collaborating institutes worldwide. Testing in CERN’s demanding environment provides the ICT industry partners with valuable feedback on their products while allowing CERN to assess the merits of new technologies in their early stages of development for possible future use. This framework also offers a neutral ground for carrying out advanced R&D with more than one company.

    CERN openlab was created in 2001 (link is external) and is now in the phase V (2015-2017). This phase tackles ambitious challenges covering the most critical needs of IT infrastructures in domains such as data acquisition, computing platforms, data storage architectures, compute provisioning and management, networks and communication, and data analytics.

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 11:21 am on November 23, 2017 Permalink | Reply
    Tags: , , , , Grace C. Young, , , Particle Accelerators, , ,   

    From CERN openlab: Women in STEM – “CERN alumna turned deep-sea explorer” Grace C. Young 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    26 October, 2017
    No writer credit

    1

    Each summer, the international research laboratory CERN, home to the Large Hadron Collider, welcomes dozens of students to work alongside seasoned scientists on cutting-edge particle physics research. Many of these students will pursue physics research in graduate school, but some find themselves applying the lessons they learned at CERN to new domains.

    In 2011, MIT undergraduate Grace Young was one of these CERN summer students.

    Like many young adults, Young didn’t know what career path she wanted to pursue. “I tried all the majors,” Young says. “Physics, engineering, architecture, math, computer science. Separately, I always loved both the ocean and building things; it wasn’t until I learned about ocean engineering that I knew I had found my calling.”

    Today, Young is completing her PhD in ocean engineering at the University of Oxford and is chief scientist for the deep-sea submarine Pisces VI. She develops technology for ocean research and in 2014 lived underwater for 15 days. During a recent visit to CERN, Young spoke with Symmetry writer Sarah Charley about the journey that led her from fundamental physics back to her first love, the ocean.

    As a junior in high school you competed in Intel’s International Science Fair and won a trip to CERN. What was your project?
    A classmate and I worked in a quantum physics lab at University of Maryland. We designed and built several devices, called particle traps, that had potential applications for quantum computing. We soldered wires onto the mirror inside a flashlight to create a bowl-shaped electric field and then applied alternating current to repeatedly flip the field, which made tiny charged particles hover in mid-air.

    We were really jumping into the deep end on quantum physics; it was kind of amazing that it worked! Winning a trip to CERN was a dream come true. It was a transformative experience that had a huge impact on my career path.

    You then came back to CERN as a freshman at MIT. What is it about CERN and particle physics that made you want to return?
    My peek inside CERN the previous year sparked an interest that drove me to apply for the CERN openlab internship [a technology development collaboration between CERN scientists and members of companies or research institutes].

    Although I learned a lot from my assignment, my interest and affinity for CERN derives from the community of researchers from diverse backgrounds and disciplines from all over the world. It was CERN’s high-powered global community of scientists congregated in one beautiful place to solve big problems that was a magnet for me.

    You say you’ve always loved the ocean. What is it about the ocean that inspires you?
    ’ve loved being by the water since I was born. I find it very humbling, standing on the shore and having the waves breaking at my feet.

    This huge body of water differentiates our planet from other rocks in space, yet so little is known about it. The more time I spent on or in the water, either sailing or diving, the more I began taking a deeper interest in marine life and the essential role the ocean plays in sustaining life as we know it on Earth.

    What does an ocean engineer actually do?
    One big reason that we’ve only explored 5 percent of the ocean is because the deep sea is so forbidding for humans. We simply don’t have the biology to see or communicate underwater, much less exist for more than a few minutes just below surface.

    But all this is changing with better underwater imaging, sensors and robotic technologies. As an ocean engineer, I design and build things such as robotic submersibles, which can monitor the health of fisheries in marine sanctuaries, track endangered species and create 3-D maps of underwater ice shelves. These tools, combined with data collected during field research, enable me and my colleagues to explore the ocean and monitor the human impact on its fragile ecosystems.

    I also design new eco-seawalls and artificial coral reefs to protect coastlines from rising sea levels and storm surges while reviving essential marine ecosystems.

    What questions are you hoping to answer during your career as an ocean engineer and researcher?
    How does the ocean support so much biodiversity? More than 70 percent of our planet is covered by water, producing more than half the oxygen we breathe, storing more carbon dioxide than all terrestrial plant life and feeding billions of humans. And yet 95 percent of our ocean remains unexplored and essentially unknown.

    The problem we are facing today is that we are destroying so many of the ocean’s ecosystems before we even know they exist. We can learn a lot about how to stay alive and thrive by studying the oceanic habitats, leading to unforeseeable discoveries and scientific advancements.

    What are some of your big goals with this work?
    We face big existential ocean-related problems, and I’d like to help develop solutions for them. Overfishing, acidification, pollution and warming temperatures are destroying the ocean’s ecosystems and affecting humans by diminishing a vital food supply, shifting weather patterns and accelerating sea-level rise. Quite simply, if we don’t know or understand the problems, we can’t fix them.

    Have you found any unexpected overlaps between the research at CERN and the research on a submarine?
    Vision isn’t a good way to see the underwater world. The ocean is pitch black in most of its volume, and the creatures don’t rely on vision. They feel currents with their skin, use sound and can read the chemicals in the water to smell food. It would make sense for humans to use sensors that do that same thing.

    Physicists faced this same challenge and found other ways to characterize subatomic particles and the celestial bodies without relying on vision. Ocean sciences are moving in this same direction.

    What do you think ocean researchers and particle physicists can learn from each other?
    I think we already know it: That is, we can only solve big problems by working together. I’m convinced that only by working together across disciplines, ethnicities and nationalities can we survive as a species.

    Of course, the physical sciences are integral to everything related to ocean engineering, but it’s really CERN’s problem-solving methodology that’s most inspiring and applicable. CERN was created to solve big problems by combining the best of human learning irrespective of nationality, ethnicity or discipline. Our Pisces VI deep sea submarine team is multidisciplinary, multinational and—just like CERN—it’s focused on exploring the unknown that’s essential to life as we know it.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About CERN openlab
    CERN openlab is a unique public-private partnership that accelerates the development of cutting-edge solutions for the worldwide LHC community and wider scientific research. Through CERN openlab, CERN collaborates with leading ICT companies and research institutes.

    Within this framework, CERN provides access to its complex IT infrastructure and its engineering experience, in some cases even extended to collaborating institutes worldwide. Testing in CERN’s demanding environment provides the ICT industry partners with valuable feedback on their products while allowing CERN to assess the merits of new technologies in their early stages of development for possible future use. This framework also offers a neutral ground for carrying out advanced R&D with more than one company.

    CERN openlab was created in 2001 (link is external) and is now in the phase V (2015-2017). This phase tackles ambitious challenges covering the most critical needs of IT infrastructures in domains such as data acquisition, computing platforms, data storage architectures, compute provisioning and management, networks and communication, and data analytics.

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 9:15 am on November 23, 2017 Permalink | Reply
    Tags: , , , First light for pioneering SESAME light source, , Particle Accelerators, ,   

    From CERN: “First light for pioneering SESAME light source” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    23 Nov 2017
    Harriet Kim Jarlett

    SESAME Particle Accelerator Jordan interior


    SESAME Particle Accelerator, Jordan campus, an independent laboratory located in Allan in the Balqa governorate of Jordan

    At 10:50 yesterday morning scientists at the pioneering SESAME light source saw First Monochromatic Light through the XAFS/XRF (X-ray absorption fine structure/X-ray fluorescence) spectroscopy beamline, signalling the start of the laboratory’s experimental programme. This beamline, SESAME’s first to come on stream, delivers X-ray light that will be used to carry out research in areas ranging from solid state physics to environmental science and archaeology.

    “After years of preparation, it’s great to see light on target,” said XAFS/XRF beamline scientist Messaoud Harfouche. “We have a fantastic experimental programme ahead of us, starting with an experiment to investigate heavy metals contaminating soils in the region.”

    The initial research programme will be carried out at two beamlines, the XAFS/XRF beamline and the Infrared (IR) spectromicroscopy beamline that is scheduled to join the XAFS/XRF beamline this year. Both have specific characteristics that make them appropriate for various areas of research. A third beamline, devoted to materials science, will come on stream in 2018.

    “Our first three beamlines already give SESAME a wide range of research options to fulfil the needs of our research community,” said SESAME Scientific Director Giorgio Paolucci, “the future for light source research in the Middle East and neighbouring countries is looking very bright!”

    First Light is an important step in the commissioning process of a new synchrotron light source, but it is nevertheless just one step on the way to full operation. The SESAME synchrotron is currently operating with a beam current of just over 80 milliamps, while the design value is 400 milliamps. Over the coming weeks and months as experiments get underway, the current will be gradually increased.

    “SESAME is a major scientific and technological addition to research and education in the Middle East and beyond,” said Director of SESAME, Khaled Toukan. “Jordan supported the project financially and politically since its inception in 2004 for the benefit of science and peace in the region. The young scientists, physicists, engineers and administrators who have built SESAME, come for the first time from this part of the world.”

    Among the subjects likely to be studied in early experiments are environmental pollution with a view to improving public health, as well as studies aimed at identifying new drugs for cancer therapy, and cultural heritage studies ranging from bioarcheology – the study of our ancestors – to investigations of ancient manuscripts.

    “On behalf of the SESAME Council, I’d like to congratulate the SESAME staff on this wonderful milestone,” said President of the Council, Rolf Heuer. “SESAME is a great addition to the region’s research infrastructure, allowing scientists from the region access to the kind of facility that they previously had to travel to Europe or the US to use.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 12:10 pm on November 22, 2017 Permalink | Reply
    Tags: , , , , , , Intel, Particle Accelerators, ,   

    From CERN: “Fermilab joins CERN openlab, works on ‘data reduction’ project with CMS experiment” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    1

    2
    Fermilab Wilson Hall

    Fermilab, the USA’s premier particle physics and accelerator laboratory, has joined CERN openlab as a research member. Researchers from the laboratory will collaborate with members of the CMS experiment and the CERN IT Department on efforts to improve technologies related to ‘physics data reduction’. This work will take place within the framework of an existing CERN openlab project with Intel on ‘big-data analytics’.

    CERN/CMS Detector

    ‘Physics data reduction’ plays a vital role in ensuring researchers are able to gain valuable insights from the vast amounts of particle-collision data produced by high-energy physics experiments, such as the CMS experiment on CERN’s Large Hadron Collider (LHC).

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    The project’s goal is to develop a new system — using industry-standard big-data tools — for filtering many petabytes of heterogeneous collision data to create manageable, but rich, datasets of a few terabytes for analysis. Using current systems, this kind of targeted data reduction can often take weeks; but the aim of the project is to be able to achieve this in a matter of hours.

    “Time is critical in analysing the ever-increasing volumes of LHC data,”says Oliver Gutsche, a Fermilab scientist working at the CMS experiment. “I am excited about the prospects CERN openlab brings to the table: systems that could enable us to perform analysis much faster and with much less effort and resources.” Gutsche and his colleagues will explore methods of ensuring efficient access to the data from the experiment. For this, they will investigate techniques based on Apache Spark, a popular open-source software platform for distributed processing of very large data sets on computer clusters built from commodity hardware. “The success of this project will have a large impact on the way analysis is conducted, allowing more optimised results to be produced in far less time,” says Matteo Cremonesi, a research associate at Fermilab. “I am really looking forward to using the new open-source tools; they will be a game changer for the overall scientific process in high-energy physics.”

    The team plans to first create a prototype of the system, capable of processing 1 PB of data with about 1000 computer cores. Based on current projections, this is about 1/20th of the scale of the final system that would be needed to handle the data produced when the High-Luminosity LHC comes online in 2026.

    Using this prototype, it should be possible to produce a benchmark (or ‘reference workload’) that can be used evaluate the optimum configuration of both hardware and software for the data-reduction system.

    “This kind of work, investigating big-data analytics techniques is vital for high-energy physics — both in terms of physics data and data from industrial control systems on the LHC,” says Maria Girone, CERN openlab CTO. “However, these investigations also potentially have far-reaching impact for a range of other disciplines. For example, this CERN openlab project with Intel is also exploring the use of these kinds of analytics techniques for healthcare data.”

    “Intel is proud of the work it has done in enabling the high-energy physics community to adopt the latest technologies for high-performance computing, data analytics, and machine learning — and reap the benefits. CERN openlab’s project on big-data analytics is one of the strategic endeavours to which Intel has been contributing,” says Stephan Gillich, Intel Deutschland’s director of technical computing for Europe, the Middle East, and Africa. “The possibility of extending the CERN openlab collaboration to include Fermilab, one of the world’s leading research centres, is further proof of the scientific relevance and success of this private-public partnership.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About CERN openlab

    CERN openlab is a unique public-private partnership that accelerates the development of cutting-edge solutions for the worldwide LHC community and wider scientific research. Through CERN openlab, CERN collaborates with leading ICT companies and research institutes.

    Within this framework, CERN provides access to its complex IT infrastructure and its engineering experience, in some cases even extended to collaborating institutes worldwide. Testing in CERN’s demanding environment provides the ICT industry partners with valuable feedback on their products while allowing CERN to assess the merits of new technologies in their early stages of development for possible future use. This framework also offers a neutral ground for carrying out advanced R&D with more than one company.

    CERN openlab was created in 2001 (link is external) and is now in the phase V (2015-2017). This phase tackles ambitious challenges covering the most critical needs of IT infrastructures in domains such as data acquisition, computing platforms, data storage architectures, compute provisioning and management, networks and communication, and data analytics.

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 8:53 am on November 22, 2017 Permalink | Reply
    Tags: , , , , , Particle Accelerators, , ,   

    From Futurism: “Quantum Physicists Conclude Necessary Makeup of Elusive Tetraquarks” 

    futurism-bloc

    Futurism

    Mesons Baryons Tetraquarks

    , https://blog.cerebrodigital.org/tetraquark-particula-exotica-descubierta-en-fermilab/

    November 20, 2017
    Abby Norman

    Everything in the universe is made up of atoms — except, of course, atoms themselves. They’re made up of subatomic particles, namely, protons, neutrons, and electrons. While electrons are classified as leptons, protons and neutrons are in a class of particles known as quarks. Though, “known” may be a bit misleading: there is a lot more theoretical physicists don’t know about the particles than they do with any degree of certainty.

    As far as we know, quarks are the fundamental particle of the universe. You can’t break a quark down into any smaller particles. Imagining them as being uniformly minuscule is not quite accurate, however: while they are tiny, they are not all the same size. Some quarks are larger than others, and they can also join together and create mesons (1 quark + 1 antiquark) or baryons (3 quarks of various flavors).

    In terms of possible quark flavors, which are respective to their position, we’ve identified six: up, down, top, bottom, charm, and strange. As mentioned, they usually pair up either in quark-antiquark pairs or a quark threesome — so long as the charges ( ⅔, ⅔, and ⅓ ) all add up to positive 1.

    The so-called tetraquark pairing has long-eluded scientists; a hadron which would require 2 quark-antiquark pairs, held together by the strong force. Now, it’s not enough for them to simply pair off and only interact with their partner. To be a true tetraquark, all four quarks would need to interact with one another; behaving as quantum swingers, if you will.

    “Quarky” Swingers

    It might seem like a pretty straightforward concept: throw four quarks together and they’re bound to interact, right? Well, not necessarily. And that would be assuming they’d pair off stably in the first place, which isn’t a given. As Marek Karliner of Tel Aviv University explained to LiveScience, two quarks aren’t any more likely to pair off in a stable union than two random people you throw into an apartment together. When it comes to both people and quarks, close proximity doesn’t ensure chemistry.

    “The big open question had been whether such combinations would be stable,
    or would they instantly disintegrate into two quark-antiquark mesons,” Karliner told Futurism. “Many years of experimental searches came up empty-handed, and no one knew for sure whether stable tetraquarks exist.”

    Most discussions of tetraquarks up until recently involved those “ad-hoc” tetraquarks; the ones where four quarks were paired off, but not interacting. Finding the bona-fide quark clique has been the “holy grail” of theoretical physics for years – and we’re agonizingly close.

    Recalling that quarks are not something we can actually see, it probably goes without saying that predicting the existence of such an arrangement would be incredibly hard to do. The very laws of physics dictate that it would be impossible for four quarks to come together and form a stable hadron. But two physicists found a way to simplify (as much as you can “simplify” quantum mechanics) the approach to the search for tetraquarks.

    Several years ago, Karliner and his research partner, Jonathan Rosner of the University of Chicago, set out to establish the theory that if you want to know the mass and binding energy of rare hadrons, you can start by comparing them to the common hadrons you already know the measurements for. In their research [Nature] they looked at charm quarks; the measurements for which are known and understood (to quantum physicists, at least).

    Based on these comparisons, they proposed that a doubly-charged baryon should have a mass of 3,627 MeV, +/- 12 MeV [Physical Review Letters]. The next step was to convince CERN to go tetraquark-hunting, using their math as a map.

    For all the complex work it undertakes, the vast majority of which is nothing detectable by the human eye, The Large Hadron Collider is exactly what the name implies: it’s a massive particle accelerator that smashes atoms together, revealing their inner quarks.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    If you’re out to prove the existence of a very tiny theoretical particle, the LHC is where you want to start — though there’s no way to know how long it will be before, if ever, the particles you seek appear.

    It took several years, but in the summer of 2017, the LHC detected a new baryon: one with a single up quark and two heavy charm quarks — the kind of doubly-charged baryon Karliner and Rosner were hoping for. The mass of the baryon was 3,621 MeV, give or take 1 MeV, which was extremely close to the measurement Karliner and Rosner had predicted. Prior to this observation physicists had speculated about — but never detected — more than one heavy quark in a baryon. In terms of the hunt for the tetraquark, this was an important piece of evidence: that more robust bottom quark could be just what a baryon needs to form a stable tetraquark.

    The perpetual frustration of studying particles is that they don’t stay around long. These baryons, in particular, disappear faster than “blink-and-you’ll-miss-it” speed; one 10/trillionth of a second, to be exact. Of course, in the world of quantum physics, that’s actually plenty of time to establish existence, thanks to the LHC.

    The great quantum qualm within the LHC, however, is one that presents a significant challenge in the search for tetraquarks: heavier particles are less likely to show up, and while this is all happening on an infinitesimal level, as far as the quantum scale is concerned, bottom quarks are behemoths.

    The next question for Rosner and Karliner, then, was did it make more sense to try to build a tetraquark, rather than wait around for one to show up? You’d need to generate two bottom quarks close enough together that they’d hook up, then throw in a pair of lighter antiquarks — then do it again and again, successfully, enough times to satisfy the scientific method.

    “Our paper uses the data from recently discovered double-charmed baryon to point, for the first time, that a stable tetraquark *must* exist,” Karliner told Futurism, adding that there’s “a very good chance” the LHCb at CERN would succeed in observing the phenomenon experimentally.

    That, of course, is still a theoretical proposition, but should anyone undertake it, the LHC would keep on smashing in the meantime — and perhaps the combination would arise on its own. As Karliner reminded LiveScience, for years the assumption has been that tetraquarks are impossible. At the very least, they’re profoundly at odds with the Standard Model of Physics. But that assumption is certainly being challenged. “The tetraquark is a truly new form of strongly-interacting matter,” Karliner told Futurism,” in addition to ordinary baryons and mesons.”

    If tetraquarks are not impossible, or even particularly improbable, thanks to the Karliner and Rosner’s calculations, at least now we have a better sense of what we’re looking for — and where it might pop up.

    Where there’s smoke there’s fire, as they say, and while the mind-boggling realm of quantum mechanics may feel more like smoke and mirrors to us, theoretical physicists aren’t giving up just yet. Where there’s a 2-bottom quark, there could be tetraquarks.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Futurism covers the breakthrough technologies and scientific discoveries that will shape humanity’s future. Our mission is to empower our readers and drive the development of these transformative technologies towards maximizing human potential.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: