Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:10 pm on March 10, 2019 Permalink | Reply
    Tags: A quantum computer would greatly speed up analysis of the collisions hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on., And they’ve been waiting for decades. Google is in the race as are IBM Microsoft Intel and a clutch of startups academic groups and the Chinese government., , At the moment researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH trying to find exotic heavy sister-particles to all our known particles of matter., “This is a marathon” says David Reilly who leads Microsoft’s quantum lab at the University of Sydney Australia. “And it's only 10 minutes into the marathon.”, , CERN LHC, CERN-Future Circular Collider, For CERN the quantum promise could for instance help its scientists find evidence of supersymmetry or SUSY which so far has proven elusive., HL-LHC-High-Luminosity LHC, IBM has steadily been boosting the number of qubits on its quantum computers starting with a meagre 5-qubit computer then 16- and 20-qubit machines and just recently showing off its 50-qubit processor, In a bid to make sense of the impending data deluge some at CERN are turning to the emerging field of quantum computing., In a quantum computer each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work., In theory a quantum computer would process all the states a qubit can have at once and with every qubit added to its memory size its computational power should increase exponentially., Last year physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson found at the LHC in 2012, None of the competing teams have come close to reaching even the first milestone., , , , The quest has now lasted decades and a number of physicists are questioning if the theory behind SUSY is really valid., Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data., Venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone.,   

    From WIRED: “Inside the High-Stakes Race to Make Quantum Computers Work” 

    Wired logo

    From WIRED

    03.08.19
    Katia Moskvitch

    1
    View Pictures/Getty Images

    Deep beneath the Franco-Swiss border, the Large Hadron Collider is sleeping.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    But it won’t be quiet for long. Over the coming years, the world’s largest particle accelerator will be supercharged, increasing the number of proton collisions per second by a factor of two and a half.

    Once the work is complete in 2026, researchers hope to unlock some of the most fundamental questions in the universe. But with the increased power will come a deluge of data the likes of which high-energy physics has never seen before. And, right now, humanity has no way of knowing what the collider might find.

    To understand the scale of the problem, consider this: When it shut down in December 2018, the LHC generated about 300 gigabytes of data every second, adding up to 25 petabytes (PB) annually. For comparison, you’d have to spend 50,000 years listening to music to go through 25 PB of MP3 songs, while the human brain can store memories equivalent to just 2.5 PB of binary data. To make sense of all that information, the LHC data was pumped out to 170 computing centers in 42 countries [http://greybook.cern.ch/]. It was this global collaboration that helped discover the elusive Higgs boson, part of the Higgs field believed to give mass to elementary particles of matter.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    To process the looming data torrent, scientists at the European Organization for Nuclear Research, or CERN, will need 50 to 100 times more computing power than they have at their disposal today. A proposed Future Circular Collider, four times the size of the LHC and 10 times as powerful, would create an impossibly large quantity of data, at least twice as much as the LHC.

    CERN FCC Future Circular Collider map

    In a bid to make sense of the impending data deluge, some at CERN are turning to the emerging field of quantum computing. Powered by the very laws of nature the LHC is probing, such a machine could potentially crunch the expected volume of data in no time at all. What’s more, it would speak the same language as the LHC. While numerous labs around the world are trying to harness the power of quantum computing, it is the future work at CERN that makes it particularly exciting research. There’s just one problem: Right now, there are only prototypes; nobody knows whether it’s actually possible to build a reliable quantum device.

    Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work.

    A quantum computer is not limited to this “either/or” way of thinking. Its memory is made up of quantum bits, or qubits—tiny particles of matter like atoms or electrons. And qubits can do “both/and,” meaning that they can be in a superposition of all possible combinations of zeros and ones; they can be all of those states simultaneously.

    For CERN, the quantum promise could, for instance, help its scientists find evidence of supersymmetry, or SUSY, which so far has proven elusive.

    Standard Model of Supersymmetry via DESY

    At the moment, researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH, trying to find exotic, heavy sister-particles to all our known particles of matter. The quest has now lasted decades, and a number of physicists are questioning if the theory behind SUSY is really valid. A quantum computer would greatly speed up analysis of the collisions, hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on.

    A quantum device might also help scientists understand the evolution of the early universe, the first few minutes after the Big Bang. Physicists are pretty confident that back then, our universe was nothing but a strange soup of subatomic particles called quarks and gluons. To understand how this quark-gluon plasma has evolved into the universe we have today, researchers simulate the conditions of the infant universe and then test their models at the LHC, with multiple collisions. Performing a simulation on a quantum computer, governed by the same laws that govern the very particles that the LHC is smashing together, could lead to a much more accurate model to test.

    Beyond pure science, banks, pharmaceutical companies, and governments are also waiting to get their hands on computing power that could be tens or even hundreds of times greater than that of any traditional computer.

    And they’ve been waiting for decades. Google is in the race, as are IBM, Microsoft, Intel and a clutch of startups, academic groups, and the Chinese government. The stakes are incredibly high. Last October, the European Union pledged to give $1 billion to over 5,000 European quantum technology researchers over the next decade, while venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone. “This is a marathon,” says David Reilly, who leads Microsoft’s quantum lab at the University of Sydney, Australia. “And it’s only 10 minutes into the marathon.”

    Despite the hype surrounding quantum computing and the media frenzy triggered by every announcement of a new qubit record, none of the competing teams have come close to reaching even the first milestone, fancily called quantum supremacy—the moment when a quantum computer performs at least one specific task better than a standard computer. Any kind of task, even if it is totally artificial and pointless. There are plenty of rumors in the quantum community that Google may be close, although if true, it would give the company bragging rights at best, says Michael Biercuk, a physicist at the University of Sydney and founder of quantum startup Q-CTRL. “It would be a bit of a gimmick—an artificial goal,” says Reilly “It’s like concocting some mathematical problem that really doesn’t have an obvious impact on the world just to say that a quantum computer can solve it.”

    That’s because the first real checkpoint in this race is much further away. Called quantum advantage, it would see a quantum computer outperform normal computers on a truly useful task. (Some researchers use the terms quantum supremacy and quantum advantage interchangeably.) And then there is the finish line, the creation of a universal quantum computer. The hope is that it would deliver a computational nirvana with the ability to perform a broad range of incredibly complex tasks. At stake is the design of new molecules for life-saving drugs, helping banks to adjust the riskiness of their investment portfolios, a way to break all current cryptography and develop new, stronger systems, and for scientists at CERN, a way to glimpse the universe as it was just moments after the Big Bang.

    Slowly but surely, work is already underway. Federico Carminati, a physicist at CERN, admits that today’s quantum computers wouldn’t give researchers anything more than classical machines, but, undeterred, he’s started tinkering with IBM’s prototype quantum device via the cloud while waiting for the technology to mature. It’s the latest baby step in the quantum marathon. The deal between CERN and IBM was struck in November last year at an industry workshop organized by the research organization.

    Set up to exchange ideas and discuss potential collab­orations, the event had CERN’s spacious auditorium packed to the brim with researchers from Google, IBM, Intel, D-Wave, Rigetti, and Microsoft. Google detailed its tests of Bristlecone, a 72-qubit machine. Rigetti was touting its work on a 128-qubit system. Intel showed that it was in close pursuit with 49 qubits. For IBM, physicist Ivano Tavernelli took to the stage to explain the company’s progress.

    IBM has steadily been boosting the number of qubits on its quantum computers, starting with a meagre 5-qubit computer, then 16- and 20-qubit machines, and just recently showing off its 50-qubit processor.

    IBM iconic image of Quantum computer

    Carminati listened to Tavernelli, intrigued, and during a much needed coffee break approached him for a chat. A few minutes later, CERN had added a quantum computer to its impressive technology arsenal. CERN researchers are now starting to develop entirely new algorithms and computing models, aiming to grow together with the device. “A fundamental part of this process is to build a solid relationship with the technology providers,” says Carminati. “These are our first steps in quantum computing, but even if we are coming relatively late into the game, we are bringing unique expertise in many fields. We are experts in quantum mechanics, which is at the base of quantum computing.”

    The attraction of quantum devices is obvious. Take standard computers. The prediction by former Intel CEO Gordon Moore in 1965 that the number of components in an integrated circuit would double roughly every two years has held true for more than half a century. But many believe that Moore’s law is about to hit the limits of physics. Since the 1980s, however, researchers have been pondering an alternative. The idea was popularized by Richard Feynman, an American physicist at Caltech in Pasadena. During a lecture in 1981, he lamented that computers could not really simulate what was happening at a subatomic level, with tricky particles like electrons and photons that behave like waves but also dare to exist in two states at once, a phenomenon known as quantum superposition.

    Feynman proposed to build a machine that could. “I’m not happy with all the analyses that go with just the classical theory, because nature isn’t classical, dammit,” he told the audience back in 1981. “And if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

    And so the quantum race began. Qubits can be made in different ways, but the rule is that two qubits can be both in state A, both in state B, one in state A and one in state B, or vice versa, so there are four probabilities in total. And you won’t know what state a qubit is at until you measure it and the qubit is yanked out of its quantum world of probabilities into our mundane physical reality.

    In theory, a quantum computer would process all the states a qubit can have at once, and with every qubit added to its memory size, its computational power should increase exponentially. So, for three qubits, there are eight states to work with simultaneously, for four, 16; for 10, 1,024; and for 20, a whopping 1,048,576 states. You don’t need a lot of qubits to quickly surpass the memory banks of the world’s most powerful modern supercomputers—meaning that for specific tasks, a quantum computer could find a solution much faster than any regular computer ever would. Add to this another crucial concept of quantum mechanics: entanglement. It means that qubits can be linked into a single quantum system, where operating on one affects the rest of the system. This way, the computer can harness the processing power of both simultaneously, massively increasing its computational ability.

    While a number of companies and labs are competing in the quantum marathon, many are running their own races, taking different approaches. One device has even been used by a team of researchers to analyze CERN data, albeit not at CERN. Last year, physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson, found at the LHC in 2012, by sifting through the collider’s troves of data using a quantum computer manufactured by D-Wave, a Canadian firm based in Burnaby, British Columbia. The findings didn’t arrive any quicker than on a traditional computer, but, crucially, the research showed a quantum machine could do the work.

    One of the oldest runners in the quantum race, D-Wave announced back in 2007 that it had built a fully functioning, commercially available 16-qubit quantum computer prototype—a claim that’s controversial to this day. D-Wave focuses on a technology called quantum annealing, based on the natural tendency of real-world quantum systems to find low-energy states (a bit like a spinning top that inevitably will fall over). A D-Wave quantum computer imagines the possible solutions of a problem as a landscape of peaks and valleys; each coordinate represents a possible solution and its elevation represents its energy. Annealing allows you to set up the problem, and then let the system fall into the answer—in about 20 milliseconds. As it does so, it can tunnel through the peaks as it searches for the lowest valleys. It finds the lowest point in the vast landscape of solutions, which corresponds to the best possible outcome—although it does not attempt to fully correct for any errors, inevitable in quantum computation. D-Wave is now working on a prototype of a universal annealing quantum computer, says Alan Baratz, the company’s chief product officer.

    Apart from D-Wave’s quantum annealing, there are three other main approaches to try and bend the quantum world to our whim: integrated circuits, topological qubits and ions trapped with lasers. CERN is placing high hopes on the first method but is closely watching other efforts too.

    IBM, whose computer Carminati has just started using, as well as Google and Intel, all make quantum chips with integrated circuits—quantum gates—that are superconducting, a state when certain metals conduct electricity with zero resistance. Each quantum gate holds a pair of very fragile qubits. Any noise will disrupt them and introduce errors—and in the quantum world, noise is anything from temperature fluctuations to electromagnetic and sound waves to physical vibrations.

    To isolate the chip from the outside world as much as possible and get the circuits to exhibit quantum mechanical effects, it needs to be supercooled to extremely low temperatures. At the IBM quantum lab in Zurich, the chip is housed in a white tank—a cryostat—suspended from the ceiling. The temperature inside the tank is a steady 10 millikelvin or –273 degrees Celsius, a fraction above absolute zero and colder than outer space. But even this isn’t enough.

    Just working with the quantum chip, when scientists manipulate the qubits, causes noise. “The outside world is continually interacting with our quantum hardware, damaging the information we are trying to process,” says physicist John Preskill at the California Institute of Technology, who in 2012 coined the term quantum supremacy. It’s impossible to get rid of the noise completely, so researchers are trying to suppress it as much as possible, hence the ultracold temperatures to achieve at least some stability and allow more time for quantum computations.

    “My job is to extend the lifetime of qubits, and we’ve got four of them to play with,” says Matthias Mergenthaler, an Oxford University postdoc student working at IBM’s Zurich lab. That doesn’t sound like a lot, but, he explains, it’s not so much the number of qubits that counts but their quality, meaning qubits with as low a noise level as possible, to ensure they last as long as possible in superposition and allow the machine to compute. And it’s here, in the fiddly world of noise reduction, that quantum computing hits up against one of its biggest challenges. Right now, the device you’re reading this on probably performs at a level similar to that of a quantum computer with 30 noisy qubits. But if you can reduce the noise, then the quantum computer is many times more powerful.

    Once the noise is reduced, researchers try to correct any remaining errors with the help of special error-correcting algorithms, run on a classical computer. The problem is, such error correction works qubit by qubit, so the more qubits there are, the more errors the system has to cope with. Say a computer makes an error once every 1,000 computational steps; it doesn’t sound like much, but after 1,000 or so operations, the program will output incorrect results. To be able to achieve meaningful computations and surpass standard computers, a quantum machine has to have about 1,000 qubits that are relatively low noise and with error rates as corrected as possible. When you put them all together, these 1,000 qubits will make up what researchers call a logical qubit. None yet exist—so far, the best that prototype quantum devices have achieved is error correction for up to 10 qubits. That’s why these prototypes are called noisy intermediate-scale quantum computers (NISQ), a term also coined by Preskill in 2017.

    For Carminati, it’s clear the technology isn’t ready yet. But that isn’t really an issue. At CERN the challenge is to be ready to unlock the power of quantum computers when and if the hardware becomes available. “One exciting possibility will be to perform very, very accurate simulations of quantum systems with a quantum computer—which in itself is a quantum system,” he says. “Other groundbreaking opportunities will come from the blend of quantum computing and artificial intelligence to analyze big data, a very ambitious proposition at the moment, but central to our needs.”

    But some physicists think NISQ machines will stay just that—noisy—forever. Gil Kalai, a professor at Yale University, says that error correcting and noise suppression will never be good enough to allow any kind of useful quantum computation. And it’s not even due to technology, he says, but to the fundamentals of quantum mechanics. Interacting systems have a tendency for errors to be connected, or correlated, he says, meaning errors will affect many qubits simultaneously. Because of that, it simply won’t be possible to create error-correcting codes that keep noise levels low enough for a quantum computer with the required large number of qubits.

    “My analysis shows that noisy quantum computers with a few dozen qubits deliver such primitive computational power that it will simply not be possible to use them as the building blocks we need to build quantum computers on a wider scale,” he says. Among scientists, such skepticism is hotly debated. The blogs of Kalai and fellow quantum skeptics are forums for lively discussion, as was a recent much-shared article titled “The Case Against Quantum Computing”—followed by its rebuttal, “The Case Against the Case Against Quantum Computing.

    For now, the quantum critics are in a minority. “Provided the qubits we can already correct keep their form and size as we scale, we should be okay,” says Ray Laflamme, a physicist at the University of Waterloo in Ontario, Canada. The crucial thing to watch out for right now is not whether scientists can reach 50, 72, or 128 qubits, but whether scaling quantum computers to this size significantly increases the overall rate of error.

    3
    The Quantum Nano Centre in Canada is one of numerous big-budget research and development labs focussed on quantum computing. James Brittain/Getty Images

    Others believe that the best way to suppress noise and create logical qubits is by making qubits in a different way. At Microsoft, researchers are developing topological qubits—although its array of quantum labs around the world has yet to create a single one. If it succeeds, these qubits would be much more stable than those made with integrated circuits. Microsoft’s idea is to split a particle—for example an electron—in two, creating Majorana fermion quasi-particles. They were theorized back in 1937, and in 2012 researchers at Delft University of Technology in the Netherlands, working at Microsoft’s condensed matter physics lab, obtained the first experimental evidence of their existence.

    “You will only need one of our qubits for every 1,000 of the other qubits on the market today,” says Chetan Nayak, general manager of quantum hardware at Microsoft. In other words, every single topological qubit would be a logical one from the start. Reilly believes that researching these elusive qubits is worth the effort, despite years with little progress, because if one is created, scaling such a device to thousands of logical qubits would be much easier than with a NISQ machine. “It will be extremely important for us to try out our code and algorithms on different quantum simulators and hardware solutions,” says Carminati. “Sure, no machine is ready for prime time quantum production, but neither are we.”

    Another company Carminati is watching closely is IonQ, a US startup that spun out of the University of Maryland. It uses the third main approach to quantum computing: trapping ions. They are naturally quantum, having superposition effects right from the start and at room temperature, meaning that they don’t have to be supercooled like the integrated circuits of NISQ machines. Each ion is a singular qubit, and researchers trap them with special tiny silicon ion traps and then use lasers to run algorithms by varying the times and intensities at which each tiny laser beam hits the qubits. The beams encode data to the ions and read it out from them by getting each ion to change its electronic states.

    In December, IonQ unveiled its commercial device, capable of hosting 160 ion qubits and performing simple quantum operations on a string of 79 qubits. Still, right now, ion qubits are just as noisy as those made by Google, IBM, and Intel, and neither IonQ nor any other labs around the world experimenting with ions have achieved quantum supremacy.

    As the noise and hype surrounding quantum computers rumbles on, at CERN, the clock is ticking. The collider will wake up in just five years, ever mightier, and all that data will have to be analyzed. A non-noisy, error-corrected quantum computer will then come in quite handy.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:29 pm on March 8, 2019 Permalink | Reply
    Tags: , And finally theywill be shipped to CERN, “The need to go beyond the already excellent performance of the LHC is at the basis of the scientific method” said Giorgio Apollinari Fermilab scientist and HL-LHC AUP project manager., , , CERN LHC, Each magnet will have four sets of coils making it a quadrupole., Earlier this month the AUP earned approval for both Critical Decisions 2 and 3b from DOE., Fermilab will manufacture 43 coils and Brookhaven National Laboratory in New York will manufacture another 41, , , In its current configuration on average an astonishing 1 billion collisions occur every second at the LHC., It’s also the reason behind the collider’s new name the High-Luminosity LHC., LHC AUP began just over two years ago and on Feb. 11 it received key approvals allowing the project to transition into its next steps., , , , Superconducting niobium-tin magnets have never been used in a high-energy particle accelerator like the LHC., The AUP calls for 84 coils fabricated into 21 magnets., The first upgrade is to the magnets that focus the particles., The magnets will be sent to Brookhaven to be tested before being shipped back to Fermilab., The new technologies developed for the LHC will boost that number by a factor of 10., The second upgrade is a special type of accelerator cavity., The U.S. Large Hadron Collider Accelerator Upgrade Project is the Fermilab-led collaboration of U.S. laboratories in partnership with CERN and a dozen other countries., These new magnets will generate a maximum magnetic field of 12 tesla roughly 50 percent more than the niobium-titanium magnets currently in the LHC., This means that significantly more data will be available to experiments at the LHC., This special cavity called a crab cavity is used to increase the overlap of the two beams so that more protons have a chance of colliding., Those will then be delivered to Lawrence Berkeley National Laboratory to be formed into accelerator magnets, Twenty successful magnets will be inserted into 10 containers which are then tested by Fermilab, U.S. Department of Energy projects undergo a series of key reviews and approvals referred to as “Critical Decisions” that every project must receive., U.S. physicists and engineers helped research and develop two technologies to make this upgrade possible.   

    From Brookhaven National Lab: “Large Hadron Collider Upgrade Project Leaps Forward” 

    From Brookhaven National Lab

    March 4, 2019
    Caitlyn Buongiorno

    1
    Staff members of the Superconducting Magnet Division at Brookhaven National Laboratory next to the “top hat”— the interface between the room temperature components of the magnet test facility and the LHC high-luminosity magnet to be tested. The magnet is attached to the bottom of the top hat and tested in superfluid helium at temperatures close to absolute zero. Left to right: Joseph Muratore, Domenick Milidantri, Sebastian Dimaiuta, Raymond Ceruti, and Piyush Joshi. Credit: Brookhaven National Laboratory

    The U.S. Large Hadron Collider Accelerator Upgrade Project is the Fermilab-led collaboration of U.S. laboratories that, in partnership with CERN and a dozen other countries, is working to upgrade the Large Hadron Collider.

    LHC AUP began just over two years ago and, on Feb. 11, it received key approvals, allowing the project to transition into its next steps.

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    U.S. Department of Energy projects undergo a series of key reviews and approvals, referred to as “Critical Decisions” that every project must receive. Earlier this month, the AUP earned approval for both Critical Decisions 2 and 3b from DOE. CD-2 approves the performance baseline — the scope, cost and schedule — for the AUP. In order to stay on that schedule, CD-3b allows the project to receive the funds and approval necessary to purchase base materials and produce final design models of two technologies by the end of 2019.

    The LHC, a 17-mile-circumference particle accelerator on the French-Swiss border, smashes together two opposing beams of protons to produce other particles. Researchers use the particle data to understand how the universe operates at the subatomic scale.

    In its current configuration, on average, an astonishing 1 billion collisions occur every second at the LHC. The new technologies developed for the LHC will boost that number by a factor of 10. This increase in luminosity — the number of proton-proton interactions per second — means that significantly more data will be available to experiments at the LHC. It’s also the reason behind the collider’s new name, the High-Luminosity LHC.

    2
    This “crab cavity” is designed to maximize the chance of collision between two opposing particle beams. Photo: Paolo Berrutti

    “The need to go beyond the already excellent performance of the LHC is at the basis of the scientific method,” said Giorgio Apollinari, Fermilab scientist and HL-LHC AUP project manager. “The endorsement and support received for this U.S. contribution to the HL-LHC will allow our scientists to remain at the forefront of research at the energy frontier.”

    U.S. physicists and engineers helped research and develop two technologies to make this upgrade possible. The first upgrade is to the magnets that focus the particles. The new magnets rely on niobium-tin conductors and can exert a stronger force on the particles than their predecessors. By increasing the force, the particles in each beam are driven closer together, enabling more proton-proton interactions at the collision points.

    The second upgrade is a special type of accelerator cavity. Cavities are structures inside colliders that impart energy to the particle beam and propel them forward. This special cavity, called a crab cavity, is used to increase the overlap of the two beams so that more protons have a chance of colliding.

    “This approval is a recognition of 15 years of research and development started by a U.S. research program and completed by this project,” said Giorgio Ambrosio, Fermilab scientist and HL-LHC AUP manager for magnets.

    3
    This completed niobium-tin magnet coil will generate a maximum magnetic field of 12 tesla, roughly 50 percent more than the niobium-titanium magnets currently in the LHC. Photo: Alfred Nobrega

    Magnets help the particles go ’round

    Superconducting niobium-tin magnets have never been used in a high-energy particle accelerator like the LHC. These new magnets will generate a maximum magnetic field of 12 tesla, roughly 50 percent more than the niobium-titanium magnets currently in the LHC. For comparison, an MRI’s magnetic field ranges from 0.5 to 3 tesla, and Earth’s magnetic field is only 50 millionths of one tesla.

    There are multiple stages to creating the niobium-tin coils for the magnets, and each brings its challenges.

    Each magnet will have four sets of coils, making it a quadrupole. Together the coils conduct the electric current that produces the magnetic field of the magnet. In order to make niobium-tin capable of producing a strong magnetic field, the coils must be baked in an oven and turned into a superconductor. The major challenge with niobium-tin is that the superconducting phase is brittle. Similar to uncooked spaghetti, a small amount of pressure can snap it in two if the coils are not well supported. Therefore, the coils must be handled delicately from this point on.

    The AUP calls for 84 coils, fabricated into 21 magnets. Fermilab will manufacture 43 coils, and Brookhaven National Laboratory in New York will manufacture another 41. Those will then be delivered to Lawrence Berkeley National Laboratory to be formed into accelerator magnets. The magnets will be sent to Brookhaven to be tested before being shipped back to Fermilab. Twenty successful magnets will be inserted into 10 containers, which are then tested by Fermilab, and finally shipped to CERN.

    With CD-2/3b approval, AUP expects to have the first magnet assembled in April and tested by July. If all goes well, this magnet will be eligible for installation at CERN.

    Crab cavities for more collisions

    Cavities accelerate particles inside a collider, boosting them to higher energies. They also form the particles into bunches: As individual protons travel through the cavity, each one is accelerated or decelerated depending on whether they are below or above an expected energy. This process essentially sorts the beam into collections of protons, or particle bunches.

    HL-LHC puts a spin on the typical cavity with its crab cavities, which get their name from how the particle bunches appear to move after they’ve passed through the cavity. When a bunch exits the cavity, it appears to move sideways, similar to how a crab walks. This sideways movement is actually a result of the crab cavity rotating the particle bunches as they pass through.

    Imagine that a football was actually a particle bunch. Typically, you want to throw a football straight ahead, with the pointed end cutting through the air. The same is true for particle bunches; they normally go through a collider like a football. Now let’s say you wanted to ensure that your football and another football would collide in mid-air. Rather than throwing it straight on, you’d want to throw the football on its side to maximize the size of the target and hence the chance of collision.

    Of course, turning the bunches is harder than turning a football, as each bunch isn’t a single, rigid object.

    To make the rotation possible, the crab cavities are placed right before and after the collision points at two of the particle detectors at the LHC, called ATLAS and CMS. An alternating electric field runs through each cavity and “tilts” the particle bunch on its side. To do this, the front section of the bunch gets a “kick” to one side on the way in and, before it leaves, the rear section gets a “kick” to the opposite side. Now, the particle bunch looks like a football on its side. When the two bunches meet at the collision point, they overlap better, which makes the occurrence of a particle collision more likely.

    After the collision point, more crab cavities straighten the remaining bunches, so they can travel through the rest of the LHC without causing unwanted interactions.

    With CD-2/3b approval, all raw materials necessary for construction of the cavities can be purchased. Two crab cavity prototypes are expected by the end of 2019. Once the prototypes have been certified, the project will seek further approval for the production of all cavities destined to the LHC tunnel.

    After further testing, the cavities will be sent out to be “dressed”: placed in a cooling vessel. Once the dressed cavities pass all acceptance criteria, Fermilab will ship all 10 dressed cavities to CERN.

    “It’s easy to forget that these technological advances don’t benefit just accelerator programs,” said Leonardo Ristori, Fermilab engineer and an HL-LHC AUP manager for crab cavities. “Accelerator technology existed in the first TV screens and is currently used in medical equipment like MRIs. We might not be able to predict how these technologies will appear in everyday life, but we know that these kinds of endeavors ripple across industries.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 1:20 pm on February 28, 2019 Permalink | Reply
    Tags: , , CERN LHC, Croatia to become an Associate Member of CERN, , , ,   

    From CERN: “Croatia to become an Associate Member of CERN” 

    Cern New Bloc

    Cern New Particle Event

    From CERN

    28 February, 2019

    1
    Fabiola Gianotti, CERN Director-General, and Blaženka Divjak, Minister of Science and Education of the Republic of Croatia, signed an Agreement admitting Croatia as an Associate Member of CERN.

    Zagreb. Today, the Director-General of CERN1, Fabiola Gianotti, and the Minister of Science and Education of the Republic of Croatia, Blaženka Divjak, in the presence of Croatian Prime Minister Andrej Plenković, signed an Agreement admitting Croatia as an Associate Member of CERN. The status will come into effect on the date the Director-General receives Croatia’s notification that it has completed its internal approval procedures in respect of the Agreement.

    “It is a great pleasure to welcome Croatia into the CERN family as an Associate Member. Croatian scientists have made important contributions to a large variety of experiments at CERN for almost four decades and as an Associate Member, new opportunities open up for Croatia in scientific collaboration, technological development, education and training,” said Fabiola Gianotti.

    “Croatian participation in CERN as an Associate Member is also a way to retain young and capable people in the country because they can participate in important competitive international projects, working and studying in the Croatian educational and scientific institutions that collaborate with CERN,” said Blaženka Divjak.

    Croatian scientists have been engaged in scientific work at CERN for close to 40 years. Already in the late 1970s, researchers from Croatian institutes worked on the SPS heavy-ion programme. In 1994, research groups from Split officially joined the CMS collaboration and one year later a research group from Zagreb joined the ALICE collaboration, working with Croatian industry partners to contribute to the construction of the experiments’ detectors. Scientists from Croatia have also been involved in other CERN experiments such as CAST, NA61, ISOLDE, nTOF and OPERA.

    CERN and Croatia signed a Cooperation Agreement in 2001, setting priorities for scientific and technical cooperation. This resulted in an increased number of scientists and students from Croatia participating in CERN’s programmes, including the CERN Summer Student Programme. In May 2014, Croatia applied for Associate Membership.

    As an Associate Member, Croatia will be entitled to participate in the CERN Council, Finance Committee and Scientific Policy Committee. Nationals of Croatia will be eligible for staff positions and Croatia’s industry will be able to bid for CERN contracts, opening up opportunities for industrial collaboration in advanced technologies.

    Footnote(s)

    1. CERN, the European Organization for Nuclear Research, is one of the world’s leading laboratories for particle physics. The Organization is located on the French-Swiss border, with its headquarters in Geneva. Its Member States are: Austria, Belgium, Bulgaria, Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Israel, Italy, Netherlands, Norway, Poland, Portugal, Romania, Slovakia, Spain, Sweden, Switzerland and United Kingdom. Cyprus, Serbia and Slovenia are Associate Member States in the pre-stage to Membership. India, Lithuania, Pakistan, Turkey and Ukraine are Associate Member States. The European Union, Japan, JINR, the Russian Federation, UNESCO and the United States of America currently have Observer status.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

     
  • richardmitnick 12:00 pm on February 28, 2019 Permalink | Reply
    Tags: "First ATLAS result with full Run 2 dataset: a search for new heavy particles", , , CERN LHC, , , ,   

    From CERN ATLAS: “First ATLAS result with full Run 2 dataset: a search for new heavy particles” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN/ATLAS

    27th February 2019
    ATLAS Collaboration

    1
    Figure 1: Measured dielectron mass distribution for the data (black points), together with the total background fit result is shown (red continuous line), with various possible Z’ signal distributions overlaid (dashed red line). The sub-panel shows the significance of the deviation between the observed data and the background prediction in each bin of the distribution. (Image: ATLAS Collaboration/CERN).

    Could a Grand Unified Theory resolve the remaining mysteries of the Standard Model?

    Standard Model of Particle Physics


    Standard Model of Particle Physics from Symmetry Magazine

    If verified, it would provide an elegant description of the unification of Standard Model forces at very high energies, and might even explain the existence of dark matter and neutrino masses. ATLAS physicists are searching for evidence of new heavy particles predicted by such theories, including a neutral Z’ gauge boson.

    The ATLAS collaboration has today released its very first result utilising its entire LHC Run 2 dataset, collected between 2015 and 2018. This analysis searches for new heavy particles decaying into dilepton final states, where the leptons are either two electrons or two muons. This is one of the most sensitive decays to search for new physics, thanks to the ATLAS detector’s excellent energy and momentum resolution for leptons and the strong signal-to-background differentiation as a result of the simple two-lepton signature.

    The new ATLAS result also employs a novel data-driven approach for estimating the Standard Model background. While the previous analysis predominantly used simulations for the background prediction and was carried out with a fraction of the data, this new analysis takes advantage of the vast Run 2 dataset by fitting the observed data with a functional form motivated by and validated with our understanding of the Standard Model processes contributing to these events. If present, the new particles would appear as bumps on top of a smoothly falling background shape, making them straightforward to identify (see Figure 1). This is similar to one of the ways that the Higgs boson was discovered in 2012, through its decay to two photons.

    In addition to probing unexplored territory in the search for new physics, a great deal of work in this analysis has gone into understanding the ATLAS detector and collaborating with the various detector performance groups to improve the identification of very high-energy electrons and muons. This included accounting for the multiplicity of tracks in the inner part of the detector, as it continuously increased due to the rising average number of proton-proton collisions per bunch crossing during Run 2.

    No significant sign of new physics has been observed thus far. The result sets stringent constraints on the production rate of various types of hypothetical Z’ particles. As well as setting exclusion limits on specific theoretical models, the result has also been provided in a generic format that allows physicists to re-interpret the data under different theoretical assumptions. This study has deepened the exploration of physics at the energy frontier; ATLAS physicists are excited about further analysing the large Run 2 dataset.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 3:26 pm on February 26, 2019 Permalink | Reply
    Tags: "What’s in store for the CMS detector over the next two years?", , , CERN LHC, , , ,   

    From CERN CMS: “What’s in store for the CMS detector over the next two years?” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    26 February, 2019
    Letizia Diamante

    CERN/CMS Detector

    A jewel of particle physics, the CMS experiment is a 14 000-tonne detector that aims to solve a wide range of questions about the mysteries around the Higgs boson and dark matter.

    CERN CMS Higgs Event

    Now that the Large Hadron Collider (LHC) beam has been switched off for a two-year technical stop, Long Shutdown 2 (LS2), CMS is preparing for significant maintenance work and upgrades.

    1
    This diagram of the CMS detector shows some of the maintenance and upgrades in store over the next years

    All the LHC experiments at CERN want to exploit the full benefits of the accelerator’s upgrade, the High-Luminosity LHC (HL-LHC), scheduled to start in 2026.

    The HL-LHC will produce between five and ten times more collisions than the LHC, allowing more precision measurements of rare phenomena that are predicted in the Standard Model to be taken, and maybe even detecting new particles that have never been seen before. To take advantage of this, some of CMS’s components need to be replaced.

    Standard Moldel of Particle Physics

    Standard Model of Particle Physics from Symmetry Magazine

    In the heart of CMS

    Hidden inside several layers of subdetectors, the pixel detector surrounding the beam pipe is the core of the experiment, as it is the closest to the particle-collision point. During LS2, the innermost layer of the present pixel detector will be replaced, using more high-luminosity-tolerant and radiation-tolerant components. The beam pipe will also be replaced in LS2, with one that will allow the extremities of the future pixel detectors to get even closer to the interaction point. This third-generation pixel detector will be installed during the third long shutdown (LS3) in 2024–2026.

    4
    CMS core removal during the Long Shutdown 2 (LS2) (Image: Maximilien Brice/Julien Ordan/CERN)

    Without missing a thing

    Beyond the core, the CMS collaboration is also planning to work on the outermost part of the detector, which detects and measures muons – particles similar to electrons, but much heavier. They are preparing to install 40 large Multi-Gas Electron Multiplier (GEM) chambers to measure muons that scatter at an angle of around 10° – one of the most challenging angles for the detector to deal with. Invented in 1997 by Fabio Sauli, GEM chambers are already used in other CERN experiments, including COMPASS, TOTEM and LHCb, but the scale of CMS is far greater than the other detectors. The GEM chambers consist of a thin, metal-clad polymer foil, chemically pierced with millions of holes, typically 50 to 100 per millimetre, submerged in a gas. As muons pass through, electrons released by the gas drift into the holes, multiply in a very strong electric field and transfer to a collection region.

    Fast-forward to the future

    Some of the existing detectors would not perform well enough during the HL-LHC phase, as the number of proton–proton collisions produced in the HL-LHC will be ten times higher than that originally planned for the CMS experiment. Therefore, the high-granularity calorimeter (HGCAL) will replace the existing endcap electromagnetic and hadronic calorimeters during LS3, between 2024 and 2026. The new detector will comprise over 1000 m² of hexagonal silicon sensors and plastic scintillator tiles, distributed over 100 layers (50 in each endcap), providing unprecedented information about electrons, photons and hadrons. Exploiting this detector is a major challenge for software and analysis, and physicists and computer science experts are already working on advanced techniques, such as machine learning.

    4
    Ongoing tests on the modules of the high-granularity calorimeter (HGCAL). Intense R&D is planned for LS2 to ensure that the new detector will be ready for installation during LS3. (Image: Maximilien Brice/CERN)

    Building, building, building

    CMS has also been involved with the HL-LHC civil-engineering work, which kick-started in June 2018 and is ongoing. The project includes five new buildings on the surface at Cessy, France, as well as modifications to the underground cavern and galleries.

    CMS’s ambitious plan for the near and longer-term future is preparing the detector for more exciting undertakings. Stay tuned for more.

    Read more in “CMS has high luminosity in sight” in the latest CERN Courier, as well as LS2 highlights from ALICE, ATLAS and LHCb.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
  • richardmitnick 2:46 pm on February 26, 2019 Permalink | Reply
    Tags: "LHCb catches fast-spinning charmonium particle", , CERN LHC, , , , ,   

    From CERN LHCb: “LHCb catches fast-spinning charmonium particle” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN LHCb

    26 February, 2019
    Ana Lopes

    The LHCb collaboration has spotted a new particle. Its mass and other properties place it squarely in the charmonium family that includes the better-known J/ψ particle, which was the first particle containing a “charm quark” to be discovered and won its discoverers a Nobel prize in physics.

    The Nobel Prize in Physics 1976

    2
    Burton Richter. Prize share: 1/2

    3
    Samuel Chao Chung Ting. Prize share: 1/2

    Future studies of the properties of this new charmonium state and its relatives will help physicists better understand the strong force that binds together quarks, among the smallest particles that we know of.

    Charmonium particles are two-quark particles (called mesons) composed of a charm quark and its antimatter counterpart, the charm antiquark. Charm quarks are the third most massive of six quark types. Just like atoms, mesons can be observed in excited states of higher energy, in which the mesons’ constituent quarks move around each other in different configurations. These different arrangements give rise to a gamut of particles with different masses and quantum properties such as spin, which can be thought of as the rotation of a system around its axis.

    Observing such excited states and measuring their properties provides a way of testing models of quantum chromodynamics (QCD), the theory that describes how quarks are stuck together into composite particles. What’s more, knowledge of the full collection of these states helps identify exotic states with more than three quarks, such as tetraquarks, that are also predicted by QCD but have only recently been discovered.

    Tetraquarks-School of Physics and Astronomy – The University of Edinburgh

    If all of the excited states are accounted for, physicists can be more confident that any remaining ones are exotic.

    To catch the new charmonium particle, the LHCb collaboration, one of the four main experiments at the Large Hadron Collider, studied the decays of charmonium states produced in proton–proton collisions into pairs of D mesons, using data recorded between 2011 and 2018; D mesons are the lightest particles containing charm quarks. The collaboration measured the range of masses of the D-meson pairs and then added up how many times they recorded each mass value within the measured range. They then looked for an excess of events, or bump, in this mass distribution, and found a new, narrow peak at a mass that corresponds to a previously unobserved charmonium state dubbed the ψ3(1D). The particle has a spin value of 3, making this the first observation of a spin-3 charmonium state. The high spin value could account for the peak’s narrow width and the fact it has taken so long to find.

    For more information, check the LHCb website.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    LHCb
    CERN LHCb New II

     
  • richardmitnick 12:04 pm on February 25, 2019 Permalink | Reply
    Tags: "LS2 report: a technological leap for SPS acceleration", , , CERN LHC, CERN LS2, , , , , The SPS radiofrequency acceleration system is being enhanced with a new technology: solid-state amplifiers   

    From CERN: “LS2 report: a technological leap for SPS acceleration” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    25 February, 2019
    Corinne Pralavorio

    The SPS radiofrequency acceleration system is being enhanced with a new technology: solid-state amplifiers.

    CERN LS2 SPS The new solid-state amplifier system developed by CERN with the Thales Gérac company comprises 32 towers, in which 2560 RF modules, each containing four transistors, will be installed. (Image: Maximilien Brice/CERN)

    Big changes are under way at the Super Proton Synchrotron (SPS). One of the major operations is the upgrade of the machine’s acceleration system. “The beams in the High-Luminosity LHC will be twice as intense, which requires an increase in radiofrequency power,” explains Erk Jensen, leader of the Radiofrequency (BE-RF) group. One aspect of the LHC Injectors Upgrade (LIU) project is therefore bringing the SPS acceleration system up to standard.

    Erk Jensen shows us around the huge Building 870, just behind the CERN Control Centre on the Prévessin site, which is a hive of activity. Everywhere you look, teams are pulling out cables, unscrewing components and removing electronic modules. Dismantling is one of the main activities of this first phase of the Long Shutdown. No fewer than 400 km of cables are being removed at Points 3 and 5 of the SPS, for example.

    In the large halls, we can see the huge power converter and amplifier installations that supply the radiofrequency (RF) accelerator cavities of the SPS. The amplifiers use an electronic tube technology dating back to the 1970s and 80s, as the SPS was commissioned in 1976 and transformed into a proton-antiproton collider in 1981. Two tube systems exist alongside each other, each producing 2 megawatts of power.

    To supply the power needed for the High-Luminosity LHC, a team from the RF group, headed by Eric Montesinos, working with the firm Thales Gérac, has developed a new system that uses solid-state amplifiers, similar to those that were recently developed for the SOLEIL and ESRF synchrotrons. The transistors for these amplifiers are assembled in sets of four on modules that supply 2 kilowatts, much less power than was delivered by the electronic tubes (between 35 and 135 kilowatts). But a total of 2560 modules, i.e. 10 240 transistors, will be spread across 32 towers. The power from 16 towers will be combined via an RF power combiner. The whole system will be able to provide RF power of two times 1.6 megawatts to the cavities.

    This system is much more flexible, since the power is distributed across thousands of transistors,” observes Eric Montesinos. “If a few transistors stop working, the RF system will not stop completely, whereas if one of the tubes failed, we had to intervene quickly.” In addition, it’s much easier to change a module, especially since electronic tubes in this frequency range are an endangered species, accelerators being among the last applications of the technology.

    CERN LS2 CPS RF cavities 200 MHz accelerating removed from their tunnel to be upgraded. Image: Maximilien Brice/CERN)

    Development of the solid-state amplifier system began in 2016. A team from the RF group worked in collaboration with scientists from Thales Gérac, and many tests and adjustments had to be carried out. Power electronics are subject to significant thermomechanical effects, so the technique for fitting the transistors onto the plate of the module, to take one example, turned out to be a particularly tricky aspect to get right. After several dozen complex prototypes had been produced, the work finally came to a successful conclusion last year: the first tower housing 80 transistor modules operated for 1000 hours, passing the validation tests in August. This was a great success that allowed series production to begin while the tests continued.

    The structures, i.e. the 32 towers, have already been installed in a new room, giving it the air of a science-fiction movie set. Only one of them so far is equipped with its RF power modules, offering a taste of the even more futuristic look that the room will have in a few months’ time. The modules will be delivered as of May, continuing through to the end of the year; all of them will be tested on a specially designed test bench before being installed in the towers. Some painstaking work faces the teams that will install all the modules.

    In parallel, the cavities have been removed from the tunnel. The SPS has four 200 MHz cavities: two formed of four sections, and two of five sections, each section measuring four metres. “To accelerate more intense beams, we need to reduce the length of the cavities in order to maintain a sufficiently strong electromagnetic field along their whole length,” explains Erk Jensen. The teams will therefore reassemble the sections in order to form a total of six cavities: two of four sections and four of three sections.

    At the same time, the beam control system is being replaced. The Faraday cage, which houses the electronic racks for the beam control system, has been completely emptied, ready to be fitted with the latest electronics and new infrastructure (lighting, cooling and ventilation systems, among others). Finally, an improved system for eliminating parasitic resonances will be installed, based on HOM (higher order mode) couplers, which were tested during the last run.

    The teams must stick to a tight schedule, comprising all the dismantling work, the start of installation later in 2019, and numerous tests and commissioning tasks in 2020.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

     
  • richardmitnick 4:37 pm on February 19, 2019 Permalink | Reply
    Tags: , , CERN LHC, Looking for Dak Energy at CERN,   

    From Symmetry: “Taking a collider to the dark energy problem” 

    Symmetry Mag
    From Symmetry

    02/14/19
    Sarah Charley

    1
    Ralf Kaehler, based on a simulation by John Wise and Tom Abel

    Every second, the universe grows a little bigger. Scientists are using the LHC to try to find out why.

    With the warmth of holiday cheer in the air, Nottingham University theoretical physicist Clare Burrage and her colleagues decided to hit the pub after a conference in December 2014 and do what many physicists tend to do after work: keep talking about physics.

    That evening’s topic of conversation: dark energy particles. The chat would lead to a new line of investigation at the Large Hadron Collider at CERN.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Dark energy is a catch-all term that scientists coined to describe whatever seems to be pushing the bounds of the universe farther and farther apart. If gravity were the only force choreographing the interstellar ballet of stars and galaxies, then—after the initial grand jeté of all of the matter and energy in the universe during the big bang—every celestial body would slowly chassé back to a central point. But that’s not what’s happening. Instead, the universe continues to drift apart—and it’s happening at an accelerating rate.

    “We really don’t know what’s going on,” says Burrage. “At the moment, there are problems with all of our possible solutions to this problem.”

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Most experiments studying this mysterious cosmic expansion look at intergalactic movements and precision measurements of the effects of gravity. Dark energy could be a property of spacetime itself, or just a huge misunderstanding of how gravity works on a cosmic scale.

    But many theorists suspect that dark energy is a new type of force or field—something that changes how gravity works. And if this is true, then scientists might be able to put just the right amount of energy into that field to pop out a particle, a particle that could potentially show up in a detector at the LHC. This is the way scientists discovered the Higgs field, by interacting with it in just the right way for it to produce a Higgs boson.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    “Cosmologists know that there is new physics we don’t understand, and all the evidence is pointing towards something very fundamental about our universe,” Burrage says. “The experiments on the LHC are also very interested in the fundamentals.”

    The ATLAS and CMS experiments, the big general-purpose experiments at the LHC, search for new fundamental forces and properties of nature by recording what happens when the LHC smashes together protons at just under the speed of light.

    CERN/ATLAS detector


    The giant detectors surround the collision points and map the energy and matter released from the collisions, giving scientists a unique view of the clandestine threads that weave together to build everything in the universe.

    The theory Burrage and her colleagues were poring over at the pub predicted that if dark energy is a new type of field, it might produce light particles with strong and specific interactions with matter. “The main focus of LHC has been heavy particles, so we had to go back and re-interpret the data to look for something light,” she says.

    Burrage worked with Philippe Brax of Université Paris-Saclay and Christophe Englert of the University of Glasgow to check publicly available data from the first run of the world’s most powerful collider for signs of a lightweight dark energy particles. They quickly determined that the signs they were looking for had not appeared.

    With this simple model easily eliminated, they decided to take on another idea with a more cryptic signature. They knew that more complex analyses would require the expertise of an experimentalist. So in April 2016, along with Michael Spannowsky of Durham University in the UK, they published a new hypothesis in the scientific journal Physical Review Letters—and waited.

    They found their experimentalist in Spyros Argyropoulos, a postdoc at the University of Iowa working on the ATLAS experiment, who read their article.

    “The idea of testing dark energy was intriguing,” Argyropoulos says. “It’s not something we typically look for at the LHC, and making progress on this problem is a win-win for both cosmologist and particle physicist.”

    Argyropoulos reached out to Burrage and her colleagues to define the parameters, and then he and a group of ATLAS scientists went to the data.

    According to this new theory, dark energy particles should radiate off of energetic top quarks and show up in the detector as missing energy. Argyropoulos and his colleagues went through ATLAS analyses of top quarks and, in a separate search, looked at certain other collisions to see if any of them showed the signatures they were looking for. They did not.

    While this might seem like a disappointing result, Argyropoulos assures that it’s anything but. “Physics isn’t just about finding the right answer,” he says. “It’s also about narrowing down all the possibilities.”

    Burrage agrees: “Eliminating an idea with experimental data is a positive thing, even if it means our pet theory gets killed in the process. Theorists can always come up with more ideas, and it’s good for the field to have the spectrum of possibilities narrowed down.”

    The landscape of dark energy theories is enormous. Burrage’s specialty is scouring that landscape, searching for theories that can be tested, and then proposing ways to test them.

    “Ten years ago, nobody was thinking that collider physics could put constraints on dark energy searches,” she says. “Theories have to pass all relevant experimental tests, and it’s looking like surviving the Large Hadron Collider is going to be an important one to our field.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 7:09 pm on February 13, 2019 Permalink | Reply
    Tags: A key link in CERN’s accelerator complex, A new set of quadrupole magnets will be installed along the Booster-to-PS injection line, , Also delivering particles to several experimental areas such as the Antiproton Decelerator (AD), , CERN LHC, CERN Proton Synchrotron, , It takes ten hours to extract one magnet, , Mainly accelerating protons to 26 GeV before sending them to the Super Proton Synchrotron (SPS), New cooling systems are being installed to increase the cooling capacity of the PS, One major component of the PS that will be consolidated is the magnet system, One of the elements known as the pole-face windings which is located between the beam pipe and the magnet yoke needs replacing, , PS will undergo a major overhaul to prepare it for the higher injection and beam intensities of the LHC’s Run 3 as well as for the High-Luminosity LHC   

    From CERN- “LS2 report: The Proton Synchrotron’s magnets prepare for higher energies” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    13 February, 2019
    Achintya Rao

    CERN Proton Synchrotron

    1
    One of the magnets being driven on a locomotive to the workshop (right) after being extracted from the PS itself (left) (Image: Julien Marius Ordan/Maximilien Brice/CERN)

    The Proton Synchrotron (PS), which was CERN’s first synchrotron and which turns 60 this year, once held the record for the particle accelerator with the highest energy. Today, it forms a key link in CERN’s accelerator complex, mainly accelerating protons to 26 GeV before sending them to the Super Proton Synchrotron (SPS), but also delivering particles to several experimental areas such as the Antiproton Decelerator (AD). Over the course of Long Shutdown 2 (LS2), the PS will undergo a major overhaul to prepare it for the higher injection and beam intensities of the LHC’s Run 3 as well as for the High-Luminosity LHC.

    One major component of the PS that will be consolidated is the magnet system [Many magnets will come from Fermilab and Brookhaven Lab, two US D.O.E. labs]. The synchrotron has a total of 100 main magnets within it (plus one reference magnet unit outside the ring), which bend and focus the particle beams as they whizz around it gaining energy. “During the last long shutdown (LS1) and at the beginning of LS2, the TE-MSC team performed various tests to identify weak points in the magnets,” explains Fernando Pedrosa, who is coordinating the LS2 work on the PS. The team identified 50 magnets needing refurbishment, of which seven were repaired during LS1 itself. “The remaining 43 magnets that need attention will be refurbished this year.”

    Specifically, one of the elements, known as the pole-face windings, which is located between the beam pipe and the magnet yoke, needs replacing. In order to reach into the magnet innards to replace these elements, the magnet units have to be transferred to a workshop in building 151. Once disconnected, each magnet is placed onto a small locomotive system that drives them to the workshops. The locomotives themselves are over 50 years old, and their movement must be delicately managed. It takes ten hours to extract one magnet. So far, six magnets have been taken to the workshop and this work will last until 18 October 2019.

    The workshop where the magnets are being treated is divided into two sections. In the first room, the vacuum chamber of the magnets is cut so as to access the pole-face windings. The magnet units are then taken to the second room, where prefabricated replacements are installed.

    As mentioned in the previous LS2 Report, the PS Booster will see an increase in the energy it imparts to accelerating protons, from 1.4 GeV to 2 GeV. A new set of quadrupole magnets will be installed along the Booster-to-PS injection line, to increase the focusing strength required for the higher-energy beams. Higher-energy beams require higher-energy injection elements; therefore some elements will be replaced in the PS injection region as part of the LHC Injectors Upgrade (LIU) project, namely septum 42, kicker 45 and five bumper magnets.

    Other improvements as part of the LIU project include the new cooling systems being installed to increase the cooling capacity of the PS. A new cooling station is being built at building 355, while one cooling tower in building 255 is being upgraded. The TT2 line, which is involved in the transfer from the PS to the SPS, will have its cooling system decoupled from the Booster’s, to allow the PS to operate independent of the Booster schedule. “The internal dumps of the PS, which are used in case the beam needs to be stopped, are also being changed, as are some other intercepting devices,” explains Pedrosa.

    The LS2 operations are on a tight schedule,” notes Pedrosa, pointing out that works being performed on several interconnected systems create constraints for what can be done concurrently. As LS2 proceeds, we will bring you more news about the PS, including the installation of new instrumentation in wire scanners that help with beam-size measurement, an upgraded transverse-feedback system to stabilise the beam and more.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 3:51 pm on February 5, 2019 Permalink | Reply
    Tags: , , CERN LHC, , , , Particle Physics Is Doing Just Fine, ,   

    From slate.com: “Particle Physics Is Doing Just Fine” 

    SLATE

    From slate.com

    Jan 31, 2019
    Chanda Prescod-Weinstein
    Tim M.P. Tait


    CERN/ALICE Detector

    Research is a search through the unknown. If you knew the answer, there would be no need to do the research, and until you do the research, you don’t know the answer. Science is a complex social phenomenon, but certainly its history includes repeated episodes of people having ideas, trying experiments to test those ideas, and using the results to inform the next round of ideas. When an experimental result indicates that one particular idea is not correct, this is neither a failure of the experiment nor of the original idea itself; it’s an advancement of our understanding of the world around us.

    Recently, particle physics has become the target of a strange line of scientific criticism. Articles like Sabine Hossenfelder’s New York Times op-ed questioning the “uncertain future” of particle physics and Vox’s “The $22 Billion Gamble: Why Some Physicists Aren’t Excited About Building a Bigger Particle Collider” raise the specter of failed scientists. To read these articles, you’d think that unless particle physics comes home with a golden ticket in the form of a new particle, it shouldn’t come home at all. Or at least, it shouldn’t get a new shot at exploring the universe’s subatomic terrain. But the proposal that particle physicists are essentially setting money on fire comes with an insidious underlying message: that science is about the glory of discovery, rather than the joy of learning about the world. Finding out that there are no particles where we had hoped tells us about the distance between human imagination and the real world. It can operate as a motivation to expand our vision of what the real world is like at scales that are totally unintuitive. Not finding something is just as informative as finding something.

    That’s not to say resources should be infinite or to suggest that community consensus isn’t important. To the contrary, the particle physics community, like the astronomy and planetary science communities, takes the conversation about what our priorities should be so seriously that we have it every half decade or so. Right now, the European particle physics community is in the middle of a “strategy update,” and plans are underway for the U.S. particle physics community to hold the next of its “Snowmass community studies,” which take place approximately every five years. These events are opportunities to take stock of recent developments and to devise a strategy to maximize scientific progress in the field. In fact, we’d wager that they’re exactly what Hossenfelder is asking for when she suggests “it’s time for particle physicists to step back and reflect on the state of the field.”

    One of the interesting questions that both of these studies will confront is whether or not the field should prioritize construction of a new high-energy particle accelerator. In past decades, many resources have been directed toward the construction and operation of the Large Hadron Collider, a gigantic device whose tunnel spans two countries and whose budget is in the billions of dollars. Given funding constraints, it is entirely appropriate to ask whether it makes sense to prioritize a future particle accelerator at this moment in history. A new collider is likely to have a price tag measured in tens of billions of dollars and would represent a large investment—though not large compared with the scale of other areas of government spending, and the collider looks even less expensive when spread out over decades and shared by many nations.

    The LHC was designed to reach energies of 14 trillion electron volts, about seven times more than its predecessor, the Tevatron at Fermilab in Chicagoland.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    FNAL/Tevatron map

    FNAL/Tevatron

    There was very strong motivation to explore collisions at these energies; up until the LHC began operations, our understanding of the Standard Model of particle physics, the leading theory describing subatomic particles and their interactions, contained a gaping hole.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The theory could only consistently describe the massive fundamental particles that are observed in our experiments if one included the Higgs boson—a particle that had yet to be observed.

    Self-consistency demanded that either the Higgs or something else providing masses would appear at the energies studied by the LHC. There were a host of competing theories, and only experimental data could hope to judge which one was realized in nature.

    So we tried it. And because the LHC allowed us to actually observe the Higgs, we now know that the picture in which masses arise from the Higgs is either correct or very close to being correct.

    Peter Higgs

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The LHC discovered a particle whose interactions with the known particles matches the predictions to within about 10 percent or so. This represents a triumph in our understanding of the fundamental building blocks of nature, one that would have been impossible without both 1) the theoretical projections that defined the characteristics that the Higgs must have to play its role and 2) the experimental design of the accelerator and particle detectors and the analysis of the data that they collected. In order to learn nature’s secrets, theory and experiment must come together.

    [I.E., you must do the math.]

    Some people have labeled the LHC a failure because even though it confirmed the Standard Model’s vision for how particles get their masses, it did not offer any concrete hint of any further new particles besides the Higgs. We understand the disappointment. Given the exciting new possibilities opened up by exploring energy levels we’ve never been privy to here on earth, this feeling is easy to relate to. But it is also selling the accomplishments short and fails to appreciate how research works. Theorists come up with fantastical ideas about what could be. Most of them are wrong, because the laws of physics are unchanging and universal. Experimentalists are taking on the task of actually popping open the hood and looking at what’s underneath it all. Sometimes, they may not find anything new.

    A curious species, we are left to ask more questions. Why did we find this and not that? What should we look for next? What a strange and fascinating universe we live in, and how wonderful to have the opportunity to learn about it.

    It cannot be ignored that if the U.S. had built the Superconducting Super Collider a particle accelerator complex under construction in the vicinity of Waxahachie, Texas, Higgs would have been found in the U.S. and High Energy Physics would not have been ceded to Europe.

    3
    Tracing the path of the particle accelerators and tunnels planned for the Superconducting Supercollider Project. You can see the main ring circling Waxahachie.

    The Superconducting Super Collider planned ring circumference was 87.1 kilometers (54.1 mi) with an energy of 20 TeV per proton and was set to be the world’s largest and most energetic. It would have greatly surpassed the current record held by the Large Hadron Collider which has ring circumference 27 km (17 mi) and energy of 13 TeV per proton. The project’s director was Roy Schwitters, a physicist at the University of Texas at Austin. Dr. Louis Ianniello served as its first Project Director for 15 months. The project was cancelled in 1993 due to budget problems [Congress cancelled the Collider for having “no immediate econmic value].

    See the full article here .
    See also the possible future of HEP here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Slate is a daily magazine on the Web. Founded in 1996, we are a general-interest publication offering analysis and commentary about politics, news, business, technology, and culture. Slate’s strong editorial voice and witty take on current events have been recognized with numerous awards, including the National Magazine Award for General Excellence Online. The site, which is owned by Graham Holdings Company, does not charge for access and is supported by advertising revenues.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: