Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:00 pm on July 3, 2017 Permalink | Reply
    Tags: , CERN LHC, , , , Joe Incandela, , What comes next?   

    From Symmetry: “When was the Higgs actually discovered?” 

    Symmetry Mag

    Symmetry

    07/03/17
    Sarah Charley

    The announcement on July 4 was just one part of the story. Take a peek behind the scenes of the discovery of the Higgs boson.

    1
    Maximilien Brice, Laurent Egli, CERN

    Joe Incandela UCSB and Cern CMS

    Joe Incandela sat in a conference room at CERN and watched with his arms folded as his colleagues presented the latest results on the hunt for the Higgs boson. It was December 2011, and they had begun to see the very thing they were looking for—an unexplained bump emerging from the data.

    “I was far from convinced,” says Incandela, a professor at the University of California, Santa Barbara and the former spokesperson of the CMS experiment at the Large Hadron Collider.

    CERN CMS Higgs Event

    CERN/CMS Detector

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    For decades, scientists had searched for the elusive Higgs boson: the holy grail of modern physics and the only piece of the robust and time-tested Standard Model that had yet to be found.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The construction of the LHC was motivated in large part by the absence of this fundamental component from our picture of the universe. Without it, physicists couldn’t explain the origin of mass or the divergent strengths of the fundamental forces.

    “Without the Higgs boson, the Standard Model falls apart,” says Matthew McCullough, a theorist at CERN. “The Standard Model was fitting the experimental data so well that most of the theory community was convinced that something playing the role of Higgs boson would be discovered by the LHC.”

    The Standard Model predicted the existence of the Higgs but did not predict what the particle’s mass would be. Over the years, scientists had searched for it across a wide range of possible masses. By 2011, there was only a tiny region left to search; everything else had been excluded by previous generations of experimentation.

    FNAL in the Tevatron research had ruled out many of the possible levels of energy that could have been the home of Higgs.

    FNAL Tevatron

    FNAL/Tevatron map


    FNAL/Tevatron DZero detector


    FNAL/Tevatron CDF detector

    If the predicted Higgs boson were anywhere, it had to be there, right where the LHC scientists were looking.

    But Incandela says he was skeptical about these preliminary results. He knew that the Higgs could manifest itself in many different forms, and this particular channel was extremely delicate.

    “A tiny mistake or an unfortunate distribution of the background events could make it look like a new particle is emerging from the data when in reality, it’s nothing,” Incandela says.

    A common mantra in science is that extraordinary claims require extraordinary evidence. The challenge isn’t just collecting the data and performing the analysis; it’s deciding if every part of the analysis is trustworthy. If the analysis is bulletproof, the next question is whether the evidence is substantial enough to claim a discovery. And if a discovery can be claimed, the final question is what, exactly, has been discovered? Scientists can have complete confidence in their results but remain uncertain about how to interpret them.

    In physics, it’s easy to say what something is not but nearly impossible to say what it is. A single piece of corroborated, contradictory evidence can discredit an entire theory and destroy an organization’s credibility.

    “We’ll never be able to definitively say if something is exactly what we think it is, because there’s always something we don’t know and cannot test or measure,” Incandela says. “There could always be a very subtle new property or characteristic found in a high-precision experiment that revolutionizes our understanding.”

    With all of that in mind, Incandela and his team made a decision: From that point on, everyone would refine their scientific analyses using special data samples and a patch of fake data generated by computer simulations covering the interesting areas of their analyses. Then, when they were sure about their methodology and had enough data to make a significant observation, they would remove the patch and use their algorithms on all the real data in a process called unblinding.

    “This is a nice way of providing an unbiased view of the data and helps us build confidence in any unexpected signals that may be appearing, particularly if the same unexpected signal is seen in different types of analyses,” Incandela says.

    A few weeks before July 4, all the different analysis groups met with Incandela to present a first look at their unblinded results. This time the bump was very significant and showing up at the same mass in two independent channels.

    “At that point, I knew we had something,” Incandela says. “That afternoon we presented the results to the rest of the collaboration. The next few weeks were among the most intense I have ever experienced.”

    Meanwhile, the other general-purpose experiment at the LHC, ATLAS, was hot on the trail of the same mysterious bump.

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    Andrew Hard was a graduate student at The University of Wisconsin, Madison working on the ATLAS Higgs analysis with his PhD thesis advisor Sau Lan Wu.

    “Originally, my plan had been to return home to Tennessee and visit my parents over the winter holidays,” Hard says. “Instead, I came to CERN every day for five months—even on Christmas. There were a few days when I didn’t see anyone else at CERN. One time I thought some colleagues had come into the office, but it turned out to be two stray cats fighting in the corridor.”

    Hard was responsible for writing the code that selected and calibrated the particles of light the ATLAS detector recorded during the LHC’s high-energy collisions. According to predictions from the Standard Model, the Higgs can transform into two of these particles when it decays, so scientists on both experiments knew that this project would be key to the discovery process.

    “We all worked harder than we thought we could,” Hard says. “People collaborated well and everyone was excited about what would come next. All in all, it was the most exciting time in my career. I think the best qualities of the community came out during the discovery.”

    At the end of June, Hard and his colleagues synthesized all of their work into a single analysis to see what it revealed. And there it was again—that same bump, this time surpassing the statistical threshold the particle physics community generally requires to claim a discovery.

    “Soon everyone in the group started running into the office to see the number for the first time,” Hard says. “The Wisconsin group took a bunch of photos with the discovery plot.”

    Hard had no idea whether CMS scientists were looking at the same thing. At this point, the experiments were keeping their latest results secret—with the exception of Incandela, Fabiola Gianotti (then ATLAS spokesperson) and a handful of CERN’s senior management, who regularly met to discuss their progress and results.

    Fabiola Gianotti, then the ATLAS spokesperson, now the General Director of CERN

    “I told the collaboration that the most important thing was for each experiment to work independently and not worry about what the other experiment was seeing,” Incandela says. “I did not tell anyone what I knew about ATLAS. It was not relevant to the tasks at hand.”

    Still, rumors were circulating around theoretical physics groups both at CERN and abroad. Mccullough, then a postdoc at the Massachusetts Institute of Technology, was avidly following the progress of the two experiments.

    “We had an update in December 2011 and then another one a few months later in March, so we knew that both experiments were seeing something,” he says. “When this big excess showed up in July 2012, we were all convinced that it was the guy responsible for curing the ails of the Standard Model, but not necessarily precisely that guy predicted by the Standard Model. It could have properties mostly consistent with the Higgs boson but still be not absolutely identical.”

    The week before announcing what they’d found, Hard’s analysis group had daily meetings to discuss their results. He says they were excited but also nervous and stressed: Extraordinary claims require extraordinary confidence.

    “One of our meetings lasted over 10 hours, not including the dinner break halfway through,” Hard says. “I remember getting in a heated exchange with a colleague who accused me of having a bug in my code.”

    After both groups had independently and intensely scrutinized their Higgs-like bump through a series of checks, cross-checks and internal reviews, Incandela and Gianotti decided it was time to tell the world.

    “Some people asked me if I was sure we should say something,” Incandela says. “I remember saying that this train has left the station. This is what we’ve been working for, and we need to stand behind our results.”

    On July 4, 2012, Incandela and Gianotti stood before an expectant crowd and, one at a time, announced that decades of searching and generations of experiments had finally culminated in the discovery of a particle “compatible with the Higgs boson.”

    Science journalists rejoiced and rushed to publish their stories. But was this new particle the long-awaited Higgs boson? Or not?

    Discoveries in science rarely happen all at once; rather, they build slowly over time. And even when the evidence overwhelmingly points in a clear direction, scientists will rarely speak with superlatives or make definitive claims.

    “There is always a risk of overlooking the details,” Incandela says, “and major revolutions in science are often born in the details.”

    Immediately after the July 4 announcement, theorists from around the world issued a flurry of theoretical papers presenting alternative explanations and possible tests to see if this excess really was the Higgs boson predicted by the Standard Model or just something similar.

    “A lot of theory papers explored exotic ideas,” McCullough says. “It’s all part of the exercise. These papers act as a straw man so that we can see just how well we understand the particle and what additional tests need to be run.”

    For the next several months, scientists continued to examine the particle and its properties. The more data they collected and the more tests they ran, the more the discovery looked like the long-awaited Higgs boson. By March, both experiments had twice as much data and twice as much evidence.

    “Amongst ourselves, we called it the Higgs,” Incandela says, “but to the public, we were more careful.”

    It was increasingly difficult to keep qualifying their statements about it, though. “It was just getting too complicated,” Incandela says. “We didn’t want to always be in this position where we had to talk about this particle like we didn’t know what it was.”

    On March 14, 2013—nine months and 10 days after the original announcement—CERN issued a press release quoting Incandela as saying, “to me, it is clear that we are dealing with a Higgs boson, though we still have a long way to go to know what kind of Higgs boson it is.”​

    To this day, scientists are open to the possibility that the Higgs they found is not exactly the Higgs they expected.

    “We are definitely, 100 percent sure that this is a Standard-Model-like Higgs boson,” Incandela says. “But we’re hoping that there’s a chink in that armor somewhere. The Higgs is a sign post, and we’re hoping for a slight discrepancy which will point us in the direction of new physics.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 4:04 pm on June 30, 2017 Permalink | Reply
    Tags: , CERN LHC, Gluons, , , What really hapens?   

    From Symmetry: “What’s really happening during an LHC collision?” 

    Symmetry Mag

    Symmetry

    06/30/17
    Sarah Charley

    It’s less of a collision and more of a symphony.

    1
    Wow!! ATLAS collaboration.

    The Large Hadron Collider is definitely large.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    With a 17-mile circumference, it is the biggest collider on the planet. But the latter fraction of its name is a little misleading. That’s because what collides in the LHC are the tiny pieces inside the hadrons, not the hadrons themselves.

    Hadrons are composite particles made up of quarks and gluons.

    The quark structure of the proton 16 March 2006 Arpad Horvath

    The gluons carry the strong force, which enables the quarks to stick together and binds them into a single particle.

    2
    SA

    The main fodder for the LHC are hadrons called protons. Protons are made up of three quarks and an indefinable number of gluons. (Protons in turn make up atoms, which are the building blocks of everything around us.)

    If a proton were enlarged to the size of a basketball, it would look empty. Just like atoms, protons are mostly empty space. The individual quarks and gluons inside are known to be extremely small, less than 1/10,000th the size of the entire proton.

    “The inside of a proton would look like the atmosphere around you,” says Richard Ruiz, a theorist at Durham University. “It’s a mixture of empty space and microscopic particles that, for all intents and purposes, have no physical volume.

    “But if you put those particles inside a balloon, you’ll see the balloon expand. Even though the internal particles are microscopic, they interact with each other and exert a force on their surroundings, inevitably producing something which does have an observable volume.”

    So how do you collide two objects that are effectively empty space? You can’t. But luckily, you don’t need a classical collision to unleash a particle’s full potential.

    In particle physics, the term “collide” can mean that two protons glide through each other, and their fundamental components pass so close together that they can talk to each other. If their voices are loud enough and resonate in just the right way, they can pluck deep hidden fields that will sing their own tune in response—by producing new particles.

    “It’s a lot like music,” Ruiz says. “The entire universe is a symphony of complex harmonies which call and respond to each other. We can easily produce the mid-range tones, which would be like photons and muons, but some of these notes are so high that they require a huge amount of energy and very precise conditions to resonate.”

    Space is permeated with dormant fields that can briefly pop a particle into existence when vibrated with the right amount of energy. These fields play important roles but almost always work behind the scenes. The Higgs field, for instance, is always interacting with other particles to help them gain mass. But a Higgs particle will only appear if the field is plucked with the right resonance.

    When protons meet during an LHC collision, they break apart and the quarks and gluons come spilling out. They interact and pull more quarks and gluons out of space, eventually forming a shower of fast-moving hadrons.

    This subatomic symbiosis is facilitated by the LHC and recorded by the experiment, but it’s not restricted to the laboratory environment; particles are also accelerated by cosmic sources such as supernova remnants. “This happens everywhere in the universe,” Ruiz says. “The LHC and its experiments are not special in that sense. They’re more like a big concert hall that provides the energy to pop open and record the symphony inside each proton.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 5:12 pm on June 29, 2017 Permalink | Reply
    Tags: , , , , CERN LHC, , , HPSS -High Performance Storage System, , , , RACF - Resource Access Control Facility, Scientific Data and Computing Center   

    From BNL: “Brookhaven Lab’s Scientific Data and Computing Center Reaches 100 Petabytes of Recorded Data” 

    Brookhaven Lab

    Ariana Tantillo
    atantillo@bnl.gov

    Total reflects 17 years of experimental physics data collected by scientists to understand the fundamental nature of matter and the basic forces that shape our universe.

    1
    (Back row) Ognian Novakov, Christopher Pinkenburg, Jérôme Lauret, Eric Lançon, (front row) Tim Chou, David Yu, Guangwei Che, and Shigeki Misawa at Brookhaven Lab’s Scientific Data and Computing Center, which houses the Oracle StorageTek tape storage system where experimental data are recorded.

    Imagine storing approximately 1300 years’ worth of HDTV video, nearly six million movies, or the entire written works of humankind in all languages since the start of recorded history—twice over. Each of these quantities is equivalent to 100 petabytes of data: the amount of data now recorded by the Relativistic Heavy Ion Collider (RHIC) and ATLAS Computing Facility (RACF) Mass Storage Service, part of the Scientific Data and Computing Center (SDCC) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory. One petabyte is defined as 10245 bytes, or 1,125,899,906,842,624 bytes, of data.

    “This is a major milestone for SDCC, as it reflects nearly two decades of scientific research for the RHIC nuclear physics and ATLAS particle physics experiments, including the contributions of thousands of scientists and engineers,” said Brookhaven Lab technology architect David Yu, who leads the SDCC’s Mass Storage Group.

    SDCC is at the core of a global computing network connecting more than 2,500 researchers around the world with data from the STAR and PHENIX experiments at RHIC—a DOE Office of Science User Facility at Brookhaven—and the ATLAS experiment at the Large Hadron Collider (LHC) in Europe.

    BNL/RHIC Star Detector

    BNL/RHIC PHENIX

    CERN/ATLAS detector

    In these particle collision experiments, scientists recreate conditions that existed just after the Big Bang, with the goal of understanding the fundamental forces of nature—gravitational, electromagnetic, strong nuclear, and weak nuclear—and the basic structure of matter, energy, space, and time.

    Big Data Revolution

    The RHIC and ATLAS experiments are part of the big data revolution.

    BNL RHIC Campus


    BNL/RHIC

    These experiments involve collecting extremely large datasets that reduce statistical uncertainty to make high-precision measurements and search for extremely rare processes and particles.

    For example, only one Higgs boson—an elementary particle whose energy field is thought to give mass to all the other elementary particles—is produced for every billion proton-proton collisions at the LHC.

    CERN CMS Higgs Event


    CERN/CMS Detector

    CERN ATLAS Higgs Event

    More, once produced, the Higgs boson almost immediately decays into other particles. So detecting the particle is a rare event, with around one trillion collisions required to detect a single instance. When scientists first discovered the Higgs boson at the LHC in 2012, they observed about 20 instances, recording and analyzing more than 300 trillion collisions to confirm the particle’s discovery.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    At the end of 2016, the ATLAS collaboration released its first measurement of the mass of the W boson particle (another elementary particle that, together with the Z boson, is responsible for the weak nuclear force). This measurement, which is based on a sample of 15 million W boson candidates collected at LHC in 2011, has a relative precision of 240 parts per million (ppm)—a result that matches the best single-experiment measurement announced in 2007 by the Collider Detector at Fermilab collaboration, whose measurement is based on several years’ worth of collected data. A highly precise measurement is important because a deviation from the mass predicted by the Standard Model could point to new physics. More data samples are required to achieve the level of accuracy (80 ppm) that scientists need to significantly test this model.

    The volume of data collected by these experiments will grow significantly in the near future as new accelerator programs deliver higher-intensity beams. The LHC will be upgraded to increase its luminosity (rate of collisions) by a factor of 10. This High-Luminosity LHC, which should be operational by 2025, will provide a unique opportunity for particle physicists to look for new and unexpected phenomena within the exabytes (one exabyte equals 1000 petabytes) of data that will be collected.

    Data archiving is the first step in making available the results from such experiments. Thousands of physicists then need to calibrate and analyze the archived data and compare the data to simulations. To this end, computational scientists, computer scientists, and mathematicians in Brookhaven Lab’s Computational Science Initiative, which encompasses SDCC, are developing programming tools, numerical models, and data-mining algorithms. Part of SDCC’s mission is to provide computing and networking resources in support of these activities.

    A Data Storage, Computing, and Networking Infrastructure

    Housed inside SDCC are more than 60,000 computing cores, 250 computer racks, and tape libraries capable of holding up to 90,000 magnetic storage tape cartridges that are used to store, process, analyze, and distribute the experimental data. The facility provides approximately 90 percent of the computing capacity for analyzing data from the STAR and PHENIX experiments, and serves as the largest of the 12 Tier 1 computing centers worldwide that support the ATLAS experiment. As a Tier 1 center, SDCC contributes nearly 23 percent of the total computing and storage capacity for the ATLAS experiment and delivers approximately 200 terabytes of data (picture 62 million photos) per day to more than 100 data centers globally.

    At SDCC, the High Performance Storage System (HPSS) has been providing mass storage services to the RHIC and LHC experiments since 1997 and 2006, respectively. This data archiving and retrieval software, developed by IBM and several DOE national laboratories, manages petabytes of data on disk and in robot-controlled tape libraries. Contained within the libraries are magnetic tape cartridges that encode the data and tape drives that read and write the data. Robotic arms load the cartridges into the drives and unload them upon request.

    3
    Inside one of the automated tape libraries at the Scientific Data and Computing Center (SDCC), Eric Lançon, director of SDCC, holds a magnetic tape cartridge. When scientists need data, a robotic arm (the piece of equipment in front of Lançon) retrieves the relevant cartridges from their slots and loads them into drives in the back of the library.

    When ranked by the volume of data stored in a single HPSS, Brookhaven’s system is the second largest in the nation and the fourth largest in the world. Currently, the RACF operates nine Oracle robotic tape libraries that constitute the largest Oracle tape storage system in the New York tri-state area. Contained within this system are nearly 70,000 active cartridges with capacities ranging from 800 gigabytes to 8.5 terabytes, and more than 100 tape drives. As the volume of scientific data to be stored increases, more libraries, tapes, and drives can be added accordingly. In 2006, this scalability was exercised when HPSS was expanded to accommodate data from the ATLAS experiment at LHC.

    “The HPSS system was deployed in the late 1990s, when the RHIC accelerator was coming on line. It allowed data from RHIC experiments to be transmitted via network to the data center for storage—a relatively new idea at the time,” said Shigeki Misawa, manager of Mass Storage and General Services at Brookhaven Lab. Misawa played a key role in the initial evaluation and configuration of HPSS, and has guided the system through significant changes in hardware (network equipment, storage systems, and servers) and operational requirements (tape drive read/write rate, magnetic tape cartridge capacity, and data transfer speed). “Prior to this system, data was recorded on magnetic tape at the experiment and physically moved to the data center,” he continued.

    Over the years, SDCC’s HPSS has been augmented with a suite of optimization and monitoring tools developed at Brookhaven Lab. One of these tools is David Yu’s scheduling software that optimizes the retrieval of massive amounts of data from tape storage. Another, developed by Jérôme Lauret, software and computing project leader for the STAR experiment, is software for organizing multiple user requests to retrieve data more efficiently.

    Engineers in the Mass Storage Group—including Tim Chou, Guangwei Che, and Ognian Novakov—have created other software tools customized for Brookhaven Lab’s computing environment to enhance data management and operation abilities and to improve the effectiveness of equipment usage.

    STAR experiment scientists have demonstrated the capabilities of SDCC’s enhanced HPSS, retrieving more than 4,000 files per hour (a rate of 6,000 gigabytes per hour) while using a third of HPSS resources. On the data archiving side, HPSS can store data in excess of five gigabytes per second.

    As demand for mass data storage spreads across Brookhaven, access to HPSS is being extended to other research groups. In the future, SDCC is expected to provide centralized mass storage services to multi-experiment facilities, such as the Center for Functional Nanomaterials and the National Synchrotron Light Source II—two more DOE Office of Science User Facilities at Brookhaven.

    “The tape library system of SDCC is a clear asset for Brookhaven’s current and upcoming big data science programs,” said SDCC Director Eric Lançon. “Our expertise in the field of data archiving is acknowledged worldwide.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 6:04 pm on June 20, 2017 Permalink | Reply
    Tags: , CERN LHC, Cyber security,   

    From SA: “World’s Most Powerful Particle Collider Taps AI to Expose Hack Attacks” 

    Scientific American

    Scientific American

    June 19, 2017
    Jesse Emspak

    1
    A general view of the CERN Computer / Data Center and server farm. Credit: Dean Mouhtaropoulos Getty Images

    Thousands of scientists worldwide tap into CERN’s computer networks each day in their quest to better understand the fundamental structure of the universe. Unfortunately, they are not the only ones who want a piece of this vast pool of computing power, which serves the world’s largest particle physics laboratory. The hundreds of thousands of computers in CERN’s grid are also a prime target for hackers who want to hijack those resources to make money or attack other computer systems. But rather than engaging in a perpetual game of hide-and-seek with these cyber intruders via conventional security systems, CERN scientists are turning to artificial intelligence to help them outsmart their online opponents.

    Current detection systems typically spot attacks on networks by scanning incoming data for known viruses and other types of malicious code. But these systems are relatively useless against new and unfamiliar threats. Given how quickly malware changes these days, CERN is developing new systems that use machine learning to recognize and report abnormal network traffic to an administrator. For example, a system might learn to flag traffic that requires an uncharacteristically large amount of bandwidth, uses the incorrect procedure when it tries to enter the network (much like using the wrong secret knock on a door) or seeks network access via an unauthorized port (essentially trying to get in through a door that is off-limits).

    CERN’s cybersecurity department is training its AI software to learn the difference between normal and dubious behavior on the network, and to then alert staff via phone text, e-mail or computer message of any potential threat. The system could even be automated to shut down suspicious activity on its own, says Andres Gomez, lead author of a paper [Intrusion Prevention and Detection in GridComputing – The ALICE Case] describing the new cybersecurity framework.

    CERN’s Jewel

    CERN—the French acronym for the European Organization for Nuclear Research lab, which sits on the Franco-Swiss border—is opting for this new approach to protect a computer grid used by more than 8,000 physicists to quickly access and analyze large volumes of data produced by the Large Hadron Collider (LHC).

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    The LHC’s main job is to collide atomic particles at high-speed so that scientists can study how particles interact. Particle detectors and other scientific instruments within the LHC gather information about these collisions, and CERN makes it available to laboratories and universities worldwide for use in their own research projects.

    The LHC is expected to generate a total of about 50 petabytes of data (equal to 15 million high-definition movies) in 2017 alone, and demands more computing power and data storage than CERN itself can provide. In anticipation of that type of growth the laboratory in 2002 created its Worldwide LHC Computing Grid, which connects computers from more than 170 research facilities across more than 40 countries. CERN’s computer network functions somewhat like an electrical grid, which relies on a network of generating stations that create and deliver electricity as needed to a particular community of homes and businesses. In CERN’s case the community consists of research labs that require varying amounts of computing resources, based on the type of work they are doing at any given time.

    Grid Guardians

    One of the biggest challenges to defending a computer grid is the fact that security cannot interfere with the sharing of processing power and data storage. Scientists from labs in different parts of the world might end up accessing the same computers to do their research if demand on the grid is high or if their projects are similar. CERN also has to worry about whether the computers of the scientists’ connecting into the grid are free of viruses and other malicious software that could enter and spread quickly due to all the sharing. A virus might, for example, allow hackers to take over parts of the grid and use those computers either to generate digital currency known as bitcoins or to launch cyber attacks against other computers. “In normal situations, antivirus programs try to keep intrusions out of a single machine,” Gomez says. “In the grid we have to protect hundreds of thousands of machines that already allow” researchers outside CERN to use a variety of software programs they need for their different experiments. “The magnitude of the data you can collect and the very distributed environment make intrusion detection on [a] grid far more complex,” he says.

    Jarno Niemelä, a senior security researcher at F-Secure, a company that designs antivirus and computer security systems, says CERN’s use of machine learning to train its network defenses will give the lab much-needed flexibility in protecting its grid, especially when searching for new threats. Still, artificially intelligent intrusion detection is not without risks—and one of the biggest is whether Gomez and his team can develop machine-learning algorithms that can tell the difference between normal and harmful activity on the network without raising a lot of false alarms, Niemelä says.

    CERN’s AI cybersecurity upgrades are still in the early stages and will be rolled out over time. The first test will be protecting the portion of the grid used by ALICE (A Large Ion Collider Experiment)—a key LHC project to study the collisions of lead nuclei. If tests on ALICE are successful, CERN’s machine learning–based security could then be used to defend parts of the grid used by the institution’s six other detector experiments.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 2:20 pm on June 12, 2017 Permalink | Reply
    Tags: But it’s not a perfect vacuum, CERN LHC, Creating secondary electrons, How to clean inside the LHC, Self-healing feature,   

    From Symmetry: “How to clean inside the LHC” 

    Symmetry Mag

    Symmetry

    06/12/17
    Sarah Charley

    1
    Daniel Dominguez, CERN

    The beam pipes of the LHC need to be so clean, even air molecules count as dirt.

    The Large Hadron Collider is the world’s most powerful accelerator.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Inside, beams of particles sprint 17 miles around in opposite directions through a pair of evacuated beam pipes that intersect at collision points surrounded by giant particle detectors.

    The inside of the beam pipes need to be spotless, which is why the LHC is thoroughly cleaned every year before it ramps up its summer operations program.

    It’s not dirt or grime that clogs the LHC. Rather, it’s microscopic air molecules.

    “The LHC is incredibly cold and under a strong vacuum, but it’s not a perfect vacuum,” says LHC accelerator physicist Giovanni Rumolo. “There’s a tiny number of simple atmospheric gas molecules and even more frozen to the beam pipes’ walls.”

    Protons racing around the LHC crash into these floating air molecules, detaching their electrons. The liberated electrons jump after the positively charged protons but quickly crash into the beam pipe walls, depositing heat and liberating even more electrons from the frozen gas molecules there.

    This process quickly turns into an avalanche, which weakens the vacuum, heats up the cryogenic system, disrupts the proton beam and dramatically lowers the efficiency and reliability of the LHC.

    But the clouds of buzzing electrons inside the beam pipe possess an interesting self-healing feature, Rumolo says.

    “When the chamber wall is under intense electron bombardment, the probability of it creating secondary electrons decreases and the avalanche is gradually mitigated,” he says. “Before ramping the LHC up to its full intensity, we run the machine for several days with as many low-energy protons as we can safely manage and intentionally produce electron clouds. The effect is that we have fewer loose electrons during the LHC’s physics runs.”

    In other words, accelerator engineers clean the inside of the LHC a little like they would unclog a shower drain. They gradually pump the LHC full of more and more sluggish protons, which act like a scrub brush and knock off the microscopic grime clinging to the inside of the beam pipe. This loose debris is flushed out by the vacuum system. In addition, the bombardment of electrons transforms simple carbon molecules, which are still clinging to the beam pipe’s walls, into an inert and protective coating of graphite.

    Cleaning the beam pipe is such an important job that there is a team of experts responsible for it (officially called the “Scrubbing Team”).

    “Scrubbing is essential if we want to operate the LHC at its full potential,” Rumolo says. “It’s challenging, because there is a fine line between thoroughly cleaning the machine and accidentally dumping the beam. When we’re scrubbing, we work around the clock in the CERN Control Center to make sure the accelerator is safe and the scrubbing is working properly.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 3:28 pm on June 6, 2017 Permalink | Reply
    Tags: CERN LHC, , CERN unveils new linear accelerator,   

    From Symmetry: “CERN unveils new linear accelerator” 

    Symmetry Mag

    Symmetry

    05/09/17 [Don’t know how I missed this one]
    No writer credit found

    1
    Photo by CERN

    Linac 4 will replace an older accelerator as the first step in the complex that includes the LHC.

    At a ceremony today, CERN European research center inaugurated its newest accelerator.

    Linac 4 will eventually become the first step in CERN’s accelerator chain, delivering proton beams to a wide range of experiments, including those at the Large Hadron Collider.

    After an extensive testing period, Linac 4 will be connected to CERN’s accelerator complex during a long technical shutdown in 2019-20. Linac 4 will replace Linac 2, which was put into service in 1978. Linac 4 will feed the CERN accelerator complex with particle beams of higher energy.

    “We are delighted to celebrate this remarkable accomplishment,” says CERN Director General Fabiola Gianotti. “Linac 4 is a modern injector and the first key element of our ambitious upgrade program, leading to the High-Luminosity LHC. This high-luminosity phase will considerably increase the potential of the LHC experiments for discovering new physics and measuring the properties of the Higgs particle in more detail.”

    “This is an achievement not only for CERN, but also for the partners from many countries who contributed in designing and building this new machine,” says CERN Director for Accelerators and Technology Frédérick Bordry. “We also today celebrate and thank the wide international collaboration that led this project, demonstrating once again what can be accomplished by bringing together the efforts of many nations.”

    The linear accelerator is the first essential element of an accelerator chain. In the linear accelerator, the particles are produced and receive the initial acceleration. The density and intensity of the particle beams are also shaped in the linac. Linac 4 is an almost 90-meter-long machine sitting 12 meters below the ground. It took nearly 10 years to build it.

    Linac 4 will send negative hydrogen ions, consisting of a hydrogen atom with two electrons, to CERN’s Proton Synchrotron Booster, which further accelerates the negative ions and removes the electrons. Linac 4 will bring the beam up to an energy of 160 million electronvolts, more than 3 times the energy of its predecessor. The increase in energy, together with the use of hydrogen ions, will enable doubling the beam intensity delivered to the LHC, contributing to an increase in the luminosity of the LHC by 2021.

    Luminosity is a parameter indicating the number of particles colliding within a defined amount of time. The peak luminosity of the LHC is planned to be increased by a factor of 5 by the year 2025. This will make it possible for the experiments to accumulate about 10 times more data over the period 2025 to 2035 than before.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:17 pm on June 6, 2017 Permalink | Reply
    Tags: , CERN LHC, , , , , When proton–proton collisions turn strange   

    From Physics Today: “When proton–proton collisions turn strange” 

    Physics Today bloc

    Physics Today

    5 Jun 2017
    Sung Chang

    Enhanced production of particles that contain strange quarks deepens the mysteries surrounding the formation of a quark–gluon plasma.

    1
    Protons colliding into other protons at CERN’s Large Hadron Collider (LHC) flushed out the Higgs boson.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    But for studying quark–gluon plasma (QGP)—the hot, dense soup of unconfined quarks and gluons that briefly filled the universe a few microseconds after the Big Bang—those collisions were supposed to be irrelevant. They are even used as QGP-absent baselines to compare with when investigating heavy-ion collisions that do produce QGP. However, in 2010 the CMS collaboration spotted something unexpected in proton–proton collisions. In the debris of rare, so-called high-multiplicity events—that is, events that produce an unusually high number of charged particles—the researchers discovered spatial correlations reminiscent of those attributed to QGP formation in heavy-ion collisions (see the article by Barbara Jacak and Peter Steinberg, Physics Today, May 2010, page 39). The ALICE and ATLAS collaborations soon corroborated the discovery.

    In addition, ALICE researchers found that in proton–lead ion collisions, the relative yield of particles that contain strange quarks increases with multiplicity. Strangeness enhancement is another hallmark of QGP formation. Now the ALICE collaboration reports that in high-multiplicity proton–proton collisions, such as the one shown here, strangeness is similarly enhanced. The finding is the latest entry in a growing, though not yet conclusive, list of evidence that QGP can form even in proton–proton collisions. Now that the LHC is operating at higher energies, the high-multiplicity collisions are both more frequent and extend to higher multiplicities. Researchers hope to decisively establish whether QGP can indeed be created in proton–proton collisions. And if that’s the case, physicists could probe the properties of QGP in a far simpler system than the one produced in heavy-ion collisions. (J. Adam et al., ALICE collaboration, Nat. Phys. 13, 535, 2017. Image courtesy of CERN.)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Our mission

    The mission of Physics Today is to be a unifying influence for the diverse areas of physics and the physics-related sciences.

    It does that in three ways:

    • by providing authoritative, engaging coverage of physical science research and its applications without regard to disciplinary boundaries;
    • by providing authoritative, engaging coverage of the often complex interactions of the physical sciences with each other and with other spheres of human endeavor; and
    • by providing a forum for the exchange of ideas within the scientific community.”

     
  • richardmitnick 2:39 pm on May 30, 2017 Permalink | Reply
    Tags: , , CERN LHC, Fermilab designs and builds focusing magnets for the LHC, , ,   

    From FNAL: “Fermilab designs and builds focusing magnets for the LHC” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    1
    Fermilab designed and built advanced, superconducting magnets to focus particle beams, preparing them for collision, at the Large Hadron Collider. The 43-foot long, 19-ton magnets took a decade to design, develop, manufacture and test, and ultimately became part of the world’s most powerful particle collider.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 10:18 am on May 23, 2017 Permalink | Reply
    Tags: , CERN LHC, , , ,   

    From CERN: “Kick-off for the 2017 LHC physics season” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    Data-taking has started again at the LHC for the first time in 2017.

    The experiments are continuing their exploration of physics at the unprecedented energy of 13 TeV.

    5
    Photo by Maximilien Brice, CERN, via Symmetry

    Physics at the LHC has kicked off for another season. Today, the Large Hadron Collider shifted up a gear, allowing the experiments to start taking data for the first time in 2017. Operations are starting gradually, with just a few proton bunches per beam. The operators who control the most powerful collider in the world will gradually increase the number of bunches circulating and will also reduce the size of the beams at the interaction points.

    In a few weeks’ time, over a billion collisions will be produced every second at the heart of the experiments.

    Last year, the LHC produced an impressive amount of data, no fewer than 6.5 million billion collisions, representing an integrated luminosity over the course of the year of almost 40 inverse femtobarns.

    Luminosity, which corresponds to the number of potential collisions per surface unit in a given time period, is a crucial indicator of an accelerator’s performance.

    In 2017, the operators are hoping to produce the same number of collisions as in 2016, but over a shorter period, since the LHC has started up a month later due to the extended year-end technical stop. “Over the first two years of operation at a collision energy of 13 TeV, we built up an excellent understanding of how the LHC works, which will allow us to optimise its operation even further in the third year,” says Frédérick Bordry, Director for Accelerators and Technology at CERN. “Our goal is to increase the peak luminosity even further and to maintain the LHC’s excellent availability, which in itself would be a great achievement.”

    Particle physics relies on the statistical analysis of various phenomena, so the size of the samples is crucial. In other words, the greater the number of collisions that reveal a certain phenomenon, the more reliable the result is. The experiments intend to take advantage of the large quantity of data supplied by the LHC to continue their exploration of physics at the highest energy ever obtained by an accelerator.

    “The LHC experiments are well prepared to double their statistics compared to what they obtained in 2016 at 13 TeV.

    Thanks to the new data, they will be able to reduce the uncertainties that surround their observations every time we enter unchartered territory,” says Eckhard Elsen, Director for Research and Computing.

    The LHC physicists are working on two different broad areas: improving their knowledge of known phenomena and probing the unknown. The known phenomena constitute the Standard Model of Particles and Forces, a theory that encompasses all our current knowledge of elementary particles.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The Higgs boson, discovered in 2012, plays a key role in the Standard Model.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    It is also a scalar particle, fundamentally different to the other elementary particles. In 2017, ATLAS and CMS will continue to work on determining the characteristics of this particle.

    CERN/ATLAS detector

    CERN/CMS Detector

    These two large general-purpose experiments will observe its decay modes and how it interacts with other particles. Their measurements may provide indications of possible new physics beyond the Standard Model. The experiments will also carry out precise measurements of other processes of the Standard Model, in particular those involving the top quark, the elementary particle with the greatest mass.

    Physicists hope to be able to identify disparities between their measurements and the Standard Model. This is one of the ways in which the unknown can be probed. Although it describes a lot of the phenomena of the infinitely small precisely, the Standard Model leaves many questions unanswered. For example, it describes only 5% of the universe; the rest is formed of dark matter and dark energy, the nature of which are as yet unknown. Every discrepancy with regard to the theory could direct physicists towards a larger theoretical framework of new physics that might resolve the enigmas we face.

    ATLAS, CMS and LHCb measure processes precisely to detect anomalies.

    CERN/LHCb

    ATLAS and CMS are also looking for new particles, such as those predicted by the theory of supersymmetry, which could be the components of dark matter.

    Standard model of Supersymmetry DESY

    LHCb is also interested in the imbalance between matter and antimatter. Both of these would have been created in equal quantities at the time of the Big Bang, but antimatter is now practically absent from the universe. LHCb is tracking the phenomenon known as “charge-parity violation” which is thought to be at least partly responsible for this imbalance.

    No lead ion collisions, which are the ALICE experiment’s specialist subject, are planned at the LHC this year.

    CERN/ALICE Detector

    ALICE will continue its analysis of the 2016 data and will record proton-proton collisions, which will also allow it to study the strong force. On the basis of the proton-proton collisions from 2016, ALICE recently announced that it had observed a state of matter resembling quark-gluon plasma.

    Quark gluon plasma. Duke University

    Quark-gluon plasma is the state of matter that existed a few millionths of a second after the Big Bang.

    Finally, several days of physics running with de-squeezed beams are planned for the TOTEM and ATLAS/ALFA experiments.

    3
    CERN TOTEM

    5
    CERN ATLAS/ALFA

    To find out more about physics at the LHC, you can watch our “Facebook Live” event tomorrow at 4p.m. CEST [no link provided].

    Received via email, so no link to the article. Sorry.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CernCourier
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 8:58 pm on May 16, 2017 Permalink | Reply
    Tags: , Alastair Paragas, , CERN LHC, , , , ,   

    From FIU: “My Internship with CERN” Alastair Paragas 

    FIU bloc

    This post is dedicated to J.L.T. who will prove Loop Quantum Gravity. I hope he sees it.

    Florida International University

    05/15/2017
    Millie Acebal

    1
    Name: Alastair Paragas

    Major: Computer Science (College of Engineering and Computing and Honors College)

    Hometown: Originally from Manila, Philippines; currently living in Homestead, Florida

    Where will you intern ? Starting June 19, I will intern at CERN, located in Geneva, Switzerland. CERN is the home of the (Large) (H)adron (C)ollider where the Higgs-Boson particle was discovered.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    2
    Tim Berners-Lee
    https://www.w3.org/People/Berners-Lee/

    Another great development at CERN was the creation of the modern internet – the (W)orld (W)ide (W)eb, with web pages as accessible documents through HTTP (HyperText Transfer Protocol), as developed by Tim Berners-Lee.

    Though CERN is in Geneva, I will be living in Saint Genis-Pouilly, France. Saint Genis-Pouilly is a town on the French side of the Franco-Swiss border, with CERN being on the Swiss side of the border. Luckily enough, the commute is only 2 miles long and is quite permissive because of the relaxed borders between the two countries due mostly in part to CERN’s importance to the European Union as a nuclear research facility. As such, I get to cross the border twice a day!

    What do you do there?

    I will be doing research and actual software engineering work with CERN’s distributed computing and data reporting/analytics team, under the mentorship of Manuel Martin Marquez. I will ensure the software that transports real-time data collected from the various instrumentation and devices at CERN don’t get lost! I also get to develop software that stores such data into both online transactional and analytical processing workloads.

    How did you get your internship?

    Out of 1,560 complete applications (and more partial applications), I was happy to be chosen as one of three other U.S. students, and in total 33 other students around the world.

    I was also lucky to also be accepted as an intern at NASA’s Langley Research Center (Virginia), under their autonomous algorithm team and the mentorship of A.J. Narkawicz, working on the DAEDALUS and ICAROUS projects for autonomous unmanned aerial and watercraft systems. Most of this software supports and runs with/on critical software that operate in all of modern American airports and air traffic control. However, I chose to turn this down for CERN.

    How does your internship connect back to your coursework?

    The internship connects back to what I learned in Operating Systems, Database and Survey of Database Systems; I learned to work with managing synchronization between concurrent processes as well as lower-level software aspects of a computer; how to manage data across various data stores; get an idea of the importance of various features of a relational database; and when not to use a relational database (of which are very few and far-in-between) and so forth.

    What about this internship opportunity excites you the most?

    I am looking forward to living in Europe, completely free, for nine weeks! I never thought it would be possible for me to travel around the world in such a capacity – and for that, I am very grateful.

    Coming from a poor background as an immigrant, I would never think it possible to be a citizen of the United States, much less, be able to do things like this.

    What have you learned about yourself?

    I learned that just like always, I am cheap and would like to live on the bare minimum. Even in my previous internships, I remember calculating my grocery costs to ensure that they were optimal and that I wasn’t breaking the budget, even if I can afford the cost and I am already starting to suffer looking around at food prices at local stores in the area.

    How will this internship help you professionally?

    I expect that just like my internships at Wolfram and Apple, I can network with highly intelligent people coming from diverse fields of study, ranging from physics, mathematics, mechanical engineering and computer science. I am always humbled working with behemoths from their respective fields, living and working on the shoulders of giants.

    What advice do you have for others starting the internship process?

    This is my third internship. I interned at Wolfram during my sophomore year in Waltham, MA, building a research project utilizing Wolfram technologies. I also completed an internship at Apple during my junior year as a software engineer in Cupertino, CA, building real-time streaming and batch data processing and reporting softwares in Apple’s Internet Software and Services Department.

    At our club – Association for Computing Machinery at FIU – we’ve also managed to create a community of highly successful and motivated students doing internships this summer at prestigious companies (all software engineering roles at companies like Chase, State Farm, Target, MathWorks and etc). We have weekly workshops on machine learning, big data, web/mobile application development, programming languages and a lot of other real-world engineering principles that escape the more academic theory of the computer science/information technology curriculum.

    We also get tons of our members to come to hackathons with us, whether by getting their travel expenses reimbursed or carpools! Considering that we are club officers, we don’t get paid for the services we do for the club – we’re seriously and passionately committed and do care about getting as many students into the level of expertise and careers they want for themselves.

    Anything else you’d care to share?

    On a more personal note, I would also like to say that just like everyone else, I have had bouts in my life where I felt like I was not accomplishing anything and also suffered from the emotions that come with that. It is important to never place someone on a pedestal while seeing yourself as little. However hard those moments may hit, I consider it highly important to re-evaluate and to emphasize to yourself the importance of working harder and fighting against possible temptations and vices that may result from such emotions and impulses; the idea of not giving up is all the more important.

    Personally, I was able to fight through this by being a part of my local Marine Corps’ DEP (Delayed Entry Program) program, under the mentorship of Sgt. Ariel Tavarez, where I was able to reflect, get inspired and work through grueling physical exercises with people who have made an impactful change in their lives. Different solutions work for different people, but the one thing that stays true across all these, is to always stay your course.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FIU Campus

    As Miami’s first and only public research university, offering bachelor’s, master’s, and doctoral degrees, FIU is worlds ahead in its service to the academic and local community.

    Designated as a top-tier research institution, FIU emphasizes research as a major component in the university’s mission. The Herbert Wertheim College of Medicine and the School of Computing and Information Sciences’ Discovery Lab, are just two of many colleges, schools, and centers that actively enhance the university’s ability to set new standards through research initiatives.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: