Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:18 am on May 23, 2017 Permalink | Reply
    Tags: , CERN LHC, , , ,   

    From CERN: “Kick-off for the 2017 LHC physics season” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    Data-taking has started again at the LHC for the first time in 2017.

    The experiments are continuing their exploration of physics at the unprecedented energy of 13 TeV.

    5
    Photo by Maximilien Brice, CERN, via Symmetry

    Physics at the LHC has kicked off for another season. Today, the Large Hadron Collider shifted up a gear, allowing the experiments to start taking data for the first time in 2017. Operations are starting gradually, with just a few proton bunches per beam. The operators who control the most powerful collider in the world will gradually increase the number of bunches circulating and will also reduce the size of the beams at the interaction points.

    In a few weeks’ time, over a billion collisions will be produced every second at the heart of the experiments.

    Last year, the LHC produced an impressive amount of data, no fewer than 6.5 million billion collisions, representing an integrated luminosity over the course of the year of almost 40 inverse femtobarns.

    Luminosity, which corresponds to the number of potential collisions per surface unit in a given time period, is a crucial indicator of an accelerator’s performance.

    In 2017, the operators are hoping to produce the same number of collisions as in 2016, but over a shorter period, since the LHC has started up a month later due to the extended year-end technical stop. “Over the first two years of operation at a collision energy of 13 TeV, we built up an excellent understanding of how the LHC works, which will allow us to optimise its operation even further in the third year,” says Frédérick Bordry, Director for Accelerators and Technology at CERN. “Our goal is to increase the peak luminosity even further and to maintain the LHC’s excellent availability, which in itself would be a great achievement.”

    Particle physics relies on the statistical analysis of various phenomena, so the size of the samples is crucial. In other words, the greater the number of collisions that reveal a certain phenomenon, the more reliable the result is. The experiments intend to take advantage of the large quantity of data supplied by the LHC to continue their exploration of physics at the highest energy ever obtained by an accelerator.

    “The LHC experiments are well prepared to double their statistics compared to what they obtained in 2016 at 13 TeV.

    Thanks to the new data, they will be able to reduce the uncertainties that surround their observations every time we enter unchartered territory,” says Eckhard Elsen, Director for Research and Computing.

    The LHC physicists are working on two different broad areas: improving their knowledge of known phenomena and probing the unknown. The known phenomena constitute the Standard Model of Particles and Forces, a theory that encompasses all our current knowledge of elementary particles.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The Higgs boson, discovered in 2012, plays a key role in the Standard Model.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    It is also a scalar particle, fundamentally different to the other elementary particles. In 2017, ATLAS and CMS will continue to work on determining the characteristics of this particle.

    CERN/ATLAS detector

    CERN/CMS Detector

    These two large general-purpose experiments will observe its decay modes and how it interacts with other particles. Their measurements may provide indications of possible new physics beyond the Standard Model. The experiments will also carry out precise measurements of other processes of the Standard Model, in particular those involving the top quark, the elementary particle with the greatest mass.

    Physicists hope to be able to identify disparities between their measurements and the Standard Model. This is one of the ways in which the unknown can be probed. Although it describes a lot of the phenomena of the infinitely small precisely, the Standard Model leaves many questions unanswered. For example, it describes only 5% of the universe; the rest is formed of dark matter and dark energy, the nature of which are as yet unknown. Every discrepancy with regard to the theory could direct physicists towards a larger theoretical framework of new physics that might resolve the enigmas we face.

    ATLAS, CMS and LHCb measure processes precisely to detect anomalies.

    CERN/LHCb

    ATLAS and CMS are also looking for new particles, such as those predicted by the theory of supersymmetry, which could be the components of dark matter.

    Standard model of Supersymmetry DESY

    LHCb is also interested in the imbalance between matter and antimatter. Both of these would have been created in equal quantities at the time of the Big Bang, but antimatter is now practically absent from the universe. LHCb is tracking the phenomenon known as “charge-parity violation” which is thought to be at least partly responsible for this imbalance.

    No lead ion collisions, which are the ALICE experiment’s specialist subject, are planned at the LHC this year.

    CERN/ALICE Detector

    ALICE will continue its analysis of the 2016 data and will record proton-proton collisions, which will also allow it to study the strong force. On the basis of the proton-proton collisions from 2016, ALICE recently announced that it had observed a state of matter resembling quark-gluon plasma.

    Quark gluon plasma. Duke University

    Quark-gluon plasma is the state of matter that existed a few millionths of a second after the Big Bang.

    Finally, several days of physics running with de-squeezed beams are planned for the TOTEM and ATLAS/ALFA experiments.

    3
    CERN TOTEM

    5
    CERN ATLAS/ALFA

    To find out more about physics at the LHC, you can watch our “Facebook Live” event tomorrow at 4p.m. CEST [no link provided].

    Received via email, so no link to the article. Sorry.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CernCourier
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 8:58 pm on May 16, 2017 Permalink | Reply
    Tags: , Alastair Paragas, , CERN LHC, , , , ,   

    From FIU: “My Internship with CERN” Alastair Paragas 

    FIU bloc

    This post is dedicated to J.L.T. who will prove Loop Quantum Gravity. I hope he sees it.

    Florida International University

    05/15/2017
    Millie Acebal

    1
    Name: Alastair Paragas

    Major: Computer Science (College of Engineering and Computing and Honors College)

    Hometown: Originally from Manila, Philippines; currently living in Homestead, Florida

    Where will you intern ? Starting June 19, I will intern at CERN, located in Geneva, Switzerland. CERN is the home of the (Large) (H)adron (C)ollider where the Higgs-Boson particle was discovered.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    2
    Tim Berners-Lee
    https://www.w3.org/People/Berners-Lee/

    Another great development at CERN was the creation of the modern internet – the (W)orld (W)ide (W)eb, with web pages as accessible documents through HTTP (HyperText Transfer Protocol), as developed by Tim Berners-Lee.

    Though CERN is in Geneva, I will be living in Saint Genis-Pouilly, France. Saint Genis-Pouilly is a town on the French side of the Franco-Swiss border, with CERN being on the Swiss side of the border. Luckily enough, the commute is only 2 miles long and is quite permissive because of the relaxed borders between the two countries due mostly in part to CERN’s importance to the European Union as a nuclear research facility. As such, I get to cross the border twice a day!

    What do you do there?

    I will be doing research and actual software engineering work with CERN’s distributed computing and data reporting/analytics team, under the mentorship of Manuel Martin Marquez. I will ensure the software that transports real-time data collected from the various instrumentation and devices at CERN don’t get lost! I also get to develop software that stores such data into both online transactional and analytical processing workloads.

    How did you get your internship?

    Out of 1,560 complete applications (and more partial applications), I was happy to be chosen as one of three other U.S. students, and in total 33 other students around the world.

    I was also lucky to also be accepted as an intern at NASA’s Langley Research Center (Virginia), under their autonomous algorithm team and the mentorship of A.J. Narkawicz, working on the DAEDALUS and ICAROUS projects for autonomous unmanned aerial and watercraft systems. Most of this software supports and runs with/on critical software that operate in all of modern American airports and air traffic control. However, I chose to turn this down for CERN.

    How does your internship connect back to your coursework?

    The internship connects back to what I learned in Operating Systems, Database and Survey of Database Systems; I learned to work with managing synchronization between concurrent processes as well as lower-level software aspects of a computer; how to manage data across various data stores; get an idea of the importance of various features of a relational database; and when not to use a relational database (of which are very few and far-in-between) and so forth.

    What about this internship opportunity excites you the most?

    I am looking forward to living in Europe, completely free, for nine weeks! I never thought it would be possible for me to travel around the world in such a capacity – and for that, I am very grateful.

    Coming from a poor background as an immigrant, I would never think it possible to be a citizen of the United States, much less, be able to do things like this.

    What have you learned about yourself?

    I learned that just like always, I am cheap and would like to live on the bare minimum. Even in my previous internships, I remember calculating my grocery costs to ensure that they were optimal and that I wasn’t breaking the budget, even if I can afford the cost and I am already starting to suffer looking around at food prices at local stores in the area.

    How will this internship help you professionally?

    I expect that just like my internships at Wolfram and Apple, I can network with highly intelligent people coming from diverse fields of study, ranging from physics, mathematics, mechanical engineering and computer science. I am always humbled working with behemoths from their respective fields, living and working on the shoulders of giants.

    What advice do you have for others starting the internship process?

    This is my third internship. I interned at Wolfram during my sophomore year in Waltham, MA, building a research project utilizing Wolfram technologies. I also completed an internship at Apple during my junior year as a software engineer in Cupertino, CA, building real-time streaming and batch data processing and reporting softwares in Apple’s Internet Software and Services Department.

    At our club – Association for Computing Machinery at FIU – we’ve also managed to create a community of highly successful and motivated students doing internships this summer at prestigious companies (all software engineering roles at companies like Chase, State Farm, Target, MathWorks and etc). We have weekly workshops on machine learning, big data, web/mobile application development, programming languages and a lot of other real-world engineering principles that escape the more academic theory of the computer science/information technology curriculum.

    We also get tons of our members to come to hackathons with us, whether by getting their travel expenses reimbursed or carpools! Considering that we are club officers, we don’t get paid for the services we do for the club – we’re seriously and passionately committed and do care about getting as many students into the level of expertise and careers they want for themselves.

    Anything else you’d care to share?

    On a more personal note, I would also like to say that just like everyone else, I have had bouts in my life where I felt like I was not accomplishing anything and also suffered from the emotions that come with that. It is important to never place someone on a pedestal while seeing yourself as little. However hard those moments may hit, I consider it highly important to re-evaluate and to emphasize to yourself the importance of working harder and fighting against possible temptations and vices that may result from such emotions and impulses; the idea of not giving up is all the more important.

    Personally, I was able to fight through this by being a part of my local Marine Corps’ DEP (Delayed Entry Program) program, under the mentorship of Sgt. Ariel Tavarez, where I was able to reflect, get inspired and work through grueling physical exercises with people who have made an impactful change in their lives. Different solutions work for different people, but the one thing that stays true across all these, is to always stay your course.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FIU Campus

    As Miami’s first and only public research university, offering bachelor’s, master’s, and doctoral degrees, FIU is worlds ahead in its service to the academic and local community.

    Designated as a top-tier research institution, FIU emphasizes research as a major component in the university’s mission. The Herbert Wertheim College of Medicine and the School of Computing and Information Sciences’ Discovery Lab, are just two of many colleges, schools, and centers that actively enhance the university’s ability to set new standards through research initiatives.

     
  • richardmitnick 3:50 pm on May 16, 2017 Permalink | Reply
    Tags: , Blind studies, , CERN LHC, , , , , ,   

    From Symmetry: “The facts and nothing but the facts” 

    Symmetry Mag

    Symmetry

    1
    Artwork by Corinne Mucha

    05/16/17
    Manuel Gnida

    At a recent workshop on blind analysis, researchers discussed how to keep their expectations out of their results.

    Scientific experiments are designed to determine facts about our world. But in complicated analyses, there’s a risk that researchers will unintentionally skew their results to match what they were expecting to find. To reduce or eliminate this potential bias, scientists apply a method known as “blind analysis.”

    Blind studies are probably best known from their use in clinical drug trials, in which patients are kept in the dark about—or blind to—whether they’re receiving an actual drug or a placebo. This approach helps researchers judge whether their results stem from the treatment itself or from the patients’ belief that they are receiving it.

    Particle physicists and astrophysicists do blind studies, too. The approach is particularly valuable when scientists search for extremely small effects hidden among background noise that point to the existence of something new, not accounted for in the current model. Examples include the much-publicized discoveries of the Higgs boson by experiments at CERN’s Large Hadron Collider and of gravitational waves by the Advanced LIGO detector.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles


    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project


    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    ESA/eLISA the future of gravitational wave research

    “Scientific analyses are iterative processes, in which we make a series of small adjustments to theoretical models until the models accurately describe the experimental data,” says Elisabeth Krause, a postdoc at the Kavli Institute for Particle Astrophysics and Cosmology, which is jointly operated by Stanford University and the Department of Energy’s SLAC National Accelerator Laboratory. “At each step of an analysis, there is the danger that prior knowledge guides the way we make adjustments. Blind analyses help us make independent and better decisions.”

    Krause was the main organizer of a recent workshop at KIPAC that looked into how blind analyses could be incorporated into next-generation astronomical surveys that aim to determine more precisely than ever what the universe is made of and how its components have driven cosmic evolution.

    Black boxes and salt

    One outcome of the workshop was a finding that there is no one-size-fits-all approach, says KIPAC postdoc Kyle Story, one of the event organizers. “Blind analyses need to be designed individually for each experiment.”

    The way the blinding is done needs to leave researchers with enough information to allow a meaningful analysis, and it depends on the type of data coming out of a specific experiment.

    A common approach is to base the analysis on only some of the data, excluding the part in which an anomaly is thought to be hiding. The excluded data is said to be in a “black box” or “hidden signal box.”

    Take the search for the Higgs boson. Using data collected with the Large Hadron Collider until the end of 2011, researchers saw hints of a bump as a potential sign of a new particle with a mass of about 125 gigaelectronvolts. So when they looked at new data, they deliberately quarantined the mass range around this bump and focused on the remaining data instead.

    They used that data to make sure they were working with a sufficiently accurate model. Then they “opened the box” and applied that same model to the untouched region. The bump turned out to be the long-sought Higgs particle.

    That worked well for the Higgs researchers. However, as scientists involved with the Large Underground Xenon experiment reported at the workshop, the “black box” method of blind analysis can cause problems if the data you’re expressly not looking at contains rare events crucial to figuring out your model in the first place.

    LUX has recently completed one of the world’s most sensitive searches for WIMPs—hypothetical particles of dark matter, an invisible form of matter that is five times more prevalent than regular matter.

    LUX/Dark matter experiment at SURF

    LUX scientists have done a lot of work to guard LUX against background particles—building the detector in a cleanroom, filling it with thoroughly purified liquid, surrounding it with shielding and installing it under a mile of rock. But a few stray particles make it through nonetheless, and the scientists need to look at all of their data to find and eliminate them.

    For that reason, LUX researchers chose a different blinding approach for their analyses. Instead of using a “black box,” they use a process called “salting.”

    LUX scientists not involved in the most recent LUX analysis added fake events to the data—simulated signals that just look like real ones. Just like the patients in a blind drug trial, the LUX scientists didn’t know whether they were analyzing real or placebo data. Once they completed their analysis, the scientists that did the “salting” revealed which events were false.

    A similar technique was used by LIGO scientists, who eventually made the first detection of extremely tiny ripples in space-time called gravitational waves.

    High-stakes astronomical surveys

    The Blind Analysis workshop at KIPAC focused on future sky surveys that will make unprecedented measurements of dark energy and the Cosmic Microwave Background—observations that will help cosmologists better understand the evolution of our universe.

    CMB per ESA/Planck

    ESA/Planck

    Dark energy is thought to be a force that is causing the universe to expand faster and faster as time goes by. The CMB is a faint microwave glow spread out over the entire sky. It is the oldest light in the universe, left over from the time the cosmos was only 380,000 years old.

    To shed light on the mysterious properties of dark energy, the Dark Energy Science Collaboration is preparing to use data from the Large Synoptic Survey Telescope, which is under construction in Chile. With its unique 3.2-gigapixel camera, LSST will image billions of galaxies, the distribution of which is thought to be strongly influenced by dark energy.


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    “Blinding will help us look at the properties of galaxies picked for this analysis independent of the well-known cosmological implications of preceding studies,” DESC member Krause says. One way the collaboration plans on blinding its members to this prior knowledge is to distort the images of galaxies before they enter the analysis pipeline.

    Not everyone in the scientific community is convinced that blinding is necessary. Blind analyses are more complicated to design than non-blind analyses and take more time to complete. Some scientists participating in blind analyses inevitably spend time looking at fake data, which can feel like a waste.

    Yet others strongly advocate for going blind. KIPAC researcher Aaron Roodman, a particle-physicist-turned-astrophysicist, has been using blinding methods for the past 20 years.

    “Blind analyses have already become pretty standard in the particle physics world,” he says. “They’ll be also crucial for taking bias out of next-generation cosmological surveys, particularly when the stakes are high. We’ll only build one LSST, for example, to provide us with unprecedented views of the sky.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 4:03 pm on May 11, 2017 Permalink | Reply
    Tags: , CERN LHC, , ,   

    From FNAL: “New U.S. and CERN agreements open pathways for future projects” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    May 11, 2017
    No writer credit found.

    1
    The CMS detector at the Large Hadron Collider at CERN. Photo: CERN

    The U.S. Department of Energy and CERN establish contributions for next-generation experiments and scientific infrastructure located both at CERN and in the United States

    The United States Department of Energy (DOE) and the European Organization for Nuclear Research (CERN) last week signed three new agreements securing a symbiotic partnership for scientific projects based both in the United States and Europe. These new agreements, which follow from protocols signed by both agencies in 2015, outline the contributions CERN will make to the neutrino program hosted by Fermilab in the United States and the U.S. Department of Energy’s contributions to the High-Luminosity Large Hadron Collider upgrade program at CERN.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Researchers, engineers and technicians at CERN are currently designing detector technology for the U.S. neutrino research program hosted by Fermilab.

    CERN Proto DUNE Maximillian Brice


    Surf-Dune/LBNF Caverns at Sanford


    FNAL DUNE Argon tank at SURF


    FNAL/DUNE Near Site Layout


    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    Neutrinos are nearly massless, neutral particles that interact so rarely with other matter that trillions of them pass through our bodies each second without leaving a trace. These tiny particles could be key to a deeper understanding of our universe, but their unique properties make them very difficult to study. Using intense particle beams and sophisticated detectors, Fermilab currently operates three neutrino experiments (NOvA, MicroBooNE and MINERvA) and has three more in development, including the Deep Underground Neutrino Experiment (DUNE) and two short-baseline experiments on the Fermilab site, one of which will make use of the Italian ICARUS detector, currently being prepared for transport from CERN.

    FNAL/NOvA experiment map

    FNAL/MicrobooNE

    FNAL/MINERvA

    FNAL/ICARUS


    INFN Gran Sasso ICARUS, since moved to FNAL

    The Long Baseline Neutrino Facility will provide the infrastructure needed to support DUNE both on the Fermilab site in Illinois and at the Sanford Underground Research Facility in South Dakota. Together, LBNF/DUNE represent the first international megascience project to be built at a DOE national laboratory.


    3
    Deep science at the frontier of physics

    The first agreement, signed last week, describes CERN’s provision of the first cryostat to house the massive DUNE detectors in South Dakota, which represent a major investment by CERN to the U.S.-hosted neutrino program. This critical piece of technology ensures that the particle detectors can operate below a temperature of minus 300 degrees Celsius, allowing them to record the traces of neutrinos as they pass through.

    The agreement also formalizes CERN’s support for construction and testing of prototype DUNE detectors. Researchers at CERN are currently working in partnership with Fermilab and other DUNE collaborating institutions to build prototypes for the huge subterranean detectors which will eventually sit a mile underground at the Sanford Underground Research Facility in South Dakota. These detectors will capture and measure neutrinos generated by Fermilab’s neutrino beam located 800 miles away. The prototypes developed at CERN will test and refine new methods for measuring neutrinos, and engineers will later integrate this new technology into the final detector designs for DUNE.

    The agreement also lays out the framework and objectives for CERN’s participation in Fermilab’s Short Baseline Neutrino Program, which is assembling a suite of three detectors to search for a hypothesized new type of neutrino. CERN has been refurbishing the ICARUS detector that originally searched for neutrinos at INFN’s Gran Sasso Laboratory in Italy and will ship it to Fermilab later this spring.

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO

    More than 1,700 scientists and engineers from DOE national laboratories and U.S. universities work on the Large Hadron Collider (LHC) experiments hosted at CERN. The LHC is the world’s most powerful particle collider, used to discover the Higgs boson in 2012 and now opening new realms of scientific discovery with higher-energy and higher-intensity beams. U.S. scientists, students, engineers and technicians contributed critical accelerator and detectors components for the original construction of the LHC and subsequent upgrades, and U.S. researchers continue to play essential roles in the international community that maintains, operates and analyzes data from the LHC experiments.

    The second agreement concerns the next phase of the LHC program, which includes an upgrade of the accelerator to increase the luminosity, a measurement of particle collisions per second. Scientists and engineers at U.S. national laboratories and universities are partnering with CERN to design powerful focusing magnets that employ state-of-the-art superconducting technology. The final magnets will be constructed by both American and European industries and then installed inside the LHC tunnel. The higher collision rate enabled by these magnets will help generate the huge amount of data scientists need in order to search and discover new particles and study extremely rare processes.

    American experts funded by DOE will also contribute to detector upgrades that will enable the ATLAS and CMS experiments to withstand the deluge of particles emanating from the LHC’s high-luminosity collisions. This work is detailed in the third agreement. These upgrades will make the detectors more robust and provide a high-resolution and three-dimensional picture of what is happening when rare particles metamorphose and decay. Fermilab will be a hub of upgrade activity for both the LHC accelerator and the CMS experiment upgrades, serving as the host DOE laboratory for the High-Luminosity LHC Accelerator Upgrade and the CMS Detector Upgrade projects.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 7:29 am on May 9, 2017 Permalink | Reply
    Tags: , CERN LHC, , , ,   

    From CERN: “CERN celebrates completion of Linac 4, its brand new linear particle accelerator” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    May 9, 2017
    No writer credit

    1
    No image caption, no image credit

    Linear accelerator 4 (Linac 4) is designed to boost negative hydrogen ions to high energies. It is scheduled to become the source of proton beams for the Large Hadron Collider (LHC) after the long shutdown in 2019-2020.

    Linac 4 will accelerate ions to 160 MeV to prepare them to enter the Proton Synchrotron Booster, which is part of the LHC injection chain. Negative hydrogen ions are pulsed through the accelerator for 400 microseconds at a time.

    Linear accelerators use radiofrequency cavities to charge cylindrical conductors. The ions pass through the conductors, which are alternately charged positive or negative. The conductors behind them push particles and the conductors ahead of them pull, causing the particles to accelerate. Small quadrupole magnets ensure the hydrogen ions remain in a tight beam. As particles approach the speed of light, the energy imparted by the conductors is converted into mass.

    Linac 4 accelerates negative hydrogen ions, which consist of a hydrogen atom with an additional electron. The ions are stripped of their two electrons during injection from Linac 4 into the Proton Synchrotron Booster to leave only protons. This allows more particles to accumulate in the synchrotron, simplifies injection, reduces beam loss at injection and gives a more brilliant beam.

    Linac 4 is 80 metres long and located 12 metres below ground. Beams have begun to be produced in 2013 and the milestone energy of 50 MeV was reached in 2015. During the long shutdown planned for 2019-20, it will replace Linac 2, which currently accelerates protons to 50 MeV. It is an important milestone in the project to increase the luminosity of the LHC during the next decade.

    At a ceremony today, CERN inaugurated its linear accelerator, Linac 4, the newest accelerator acquisition since the Large Hadron Collider (LHC). Linac 4 is due to feed the CERN accelerator complex with particle beams of higher energy, which will allow the LHC to reach higher luminosity by 2021. After an extensive testing period, Linac 4 will be connected to CERN’s accelerator complex during the upcoming long technical shut down in 2019-20. Linac 4 will replace Linac 2, which has been in service since 1978. It will become the first step in CERN’s accelerator chain, delivering proton beams to a wide range of experiments.

    “We are delighted to celebrate this remarkable accomplishment. Linac 4 is a modern injector and the first key element of our ambitious upgrade programme, leading up to the High-Luminosity LHC. This high-luminosity phase will considerably increase the potential of the LHC experiments for discovering new physics and measuring the properties of the Higgs particle in more detail,” said CERN Director General Fabiola Gianotti.

    “This is an achievement not only for CERN, but also for the partners from many countries who contributed to designing and building this new machine,” said CERN Director for Accelerators and Technology Frédérick Bordry. “Today, we also celebrate and thank the wide international collaboration that led this project, demonstrating once again what can be accomplished by bringing together the efforts of many nations.”

    The linear accelerator is the first essential element of an accelerator chain. In the linear accelerator, the particles are produced and receive the initial acceleration; the density and intensity of the particle beams are also shaped in the linac. Linac 4 is an almost 90-metre-long machine sitting 12 metres below the ground. It took nearly 10 years to build.

    Linac 4 will send negative hydrogen ions, consisting of a hydrogen atom with two electrons, to CERN’s Proton Synchrotron Booster (PSB), which further accelerates the negative ions and removes the electrons. Linac 4 will bring the beam up to 160 MeV energy, more than three times the energy of its predecessor. The increase in energy, together with the use of hydrogen ions, will enable double the beam intensity to be delivered to the LHC, thus contributing to an increase in the luminosity of the LHC.

    Luminosity is a parameter indicating the number of particles colliding within a defined amount of time. The peak luminosity of the LHC is planned to be increased by a factor of five by 2025. This will make it possible for the experiments to accumulate about 10 times more data over the period 2025 to 2035 than before. The High-Luminosity LHC will therefore provide more accurate measurements of fundamental particles than today, as well as the possibility of observing rare processes that occur beyond the machine’s present sensitivity level.

    You can download recent photos of the Linac4 here and also a set of 2015 (from the Photowalk competition). Video footage can be later found on our web pages : http://press.cern/news

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CernCourier
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 6:33 am on May 3, 2017 Permalink | Reply
    Tags: , CERN LHC, , ,   

    From CERN: “The LHC has restarted for its 2017 run” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    29 Apr 2017
    Harriet Kim Jarlett

    2
    Final tests were performed in the LHC at the end of April, ready for the restart this weekend (Image: Maximilien Brice/ CERN)

    Today, the LHC once again began circulating beams of protons, for the first time this year. This follows a 17-week-long extended technical stop.

    Over the past month, after the completion of the maintenance work that began in December 2016, each of the machines in the accelerator chain have, in turn, been switched on and checked until this weekend when the LHC, the final machine in the chain, could be restarted by the Operations team.

    “It’s like an orchestra, everything has to be timed and working very nicely together. Once each of the parts is working properly, that’s when the beam goes in, in phases from one machine to the next all the way up to the LHC,” explains Rende Steerenberg, who leads the operations group responsible for the whole accelerator complex, including the LHC.

    Each year, the machines shut down over the winter break to enable technicians and engineers to perform essential repairs and upgrades, but this year the stop was scheduled to run longer, allowing more complex work to take place. This year included the replacement of a superconducting magnet in the LHC, the installation of a new beam dump in the Super Proton Synchrotron and a massive cable removal campaign.

    Among other things, these upgrades will allow the collider to reach a higher integrated luminosity – the higher the luminosity, the more data the experiments can gather to allow them to observe rare processes.

    “Our aim for 2017 is to reach an integrated luminosity of 45 fb-1 [they reached 40 fb-1 last year] and preferably go beyond. The big challenge is that, while you can increase luminosity in different ways – you can put more bunches in the machine, you can increase the intensity per bunch and you can also increase the density of the beam – the main factor is actually the amount of time you stay in stable beams,” explains Steerenberg.

    In 2016, the machine was able to run with stable beams – beams from which the researchers can collect data – for around 49 per cent of the time, compared to just 35 per cent the previous year. The challenge the team faces this year is to maintain this or (preferably) increase it further.

    The team will also be using the 2017 run to test new optics settings – which provide the potential for even higher luminosity and more collisions.

    “We’re changing how we squeeze the beam to its small size in the experiments, initially to the same value as last year, but with the possibility to go to even smaller sizes later, which means we can push the limits of the machine further. With the new SPS beam dump and the improvements to the LHC injector kickers, we can inject more particles per bunch and more bunches, hence more collisions,” he concludes.

    For the first few weeks only, a few bunches of particles will be circulating in the LHC to debug and validate the machine. Bunches will gradually increase over the coming weeks until there are enough particles in the machine to begin collisions and to start collecting physics data.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CernCourier
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 2:25 pm on April 24, 2017 Permalink | Reply
    Tags: , , CERN LHC, Proton-proton collisions, ,   

    From Symmetry: “A tiny droplet of the early universe?” 

    Symmetry Mag

    Symmetry

    04/24/17
    Sarah Charley

    Particles seen by the ALICE experiment hint at the formation of quark-gluon plasma during proton-proton collisions. [ALREADY COVERED WITH AN ARTICLE FROM CERN HERE.]

    1
    Mona Schweizer, CERN

    About 13.8 billion years ago, the universe was a hot, thick soup of quarks and gluons—the fundamental components that eventually combined into protons, neutrons and other hadrons.

    Scientists can produce this primitive particle soup, called the quark-gluon plasma, in collisions between heavy ions. But for the first time physicists on an experiment at the Large Hadron Collider have observed particle evidence of its creation in collisions between protons as well.

    The LHC collides protons during the majority of its run time. This new result, published in Nature Physics by the ALICE collaboration, challenges long-held notions about the nature of those proton-proton collisions and about possible phenomena that were previously missed.

    “Many people think that protons are too light to produce this extremely hot and dense plasma,” says Livio Bianchi, a postdoc at the University of Houston who worked on this analysis. “But these new results are making us question this assumption.”

    Scientists at the LHC and at the US Department of Energy’s Brookhaven National Laboratory’s Relativistic Heavy Ion Collider, or RHIC, have previously created quark-gluon plasma in gold-gold and lead-lead collisions.

    BNL RHIC Campus

    BNL/RHIC Star

    BNL RHIC PHENIX

    CERN/LHC Map

    CERN LHC Tunnel


    CERN LHC

    In the quark gluon plasma, mid-sized quarks—such as strange quarks—freely roam and eventually bond into bigger, composite particles (similar to the way quartz crystals grow within molten granite rocks as they slowly cool). These hadrons are ejected as the plasma fizzles out and serve as a telltale signature of their soupy origin. ALICE researchers noticed numerous proton-proton collisions emitting strange hadrons at an elevated rate.

    “In proton collisions that produced many particles, we saw more hadrons containing strange quarks than predicted,” says Rene Bellwied, a professor at the University of Houston. “And interestingly, we saw an even bigger gap between the predicted number and our experimental results when we examined particles containing two or three strange quarks.”

    From a theoretical perspective, a proliferation of strange hadrons is not enough to definitively confirm the existence of quark-gluon plasma. Rather, it could be the result of some other unknown processes occurring at the subatomic scale.

    “This measurement is of great interest to quark-gluon-plasma researchers who wonder how a possible QGP signature can arise in proton-proton collisions,” says Urs Wiedemann, a theorist at CERN. “But it is also of great interest for high energy physicists who have never encountered such a phenomenon in proton-proton collisions.”

    Earlier research at the LHC found that the spatial orientation of particles produced during some proton-proton collisions mirrored the patterns created during heavy-ion collisions, suggesting that maybe these two types of collisions have more in common than originally predicted. Scientists working on the ALICE experiment will need to explore multiple characteristics of these strange proton-proton collisions before they can confirm if they are really seeing a miniscule droplet of the early universe.

    “Quark-gluon plasma is a liquid, so we also need to look at the hydrodynamic features,” Bianchi says. “The composition of the escaping particles is not enough on its own.”

    This finding comes from data collected the first run of the LHC between 2009 and 2013. More research over the next few years will help scientists determine whether the LHC can really make quark-gluon plasma in proton-proton collisions.

    “We are very excited about this discovery,” says Federico Antinori, spokesperson of the ALICE collaboration. “We are again learning a lot about this extreme state of matter. Being able to isolate the quark-gluon-plasma-like phenomena in a smaller and simpler system, such as the collision between two protons, opens up an entirely new dimension for the study of the properties of the primordial state that our universe emerged from.”

    Other experiments, such as those using RHIC, will provide more information about the observable traits and experimental characteristics of quark-gluon plasmas at lower energies, enabling researchers to gain a more complete picture of the characteristics of this primordial particle soup.

    “The field makes far more progress by sharing techniques and comparing results than we would be able to with one facility alone,” says James Dunlop, a researcher at RHIC. “We look forward to seeing further discoveries from our colleagues in ALICE.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:02 am on March 14, 2017 Permalink | Reply
    Tags: , , CERN LHC, , Vector boson plus jet event   

    From ALCF: “High-precision calculations help reveal the physics of the universe” 

    Argonne Lab
    News from Argonne National Laboratory

    ANL Cray Aurora supercomputer
    Cray Aurora supercomputer at the Argonne Leadership Computing Facility

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility
    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    ALCF

    March 9, 2017
    Joan Koka

    1
    With the theoretical framework developed at Argonne, researchers can more precisely predict particle interactions such as this simulation of a vector boson plus jet event. Credit: Taylor Childers, Argonne National Laboratory

    On their quest to uncover what the universe is made of, researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory are harnessing the power of supercomputers to make predictions about particle interactions that are more precise than ever before.

    Argonne researchers have developed a new theoretical approach, ideally suited for high-performance computing systems, that is capable of making predictive calculations about particle interactions that conform almost exactly to experimental data. This new approach could give scientists a valuable tool for describing new physics and particles beyond those currently identified.

    The framework makes predictions based on the Standard Model, the theory that describes the physics of the universe to the best of our knowledge. Researchers are now able to compare experimental data with predictions generated through this framework, to potentially uncover discrepancies that could indicate the existence of new physics beyond the Standard Model. Such a discovery would revolutionize our understanding of nature at the smallest measurable length scales.

    “So far, the Standard Model of particle physics has been very successful in describing the particle interactions we have seen experimentally, but we know that there are things that this model doesn’t describe completely.


    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    We don’t know the full theory,” said Argonne theorist Radja Boughezal, who developed the framework with her team.

    “The first step in discovering the full theory and new models involves looking for deviations with respect to the physics we know right now. Our hope is that there is deviation, because it would mean that there is something that we don’t understand out there,” she said.

    The theoretical method developed by the Argonne team is currently being deployed on Mira, one of the fastest supercomputers in the world, which is housed at the Argonne Leadership Computing Facility, a DOE Office of Science User Facility.

    Using Mira, researchers are applying the new framework to analyze the production of missing energy in association with a jet, a particle interaction of particular interest to researchers at the Large Hadron Collider (LHC) in Switzerland.




    LHC at CERN

    Physicists at the LHC are attempting to produce new particles that are known to exist in the universe but have yet to be seen in the laboratory, such as the dark matter that comprises a quarter of the mass and energy of the universe.


    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Although scientists have no way today of observing dark matter directly — hence its name — they believe that dark matter could leave a “missing energy footprint” in the wake of a collision that could indicate the presence of new particles not included in the Standard Model. These particles would interact very weakly and therefore escape detection at the LHC. The presence of a “jet”, a spray of Standard Model particles arising from the break-up of the protons colliding at the LHC, would tag the presence of the otherwise invisible dark matter.

    In the LHC detectors, however, the production of a particular kind of interaction — called the Z-boson plus jet process — can mimic the same signature as the potential signal that would arise from as-yet-unknown dark matter particles. Boughezal and her colleagues are using their new framework to help LHC physicists distinguish between the Z-boson plus jet signature predicted in the Standard Model from other potential signals.

    Previous attempts using less precise calculations to distinguish the two processes had so much uncertainty that they were simply not useful for being able to draw the fine mathematical distinctions that could potentially identify a new dark matter signal.

    “It is only by calculating the Z-boson plus jet process very precisely that we can determine whether the signature is indeed what the Standard Model predicts, or whether the data indicates the presence of something new,” said Frank Petriello, another Argonne theorist who helped develop the framework. “This new framework opens the door to using Z-boson plus jet production as a tool to discover new particles beyond the Standard Model.”

    Applications for this method go well beyond studies of the Z-boson plus jet. The framework will impact not only research at the LHC, but also studies at future colliders which will have increasingly precise, high-quality data, Boughezal and Petriello said.

    “These experiments have gotten so precise, and experimentalists are now able to measure things so well, that it’s become necessary to have these types of high-precision tools in order to understand what’s going on in these collisions,” Boughezal said.

    “We’re also so lucky to have supercomputers like Mira because now is the moment when we need these powerful machines to achieve the level of precision we’re looking for; without them, this work would not be possible.”

    Funding and resources for this work was previously allocated through the Argonne Leadership Computing Facility’s (ALCF’s) Director’s Discretionary program; the ALCF is supported by the DOE’s Office of Science’s Advanced Scientific Computing Research program. Support for this work will continue through allocations coming from the Innovation and Novel Computational Impact on Theory and Experiment (INCITE) program.

    The INCITE program promotes transformational advances in science and technology through large allocations of time on state-of-the-art supercomputers.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About ALCF

    The Argonne Leadership Computing Facility’s (ALCF) mission is to accelerate major scientific discoveries and engineering breakthroughs for humanity by designing and providing world-leading computing facilities in partnership with the computational science community.

    We help researchers solve some of the world’s largest and most complex problems with our unique combination of supercomputing resources and expertise.

    ALCF projects cover many scientific disciplines, ranging from chemistry and biology to physics and materials science. Examples include modeling and simulation efforts to:

    Discover new materials for batteries
    Predict the impacts of global climate change
    Unravel the origins of the universe
    Develop renewable energy technologies

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 8:51 pm on March 10, 2017 Permalink | Reply
    Tags: , , , CERN LHC, , , , , , The strong force (strong interaction)   

    From Symmetry: “A strength test for the strong force [strong interaction]” 

    Symmetry Mag

    Symmetry

    03/10/17
    Sarah Charley

    1
    Science Saturday

    New research could tell us about particle interactions in the early universe and even hint at new physics.

    Much of the matter in the universe is made up of tiny particles called quarks. Normally it’s impossible to see a quark on its own because they are always bound tightly together in groups. Quarks only separate in extreme conditions, such as immediately after the Big Bang or in the center of stars or during high-energy particle collisions generated in particle colliders.

    Scientists at Louisiana Tech University are working on a study of quarks and the force that binds them by analyzing data from the ATLAS experiment at the LHC. Their measurements could tell us more about the conditions of the early universe and could even hint at new, undiscovered principles of physics.


    ATLAS at the LHC

    The particles that stick quarks together are aptly named “gluons.” Gluons carry the strong force, one of four fundamental forces in the universe that govern how particles interact and behave. The strong force binds quarks into particles such as protons, neutrons and atomic nuclei.

    As its name suggests, the strong force [strong interaction] is the strongest—it’s 100 times stronger than the electromagnetic force (which binds electrons into atoms), 10,000 times stronger than the weak force (which governs radioactive decay), and a hundred million million million million million million (1039) times stronger than gravity (which attracts you to the Earth and the Earth to the sun).

    But this ratio shifts when the particles are pumped full of energy. Just as real glue loses its stickiness when overheated, the strong force carried by gluons becomes weaker at higher energies.

    “Particles play by an evolving set of rules,” says Markus Wobisch from Louisiana Tech University. “The strength of the forces and their influence within the subatomic world changes as the particles’ energies increase. This is a fundamental parameter in our understanding of matter, yet has not been fully investigated by scientists at high energies.”

    Characterizing the cohesiveness of the strong force is one of the key ingredients to understanding the formation of particles after the Big Bang and could even provide hints of new physics, such as hidden extra dimensions.

    “Extra dimensions could help explain why the fundamental forces vary dramatically in strength,” says Lee Sawyer, a professor at Louisiana Tech University. “For instance, some of the fundamental forces could only appear weak because they live in hidden extra dimensions and we can’t measure their full strength. If the strong force is weaker or stronger than expected at high energies, this tells us that there’s something missing from our basic model of the universe.”

    By studying the high-energy collisions produced by the LHC, the research team at Louisiana Tech University is characterizing how the strong force pulls energetic quarks into encumbered particles. The challenge they face is that quarks are rambunctious and caper around inside the particle detectors. This subatomic soirée involves hundreds of particles, often arising from about 20 proton-proton collisions happening simultaneously. It leaves a messy signal, which scientists must then reconstruct and categorize.

    Wobisch and his colleagues innovated a new method to study these rowdy groups of quarks called jets. By measuring the angles and orientations of the jets, he and his colleagues are learning important new information about what transpired during the collisions—more than what they can deduce by simple counting the jets.

    The average number of jets produced by proton-proton collisions directly corresponds to the strength of the strong force in the LHC’s energetic environment.

    “If the strong force is stronger than predicted, then we should see an increase in the number of proton-protons collisions that generate three jets. But if the strong force is actually weaker than predicted, then we’d expect to see relatively more collisions that produce only two jets. The ratio between these two possible outcomes is the key to understanding the strong force.”

    After turning on the LHC, scientists doubled their energy reach and have now determined the strength of the strong force up to 1.5 trillion electronvolts, which is roughly the average energy of every particle in the universe just after the Big Bang. Wobisch and his team are hoping to double this number again with more data.

    “So far, all our measurements confirm our predictions,” Wobisch says. “More data will help us look at the strong force at even higher energies, giving us a glimpse as to how the first particles formed and the microscopic structure of space-time.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:17 am on February 16, 2017 Permalink | Reply
    Tags: , , , CERN LHC, , , , SCOAP³,   

    From The Conversation: “How the insights of the Large Hadron Collider are being made open to everyone” 

    Conversation
    The Conversation

    January 12, 2017 [Just appeared in social media.]
    Virginia Barbour

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    If you visit the Large Hadron Collider (LHC) exhibition, now at the Queensland Museum, you’ll see the recreation of a moment when the scientist who saw the first results indicating discovery of the Higgs boson laments she can’t yet tell anyone.

    It’s a transitory problem for her, lasting as long as it takes for the result to be thoroughly cross-checked. But it illustrates a key concept in science: it’s not enough to do it; it must be communicated.

    That’s what is behind one of the lesser known initiatives of CERN (European Organization for Nuclear Research): an ambitious plan to make all its research in particle physics available to everyone, with a big global collaboration inspired by the way scientists came together to make discoveries at the LHC.

    This initiative is called SCOAP³, the Sponsoring Consortium for Open Access in Particle Physics Publishing, and is now about to enter its fourth year of operation. It’s a worldwide collaboration of more than 3,000 libraries (including six in Australia), key funding agencies and research centres in 44 countries, together with three intergovernmental organisations.

    It aims to make work previously only available to paying subscribers of academic journals freely and immediately available to everyone. In its first three years it has made more than 13,000 articles available.

    Not only are these articles free for anyone to read, but because they are published under a Creative Commons attribution license (CCBY), they are also available for anyone to use in anyway they wish, such as to illustrate a talk, pass onto a class of school children, or feed to an artificial intelligence program to extract information from. And these usage rights are enshrined forever.

    The concept of sharing research is not new in physics. Open access to research is now a growing worldwide initiative, including in Australasia. CERN, which runs the LHC, was also where the world wide web was invented in 1989 by Tim Berners-Lee, a British computer scientist at CERN.

    The main purpose of the web was to enable researchers contributing to CERN from all over the world share documents, including scientific drafts, no matter what computer systems they were using.

    Before the web, physicists had been sharing paper drafts by post for decades, so they were one of the first groups to really embrace the new online opportunities for sharing early research. Today, the pre-press site arxiv.org has more than a million free article drafts covering physics, mathematics, astronomy and more.

    But, with such a specialised field, do these “open access” papers really matter? The short answer is “yes”. Downloads have doubled to journals participating in SCOAP³.

    With millions of open access articles now being downloaded across all specialities, there is enormous opportunity for new ideas and collaborations to spring from chance readership. This is an important trend: the concept of serendipity enabled by open access was explored in 2015 in an episode of ABC RN’s Future Tense program.

    Greater than the sum of the parts

    There’s also a bigger picture to SCOAP³’s open access model. Not long ago, the research literature was fragmented. Individual papers and the connections between them were only as good as the physical library, with its paper journals, that academics had access to.

    Now we can do searches in much less time than we spend thinking of the search question, and the results we are presented with are crucially dependent on how easily available the findings themselves are. And availability is not just a function of whether an article is free or not but whether it is truly open, i.e. connected and reusable.

    One concept is whether research is “FAIR”, or Findable, Accessible, Interoperable and Reusable. In short, can anyone find, read, use and reuse the work?

    The principle is most advanced for data, but in Australia work is ongoing to apply it to all research outputs. This approach was also proposed at the November 2016 meeting of the G20 Science, Technology and Innovation Ministers Meeting. Research findings that are not FAIR can, effectively, be invisible. It’s a huge waste of millions of taxpayer dollars to fund research that won’t be seen.

    There is an even bigger picture that research and research publications have to fit into: that of science in society.

    Across the world we see politicians challenging accepted scientific norms. Is the fact that most academic research remains available only to those who can pay to see it contributing to an acceptance of such misinformed views?

    If one role for science is to inform public debate, then restricting access to that science will necessarily hinder any informed public debate. Although no one suggests that most readers of news sites will regularly want to delve into the details of papers in high energy physics, open access papers are 47% more likely to end up being cited in Wikipedia, which is a source that many non-scientists do turn to.

    Even worse, work that is not available openly now may not even be available in perpetuity, something that is being discussed by scientists in the USA.

    So in the same way that CERN itself is an example of the power of international collaboration to ask some of the fundamental scientific questions of our time, SCOAP³ provides a way to ensure that the answers, whatever they are, are available to everyone, forever.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: