Tagged: Accelerator Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:12 pm on October 30, 2014 Permalink | Reply
    Tags: Accelerator Science, , , , , ,   

    From FNAL- “Frontier Science Result: CDF A charming result” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Thursday, Oct. 30, 2014
    Diego Tonelli and Andy Beretvas

    Physicists gave funny names to the heavy quark cousins of those that make up ordinary matter: charm, strange, bottom, top. The Standard Model predicts that the laws governing the decays of strange, charm and bottom quarks differ if particles are replaced with antiparticles and observed in a mirror. This difference, CP violation in particle physics lingo, has been established for strange and bottom quarks. But for charm quarks the differences are so tiny that no one has observed them so far. Observing differences larger than predictions could provide much sought-after indications of new phenomena.

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    A team of CDF scientists searched for these tiny differences by analyzing millions of decays of particles decaying into pairs of charged kaons and pions, sifting through roughly a thousand trillion proton-antiproton collisions from the full CDF Run II data set. They studied CP violation by looking at whether the difference between the numbers of charm and anticharm decays occurring in each chunk of decay time varies with decay time itself.

    The results have a tiny uncertainty (two parts per thousand) but do not show any evidence for CP violation, as shown in the upper figure. The small residual decay asymmetry, which is constant in decay time, is due to the asymmetric layout of the detector. The combined result of charm decays into a pair of kaons and a pair of pions is the CP asymmetry parameter AΓ , which is equal to -0.12 ± 0.12 percent. The results are consistent with the current best determinations. Combined with them, they will improve the exclusion constraints on the presence of new phenomena in nature.

    graph
    These plots show the effective lifetime asymmetries as function of decay time for D →K+K- (top) and D → π+π- (bottom) samples. Results of the fits not allowing for (dotted red line) and allowing for (solid blue line) CP violation are overlaid.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 11:58 am on October 30, 2014 Permalink | Reply
    Tags: Accelerator Science, , , , , ,   

    From LC Newsline: “The future of Higgs physics” 

    Linear Collider Collaboration header
    Linear Collider Collaboration

    30 October 2014
    Joykrit Mitra

    In 2012, the ATLAS and CMS experiments at CERN’s Large Hadron Collider announced the discovery of the Higgs boson. The Higgs was expected to be the final piece of the particular jigsaw that is the Standard Model of particle physics, and its discovery was a monumental event.

    higgs
    Event recorded with the CMS detector in 2012 at a proton-proton centre of mass energy of 8 TeV. The event shows characteristics expected from the decay of the SM Higgs boson to a pair of photons (dashed yellow lines and green towers). Image: L. Taylor, CMS collaboration /CERN

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    CERN ATLAS New
    CERN ATLAS

    CERN CMS New
    CDERN CMS

    But more precise studies of it are needed than the LHC is able to provide. That is why, years earlier, a machine like the International Linear Collider had been envisioned as a Higgs factory, and the Higgs discovery set the stage for its possible construction.

    ILC schematic
    ILC schematic

    Over the years, instruments for probing the universe have become more sophisticated. More refined data has hinted that aspects of the Standard Model are incomplete. If built, a machine such as the ILC will help reveal how wide a gulf there is between the universe and our understanding of it by probing the Higgs to unprecedented levels. And perhaps, as some physicists think, it will uproot the Standard Model and make way for an entirely new physics.

    In the textbook version, the Higgs boson is a single particle, and its alleged progenitor, the mysterious Higgs field that pervades every point in the universe, is a single field. But this theory is still to be tested.

    “We don’t know whether the Higgs field is one field or many fields,” said Michael Peskin of SLAC’s Theoretical Physics Group. “We’re just now scratching the surface at the LHC.”

    The LHC collides proton beams together, and the collision environment is not a clean one. Protons are made up of quarks and gluons, and in an LHC collision it’s really these many component parts – not the larger proton – that interact. During a collision, there are simply too many components in the mix to determine the initial energies of each one. Without knowing them, it’s not possible to precisely calculate properties of the particles generated from the collision. Furthermore, Higgs events at the LHC are exceptionally rare, and there is so much background that the amount of data that scientists have to sift through to glean information on the Higgs is astronomical.

    “There are many ways to produce an event that looks like the Higgs at the LHC,” Peskin said. “Lots of other things happen that look exactly like what you’re trying to find.”

    The ILC, on the other hand, would collide electrons and positrons, which are themselves fundamental particles. They have no component parts. Scientists would know their precise initial energy states and there will be significantly fewer distractions from the measurement standpoint. The ILC is designed to be able to accelerate particle beams up to energies of 250 billion electronvolts, extendable eventually to 500 billion electronvolts. The higher the particles’ energies, the larger will be the number of Higgs events. It’s the best possible scenario to probe the Higgs.

    If the ILC is built, physicists will first want to test whether the Higgs particle discovered at the LHC indeed has the properties predicted by the Standard Model. To do this, they plan to study Higgs couplings with known subatomic particles. The higher a particle’s mass, the proportionally stronger its coupling ought to be with the Higgs boson. The ILC will be sensitive enough to detect and accurately measure Higgs couplings with light particles, for instance with charm quarks. Such a coupling can be detected at the LHC in principle but is very difficult to measure accurately.

    The ILC can also help measure the exact lifetime of the Higgs boson. The more particles the Higgs couples to, the faster it decays and disappears. A difference between the measured lifetime and the projected lifetime—calculated from the Standard Model—could reveal what fraction of possible particles—or the Higgs’ interactions with them— we’ve actually discovered.

    “Maybe the Higgs interacts with something new that is very hard to detect at a hadron collider, for example if it cannot be observed directly, like neutrinos,” speculated John Campbell of Fermilab’s Theoretical Physics Department.

    These investigations could yield some surprises. Unexpected vagaries in measurement could point to yet undiscovered particles, which in turn would indicate that the Standard Model is incomplete. The Standard Model also has predictions for the coupling between two Higgs bosons, and physicists hope to study this as well to check if there are indeed multiple kinds of Higgs particles.

    “It could be that the Higgs boson is only a part of the story, and it has explained what’s happened at colliders so far,” Campbell said. “The self-coupling of the Higgs is there in the Standard Model to make it self-consistent. If not the Higgs, then some other thing has to play that role that self-couplings play in the model. Other explanations could also provide dark matter candidates, but it’s all speculation at this point.”

    image
    3D plot showing how dark matter distribution in our universe has grown clumpier over time. (Image: NASA, ESA, R. Massey from California Institute of Technology)

    The Standard Model has been very self-consistent so far, but some physicists think it isn’t entirely valid. It ignores the universe’s
    accelerating expansion caused by dark energy, as well as the mysterious dark matter that still allows matter to clump together and galaxies to form. There is speculation about the existence of undiscovered mediator particles that might be exchanged between dark matter and the Higgs field. The Higgs particle could be a likely gateway to this unknown physics.

    With the LHC set to be operational again next year, an optimistic possibility is that a new particle or two might be dredged out from trillions of collision events in the near future. If built, the ILC would be able to build on such discoveries, just as in case of the Higgs boson, and provide a platform for more precise investigation.

    The collaboration between a hadron collider like the LHC and an electron-positron collider of the scale of the ILC could uncover new territories to be explored and help map them with precision, making particle physics that much richer.

    See the full article here.

    The Linear Collider Collaboration is an organisation that brings the two most likely candidates, the Compact Linear Collider Study (CLIC) and the International Liner Collider (ILC), together under one roof. Headed by former LHC Project Manager Lyn Evans, it strives to coordinate the research and development work that is being done for accelerators and detectors around the world and to take the project linear collider to the next step: a decision that it will be built, and where.

    Some 2000 scientists – particle physicists, accelerator physicists, engineers – are involved in the ILC or in CLIC, and often in both projects. They work on state-of-the-art detector technologies, new acceleration techniques, the civil engineering aspect of building a straight tunnel of at least 30 kilometres in length, a reliable cost estimate and many more aspects that projects of this scale require. The Linear Collider Collaboration ensures that synergies between the two friendly competitors are used to the maximum.

    Linear Collider Colaboration Banner

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:56 pm on October 28, 2014 Permalink | Reply
    Tags: Accelerator Science, , , , ,   

    From FNAL: “Mu2e moves ahead” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Tuesday, Oct. 28, 2014
    nl
    Fermilab Director Nigel Lockyer wrote this column

    In continued alignment with goals laid out in the P5 report, we’re making progress on our newest muon experiment, Mu2e. A four-day DOE Critical Decision 2/3b review of the experiment concluded Friday. The review went extremely well and validated the design, technical progress, and the cost and schedule of the project. The reviewers praised the depth and breadth of our staff’s excellent technical work and preparation. Official sign-off for CD-2/3b is expected in the next several months, followed by construction on the Mu2e building in early 2015. Construction on the transport solenoid modules should begin in the spring. The experiment received CD-0 approval in 2009 and CD-1 approval in 2012 and is slated to start up in 2020.

    Named for the muon-to-electron conversion that researchers hope to observe, Mu2e is a crucial stepping stone on our journey beyond the Standard Model. and in the hunt for new physics. It will be 10,000 times more sensitive than the previous attempts to observe that transition.

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Experimenters will use a series of superconducting magnets to separate muons from other particles, guiding them to a stopping target. After the muons have been captured by aluminum nuclei, a very small number are expected to transform into only an electron rather than the typical decay into an electron and two neutrinos. It’s a change so rare, theorists liken it to finding a penny with a scratch on Lincoln’s head hidden in a stack of pristine pennies so tall that the stack stretches from the Earth to Mars and back again 130 times.

    The experiment will provide insight into how and why particles within one family change into others. It might also help narrow down theories about how the universe works and provide insight into data coming out of the LHC. Discovery of the muon-to-electron conversion would hint at undiscovered particles or forces and potentially illuminate a grand unification theory — not bad for a 75-foot-long experiment.

    Many months of hard work preceded last week’s review. Thank you to all who were involved in helping to move this important experiment forward.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 6:36 pm on October 24, 2014 Permalink | Reply
    Tags: Accelerator Science, , , , ,   

    From Nautilus: “Who Really Found the Higgs Boson” 

    Nautilus

    Nautilus

    October 23, 2014
    By Neal Hartman
    Illustration by Owen Freeman
    Also stock photos

    To those who say that there is no room for genius in modern science because everything has been discovered, Fabiola Gianotti has a sharp reply. “No, not at all,” says the former spokesperson of the ATLAS Experiment, the largest particle detector at the Large Hadron Collider at CERN. “Until the fourth of July, 2012 we had no proof that nature allows for elementary scalar fields. So there is a lot of space for genius.”

    CERN ATLAS New
    ATLAS

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    She is referring to the discovery of the Higgs boson two years ago—potentially one of the most important advances in physics in the past half century. It is a manifestation of the eponymous field that permeates all of space, and completes the standard model of physics: a sort of baseline description for the existence and behavior of essentially everything there is.

    By any standards, it is an epochal, genius achievement.

    What is less clear is who, exactly, the genius is. An obvious candidate is Peter Higgs, who postulated the Higgs boson, as a consequence of the Brout-Englert-Higgs mechanism, in 1964. He was awarded the Nobel Prize in 2013 along with Francois Englert (Englert and his deceased colleague Robert Brout arrived at the same result independently). But does this mean that Higgs was a genius? Peter Jenni, one of the founders and the first “spokesperson” of the ATLAS Experiment Collaboration (one of the two experiments at CERN that discovered the Higgs particle), hesitates when I ask him the question.

    “They [Higgs, Brout and Englert] didn’t think they [were working] on something as grandiose as [Einstein’s relativity],” he states cautiously. The spontaneous symmetry breaking leading to the Higgs “was a challenging question, but [Albert Einstein] saw something new and solved a whole field. Peter Higgs would tell you, he worked a few weeks on this.”

    The ability of the precocious individual physicist to suggest a new data cut or filter is restricted.

    What, then, of the leaders of the experimental effort, those who directed billions of dollars in investment and thousands of physicists, engineers, and students from almost 40 countries for over three decades? Surely there must have been a genius mastermind directing this legion of workers, someone we can single out for his or her extraordinary contribution.

    “No,” says Gianotti unequivocally, which is rare for a physicist, “it’s completely different. The instruments we have built are so complex that inventiveness and creativity manifests itself in the day-by-day work. There are an enormous amount of problems that require genius and creativity to be spread over time and over many people, and all at the same level.”

    Scientific breakthroughs often seem to be driven by individual genius, but this perception belies the increasingly collaborative nature of modern science. Perhaps nothing captures this dichotomy better than the story of the Higgs discovery, which presents a stark contrast between the fame awarded to a few on the one hand, and the institutionalized anonymity of the experiments that made the discovery possible on the other.

    An aversion to the notion of exceptional individuals is deeply rooted within the ATLAS collaboration, a part of its DNA. Almost all decisions in the collaboration are approved by representative groups, such as the Institute Board, the Collaboration Board, and a plethora of committees and task forces. Consensus is the name of the game. Even the effective CEO, a role Gianotti occupied from 2009 to 2013, is named the “Spokesperson.” She spoke for the collaboration, but did not command it.

    Collectivity is crucial to ATLAS in part because it’s important to avoid paying attention to star personalities, so that the masses of physicists in the collaboration each feel they own the research in some way. Almost 3,000 people qualify as authors on the key physics papers ATLAS produces, and the author list can take almost as many pages as the paper itself.

    team
    The genius of crowds: Particle physics collaborations can produce academic papers with hundreds of authors. One 2010 paper was 40 pages long—with 10 pages devoted to the authors list, pictured here.

    On a more functional level, this collectivity also makes it easier to guard against bias in interpreting the data. “Almost everything we do is meant to reduce potential bias in the analysis,” asserts Kerstin Tackmann, a member of the Higgs to Gamma Gamma analysis group during the time of the Higgs discovery, and recent recipient of the Young Scientist Prize in Particle Physics. Like many physicists, Tackmann verges on the shy, and speaks with many qualifications. But she becomes more forceful when conveying the importance of eliminating bias.

    “We don’t work with real data until the very last step,” she explains. After the analysis tools—algorithms and software, essentially—are defined, they are applied to real data, a process known as the unblinding. “Once we look at the real data,” says Tackmann, “we’re not allowed to change the analysis anymore.” To do so might inadvertently create bias, by tempting the physicists to tune their analysis tools toward what they hope to see, in the worst cases actually creating results that don’t exist. The ability of the precocious individual physicist to suggest a new data cut or filter is restricted by this procedure: He or she wouldn’t even see real data until late in the game, and every analysis is vetted independently by multiple other scientists.

    Most people in the collaboration work directly “for” someone who is in no way related to their home institute, which actually writes their paycheck.

    This collective discipline is one way that ATLAS tames the complexity of the data it produces, which in raw form is voluminous enough to fill a stack of DVDs that reaches from the earth to the moon and back again, 10 times every year. The data must be reconstructed into something that approximates an image of individual collisions in time and space, much like the processing required for raw output from a digital camera.

    But the identification of particles from collisions has become astoundingly more complex since the days of “scanning girls” and bubble chamber negatives, where actual humans sat over enlarged images of collisions and identified the lines and spirals as different particles. Experimentalists today need to have expert knowledge of the internal functioning of the different detector subsystems: pixel detector, silicon strip tracker, transition radiation tracker, muon system, and calorimeters, both hadronic and electromagnetic. Adjustments made to each subsystem’s electronics, such as gain or threshold settings, might cause the absence or inclusion of what looks like real data but isn’t. Understanding what might cause false or absent signals, and how they can be accounted for, is the most challenging and creative part of the process. “Some people are really clever and very good at this,” says Tackmann.

    The process isn’t static, either. As time goes on, the detector changes from age and radiation damage. In the end the process of perfecting the detector’s software is never-ending, and the human requirements are enormous: roughly 100 physicists were involved in the analysis of a single and relatively straightforward particle signature, the decay of the Higgs into two Gamma particles. The overall Higgs analysis was performed by a team of more than 600 physicists.

    The depth and breadth of this effort transform the act of discovery into something anonymous and distributed—and this anonymity has been institutionalized in ATLAS culture. Marumi Kado, a young physicist with tousled hair and a quiet zen-like speech that borders on a whisper, was one of the conveners of the “combined analysis” group that was responsible for finally reaching the level of statistical significance required to confirm the Higgs discovery. But, typically for ATLAS, he downplays the importance of the statistical analysis—the last step—in light of the complexity of what came before. “The final analysis was actually quite simple,” he says. “Most of the [success] lay in how you built the detector, how well you calibrated it, and how well it was designed from the very beginning. All of this took 25 years.”
    2

    The deeply collaborative work model within ATLAS meant that it wasn’t enough for it to innovate in physics and engineering—it also needed to innovate its management style and corporate culture. Donald Marchand, a professor of strategy execution and information management at IMD Business School in Lausanne, describes ATLAS as following a collaborative mode of working that flies in the face of standard “waterfall”—or top down—management theory.

    Marchand conducted a case study on ATLAS during the mid-2000s, finding that the ATLAS management led with little or no formal authority. Most people in the collaboration work directly “for” someone who is in no way related to their home institute, which actually writes their paycheck. For example, during the construction phase, the project leader of the ATLAS pixel detector, one of its most data-intensive components, worked for a U.S. laboratory in California. His direct subordinate, the project engineer, worked for an institute in Italy. Even though he was managing a critical role in the production process, the project leader had no power to promote, discipline, or even formally review the project engineer’s performance. His only recourse was discussion, negotiation, and compromise. ATLAS members are more likely to feel that they work with someone, rather than for them.

    Similarly, funding came from institutes in different countries through “memorandums of understanding” rather than formal contracts. The collaboration’s spokesperson and other top managers were required to follow a politic of stewardship, looking after the collaboration rather than directing it. If collaboration members were alienated, that could mean the loss of the financial and human capital they were investing. Managers at all levels needed to find non-traditional ways to provide feedback, incentives, and discipline to their subordinates.

    One famous member of the collaboration is looked upon dubiously by many, who see him as drawing too much attention to himself.

    The coffee chat was one way to do this, and became the predominant way to conduct the little daily negotiations that kept the collaboration running. Today there are cafés stationed all around CERN, and they are full from morning to evening with people having informal meetings. Many physicists can be seen camped out in the cafeteria for hours at a time, working on their laptops between appointments. ATLAS management also created “a safe harbor, a culture within the organization that allows [employees] to express themselves and resolve conflicts and arguments without acrimony,” Marchand says.

    The result is a management structure that is remarkably effective and flexible. ATLAS managers consistently scored in the top 5 percent of a benchmark scale that measures how they control, disseminate, and capitalize on the information capital in their organization. Marchand also found that the ATLAS management structure was effective at adapting to changing circumstances, temporarily switching to a more top-down paradigm during the core production phase of the experiment, when thousands of identical objects needed to be produced on assembly lines all over the world.

    This collaborative culture didn’t arise by chance; it was built into ATLAS from the beginning, according to Marchand. The original founders infused a collaborative ethic into every person that joined by eschewing personal credit, talking through conflicts face to face, and discussing almost everything in open meetings. But that ethic is codified nowhere; there is no written code of conduct. And yet it is embraced, almost religiously, by everyone that I spoke with.

    Collaboration members are sceptical of attributing individual credit to anything. Every paper includes the entire author list, and all of ATLAS’s outreach material is signed “The ATLAS Collaboration.” People are suspicious of those that are perceived to take too much personal credit in the media. One famous member of the collaboration (as well as a former rock star and host of the highly successful BBC series, Horizon) is looked upon dubiously by many, who see him as drawing too much attention to himself through his association with the experiment.

    3
    MIND THE GAP: Over 60 institutes collaborated to build and install a new detector layer inside a 9-millimeter gap between the beam pipe (the evacuated pipe inside of which protons circulate) and the original detector.ATLAS Experiment © 2014 CERN

    In searching for genius at ATLAS, and other experiments at CERN, it seems almost impossible to point at anything other than the collaborations themselves. More than any individual, including the theorists who suggest new physics and the founders of experimental programs, it is the collaborations that reflect the hallmarks of genius: imagination, persistence, open-mindedness, and accomplishment.

    The results speak for themselves: ATLAS has already reached its first key objective in just one-tenth of its projected lifetime, and continues to evolve in a highly collaborative way. This May, one of the first upgrades to the detector was installed. Called the Insertable B-Layer (IBL), it grew out of a task force formed near the end of ATLAS’s initial commissioning period, in 2008, with the express goal of documenting why inserting another layer of detector into a 9-millimeter clearance space just next to the beam pipe was considered impossible.

    Consummate opportunists, the task force members instead came up with a design that quickly turned into a new subproject. And though it’s barely larger than a shoebox, the IBL’s construction involved more than 60 institutes all over the world, because everyone wanted to be involved in this exciting new thing. When it came time to slide the Insertable B-layer sub-detector into its home in the heart of ATLAS earlier this year, with only a fraction of a millimeter of clearance over 7 meters in length, the task was accomplished in just two hours—without a hitch.

    Fresh opportunities for new genius abound. Gianotti singles out dark matter as an example, saying “96 percent of the universe is dark. We don’t know what it’s made of and it doesn’t interact with our instruments. We have no clue,” she says. “So there is a lot of space for genius.” But instead of coming from the wild-haired scientist holding a piece of chalk or tinkering in the laboratory, that genius may come from thousands of people working together.

    Neal Hartman is a mechanical engineer with Lawrence Berkeley National Laboratory that has been working with the ATLAS collaboration at CERN for almost 15 years. He spends much of his time on outreach and education in both physics and general science, including running CineGlobe, a science-inspired film festival at CERN.

    See the full article, with notes, here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:39 pm on October 23, 2014 Permalink | Reply
    Tags: Accelerator Science, , , ,   

    From FNAL: “Physics in a Nutshell – Unparticle physics” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Thursday, Oct. 23, 2014
    Jim Pivarski

    The first property of matter that was known to be quantized was not a surprising one like spin — it was mass. That is, mass only comes in multiples of a specific value: The mass of five electrons is 5 times 511 keV. A collection of electrons cannot have 4.9 or 5.1 times this number — it must be exactly 4 or exactly 6, and this is a quantum mechanical effect.

    We don’t usually think of mass quantization as quantum mechanical because it isn’t weird. We sometimes imagine electrons as tiny balls, all alike, each with a mass of 511 keV. While this mental image could make sense of the quantization, it isn’t correct since other experiments show that an electron is an amorphous wave or cloud. Individual electrons cannot be distinguished. They all melt together, and yet the mass of a blob of electron-stuff is always a whole number.

    The quantization of mass comes from a wave equation — physicists assume that electron-stuff obeys this equation, and when they solve the equation, it has only solutions with mass in integer multiples of 511 keV. Since this agrees with what we know, it is probably the right equation for electrons. However, there might be other forms of matter that obey different laws.

    fra
    One alternative would be to obey a symmetry principle known as scale invariance. Scale invariance is a property of fractals, like the one shown above, in which the same drawing is repeated within itself at smaller and smaller scales. For matter, scale invariance is the property that the energy, momentum and mass of a blob of matter can be scaled up equally. Normal particles like electrons are not scale-invariant because the energy can be scaled by an arbitrary factor, but the mass is rigidly quantized.

    It is theoretically possible that another type of matter, dubbed “unparticles,” could satisfy scale invariance. In a particle detector, unparticles would look like particles with random masses. One unparticle decay might have many times the apparent mass of the next — the distribution would be broad.

    Another feature of unparticles is that they don’t interact strongly with the familiar Standard Model particles, but they interact more strongly at higher energies. Therefore, they would not have been produced in low-energy experiments, but could be discovered in high-energy experiments.

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Physicists searched for unparticles using the 7- and 8-TeV collisions produced by the LHC in 2011-2012, and they found nothing. This tightens limits, reducing the possible parameters that the theory can have, but it does not completely rule it out. Next spring, the LHC is scheduled to start up with an energy of 13 TeV, which would provide a chance to test the theory more thoroughly. Perhaps the next particle to be discovered is not a particle at all.

    CERN LHC Grand Tunnel
    LHC Tunnel

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:08 pm on October 22, 2014 Permalink | Reply
    Tags: Accelerator Science, , , , , ,   

    From FNAL: “From the Office of Campus Strategy and Readiness – Building the future of Fermilab” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Wednesday, Oct. 22, 2014
    ro
    Randy Ortgiesen, head of OCSR, wrote this column.

    As Fermilab and the Department of Energy continue to aggressively “make ready the laboratory” for implementing P5’s recommendations, I can’t help reflecting on all that has recently been accomplished to support the lab’s future — both less visible projects and the big stuff. As we continue to build on these accomplishments, it’s worth noting their breadth and how much headway we’ve made.

    The development of the Muon Campus is proceeding at a healthy clip. Notable in its progress is the completion of the MC-1 Building and the cryogenic systems that support the Muon g-2 experiment. The soon-to-launch beamline enclosure construction project and soon-to-follow Mu2e building is also significant. And none of this could operate without the ongoing, complex accelerator work that will provide beam to these experiments.

    Repurposing of the former CDF building for future heavy-assembly production space and offices is well under way, with more visible exterior improvements to begin soon.

    The new remote operations center, ROC West, is open for business. Several experiments already operate from its new location adjacent to the Wilson Hall atrium.

    The Wilson Street entrance security improvements, including a new guardhouse, are also welcome additions to improved site aesthetics and security operations. Plans for a more modern and improved Pine Street entrance are beginning as well.

    The fully funded Science Laboratory Infrastructure project to replace the Master Substation and critical portions of the industrial cooling water system will mitigate the lab’s largest infrastructure vulnerability for current and future lab operations. Construction is scheduled to start in summer 2015.

    The short-baseline neutrino program is expected to start utility and site preparation very soon, with the start of the detector building construction following shortly thereafter. This is an important and significant part of the near-term future of the lab.

    The start of a demolition program for excess older and inefficient facilities is very close. The program will begin with a portion of the trailers at both the CDF and DZero trailer complexes.

    Space reconfiguration in Wilson Hall to house the new Neutrino Division and LBNF project offices is in the final planning stage and will also be starting soon.

    The atrium improvements, with the reception desk, new lighting and more modern furniture create a more welcoming atmosphere.

    And I started the article by mentioning planning for the “big stuff.” The big stuff, as you may know, includes the lab’s highest-priority project in developing a new central campus. This project is called the Center for Integrated Engineering Research, to be located just west of Wilson Hall. It will consolidate engineering resources from across the site to most efficiently plan for, construct and operate the P5 science projects. The highest-priority Technical Campus project, called the Industrial Center Building Addition, is urgently needed to expand production capacity for the equipment required for future science projects. And lastly the Scientific Hostel, or guest house, for which plans are also under way, will complete the Central Campus theme to “eat-sleep-work to drive discovery.”

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:08 pm on October 21, 2014 Permalink | Reply
    Tags: Accelerator Science, , Fermilab Scientific Computing, , , ,   

    From FNAL: “Simulation in the 21st century” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Tuesday, Oct. 21, 2014
    V. Daniel Elvira, Scientific Computing Simulation Department head

    Simulation is not magic, but it can certainly produce the feeling. Although it can’t miraculously replace particle physics experiments, revealing new physics phenomena at the touch of a key, it can help scientists to design detectors for best physics at the minimum cost in time and money.

    sim
    This CMS simulated event was created using Geant4 simulation software. Image: CMS Collaboration

    CERN CMS New
    CMS at CERN

    Geant4 is a detector simulation software toolkit originally created at CERN and currently developed by about 100 physicists and computer scientists from all around the world to model the passage of particles through matter and electromagnetic fields. For example, physicists use simulation to optimize detectors and software algorithms with the goal to measure, with utmost efficiency, marks that previously unobserved particles predicted by new theories would leave in their experimental devices.

    Particle physics detectors are typically large and complex. Think of them as a set of hundreds of different shapes and materials. Particles coming from accelerator beams or high-energy collisions traverse the detectors, lose energy and transform themselves into showers of more particles as they interact with the detector material. The marks they leave behind are read by detector electronics and reconstructed by software into the original incident particles with their associated energies and trajectories.

    We wouldn’t even dream of starting detector construction, much less asking for the funding to do it, without simulating the detector geometry and magnetic fields, as well as the physics of the interactions of particles with detector material, in exquisite detail. One of the goals of simulation is to demonstrate that the proposed detector would do the job.

    Geant4 includes tools to represent the detector geometry by assembling elements of different shapes, sizes and material, as well as the mathematical expressions to propagate particles and calculate the details of the electromagnetic and nuclear interactions of particles with matter.

    Geant4 is the current incarnation of Geant (Geometry and Tracking, or “giant” in French). It has become extremely popular for physics, medical and space science applications and is the tool of choice for high-energy physics, including CERN’s LHC experiments and Fermilab’s neutrino and muon programs.

    The Fermilab Scientific Computing Simulation Department (SCS) has grown a team of Geant4 experts that participate actively in its core development and maintenance, offering detector simulation support to experiments and projects within Fermilab’s scientific program. The focus of our team is on improving physics and testing tools, as well as time and memory performance. The SCS team also spearheads an exciting R&D program to re-engineer the toolkit to run on modern computer architectures.

    New-generation machines containing chips called coprocessors, or graphics processing units such as those used in game consoles or smart phones, may be used to speed execution times significantly. Software engineers do this by exploiting the benefits of the novel circuit design of the chips, as well as by using parallel programming. For example, a program execution mode called “multi-threading” would allow us to simulate particles from showers of different physics collisions simultaneously by submitting these threads to the hundreds or thousands of processor cores contained within these novel computer systems.

    As the high-energy community builds, commissions and runs the experiments of the first half of the 21st century, a world of exciting and promising possibilities is opening in the field of simulation and detector modeling. Our Fermilab SCS team is at the forefront of this effort.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:42 pm on October 20, 2014 Permalink | Reply
    Tags: Accelerator Science, , , , , ,   

    From FNAL: “New high-speed transatlantic network to benefit science collaborations across the U.S.” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Monday, Oct. 20, 2014

    Karen McNulty-Walsh, Brookhaven Media and Communications Office, kmcnulty@bnl.gov, 631-344-8350
    Kurt Riesselmann, Fermilab Office of Communication, media@fnal.gov, 630-840-3351
    Jon Bashor, Computing Sciences Communications Manager, Lawrence Berkeley National Laboratory, jbashor@lbnl.gov, 510-486-5849

    Scientists across the United States will soon have access to new, ultra-high-speed network links spanning the Atlantic Ocean thanks to a project currently under way to extend ESnet (the U.S. Department of Energy’s Energy Sciences Network) to Amsterdam, Geneva and London. Although the project is designed to benefit data-intensive science throughout the U.S. national laboratory complex, heaviest users of the new links will be particle physicists conducting research at the Large Hadron Collider (LHC), the world’s largest and most powerful particle collider. The high capacity of this new connection will provide U.S. scientists with enhanced access to data at the LHC and other European-based experiments by accelerating the exchange of data sets between institutions in the United States and computing facilities in Europe.

    esnet

    DOE’s Brookhaven National Laboratory and Fermi National Accelerator Laboratory—the primary computing centers for U.S. collaborators on the LHC’s ATLAS and CMS experiments, respectively—will make immediate use of the new network infrastructure once it is rigorously tested and commissioned. Because ESnet, based at DOE’s Lawrence Berkeley National Laboratory, interconnects all national laboratories and a number of university-based projects in the United States, tens of thousands of researchers from all disciplines will benefit as well.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    CERN ATLAS New
    ATLAS at the LHC

    CERN CMS New
    CMS at CERN

    BNL Campus
    Brookhaven Lab

    The ESnet extension will be in place before the LHC at CERN in Switzerland—currently shut down for maintenance and upgrades—is up and running again in the spring of 2015. Because the accelerator will be colliding protons at much higher energy, the data output from the detectors will expand considerably—to approximately 40 petabytes of raw data per year compared with 20 petabytes for all of the previous lower-energy collisions produced over the three years of the LHC first run between 2010 and 2012.

    The cross-Atlantic connectivity during the first successful run for the LHC experiments, which culminated in the discovery of the Higgs boson, was provided by the US LHCNet network, managed by the California Institute of Technology. In recent years, major research and education networks around the world—including ESnet, Internet2, California’s CENIC, and European networks such as DANTE, SURFnet and NORDUnet—have increased their backbone capacity by a factor of 10, using sophisticated new optical networking and digital signal processing technologies. Until recently, however, higher-speed links were not deployed for production purposes across the Atlantic Ocean—creating a network “impedance mismatch” that can harm large, intercontinental data flows.

    An evolving data model

    This upgrade coincides with a shift in the data model for LHC science. Previously, data moved in a more predictable and hierarchical pattern strongly influenced by geographical proximity, but network upgrades around the world have now made it possible for data to be fetched and exchanged more flexibly and dynamically. This change enables faster science outcomes and more efficient use of storage and computational power, but it requires networks around the world to perform flawlessly together.

    “Having the new infrastructure in place will meet the increased need for dealing with LHC data and provide more agile access to that data in a much more dynamic fashion than LHC collaborators have had in the past,” said physicist Michael Ernst of DOE’s Brookhaven National Laboratory, a key member of the team laying out the new and more flexible framework for exchanging data between the Worldwide LHC Computing Grid centers.

    Ernst directs a computing facility at Brookhaven Lab that was originally set up as a central hub for U.S. collaborators on the LHC’s ATLAS experiment. A similar facility at Fermi National Accelerator Laboratory has played this role for the LHC’s U.S. collaborators on the CMS experiment. These computing resources, dubbed Tier 1 centers, have direct links to the LHC at the European laboratory CERN (Tier 0). The experts who run them will continue to serve scientists under the new structure. But instead of serving as hubs for data storage and distribution only among U.S.-based collaborators at Tier 2 and 3 research centers, the dedicated facilities at Brookhaven and Fermilab will be able to serve data needs of the entire ATLAS and CMS collaborations throughout the world. And likewise, U.S. Tier 2 and Tier 3 research centers will have higher-speed access to Tier 1 and Tier 2 centers in Europe.

    “This new infrastructure will offer LHC researchers at laboratories and universities around the world faster access to important data,” said Fermilab’s Lothar Bauerdick, head of software and computing for the U.S. CMS group. “As the LHC experiments continue to produce exciting results, this important upgrade will let collaborators see and analyze those results better than ever before.”

    Ernst added, “As centralized hubs for handling LHC data, our reliability, performance and expertise have been in demand by the whole collaboration, and now we will be better able to serve the scientists’ needs.”

    An investment in science

    ESnet is funded by DOE’s Office of Science to meet networking needs of DOE labs and science projects. The transatlantic extension represents a financial collaboration, with partial support coming from DOE’s Office of High Energy Physics (HEP) for the next three years. Although LHC scientists will get a dedicated portion of the new network once it is in place, all science programs that make use of ESnet will now have access to faster network links for their data transfers.

    “We are eagerly awaiting the start of commissioning for the new infrastructure,” said Oliver Gutsche, Fermilab scientist and member of the CMS Offline and Computing Management Board. “After the Higgs discovery, the next big LHC milestones will come in 2015, and this network will be indispensable for the success of the LHC Run 2 physics program.”

    This work was supported by the DOE Office of Science.

    The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:05 pm on October 17, 2014 Permalink | Reply
    Tags: Accelerator Science, , , , , ,   

    From FNAL- “Frontier Science Result: CMS Off the beaten path” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Friday, Oct. 17, 2014
    Jim Pivarski

    The main concern for most searches for rare phenomena is to control the backgrounds. Backgrounds are observations that resemble the one of interest, yet aren’t. For instance, fool’s gold is a background for gold prospectors. The main reason that the Higgs boson was hard to find is that most Higgs decays resemble b quark pair production, which is a million times more common. You not only have to find the one-in-a-million event picture, you have to identify some feature of it to prove that it is not an ordinary event.

    This is particularly hard to do in proton collisions because protons break apart in messy ways — the quarks from the proton that missed each other generate a spray of particles that fly off just about everywhere. Look through a billion or a trillion of these splatter events and you can find one that resembles the pattern of new physics that you’re looking for. Physicists have many techniques for filtering out these backgrounds — requiring missing momentum from an invisible particle, high energy perpendicular to the beam, a resonance at a single energy, and the presence of electrons and muons are just a few.

    nu
    Most particles produced by proton collisions originate in the point where the beams cross. Those that do not are due to intermediate particles that travel some distance before they decay

    A less common yet powerful technique for eliminating backgrounds is to look for displaced particle trajectories, meaning trajectories that don’t intersect the collision point. Particles that are directly created by the proton collision or are created by short-lived intermediates always emerge from this point. Those that emerge from some other point in space must be due to a long-lived intermediate.

    A common example of this is the b quark, which can live as long as a trillionth of a second before decaying into visible particles. That might not sound like very long, but the quark is traveling so quickly that it covers several millimeters in that trillionth of a second, which is a measurable difference.

    In a recent analysis, CMS scientists searched for displaced electrons and muons. Displaced tracks are rare, and electrons and muons are also rare, so displaced electrons and muons should be extremely rare. The only problem with this logic is that b quarks sometimes produce electrons and muons, so one other feature is needed to disambiguate. A b quark almost always produces a jet of particles, so this search for new physics also required that the electrons and muons were not close to jets.

    CERN CMS New
    CERN CMS

    With these simple selection criteria, the experimenters found only as many events as would be expected from standard physics. Therefore, it constrains any theory that predicts displaced electrons and muons. One of these is “displaced supersymmetry,” which generalizes the usual supersymmetry scenario by allowing the longest-lived supersymmetric particle to decay on the millimeter scale that this analysis tests. Displaced supersymmetry was introduced as a way that supersymmetry might exist yet be missed by most other analyses. Experiments like this one illuminate the dark corners in which supersymmetry might be hiding.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:52 pm on October 16, 2014 Permalink | Reply
    Tags: Accelerator Science, , , , , ,   

    From LC Newsline: “Full ILC-type cryomodule makes the grade” 

    Linear Collider Collaboration header
    Linear Collider Collaboration

    16 October 2014
    Joykrit Mitra

    For the first time, the ILC gradient specification of 31.5 megavolts per metre has been achieved on average across all of the eight cavities assembled in an ILC-type cryomodule. A team at Fermilab reached the milestone earlier this month. It is an achievement for scientists, engineers and technicians at Fermilab and Jefferson Lab in Virginia as well as their domestic and international partners in superconducting radio-frequency (SRF) technologies.

    The cryomodule, called CM2, was developed and assembled to advance superconducting radio-frequency technology and infrastructure at Americas-region laboratories. The CM2 milestone achievement has been nearly a decade in the making, since US scientists started participating in ILC research and development in 2006.

    cryo
    CM2 cryomodule being assembled at Fermilab’s Industrial Center Building (2011). Photo: Reidar Hahn

    “We’ve reached this important milestone and it was a long time coming,” said Elvin Harms, who leads the cryomodule testing programme at Fermilab. “It’s the first time in the world this has been achieved.”

    An accelerating gradient is a measure of how much of an energy boost particle bunches receive as they zip through an accelerator. Cavities with higher gradients boost particle bunches to higher energies over shorter distances. In an operational ILC, all 16,000 of its cavities would be housed in cryomodules, which would keep the cavities cool when operating at a temperature of 2 kelvins. While cavities can achieve high gradients as standalones, when they are assembled together in a cryomodule unit, the average gradient drops significantly.

    The road to the 31.5 MV/m milestone has been a long and arduous one. Between 2008 and 2010, all of the eight cavities in CM2 had individually been pushed to gradients above 35 MV/m at Jefferson Lab in tests in which the cavities were electropolished and vertically oriented. They were among 60 cavities evaluated globally for the prospects of reaching the ILC gradient. This evaluation was known as the S0 Global Design Effort. It was a build-up to the S1-Global Experiment, which put to the test the possibility of reaching 31.5 MV/m across an entire cryomodule. The final assembly of the S1 cryomodule setup took place at KEK in Japan, between 2010 and 2011. In S1, seven nine-cell 1.3-gigahertz niobium cavities strung together inside a cryomodule achieved an average gradient of 26 MV/m. An ILC-type cryomodule consists of eight such cavities.

    cm2
    CM2 in its home at Fermilab’s NML building, as part of the future Advanced Superconducting Test Accelerator. Photo: Reidar Hahn

    But the ILC community has taken big strides since then. Americas region teams acquired significant expertise in increasing cavity gradients: all CM2 cavities were vertically tested in the United States, initially at Jefferson Lab, and were subjected to additional horizontal tests at Fermilab. Further, cavities manufactured by private vendors in the United States have improved in quality: three of the eight cavities that make up the CM2 cryomodule were fabricated locally.

    Hands-on experience played a major role in improving the overall CM2 gradient. In 2007, a kit for Fermilab’s Cryomodule 1, or CM1, arrived from DESY, and by 2010, when CM1 was operational, the workforce had adopted a production mentality, which was crucial for the work they did on CM2.

    “I would like to congratulate my Fermilab colleagues for their persistence in carrying out this important work and for the quality of their work, which is extremely high,” said the SRF Institute at Jefferson Lab’s Rongli Geng, who led the ILC high-gradient cavity project there from 2007 to 2012. “We are glad to be able to contribute to this success.”

    But achieving the gradient is only the first step, Harms said. “There is still a lot of work left to be done. We need to look at CM2’s longer term performance. And we need to evaluate it thoroughly.”

    Among other tasks, the CM2 group will gently push the gradients higher to determine the limits of the technology and continue to understand and refine it. They plan to power and check the magnet—manufactured at Fermilab— that will be used to focus the particle beam passing through the cryomodule. Also in the works is a plan to study the rate at which the CM2 can be cooled down to 2 kelvins and warmed up again. Finally, they expect to send an actual electron beam through CM2 in 2015 to understand better how the beam and cryomodule respond in that setup.

    Scientists at Fermilab also expect that CM2 will be used in the Advanced Superconducting Test Accelerator currently under construction at Fermilab’s NML building, where CM2 is housed. The SRF technology developed for CM2 also has applications for light source instruments such as LCLS-II at SLAC in the United States and DESY’s XFEL.

    And it’s definitely a viable option for a future machine like the ILC.

    See the full article here.

    The Linear Collider Collaboration is an organisation that brings the two most likely candidates, the Compact Linear Collider Study (CLIC) and the International Liner Collider (ILC), together under one roof. Headed by former LHC Project Manager Lyn Evans, it strives to coordinate the research and development work that is being done for accelerators and detectors around the world and to take the project linear collider to the next step: a decision that it will be built, and where.

    Some 2000 scientists – particle physicists, accelerator physicists, engineers – are involved in the ILC or in CLIC, and often in both projects. They work on state-of-the-art detector technologies, new acceleration techniques, the civil engineering aspect of building a straight tunnel of at least 30 kilometres in length, a reliable cost estimate and many more aspects that projects of this scale require. The Linear Collider Collaboration ensures that synergies between the two friendly competitors are used to the maximum.

    Linear Collider Colaboration Banner

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 347 other followers

%d bloggers like this: