Tagged: Particle Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:31 am on October 31, 2014 Permalink | Reply
    Tags: , , , , Fermilab Today, , , Particle Physics   

    From FNAL- “Frontier Science Result: CMS Boosted W’s” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Friday, Oct. 31, 2014

    FNAL Don Lincoln
    Don Lincoln

    Today’s article covers an interesting topic. It’s interesting not because it explores new physics, but because of how it reveals some of the mundane aspects of research at the LHC. It also shows how the high energy of the LHC makes certain topics harder to study than they were during the good old days at lower-energy accelerators.

    At the LHC, quarks or gluons are scattered out of the collision. It’s the most common thing that happens at the LHC. Regular readers of this column know that it is impossible to see isolated quarks and gluons and that these particles convert into jets as they exit the collision. Jets are collimated streams of particles that have more or less the same energy as the parent quark or gluon. Interactions that produce jets are governed by the strong force.

    map
    In the green region, we show what a W boson looks like before it decays. Moving left to right, the boson is created with more and more momentum. In the yellow region, we repeat the exercise, this time looking at the same W boson after it decays into quarks, which have then turned into jets. Finally in the pink region, we look at a jet originating from a quark or gluon. This looks much like a high-momentum W boson decaying into quarks. Because ordinary jets are so much more common, this highlights the difficulty inherent in finding high-momentum W bosons that decay into jets.
    No image credit

    Things get more interesting when a W boson is produced. One reason for this is that making a W boson requires the involvement of the electroweak force, which is needed for the decay of heavy quarks. Thus studies of W bosons are important for subjects such as the production of the top quark, which is the heaviest quark. W bosons are also found in some decays of the Higgs boson.

    A W boson most often decays into two light quarks, and when it decays, it flings the light quarks into two different directions, which can be seen as two jets.

    But there’s a complication in this scenario at the LHC, where the W bosons are produced with so much momentum that it affects the spatial distribution of particles in those two jets. As the momentum of the W boson increases, the two jets get closer together and eventually merge into a single jet.

    As mentioned earlier, individual jets are much more commonly made using the strong force. So when one sees a jet, it is very hard to identify it as coming from a W boson, which involves the electroweak force. Since identifying the existence of W bosons is very important for certain discoveries, CMS scientists needed to figure out how to tell quark- or gluon-initiated jets from the W-boson-initiated jets. So they devised algorithms that could identify when a jet contained two lumps of energy rather than one. If there were two lumps, the jet was more likely to come from the decay of a W boson.

    CERN CMS New
    CMS

    In today’s paper, CMS scientists explored algorithms and studied variables one can extract from the data to identify single jets that originated from the decay of W bosons. The data agreed reasonably well with calculations, and the techniques they devised will be very helpful for future analyses involving W bosons. In addition, the same basic technique can be extended to other interesting signatures, such as the decay of Z and Higgs bosons.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 12:12 pm on October 30, 2014 Permalink | Reply
    Tags: , , , , , , Particle Physics   

    From FNAL- “Frontier Science Result: CDF A charming result” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Thursday, Oct. 30, 2014
    Diego Tonelli and Andy Beretvas

    Physicists gave funny names to the heavy quark cousins of those that make up ordinary matter: charm, strange, bottom, top. The Standard Model predicts that the laws governing the decays of strange, charm and bottom quarks differ if particles are replaced with antiparticles and observed in a mirror. This difference, CP violation in particle physics lingo, has been established for strange and bottom quarks. But for charm quarks the differences are so tiny that no one has observed them so far. Observing differences larger than predictions could provide much sought-after indications of new phenomena.

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    A team of CDF scientists searched for these tiny differences by analyzing millions of decays of particles decaying into pairs of charged kaons and pions, sifting through roughly a thousand trillion proton-antiproton collisions from the full CDF Run II data set. They studied CP violation by looking at whether the difference between the numbers of charm and anticharm decays occurring in each chunk of decay time varies with decay time itself.

    The results have a tiny uncertainty (two parts per thousand) but do not show any evidence for CP violation, as shown in the upper figure. The small residual decay asymmetry, which is constant in decay time, is due to the asymmetric layout of the detector. The combined result of charm decays into a pair of kaons and a pair of pions is the CP asymmetry parameter AΓ , which is equal to -0.12 ± 0.12 percent. The results are consistent with the current best determinations. Combined with them, they will improve the exclusion constraints on the presence of new phenomena in nature.

    graph
    These plots show the effective lifetime asymmetries as function of decay time for D →K+K- (top) and D → π+π- (bottom) samples. Results of the fits not allowing for (dotted red line) and allowing for (solid blue line) CP violation are overlaid.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 11:58 am on October 30, 2014 Permalink | Reply
    Tags: , , , , , , Particle Physics   

    From LC Newsline: “The future of Higgs physics” 

    Linear Collider Collaboration header
    Linear Collider Collaboration

    30 October 2014
    Joykrit Mitra

    In 2012, the ATLAS and CMS experiments at CERN’s Large Hadron Collider announced the discovery of the Higgs boson. The Higgs was expected to be the final piece of the particular jigsaw that is the Standard Model of particle physics, and its discovery was a monumental event.

    higgs
    Event recorded with the CMS detector in 2012 at a proton-proton centre of mass energy of 8 TeV. The event shows characteristics expected from the decay of the SM Higgs boson to a pair of photons (dashed yellow lines and green towers). Image: L. Taylor, CMS collaboration /CERN

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    CERN ATLAS New
    CERN ATLAS

    CERN CMS New
    CDERN CMS

    But more precise studies of it are needed than the LHC is able to provide. That is why, years earlier, a machine like the International Linear Collider had been envisioned as a Higgs factory, and the Higgs discovery set the stage for its possible construction.

    ILC schematic
    ILC schematic

    Over the years, instruments for probing the universe have become more sophisticated. More refined data has hinted that aspects of the Standard Model are incomplete. If built, a machine such as the ILC will help reveal how wide a gulf there is between the universe and our understanding of it by probing the Higgs to unprecedented levels. And perhaps, as some physicists think, it will uproot the Standard Model and make way for an entirely new physics.

    In the textbook version, the Higgs boson is a single particle, and its alleged progenitor, the mysterious Higgs field that pervades every point in the universe, is a single field. But this theory is still to be tested.

    “We don’t know whether the Higgs field is one field or many fields,” said Michael Peskin of SLAC’s Theoretical Physics Group. “We’re just now scratching the surface at the LHC.”

    The LHC collides proton beams together, and the collision environment is not a clean one. Protons are made up of quarks and gluons, and in an LHC collision it’s really these many component parts – not the larger proton – that interact. During a collision, there are simply too many components in the mix to determine the initial energies of each one. Without knowing them, it’s not possible to precisely calculate properties of the particles generated from the collision. Furthermore, Higgs events at the LHC are exceptionally rare, and there is so much background that the amount of data that scientists have to sift through to glean information on the Higgs is astronomical.

    “There are many ways to produce an event that looks like the Higgs at the LHC,” Peskin said. “Lots of other things happen that look exactly like what you’re trying to find.”

    The ILC, on the other hand, would collide electrons and positrons, which are themselves fundamental particles. They have no component parts. Scientists would know their precise initial energy states and there will be significantly fewer distractions from the measurement standpoint. The ILC is designed to be able to accelerate particle beams up to energies of 250 billion electronvolts, extendable eventually to 500 billion electronvolts. The higher the particles’ energies, the larger will be the number of Higgs events. It’s the best possible scenario to probe the Higgs.

    If the ILC is built, physicists will first want to test whether the Higgs particle discovered at the LHC indeed has the properties predicted by the Standard Model. To do this, they plan to study Higgs couplings with known subatomic particles. The higher a particle’s mass, the proportionally stronger its coupling ought to be with the Higgs boson. The ILC will be sensitive enough to detect and accurately measure Higgs couplings with light particles, for instance with charm quarks. Such a coupling can be detected at the LHC in principle but is very difficult to measure accurately.

    The ILC can also help measure the exact lifetime of the Higgs boson. The more particles the Higgs couples to, the faster it decays and disappears. A difference between the measured lifetime and the projected lifetime—calculated from the Standard Model—could reveal what fraction of possible particles—or the Higgs’ interactions with them— we’ve actually discovered.

    “Maybe the Higgs interacts with something new that is very hard to detect at a hadron collider, for example if it cannot be observed directly, like neutrinos,” speculated John Campbell of Fermilab’s Theoretical Physics Department.

    These investigations could yield some surprises. Unexpected vagaries in measurement could point to yet undiscovered particles, which in turn would indicate that the Standard Model is incomplete. The Standard Model also has predictions for the coupling between two Higgs bosons, and physicists hope to study this as well to check if there are indeed multiple kinds of Higgs particles.

    “It could be that the Higgs boson is only a part of the story, and it has explained what’s happened at colliders so far,” Campbell said. “The self-coupling of the Higgs is there in the Standard Model to make it self-consistent. If not the Higgs, then some other thing has to play that role that self-couplings play in the model. Other explanations could also provide dark matter candidates, but it’s all speculation at this point.”

    image
    3D plot showing how dark matter distribution in our universe has grown clumpier over time. (Image: NASA, ESA, R. Massey from California Institute of Technology)

    The Standard Model has been very self-consistent so far, but some physicists think it isn’t entirely valid. It ignores the universe’s
    accelerating expansion caused by dark energy, as well as the mysterious dark matter that still allows matter to clump together and galaxies to form. There is speculation about the existence of undiscovered mediator particles that might be exchanged between dark matter and the Higgs field. The Higgs particle could be a likely gateway to this unknown physics.

    With the LHC set to be operational again next year, an optimistic possibility is that a new particle or two might be dredged out from trillions of collision events in the near future. If built, the ILC would be able to build on such discoveries, just as in case of the Higgs boson, and provide a platform for more precise investigation.

    The collaboration between a hadron collider like the LHC and an electron-positron collider of the scale of the ILC could uncover new territories to be explored and help map them with precision, making particle physics that much richer.

    See the full article here.

    The Linear Collider Collaboration is an organisation that brings the two most likely candidates, the Compact Linear Collider Study (CLIC) and the International Liner Collider (ILC), together under one roof. Headed by former LHC Project Manager Lyn Evans, it strives to coordinate the research and development work that is being done for accelerators and detectors around the world and to take the project linear collider to the next step: a decision that it will be built, and where.

    Some 2000 scientists – particle physicists, accelerator physicists, engineers – are involved in the ILC or in CLIC, and often in both projects. They work on state-of-the-art detector technologies, new acceleration techniques, the civil engineering aspect of building a straight tunnel of at least 30 kilometres in length, a reliable cost estimate and many more aspects that projects of this scale require. The Linear Collider Collaboration ensures that synergies between the two friendly competitors are used to the maximum.

    Linear Collider Colaboration Banner

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 5:20 pm on October 28, 2014 Permalink | Reply
    Tags: , Bs meson, , , , , Particle Physics, Syracuse University   

    From Syracuse University: “Syracuse Physicists Closer to Understanding Balance of Matter, Antimatter” 

    Syracuse University

    Syracuse University

    Physicists in the College of Arts and Sciences have made important discoveries regarding Bs meson particles—something that may explain why the universe contains more matter than antimatter.

    ss
    Sheldon Stone

    Distinguished Professor Sheldon Stone and his colleagues recently announced their findings at a workshop at CERN in Geneva, Switzerland. Titled Implications of LHCb Measurements and Their Future Prospects, the workshop enabled him and other members of the Large Hadron Collider beauty (LHCb) Collaboration to share recent data results.

    CERN LHCb New
    CERN LHCb

    The LHCb Collaboration is a multinational experiment that seeks to explore what happened after the Big Bang, causing matter to survive and flourish in the Universe. LHCb is an international experiment, based at CERN, involving more than 800 scientists and engineers from all over the world. At CERN, Stone heads up a team of 15 physicists from Syracuse.

    “Many international experiments are interested in the Bs meson because it oscillates between a matter particle and an antimatter particle,” says Stone, who heads up Syracuse’s High-Energy Physics Group. “Understanding its properties may shed light on charge-parity [CP] violation, which refers to the balance of matter and antimatter in the universe and is one of the biggest challenges of particle physics.”

    Scientists believe that, 14 billion years ago, energy coalesced to form equal quantities of matter and antimatter. As the universe cooled and expanded, its composition changed. Antimatter all but disappeared after the Big Bang (approximately 3.8 billion years ago), leaving behind matter to create everything from stars and galaxies to life on Earth.

    “Something must have happened to cause extra CP violation and, thus, form the universe as we know it,” Stone says.

    He thinks part of the answer lies in the Bs meson, which contains an antiquark and a strange quark and is bound together by a strong interaction. (A quark is a hard, point-like object found inside a proton and neutron that forms the nucleus of an atom.)

    Enter CERN, a European research organization that operates the world’s largest particle physics laboratory.

    In Geneva, Stone and his research team—which includes Liming Zhang, a former Syracuse research associate who is now a professor at Tsinghua University in Beijing, China—have studied two landmark experiments that took place at Fermilab, a high-energy physics laboratory near Chicago, in 2009.

    lhc
    The Large Hadron Collider at CERN

    The experiments involved the Collider Detector at Fermilab (CDF) and the DZero (D0), four-story detectors that were part of Fermilab’s now-defunct Tevatron, then one of the world’s highest-energy particle accelerators.

    “Results from D0 and CDF showed that the matter-antimatter oscillations of the Bs meson deviated from the standard model of physics, but the uncertainties of their results were too high to make any solid conclusions,” Stone says.

    He and Zhang had no choice but to devise a technique allowing for more precise measurements of Bs mesons. Their new result shows that the difference in oscillations between the Bs and anti-Bs meson is just as the standard model has predicted.

    Stone says the new measurement dramatically restricts the realms where new physics could be hiding, forcing physicists to expand their searches into other areas. “Everyone knows there is new physics. We just need to perform more sensitive analyses to sniff it out,” he adds.

    See the full article here.

    Syracuse University was officially chartered in 1870 as a private, coeducational institution offering programs in the physical sciences and modern languages. The university is located in the heart of Central New York, is within easy driving distance of Toronto, Boston, Montreal, and New York City. SU offers a rich mix of academic programs, alumni activities, and immersion opportunities in numerous centers in the U.S. and around the globe, including major hubs in New York City, Washington, D.C., and Los Angeles. The total student population at Syracuse University represents all 50 U.S. states and 123 countries.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:56 pm on October 28, 2014 Permalink | Reply
    Tags: , , , , , Particle Physics   

    From FNAL: “Mu2e moves ahead” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Tuesday, Oct. 28, 2014
    nl
    Fermilab Director Nigel Lockyer wrote this column

    In continued alignment with goals laid out in the P5 report, we’re making progress on our newest muon experiment, Mu2e. A four-day DOE Critical Decision 2/3b review of the experiment concluded Friday. The review went extremely well and validated the design, technical progress, and the cost and schedule of the project. The reviewers praised the depth and breadth of our staff’s excellent technical work and preparation. Official sign-off for CD-2/3b is expected in the next several months, followed by construction on the Mu2e building in early 2015. Construction on the transport solenoid modules should begin in the spring. The experiment received CD-0 approval in 2009 and CD-1 approval in 2012 and is slated to start up in 2020.

    Named for the muon-to-electron conversion that researchers hope to observe, Mu2e is a crucial stepping stone on our journey beyond the Standard Model. and in the hunt for new physics. It will be 10,000 times more sensitive than the previous attempts to observe that transition.

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Experimenters will use a series of superconducting magnets to separate muons from other particles, guiding them to a stopping target. After the muons have been captured by aluminum nuclei, a very small number are expected to transform into only an electron rather than the typical decay into an electron and two neutrinos. It’s a change so rare, theorists liken it to finding a penny with a scratch on Lincoln’s head hidden in a stack of pristine pennies so tall that the stack stretches from the Earth to Mars and back again 130 times.

    The experiment will provide insight into how and why particles within one family change into others. It might also help narrow down theories about how the universe works and provide insight into data coming out of the LHC. Discovery of the muon-to-electron conversion would hint at undiscovered particles or forces and potentially illuminate a grand unification theory — not bad for a 75-foot-long experiment.

    Many months of hard work preceded last week’s review. Thank you to all who were involved in helping to move this important experiment forward.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:07 pm on October 25, 2014 Permalink | Reply
    Tags: , , , , Particle Physics,   

    From Quanta: “Dwarf Galaxies Dim Hopes of Dark Matter” 

    Quanta Magazine
    Quanta Magazine

    October 25, 2014
    Natalie Wolchover

    Once again, a shadow of a signal that scientists hoped would amplify into conclusive evidence of dark matter has instead flatlined, repeating a maddening refrain in the search for the invisible, omnipresent particles.

    The Fermi Large Area Telescope (LAT) failed to detect the glow of gamma rays emitted by annihilating dark matter in miniature “dwarf” galaxies that orbit the Milky Way, scientists reported Friday at a meeting in Nagoya, Japan. The hint of such a glow showed up in a Fermi analysis last year, but the statistical bump disappeared as more data accumulated.

    NASA Fermi Telescope
    NASA/Fermi Gamma Ray Spacecraft

    LAT
    LAT cutaway

    “We were obviously somewhat disappointed not to see a signal,” said Matthew Wood, a postdoctoral researcher at Stanford University who was centrally involved the Fermi-LAT collaboration’s new analysis, in an email.

    Scientists homed in on the dwarf galaxies after Dan Hooper, a theoretical astrophysicist at the Fermi National Accelerator Laboratory in Batavia, Ill., and Lisa Goodenough, his graduate student at the time, detected an unexplained gamma-ray signal coming from the center of the Milky Way in 2009. Hooper and several collaborators proposed that the gamma rays might be due to dark matter in the form of WIMPs, or weakly interacting massive particles, which are the leading candidates for the invisible substance that comprises six-sevenths of the universe’s mass. When two WIMPs collide in the dense galactic center, they should annihilate, with gamma rays as the fallout. Over the past five years the intriguing gamma-ray signal has seemed more and more likely to be the detritus of annihilating WIMPs.

    However, scientists knew that the same glow could also originate from an unknown population of millisecond pulsars in the galactic center — bright, rapidly spinning stars that spew gamma rays into space.

    Looking for ways to distinguish the two possibilities, scientists turned to dwarf galaxies, which are thought to be rich in dark matter but free of pulsars. If researchers found gamma rays pouring out of dwarf galaxies, the observation would rule out alternative explanations and provide emphatic evidence of WIMPs.

    Yet no such signal has been detected in five years’ worth of the highest-quality data from 15 nearby dwarfs, Wood and his colleagues report. “The case for the dark-matter interpretation of the galactic-center excess is substantially weakened,” he said.

    graph
    Olena Shmahalo/Quanta Magazine; data courtesy of Matthew Wood

    Under the most generous assumptions about the density of dark matter, new observations of dwarf galaxies exclude some, but not all, models of dark-matter particles that could be producing a signal coming from the center of the Milky Way. The range of particle properties proposed in a 2014 paper by Dan Hooper and colleagues (purple) is still viable, while a model proposed by Francesca Calore et al. (orange), which experts consider the most comprehensive, predicts a range of properties that is cut exactly in half. Under less generous assumptions, all except the Calore model are excluded.

    The possibility remains that the signal from the Milky Way’s center does come from dark matter, but only if the density of dark matter in the galaxy is at the high end of researchers’ estimates. If dark matter is sufficiently dense, it doesn’t have to annihilate at a very high rate to explain the signal from the galactic center. And if dark matter annihilates at a low rate, then researchers shouldn’t be surprised when they don’t see a signal coming from the more-diffuse dwarfs.

    “At this stage we do not entirely exclude all of the dark-matter models proposed to explain the reported excess,” Wood said.

    Hooper, whose model barely survives the blow of the new dwarf-galaxy findings, seems unfazed, and he maintains his position that the signal from the galactic center most likely comes from colliding WIMPs that vanish in puffs of gamma rays. “That’s where my money is,” he told Quanta Magazine in March. Speaking from the meeting in Japan, he said, “That hasn’t changed in any significant way.”

    Other scientists agree that the dark-matter explanation of the gamma-ray excess is still viable, for now. “It is what it is,” said Savvas Koushiappas, a physicist at Brown University and co-author of another recent analysis of gamma rays from the dwarfs. “There is a dark-matter interpretation, and the dwarfs at the moment did not rule it out, or confirm it. However, we are close.”

    Tracy Slatyer, a physicist at the Massachusetts Institute of Technology who has collaborated with Hooper on models of the galactic-center excess, said she finds the new results “really encouraging.”

    “Of course, I would like the galactic-center excess to come from annihilating dark matter, but I would much rather know one way or the other,” she said. “This result increases the probability that we will know for sure in the near future.”

    The paradigm that dark matter is likely composed of WIMPs has long reigned among physicists because of the “WIMP miracle,” or the fact that the same hypothetical particle could account for mysteries of both the cosmic and the quantum worlds. With roughly the same mass as many of the known particles in nature, WIMPs would counteract the effects of those particles in quantum equations in a way that would make apparently faulty calculations work. And the presence of a halo of WIMPs around galaxies would explain why the galaxies rotate faster than expected at their outskirts — the most compelling indirect evidence that dark matter exists.

    But the fact that WIMPs would represent an elegant solution to deep questions doesn’t mean they’re real. Scientists have spent the past decade monitoring ultra-cooled vats of liquid chemicals located deep underground in repurposed mine shafts all over the world, hoping that WIMPs would occasionally leave traces of energy as they traversed the liquids. But the search has not produced a single convincing signal.

    As the experiments become ever more sensitive, they eat away at the abstract space of all viable WIMP models, giving it the look of Swiss cheese. The discouraging results have pushed researchers to get more creative. “Even though many people are working very hard on the WIMP paradigm, people are starting to think more broadly,” said Mark Trodden, a professor of theoretical physics at the University of Pennsylvania.

    Dwarf galaxies have already inspired alternatives to the standard WIMP picture. If dark-matter particles can interact with one another (instead of “weakly interacting” only with ordinary matter, as in conventional WIMP models), they will transfer heat as they collide. “When you transfer heat, you get a less dense center,” explained David Spergel, an astrophysicist at Princeton University who, along with his colleague Paul Steinhardt, first proposed the self-interacting dark-matter scenario in 2000. Indeed, astronomers have observed that the cores of dwarf galaxies are less dense than would be expected based on simulations of galaxy formation that use WIMPs.

    map
    J. Bullock, M. Geha, R. Powell
    A map of dwarf galaxies orbiting the Milky Way Galaxy. Each dwarf contains up to several billion stars, compared to several hundred billion in the Milky Way.

    Self-interacting dark matter has attracted growing interest among scientists, but not everyone feels comfortable postulating a new property to patch over the problems with current models.

    “We’re just making this invisible particle increasingly complicated,” said Justin Khoury, a theoretical physicist at the University of Pennsylvania. “I’m torn about that.”

    Meanwhile, new and improved simulations by Alyson Brooks of Rutgers University and colleagues suggest that dwarf galaxies can be modeled correctly without dark matter self-interactions after all, if the simulations include the effects of ordinary particles — the one-seventh of all matter that we actually see, but which models often ignore for the sake of simplicity. When stars go supernova, Brooks explained, they produce hot bubbles of gas that rapidly expand. “It turns out that process gives energy to the dark matter in the center of galaxies and pushes it out,” she said.

    Although Brooks’ simulations match observations, some other leading modelers can’t get the effects of ordinary matter to fix the discrepancy in their own simulations, fueling the interest in self-interacting dark matter.

    Complicating the debate is the fact that if dark-matter particles self-interact, that means they don’t annihilate upon contact in bursts of gamma rays. In that case, the signal from the Milky Way’s center would not come from dark matter.

    “If this all sounds lively and contradictory and confused, you have the right idea,” Steinhardt said.

    Khoury has moved the furthest from the WIMP picture with a recent paper postulating that dark matter may not be composed of particles at all. His theory revamps an old idea called modified Newtonian dynamics, or MOND, which proposes a change to the law of gravity. In Khoury’s theory, dark matter is a fluidlike field that permeates space, interacting with the gravitational fields of galaxies in a way that alters their rotation.

    Erik Verlinde, a theoretical physicist at the University of Amsterdam in the Netherlands, has proposed a different modified-gravity theory, one in which dark matter doesn’t exist at all and the rotational speeds of galaxies reflect the entropy, or disorder, of space and time.

    At this stage, one theorist’s guess seems as good as another’s.

    “There are many, many, many things that dark matter could be,” Trodden said. “If you gave me license to write down particle physics [models] that could give me dark matter, I could write down 10 that haven’t been thought about before.” As for which ones hold the most promise, the universe isn’t telling.

    See the full article here.

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 6:36 pm on October 24, 2014 Permalink | Reply
    Tags: , , , , , Particle Physics   

    From Nautilus: “Who Really Found the Higgs Boson” 

    Nautilus

    Nautilus

    October 23, 2014
    By Neal Hartman
    Illustration by Owen Freeman
    Also stock photos

    To those who say that there is no room for genius in modern science because everything has been discovered, Fabiola Gianotti has a sharp reply. “No, not at all,” says the former spokesperson of the ATLAS Experiment, the largest particle detector at the Large Hadron Collider at CERN. “Until the fourth of July, 2012 we had no proof that nature allows for elementary scalar fields. So there is a lot of space for genius.”

    CERN ATLAS New
    ATLAS

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    She is referring to the discovery of the Higgs boson two years ago—potentially one of the most important advances in physics in the past half century. It is a manifestation of the eponymous field that permeates all of space, and completes the standard model of physics: a sort of baseline description for the existence and behavior of essentially everything there is.

    By any standards, it is an epochal, genius achievement.

    What is less clear is who, exactly, the genius is. An obvious candidate is Peter Higgs, who postulated the Higgs boson, as a consequence of the Brout-Englert-Higgs mechanism, in 1964. He was awarded the Nobel Prize in 2013 along with Francois Englert (Englert and his deceased colleague Robert Brout arrived at the same result independently). But does this mean that Higgs was a genius? Peter Jenni, one of the founders and the first “spokesperson” of the ATLAS Experiment Collaboration (one of the two experiments at CERN that discovered the Higgs particle), hesitates when I ask him the question.

    “They [Higgs, Brout and Englert] didn’t think they [were working] on something as grandiose as [Einstein’s relativity],” he states cautiously. The spontaneous symmetry breaking leading to the Higgs “was a challenging question, but [Albert Einstein] saw something new and solved a whole field. Peter Higgs would tell you, he worked a few weeks on this.”

    The ability of the precocious individual physicist to suggest a new data cut or filter is restricted.

    What, then, of the leaders of the experimental effort, those who directed billions of dollars in investment and thousands of physicists, engineers, and students from almost 40 countries for over three decades? Surely there must have been a genius mastermind directing this legion of workers, someone we can single out for his or her extraordinary contribution.

    “No,” says Gianotti unequivocally, which is rare for a physicist, “it’s completely different. The instruments we have built are so complex that inventiveness and creativity manifests itself in the day-by-day work. There are an enormous amount of problems that require genius and creativity to be spread over time and over many people, and all at the same level.”

    Scientific breakthroughs often seem to be driven by individual genius, but this perception belies the increasingly collaborative nature of modern science. Perhaps nothing captures this dichotomy better than the story of the Higgs discovery, which presents a stark contrast between the fame awarded to a few on the one hand, and the institutionalized anonymity of the experiments that made the discovery possible on the other.

    An aversion to the notion of exceptional individuals is deeply rooted within the ATLAS collaboration, a part of its DNA. Almost all decisions in the collaboration are approved by representative groups, such as the Institute Board, the Collaboration Board, and a plethora of committees and task forces. Consensus is the name of the game. Even the effective CEO, a role Gianotti occupied from 2009 to 2013, is named the “Spokesperson.” She spoke for the collaboration, but did not command it.

    Collectivity is crucial to ATLAS in part because it’s important to avoid paying attention to star personalities, so that the masses of physicists in the collaboration each feel they own the research in some way. Almost 3,000 people qualify as authors on the key physics papers ATLAS produces, and the author list can take almost as many pages as the paper itself.

    team
    The genius of crowds: Particle physics collaborations can produce academic papers with hundreds of authors. One 2010 paper was 40 pages long—with 10 pages devoted to the authors list, pictured here.

    On a more functional level, this collectivity also makes it easier to guard against bias in interpreting the data. “Almost everything we do is meant to reduce potential bias in the analysis,” asserts Kerstin Tackmann, a member of the Higgs to Gamma Gamma analysis group during the time of the Higgs discovery, and recent recipient of the Young Scientist Prize in Particle Physics. Like many physicists, Tackmann verges on the shy, and speaks with many qualifications. But she becomes more forceful when conveying the importance of eliminating bias.

    “We don’t work with real data until the very last step,” she explains. After the analysis tools—algorithms and software, essentially—are defined, they are applied to real data, a process known as the unblinding. “Once we look at the real data,” says Tackmann, “we’re not allowed to change the analysis anymore.” To do so might inadvertently create bias, by tempting the physicists to tune their analysis tools toward what they hope to see, in the worst cases actually creating results that don’t exist. The ability of the precocious individual physicist to suggest a new data cut or filter is restricted by this procedure: He or she wouldn’t even see real data until late in the game, and every analysis is vetted independently by multiple other scientists.

    Most people in the collaboration work directly “for” someone who is in no way related to their home institute, which actually writes their paycheck.

    This collective discipline is one way that ATLAS tames the complexity of the data it produces, which in raw form is voluminous enough to fill a stack of DVDs that reaches from the earth to the moon and back again, 10 times every year. The data must be reconstructed into something that approximates an image of individual collisions in time and space, much like the processing required for raw output from a digital camera.

    But the identification of particles from collisions has become astoundingly more complex since the days of “scanning girls” and bubble chamber negatives, where actual humans sat over enlarged images of collisions and identified the lines and spirals as different particles. Experimentalists today need to have expert knowledge of the internal functioning of the different detector subsystems: pixel detector, silicon strip tracker, transition radiation tracker, muon system, and calorimeters, both hadronic and electromagnetic. Adjustments made to each subsystem’s electronics, such as gain or threshold settings, might cause the absence or inclusion of what looks like real data but isn’t. Understanding what might cause false or absent signals, and how they can be accounted for, is the most challenging and creative part of the process. “Some people are really clever and very good at this,” says Tackmann.

    The process isn’t static, either. As time goes on, the detector changes from age and radiation damage. In the end the process of perfecting the detector’s software is never-ending, and the human requirements are enormous: roughly 100 physicists were involved in the analysis of a single and relatively straightforward particle signature, the decay of the Higgs into two Gamma particles. The overall Higgs analysis was performed by a team of more than 600 physicists.

    The depth and breadth of this effort transform the act of discovery into something anonymous and distributed—and this anonymity has been institutionalized in ATLAS culture. Marumi Kado, a young physicist with tousled hair and a quiet zen-like speech that borders on a whisper, was one of the conveners of the “combined analysis” group that was responsible for finally reaching the level of statistical significance required to confirm the Higgs discovery. But, typically for ATLAS, he downplays the importance of the statistical analysis—the last step—in light of the complexity of what came before. “The final analysis was actually quite simple,” he says. “Most of the [success] lay in how you built the detector, how well you calibrated it, and how well it was designed from the very beginning. All of this took 25 years.”
    2

    The deeply collaborative work model within ATLAS meant that it wasn’t enough for it to innovate in physics and engineering—it also needed to innovate its management style and corporate culture. Donald Marchand, a professor of strategy execution and information management at IMD Business School in Lausanne, describes ATLAS as following a collaborative mode of working that flies in the face of standard “waterfall”—or top down—management theory.

    Marchand conducted a case study on ATLAS during the mid-2000s, finding that the ATLAS management led with little or no formal authority. Most people in the collaboration work directly “for” someone who is in no way related to their home institute, which actually writes their paycheck. For example, during the construction phase, the project leader of the ATLAS pixel detector, one of its most data-intensive components, worked for a U.S. laboratory in California. His direct subordinate, the project engineer, worked for an institute in Italy. Even though he was managing a critical role in the production process, the project leader had no power to promote, discipline, or even formally review the project engineer’s performance. His only recourse was discussion, negotiation, and compromise. ATLAS members are more likely to feel that they work with someone, rather than for them.

    Similarly, funding came from institutes in different countries through “memorandums of understanding” rather than formal contracts. The collaboration’s spokesperson and other top managers were required to follow a politic of stewardship, looking after the collaboration rather than directing it. If collaboration members were alienated, that could mean the loss of the financial and human capital they were investing. Managers at all levels needed to find non-traditional ways to provide feedback, incentives, and discipline to their subordinates.

    One famous member of the collaboration is looked upon dubiously by many, who see him as drawing too much attention to himself.

    The coffee chat was one way to do this, and became the predominant way to conduct the little daily negotiations that kept the collaboration running. Today there are cafés stationed all around CERN, and they are full from morning to evening with people having informal meetings. Many physicists can be seen camped out in the cafeteria for hours at a time, working on their laptops between appointments. ATLAS management also created “a safe harbor, a culture within the organization that allows [employees] to express themselves and resolve conflicts and arguments without acrimony,” Marchand says.

    The result is a management structure that is remarkably effective and flexible. ATLAS managers consistently scored in the top 5 percent of a benchmark scale that measures how they control, disseminate, and capitalize on the information capital in their organization. Marchand also found that the ATLAS management structure was effective at adapting to changing circumstances, temporarily switching to a more top-down paradigm during the core production phase of the experiment, when thousands of identical objects needed to be produced on assembly lines all over the world.

    This collaborative culture didn’t arise by chance; it was built into ATLAS from the beginning, according to Marchand. The original founders infused a collaborative ethic into every person that joined by eschewing personal credit, talking through conflicts face to face, and discussing almost everything in open meetings. But that ethic is codified nowhere; there is no written code of conduct. And yet it is embraced, almost religiously, by everyone that I spoke with.

    Collaboration members are sceptical of attributing individual credit to anything. Every paper includes the entire author list, and all of ATLAS’s outreach material is signed “The ATLAS Collaboration.” People are suspicious of those that are perceived to take too much personal credit in the media. One famous member of the collaboration (as well as a former rock star and host of the highly successful BBC series, Horizon) is looked upon dubiously by many, who see him as drawing too much attention to himself through his association with the experiment.

    3
    MIND THE GAP: Over 60 institutes collaborated to build and install a new detector layer inside a 9-millimeter gap between the beam pipe (the evacuated pipe inside of which protons circulate) and the original detector.ATLAS Experiment © 2014 CERN

    In searching for genius at ATLAS, and other experiments at CERN, it seems almost impossible to point at anything other than the collaborations themselves. More than any individual, including the theorists who suggest new physics and the founders of experimental programs, it is the collaborations that reflect the hallmarks of genius: imagination, persistence, open-mindedness, and accomplishment.

    The results speak for themselves: ATLAS has already reached its first key objective in just one-tenth of its projected lifetime, and continues to evolve in a highly collaborative way. This May, one of the first upgrades to the detector was installed. Called the Insertable B-Layer (IBL), it grew out of a task force formed near the end of ATLAS’s initial commissioning period, in 2008, with the express goal of documenting why inserting another layer of detector into a 9-millimeter clearance space just next to the beam pipe was considered impossible.

    Consummate opportunists, the task force members instead came up with a design that quickly turned into a new subproject. And though it’s barely larger than a shoebox, the IBL’s construction involved more than 60 institutes all over the world, because everyone wanted to be involved in this exciting new thing. When it came time to slide the Insertable B-layer sub-detector into its home in the heart of ATLAS earlier this year, with only a fraction of a millimeter of clearance over 7 meters in length, the task was accomplished in just two hours—without a hitch.

    Fresh opportunities for new genius abound. Gianotti singles out dark matter as an example, saying “96 percent of the universe is dark. We don’t know what it’s made of and it doesn’t interact with our instruments. We have no clue,” she says. “So there is a lot of space for genius.” But instead of coming from the wild-haired scientist holding a piece of chalk or tinkering in the laboratory, that genius may come from thousands of people working together.

    Neal Hartman is a mechanical engineer with Lawrence Berkeley National Laboratory that has been working with the ATLAS collaboration at CERN for almost 15 years. He spends much of his time on outreach and education in both physics and general science, including running CineGlobe, a science-inspired film festival at CERN.

    See the full article, with notes, here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:39 pm on October 23, 2014 Permalink | Reply
    Tags: , , , , Particle Physics   

    From FNAL: “Physics in a Nutshell – Unparticle physics” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Thursday, Oct. 23, 2014
    Jim Pivarski

    The first property of matter that was known to be quantized was not a surprising one like spin — it was mass. That is, mass only comes in multiples of a specific value: The mass of five electrons is 5 times 511 keV. A collection of electrons cannot have 4.9 or 5.1 times this number — it must be exactly 4 or exactly 6, and this is a quantum mechanical effect.

    We don’t usually think of mass quantization as quantum mechanical because it isn’t weird. We sometimes imagine electrons as tiny balls, all alike, each with a mass of 511 keV. While this mental image could make sense of the quantization, it isn’t correct since other experiments show that an electron is an amorphous wave or cloud. Individual electrons cannot be distinguished. They all melt together, and yet the mass of a blob of electron-stuff is always a whole number.

    The quantization of mass comes from a wave equation — physicists assume that electron-stuff obeys this equation, and when they solve the equation, it has only solutions with mass in integer multiples of 511 keV. Since this agrees with what we know, it is probably the right equation for electrons. However, there might be other forms of matter that obey different laws.

    fra
    One alternative would be to obey a symmetry principle known as scale invariance. Scale invariance is a property of fractals, like the one shown above, in which the same drawing is repeated within itself at smaller and smaller scales. For matter, scale invariance is the property that the energy, momentum and mass of a blob of matter can be scaled up equally. Normal particles like electrons are not scale-invariant because the energy can be scaled by an arbitrary factor, but the mass is rigidly quantized.

    It is theoretically possible that another type of matter, dubbed “unparticles,” could satisfy scale invariance. In a particle detector, unparticles would look like particles with random masses. One unparticle decay might have many times the apparent mass of the next — the distribution would be broad.

    Another feature of unparticles is that they don’t interact strongly with the familiar Standard Model particles, but they interact more strongly at higher energies. Therefore, they would not have been produced in low-energy experiments, but could be discovered in high-energy experiments.

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Physicists searched for unparticles using the 7- and 8-TeV collisions produced by the LHC in 2011-2012, and they found nothing. This tightens limits, reducing the possible parameters that the theory can have, but it does not completely rule it out. Next spring, the LHC is scheduled to start up with an energy of 13 TeV, which would provide a chance to test the theory more thoroughly. Perhaps the next particle to be discovered is not a particle at all.

    CERN LHC Grand Tunnel
    LHC Tunnel

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:08 pm on October 22, 2014 Permalink | Reply
    Tags: , , , , , , Particle Physics   

    From FNAL: “From the Office of Campus Strategy and Readiness – Building the future of Fermilab” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Wednesday, Oct. 22, 2014
    ro
    Randy Ortgiesen, head of OCSR, wrote this column.

    As Fermilab and the Department of Energy continue to aggressively “make ready the laboratory” for implementing P5’s recommendations, I can’t help reflecting on all that has recently been accomplished to support the lab’s future — both less visible projects and the big stuff. As we continue to build on these accomplishments, it’s worth noting their breadth and how much headway we’ve made.

    The development of the Muon Campus is proceeding at a healthy clip. Notable in its progress is the completion of the MC-1 Building and the cryogenic systems that support the Muon g-2 experiment. The soon-to-launch beamline enclosure construction project and soon-to-follow Mu2e building is also significant. And none of this could operate without the ongoing, complex accelerator work that will provide beam to these experiments.

    Repurposing of the former CDF building for future heavy-assembly production space and offices is well under way, with more visible exterior improvements to begin soon.

    The new remote operations center, ROC West, is open for business. Several experiments already operate from its new location adjacent to the Wilson Hall atrium.

    The Wilson Street entrance security improvements, including a new guardhouse, are also welcome additions to improved site aesthetics and security operations. Plans for a more modern and improved Pine Street entrance are beginning as well.

    The fully funded Science Laboratory Infrastructure project to replace the Master Substation and critical portions of the industrial cooling water system will mitigate the lab’s largest infrastructure vulnerability for current and future lab operations. Construction is scheduled to start in summer 2015.

    The short-baseline neutrino program is expected to start utility and site preparation very soon, with the start of the detector building construction following shortly thereafter. This is an important and significant part of the near-term future of the lab.

    The start of a demolition program for excess older and inefficient facilities is very close. The program will begin with a portion of the trailers at both the CDF and DZero trailer complexes.

    Space reconfiguration in Wilson Hall to house the new Neutrino Division and LBNF project offices is in the final planning stage and will also be starting soon.

    The atrium improvements, with the reception desk, new lighting and more modern furniture create a more welcoming atmosphere.

    And I started the article by mentioning planning for the “big stuff.” The big stuff, as you may know, includes the lab’s highest-priority project in developing a new central campus. This project is called the Center for Integrated Engineering Research, to be located just west of Wilson Hall. It will consolidate engineering resources from across the site to most efficiently plan for, construct and operate the P5 science projects. The highest-priority Technical Campus project, called the Industrial Center Building Addition, is urgently needed to expand production capacity for the equipment required for future science projects. And lastly the Scientific Hostel, or guest house, for which plans are also under way, will complete the Central Campus theme to “eat-sleep-work to drive discovery.”

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:08 pm on October 21, 2014 Permalink | Reply
    Tags: , , Fermilab Scientific Computing, , , , Particle Physics   

    From FNAL: “Simulation in the 21st century” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Tuesday, Oct. 21, 2014
    V. Daniel Elvira, Scientific Computing Simulation Department head

    Simulation is not magic, but it can certainly produce the feeling. Although it can’t miraculously replace particle physics experiments, revealing new physics phenomena at the touch of a key, it can help scientists to design detectors for best physics at the minimum cost in time and money.

    sim
    This CMS simulated event was created using Geant4 simulation software. Image: CMS Collaboration

    CERN CMS New
    CMS at CERN

    Geant4 is a detector simulation software toolkit originally created at CERN and currently developed by about 100 physicists and computer scientists from all around the world to model the passage of particles through matter and electromagnetic fields. For example, physicists use simulation to optimize detectors and software algorithms with the goal to measure, with utmost efficiency, marks that previously unobserved particles predicted by new theories would leave in their experimental devices.

    Particle physics detectors are typically large and complex. Think of them as a set of hundreds of different shapes and materials. Particles coming from accelerator beams or high-energy collisions traverse the detectors, lose energy and transform themselves into showers of more particles as they interact with the detector material. The marks they leave behind are read by detector electronics and reconstructed by software into the original incident particles with their associated energies and trajectories.

    We wouldn’t even dream of starting detector construction, much less asking for the funding to do it, without simulating the detector geometry and magnetic fields, as well as the physics of the interactions of particles with detector material, in exquisite detail. One of the goals of simulation is to demonstrate that the proposed detector would do the job.

    Geant4 includes tools to represent the detector geometry by assembling elements of different shapes, sizes and material, as well as the mathematical expressions to propagate particles and calculate the details of the electromagnetic and nuclear interactions of particles with matter.

    Geant4 is the current incarnation of Geant (Geometry and Tracking, or “giant” in French). It has become extremely popular for physics, medical and space science applications and is the tool of choice for high-energy physics, including CERN’s LHC experiments and Fermilab’s neutrino and muon programs.

    The Fermilab Scientific Computing Simulation Department (SCS) has grown a team of Geant4 experts that participate actively in its core development and maintenance, offering detector simulation support to experiments and projects within Fermilab’s scientific program. The focus of our team is on improving physics and testing tools, as well as time and memory performance. The SCS team also spearheads an exciting R&D program to re-engineer the toolkit to run on modern computer architectures.

    New-generation machines containing chips called coprocessors, or graphics processing units such as those used in game consoles or smart phones, may be used to speed execution times significantly. Software engineers do this by exploiting the benefits of the novel circuit design of the chips, as well as by using parallel programming. For example, a program execution mode called “multi-threading” would allow us to simulate particles from showers of different physics collisions simultaneously by submitting these threads to the hundreds or thousands of processor cores contained within these novel computer systems.

    As the high-energy community builds, commissions and runs the experiments of the first half of the 21st century, a world of exciting and promising possibilities is opening in the field of simulation and detector modeling. Our Fermilab SCS team is at the forefront of this effort.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 348 other followers

%d bloggers like this: