Tagged: Particle Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:43 am on July 17, 2016 Permalink | Reply
    Tags: , Brian Eno, Music, , Particle Physics,   

    From Nautilus: “Brian Eno Plays the Universe” 

    Nautilus

    Nautilus

    July 7, 2016
    Stephon Alexander
    Illustration by Ric Carrasquillo

    1

    A physicist explains what the composer has in common with the dawn of the cosmos.

    Everyone had his or her favorite drink in hand. There were bubbles and deep reds, and the sound of ice clinking in cocktail glasses underlay the hum of contented chatter. Gracing the room were women with long hair and men dressed in black suits, with glints of gold necklaces and cuff links. But it was no Gatsby affair. It was the annual Imperial College quantum gravity cocktail hour. Like the other eager postdocs, this informal meeting was an opportunity to mingle with some of the top researchers in quantum gravity and hopefully ignite a collaboration, with a drink to sooth our nerves. But for me this party would provide a chance encounter that encouraged me to connect music with the physics of the early universe.

    The host was dressed down in black from head to toe—black turtleneck, jeans, and trench coat. On my first day as a postdoctoral student at Imperial College, I had spotted him at the end of a long hallway in the theoretical physics wing of Blackett Lab. With jet-black wild hair, beard, and glasses, he definitely stood out. I said, “Hi,” as he walked by, curious who he was, and with his “How’s it going?” response, I had him pegged. “You from New York?” I asked. He was.

    My new friend was Lee Smolin, one of the fathers of a theory known as loop quantum gravity, and he was in town considering a permanent job at Imperial. Along with string theory, loop quantum gravity is one of the most compelling approaches to unifying Einstein’s general relativity with quantum mechanics. As opposed to string theory, which says that the stuff in our universe is made up of fundamental vibrating strings, loop quantum gravity focuses on space itself as a woven network of loops of the same size as the strings in string theory.

    Lee had offered up his West Kensington flat for the quantum gravity drinks that evening to give the usual annual host, Faye Dowker, a break. Faye enjoyed being the guest lecturer that evening. Bespectacled, and brilliant, she was also a quantum gravity pioneer. While Professor Dowker was a postdoc she studied under Steven Hawking, working on wormholes and quantum cosmology, but her specialty transformed into causal set theory. After a couple of hours, the contented chatter gave way to Faye as she presented her usual crystal-clear exposition of causal sets as an alternate to strings and loops. Like loop quantum gravity, causal sets are less about the stuff in the universe and more about the structure of spacetime itself. But instead of being woven out of loops, spacetime is described by a discrete structure that is organized in a causal way. The causal-set approach envisions the structure of space analogous to sand on a beachhead. If we view the beachhead from afar, we see a uniform distribution of sand. But as we zoom in, we can discern the individual sand grains. In a causal set, spacetime, like a beach made up of sand, is composed of granular “atoms” of space-time.

    Scattered into the quantum gravity mixer were those working primarily on string theory, like the American theorist, Kellogg Stelle, who was a pioneer of p-branes, as well as one of my postdoc advisors. In mathematics, a membrane is a two-dimensional extended object—that is, it takes up space. A p-brane is a similar object in higher dimensions.

    The strings of string theory can collectively end on p-branes. And coming at quantum gravity from yet another route, there was Chris Isham, the philosophical topos theory man who played with mathematical entities that only “partially exist.” Postdocs studying all avenues of quantum gravity filled in the gaps between the big brains in the room. It wasn’t exactly a gathering of humble intellect. It was scenes like that, that made me feel like I didn’t have the chops, the focus, to sit behind a desk in a damp office manipulating mathematical symbols for hours like the others. Fortunately, Chris had shown he believed in my abilities to make a contribution to cosmology by encouraging me to get out of the office and get more involved with my music. Working on physics ideas and calculations in between sets, at the jazz dives of Camden town, I found myself trying hard to believe that it would give me a creative edge in my research. But something was about to change.

    While Faye gave her living-room lecture, I honed in on someone else I had noticed throughout the evening. Dressed in black like Lee, he had a strong face and a gold tooth that shone every time someone engaged him in conversation. The way he listened to Faye, with such focus, I assumed he was a hardcore Russian theorist. It turned out he had come with Lee. When Lee noticed I was still hanging around after the talk, he invited me to join them as Lee walked his gold-toothed friend back to his studio in Notting Hill Gate. I was curious what research this friend was going to churn up and what school of quantum gravity he’d slot into. I had to work to keep pace with the animated duo as we walked along well-lit high streets, dipping in and out of dark London mews. This guy was no regular physicist, I soon realized. Their conversation was unprecedented. It started with the structure of spacetime and the relativity of time and space according to Einstein. That wasn’t the strange part. Soon, they were throwing commentary around on the mathematics of waves and somehow kept coming back to music. This gold-toothed wonder was getting more intriguing by the minute.

    That was my first encounter with Brian Eno. Once we reached his studio, we exchanged phone numbers, and he generously lent me one of his bikes—indefinitely. At the time, I didn’t know who Brian was, but that changed a week later when I told a friend and band member about him. Tayeb, a gifted British-Algerian bassist and ooud player (an Arabic string instrument), was at first dumbfounded by my shameful ignorance. “Bloody hell, Stephon … you met the master.”

    Brian Eno, former member of the English rock band Roxy Music, established himself early on as a great innovator of music. He was part of the art rock and glam rock movement, when rock ‘n’ roll took on a new sound by incorporating classical and avant-garde influences. The rocker look was dressed up with flamboyant clothes, funky hair, and bright makeup: think Lou Reed, Iggy Pop, and David Bowie. Brian was the band’s synthesizer guru, with the power to program exquisite sounds. The beauty of synthesizers in those days lay in their complexity. In the early days, one had to program them—unlike synthesizers today, with preset sounds at the touch of a button. Popularity hit Roxy Music hard and fast, and Eno promptly had enough of it, so he left Roxy Music, and his career continued to flourish. He produced the Talking Heads and U2 and went on to collaborate with and produce greats such as Paul Simon, David Bowie, and Coldplay. In addition, he continued with synthesizers and emerged as the world’s leading programmer of the legendary Yamaha DX7 synthesizer.

    I wondered why an artist like Brian would be interested in matters of spacetime and relativity. The more I got to know Brian, I knew it wasn’t a time filler, or for his health. What I was about to discover during my two years in London was that Brian was something I’ve come to call a “sound cosmologist.” He was investigating the structure of the universe, not inspired by music, but with music.

    Often he would make a comment in passing that would even impact my research in cosmology. We began meeting up regularly at Brian’s studio in Notting Hill. It became a pit stop on my way to Imperial. We’d have a coffee and exchange ideas on cosmology and instrument design, or simply veg out and play some of Brian’s favorite Marvin Gaye and Fela Kuti songs. His studio became the birthplace of my most creative ideas. Afterward, I’d head to Imperial, head buzzing, spirits high, motivated to continue my work on calculations or discussions on research and publications with fellow theorists.

    One of the most memorable and influential moments in my physics research occurred one morning when I walked into Brian’s studio. Normally, Brian was working on the details of a new tune—getting his bass sorted out just right for a track, getting a line just slightly behind the beat. He was a pioneer of ambient music and a prolific installation artist.

    3

    Eno described his work in the liner notes for his record, Ambient 1: Music for Airports: “Ambient music must be able to accommodate many levels of listening attention without enforcing one in particular; it must be as ignorable as it is interesting.” What he sought was a music of tone and atmosphere, rather than music that demanded active listening. But creating an easy listening track is anything but easy, so he often had his head immersed in meticulous sound analysis.

    That particular morning, Brian was manipulating waveforms on his computer with an intimacy that made it feel as if he were speaking Wavalian, some native tongue of sound waves. What struck me was that Brian was playing with, arguably, the most fundamental concept in the universe—the physics of vibration. To quantum physicists, particles are described by the physics of vibration. And to quantum cosmologists, vibrations of fundamental entities such as strings could possibly be the key to the physics of the entire universe. The quantum scales those strings play are, unfortunately, terribly intangible, both mentally and physically, but there it was in front of me—sound—a tangible manifestation of vibration. This was by no means a new link I was making, but it made me start to think about its effect on my research and the question Robert Brandenberger had put to me: How did structure in our universe form?

    Sound is a vibration that pushes a medium, such as air or something solid, to create traveling waves of pressure. Different sounds create different vibrations, which in turn create different pressure waves. We can draw pictures of these waves, called waveforms. A key point in the physics of vibrations is that every wave has a measurable wavelength and height. With respect to sound, the wavelength dictates the pitch, high or low, and the height, or amplitude, describes the volume.

    If something is measurable, such as the length and height of waves, then you can give it a number. If you can put a number to something, then you can add more than one of them together, just by adding numbers together. And that’s what Brian was doing—adding up waveforms to get new ones. He was mixing simpler waveforms to make intricate sounds.

    To physicists, this notion of adding up waves is known as the Fourier transform. It’s an intuitive idea, clearly demonstrated by dropping stones in a pond. If you drop a stone in a pond, a circular wave of a definite frequency radiates from the point of contact. If you drop another stone nearby, a second circular wave radiates outward, and the waves from the two stones start to interfere with each other, creating a more complicated wave pattern. What is incredible about the Fourier idea is that any waveform can be constructed by adding waves of the simplest form together. These simple “pure waves” are ones that regularly repeat themselves.

    Linked by the physics of vibration, Brian Eno and I bonded. I began to view Fourier transforms in physics from the perspective of a musician mixing sound, seeing them as an avenue for creativity. The bicycle Brian lent me became the wheels necessary to get my brain from one place to another faster. For months, the power of interdisciplinary thought was my adrenaline. Music was no longer just an inspiration, not just a way to flex my neural pathways, it was absolutely and profoundly complementary to my research. I was enthralled by the idea of decoding what I saw as the Rosetta stone of vibration—there was the known language of how waves create sound and music, which Eno was clearly skilled with, and then there was the unclear vibrational message of the quantum behavior in the early universe and how it has created large-scale structures. Waves and vibration make up the common thread, but the challenge was to link them in order to draw a clearer picture of how structure is formed and, ultimately, us.

    Among the many projects Brian was working on at the time was one he called “generative music.” In 1994 Brian launched generative music to a studio full of baffled journalists and released the first generative software piece at the same time. The generative music idea that came to fruition about a decade later was an audible version of a moiré pattern. Recall our pond ripples interfering to create complex patterns. These are moiré patterns, created by overlapping identical repeating patterns, and there are an infinite variety of them. Instead of two pebbles creating waves, generative music rested on the idea of two beats, played back at different speeds. Allowed to play forward in time, simple beat inputs led to beautiful and impressive complexity—an unpredictable and endless landscape of audible patterns. It is “the idea that it’s possible to think of a system or a set of rules which once set in motion will create music for you … music you’ve never heard before.” Brian’s first experiment with moiré patterns was Discreet Music, which was released in 1975. It remains a big part of his longer ambient compositions such as Lux, a studio album released in 2012. Music becomes uncontrolled, unrepeatable, and unpredictable, very unlike classical music. The issue becomes which inputs you choose. What beats? What sounds?

    What I began to see was a close link between the physics underlying the first moments of the cosmos—how an empty featureless universe matured to have the rich structures that we see today—and Brian’s generative music. When I walked up to him one morning as he was manipulating the waveforms, he looked at me with a smile and said, “You see, Stephon, I’m trying to design a simple system that will generate an entire composition when activated.”

    A light bulb flickered in my brain. I began to seriously consider the hypothesis that the infant universe was not featureless, and its original radiation energy resonated like simple sound waves, much like Brian’s FM synthesizer. In this sense, our universe behaved like the ultimate generative music composition. The initial vibration of the energy fields sonified throughout the spacetime background like the vibrating body of an instrument, generating the first structure in our cosmos and then the first stars and eventually us.

    See the full article here .

    2
    By Stephon Alexander

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 1:14 pm on July 14, 2016 Permalink | Reply
    Tags: , How to Make an Intense Gamma-Ray Beam, Particle Physics   

    From Physics- “Focus: How to Make an Intense Gamma-Ray Beam” 

    Physics LogoAbout Physics

    Physics Logo 2

    Physics

    May 6, 2016 [This just showed up in social media.]
    Philip Ball

    Computer simulations show that blasting plastic with strong laser pulses could produce gamma rays with unprecedented intensity, good for fundamental physics experiments and possibly cancer treatments.

    1
    Squeezing gamma rays from plastic. Calculated electron density (green) and magnetic field strength (yellow-orange-blue) inside a structured plastic target 300 femtoseconds after being irradiated with an intense laser pulse from the left-hand side.

    Intense beams of gamma rays would find a host of uses in fundamental physics research, nuclear fusion, and medicine, but they are hard to produce. A team has now used computer simulations to show that a powerful laser hitting a plastic surface can generate intense gamma-ray emission. In the simulations, the laser light creates a plasma in the plastic and accelerates electrons enough to produce large numbers of gamma-ray photons. The researchers say that the system might work with current technology.

    In extreme astrophysical environments, such as near a supermassive black hole, matter and antimatter (electrons and positrons) regularly annihilate, producing gamma rays. Researchers would like to study the reverse process by colliding beams of gamma rays, which should create electrons and positrons, a transformation of light into matter [1]. Gamma-ray beams could also enable a wide range of other fundamental experiments and might have a role in radiation therapy and radiosurgery [2]. Previous attempts to make these beams involved the interaction of a laser beam with an electron beam [3]. But to produce copious gamma-ray photons with energies in the MeV range, the laser beam would need to be more intense than any current device.

    Alexey Arefiev at the University of Texas at Austin and his co-workers now propose a different method that requires somewhat less laser power. It involves shining pulses of a petawatt (1015W) infrared laser onto a carbon-rich, plastic target. The power density of such a pulsed laser can reach around 5×1022W/cm2, which is about 500 times greater than would be produced by focusing all of the sunlight reaching the Earth onto a pencil tip.

    In the team’s scenario, the laser pulse heats the target, creating a plasma of electrons and ions. The electrons in the plasma are high-energy, and according to special relativity, they acquire a large effective mass, making them too sluggish to follow the oscillations of the laser’s electromagnetic field. This effect renders the plasma transparent to the light, so the laser beam can penetrate tens of micrometers into the target, filling it with a dense plasma.

    In the team’s simulations, the laser pulse pushes the electrons in this plasma forward, like a leaf blower propelling leaves, and this motion of charge sets up a strong magnetic field that curls around the axis of the laser beam. This field accelerates the electrons forward even more but along zigzag trajectories as they move through the plastic. This electron motion generates so-called synchrotron radiation of very high energy (gamma-ray photons) that is emitted from the rear of the plastic target in the direction of the laser beam.

    The simulations that Arefiev and colleagues conducted using the Stampede supercomputer at the University of Texas showed that the method works in principle—but also that propagation of the laser pulse into the target may quickly become unstable and deviate from its original direction.

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF
    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    Then the electrons are pushed in random directions, and the multi-MeV photons they emit exit in an uncoordinated spray. “It’s similar to what happens to a garden hose that, when left unattended, can spray water uncontrollably everywhere,” says Arefiev.

    To provide a stabilizing “hand on the hose,” the researchers propose adapting the structure of the target. They simulated the same process for a target containing a cylindrical channel of lower density—a plastic foam, say, surrounded by the denser material. When the laser pulse hits this channel, the resulting plasma becomes completely transparent, whereas the surrounding material is opaque. So the channel funnels the laser pulse and ensures that it stays on course, producing a narrow, intense beam of forward-moving gamma rays.

    “We hope that our results will motivate experimentalists to test our predictions,” says Arefiev. The conditions they have simulated “might be within reach for existing laser facilities.”

    Donald Umstadter of the University of Nebraska at Lincoln, who works on new laser technologies, says that the gamma-ray beam could be used to study nuclear weapons materials that are relevant for managing the large stockpile of obsolete warheads. However, he also foresees many potential engineering difficulties in putting the idea into practice.

    The expected magnetic field would be 10 times stronger than that of any previous laser plasma, says Tony Bell of the University of Oxford, UK. But “the simulations are credible,” he says. “If it is successful, the beam of gamma rays thus generated would be extraordinarily intense.”

    This research is published in Physical Review Letters.

    References

    X. Ribeyre, E. d’Humières, O. Jansen, S. Jequier, V. T. Tikhonchuk, and M. Lobet, “Pair Creation in Collision of γ-Ray Beams Produced with High-Intensity Lasers,” Phys. Rev. E 93, 013201 (2016).
    K. J. Weeks, V. N. Litvinenko, and J. M. J. Madey, “The Compton Backscattering Process and Radiotherapy,” Med. Phys. 24, 417 (1997); B. Girolami, B. Larsson, M. Preger, C. Schaerf, and J. Stepanek, “Photon Beams for Radiosurgery Produced by Laser Compton Backscattering from Relativistic Electrons,” Phys. Med. Biol. 41, 1581 (1996).
    S. Chen et al., “MeV-Energy X Rays from Inverse Compton Scattering with Laser-Wakefield Accelerated Electrons,” Phys. Rev. Lett. 110, 155003 (2013).

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments (physics@aps.org).

     
  • richardmitnick 10:21 am on July 14, 2016 Permalink | Reply
    Tags: , , , , Particle Physics,   

    From particlebites: “The dawn of multi-messenger astronomy: using KamLAND to study gravitational wave events GW150914 and GW151226” 

    particlebites bloc

    particlebites

    July 13, 2016
    Eve Vavagiakis

    Article: Search for electron antineutrinos associated with gravitational wave events GW150914 and GW151226 using KamLAND
    Authors: KamLAND Collaboration
    Reference: arXiv:1606.07155

    After the chirp heard ‘round the world, the search is on for coincident astrophysical particle events to provide insight into the source and nature of the era-defining gravitational wave events detected by the LIGO Scientific Collaboration in late 2015.

    LSC LIGO Scientific Collaboration

    By combining information from gravitational wave (GW) events with the detection of astrophysical neutrinos and electromagnetic signatures such as gamma-ray bursts, physicists and astronomers are poised to draw back the curtain on the dynamics of astrophysical phenomena, and we’re surely in for some surprises.

    The first recorded gravitational wave event, GW150914, was likely a merger of two black holes which took place more than one billion light years from the Earth. The event’s name marks the day it was observed by the Advanced Laser Interferometer Gravitational-wave Observatory (LIGO), September 14th, 2015.

    1
    Two black holes spiral in towards one another and merge to emit a burst of gravitational waves that Advanced LIGO can detect. Source: APS Physics.

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation
    Caltech/MIT Advanced aLigo Hanford, WA, USA installation
    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA
    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    LIGO detections are named “GW” for “gravitational wave,” followed by the observation date in YYMMDD format. The second event, GW151226 (December 26th, 2015) was likely another merger of two black holes, having 8 and 14 times the mass of the sun, taking place 1.4 billion light years away from Earth.

    The following computer simulation from LIGO depicts what the collision of two black holes would look like if we could get close enough to the merger. It was created by solving equations from Albert Einstein’s general theory of relativity using the LIGO data. (Source: LIGO Lab Caltech : MIT).

    A third gravitational wave event candidate, LVT151012, a possible black hole merger which occurred on October 12th, 2015, did not reach the same detection significance a the aforementioned events, but still has a >50% chance of astrophysical origin. LIGO candidates are named differently than detections. The names start with “LVT” for “LIGO-Virgo Trigger,” but are followed by the observation date in the same YYMMDD format. The different name indicates that the event was not significant enough to be called a gravitational wave.

    Observations from other scientific collaborations can search for particles associated with these gravitational waves. The combined information from the gravitational wave and particle detections could identify the origin of these gravitational wave events. For example, some violent astrophysical phenomena emit not only gravitational waves, but also high-energy neutrinos. Conversely, there is currently no known mechanism for the production of either neutrinos or electromagnetic waves in a black hole merger.

    Black holes with rapidly accreting disks can be the origin of gamma-ray bursts and neutrino signals, but these disks are not expected to be present during mergers like the ones detected by LIGO. For this reason, it was surprising when the Fermi Gamma-ray Space Telescope reported a coincident gamma-ray burst occurring 0.4 seconds after the September GW event with a false alarm probability of 1 in 455. Although there is some debate in the community about whether or not this observation is to be believed, the observation motivates a multi-messenger analysis including the hunt for associated astrophysical neutrinos at all energies.

    Could a neutrino experiment like KamLAND find low energy antineutrino events coincident with the GW events, even when higher energy searches by IceCube and ANTARES did not?

    3
    Schematic diagram of the KamLAND detector. Source: hep-ex/0212021v1

    KamLAND, the Kamioka Liquid scintillator Anti-Neutrino Detector, is located under Mt. Ikenoyama, Japan, buried beneath the equivalent of 2,700 meters of water. It consists of an 18 meter diameter stainless steel sphere, the inside of which is covered with photomultiplier tubes, surrounding an EVOH/nylon balloon enclosed by pure mineral oil. Inside the balloon resides 1 kton of highly purified liquid scintillator. Outside the stainless steel sphere is a cylindrical 3.2 kton water-Cherenkov detector that provides shielding and enables cosmic ray muon identification.

    KamLAND is optimized to search for ~MeV neutrinos and antineutrinos. The detection of the gamma ray burst by the Fermi telescope suggests that the detected black hole merger might have retained its accretion disk, and the spectrum of accretion disk neutrinos around a single black hole is expected to peak around 10 MeV, so KamLAND searched for correlations between the LIGO GW events and ~10 MeV electron antineutrino events occurring within a 500 second window of the merger events. Researchers focused on the detection of electron antineutrinos through the inverse beta decay reaction.

    No events were found within the target window of any gravitational wave event, and any adjacent event was consistent with background. KamLAND researchers used this information to determine a monochromatic fluence (time integrated flux) upper limit, as well as an upper limit on source luminosity for each gravitational wave event, which places a bound on the total energy released as low energy neutrinos during the merger events and candidate event. The lack of detected concurrent inverse beta decay events supports the conclusion that GW150914 was a black hole merger, and not another astrophysical event such as a core-collapse supernova.

    More information would need to be obtained to explain the gamma ray burst observed by the Fermi telescope, and work to improve future measurements is ongoing. Large uncertainties in the origin region of gamma ray bursts observed by the Fermi telescope will be reduced, and the localization of GW events will be improved, most drastically so by the addition of a third LIGO detector (LIGO India).

    As Advanced LIGO continues its operation, there will likely be many more chances for KamLAND and other neutrino experiments to search for coincidence neutrinos. Multi-messenger astronomy has only just begun to shed light on the nature of black holes, supernovae, mergers, and other exciting astrophysical phenomena — and the future looks bright.

    U Wisconsin ICECUBE neutrino detector at the South Pole
    IceCube neutrino detector interior
    U Wisconsin ICECUBE neutrino detector at the South Pole

    5
    ANTERES

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    What is ParticleBites?

    ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.

    The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment).

    Why read ParticleBites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

    Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics.

    Who writes ParticleBites?

    ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites.

    ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop.

    2
    Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside

    It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders.

     
  • richardmitnick 12:51 pm on July 7, 2016 Permalink | Reply
    Tags: , , , , Particle Physics, Testing calorimeters at CERN   

    From ILC: “Practi-Cal” 

    Linear Collider Collaboration header
    Linear Collider Collaboration

    7 July 2016
    No writer credit found

    1
    Testing, testing… calorimeters in the test beam at CERN.

    Better together: two technological prototypes of the high-granularity calorimeters for a future ILC detector have been tested together with particle beams at CERN in a combined mode. The Semi-Digital Hadronic CALorimeter (SDHCAL) prototype with its 48 layers and the Silicon Electromagnetic CALorimeter (SiECAL) with its 10 units, both part of the CALICE collaboration, spent two weeks taking data on the “H2” beam line at CERN’s SPS. The principal goal of this beam test was to validate their combined data acquisition (DAQ) system developed by the teams working on the two calorimeters. After the fixing of a few problems that appeared during the data taking, the DAQ system ran smoothly and both prototypes took common data. This is what they will have to do in the future to register electron-positron collisions at the ILC.

    Physicists and engineers from six countries participated in this beam test: Belgium, China, France, Japan, Korea and Spain. Future tests will focus on studying the common response of these two calorimeters to the different kinds of particles. “The success of this combined test will certainly encourage other detectors proposed for the tracking system (Silicon and TPC detectors) to join the adventure…,” Imad Laktineh, professor at IN2P3’s Institut de Physique Nucléaire de Lyon,who supervised the combined beam test, hopes.

    More about calorimeter test beams here and here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Linear Collider Collaboration is an organisation that brings the two most likely candidates, the Compact Linear Collider Study (CLIC) and the International Liner Collider (ILC), together under one roof. Headed by former LHC Project Manager Lyn Evans, it strives to coordinate the research and development work that is being done for accelerators and detectors around the world and to take the project linear collider to the next step: a decision that it will be built, and where.

    Some 2000 scientists – particle physicists, accelerator physicists, engineers – are involved in the ILC or in CLIC, and often in both projects. They work on state-of-the-art detector technologies, new acceleration techniques, the civil engineering aspect of building a straight tunnel of at least 30 kilometres in length, a reliable cost estimate and many more aspects that projects of this scale require. The Linear Collider Collaboration ensures that synergies between the two friendly competitors are used to the maximum.

    Linear Collider Colaboration Banner

     
  • richardmitnick 3:40 pm on July 6, 2016 Permalink | Reply
    Tags: , , , Particle Physics, Quantum Color   

    From Don Lincoln at FNAL: “Quantum Color” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    From Don Lincoln, Fermilab

    FNAL Don Lincoln
    Don Loncoln

    Published on Jun 17, 2016 [Just made it to social media]

    The strongest force in the universe is the strong nuclear force and it governs the behavior of quarks and gluons inside protons and neutrons. The name of the theory that governs this force is quantum chromodynamics, or QCD. In this video, Fermilab’s Dr. Don Lincoln explains the intricacies of this dominant component of the Standard Model.

    Watch, enjoy learn.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 1:02 pm on June 28, 2016 Permalink | Reply
    Tags: , Brookhaven E821 Muon (g-2), , Particle Physics,   

    From Symmetry: “Preparing for their magnetic moment” 

    Symmetry Mag

    Symmetry

    06/28/16
    Andre Salles

    1
    Cindy Arnold, Fermilab

    Scientists are using a plastic robot and hair-thin pieces of metal to ready a magnet that will hunt for new physics.

    Three summers ago, a team of scientists and engineers on the Muon g-2 experiment moved a 52-foot-wide circular magnet 3200 miles over land and sea. It traveled in one piece without twisting more than a couple of millimeters, lest the fragile coils inside irreparably break. It was an astonishing feat that took years to plan and immense skill to execute.

    FNAL Muon g-2 studio
    FNAL Muon g-2 studio

    As it turns out, that was the easy part.

    The hard part—creating a magnetic field so precise that even subatomic particles see it as perfectly smooth—has been under way for seven months. It’s a labor-intensive process that has inspired scientists to create clever, often low-tech solutions to unique problems, working from a road map written 30 years ago as they drive forward into the unknown.

    The goal of Muon g-2 is to follow up on a similar experiment conducted at the US Department of Energy’s Brookhaven National Laboratory in New York in the 1990s.

    2
    E821 Muon (g-2) At Brookhaven

    Scientists there built an extraordinary machine that generated a near-perfect magnetic field into which they fired a beam of particles called muons. The magnetic ring serves as a racetrack for the muons, and they zoom around it for as long as they exist—usually about 64 millionths of a second.

    That’s a blink of an eye, but it’s enough time to measure a particular property: the precession frequency of the muons as they hustle around the magnetic field. And when Brookhaven scientists took those measurements, they found something different than the Standard Model, our picture of the universe, predicted they would. They didn’t quite capture enough data to claim a definitive discovery, but the hints were tantalizing.

    Now, 30 years later, some of those same scientists—and dozens of others, from 34 institutions around the world—are conducting a similar experiment with the same magnet, but fed by a more powerful beam of muons at the US Department of Energy’s Fermi National Accelerator Laboratory in Illinois. Moving that magnet from New York caused quite a stir among the science-interested public, but that’s nothing compared with what a discovery from the Muon g-2 experiment would cause.

    “We’re trying to determine if the muon really is behaving differently than expected,” says Dave Hertzog of the University of Washington, one of the spokespeople of the Muon g-2 experiment. “And, if so, that would suggest either new particles popping in and out of the vacuum, or new subatomic forces at work. More likely, it might just be something no one has thought of yet. In any case, it’s all very exciting.”

    Shimming to reduce shimmy

    To start making these measurements, the magnetic field needs to be the same all the way around the ring so that, wherever the muons are in the circle, they will see the same pathway. That’s where Brendan Kiburg of Fermilab and a group of a dozen scientists, post-docs and students come in. For the past six months, they have been “shimming” the magnetic ring, shaping it to an almost inconceivably exact level.

    “The primary goal of shimming is to make the magnetic field as uniform as possible,” Kiburg says. “The muons act like spinning tops, precessing at a rate proportional to the magnetic field. If a section of the field is a little higher or a little lower, the muon sees that, and will go faster or slower.”

    Since the idea is to measure the precession rate to an extremely precise degree, the team needs to shape the magnetic field to a similar degree of uniformity. They want it to vary by no more than ten parts in a billion per centimeter. To put that in perspective, that’s like wanting a variation of no more than one second in nearly 32 years, or one sheet in a roll of toilet paper stretching from New York to London.

    How do they do this? First, they need to measure the field they have. With a powerful electromagnet that will affect any metal object inside it, that’s pretty tricky. The solution is a marriage of high-tech and low-tech: a cart made of sturdy plastic and quartz, moved by a pulley attached to a motor and continuously tracked by a laser. On this cart are probes filled with petroleum jelly, with sensors measuring the rate at which the jelly’s protons spin in the magnetic field.

    The laser can record the position of the cart to 25 microns, half the width of a human hair. Other sensors measure how far apart the top and bottom of the cart are to the magnet, to the micron.

    “The cart moves through the field as it is pulled around the ring,” Kiburg says. “It takes between two and two-and-a-half hours to go around the ring. There are more than 1500 locations around the path, and it stops every three centimeters for a short moment while the field is precisely measured in each location. We then stitch those measurements into a full map of the magnetic field.”

    Erik Swanson of the University of Washington is the run coordinator for this effort, meaning he directs the team as they measure the field and perform the manually intensive shimming. He also designed the new magnetic resonance probes that measure the field, upgrading them from the technology used at Brookhaven.

    “They’re functionally the same,” he says of the probes, “but the Brookhaven experiment started in the 1990s, and the old probes were designed before that. Any electronics that old, there’s the potential that they will stop working.”

    Swanson says that the accuracy to which the team has had to position the magnet’s iron pieces to achieve the desired magnetic field surprised even him. When scientists first turned the magnet on in October, the field, measured at different places around the ring, varied by as much as 1400 parts per million. That may seem smooth, but to a tiny muon it looks like a mountain range of obstacles. In order to even it out, the Muon g-2 team makes hundreds of minuscule adjustments by hand.


    Access mp4 video here .

    Physical physics

    Stationed around the ring are about 1000 knobs that control the ways the field could become non-uniform. But when that isn’t enough, the field can be shaped by taking parts of the magnet apart and inserting extremely small pieces of steel called shims, changing the field by thousandths of an inch.

    There are 12 sections of the magnet, and it takes an entire day to adjust just one of those sections.

    This process relies on simulations, calibrations and iterations, and with each cycle the team inches forward toward their goal, guided by mathematical predictions. Once they’re done with the process of carefully inserting these shims, some as thin as 12.5 microns, they reassemble the magnet and measure the field again, starting the process over, refining and learning as they go.

    “It’s fascinating to me how hard such a simple-seeming problem can be,” says Matthias Smith of the University of Washington, one of the students who helped design the plastic measuring robot. “We’re making very minute adjustments because this is a puzzle that can go together in multiple ways. It’s very complex.”

    His colleague Rachel Osofsky, also of the University of Washington, agrees. Osofsky has helped put in more than 800 shims around the magnet, and says she enjoys the hands-on and collaborative nature of the work.

    “When I first came aboard, I knew I’d be spending time working on the magnet, but I didn’t know what that meant,” she says. “You get your hands dirty, really dirty, and then measure the field to see what you did. Students later will read the reports we’re writing now and refer to them. It’s exciting.”

    Similarly, the Muon g-2 team is constantly consulting the work of their predecessors who conducted the Brookhaven experiment, making improvements where they can. (One upgrade that may not be obvious is the very building that the experiment is housed in, which keeps the temperature steadier than the one used at Brookhaven and reduces shape changes in the magnet itself.)

    Kiburg says the Muon g-2 team should be comfortable with the shape of the magnetic field sometime this summer. With the experiment’s beamline under construction and the detectors to be installed, the collaboration should be ready to start measuring particles by next summer. Swanson says that while the effort has been intense, it has also been inspiring.

    “It’s a big challenge to figure out how to do all this right,” he says. “But if you know scientists, when a challenge seems almost impossible, that’s the one we all go for.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:23 pm on June 28, 2016 Permalink | Reply
    Tags: , , MAX IV synchrotron, , Particle Physics   

    From CERN: “Vacuum chambers full of ideas for the Swedish synchrotron” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    27 Jun 2016
    Corinne Pralavorio

    CERN’s Vacuum, Surfaces and Coatings group has contributed to the development of vacuum chambers for the MAX IV synchrotron, which has just been officially opened in Sweden.

    1
    A section of the new 3 GeV MAXIV synchrotron at the time of installation. In the centre of the magnets you can see the vacuum chamber developed in collaboration with CERN. (Photo: Marek Grabski, MAX IV Vacuum group)

    On 21 June, the King and the Prime Minister of Sweden officially opened MAX IV, a brand-new synchrotron in Lund, Sweden. The summer solstice, the longest day of the year, was deliberately chosen for the ceremony: MAX IV, a cutting-edge synchrotron, will deliver the brightest X-rays ever produced to more than 2000 users.

    Some 1500 kilometres away, a team at CERN followed the opening ceremony with a touch of pride. The Vacuum, Surfaces and Coatings group in the Technology department (TE-VSC) participated in the construction of this new synchrotron. Its contribution lies at the very heart of the accelerator, in its vacuum chambers. The group developed the coating for most of the vacuum chambers in the larger of the two rings, which has a circumference of 528 metres and operates at an energy of 3 GeV.

    The CERN group was brought in to develop the coating for the vacuum chambers using NEG (Non-Evaporable Getter) material. A thin, micrometric layer of NEG ensures a high-grade vacuum: it traps residual gas molecules and limits the release of molecules generated by the bombardment of photons. The technology was developed at CERN in the late 1990s for the LHC: six kilometres of vacuum chambers in the LHC, i.e. those at ambient temperature, are coated with NEG material. CERN’s expertise in the field is therefore unique and recognised worldwide.

    2
    Prototype of the surface treatment process, developed at CERN, to coat the vacuum chambers of the MAX IV synchrotron. (Photo: Pedro Costa Pinto/CERN)

    “The MAX IV design was very demanding, as the cross-section of the vacuum chambers is very small, just 2.4 centimetres compared to 8 cm at the LHC,” explains Paolo Chiggiato, TE-VSC group leader. “In addition, some parts were geometrically complex.” Synchrotron light is extracted to experimental areas every 26 metres. At the extraction point, the chamber comprises two tubes that gradually diverge.

    The CERN group began its involvement in the project in 2014 and developed the chemical surface treatment method used for almost all the vacuum chambers in the large ring of MAX IV. Treatment of the cylindrically symmetrical vacuum chambers was carried out by a European firm and a European institute, to which CERN had already transferred the technology in the past. The most complex chambers, around 120 in total, were treated at CERN. Two benches for sputtering, the coating technique used, were developed at CERN. “These benches are equipped with a wire whose material is deposited onto the surface of the chamber. For the MAX IV chambers, the wire had a diameter of 0.5 millimetres and its alignment was critical,” explains Mauro Taborelli, leader of the Surfaces, Chemistry and Coatings section in the TE-VSC group. “The procedure was all the more complicated because the extraction chambers, in which the photons are extracted, have a tiny vertical aperture, of around 1 millimetre,” confirms Pedro Costa Pinto, leader of the team responsible for the vacuum deposition process.

    The vacuum chambers were delivered in 2014 and 2015. “It’s essential for us to participate in these types of project, which require lots of ingenuity, to be able to maintain and build on our know-how,” says Paolo Chiggiato. “By developing our expertise in this way, we will be ready for new projects at CERN.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 5:29 pm on June 26, 2016 Permalink | Reply
    Tags: , Particle Physics, ,   

    From Science Alert: “What’s the point of theoretical physics?” 

    ScienceAlert

    Science Alert

    24 JUN 2016
    ALEXANDER LENZ

    1
    Ahuli Labutin/Shutterstock.com

    You don’t have to be a scientist to get excited about breakthroughs in theoretical physics. Discoveries such as gravitational waves and the Higgs boson can inspire wonder at the complex beauty of the Universe no matter how little you really understand them.

    But some people will always question why they should care about scientific advances that have no apparent impact on their daily life – and why we spend millions funding them. Sure, it’s amazing that we can study black holes thousands of light-years away and that Einstein really was as much of a genius as we thought, but that won’t change the way most people live or work.

    Yet the reality is that purely theoretical studies in physics can sometimes lead to amazing changes in our society. In fact, several key pillars on which our modern society rests, from satellite communication to computers, were made possible by investigations that had no obvious application at the time.

    Around 100 years ago, quantum mechanics was a purely theoretical topic, only developed to understand certain properties of atoms. Its founding fathers such as Werner Heisenberg and Erwin Schrödinger had no applications in mind at all. They were simply driven by the quest to understand what our world is made of.

    Quantum mechanics states that you cannot observe a system without changing it fundamentally by your observation, and initially its effects to society were of a philosophical and not a practical nature.

    But today, quantum mechanics is the basis of our use of all semiconductors in computers and mobile phones. To build a modern semiconductor for use in a computer, you have to understand concepts such as the way electrons behave when atoms are held together in a solid material, something only described accurately by quantum mechanics.

    Without it, we would have been stuck using computers based on vacuum tubes.

    At a similar time as the key developments in quantum mechanics, Albert Einstein was attempting to better understand gravity, the dominating force of the universe.

    Rather than viewing gravity as a force between two bodies, he described it as a curving of space-time around each body, similar to how a rubber sheet will stretch if a heavy ball is placed on top of it. This was Einstein’s general theory of relativity.

    Today the most common application of this theory is in GPS. To use signals from satellites to pinpoint your location you need to know the precise time the signal leaves the satellite and when it arrives on Earth.

    Einstein’s theory of general relativity means that the distance of a clock from Earth’s centre of gravity affects how fast it ticks. And his theory of special relativity means that the speed a clock is moving at also affects its ticking speed.

    Without knowing how to adjust the clocks to take account of these effects, we wouldn’t be able to accurately use the satellite signals to determine our position on the ground. Despite his amazing brain, Einstein probably could not have imagined this application a century ago.

    Scientific culture

    Aside from the potential, eventual applications of doing fundamental research, there are also direct financial benefits. Most of the students and post-docs working on big research projects like the Large Hadron Collider will not stay in academia but move into industry.

    During their time in fundamental physics, they are educated at the highest existing technical level and then take their expertise into working companies. This is like educating car mechanics in Formula One racing teams.

    Despite these direct and indirect benefits, most theoretical physicists have a very different motive for their work. They simply want to improve humanity’s understanding of the Universe.

    While this might not immediately impact everyone’s lives, I believe it is just as important a reason for pursuing fundamental research.

    2
    GPS: a relative success. Shutterstock

    This motivation may well have begun when humans first looked up at the night-sky in ancient times. They wanted to understand the world they lived in and so spent time watching nature and creating theories about it, many of them involving gods or supernatural beings.

    Today we have made huge progress in our understanding of both stars and galaxies and, at the other end of the scale, of the tiny fundamental particles from which matter is built.

    It somehow seems that every new level of understanding we achieve comes in tandem with new, more fundamental questions. It is never enough to know what we now know. We always want to continue looking behind newly arising curtains. In that respect, I consider fundamental physics a basic part of human culture.

    Now we can wait curiously to find out what unforeseen spin-offs that discoveries such as the Higgs boson or gravitational waves might lead to in the long-term future. But we can also look forward to the new insights into the building blocks of nature that they will bring us, and the new questions they will raise.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 5:16 pm on June 26, 2016 Permalink | Reply
    Tags: , Particle Physics, ,   

    From particle bites: “Can’t Stop Won’t Stop: The Continuing Search for SUSY” 

    particlebites bloc

    particlebites

    June 19, 2016
    Julia Gonski

    Presenting:

    Title: “Search for top squarks in final states with one isolated lepton, jets, and missing transverse momentum in √s = 13 TeV pp collisions with the ATLAS detector
    Author: The ATLAS Collaboration
    Publication: Submitted 13 June 2016, arXiv 1606.03903

    Things at the LHC are going great. Run II of the Large Hadron Collider is well underway, delivering higher energies and more luminosity than ever before. ATLAS and CMS also have an exciting lead to chase down– the diphoton excess that was first announced in December 2015. So what does lots of new data and a mysterious new excess have in common? They mean that we might finally get a hint at the elusive theory that keeps refusing our invitations to show up: supersymmetry.

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    1
    Figure 1: Feynman diagram of stop decay from proton-proton collisions.

    People like supersymmetry because it fixes a host of things in the Standard Model. But most notably, it generates an extra Feynman diagram that cancels the quadratic divergence of the Higgs mass due to the top quark contribution. This extra diagram comes from the stop quark. So a natural SUSY solution would have a light stop mass, ideally somewhere close to the top mass of 175 GeV. This expected low mass due to “naturalness” makes the stop a great place to start looking for SUSY. But according to the newest results from the ATLAS Collaboration, we’re not going to be so lucky.

    Using the full 2015 dataset (about 3.2 fb-1), ATLAS conducted a search for pair-produced stops, each decaying to a top quark and a neutralino, in this case playing the role of the lightest supersymmetric particle. The top then decays as tops do, to a W boson and a b quark. The W usually can do what it wants, but in this case the group chose to select for one W decaying leptonically and one decaying to jets (leptons are easier to reconstruct, but have a lower branching ratio from the W, so it’s a trade off.) This whole process is shown in Figure 1. So that gives a lepton from one W, jets from the others, and missing energy from the neutrino for a complete final state.

    2
    Figure 2: Transverse mass distribution in one of the signal regions.

    The paper does report an excess in the data, with a significance around 2.3 sigma. In Figure 2, you can see this excess overlaid with all the known background predictions, and two possible signal models for various gluino and stop masses. This signal in the 700-800 GeV mass range is right around the current limit for the stop, so it’s not entirely inconsistent. While these sorts of excesses come and go a lot in particle physics, it’s certainly an exciting reason to keep looking.

    Figure 3 shows our status with the stop and neutralino, using 8 TeV data. All the shaded regions here are mass points for the stop and neutralino that physicists have excluded at 95% confidence. So where do we go from here? You can see a sliver of white space on this plot that hasn’t been excluded yet— that part is tough to probe because the mass splitting is so small, the neutralino emerges almost at rest, making it very hard to notice. It would be great to check out that parameter space, and there’s an effort underway to do just that. But at the end of the day, only more time (and more data) can tell.

    (P.S. This paper also reports a gluino search—too much to cover in one post, but check it out if you’re interested!)

    3
    Figure 3: Limit curves for stop and neutralino masses, with 8 TeV ATLAS dataset.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    What is ParticleBites?

    ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.

    The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment).

    Why read ParticleBites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

    Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics.

    Who writes ParticleBites?

    ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites.

    ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop.

    2
    Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside

    It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders.

     
  • richardmitnick 4:47 pm on June 26, 2016 Permalink | Reply
    Tags: , , , Particle Physics   

    From astrobites: Particlebites reports on elastically decoupling dark matter 

    Astrobites bloc

    Today we feature a recent article from our sister site particlebites exploring new theory on dark matter interactions,

    Source: Particlebites reports on elastically decoupling dark matter

    Below is an excerpt from a post by particlebites author Flip Tanedo.

    5
    UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside.

    Theoretical physicists have proposed new theory for how dark matter interactions can explain the observed amount of dark matter in the universe today. This elastically decoupling dark matter framework, is a hybrid of conventional and novel dark matter models.

    Presenting: Elastically Decoupling Dark Matter
    Authors: Eric Kuflik, Maxim Perelstein, Nicolas Rey-Le Lorier, Yu-Dai Tsai
    Reference: 1512.04545, Phys. Rev. Lett. 116, 221302 (2016)

    The particle identity of dark matter is one of the biggest open questions in physics. The simplest and most widely assumed explanation is that dark matter is a weakly-interacting massive particle (WIMP). Assuming that dark matter starts out in thermal equilibrium in the hot plasma of the early universe, the present cosmic abundance of WIMPs is set by the balance of two effects:

    When two WIMPs find each other, they can annihilate into ordinary matter. This depletes the number of WIMPs in the universe.
    The universe is expanding, making it harder for WIMPs to find each other.

    This process of “thermal freeze out” leads to an abundance of WIMPs controlled by the dark matter mass and interaction strength. The term ‘weakly-interacting massive particle’ comes from the observation that dark matter of roughly the mass of the weak force particle that interact through the weak nuclear force gives the experimentally measured dark matter density today.

    1
    Two ways for a new particle, X, to produce the observed dark matter abundance: (left) WIMP annihilation into Standard Model (SM) particles versus (right) SIMP 3-to-2 interactions that reduce the amount of dark matter.

    More recently, physicists noticed that dark matter with very large interactions with itself (not ordinary matter), can produce the correct dark matter density in another way. These “strongly interacting massive particle” models reduce regulate the amount of dark matter through 3-to-2 interactions that reduce the total number of dark matter particles rather than annihilation into ordinary matter.

    The authors of 1512.04545 have proposed an intermediate road that interpolates between these two types of dark matter. The “elastically decoupling dark matter” (ELDER) scenario. ELDERs have both of the above interactions: they can annihilate pairwise into ordinary matter, or sets of three ELDERs can turn into two ELDERS.

    2
    Thermal history of ELDERs, adapted from 1512.04545.

    The cosmic history of these ELDERs is as follows:

    ELDERs are produced in thermal bath immediately after the big bang.
    Pairs of ELDERS annihilate into ordinary matter. Like WIMPs, they interact weakly with ordinary matter.
    As the universe expands, the rate for annihilation into Standard Model particles falls below the rate at which the universe expands
    Assuming that the ELDERs interact strongly amongst themselves, the 3-to-2 number-changing process still occurs. Because this process distributes the energy of 3 ELDERs in the initial state to 2 ELDERs in the final state, the two outgoing ELDERs have more kinetic energy: they’re hotter. This turns out to largely counteract the effect of the expansion of the universe.

    The neat effect here is the abundance of ELDERs is actually set by the interaction with ordinary matter, like WIMPs. However, because they have this 3-to-2 heating period, they are able to produce the observed present-day dark matter density for very different choices of interactions. In this sense, the authors show that this framework opens up a new “phase diagram” in the space of dark matter theories:

    3
    A “phase diagram” of dark matter models. The vertical axis represents the dark matter self-coupling strength while the horizontal axis represents the coupling to ordinary matter.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 584 other followers

%d bloggers like this: