Tagged: Particle Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:18 pm on February 3, 2016 Permalink | Reply
    Tags: , , , Particle Physics   

    From FNAL: “Muon Campus beamline enclosure achieves beneficial occupancy” 

    FNAL II photo

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    February 3, 2016
    Rashmi Shivni

    FNAL Mu2e facility
    The Mu2e facility continues construction, south of Wilson Hall and, in this picture, is left of the completed MC-1 building. Both facilities are a part of the Muon Campus, along with the Muon Delivery Ring (not pictured). Photo: Tom Hamernik, FESS

    With the Muon g-2 and Mu2e experiments, Fermilab may uncover new physics that could solve discrepancies in the Standard Model, which maps our understanding of physics in the subatomic realm. Fermilab has been building a home for the two experiments – the Muon Campus – which began construction in 2013. It is also preparing for Muon g-2 to take beam in 2017.

    FNAL Muon g-2 studio
    Muon g-2 studio

    FNAL Mu2e experiment
    Mu2e

    Standard model with Higgs New
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The lab met a major milestone last month, achieving beneficial occupancy on Dec. 9, for the Muon Campus’ underground beamline enclosure. The beamline links the muon experimentation facilities to the Muon Delivery Ring, which delivers beam to the Mu2e experiment. Beneficial occupancy is achieved when basic life safety systems, such as emergency lighting, fire alarms and communications, are in place.

    “That doesn’t mean the building is completely finished,” said Tom Hamernik, a FESS engineer and conventional construction manager for the Mu2e facility and beamline enclosure. “After the laboratory takes beneficial occupancy, there is a substantial period of experimental equipment installation before the facility is ready for experimentation.”

    The Muon Campus’ projected completion is in 2020.

    The Muon Campus is south of Wilson Hall, and it will be one of several experimental campuses that use the Recycler accelerator (located in the Main Injector ring). The MC-1 facility on the Muon Campus, which houses the Muon g-2 experiment, and the beamline enclosure are currently the two areas that have beneficial occupancy.

    “We’re at the peak of construction right now,” said Mary Convery, associate division head of the Accelerator Division.

    Convery oversees the Muon Campus program, which is broken into several, smaller projects. Most of the construction and civil engineering projects are complete, while the accelerator upgrades and the Mu2e building construction remain.

    The Particle Physics Division’s Alignment Group is using the lab’s beneficial occupancy to create a magnet alignment network inside the Muon Delivery Ring and the new beamline enclosures. The Accelerator Division is installing equipment, such as vacuum components, instrumentation cables, beamline magnets and water cooling systems. This work is beginning now and will continue for more than a year with many other divisions at Fermilab.

    “It’s a lot of coordination between divisions, and it’s turning into a one-lab type of mentality,” said Consolato Gattuso, the Accelerator Division summer shutdown manager and Muon Campus installation coordinator.

    The amount of time and effort that goes into constructing facilities like the Muon Campus can be daunting, Gattuso said. So the construction and installation crews manage their time wisely by planning and tackling each task in bite-sized pieces, keeping them on schedule. But challenges are also bound to arise from many areas in the construction process, since there are multiple, smaller facets to the project.

    The Mu2e building, for example, has many underground spaces, with ceilings as high as 20 feet, that must fit the 80-foot long, S-shaped Mu2e detector and supporting infrastructure.

    “The complex geometry of detailing and designing all the corners and walls, where everything comes together, creates a unique construction challenge for everybody involved,” Hamernik said.

    For Gattuso, the biggest challenge, besides the construction itself, may be planning and scheduling everyone’s tasks.

    “It’s not just having specialized people doing their work, but also knowing the appropriate pace we need to maintain to stay on schedule,” Gattuso said. “There’s a lot of shuffling that happens when we’re talking magnets that weigh somewhere between as little as 600 pounds and as much as 20 tons.”

    Although there is plenty work yet to be done, Fermilab benefits from having a wealth of existing inventory to draw from. For example, the former Antiproton Source (now the Muon Delivery Ring) and approximately 300 of the lab’s magnets are being repurposed for the two muon experiments.

    Construction and beneficial occupancy work are a part of the natural progression of building and innovating, Convery said, where innovation lies in gaining a firmer hold of fleeting particles such as muons.

    “Both experiments will be able to reach higher precision thanks to the new facilities and improved beam delivery that the Muon Campus provides,” she said. “We wouldn’t have these facilities if it weren’t for the many people who came together to make this a reality.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 1:23 pm on February 2, 2016 Permalink | Reply
    Tags: , , , , , Particle Physics   

    From CERN: “The story of ALICE” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    01 February 2016
    Iva Raynova

    The discussions about the future of heavy-ion physics at CERN started in 1986, even before the building of the Large Electron-Positron Collider (LEP) had been completed. Four years later the idea of creating a dedicated heavy-ion detector for the Large Hadron Collider was born and on 13 December 1990 the Heavy Ion Proto Collaboration (HIPC) held its first meeting. Later on, during the historic Evian meeting Towards the LHC experimental programme in 1992, the expression of interest to create ALICE was submitted, followed by the letter of intent in 1994 and by the technical proposal in 1995.

    CERN ALICE Icon HUGE

    One of the people, responsible for the creation of ALICE, is Jurgen Schukraft. Spokesperson of the collaboration for the first 20 years of its existence, he is also the person who organised the initial meeting of HIPC. In the following interview we will try to show you the evolution of ALICE through his eyes.

    Has ALICE changed much since the beginning?

    Jurgen: In the beginning, the plan for the experiment was different from what it eventually turned out to be. We had a big TPC, we had a silicon vertex detector, we had time of flight, but the magnet was completely different. Ever since we sent the letter of intent, we had many different ideas. All the details were missing and we made a lot of additions afterwards, but the essential part of the detector was already decided by 1992.

    AliceDetectorLarge
    ALICE Detector

    In terms of the collaboration, it was very different at the time, because most of the people at CERN were doing experiments at low energies – the LEP programme at CERN. The Large Hadron Collider was still far in the future. It was after the approval of the technical proposal in 1994 when we started some serious research and development. In 1998, when the SPS experiment stopped, more people joined our collaboration.

    Which are the most interesting discoveries, made in ALICE?

    Jurgen: We have made many discoveries so far, but one thing which we did not expect is that each of these little “big bangs” has its own character. These explosions are so strong that every one of them is different and individual. This couldn’t be observed in the other types of collisions, where we only look at the average properties of the particles.

    The other very interesting thing for me is the discovery that there is a much deeper connection between all the QCD processes – everything which involves strong interaction – they are much [more] deeply connected than we originally thought.

    I think it would be very good if in the next 10 or 15 years we manage to embed what we have learned from the heavy-ion physics into the bigger context of the standard model.

    Standard model with Higgs New
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Are you happy with how the experiment developed?

    Jurgen: I think overall it worked out as well as we could have hoped. The physics at the LHC turned out to be extremely interesting. Even more interesting than we initially thought. Also, the experiment worked very well. There are always things that could be done better, but we constantly learn. That is why ALICE is going to be upgraded during the next long shutdown.

    In addition, more people came to the collaboration than we thought would join. There are currently about 1500 members. In these terms we developed even better than I hoped. I am pleased and also proud of our community and of the fact that we managed to create such a huge experiment.

    We were a bit naive in the beginning, thinking that 10-12 years were going to be enough to do what eventually took us 20 years. A bit naive, but also very enthusiastic. What I am happy about is that we didn’t have big disappointments along the way. On the contrary – we had a very satisfactory development. This project was more complicated, more expensive and much bigger than what we had done before. It was a big mountain to climb and I am proud that we managed to get to the top.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New

    LHC

    CERN LHC New
    CERN LHC Grand Tunnel

    LHC particles

    Quantum Diaries

     
  • richardmitnick 6:08 pm on January 27, 2016 Permalink | Reply
    Tags: , , , , Particle Physics, Unparticles   

    From FNAL: “Particles and unparticles” 

    FNAL II photo

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    January 27, 2016

    FNAL Don Lincoln
    Don Lincoln

    The LHC accelerator is in the business of discovering new things, from particles that are expected (like the Higgs boson) to particles that are sort of expected (like the panoply of particles predicted by supersymmetric models) to particles from something entirely unexpected (like the what-the-heck-is-that moment that changes our theories forever).

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Supersymmetry standard model
    Standard Model of Supersymmetry

    The commonality of all of these potential discoveries is that they include particles. All particles have a fixed mass. So all electrons in the universe have the same mass, as do all muons, pions, protons or any other subatomic particle you can name.

    (This is true no matter how fast the particles are moving, and it’s worth emphasizing this point, as it may not jibe with some readers’ understanding of what happens when a particle’s velocity approaches light speed [in a vacuum]. You may have heard that the mass of a particle changes as velocity increases. We teach this to people first encountering relativity, but the statement is an illustrative one. What actually changes is the particle’s inertia, which is equivalent to mass at low velocities. You can read more about this in a previous column. So, for a particle, no matter what energy and momentum it has, it must also have a single and specific mass.)

    However, in 2007 scientist Howard Georgi had an idea: Suppose there was a kind of particle that had a mass that wasn’t constant. If you doubled the particle’s energy and momentum, you would double its mass. If you halved the energy, you’d halve the mass. Such a particle wouldn’t have a well-defined mass at all. This kind of particle is called an unparticle.

    Unparticles are governed by fractal mathematics and are highly speculative. In fact, there is no hint in the data that they must exist, nor is there a compelling theoretical reason why they should. On the other hand, they are possible, and given that we don’t know what theory will supplant the Standard Model, we should be open to all sorts of improbable ideas. We do know that unparticles, if they exist, must interact via known forces only weakly.

    Standard model with Higgs New
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in a fifth row.

    So, of course, CMS went looking for them.

    CERN CMS Detector
    CMS

    In a recent analysis that looked for both unparticles and dark matter, scientists studied events in which a Z boson was created, as well as undetected energy that would be the signal of either a dark matter particle or unparticle escaping.

    Sadly, no evidence was observed for either phenomenon. Truthfully, it would have been shocking if unparticles had been observed, but the fact that LHC experiments are looking for even such bizarre possibilities highlights that the scientific community is exploring all viable ideas, hoping to find something that gives us a huge advance in our understanding of the nature of reality.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 2:28 pm on January 20, 2016 Permalink | Reply
    Tags: , , Heavy-Ion run, , Particle Physics   

    From CERN: “LHC surpasses design luminosity with heavy ions” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    Jan 15, 2016
    John Jowett

    The LHC has finished 2015 with a successful heavy-ion run. For the first time, the lead nuclei have collided with an average centre-of-mass energy per pair of nucleons of 5.02 TeV.

    Temp 1
    First events

    The extensive modifications made to the LHC during its first long shutdown allowed the energy of the proton beams to be increased from 4 TeV in 2012 to 6.5 TeV, enabling proton–proton collisions at a centre-of-mass energy of 13 TeV, in 2015. As usual, a one-month heavy-ion run was scheduled at the end of the year. With lead nuclei colliding, the same fields in the LHC’s bending magnets would have allowed 5.13 TeV per colliding nucleon pair. However, it was decided to forego the last whisker of this increase to match the equivalent energy of the proton–lead collisions that took place in 2013, namely 5.02 TeV. Furthermore, the first week of the run was devoted to colliding protons at 2.51 TeV per beam. This will allow the LHC experiments to make precise comparisons of three different combinations of colliding particles, p–p, p–Pb and Pb–Pb, at the same effective energy of 5.02 TeV. This is crucial to disentangling the ascending complexity of the observed phenomena (CERN Courier March 2014 p17).

    The first (and last, until 2018) Pb–Pb operation close to the full energy of the LHC was also the opportunity to finally assess some of its ultimate performance limits as a heavy-ion collider. A carefully targeted set of accelerator-physics studies also had to be scheduled within the tight time frame.

    Delivering luminosity

    The chain of specialised heavy-ion injectors, comprising the electron cyclotron resonance ion source, Linac3 and the LEIR ring, with its elaborate bunch-forming and cooling, were recommissioned to provide intense and dense lead bunches in the weeks preceding the run. Through a series of elaborate RF gymnastics, the PS and SPS assemble these into 24-bunch trains for injection into the LHC. The beam intensity delivered by the injectors is a crucial determinant of the luminosity of the collider.

    Planning for the recommissioning of the LHC to run in two different operational conditions after the November technical stop resembled a temporal jigsaw puzzle, with alternating phases of proton and heavy-ion set-up (the latter using proton beams at first) continually readapted to the manifold constraints imposed by other activities in the injector complex, the strictures of machine protection, and the unexpected. For Pb–Pb operation, a new heavy-ion magnetic cycle was implemented in the LHC, including a squeeze to β* = 0.8 m, together with manipulations of the crossing angle and interaction-point position at the ALICE experiment. First test collisions occurred early in the morning of 17 November, some 10 hours after first injection of lead.

    The new Pb–Pb energy was almost twice that of the previous Pb–Pb run in 2011, and some 25 times that of RHIC at Brookhaven, extending the study of the quark–gluon plasma to still-higher energy density and temperature. Although the energy per colliding nucleon pair characterises the physical processes, it is worth noting that the total energy packed into a volume on the few-fm scale exceeded 1 PeV for the first time in the laboratory.

    After the successful collection of the required number of p–p reference collisions, the Pb–Pb configuration was validated through an extensive series of aperture measurements and collimation-loss maps. Only then could “stable beams” for physics be declared at 10.59 a.m. on 25 November, and spectacular event displays started to flow from the experiments.

    Temp 2
    Beam-loss monitor signals

    In the next few days, the number of colliding bunches in each beam was stepped up to the anticipated value of 426 and the intensity delivered by the injectors was boosted to its highest-ever values. The LHC passed a historic milestone by exceeding the luminosity of 1027 cm–2 s–1, the value advertised in its official design report in 2004.

    This allowed the ALICE experiment to run in its long-awaited saturated mode with the luminosity levelled at this value for the first few hours of each fill.

    Soon afterwards, an unexpected bonus came from the SPS injection team, who pulled off the feat of shortening the rise time of the SPS injection kicker array, first to 175 ns then to 150 ns, allowing 474, then 518, bunches to be stored in the LHC. The ATLAS and CMS experiments were able to benefit from luminosities over three times the design value. A small fraction of the luminosity in this run was delivered to the LHCb experiment, a newcomer to Pb–Pb collisions.
    Nuclear beam physics

    The electromagnetic fields surrounding highly charged ultrarelativistic nuclei are strongly Lorentz-contracted into a flat “pancake”. According to the original insight of Fermi, Weizsäcker and Williams, these fields can be represented as a flash of quasi-real photons. At LHC energies, their spectrum extends up to hundreds of GeV. In a very real sense, the LHC is a photon–photon and photon–nucleus collider (CERN Courier November 2012 p9). The study of such ultraperipheral (or “near-miss”) interactions, in which the two nuclei do not overlap, is an important subfield of the LHC experimental programme, alongside its main focus on the study of truly nuclear collisions.

    From the point of view of accelerator physics, the ultraperipheral interactions with their much higher cross-sections loom still larger in importance. They dominate the luminosity “burn-off”, or rate at which particles are removed from colliding beams, leading to short beam and luminosity lifetimes. Furthermore, they do so in a way that is qualitatively different from the spray of a few watts of “luminosity debris” by hadronic interactions. Rather, the removed nuclei are slightly modified in charge and/or mass, and emerge as new, well-focussed, secondary beams. These travel along the interaction region just like the main beam but, as soon as they encounter the bending magnets of the dispersion-suppressor section, their trajectories deviate, as in a spectrometer.

    The largest contribution to the burn-off cross-section comes from the so-called bound-free pair-production (BFPP) in which the colliding photons create electron–positron pairs with the electron in a bound-state of one nucleus. A beam of these one-electron ions, carrying a power of some tens of watts, emerges from the interaction point and is eventually lost on the outer side of the beam pipe.
    Controlled quench

    The LHC operators have become used to holding their breath as the BFPP loss peaks on the beam-loss monitors rise towards the threshold for dumping the beams (figure). There has long been a concern that the energy deposited into superconducting magnet coils may cause them to quench, bringing the run to an immediate halt and imposing a limit on luminosity. In line with recent re-evaluations of the magnet-quench limits, this did not happen during physics operation in 2015 but may happen in future operation at still-higher luminosity. During this run, mitigation strategies to move the losses out of the magnets were successfully implemented. Later, in a special experiment, one of these bumps was removed and the luminosity slowly increased. This led to the first controlled steady-state quench of an LHC dipole magnet with beam, providing long-sought data on their propensity to quench. On the last night of the run, another magnet quench was deliberately induced by exciting the beam to create losses on the primary collimators.

    Photonuclear interactions also occur at comparable rates in the collisions and in the interactions with the graphite of the LHC collimator jaws. Nuclei of 207Pb, created by the electromagnetic dissociation of a neutron from the original 208Pb at the primary collimators, were identified as a source of background after traversing more than a quarter of the ring to the tertiary collimators near ALICE.

    These, and other phenomena peculiar to heavy-ion operation, must be tackled in the quest for still-higher performance in future years.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New

    LHC

    CERN LHC New
    CERN LHC Grand Tunnel

    LHC particles

    Quantum Diaries

     
  • richardmitnick 4:40 pm on January 15, 2016 Permalink | Reply
    Tags: , , , , Particle Physics   

    From CERN: “A year of challenges and successes” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    Jan 15, 2016
    No writer credit found

    Temp 1
    LHC Page 1

    2015 was a tough year for CERN’s accelerator sector. Besides assuring delivery of beam to the extensive non-LHC facilities such as the AD, ISOLDE, nTOF and the North Area, many teams also had to work hard to bring the LHC back into business after the far-reaching efforts of the long shutdown.

    At the end of 2014 and start of 2015, the LHC was cooled down sector by sector and all magnet circuits were put through a campaign of powering tests to fully re-qualify everything. The six-month-long programme of rigorous tests involved the quench-protection system, power converters, energy extraction, UPS, interlocks, electrical quality assurance and magnet-quench behaviour. The powering-test phase eventually left all magnetic circuits fully qualified for 6.5 TeV.

    Some understandable delay was incurred during this period and three things can be highlighted. First was the decision to perform in situ tests of the consolidated splices – the so called Copper Stabilizer Continuity Measurement (CSCM) campaign. These were a success and provided confirmation of the quality work done during the shutdown.

    Second, dipole-quench re-training took some time – in particular, the dipoles of sector 45 proved a little recalcitrant and reached the target 11,080 A after some 51 training quenches.

    Third, after an impressive team effort co-ordinated by the machine-protection team to conceive, prototype, test and deploy the system, a small piece of metallic debris that was causing an earth fault in a dipole in sector 34 was successfully burnt away on the afternoon of Tuesday 31 March.

    First beam 2015 went around the LHC on Easter Sunday, 5 April. Initial commissioning delivered first beam at 6.5 TeV after five days and first “stable beams” after two months of careful set up and validation.

    Ramp up

    Two scrubbing runs delivered good beam conditions for around 1500 bunches per beam, after a concerted campaign to re-condition the beam vacuum. However, the electron cloud, anticipated to be more of a problem with the nominal 25 ns bunch-spacing beam, was still significant at the end of the scrubbing campaign.

    The initial 50 ns and 25 ns intensity ramp-up phase was tough going and had to contend with a number of issues, including earth faults, unidentified falling objects (UFOs), an unidentified aperture restriction in a main dipole, and radiation affecting specific electronic components in the tunnel. Although operating the machine in these conditions was challenging, the teams succeeded in colliding beams with 460 bunches and delivered some luminosity to the experiments, albeit with poor efficiency.

    The second phase of the ramp-up following the technical stop at the start of September was dominated by the electron cloud and the heat load that it generates in the beam screens of the magnets in the cold sectors. The challenge was then for cryogenics, which had to wrestle with transients and operation close to the cooling-power limits. The ramp-up in number of bunches was consequently slow but steady, culminating in a final figure for the year of 2244 bunches per beam.

    Importantly, the electron cloud generated during physics runs at 6.5 TeV serves to slowly condition the surface of the beam screen and so reduce the heat load at a given intensity. As time passed, this effect opened up a margin for the use of more bunches. Cryogenics operations were therefore kept close to the acceptable maximum heat load, and at the same time in the most effective scrubbing regime.

    The overall machine availability is a critical factor in integrated-luminosity delivery, and remained respectable with around 32% of the scheduled time spent in stable beams during the final period of proton–proton physics from September to November. By the end of the 2015 proton run, 2244 bunches per beam were giving peak luminosities of 5.2 × 1033 cm–2s–1 in ATLAS and CMS, with both being delivered an integrated luminosity of around 4 fb–1 for the year. Levelled luminosity of 3 × 1032 cm–2s–1 in LHCb and 5 × 1030 cm–2s–1 in ALICE was provided throughout the run.

    Also of note were dedicated runs at high β* for TOTEM and ALFA. These provided important data on elastic and diffractive scattering at 6.5 TeV, and interestingly a first test of the CMS-TOTEM Precision Proton Spectrometer (CT-PPS), which aims to probe double-pomeron exchange.

    As is now traditional, the final four weeks of operations in 2015 were devoted to the heavy-ion programme. To make things more challenging, it was decided to include a five-day proton–proton reference run in this period. The proton–proton run was performed at a centre-of-mass energy of 5.02 TeV, giving the same nucleon–nucleon collision energy as that of both the following lead–lead run and the proton–lead run that took place at the start of 2013.

    Good intensities

    Both the proton reference run and ion run demanded re-set-up and validation of the machine at new energies. Despite the time pressure, both runs went well and were counted a success. Performance with ions is strongly dependent on the beam from the injectors (source, Linac3, LEIR, PS and SPS), and extensive preparation allowed the delivery of good intensities, which open the way for delivery of a levelled design luminosity of 1 × 1027 cm–2s–1 to ALICE and more than 3 × 1027 cm–2s–1 to ATLAS and CMS. For the first time in an ion–ion run, LHCb also took data following participation in the proton–lead run. Dedicated ion machine development included crystal collimation and quench-level tests, the latter providing important input to future ion operation in the HL-LHC era.

    The travails of 2015 have opened the way for a full production run in 2016. Following initial commissioning, a short scrubbing run should re-establish the electron cloud conditions of 2015, allowing operation with 2000 bunches and more. This figure can then be incrementally increased to the nominal 2700 as conditioning progresses. Following extensive machine development campaigns in 2015, the β* will be reduced to 50 cm for the 2016 run. Nominal bunch intensity and emittance will bring the design peak luminosity of 1 × 1034 cm–2s–1 within reach. Reasonable machine availability and around 150 days of 13 TeV proton–proton physics should allow the 23 fb–1 total delivered to ATLAS and CMS in 2012 to be exceeded.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New

    LHC

    CERN LHC New
    CERN LHC Grand Tunnel

    LHC particles

    Quantum Diaries

     
  • richardmitnick 4:42 pm on January 12, 2016 Permalink | Reply
    Tags: , , Particle Physics, ,   

    From Science Friday: “10 Questions for Alan Guth, Pioneer of the Inflationary Model of the Universe” 

    Science Friday

    Science Friday

    January 7, 2016
    Christina Couch

    The theoretical physicist discusses the expanding universe and the infinite possibilities it brings.

    Buried under a mountain of papers and empty Coke Zero bottles, Alan Guth ponders the origins of the cosmos. A world-renowned theoretical physicist and professor at the Massachusetts Institute of Technology, Guth is best known for pioneering the theory of cosmic inflation, a model that explains the exponential growth of the universe mere fractions of a second after the Big Bang, and its continued expansion today.

    Cosmic inflation not only describes the underlying physics of the Big Bang, however. Guth believes it also supports the idea that our universe is one of many, with even more universes yet to form.

    Science Friday headed to MIT (where this writer also works, but in a different department) to chat with Guth in his office about the infinite possibilities in an unending cosmos, and the fortune cookie that changed his life.

    1
    Alan Guth in 2007. Photo by Betsy Devine/Wikipedia/CC BY-SA 3.0

    Science Friday: What made you realize that you wanted to be a scientist?
    Alan Guth: I remember an event in high school, which maybe is indicative of my desires to be a theoretical physicist in particular. I was taking high school physics, and a friend of mine was doing an experiment which consisted of taking a yard stick and punching holes in it in different places and pivoting it on these different holes and seeing how the period depended on where the hole was. At this point, I had just learned enough basic physics and calculus to be able to calculate what the answer to that question is supposed to be. I remember one afternoon, we got together and compared my formula with his data using a slide rule to do the calculations. It actually worked. I was very excited about the idea that we can really calculate things, and they actually do reflect the way the real world works.

    You did your dissertation on particle physics and have said that it didn’t turn out exactly how you wanted. Could you tell me about that?
    My dissertation was about the quark model and about how quarks and anti-quarks could bind to form mesons. But it was really just before the theory of quarks underwent a major revolution [when physicists went from believing that quarks are heavy particles that have a large binding energy when they combine, to the quantum chromodynamics theory that quarks are actually very light and their binding energy [gluons] increases as they’re pulled farther apart]. I was on the wrong side of that revolution. My thesis, more or less, became totally obsolete about the time I wrote it. I certainly learned a lot by doing it.

    What got you into cosmology?
    It wasn’t really until the eighth year of my being a [particle physics] postdoc that I got into cosmology. A fellow postdoc at Cornell named Henry Tye got interested in what was then a newfangled class of particle theories called grand unified theories [particle physics models that describe how three of the four fundamental forces in the universe—electromagnetism, weak nuclear interactions, and strong nuclear interactions—act as one force at extremely high energies]. He came to me one day and asked me whether these grand unified theories would predict that there should be magnetic monopoles [particles that have a net magnetic north charge or a net magnetic south charge.]

    I didn’t know about grand unified theories at the time, so he had to teach me, which he did, very successfully. Then I knew enough to put two and two together and conclude—as I’m sure many people did around the world—that yes, grand unified theories do predict that magnetic monopoles should exist, but that they would be outrageously heavy. They would weigh something like 10 to the 16th power times as much as a proton [which means that scientists should theoretically be able to observe them in the universe, although no one has yet].

    About six months later, there was a visit to Cornell by [Nobel laureate] Steve Weinberg, who’s a fabulous physicist and someone I had known from my graduate student days at MIT. He was working on how grand unified theories might explain the excess of matter over anti-matter [in the universe], but it involved the same basic physics that determining how many monopoles existed in the early universe would involve. I decided that if it was sensible enough for Steve Weinberg to work on, why not me, too?

    After a little while, Henry Tye and I came to the conclusion that far too many magnetic monopoles would be produced if one combined conventional cosmology with conventional grand unified theories. We were scooped in publishing that, but Henry and I decided that we would continue to try to figure out if there was anything that could be changed that maybe would make it possible for grand unified theories to be consistent with cosmology as we know it.

    How did you come up with the idea of cosmic inflation?
    A little bit before I started talking to Henry Tye about monopoles, there was a lecture at Cornell by Bob Dicke, a Princeton physicist and cosmologist, in which he presented something that was called the flatness problem, a problem about the expansion rate of the early universe and how precisely fine-tuned it had to be for the universe to work to produce a universe like the one we live in [that is, one that has little or no space-time curvature and is therefore almost perfectly “flat”]. In this talk, Bob Dicke told us that if you thought about the universe at one second after the beginning, the expansion rate really had to be just right to 15 decimal places, or else the universe would either fly apart too fast for any structure to form or re-collapse too fast for any structure to form.

    At the time, I thought that was kind of amazing but didn’t even understand it. But after working on this magnetic monopole question for six months, I came to the realization one night that the kind of mechanism that we were thinking about that would suppress the amount of magnetic monopoles produced after the Big Bang [the “mechanism” being a phase transition that occurs after a large amount of super-cooling] would have the surprising effect of driving the universe into a period of exponential expansion—which is what we now call inflation—and that exponential expansion would solve this flatness problem. It would also draw the universe to exactly the right expansion rate that the Big Bang required [to create a universe like ours].

    You’ve said in previous talks that a fortune cookie played a legitimately important part in your career. How so?
    During the spring of 1980, after having come up with this idea of inflation, I decided that the best way to publicize it would be to give a lot of talks about it. I visited MIT, but MIT had not advertised any positions that year. During the very last day of this six-week trip, I was at the University of Maryland, and they took me out for a Chinese dinner, and the fortune I got in my Chinese fortune cookie said, “An exciting opportunity awaits you if you’re not too timid.” I thought about that and decided that it might be trying to tell me something. When I got back to California, I called one of the faculty members at MIT and said in some stammering way that I hadn’t applied for any jobs because there weren’t any jobs at MIT, but I wanted to tell them that if they might be interested in me, I’d be interested in coming. Then they got back to me in one day and made me an offer. It was great. I came to MIT as a faculty member, and I’ve been here ever since.

    When and where do you do your best work?
    I firmly believe that I do my best thinking in the middle of the night. I very much like to be able to have reasonably long periods of time, a few hours, when I can concentrate on something and not be interrupted, and that only happens at night. What often happens is I fall asleep at like 9:30 and wake up at 1 or 2 and start working and then fall asleep again at 5.

    Who is a dream collaborator you’d love to work with?
    I bet it would have been a lot of fun to work with [Albert] Einstein. What I really respect about Einstein is his desire to throw aside all conventional modes and just concentrate on what seems to be the closest we can get to an accurate theory of nature.

    What are you currently working on?
    The most concrete project I’m working on is a project in collaboration with a fairly large group here at MIT in which we’re trying to calculate the production of primordial black holes that might have happened with a certain version of inflation. If this works out, these primordial black holes could perhaps be the seeds for the super massive black holes in the centers of galaxies, which are very hard to explain. It would be incredibly exciting if that turns out to be the case.

    What else are you mulling over?
    A bigger question, which has been in the back of my mind for a decade, is the problem of understanding probabilities in eternally inflating universes. In an eternally inflating universe, these pocket universes [like the one we live in] go on being formed literally forever. An infinite number of pocket universes are formed, and that means that anything that’s physically allowed will ultimately happen an infinite number of times.

    Normally we interpret probabilities as relative occurrences. We think one-headed cows are more probable than two-headed cows because we think there are a lot more one-headed cows than two-headed cows. I don’t know if there are any two-headed cows on earth, but let’s pretend there are. In an eternally inflating universe, assuming that a two-headed cow is at least possible, there will be an infinite number of two-headed cows and an infinite number of one-headed cows. It’s hard to know what you mean if you try to say that one is more common than the other.

    If anything can happen in an eternally inflating universe, is there a situation in which I am the cosmologist and you are the journalist?
    [Laughs] Probably, yes. I think what we would know for sure is that anything that’s physically possible—and I don’t see why this is not physically possible—will happen an infinite number of times.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Covering the outer reaches of space to the tiniest microbes in our bodies, Science Friday is the source for entertaining and educational stories about science, technology, and other cool stuff.

    Science Friday is your trusted source for news and entertaining stories about science.

    For 25 years we’ve introduced top scientists to public radio listeners, and reminded them how much fun it is to learn something new. But we’re more than just a radio show. We produce award-winning digital videos, original web articles, and educational resources for teachers and informal educators. We like to say we’re brain fun, for curious people.

    All of our work is independently produced by the Science Friday Initiative, a non-profit organization dedicated to increasing the public’s access to science and scientific information. Public Radio International (PRI) distributes our radio show, which you can catch on public radio stations across the U.S.

     
  • richardmitnick 9:32 pm on January 11, 2016 Permalink | Reply
    Tags: , Particle Physics, , The Sky is the Limit   

    From PI: “The sky as a limit” 

    Perimeter Institute
    Perimeter Institute

    January 11, 2016

    Eamon O’Flynn
    Manager, Media Relations
    eoflynn@perimeterinstitute.ca
    (519) 569-7600 x5071

    Perimeter researchers show how the largest possible structure – the curvature of the universe as a whole – can be used as a lens onto the smallest objects observable today, elementary particles.

    1
    Elliot Nelson (left) and Niayesh Afshordi. No image credit found.

    Perimeter Associate Faculty member Niayesh Afshordi and postdoctoral fellow Elliot Nelson recently won a third-place Buchalter Cosmology Prize for uncovering an entirely new way cosmology can shed light on the future of particle physics.

    1
    Hubble Goes to the eXtreme to Assemble Farthest-Ever View of the Universe

    Like photographers assembling a portfolio of best shots, astronomers have assembled a new, improved portrait of mankind’s deepest-ever view of the universe. Called the eXtreme Deep Field, or XDF, the photo was assembled by combining 10 years of NASA Hubble Space Telescope photographs taken of a patch of sky at the center of the original Hubble Ultra Deep Field.

    NASA Hubble Telescope
    NASA/ESA Hubble

    The XDF is a small fraction of the angular diameter of the full moon. The Hubble Ultra Deep Field is an image of a small area of space in the constellation Fornax, created using Hubble Space Telescope data from 2003 and 2004. By collecting faint light over many hours of observation, it revealed thousands of galaxies, both nearby and very distant, making it the deepest image of the universe ever taken at that time. The new full-color XDF image is even more sensitive, and contains about 5,500 galaxies even within its smaller field of view. The faintest galaxies are one ten-billionth the brightness of what the human eye can see. Magnificent spiral galaxies similar in shape to our Milky Way and the neighboring Andromeda Galaxy appear in this image, as do the large, fuzzy red galaxies where the formation of new stars has ceased. These red galaxies are the remnants of dramatic collisions between galaxies and are in their declining years. Peppered across the field are tiny, faint, more distant galaxies that were like the seedlings from which today’s magnificent galaxies grew. The history of galaxies — from soon after the first galaxies were born to the great galaxies of today, like our Milky Way — is laid out in this one remarkable image.

    Hubble pointed at a tiny patch of southern sky in repeat visits (made over the past decade) for a total of 50 days, with a total exposure time of 2 million seconds. More than 2,000 images of the same field were taken with Hubble’s two premier cameras: the Advanced Camera for Surveys [ACS] and the Wide Field Camera 3 [WFC3], which extends Hubble’s vision into near-infrared light.

    NASA Hubble ACS
    ACS

    NASA Hubble WFC3
    WFC3

    “The XDF is the deepest image of the sky ever obtained and reveals the faintest and most distant galaxies ever seen. XDF allows us to explore further back in time than ever before”, said Garth Illingworth of the University of California at Santa Cruz, principal investigator of the Hubble Ultra Deep Field 2009 (HUDF09) program.

    The universe is 13.7 billion years old, and the XDF reveals galaxies that span back 13.2 billion years in time. Most of the galaxies in the XDF are seen when they were young, small, and growing, often violently as they collided and merged together. The early universe was a time of dramatic birth for galaxies containing brilliant blue stars extraordinarily brighter than our sun. The light from those past events is just arriving at Earth now, and so the XDF is a “time tunnel into the distant past.” The youngest galaxy found in the XDF existed just 450 million years after the universe’s birth in the big bang.

    Before Hubble was launched in 1990, astronomers could barely see normal galaxies to 7 billion light-years away, about halfway across the universe. Observations with telescopes on the ground were not able to establish how galaxies formed and evolved in the early universe.

    Hubble gave astronomers their first view of the actual forms and shapes of galaxies when they were young. This provided compelling, direct visual evidence that the universe is truly changing as it ages. Like watching individual frames of a motion picture, the Hubble deep surveys reveal the emergence of structure in the infant universe and the subsequent dynamic stages of galaxy evolution.

    The infrared vision of NASA’s planned James Webb Space Telescope [JWST] will be aimed at the XDF.

    NASA Webb Telescope
    JWST

    The Webb telescope will find even fainter galaxies that existed when the universe was just a few hundred million years old. Because of the expansion of the universe, light from the distant past is stretched into longer, infrared wavelengths. The Webb telescope’s infrared vision is ideally suited to push the XDF even deeper, into a time when the first stars and galaxies formed and filled the early “dark ages” of the universe with light.
    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center in Greenbelt, Md., manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Md., conducts Hubble science operations. STScI is operated by the Association of Universities for Research in Astronomy, Inc., in Washington.
    Date 29 June 2012
    Photographer NASA; ESA; G. Illingworth, D. Magee, and P. Oesch, University of California, Santa Cruz; R. Bouwens, Leiden University; and the HUDF09 Team

    2
    Andromeda Galaxy. Adam Evans

    Their work begins with the knowledge that space is flat. While there are local wrinkles, they are wrinkles in a flat space, not wrinkles in curved space. The universe as a whole is within one percent of flat.

    The problem is that it shouldn’t be. The vacuum of space is not empty; it is filled with fields that may be weak but cannot be zero – nothing quantum can ever be zero, because quantum things wiggle. According to general relativity, such fluctuations should cause spacetime to curve. In fact, a straightforward calculation of how much the vacuum should curve predicts a universe so tightly wound that the moon would not fit inside it.

    Cosmologists have typically worked around this problem – that the universe should be curved, but looks flat – by assuming there is some antigravity that exactly offsets the tendency of the vacuum to curve. This set of off-base predictions and unlikely corrections is known as the cosmological constant problem, and it has been dogging cosmology for more than half a century.

    In this paper, Nelson and Afshordi make no attempt to solve it, but where other cosmologists invoked an offsetting constant and moved on, Nelson and Afshordi went on to ask one more question: Does adding such a constant to cancel the vacuum’s energy guarantee a flat spacetime? Their answer: not quite.

    The vacuum is still filled with quantum fields, and it is the nature of quantum fields to fluctuate. Even if they are perfectly offset such that their average value is zero, they will still fluctuate around that zero point. Those fluctuations should (again) cause space to curve – just not as much.

    In this scenario, the amount of curve created by the known fields – the electromagnetic field, for example, or the Higgs field – is too small to be measured, and is therefore allowed. But any unknown field would have to be weak enough that its fluctuations would not cause an observable curve in the universe. This sets a maximum energy for unknown fields.

    A theoretical maximum on a theoretical field may not sound groundbreaking – but the work opens a new window in an unexpected place: particle physics.

    A particle, quantum mechanics teaches us, is just an excitation of a field. A photon is an excitation of the electric field, for example, and the newly discovered Higgs boson is an excitation of Higgs field. It’s roughly similar to the way a wave is an excitation of the ocean. And just as the height of a breaking wave can tell us something about the depth of the water, the mass of a particle depends on the strength of its corresponding field.

    New kinds of quantum fields are often associated with proposals to extend the Standard Model of particle physics.

    5
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    If Afshordi and Nelson are right, and there can be no such fields whose fluctuations have enough energy to noticeably curve space, there can be no unknown particles with a mass of more than 35 TeV. The authors predict that if there are new fields and particles associated with an extension to the Standard Model, they will be below that range.

    For generations, particle physics has made progress from the bottom up: building more and more powerful colliders to create – then spot and study – heavier and heavier particles. It is as if we started from the ground floor and built up, discovering more particles at higher altitudes as we went. What Nelson and Afshordi have done is lower the sky.

    There is a great deal of debate in particle physics about whether we should build increasingly powerful accelerators to search for heavier unknown particles. Right now, the most powerful accelerator in the world, the Large Hadron Collider [LHC], runs at a top energy of about 14 TeV; a proposed new super accelerator in China would run at about 100 TeV.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    As this debate unfolds, this new work could be particularly useful in helping experimentalists decide which energy levels – which skyscraper heights – are the most interesting.

    The sky does indeed have a limit, this research suggests – and we are about to hit it.

    Read the original prize-winning paper from by Afshordi and Nelson

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Perimeter

    Perimeter Institute is a leading centre for scientific research, training and educational outreach in foundational theoretical physics. Founded in 1999 in Waterloo, Ontario, Canada, its mission is to advance our understanding of the universe at the most fundamental level, stimulating the breakthroughs that could transform our future. Perimeter also trains the next generation of physicists through innovative programs, and shares the excitement and wonder of science with students, teachers and the general public.

     
  • richardmitnick 11:35 am on January 8, 2016 Permalink | Reply
    Tags: , , , , Particle Physics   

    From AAAS: “Japan hopes to staff up to host the International Linear Collider” 

    AAAS

    AAAS

    7 January 2016
    Dennis Normile

    1
    Japan will grow its scientific workforce to handle the International Linear Collider, to be built in a tunnel through these mountains in northeastern Japan. WIKIMEDIA

    The International Linear Collider (ILC) took another small step forward yesterday when Japan’s High Energy Accelerator Research Organization (KEK) released a plan for getting the country ready to host the $10 billion project by tripling its relevant science and engineering workforce over the next 4 years.

    ILC schematic
    ilc schematic

    As currently envisioned, the collider will occupy a 31-kilometer-long tunnel in Iwate Prefecture north of Tokyo. The education ministry needs to be convinced the country has the human resources required to execute the project before it will approve the project, says Yasuhiro Okada, a theorist at KEK, which has led Japan’s preliminary planning and design work. The “Action Plan,” released yesterday, “is a small but critical point to show [the ministry] we will have the necessary manpower,” says Okada, who chaired the working group charged with drafting the plan. Japan also needs to demonstrate to potential international partners that the country will shoulder its share of the final design effort, he adds.

    “We are concentrating on getting the green light from the government by 2018,” says Satoru Yamashita, a University of Tokyo physicist involved in the planning. The government would then initiate negotiations for support from other interested countries, with the goal of starting construction by 2020 and beginning experiments around 2030.

    The ILC would pick up where Europe’s Large Hadron Collider leaves off in studies of the Higgs boson and other exotic particles.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    In the 1990s, groups in North America, Japan, and Europe independently started planning linear colliders to smash together electrons and antielectrons, or positrons. The project’s complexity and projected costs led the teams to pool their efforts in 2004. An international team completed a basic design in June 2013 based on superconducting techniques to accelerate the particles to energies of up to 500 gigaelectron volts. The collider could be upgraded later to even higher energies.

    Scientists in each region originally hoped to host the facility. But in 2012 the Japanese high energy physics community raised its hand and gradually got the support of American and European physicists. In August 2013 a committee picked the Iwate Prefecture site.

    Before starting a final engineering design, KEK took a hard look at the project’s manpower requirements. The U.S. and Europe are currently designing and building large physics facilities with superconducting radiofrequency cavities similar to what the ILC will use, and many of those scientists and engineers will become available to work on the ILC, Okada says. But Japan hasn’t had a similar cutting-edge project. Okada says KEK currently has 30 to 40 scientists and engineers with relevant expertise but will need about triple that number to manage its share of the final design work. KEK hopes to fill the gap by luring experienced hands as well as signing up new recruits. “We think the ILC is a project which can attract young talent,” Okada says.

    Meanwhile, Yamashita says support for the project is building among local governments and neighboring prefectures as well as among national politicians. He says the ILC may also benefit from the fact that government spending on the 2020 Olympics in Tokyo will be winding down before the first funds are needed for its construction.

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

     
  • richardmitnick 9:03 pm on December 28, 2015 Permalink | Reply
    Tags: , , , , , , Particle Physics   

    From DESY: “ERC Starting Grant for characterising the Higgs boson” 

    DESY
    DESY

    2015/12/28
    No writer credit found

    Temp 1
    No image credit found

    Kerstin Tackmann, a physicist at DESY, is to receive over 1.3 million euros from the European Research Council (ERC) in order to carry out research aimed at a more detailed characterisation of the Higgs boson.

    CERN ATLAS Higgs Event
    Higgs event at ATLAS

    She will use a starting grant to set up a research group to investigate the properties of the Higgs boson in great detail, as part of the international ATLAS Collaboration.

    CERN ATLAS New
    ATLAS

    These measurements are an important step towards identifying whether the particle fits the Standard Model of particle physics. The 5-year project is scheduled to begin in 2016.

    Ever since particle physicists working on the big LHC experiments ATLAS and CMS announced, in 2012, the discovery of a particle whose properties corresponded to those of the elusive Higgs boson, particle physics has faced an extremely exciting mystery: does this Higgs boson fit the Standard Model of particle physics, the currently accepted description of the elementary particles that make up matter and the forces acting between them, or will it open the path to a new, higher-level theory.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    CERN CMS Detector
    CMS

    CERN CMS Event
    CMS Higgs event

    Standard model with Higgs New
    Standard Model of Particle Physics

    Using the data available so far, scientists have already been able to determine the particle’s mass of around 125 gigaelectronvolts (GeV) and its spin of zero to a fairly high degree of accuracy. To obtain even more precise information about additional properties of the particle, the researchers need to analyse far more data from proton-proton collisions in the LHC. They are particularly interested in finding out exactly how the Higgs field, of which the Higgs boson is an indication, lends elementary particles their mass. To answer this question, they have started to analyse the collision data from “LHC Run 2”, which began this summer and which is expected to produce about 15 times as many Higgs bosons as the LHC’s previous run. The analysis of this large amount of collision data will allow far more reliable conclusions to be drawn.

    Kerstin Tackmann intends to devote herself to these questions together with two post-docs and three PhD students, and will be analysing the collisions from Run 2 of the ATLAS detector in great detail. They will be working as part of the ATLAS Collaboration, involving hundreds of scientists from all over the world. Her group is going to concentrate on measuring the kinematic properties of Higgs boson production. The focus will lie especially on the decay of the Higgs boson into two photons or four leptons, which allows very accurate measurements to be made. This is where deviations from the precise predictions of the Standard Model could occur, should the Higgs boson not fit the Standard Model.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    desi

    DESY is one of the world’s leading accelerator centres. Researchers use the large-scale facilities at DESY to explore the microcosm in all its variety – from the interactions of tiny elementary particles and the behaviour of new types of nanomaterials to biomolecular processes that are essential to life. The accelerators and detectors that DESY develops and builds are unique research tools. The facilities generate the world’s most intense X-ray light, accelerate particles to record energies and open completely new windows onto the universe. 
That makes DESY not only a magnet for more than 3000 guest researchers from over 40 countries every year, but also a coveted partner for national and international cooperations. Committed young researchers find an exciting interdisciplinary setting at DESY. The research centre offers specialized training for a large number of professions. DESY cooperates with industry and business to promote new technologies that will benefit society and encourage innovations. This also benefits the metropolitan regions of the two DESY locations, Hamburg and Zeuthen near Berlin.

     
  • richardmitnick 7:00 pm on December 23, 2015 Permalink | Reply
    Tags: , , , , Particle Physics, ,   

    From AAAS: “Physicists figure out how to retrieve information from a black hole” 

    AAAS

    AAAS

    23 December 2015
    Adrian Cho

    Temp 1
    It would take technologies beyond our wildest dreams to extract the tiniest amount of quantum information from a black hole like this one. NASA; M. Weiss/Chandra X-Ray Center

    Black holes earn their name because their gravity is so strong not even light can escape from them. Oddly, though, physicists have come up with a bit of theoretical sleight of hand to retrieve a speck of information that’s been dropped into a black hole. The calculation touches on one of the biggest mysteries in physics: how all of the information trapped in a black hole leaks out as the black hole “evaporates.” Many theorists think that must happen, but they don’t know how.

    Unfortunately for them, the new scheme may do more to underscore the difficulty of the larger “black hole information problem” than to solve it. “Maybe others will be able to go further with this, but it’s not obvious to me that it will help,” says Don Page, a theorist at the University of Alberta in Edmonton, Canada, who was not involved in the work.

    You can shred your tax returns, but you shouldn’t be able to destroy information by tossing it into a black hole. That’s because, even though quantum mechanics deals in probabilities—such as the likelihood of an electron being in one location or another—the quantum waves that give those probabilities must still evolve predictably, so that if you know a wave’s shape at one moment you can predict it exactly at any future time. Without such “unitarity” quantum theory would produce nonsensical results such as probabilities that don’t add up to 100%.

    But suppose you toss some quantum particles into a black hole. At first blush, the particles and the information they encode is lost. That’s a problem, as now part of the quantum state describing the combined black hole-particles system has been obliterated, making it impossible to predict its exact evolution and violating unitarity.

    Physicists think they have a way out. In 1974, British theorist Stephen Hawking argued that black holes can radiate particles and energy. Thanks to quantum uncertainty, empty space roils with pairs of particles flitting in and out of existence. Hawking realized that if a pair of particles from the vacuum popped into existence straddling the black hole’s boundary then one particle could fly into space, while the other would fall into the black hole. Carrying away energy from the black hole, the exiting Hawking radiation should cause a black hole to slowly evaporate. Some theorists suspect information reemerges from the black hole encoded in the radiation—although how remains unclear as the radiation is supposedly random.

    Now, Aidan Chatwin-Davies, Adam Jermyn, and Sean Carroll of the California Institute of Technology in Pasadena have found an explicit way to retrieve information from one quantum particle lost in a black hole, using Hawking radiation and the weird concept of quantum teleportation.

    Quantum teleportation enables two partners, Alice and Bob, to transfer the delicate quantum state of one particle such as an electron to another. In quantum theory, an electron can spin one way (up), the other way (down), or literally both ways at once. In fact, its state can be described by a point on a globe in which north pole signifies up and the south pole signifies down. Lines of latitude denote different mixtures of up and down, and lines of longitude denote the “phase,” or how the up and down parts mesh. However, if Alice tries to measure that state, it will “collapse” one way or the other, up or down, squashing information such as the phase. So she can’t measure the state and send the information to Bob, but must transfer it intact.

    To do that Alice and Bob can share an additional pair of electrons connected by a special quantum link called entanglement. The state of either particle in the entangled pair is uncertain—it simultaneously points everywhere on the globe—but the states are correlated so that if Alice measures her particle from the pair and finds it spinning, say, up, she’ll know instantly that Bob’s electron is spinning down. So Alice has two electrons—the one whose state she wants to teleport and her half of the entangled pair. Bob has just the one from the entangled pair.

    To perform the teleportation, Alice takes advantage of one more strange property of quantum mechanics: that measurement not only reveals something about a system, it also changes its state. So Alice takes her two unentangled electrons and performs a measurement that “projects” them into an entangled state. That measurement breaks the entanglement between the pair of electrons that she and Bob share. But at the same time, it forces Bob’s electron into the state that her to-be-teleported electron was in. It’s as if, with the right measurement, Alice squeezes the quantum information from one side of the system to the other.

    Chatwin-Davies and colleagues realized that they could teleport the information about the state of an electron out of a black hole, too. Suppose that Alice is floating outside the black hole with her electron. She captures one photon from a pair born from Hawking radiation. Much like an electron, the photon can spin in either of two directions, and it will be entangled with its partner photon that has fallen into the black hole. Next, Alice measures the total angular momentum, or spin, of the black hole—both its magnitude and, roughly speaking, how much it lines up with a particular axis. With those two bits of information in hand, she then tosses in her electron, losing it forever.

    But Alice can still recover the information about the state of that electron, the team reports in a paper in press at Physical Review Letters. All she has to do is once again measure the spin and orientation of the black hole. Those measurements then entangle the black hole and the in-falling photon. They also teleport the state of the electron to the photon that Alice captured. Thus, the information from the lost electron is dragged back into the observable universe.

    Chatwin-Davies stresses that the scheme is not a plan for a practical experiment. After all, it would require Alice to almost instantly measure the spin of a black hole as massive as the sun to within a single atom’s spin. “We like to joke around that Alice is the most advanced scientist in the universe,” he says.

    The scheme also has major limitations. In particular, as the authors note, it works for one quantum particle, but not for two or more. That’s because the recipe exploits the fact that the black hole conserves angular momentum, so that its final spin is equal to its initial spin plus that of the electron. That trick enables Alice to get out exactly two bits of information—the total spin and its projection along one axis—and that’s just enough information to specify the latitude and longitude of quantum state of one particle. But it’s not nearly enough to recapture all the information trapped in a black hole, which typically forms when a star collapses upon itself.

    To really tackle the black hole information problem, theorists would also have to account for the complex states of the black hole’s interior, says Stefan Leichenauer, a theorist at the University of California, Berkeley. “Unfortunately, all of the big questions we have about black holes are precisely about these internal workings,” he says. “So, this protocol, though interesting in its own right, will probably not teach us much about the black hole information problem in general.”

    However, delving into the interior of black holes would require a quantum mechanical theory of gravity. Of course, developing such a theory is perhaps the grandest goal in all of theoretical physics, one that has eluded physicists for decades.

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 536 other followers

%d bloggers like this: