Tagged: Theoretical Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:29 pm on June 26, 2016 Permalink | Reply
    Tags: , , , Theoretical Physics   

    From Science Alert: “What’s the point of theoretical physics?” 

    ScienceAlert

    Science Alert

    24 JUN 2016
    ALEXANDER LENZ

    1
    Ahuli Labutin/Shutterstock.com

    You don’t have to be a scientist to get excited about breakthroughs in theoretical physics. Discoveries such as gravitational waves and the Higgs boson can inspire wonder at the complex beauty of the Universe no matter how little you really understand them.

    But some people will always question why they should care about scientific advances that have no apparent impact on their daily life – and why we spend millions funding them. Sure, it’s amazing that we can study black holes thousands of light-years away and that Einstein really was as much of a genius as we thought, but that won’t change the way most people live or work.

    Yet the reality is that purely theoretical studies in physics can sometimes lead to amazing changes in our society. In fact, several key pillars on which our modern society rests, from satellite communication to computers, were made possible by investigations that had no obvious application at the time.

    Around 100 years ago, quantum mechanics was a purely theoretical topic, only developed to understand certain properties of atoms. Its founding fathers such as Werner Heisenberg and Erwin Schrödinger had no applications in mind at all. They were simply driven by the quest to understand what our world is made of.

    Quantum mechanics states that you cannot observe a system without changing it fundamentally by your observation, and initially its effects to society were of a philosophical and not a practical nature.

    But today, quantum mechanics is the basis of our use of all semiconductors in computers and mobile phones. To build a modern semiconductor for use in a computer, you have to understand concepts such as the way electrons behave when atoms are held together in a solid material, something only described accurately by quantum mechanics.

    Without it, we would have been stuck using computers based on vacuum tubes.

    At a similar time as the key developments in quantum mechanics, Albert Einstein was attempting to better understand gravity, the dominating force of the universe.

    Rather than viewing gravity as a force between two bodies, he described it as a curving of space-time around each body, similar to how a rubber sheet will stretch if a heavy ball is placed on top of it. This was Einstein’s general theory of relativity.

    Today the most common application of this theory is in GPS. To use signals from satellites to pinpoint your location you need to know the precise time the signal leaves the satellite and when it arrives on Earth.

    Einstein’s theory of general relativity means that the distance of a clock from Earth’s centre of gravity affects how fast it ticks. And his theory of special relativity means that the speed a clock is moving at also affects its ticking speed.

    Without knowing how to adjust the clocks to take account of these effects, we wouldn’t be able to accurately use the satellite signals to determine our position on the ground. Despite his amazing brain, Einstein probably could not have imagined this application a century ago.

    Scientific culture

    Aside from the potential, eventual applications of doing fundamental research, there are also direct financial benefits. Most of the students and post-docs working on big research projects like the Large Hadron Collider will not stay in academia but move into industry.

    During their time in fundamental physics, they are educated at the highest existing technical level and then take their expertise into working companies. This is like educating car mechanics in Formula One racing teams.

    Despite these direct and indirect benefits, most theoretical physicists have a very different motive for their work. They simply want to improve humanity’s understanding of the Universe.

    While this might not immediately impact everyone’s lives, I believe it is just as important a reason for pursuing fundamental research.

    2
    GPS: a relative success. Shutterstock

    This motivation may well have begun when humans first looked up at the night-sky in ancient times. They wanted to understand the world they lived in and so spent time watching nature and creating theories about it, many of them involving gods or supernatural beings.

    Today we have made huge progress in our understanding of both stars and galaxies and, at the other end of the scale, of the tiny fundamental particles from which matter is built.

    It somehow seems that every new level of understanding we achieve comes in tandem with new, more fundamental questions. It is never enough to know what we now know. We always want to continue looking behind newly arising curtains. In that respect, I consider fundamental physics a basic part of human culture.

    Now we can wait curiously to find out what unforeseen spin-offs that discoveries such as the Higgs boson or gravitational waves might lead to in the long-term future. But we can also look forward to the new insights into the building blocks of nature that they will bring us, and the new questions they will raise.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 4:31 pm on June 17, 2016 Permalink | Reply
    Tags: Institute for Quantum Computing U Waterloo, Noncontextuality, , , Theoretical Physics, What does it mean to say the world is quantum?   

    From PI: “New Experiment Clarifies How The Universe Is Not Classical” 

    Perimeter Institute
    Perimeter Institute

    June 17, 2016
    Erin Bow

    “This is a great example of what’s possible when Perimeter and IQC work together. We can start with these exciting, abstract ideas and convert them to things we can actually do in our labs.”
    – Kevin Resch, Faculty member, Institute for Quantum Computing

    1
    From left to right: Matthew Pusey (Perimeter postdoctoral researcher), Kevin Resch (IQC and University of Waterloo faculty member), Robert Spekkens (Perimeter faculty member), and Michael Mazurek (University of Waterloo and IQC PhD student) interact in a quantum optics lab at the Institute for Quantum Computing. No image credit.

    Theorists from Perimeter and experimentalists from the Institute for Quantum Computing have found a new way to test whether the universe is quantum, a test that will have widespread applicability: they’ve proven the failure of noncontextuality in the lab.
    _______________________________________________________________________________________________________________________________________

    What does it mean to say the world is quantum? It’s a surprisingly difficult question to answer, and most casual discussions on the point are heavy on the hand-waving, with references to cats in boxes.

    If we are going to turn the quantum-ness of the universe to our advantage through technologies like quantum computing, our definition of what it means to be quantum – or, more broadly, what it means to be non-classical – needs to be more rigorous. That’s one of the aims of the field of quantum foundations, and the point of new joint research carried out by theorists at Perimeter and experimentalists at the University of Waterloo’s Institute for Quantum Computing (IQC).

    “We need to make precise the notion of non-classicality,” says Robert Spekkens, a faculty member at Perimeter, who led the work from the theoretical side. “We need to find phenomena that defy classical explanation, and then subject those phenomena to direct experimental tests.”

    One candidate for something that defies classical explanation is the failure of noncontextuality.

    “You can think of noncontextuality as the ‘if it walks like a duck’ principle,” says Matthew Pusey, a postdoctoral researcher at Perimeter who also worked on the project.

    As the saying has it, if something walks like a duck and quacks like a duck, it’s probably a duck. The principle of noncontextuality pushes that further, and says that if something walks like a duck and quacks like a duck and you can’t tell it apart from a duck in any experiment, not even in principle, then it must be a duck.

    Though noncontextuality is not something we often think about, it is a feature one would expect to hold in experiments. Indeed, it’s so intuitive that it seems silly to say it aloud: if you can’t tell two things apart, even in principle, then they’re the same. Makes sense, right?

    But in the quantum universe, it’s not quite true.

    Under quantum theory, two preparations of a system can return identical results in every conceivable test. But researchers run into trouble when they try to define exactly what those systems are doing. It turns out that in quantum mechanics, any model that assigns the systems well-defined properties requires them to be different. That’s a violation of the principle of noncontextuality.

    To understand what’s happening, imagine a yellow box that spits out a mix of polarized photons – half polarized horizontally and half polarized vertically. A different box – imagine it to be orange – spits out a different mix of photons, half polarized diagonally and half polarized anti-diagonally.

    Now measure the polarization of the photons from the yellow box and of the photons from the orange box. You can measure any polarization property you like, as much as you like. Because of the way the probabilities add up, the statistics of any measurement performed on photons from the yellow box are going to be identical to the statistics of the same measurement performed on photons from the orange box. In each case, the average polarization is always zero.

    “Those two kinds of boxes, according to quantum theory, cannot be distinguished,” says Spekkens. “All the measurements are going to see exactly the same thing.”

    You might think, following the principle of noncontextuality, that since the yellow and orange boxes produce indistinguishable mixes of photons, they can be described by the same probability distributions. They walk like ducks, so you can describe them both as ducks. But as it turns out, that doesn’t work.

    In a noncontextual world, the fact that the yellow-box photons and orange-box photons are indistinguishable would be explained in the natural way: by the fact that the probability distribution over properties are the same. But the quantum universe resists such explanations – it can be proven mathematically that those two mixtures of photons cannot be described by the same distribution of properties.

    “So that’s the theoretical result,” says Spekkens. “If quantum theory is right, then we can’t have a noncontextual model.”

    But can such a theoretical result be tested? Theorists from Perimeter and experimentalists from IQC set out to discover that very thing.

    Kevin Resch, a faculty member at IQC and the Department of Physics and Astronomy at the University of Waterloo, as well as a Perimeter Affiliate, worked on the project from the experimental end in his lab.

    “The original method of testing noncontextuality required two or more preparation procedures that give exactly the same statistics,” he says. “I would argue that that’s basically not possible, because no experiments are perfect. The method described in our paper allows contextuality tests to deal with these imperfections.”

    While previous attempts to test for the predicted failure of noncontextuality have had to resort to assuming things like noiseless measurements that are not achievable in practice, the Perimeter and IQC teams wanted to avoid such unrealistic assumptions. They knew they couldn’t eliminate all error, so they designed an experiment that could make meaningful tests of noncontextuality even in the presence of error.

    Pusey hit on a clever idea to fight statistical error with statistical inference. Ravi Kunjwal, a doctoral student at the Institute for Mathematical Sciences in Chennai, India, who was visiting at the time, helped define what a test of noncontextuality should look like operationally. Michael Mazurek, a doctoral student with Waterloo’s Department of Physics and Astronomy and IQC, built the experimental apparatus – single photon emitters and detectors, just as in the yellow-and-orange box example above – and ran the tests.

    “The interesting part of the experiment is that it looks really simple on paper,” says Mazurek. “But it wasn’t simple in practice. The analysis that we did and the standards that we held ourselves to required us to really get on top of the small systematic errors that are present in every experiment. Characterizing those errors and compensating for them was quite challenging.”

    At one point, Mazurek used half a roll of masking tape to keep optical fibres from moving around in response to tiny shifts in temperature. Nothing about this experiment was easy, and much of it can only be described with statistics and diagrams. But in the end, the team made it work.

    The result: an experiment that definitively shows the failure of noncontextuality. Like the pioneering work on Bell’s theorem, this research clarifies what it means for the world to be non-classical, and confirms that non-classicality experimentally.

    Importantly, and in contrast to previous tests of contextuality, this experiment renders its verdict without assuming any idealizations, such as noiseless measurements or statistics being exactly the same. This opens a new range of possibilities.

    Researchers in several fields are working to find “quantum advantages” – that is, things we can do if we harness the quantum-ness of the world that would not be possible in the classical world. Examples include quantum cryptography and quantum computation. Such advantages are the beams and girders of any future quantum technology we might be able to build. Noncontextuality can help researchers understand these quantum advantages.

    “We now know, for example, that for certain kinds of cryptographic tasks and computational tasks, the failure of noncontextuality is the resource,” says Spekkens.

    In other words, contextuality is the steel out of which the beams and girders are made.

    “This is a great example of what’s possible when Perimeter and IQC work together,” says Resch, Canada Research Chair in Optical Quantum Technologies. “We can start with these exciting, abstract ideas and convert them to things we can actually do in our labs.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Perimeter

    Perimeter Institute is a leading centre for scientific research, training and educational outreach in foundational theoretical physics. Founded in 1999 in Waterloo, Ontario, Canada, its mission is to advance our understanding of the universe at the most fundamental level, stimulating the breakthroughs that could transform our future. Perimeter also trains the next generation of physicists through innovative programs, and shares the excitement and wonder of science with students, teachers and the general public.

     
  • richardmitnick 6:39 am on May 20, 2016 Permalink | Reply
    Tags: , , Theoretical Physics   

    From CERN: “In Theory: Is theoretical physics in crisis?” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    20 May 2016
    Harriet Jarlett

    1
    “The way physics develops is often a lot less logical than the theories it leads to — you cannot plan discoveries. Especially in theoretical physics.” Gian Giudice, Head of CERN’s Theory Department (Image: Sophia Bennett/ CERN)

    Over the past decade physicists have explored new corners of our world, and in doing so have answered some of the biggest questions of the past century.

    When researchers discovered the Higgs boson in 2012, it was a huge moment of achievement.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    It showed theorists had been right to look towards the Standard Model for answers about our Universe.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    But then the particle acted just like the theorist’s said it would, it obeyed every rule they predicted. If it had acted just slightly differently it would have raised many questions about the theory, and our universe. Instead, it raised few questions and gave no new clues about to where to look next.

    In other words, the theorists had done too good a job.

    “We are struggling to find clear indications that can point us in the right direction. Some people see in this state of crisis a source of frustration. I see a source of excitement because new ideas have always thrived in moments of crisis.” – Gian Giudice, head of the Theory Department at CERN.

    Before these discoveries, physicists were standing on the edge of a metaphorical flat Earth, suspecting it was round but not knowing for sure. Finding both the Higgs boson, and evidence of gravitational waves has brought scientists closer than ever to understanding two of the great theories of our time – the Standard Model and the theory of relativity.

    Now the future of theoretical physics is at a critical point – they proved their own theories, so what is there to do now?

    So what next?

    “Taking unexplained data, trying to fit it to the ideas of the universe […] – that’s the spirit of theoretical physics” – Gian Giudice

    In an earlier article in this series [link to series is below], we spoke about how experimental physicists and theoretical physicists must work together. Their symbiotic relationship – with theorists telling experimentalists where to look, and experimentalists asking theorists for explanations of unusual findings – is necessary, if we are to keep making discoveries.

    Just four years ago, in 2012, physicists still held a genuine uncertainty about whether the lynchpin of the Standard Model, the Higgs boson existed at all. Now, there’s much less uncertainty.

    “We are still in an uncertain period, previously we were uncertain as to how the Standard Model could be completed. Now we know it is pretty much complete so we can focus on the questions beyond it, dark matter, the future of the universe, the beginning of the universe, little things like that,” says John Ellis, a theoretical physicist from Kings College, London who began working at CERN since 1973.

    2
    Michelangelo Mangano moved to the US to work at Princeton just as String Theory was made popular. “After the first big explosion of interest, there’s always a period of slowing down, because all the easier stuff has been done. And you’re struggling with more complex issues,” he explains. “This is something that today’s young theorists are finding as they struggle to make waves in fields like the Standard Model. Unexpected findings from the LHC could reignite their enthusiasm and help younger researchers to feel like they can have an impact.” (Image: Maximillien Brice/CERN)

    With the discovery of the Higgs, there’s been a shift in this relationship, with theoreticians not necessarily leading the way. Instead, experiments look for data to try and give more evidence to the already proposed theories, and if something new is thrown up theorists scramble to explain and make sense of it.

    “It’s like when you go mushroom hunting,” says Michelangelo Mangano, a theoretical physicist who works closely with experimental physicists. “You spend all your energy looking, and at the end of the day you may not find anything. Here it’s the same, there is a lots of wasted energy because it doesn’t lead to much, but by exploring all corners of the field occasionally you find a little gold nugget, a perfect mushroom.”

    At the end of last year, both the ATLAS and CMS experiments at CERN found their mushroom, an intriguing, albeit very small, bump in the data.

    This little, unexpected bump could be the door to a whole host of new physics, because it could be a new particle. After the discovery of the Higgs most of the holes in the Standard Model had been sewn up, but many physicists were optimistic about finding new anomalies.

    “What happens in the future largely depends on what the LHC finds in its second run,” Ellis explains. “So if it turns out that there’s no other new physics and we’re focusing on understanding the Higgs boson better, that’s a different possible future for physics than if LHC Run 2 finds a new particle we need to understand.”

    While the bump is too small for physicists to announce it conclusively, there’s been hundreds of papers published by theoretical physicists as they leap to say what it might be.

    “Taking unexplained data, trying to fit it to your ideas about the universe, revising your ideas once you get more data, and on and on until you have unravelled the story of the universe – that’s the spirit of theoretical physics,” expresses Giudice.

    4
    John Ellis classifies himself as a ‘scientific optimist’, who is happy to pick up whatever tools are available to him to help solve the problems that he has thought up. ‘By nature I’m an optimist so anything can happen, yes, we might not see anything beyond the Higgs boson, but lets just wait and see.’ Here he is interviewed by Harriet Jarlett (left) in his office at CERN. (Image: Sophia Bennett/CERN)

    But we’ll only know whether it’s something worthwhile with the start of the LHC this month, May 2016, when experimental physicists can start to take even more data and conclude what it is.

    Next generation of theory

    This unusual period of quiet in the world of theoretical physics means students studying physics might be more likely to go into experimental physics, where the major discoveries are seen as happening more often, and where young physicists have a chance to be the first to a discovery.

    Speaking to the Summer Students at CERN, some of whom hope to become theoretical physicists, there is the feeling that this period of uncertainty makes following theory a luxury, one that young physicists, who need to have original ideas and publish lots of papers to get ahead, can’t afford.

    5
    Camille Bonvin is working as a fellow in the Theory Department on cosmology to try and understand why the universe is accelerating. If gravity is described by Einstein’s theory of general relativity the expansion should be slowing, not accelerating, which means there’s something we don’t understand. Bonvin is trying to find out what that is. Bonvin thinks the best theories are simple, consistent and make sense, like general relativity. “Einstein is completely logical, and his theory makes sense. Sometimes you have the impression of taking a theory which already exists and adding one element, then another, then another, to try and make the data fit it better, but its not a fundamental theory, so for me its not extremely beautiful.” (Image: Sophia Bennett/CERN)

    Camille Bonvin, a young theoretical physicist at CERN hopes that the data bump is the key to new physics, because without new discoveries it’s hard to keep a younger generation interested: “If both the LHC and the upcoming cosmological surveys find no new physics, it will be difficult to motivate new theorists. If you don’t know where to go or what to look for, it’s hard to see in which direction your research should go and which ideas you should explore.”

    The future’s bright

    4
    Richard Feynman

    Richard Feynman, one of the most famous theoretical physicists once joked, “Physics is like sex. Sure, it may give some practical results, but that’s not why we do it.”

    And Gian Giudice agrees –while the field’s current uncertainty makes it more difficult for young people to make breakthroughs, it’s not the promise of glory that encourages people to follow the theory path, but just a simple passion in why our universe is the way it is.

    “It must be difficult for the new generations of young researchers to enter theoretical physics now when it is not clear where different directions are leading to,” he says. “But it’s much more interesting to play when you don’t know what’s going to happen, rather than when the rules of the game have already been settled.”

    6
    “It’s much more interesting to play when you don’t know what’s going to happen, rather than when the rules of the game have already been settled,” says Giudice, who took on the role of leading the department in 2016 (Image: Sophia Bennett/ CERN) (Image: Sophia Bennett/CERN)

    Giudice, who took on the role of leading the theory department in January 2016 is optimistic that the turbulence the field currently faces makes it one of the most exciting times to become a theoretical physicist.

    “It has often been said that it is difficult to make predictions; especially about the future. It couldn’t be more true today in particle physics. This is what makes the present so exciting. Looking back in the history of physics you’ll see that moments of crisis and confusion were invariably followed by great revolutionary ideas. I hope it’s about to happen again,” smiles Giudice.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 1:52 pm on May 13, 2016 Permalink | Reply
    Tags: , , , , , , Theoretical Physics   

    From FNAL: “What do theorists do?” 

    FNAL II photo

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    May 13, 2016
    Leah Hesla
    Rashmi Shivni

    1
    Pilar Coloma (left) and Seyda Ipek write calculations from floor to ceiling as they try to find solutions to lingering questions about our current models of the universe. Photo: Rashmi Shivni, OC

    Some of the ideas you’ve probably had about theoretical physicists are true.

    They toil away at complicated equations. The amount of time they spend on their computers rivals that of millennials on their hand-held devices. And almost nothing of what they turn up will ever be understood by most of us.

    The statements are true, but as you might expect, the resulting portrait of ivory tower isolation misses the mark.

    The theorist’s task is to explain why we see what we see and predict what we might expect to see, and such pronouncements can’t be made from the proverbial armchair. Theorists work with experimentalists, their counterparts in the proverbial field, as a vital part of the feedback loop of scientific investigation.

    “Sometimes I bounce ideas off experimentalists and learn from what they have seen in their results,” said Fermilab theorist Pilar Coloma, who studies neutrino physics. “Or they may find something profound in theory models that they want to test. My job is all about pushing the knowledge forward so other people can use it.”

    Predictive power

    Theorists in particle physics — the Higgses and Hawkings of the world — push knowledge by making predictions about particle interactions. Starting from the framework known as the Standard Model, they calculate, say, the likelihood of numerous outcomes from the interaction of two electrons, like a blackjack player scanning through the possibilities for the dealer’s next draw.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Experimentalists can then seek out the predicted phenomena, rooting around in the data for a never-before-seen phenomenon.

    Theorists’ predictions keep experimentalists from having to shoot in the dark. Like an experienced paleontologist, the theorist can tell the experimentalist where to dig to find something new.

    “We simulate many fake events,” Coloma said. “The simulated data determines the prospects for an experiment or puts a bound on a new physics model.”

    The Higgs boson provides one example.

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    By 2011, a year before CERN’s ATLAS and CMS experiments announced they’d discovered the Higgs boson, theorists had put forth nearly 100 different proposals by as many different methods for the particle’s mass. Many of the predictions were indeed in the neighborhood of the mass as measured by the two experiments.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    CERN/ATLAS
    CERN/ATLAS

    CERN/CMS Detector
    CERN/CMS Detector

    And like the paleontologist presented with a new artifact, the theorist also offers explanations for unexplained sightings in experimentalists’ data. She might compare the particle signatures in the detector against her many fake events. Or given an intriguing measurement, she might fold it into the next iteration of calculations. If experimentalists see a particle made of a quark combination not yet on the books, theorists would respond by explaining the underlying mechanism or, if there isn’t one yet, work it out.

    “Experimentalists give you information. ‘We think this particle is of this type. Do you know of any Standard Model particle that fits?’” said Seyda Ipek, a theorist studying the matter-antimatter imbalance in the universe. “At first it might not be obvious, because when you add something new, you change the other observations you know are in the Standard Model, and that puts a constraint on your models.”

    And since the grand aim of particle physics theory is to be able to explain all of nature, the calculation developed to explain a new phenomenon must be extendible to a general principle.

    “Unless you have a very good prediction from theory, you can’t convert that experimental measurement into a parameter that appears in the underlying theory of the Standard Model,” said Fermilab theorist John Campbell, who works on precision theoretical predictions for the ATLAS and CMS experiments at the Large Hadron Collider.

    Calculating moves

    The theorist’s calculation starts with the prospect of a new measurement or a hole in a theory.

    “You look at the interesting things that an experiment is going to measure or that you have a chance of measuring,” Campbell said. “If the data agrees with theory everywhere, there’s not much room for new physics. So you look for small deviations that might be a sign of something. You’re really trying to dream up a new set of interactions that might explain why the data doesn’t agree somewhere.”

    In its raw form, particle physics data is the amount and location of the energy a particle deposits in a particle detector. The more sensitive the detector, the more accurate the experimentalists’ measurement, and the more precise the corresponding calculation needs to be.

    2
    Fermilab theorists John Campbell (left) and Ye Li work on a calculation that describes the interactions you might expect to see in the complicated environment of the LHC. Photo: Rashmi Shivni

    The CMS detector at the Large Hadron Collider, for example, allows scientists to measure some probabilities of particle interactions to within a few percent. And that’s after taking into account that it takes one million or even one billion proton-proton collisions to produce just one interesting interaction that CMS would like to measure.

    “When you’re making the measurement that accurately, it demands a prediction at a very high level,” Campbell said. “If you’re looking for something unexpected, then you need to know the expected part in quite a lot of detail.”

    A paleontologist recognizes the vertebra of a brachiosaurus, and the theoretical particle physicist knows what the production of a pair of top quarks looks like in the detector. A departure from the known picture triggers him to take action.

    “So then you embark on this calculation,” Campbell said.

    Embark, indeed. These calculations are not pencil-and-paper assignments. A single calculation predicting the details of a particle interaction, for example, can be a prodigious effort that takes months or years.

    So-called loop corrections are one example: Theorists home in on what happens during a particle event by adding detail — a correction — to an approximate picture.

    Consider two electrons that approach each other, exchange a photon and diverge. Zooming in further, you predict that the photon emits and reabsorbs yet another pair of particles before it itself is reabsorbed by the electron pair. And perhaps you predict that, at the same time, one of the electrons emits and reabsorbs another photon all on its own.

    Each additional quantum-scale effect, or loop, in the big-picture interaction is like pennies on the dollar, changing the accounting of the total transaction — the precision of a particle mass calculation or of the interaction strength between two particles.

    With each additional loop, the task of performing the calculation becomes that much more formidable. (“Loop” reflects how the effects are represented pictorially in Feynman diagrams — details in the approximate picture of the interaction.) Theorists were computing one-loop corrections for the production of a Higgs boson arising from two protons until 1991. It took another 10 years to complete the two-loop corrections for the process. And it wasn’t until this year, 2016, that they finished computing the three-loop corrections. Precise measurements at the Large Hadron Collider would (and do) require precise predictions to determine the kind of Higgs boson that scientists would see, demanding the decades-long investment.

    “Doing these calculations is not straightforward, or we would have done them a long time ago,” Campbell said.

    Once the theorist completes a calculation, they might publish a paper or otherwise make their code broadly available. From there, experimentalists can use the code to simulate how it will look in the detector. Farms of computers map out millions of fake events that take into account the new predictions provided courtesy of the theorist.

    “Without a network of computers available, our studies can’t be done in a reasonable time,” Coloma said. “A single computer can not analyze millions of data points, just as a human being could never take on such a task.”

    If the simulation shows that, for example, a particle might decay in more ways than what the experiment has seen, the theorist could suggest that experimentalists expand their search.

    “We’ve pushed experiments to look in different channels,” Ipek said. “They could look into decays of particles into two-body states, but why not also 10-body states?”

    Theorists also work with an experiment, or multiple experiments, to put their calculations to best use. Armed with code, experimentalists can change a parameter or two to guide them in their search for new physics. What happens, for example, if the Higgs boson interacts a little more strongly with the top quark than we expect? How would that change what we see in our detectors?

    “That’s a question they can ask and then answer,” Campbell said. “Anyone can come up with a new theory. It is best to try to provide a concrete plan that they can follow.”

    Outlandish theories and concrete plans

    Concrete plans ensure a fruitful relationship between experiment and theory. The wilder, unconventional theories scientists dream up take the field into exciting, uncharted territory, but that isn’t to say that they don’t also have their utility.

    Theorists who specialize in physics beyond the Standard Model, for example, generate thousands of theories worldwide for new physics – new phenomena seen as new energy deposits in the detector where you don’t expect to see them.

    “Even if things don’t end up existing, it encourages the experiment to look at its data in different ways,” Campbell said. An experiment could take so much data that you might worry that some fun effect is hiding, never to be seen. Having truckloads of theories helps mitigate against that. “You’re trying to come up with as many outlandish ideas as you can in the hope that you cover as many of those possibilities as you can.”

    Theorists bridge the gap between the pure mathematics that describes nature and the data through which nature manifests.

    “The field itself is challenging, but theory takes us to new places and helps us imagine new phenomena,” Ipek said.” We collectively work toward understanding every detail of our universe and that’s what ultimately matters most.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 4:12 pm on January 2, 2016 Permalink | Reply
    Tags: , , , , Theoretical Physics   

    From PI Via Daily Galaxy: “The Big Bang was a Mirage from a Collapsing Higher-Dimensional Star” February 2015 but Very Interesting 

    Daily Galaxy
    The Daily Galaxy

    Perimeter Institute
    Perimeter Institute
    Perimeter Institute bloc

    February 14, 2015 [Just brought forward – again]
    No writer credit

    Temp 1

    Big Bang was a mirage from collapsing higher-dimensional star, theorists propose. While the recent [ESA]Planck results “prove that inflation is correct”, they leave open the question of how inflation happened.

    ESA Planck
    ESA/Planck

    A new The study could help to show how inflation was triggered by the motion of the Universe through a higher-dimensional reality.
    The event horizon of a black hole — the point of no return for anything that falls in — is a spherical surface. In a higher-dimensional universe, a black hole could have a three-dimensional event horizon, which could spawn a whole new universe as it forms.

    It could be time to bid the Big Bang bye-bye. Cosmologists have speculated that the Universe formed from the debris ejected when a four-dimensional star collapsed into a black hole — a scenario that would help to explain why the cosmos seems to be so uniform in all directions.

    Cosmic Background Radiation Planck
    CMB per Planck

    The standard Big Bang model tells us that the Universe exploded out of an infinitely dense point, or singularity. But nobody knows what would have triggered this outburst: the known laws of physics cannot tell us what happened at that moment.

    “For all physicists know, dragons could have come flying out of the singularity,” says Niayesh Afshordi, an astrophysicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

    It is also difficult to explain how a violent Big Bang would have left behind a Universe that has an almost completely uniform temperature, because there does not seem to have been enough time since the birth of the cosmos for it to have reached temperature equilibrium.

    To most cosmologists, the most plausible explanation for that uniformity is that, soon after the beginning of time, some unknown form of energy made the young Universe inflate at a rate that was faster than the speed of light. That way, a small patch with roughly uniform temperature would have stretched into the vast cosmos we see today. But Afshordi notes that “the Big Bang was so chaotic, it’s not clear there would have been even a small homogenous patch for inflation to start working on”.

    In a paper posted last week on the arXiv preprint server1, Afshordi and his colleagues turn their attention to a proposal made in 2000 by a team including Gia Dvali, a physicist now at the Ludwig Maximilians University in Munich, Germany. In that model, our three-dimensional (3D) Universe is a membrane, or brane, that floats through a ‘bulk universe’ that has four spatial dimensions.

    Ashfordi’s team realized that if the bulk universe contained its own four-dimensional (4D) stars, some of them could collapse, forming 4D black holes in the same way that massive stars in our Universe do: they explode as supernovae, violently ejecting their outer layers, while their inner layers collapse into a black hole.

    In our Universe, a black hole is bounded by a spherical surface called an event horizon. Whereas in ordinary three-dimensional space it takes a two-dimensional object (a surface) to create a boundary inside a black hole, in the bulk universe the event horizon of a 4D black hole would be a 3D object — a shape called a hypersphere. When Afshordi’s team modelled the death of a 4D star, they found that the ejected material would form a 3D brane surrounding that 3D event horizon, and slowly expand.

    The authors postulate that the 3D Universe we live in might be just such a brane — and that we detect the brane’s growth as cosmic expansion. “Astronomers measured that expansion and extrapolated back that the Universe must have begun with a Big Bang — but that is just a mirage,” says Afshordi.

    The model also naturally explains our Universe’s uniformity. Because the 4D bulk universe could have existed for an infinitely long time in the past, there would have been ample opportunity for different parts of the 4D bulk to reach an equilibrium, which our 3D Universe would have inherited.

    The picture has some problems, however. Earlier this year, the European Space Agency’s Planck space observatory released data that mapped the slight temperature fluctuations in the cosmic microwave background — the relic radiation that carries imprints of the Universe’s early moments. The observed patterns matched predictions made by the standard Big Bang model and inflation, but the black-hole model deviates from Planck’s observations by about 4%. Hoping to resolve the discrepancy, Afshordi says that his is now refining its model.

    Despite the mismatch, Dvali praises the ingenious way in which the team threw out the Big Bang model. “The singularity is the most fundamental problem in cosmology and they have rewritten history so that we never encountered it,” he says. Whereas the Planck results “prove that inflation is correct”, they leave open the question of how inflation happened, Dvali adds. The study could help to show how inflation is triggered by the motion of the Universe through a higher-dimensional reality, he says.

    Nature doi:10.1038/nature.2013.13743

    See the full article here .

    Please help promote STEM in your local schools

    stem

    STEM Education Coalition

     
  • richardmitnick 3:47 pm on November 25, 2015 Permalink | Reply
    Tags: , , , Theoretical Physics   

    From Nature: “Theoretical physics: The origins of space and time” 2013 but Very Informative 

    Nature Mag
    Nature

    28 August 2013
    Zeeya Merali

    1

    “Imagine waking up one day and realizing that you actually live inside a computer game,” says Mark Van Raamsdonk, describing what sounds like a pitch for a science-fiction film. But for Van Raamsdonk, a physicist at the University of British Columbia in Vancouver, Canada, this scenario is a way to think about reality. If it is true, he says, “everything around us — the whole three-dimensional physical world — is an illusion born from information encoded elsewhere, on a two-dimensional chip”. That would make our Universe, with its three spatial dimensions, a kind of hologram, projected from a substrate that exists only in lower dimensions.

    This ‘holographic principle’ is strange even by the usual standards of theoretical physics. But Van Raamsdonk is one of a small band of researchers who think that the usual ideas are not yet strange enough. If nothing else, they say, neither of the two great pillars of modern physics — general relativity, which describes gravity as a curvature of space and time, and quantum mechanics, which governs the atomic realm — gives any account for the existence of space and time. Neither does string theory, which describes elementary threads of energy.

    Van Raamsdonk and his colleagues are convinced that physics will not be complete until it can explain how space and time emerge from something more fundamental — a project that will require concepts at least as audacious as holography. They argue that such a radical reconceptualization of reality is the only way to explain what happens when the infinitely dense ‘singularity‘ at the core of a black hole distorts the fabric of space-time beyond all recognition, or how researchers can unify atomic-level quantum theory and planet-level general relativity — a project that has resisted theorists’ efforts for generations.

    “All our experiences tell us we shouldn’t have two dramatically different conceptions of reality — there must be one huge overarching theory,” says Abhay Ashtekar, a physicist at Pennsylvania State University in University Park.

    Finding that one huge theory is a daunting challenge. Here, Nature explores some promising lines of attack — as well as some of the emerging ideas about how to test these concepts.

    2

    Gravity as thermodynamics

    One of the most obvious questions to ask is whether this endeavour is a fool’s errand. Where is the evidence that there actually is anything more fundamental than space and time?

    A provocative hint comes from a series of startling discoveries made in the early 1970s, when it became clear that quantum mechanics and gravity were intimately intertwined with thermodynamics, the science of heat.

    In 1974, most famously, Stephen Hawking of the University of Cambridge, UK, showed that quantum effects in the space around a black hole will cause it to spew out radiation as if it was hot. Other physicists quickly determined that this phenomenon was quite general. Even in completely empty space, they found, an astronaut undergoing acceleration would perceive that he or she was surrounded by a heat bath. The effect would be too small to be perceptible for any acceleration achievable by rockets, but it seemed to be fundamental. If quantum theory and general relativity are correct — and both have been abundantly corroborated by experiment — then the existence of Hawking radiation seemed inescapable.

    A second key discovery was closely related. In standard thermodynamics, an object can radiate heat only by decreasing its entropy, a measure of the number of quantum states inside it. And so it is with black holes: even before Hawking’s 1974 paper, Jacob Bekenstein, now at the Hebrew University of Jerusalem, had shown that black holes possess entropy. But there was a difference. In most objects, the entropy is proportional to the number of atoms the object contains, and thus to its volume. But a black hole’s entropy turned out to be proportional to the surface area of its event horizon — the boundary out of which not even light can escape. It was as if that surface somehow encoded information about what was inside, just as a two-dimensional hologram encodes a three-dimensional image.

    In 1995, Ted Jacobson, a physicist at the University of Maryland in College Park, combined these two findings, and postulated that every point in space lies on a tiny black-hole horizon that also obeys the entropy–area relationship. From that, he found, the mathematics yielded [Albert]Einstein’s equations of general relativity — but using only thermodynamic concepts, not the idea of bending space-time(1).

    “This seemed to say something deep about the origins of gravity,” says Jacobson. In particular, the laws of thermodynamics are statistical in nature — a macroscopic average over the motions of myriad atoms and molecules — so his result suggested that gravity is also statistical, a macroscopic approximation to the unseen constituents of space and time.

    In 2010, this idea was taken a step further by Erik Verlinde, a string theorist at the University of Amsterdam, who showed (2) that the statistical thermodynamics of the space-time constituents — whatever they turned out to be — could automatically generate Newton’s law of gravitational attraction.

    And in separate work, Thanu Padmanabhan, a cosmologist at the Inter-University Centre for Astronomy and Astrophysics in Pune, India, showed (3) that Einstein’s equations can be rewritten in a form that makes them identical to the laws of thermodynamics — as can many alternative theories of gravity. Padmanabhan is currently extending the thermodynamic approach in an effort to explain the origin and magnitude of dark energy: a mysterious cosmic force that is accelerating the Universe’s expansion.

    Testing such ideas empirically will be extremely difficult. In the same way that water looks perfectly smooth and fluid until it is observed on the scale of its molecules — a fraction of a nanometre — estimates suggest that space-time will look continuous all the way down to the Planck scale: roughly 10−35 metres, or some 20 orders of magnitude smaller than a proton.

    But it may not be impossible. One often-mentioned way to test whether space-time is made of discrete constituents is to look for delays as high-energy photons travel to Earth from distant cosmic events such as supernovae and γ-ray bursts [?]. In effect, the shortest-wavelength photons would sense the discreteness as a subtle bumpiness in the road they had to travel, which would slow them down ever so slightly. Giovanni Amelino-Camelia, a quantum-gravity researcher at the University of Rome, and his colleagues have found (4) hints of just such delays in the photons from a γ-ray burst recorded in April. The results are not definitive, says Amelino-Camelia, but the group plans to expand its search to look at the travel times of high-energy neutrinos produced by cosmic events. He says that if theories cannot be tested, “then to me, they are not science. They are just religious beliefs, and they hold no interest for me.”

    Other physicists are looking at laboratory tests. In 2012, for example, researchers from the University of Vienna and Imperial College London proposed (5) a tabletop experiment in which a microscopic mirror would be moved around with lasers. They argued that Planck-scale granularities in space-time would produce detectable changes in the light reflected from the mirror (see Nature http://doi.org/njf; 2012).

    Loop quantum gravity

    Even if it is correct, the thermodynamic approach says nothing about what the fundamental constituents of space and time might be. If space-time is a fabric, so to speak, then what are its threads?

    One possible answer is quite literal. The theory of loop quantum gravity, which has been under development since the mid-1980s by Ashtekar and others, describes the fabric of space-time as an evolving spider’s web of strands that carry information about the quantized areas and volumes of the regions they pass through (6). The individual strands of the web must eventually join their ends to form loops — hence the theory’s name — but have nothing to do with the much better-known strings of string theory. The latter move around in space-time, whereas strands actually are space-time: the information they carry defines the shape of the space-time fabric in their vicinity.

    Because the loops are quantum objects, however, they also define a minimum unit of area in much the same way that ordinary quantum mechanics defines a minimum ground-state energy for an electron in a hydrogen atom. This quantum of area is a patch roughly one Planck scale on a side. Try to insert an extra strand that carries less area, and it will simply disconnect from the rest of the web. It will not be able to link to anything else, and will effectively drop out of space-time.

    One welcome consequence of a minimum area is that loop quantum gravity cannot squeeze an infinite amount of curvature onto an infinitesimal point. This means that it cannot produce the kind of singularities that cause Einstein’s equations of general relativity to break down at the instant of the Big Bang and at the centres of black holes.

    In 2006, Ashtekar and his colleagues reported (7) a series of simulations that took advantage of that fact, using the loop quantum gravity version of Einstein’s equations to run the clock backwards and visualize what happened before the Big Bang. The reversed cosmos contracted towards the Big Bang, as expected. But as it approached the fundamental size limit dictated by loop quantum gravity, a repulsive force kicked in and kept the singularity open, turning it into a tunnel to a cosmos that preceded our own.

    This year, physicists Rodolfo Gambini at the Uruguayan University of the Republic in Montevideo and Jorge Pullin at Louisiana State University in Baton Rouge reported (8) a similar simulation for a black hole. They found that an observer travelling deep into the heart of a black hole would encounter not a singularity, but a thin space-time tunnel leading to another part of space. “Getting rid of the singularity problem is a significant achievement,” says Ashtekar, who is working with other researchers to identify signatures that would have been left by a bounce, rather than a bang, on the cosmic microwave background — the radiation left over from the Universe’s massive expansion in its infant moments.

    Loop quantum gravity is not a complete unified theory, because it does not include any other forces. Furthermore, physicists have yet to show how ordinary space-time would emerge from such a web of information. But Daniele Oriti, a physicist at the Max Planck Institute for Gravitational Physics in Golm, Germany, is hoping to find inspiration in the work of condensed-matter physicists, who have produced exotic phases of matter that undergo transitions described by quantum field theory. Oriti and his colleagues are searching for formulae to describe how the Universe might similarly change phase, transitioning from a set of discrete loops to a smooth and continuous space-time. “It is early days and our job is hard because we are fishes swimming in the fluid at the same time as trying to understand it,” says Oriti.

    Causal sets

    Such frustrations have led some investigators to pursue a minimalist programme known as causal set theory. Pioneered by Rafael Sorkin, a physicist at the Perimeter Institute in Waterloo, Canada, the theory postulates that the building blocks of space-time are simple mathematical points that are connected by links, with each link pointing from past to future. Such a link is a bare-bones representation of causality, meaning that an earlier point can affect a later one, but not vice versa. The resulting network is like a growing tree that gradually builds up into space-time. “You can think of space emerging from points in a similar way to temperature emerging from atoms,” says Sorkin. “It doesn’t make sense to ask, ‘What’s the temperature of a single atom?’ You need a collection for the concept to have meaning.”

    In the late 1980s, Sorkin used this framework to estimate(9) the number of points that the observable Universe should contain, and reasoned that they should give rise to a small intrinsic energy that causes the Universe to accelerate its expansion. A few years later, the discovery of dark energy confirmed his guess. “People often think that quantum gravity cannot make testable predictions, but here’s a case where it did,” says Joe Henson, a quantum-gravity researcher at Imperial College London. “If the value of dark energy had been larger, or zero, causal set theory would have been ruled out.”

    Causal dynamical triangulations

    That hardly constituted proof, however, and causal set theory has offered few other predictions that could be tested. Some physicists have found it much more fruitful to use computer simulations. The idea, which dates back to the early 1990s, is to approximate the unknown fundamental constituents with tiny chunks of ordinary space-time caught up in a roiling sea of quantum fluctuations, and to follow how these chunks spontaneously glue themselves together into larger structures.

    The earliest efforts were disappointing, says Renate Loll, a physicist now at Radboud University in Nijmegen, the Netherlands. The space-time building blocks were simple hyper-pyramids — four-dimensional counterparts to three-dimensional tetrahedrons — and the simulation’s gluing rules allowed them to combine freely. The result was a series of bizarre ‘universes’ that had far too many dimensions (or too few), and that folded back on themselves or broke into pieces. “It was a free-for-all that gave back nothing that resembles what we see around us,” says Loll.

    But, like Sorkin, Loll and her colleagues found that adding causality changed everything. After all, says Loll, the dimension of time is not quite like the three dimensions of space. “We cannot travel back and forth in time,” she says. So the team changed its simulations to ensure that effects could not come before their cause — and found that the space-time chunks started consistently assembling themselves into smooth four-dimensional universes with properties similar to our own(10).

    Intriguingly, the simulations also hint that soon after the Big Bang, the Universe went through an infant phase with only two dimensions — one of space and one of time. This prediction has also been made independently by others attempting to derive equations of quantum gravity, and even some who suggest that the appearance of dark energy is a sign that our Universe is now growing a fourth spatial dimension. Others have shown that a two-dimensional phase in the early Universe would create patterns similar to those already seen in the cosmic microwave background.

    Holography

    Meanwhile, Van Raamsdonk has proposed a very different idea about the emergence of space-time, based on the holographic principle. Inspired by the hologram-like way that black holes store all their entropy at the surface, this principle was first given an explicit mathematical form by Juan Maldacena, a string theorist at the Institute of Advanced Study in Princeton, New Jersey, who published (11) his influential model of a holographic universe in 1998. In that model, the three-dimensional interior of the universe contains strings and black holes governed only by gravity, whereas its two-dimensional boundary contains elementary particles and fields that obey ordinary quantum laws without gravity.

    Hypothetical residents of the three-dimensional space would never see this boundary, because it would be infinitely far away. But that does not affect the mathematics: anything happening in the three-dimensional universe can be described equally well by equations in the two-dimensional boundary, and vice versa.

    In 2010, Van Raamsdonk studied what that means when quantum particles on the boundary are ‘entangled’ — meaning that measurements made on one inevitably affect the other (12). He discovered that if every particle entanglement between two separate regions of the boundary is steadily reduced to zero, so that the quantum links between the two disappear, the three-dimensional space responds by gradually dividing itself like a splitting cell, until the last, thin connection between the two halves snaps. Repeating that process will subdivide the three-dimensional space again and again, while the two-dimensional boundary stays connected. So, in effect, Van Raamsdonk concluded, the three-dimensional universe is being held together by quantum entanglement on the boundary — which means that in some sense, quantum entanglement and space-time are the same thing.

    Or, as Maldacena puts it: “This suggests that quantum is the most fundamental, and space-time emerges from it.”

    [For references, please see the full article.]

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: