Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:31 am on July 26, 2016 Permalink | Reply
    Tags: , , , From dark gravity to phantom energy: what’s driving the expansion of the universe?, Physics   

    From COSMOS: “From dark gravity to phantom energy: what’s driving the expansion of the universe?’ 

    Cosmos Magazine bloc

    COSMOS

    There is something strange happening in the local universe, with galaxies moving away from each other faster than expected. What is driving this extra expansion, and what does it mean for the cosmos?

    1
    A diagram representing the evolution of the universe, starting with the Big Bang to present day. The red arrow marks the flow of time. New research suggests it’s expanding even faster than shown here. NASA/GSFC

    There is something strange happening in the local universe, with galaxies moving away from each other faster than expected.

    What is driving this extra expansion, and what does it mean for the cosmos? To explore this, let’s start with the observations.

    The rate of cosmic expansion is encapsulated in the “Hubble constant”, although don’t let the name fool you, as it’s not a constant and changes as the universe expands.

    To determine this constant, astronomers must relate the distances to galaxies to the velocity they’re travelling away from us. But measuring astronomical distances has always proven difficult. This is because we lack convenient signposts, known as standard candles and rulers, to chart the heavens.

    So astronomers have built up cosmic distances through a series of steps, using overlapping methods to span the heavens. But each step in this cosmological distance ladder has its own quirks and uncertainties, and extraordinary effort over many decades has been expended to calibrate the various methods.

    A new paper has pushed this calibration even harder, using a number of methods to tie down the Hubble constant to an accuracy of 2.4% within a few hundred million light years (which is local by cosmic standards).

    We can also determine the universal expansion from observations of the cosmic microwave background, which is the radiation leftover from the Big Bang.

    Cosmic Microwave Background per ESA/Planck
    Cosmic Microwave Background per ESA/Planck

    Unlike local observations, this reveals the global expansion of the universe. And this is where the problems begin, as this global expansion is 9% slower than that seen in the local universe. In both measurements, the astronomers have worked hard to reduce the uncertainties, and so are confident this difference is valid.

    So what can explain this tension in cosmic measurement? Here are a few of the contenders.

    Cosmic contenders
    Dark matter

    The first potential culprit is dark matter, the dominant mass in the universe. We know it is not smoothly spread through space, so perhaps the lumps and bumps, like the galaxies and clusters of galaxies, are exacting less gravitational pull in the local universe.

    Perhaps we are in a cosmic void, a region whose density is below the universal average.

    If this were the case, we would have to be inhabiting a strange corner of the universe, sitting at the centre of immense emptiness not very unlike anything expected in our cosmological ideas.

    Dark energy

    And then there is dark energy, the dominant energy in the universe. This component is responsible for accelerating the cosmic expansion, but is assumed to have a very simple form, eternal and unchanging over all of history.

    But what if dark energy is dynamic and evolving, changing its properties as the universe expands? If it changed quite recently (in cosmic terms), the additional expansion could be imprinted on the local universe, but have not yet impacted the global expansion.

    If this is the case, the universe has something to worry about, as this new form of dark energy would be a “phantom”, driving universal expansion faster and faster into a “big rip”, which is more dramatic than it sounds.

    Dark radiation

    Another potential solution is “dark radiation”, which consists of hyper-fast particles that zipped around in the early universe.

    While there is no single definition on what constitutes dark radiation, a favoured candidate is a new member of the neutrino family, affectionately known as sterile neutrinos.

    While dark radiation is theoretical, there is little observational evidence for its existence. But if it had been present in the early universe, it would have influenced the early expansion of the universe, which would still be imprinted on the global value of the Hubble constant, but would now be washed out of the local value.

    Dark gravity

    The potential solutions so far have considered modifying the properties of components in the universe, but there is the more drastic alternative: dark gravity.

    This suggests that we don’t fully understand the fundamental nature of the universe, and that gravity does not follow the rules laid out by Albert Einstein in his general theory of relativity.

    Such theories of modified gravity have existed for a long time, and come in many forms, and it is not clear how we deduce the impact of such gravity on the universal expansion.

    Dark speculations

    So there are several alternatives that could potentially explain the discrepancy between the local and global measurements of the Hubble constant. Which one is correct?

    At the moment, the observations are rather raw and do not discriminate between the possibilities. And so we enter the realm of theoretical speculation, where ideas are tried and discarded until viable explanations are discovered.

    At the same time, astronomers will seek more data, and will continue to tie down calibrations and methods. This brings us to our final possibility.

    No observations are perfect, and much of science is about understanding the uncertainties of measurements. Scientists can generally wrangle random errors and understand how uncertainties in measurement impact uncertainties in results.

    But there is another uncertainty: the systematic error, which can strike fear into a researcher. Instead of scattering results, systematic errors shift all results one way or another.

    Systematic errors can also influence astronomical distance measures. And if they propagate through the distance ladder, they could potentially shift the local measurement of the Hubble constant away from the global value.

    With new data and methods, this tension may evaporate. Some astronomers are already suggesting that this is a “more reasonable explanation”.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 10:14 am on July 26, 2016 Permalink | Reply
    Tags: , , , , , , , Physics,   

    From Physics Today: “High-energy lab has high-energy director” 

    Physics Today bloc

    Physics Today

    21 July 2016
    Toni Feder

    CERN director general Fabiola Gianotti looks at what lies ahead for particle physics.

    1
    Fabiola Gianotti in December 2015, just before she became CERN’s director general. Credit: CERN

    Fabiola Gianotti shot to prominence on 4 July 2012, with the announcement of the discovery of the Higgs boson. At the time, she was the spokesperson of ATLAS, which along with the Compact Muon Solenoid (CMS) experiment spotted the Higgs at the Large Hadron Collider (LHC) at CERN.

    CERN ATLAS Higgs Event
    CERN/ATLAS detector
    CERN ATLAS Higgs Event; CERN/ATLAS detector

    CERN CMS Higgs Event
    CERN/CMS Detector
    CERN CMS Higgs Event; CERN/CMS Detector

    In the excitement over the Higgs discovery, Gianotti was on the cover of Time. She was hailed as among the most influential and the most inspirational women of our time. She was listed among the “leading global thinkers of 2013” by Foreign Policy magazine.

    “I am not very comfortable in the limelight,” says Gianotti. “Particle physics is a truly collaborative field. The discovery of the Higgs boson is the result of the work of thousands of physicists over more than 20 years.”

    Gianotti first went to CERN in 1987 as a graduate student at the University of Milan. She has been there ever since. And she seems comfortable at the helm, a job she has held since the beginning of this year.

    “The main challenge is to cope with so many different aspects, and switching my brain instantly from one problem to another one,” she says. “There are many challenges—human challenges, scientific challenges, technological challenges, budget challenges. But the challenges are interesting and engaging.”

    As of this summer, the LHC is in the middle of its second run, known as Run 2. In June the collider reached a record luminosity of 1034 cm−2s−1. It produces proton–proton collisions of energy of 13 TeV. A further push to the design energy of 14 TeV may be made later in Run 2 or in Run 3, which is planned for 2021–23. An upgrade following the third run will increase the LHC’s luminosity by an order of magnitude.

    Physics Today’s Toni Feder caught up with Gianotti in June, about six months into her five-year appointment in CERN’s top job.

    PT: Last fall the ATLAS and CMS experiments both reported hints of a signal at 750 GeV. What would the implications be of finding a particle at that energy?

    GIANOTTI: At the moment, we don’t know if what the experiments observed last year is the first hint of a signal or just a fluctuation. But if the bump turns into a signal, then the implications are extraordinary. Its presumed features would not be something we can classify within the best-known scenarios for physics beyond the standard model. So it would be something unexpected, and for researchers there is nothing more exciting than a surprise.

    The experiments are analyzing the data from this year’s run and will release results in the coming weeks. We can expect them on the time scale of ICHEP in Chicago at the beginning of August. [ICHEP is the International Conference on High Energy Physics.]

    PT: The LHC is up to nearly the originally planned collision energy. The next step is to increase the luminosity. How will that be done?

    GIANOTTI: To increase the luminosity, we will have to replace components of the accelerator—for example, the magnets sitting on each side of the ATLAS and CMS collision regions. These are quadrupoles that squeeze the beams and therefore increase the interaction probability. We will replace them with higher-field, larger-aperture magnets. There are also other things we have to do to upgrade the accelerator. The present schedule for the installation of the hardware components is at the end of Run 3—that is, during the 2024–26 shutdown. The operation of the high-luminosity LHC will start after this installation, so on the time scale of 2027.

    The high-luminosity LHC will allow the experiments to collect 10 times as much data. Improving the precision will be extremely important, in particular for the interaction strength—so-called couplings—of the Higgs boson with other particles. New physics can alter these couplings from the standard-model expectation. Hence the Higgs boson is a door to new physics.

    The high-luminosity LHC will also increase the discovery potential for new physics: Experiments will be able to detect particles with masses 20% to 30% larger than before the upgrade.

    And third, if new physics is discovered at the LHC in Run 2 or Run 3, the high-luminosity LHC will allow the first precise measurements of the new physics to be performed with a very well-known accelerator and very well-known experiments. So it would provide powerful constraints on the underlying theory.

    PT: What are some of the activities at CERN aside from the LHC?

    GIANOTTI: I have spent my scientific career working on high-energy colliders, which are very close to my heart. However, the open questions today in particle physics are difficult and crucial, and there is no single way to attack them. We can’t say today that a high-energy collider is the way to go and let’s forget about other approaches. Or underground experiments are the way to go. Or neutrino experiments are the way to go. There is no exclusive way. I think we have to be very inclusive, and we have to address the outstanding questions with all the approaches that our discipline has developed over the decades.

    In this vein, at CERN we have a scientific diversity program. It includes the study of antimatter through a dedicated facility, the Antiproton Decelerator; precise measurements of rare decays; and many other projects. We also participate in accelerator-based neutrino programs, mainly in the US. And we are doing R&D and design studies for the future high-energy colliders: an electron–positron collider in the multi-TeV region [the Compact Linear Collider] and future circular colliders.

    PT: Japan is the most likely host for a future International Linear Collider, an electron–positron collider (see Physics Today, March 2013, page 23). What’s your sense about whether the ILC will go ahead and whether it’s the best next step for high-energy physics?

    GIANOTTI: Japan is consulting with international partners to see if a global collaboration can be built. It’s a difficult decision to be taken, and it has to be taken by the worldwide community.

    Europe will produce a new road map, the European Strategy for Particle Physics, on the time scale of 2019–20. That will be a good opportunity to think about the future of the discipline, based also on the results from the LHC Run 2 and other facilities in the world.

    PT: How is CERN affected by tight financial situations in member countries?

    GIANOTTI: CERN has been running for many years with a constant budget, with constant revenues from member states, at a level of CHF 1.2 billion [$1.2 billion] per year. We strive to squeeze the operation of the most powerful accelerator in the world, its upgrade, and other interesting projects within this budget.

    PT: Will Brexit affect CERN?

    GIANOTTI: We are not directly affected because CERN membership is not related to being members of the European Union.

    PT: You have said you have four areas that you want to maintain and expand at CERN: science, technology and innovation, education, and peaceful collaboration. Please elaborate.

    GIANOTTI: Science first. We do research in fundamental physics, with the aim of understanding the elementary particles and their interactions, which also gives us very important indications about the structure and evolution of the universe.

    In order to accomplish these scientific goals, we have to develop cutting-edge technologies in many domains, from superconducting magnets to vacuum technology, cryogenics, electronics, computing, et cetera.

    These technologies are transferred to society and find applications in many other sectors—for example, in the medical fields with imaging and cancer therapy, but also solar panels, not to mention the World Wide Web. Fundamental research requires very sophisticated instruments and is a driver of innovation.

    Another component of our mission is education and training. The CERN population is very young: The age distribution of the 12 000 citizens from all over the world working on our experiments peaks at 27 years, and almost 50% are below 35. About half of our PhD students remain in academia or research, and about half go to industry. It is our duty to prepare them to be tomorrow’s scientists or tomorrow’s employees of industry—and in any case, good citizens.

    How do we prepare them to be good citizens? CERN was created in the early 1950s to promote fundamental research and to foster peaceful collaboration among European countries after the war. Today we have scientists of more than 110 nationalities, some from countries that are in conflict with each other, some from countries that do not even recognize each other’s right to exist. And yet they work together in a peaceful way, animated by the same passion for knowledge.

    PT: You are the first woman to head CERN. What do you see as the significance of this?

    GIANOTTI: The CERN director general should be appointed on the basis of his or her capabilities to run the laboratory and not on the basis of gender arguments. This being said, I hope that my being a woman can be useful as an encouragement to girls and young women who would like to do fundamental research but might hesitate. It shows them they have similar opportunities as their male colleagues.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Our mission

    The mission of Physics Today is to be a unifying influence for the diverse areas of physics and the physics-related sciences.

    It does that in three ways:

    • by providing authoritative, engaging coverage of physical science research and its applications without regard to disciplinary boundaries;
    • by providing authoritative, engaging coverage of the often complex interactions of the physical sciences with each other and with other spheres of human endeavor; and
    • by providing a forum for the exchange of ideas within the scientific community.”

     
  • richardmitnick 9:41 am on July 26, 2016 Permalink | Reply
    Tags: , Asymptotically Safe Gravity, , Causal Dynamical Triangulation, Dimensional reduction, , , Physics   

    From Ethan Siegel: “Dimensional Reduction: The Key To Physics’ Greatest Mystery?” 

    From Ethan Siegel

    Jul 26, 2016
    Sabine Hossenfelder

    1
    A visualization of a 3-torus model of space, where lines or sheets in series could reproduce a larger-dimensional structure. Image credit: Bryan Brandenburg, under c.c.a.-s.a.-3.0.

    What if the Universe – and fundamentally, space itself – were like a pile of laundry?

    It doesn’t sound like a sober thought, but it’s got math behind it, so physicists think there might be something to it. Indeed the math has piled up lately. They call it “dimensional reduction,” the idea that space on short distances has fewer than three dimensions – and it might help physicists to quantize gravity.

    We’ve gotten used to space with additional dimensions, rolled up so small (or compactified) that we can’t observe them. But how do you get rid of dimensions instead? To understand how it works we first have clarify what we mean by “dimension.”

    1
    A 3-D object like a pipe will have a Hausdorff dimension of 1, as the lines only have one dimension to spread out as long as they’d like, which is also seen in the reduction to a line as you zoom out. Image credit: Alex Dunkel (Maky) of Wikipedia, based on Brian Greene’s The Elegant Universe, under a c.c.a.-s.a.-4.0 license.

    We normally think about dimensions of space by picturing a series of lines which spread from a point. How quickly the lines dilute with the distance from the point tells us the “Hausdorff dimension” of a space. The faster the lines diverge from each other with distance, the larger the Hausdorff dimension. If you speak through a pipe, for example, sound waves spread less and your voice carries farther. The pipe hence has a lower Hausdorff dimension than our normal 3-dimensional office cubicles. It’s the Hausdorff dimension that we colloquially refer to as just dimension.

    For dimensional reduction, however, it is not the Hausdorff dimension which is relevant, but instead the “spectral dimension,” which is a slightly different concept. We can calculate it by first getting rid of the “time” in “space-time” and making it into space (period). We then place a random walker at one point and measure the probability that it returns to the same point during its walk. The smaller the average return probability, the higher the probability the walker gets lost, and the higher the number of spectral dimensions.

    4
    Isotropic random walk on the euclidean lattice Z^3. This picture shows three different walks after 10 000 unit steps, all three starting from the origin. Image credit: Zweistein, under c.c.a.-s.a.-3.0.

    Normally, for a non-quantum space, both notions of dimension are identical. However, add quantum mechanics and the spectral dimension at short distances goes down from four to two. The return probability for short walks becomes larger than expected, and the walker is less likely to get lost – this is what physicists mean by “dimensional reduction.”

    The spectral dimension is not necessarily an integer; it can take on any value. This value starts at 4 when quantum effects can be neglected, and decreases when the walker’s sensitivity to quantum effects at shortest distances increases. Physicists therefore also like to say that the spectral dimension “runs,” meaning its value depends on the resolution at which space-time is probed.

    Dimensional reduction is an attractive idea because quantizing gravity is considerably easier in lower dimensions, where the infinities that plague traditional attempts to quantize gravity go away. A theory with a reduced number of dimensions at the shortest distances therefore has a much higher chance to remain consistent, and therefore to provide a meaningful theory for the quantum nature of space and time. Not so surprisingly, among physicists, dimensional reduction has received quite some attention lately.

    5
    Cross section of the quintic Calabi–Yau manifold. Unlike taking a cross section, dimensional reduction is about having reduced degrees of freedom when it comes to the probability of returning to your starting point in a finite number of steps. Public domain.

    This strange property of quantum-spaces was first found in Causal Dynamical Triangulation, an approach to quantum gravity that relies on approximating curved spaces by triangular patches. In this work, the researchers did a numerical simulation of a random walk in such a triangulized quantum-space, and found that the spectral dimension goes down from four to two. Or actually, to 1.80 ± 0.25, if you want to know precisely.

    Instead of doing numerical simulations, it is also possible to study the spectral dimension mathematically, which has since been done in various other approaches. For this, physicists exploit that the behavior of the random walk is governed by a differential equation – the diffusion equation (a.k.a., the heat equation) – which depends on the curvature of space. In quantum gravity, spatial curvature has quantum fluctuations, so instead it’s the average curvature value which enters the diffusion equation. From the diffusion equation, one then calculates the return probability for the random walk.

    Through this method, physicists have inferred the spectral dimension also in Asymptotically Safe Gravity, an approach to quantum gravity which relies on the resolution-dependence (the “running”) of quantum field theories. And they found the same drop as in Causal Dynamical Triangulations: from four to two spectral dimensions.

    6
    A representation of a spin network in Loop quantum gravity. Image credit: Markus Poessel (Mapos) of Wikimedia Commons, under c.c.a.-s.a.-3.0.

    Another indication that dimensional reduction might be important comes from Loop Quantum Gravity, where the scaling of the area operator with length changes at short distances. In this case, is somewhat questionable whether the notion of curvature makes sense at all on short distances. Ignoring this philosophical conundrum, one can construct the diffusion equation anyway, and one finds that the spectral dimension – surprise – drops from four to two.

    And finally, there is Horava-Lifshitz gravity, yet another modification of gravity which some believe helps with quantizing it. Here too, dimensional reduction, from four to two, has been found.

    It is difficult to visualize what is happening with the dimensionality of space if it goes down continuously, rather than in discrete steps as in the example with the laundry pile. Perhaps a good way to picture it, as Calcagni, Eichhorn and Saueressig suggest, is to think of the quantum fluctuations of space-time as hindering a particle’s random walk, thereby slowing it down. It wouldn’t have to be that way, though. Quantum fluctuations could have also kicked the particle around wildly, thereby increasing the spectral dimension rather than decreasing it. But that’s not what the math tells us.

    7
    Real gravitational effects occur in spacetime, not just space, and must propagate at the speed of light through space and time. Image credit: SLAC National Accelerator Laboratory.

    One shouldn’t take this picture too seriously though, because we’re talking about a random walk in space, not space-time, and so it’s not a real physical process. Turning time into space might seem strange, but it is a common mathematical simplification which is often used for calculations in quantum theory. Still, it makes it difficult to interpret what is happening physically.

    I find it intriguing that several different approaches to quantum gravity share a behavior like this. Maybe it is a general property of quantum space-time? But then, there are many different types of random walks, and while these different approaches to quantum gravity share a similar scaling behavior for the spectral dimension, they differ in the type of random walk that produces this scaling. So maybe the similarities are only superficial.

    And, of course, this idea has no observational evidence speaking for it. Maybe never will. But one day, I’m sure, all the math will click into place and everything will make perfect sense. Meanwhile, have another.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 3:36 pm on July 25, 2016 Permalink | Reply
    Tags: , , , , Physics, TH2K, VA Tech   

    From phys.org: “CP violation or new physics?” 

    physdotorg
    phys.org

    July 25, 2016
    Lisa Zyga

    1
    This is the “South Pillar” region of the star-forming region called the Carina Nebula. Like cracking open a watermelon and finding its seeds, the infrared telescope “busted open” this murky cloud to reveal star embryos tucked inside finger-like pillars of thick dust. Credit: NASA/Spitzer

    Over the past few years, multiple neutrino experiments have detected hints for leptonic charge parity (CP) violation—a finding that could help explain why the universe is made of matter and not antimatter. So far, matter-antimatter asymmetry cannot be explained by any physics theory and is one of the biggest unsolved problems in cosmology.

    But now in a new study published in Physical Review Letters, physicists David V. Forero and Patrick Huber at Virginia Tech have proposed that the same hints could instead indicate CP-conserving “new physics,” and current experiments would have no way to tell the difference.

    Both possibilities—CP violation or new physics—would have a major impact on the scientific understanding of some of the biggest questions in cosmology. Currently, one of the most pressing problems is the search for new physics, or physics beyond the Standard Model, which is a theory that scientists know is incomplete but aren’t sure exactly how to improve. New physics could potentially explain several phenomena that the Standard Model cannot, including the matter-antimatter asymmetry problem, as well as dark matter, dark energy, and gravity.

    As the scientists show in the new study, determining whether the recent hints indicate CP violation or new physics will be very challenging. The main goal of the study was to “quantify the level of confusion” between the two possibilities. The physicists’ simulations and analysis revealed that both CP violation and new physics have distributions centered at the exact same value for what the neutrino experiments measure—something called the Dirac CP phase. This identical preference makes it impossible for current neutrino experiments to distinguish between the two cases.

    “Our results show that establishing leptonic CP violation will need exceptional care, and that new physics can in many ways lead to non-trivial confusion,” Huber told Phys.org.

    The good news is that new and future experiments may be capable of resolving the issue. One possible way to test the two proposals is to compare the measurements of the Dirac CP phase made by two slightly different experiments: DUNE (the Deep Underground Neutrino Experiment) at Fermilab in Batavia, Illinois; and T2HK (the Tokai to Hyper-Kamiokande project) at J-PARC in Tokai, Japan.

    FNAL LBNF/DUNE  from FNAL to SURF
    FNAL LBNF/DUNE from FNAL to SURF

    3
    Proposed TH2K

    “The trick is that the type of new physics we postulate in our paper manifests itself in the way in which neutrino oscillations are affected by the amount of earth matter through which the neutrino traverses,” Huber said. “The more matter travelled through, the larger the effect of this type of new physics.”

    “Now, for DUNE, neutrinos would have to travel roughly 1300 km in the earth, whereas for T2HK they would travel only about 300 km. Thus one would find two different values for the Dirac CP phase in both cases, indicating a problem.”

    In order to be accurate, these experiments will require extremely high degrees of precision, which Huber emphasizes should not be overlooked.

    “Of course, the same result could arise if for some reason either experiment was not properly calibrated and thus precisely calibrating these experiments will be extraordinarily important—a very difficult task, which I believe is not quite getting the attention it should.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 10:32 am on July 25, 2016 Permalink | Reply
    Tags: , , , Physics, Possible fifth force?,   

    From Don Lincoln of FNAL on livescience: “A Fifth Force: Fact or Fiction” 

    Livescience

    FNAL Icon
    FNAL

    FNAL Don Lincoln
    Don lincoln

    July 5, 2016

    1
    Has a Hungarian lab really found evidence of a fifth force of nature? Credit: Jurik Peter / Shutterstock.com

    Science and the internet have an uneasy relationship: Science tends to move forward through a careful and tedious evaluation of data and theory, and the process can take years to complete. In contrast, the internet community generally has the attention span of Dory, the absent-minded fish of Finding Nemo(and now Finding Dory) — a meme here, a celebrity picture there — oh, look … a funny cat video.

    Thus people who are interested in serious science should be extremely cautious when they read an online story that purports to be a paradigm-shifting scientific discovery. A recent example is one suggesting that a new force of nature might have been discovered. If true, that would mean that we have to rewrite the textbooks.

    A fifth force

    So what has been claimed?

    In an article submitted on April 7, 2015, to the arXiv repository of physics papers, a group of Hungarian researchers reported on a study in which they focused an intense beam of protons (particles found in the center of atoms) on thin lithium targets. The collisions created excited nuclei of beryllium-8, which decayed into ordinary beryllium-8 and pairs of electron-positron particles. (The positron is the antimatter equivalent of the electron.)

    3
    The Standard Model is the collection of theories that describe the smallest experimentally observed particles of matter and the interactions between energy and matter. Credit: Karl Tate, LiveScience Infographic Artist

    They claimed that their data could not be explained by known physical phenomena in the Standard Model, the reigning model governing particle physics. But, they purported, they could explain the data if a new particle existed with a mass of approximately 17 million electron volts, which is 32.7 times heavier than an electron and just shy of 2 percent the mass of a proton. The particles that emerge at this energy range, which is relatively low by modern standards, have been well studied. And so it would be very surprising if a new particle were discovered in this energy regime.

    However, the measurement survived peer review and was published on Jan. 26, 2016, in the journal Physical Review Letters, which is one of the most prestigious physics journals in the world. In this publication, the researchers, and this research, cleared an impressive hurdle.

    Their measurement received little attention until a group of theoretical physicists from the University of California, Irvine (UCI), turned their attention to it. As theorists commonly do with a controversial physics measurement, the team compared it with the body of work that has been assembled over the last century or so, to see if the new data are consistent or inconsistent with the existing body of knowledge. In this case, they looked at about a dozen published studies.

    What they found is that though the measurement didn’t conflict with any past studies, it seemed to be something never before observed — and something that couldn’t be explained by the Standard Model.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth

    New theoretical framework

    To make sense of the Hungarian measurement, then, this group of UCI theorists invented a new theory.

    The theory invented by the Irvine group is really quite exotic. They start with the very reasonable premise that the possible new particle is something that is not described by existing theory. This makes sense because the possible new particle is very low mass and would have been discovered before if it were governed by known physics. If this were a new particle governed by new physics, perhaps a new force is involved. Since traditionally physicists speak of four known fundamental forces (gravity, electromagnetism and the strong and weak nuclear forces), this hypothetical new force has been dubbed “the fifth force.”

    Theories and discoveries of a fifth force have a checkered history, going back decades, with measurements and ideas arising and disappearing with new data. On the other hand, there are mysteries not explained by ordinary physics like, for example, dark matter. While dark matter has historically been modeled as a single form of a stable and massive particle that experiences gravity and none of the other known forces, there is no reason that dark matter couldn’t experience forces that ordinary matter doesn’t experience. After all, ordinary matter experiences forces that dark matter doesn’t, so the hypothesis isn’t so silly.

    6
    There is no reason dark matter couldn’t experience forces that ordinary matter doesn’t experience. Here, in the galaxy cluster Abell 3827, dark matter was observed interacting with itself during a galaxy collision. Credit: ESO

    There are many ideas about forces that affect only dark matter and the term for this basic idea is called “complex dark matter.” One common idea is that there is a dark photon that interacts with a dark charge carried only by dark matter. This particle is a dark matter analog of the photon of ordinary matter that interacts with familiar electrical charge, with one exception: Some theories of complex dark matter imbue dark photons with mass, in stark contrast with ordinary photons.

    If dark photons exist, they can couple with ordinary matter (and ordinary photons) and decay into electron-positron pairs, which is what the Hungarian research group was investigating. Because dark photons don’t interact with ordinary electric charge, this coupling can only occur because of the vagaries of quantum mechanics. But if scientists started seeing an increase in electron-positron pairs, that might mean they were observing a dark photon.

    The Irvine group found a model that included a “protophobic” particle that was not ruled out by earlier measurements and would explain the Hungarian result. Particles that are “protophobic,” which literally means “fear of protons,” rarely or never interact with protons but can interact with neutrons (neutrophilic).

    The particle proposed by the Irvine group experiences a fifth and unknown force, which is in the range of 12 femtometers, or about 12 times bigger than a proton. The particle is protophobic and neutrophilic. The proposed particle has a mass of 17 million electron volts and can decay into electron-positron pairs. In addition to explaining the Hungarian measurement, such a particle would help explain some discrepancies seen by other experiments. This last consequence adds some weight to the idea.

    Paradigm-shifting force?

    So this is the status.

    What is likely to be true? Obviously, data is king. Other experiments will need to confirm or refute the measurement. Nothing else really matters. But that will take a year or so and having some idea before then might be nice. The best way to estimate the likelihood the finding is real is to look at the reputations of the various researchers involved. This is clearly a shoddy way to do science, but it will help shade your expectations.

    So let’s start with the Irvine group. Many of them (the senior ones, typically) are well- regarded and established members of the field, with substantive and solid papers in their past. The group includes a spectrum of ages, with both senior and junior members. In the interest of full disclosure, I know some of them personally and, indeed, two of them have read the theoretical portions of chapters of books I have written for the public to ensure that I didn’t say anything stupid. (By the way, they didn’t find any gaffes, but they certainly helped clarify certain points.) That certainly demonstrates my high regard for members of the Irvine group, but possibly taints my opinion. In my judgment, they almost certainly did a thorough and professional job of comparing their new model to existing data. They have found a small and unexplored region of possible theories that could exist.

    On the other hand, the theory is pretty speculative and highly improbable. This isn’t an indictment … all proposed theories could be labeled in this way. After all, the Standard Model, which governs particle physics, is nearly a half century old and has been thoroughly explored. In addition, ALL new theoretical ideas are speculative and improbable and almost all of them are wrong. This also isn’t an indictment. There are many ways to add possible modifications to existing theories to account for new phenomena. They can’t all be right. Sometimes none of the proposed ideas are right.

    However, we can conclude from the reputation of the group’s members that they have generated a new idea and have compared it to all relevant existing data. The fact that they released their model means that it survived their tests and thus it remains a credible, if improbable, possibility.

    What about the Hungarian group? I know none of them personally, but the article was published in Physical Review Letters — a chalk mark in the win column. However, the group has also published two previous papers in which comparable anomalies were observed, including a possible particle with a mass of 12 million electron volts and a second publication claiming the discovery of a particle with a mass of about 14 million electron volts. Both of these claims were subsequently falsified by other experiments.

    Further, the Hungarian group has never satisfactorily disclosed what error was made that resulted in these erroneous claims. Another possible red flag is that the group rarely publishes data that doesn’t claim anomalies. That is improbable. In my own research career, most publications were confirmation of existing theories. Anomalies that persist are very, very, rare.

    So what’s the bottom line? Should you be excited about this new possible discovery? Well…sure…possible discoveries are always exciting. The Standard Model has stood the test of time for half a century, but there are unexplained mysteries and the scientific community is always looking for the discovery that points us in the direction of a new and improved theory. But what are the odds that this measurement and theory will lead to the scientific world accepting a new force with a range of 12 fm and with a particle that shuns protons? My sense is that this a long shot. I am not so sanguine as to the chances of this outcome.

    Of course, this opinion is only that…an opinion, albeit an informed one. Other experiments will also be looking for dark photons because, even if the Hungarian measurement doesn’t stand up to scrutiny, there is still a real problem with dark matter. Many experiments looking for dark photons will explore the same parameter space (e.g. energy, mass and decay modes) in which the Hungarian researchers claim to have found an anomaly. We will soon (within a year) know if this anomaly is a discovery or just another bump in the data that temporarily excited the community, only to be discarded as better data is recorded. And, no matter the outcome, good and better science will be the eventual result.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 7:29 am on July 25, 2016 Permalink | Reply
    Tags: , ‘Tractor beams’ build atom-by-atom assembly in mid-air, , Physics   

    From COSMOS: ” ‘Tractor beams’ build atom-by-atom assembly in mid-air” 

    Cosmos Magazine bloc

    COSMOS

    25 July 2016
    Cathal O’Connell

    Physicists have manipulated 50 individual atoms at once in a dramatic upscaling of a technique vital to quantum computing.

    Have you ever tried to catch a speck of dust between your fingers? That’s challenging enough, but what about catching a single atom?

    Controlling the position of individual atoms is vital for quantum computers, which use individual atoms as “qubits” – the quantum version of the “bits” of regular computers.

    Usually atom assembly is a painstaking process, and can only be done one at a time.

    In a new paper uploaded to the Arxiv (prior to peer review), physicists at Harvard, Caltech and MIT have teamed up to manipulate up to 50 individual rubidium atoms using an array of 100 optical tweezers.

    The technique works a bit like the tractor beam from Star Trek. The atoms float around in a cloud within a vacuum chamber, and the tweezers pluck them out of mid-air (or, perhaps we should say, out of mid-vacuum).

    1
    The starship Enterprise using its tractor beam in an episode of Star Trek: The Next Generation. Quantum physicists have now done something similar at an atomic scale.Image credit: CBS via Getty Images

    The system then automatically arranges the atoms into a precise formation in less than half a second.

    Optical tweezers are tightly focused beams of light able to hold microscopic particles, or even single atoms, in three dimensions. It works by focusing two laser beams on to the same spot.

    An atom caught in the crossbeam stops dead, like a deer in headlights, because it is attracted to the strong electric field right at the center of the beam.

    When the beam is moved, the atom is dragged with it.

    Usually optical tweezers can only control one atom at a time. Now a team of American researchers, led by Mikhail Lukin at Harvard University and Manuel Endres at the California Institute of Technology, have found a way to upscale the process to control 50 atoms at once.

    The advance hinges on the team’s ability to split their laser source into 50 separate beams, and then control each beam individually.

    The team starts off with a cloud of rubidium atoms cooled to less than a degree above absolute zero, floating around in a vacuum chamber. When the scientists switch on the optical tweezers array, they create a line of 50 atom traps within the cloud.

    Most, but not all, of the traps usually succeed in catching an atom, and to check which ones are successful the researchers snap a picture using a special camera that can detect how a single atom fluoresces when trapped.

    Empty tweezers are simply switched off, while those that are holding atoms are manipulated to drag the atoms a desired pattern. Then another picture confirms it.

    If the pattern does not match up with what has been programmed, the system can use any remaining empty tweezers to grab a few more atoms and bring them over.

    For 50 atoms, this whole process takes about 400 milliseconds. For smaller arrays, it takes even less time.

    So far the technique only works for making a single line of atoms stretching about a tenth of a millimetre across. But the team plans to scale up the process to make a two dimensional array of optical tweezers.

    They write that the “robust creation of defect-free arrays of hundreds of atoms is feasible”.

    Another problem is how long the pattern can be held. At the moment, the limit is about 10 seconds. A quantum computer, meanwhile, would require holding times on the order of 100 seconds.

    The researchers expect that using a better vacuum, and an improved laser system, might get them at least to the one-minute mark.

    After that, they’ll need to think of other tricks so that their pattern is not gone in 60 seconds.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 7:45 am on July 23, 2016 Permalink | Reply
    Tags: Asimina Arvanitaki, , , , Physics, ,   

    From Quanta: Women in Science – “Mining Black Hole Collisions for New Physics” Asimina Arvanitaki 

    Quanta Magazine
    Quanta Magazine

    July 21, 2016
    Joshua Sokol

    1
    Asimina Arvanitaki during a July visit to the CERN particle physics laboratory in Geneva, Switzerland. Samuel Rubio for Quanta Magazine

    When physicists announced in February that they had detected gravitational waves firsthand, the foundations of physics scarcely rattled. The signal exactly matched the expectations physicists had arrived at after a century of tinkering with Einstein’s theory of general relativity. “There is a question: Can you do fundamental physics with it? Can you do things beyond the standard model with it?” said Savas Dimopoulos, a theoretical physicist at Stanford University. “And most people think the answer to that is no.”

    Asimina Arvanitaki is not one of those people. A theoretical physicist at Ontario’s Perimeter Institute of Theoretical Physics, Arvanitaki has been dreaming up ways to use black holes to explore nature’s fundamental particles and forces since 2010, when she published a paper with Dimopoulos, her mentor from graduate school, and others. Together, they sketched out a “string axiverse,” a pantheon of as yet undiscovered, weakly interacting particles. Axions such as these have long been a favored candidate to explain dark matter and other mysteries.

    In the intervening years, Arvanitaki and her colleagues have developed the idea through successive papers. But February’s announcement marked a turning point, where it all started to seem possible to test these ideas. Studying gravitational waves from the newfound population of merging black holes would allow physicists to search for those axions, since the axions would bind to black holes in what Arvanitaki describes as a “black hole atom.”

    “When it came up, we were like, ‘Oh my god, we’re going to do it now, we’re going to look for this,’” she said. “It’s a whole different ball game if you actually have data.”

    That’s Arvanitaki’s knack: matching what she calls “well-motivated,” field-hopping theoretical ideas with the precise experiment that could probe them. “By thinking away from what people are used to thinking about, you see that there is low-hanging fruit that lie in the interfaces,” she said. At the end of April, she was named the Stavros Niarchos Foundation’s Aristarchus Chair at the Perimeter Institute, the first woman to hold a research chair there.

    It’s a long way to come for someone raised in the small Grecian village of Koklas, where the graduating class at her high school — at which both of her parents taught — consisted of nine students. Quanta Magazine spoke with Arvanitaki about her plan to use black holes as particle detectors. An edited and condensed version of those discussions follows.

    QUANTA MAGZINE: When did you start to think that black holes might be good places to look for axions?

    ASIMINA ARVANITAKI: When we were writing the axiverse paper, Nemanja Kaloper, a physicist who is very good in general relativity, came and told us, “Hey, did you know there is this effect in general relativity called superradiance?” And we’re like, “No, this cannot be, I don’t think this happens. This cannot happen for a realistic system. You must be wrong.” And then he eventually convinced us that this could be possible, and then we spent like a year figuring out the dynamics.

    What is superradiance, and how does it work?

    An astrophysical black hole can rotate. There is a region around it called the “ergo region” where even light has to rotate. Imagine I take a piece of matter and throw it in a trajectory that goes through the ergo region. Now imagine you have some explosives in the matter, and it breaks apart into pieces. Part of it falls into the black hole and part escapes into infinity. The piece that is coming out has more total energy than the piece that went in the black hole.

    You can perform the same experiment by scattering radiation from a black hole. Take an electromagnetic wave pulse, scatter it from the black hole, and you see that the pulse you got back has a higher amplitude.

    So you can send a pulse of light near a black hole in such a way that it would take some energy and angular momentum from the black hole’s spin?

    This is old news, by the way, this is very old news. In ’72 Press and Teukolsky wrote a Nature paper that suggested the following cute thing. Let’s imagine you performed the same experiment as the light, but now imagine that you have the black hole surrounded by a giant mirror. What will happen in that case is the light will bounce on the mirror many times, the amplitude [of the light] grows exponentially, and the mirror eventually explodes due to radiation pressure. They called it the black hole bomb.

    The property that allows light to do this is that light is made of photons, and photons are bosons — particles that can sit in the same space at the same time with the same wave function. Now imagine that you have another boson that has a mass. It can [orbit] the black hole. The particle’s mass acts like a mirror, because it confines the particle in the vicinity of the black hole.

    In this way, axions might get stuck around a black hole?

    This process requires that the size of the particle is comparable to the black hole size. Turns out that [axion] mass can be anywhere from Hubble scale — with a quantum wavelength as big as the universe — or you could have a particle that’s tiny in size.

    So if they exist, axions can bind to black holes with a similar size and mass. What’s next?

    What happens is the number of particles in this bound orbit starts growing exponentially. At the same time the black hole spins down. If you solve for the wave functions of the bound orbits, what you find is that they look like hydrogen wave functions. Instead of electromagnetism binding your atom, what’s binding it is gravity. There are three quantum numbers you can describe, just the same. You can use the exact terminology that you can use in the hydrogen atom.

    How could we check to see if any of the black holes LIGO finds have axion clouds orbiting around black hole nuclei?

    This is a process that extracts energy and angular momentum from the black hole. If you were to measure spin versus mass of black holes, you should see that in a certain mass range for black holes you see no quickly rotating black holes.

    This is where Advanced LIGO comes in.

    LSC LIGO Scientific Collaboration
    VIRGO Collaboration bloc

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation
    Caltech/MIT Advanced aLigo Hanford, WA, USA installation

    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA
    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    You saw the event they saw. [Their measurements] allowed them to measure the masses of the merging objects, the mass of the final object, the spin of the final object, and to have some information about the spins of the initial objects.

    If I were to take the spins of the black holes before they merged, they could have been affected by superradiance. Now imagine a graph of black hole spin versus mass. Advanced LIGO could maybe get, if the things that we hear are correct, a thousand events per year. Now you have a thousand data points on this plot. So you may trace out the region that is affected by this particle just by those measurements.

    That would be supercool.

    That’s of course indirect. So the other cool thing is that it turns out there are signatures that have to do with the cloud of particles themselves. And essentially what they do is turn the black hole into a gravitational wave laser.

    Awesome. OK, what does that mean?

    Yeah, what that means is important. Just like you have transitions of electrons in an excited atom, you can have transitions of particles in the gravitational wave atom. The rate of emission of gravitational waves from these transitions is enhanced by the 1080 particles that you have. It would look like a very monochromatic line. It wouldn’t look like a transient. Imagine something now that emits a signal at a very fixed frequency.

    Where could LIGO expect to see signals like this?

    In Advanced LIGO, you actually see the birth of a black hole. You know when and where a black hole was born with a certain mass and a certain spin. So if you know the particle masses that you’re looking for, you can predict when the black hole will start growing the [axion] cloud around it. It could be that you see a merger in that day, and one or 10 years down the line, they go back to the same position and they see this laser turning on, they see this monochromatic line coming out from the cloud.

    You can also do a blind search. Because you have black holes that are roaming the universe by themselves, and they could still have some leftover cloud around them, you can do a blind search for monochromatic gravitational waves.

    Were you surprised to find out that axions and black holes could combine to produce such a dramatic effect?

    Oh my god yes. What are you talking about? We had panic attacks. You know how many panic attacks we had saying that this effect, no, this cannot be true, this is too good to be true? So yes, it was a surprise.

    The experiments you suggest draw from a lot of different theoretical ideas — like how we could look for high-frequency gravitational waves with tabletop sensors, or test whether dark matter oscillates using atomic clocks. When you’re thinking about making risky bets on physics beyond the standard model, what sorts of theories seem worth the effort?

    What is well motivated? Things that are not: “What if you had this?” People imagine: “What if dark matter was this thing? What if dark matter was the other thing?” For example, supersymmetry makes predictions about what types of dark matter should be there. String theory makes predictions about what types of particles you should have. There is always an underlying reason why these particles are there; it’s not just the endless theoretical possibilities that we have.

    And axions fit that definition?

    This is a particle that was proposed 30 years ago to explain the smallness of the observed electric dipole moment of the neutron. There are several experiments around the world looking for it already, at different wavelengths. So this particle, we’ve been looking for it for 30 years. This can be the dark matter. That particle solves an outstanding problem of the standard model, so that makes it a good particle to look for.

    Now, whether or not the particle is there I cannot answer for nature. Nature will have to answer.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 7:45 am on July 22, 2016 Permalink | Reply
    Tags: , Physics, , Unconventional quasiparticles predicted in conventional crystals   

    From Princeton: “Unconventional quasiparticles predicted in conventional crystals (Science)” 

    Princeton University
    Princeton University

    July 22, 2016
    No writer credit found

    1
    Two electronic states known as Fermi arcs, localized on the surface of a material, stem out of the projection of a 3-fold degenerate bulk new fermion. This new fermion is a cousin of the Weyl fermion discovered last year in another class of topological semimetals. The new fermion has a spin-1, a reflection of the 3- fold degeneracy, unlike the spin-½ that the recently discovered Weyl fermions have. No image credit.

    An international team of researchers has predicted the existence of several previously unknown types of quantum particles in materials. The particles — which belong to the class of particles known as fermions — can be distinguished by several intrinsic properties, such as their responses to applied magnetic and electric fields. In several cases, fermions in the interior of the material show their presence on the surface via the appearance of electron states called Fermi arcs, which link the different types of fermion states in the material’s bulk.

    The research, published online this week in the journal Science, was conducted by a team at Princeton University in collaboration with researchers at the Donostia International Physics Center (DIPC) in Spain and the Max Planck Institute for Chemical Physics of Solids in Germany. The investigators propose that many of the materials hosting the new types of fermions are “protected metals,” which are metals that do not allow, in most circumstances, an insulating state to develop. This research represents the newest avenue in the physics of “topological materials,” an area of science that has already fundamentally changed the way researchers see and interpret states of matter.

    The team at Princeton included Barry Bradlyn and Jennifer Cano, both associate research scholars at the Princeton Center for Theoretical Science; Zhijun Wang, a postdoctoral research associate in the Department of Physics, Robert Cava, the Russell Wellman Moore Professor of Chemistry; and B. Andrei Bernevig, associate professor of physics. The research team also included Maia Vergniory, a postdoctoral research fellow at DIPC, and Claudia Felser, a professor of physics and chemistry and director of the Max Planck Institute for Chemical Physics of Solids.

    For the past century, gapless fermions, which are quantum particles with no energy gap between their highest filled and lowest unfilled states, were thought to come in three varieties: Dirac, Majorana and Weyl. Condensed matter physics, which pioneers the study of quantum phases of matter, has become fertile ground for the discovery of these fermions in different materials through experiments conducted in crystals. These experiments enable researchers to explore exotic particles using relatively inexpensive laboratory equipment rather than large particle accelerators.

    In the past four years, all three varieties of gapless fermions have been theoretically predicted and experimentally observed in different types of crystalline materials grown in laboratories around the world. The Weyl fermion was thought to be last of the group of predicted quasiparticles in nature. Research published earlier this year in the journal Nature (Wang et al., doi:10.1038/nature17410) has shown, however, that this is not the case, with the discovery of a bulk insulator which hosts an exotic surface fermion.

    In the current paper, the team predicted and classified the possible exotic fermions that can appear in the bulk of materials. The energy of these fermions can be characterized as a function of their momentum into so-called energy bands, or branches. Unlike the Weyl and Dirac fermions, which, roughly speaking, exhibit an energy spectrum with 2- and 4-fold branches of allowed energy states, the new fermions can exhibit 3-, 6- and 8-fold branches. The 3-, 6-, or 8-fold branches meet up at points – called degeneracy points – in the Brillouin zone, which is the parameter space where the fermion momentum takes its values.

    “Symmetries are essential to keep the fermions well-defined, as well as to uncover their physical properties,” Bradlyn said. “Locally, by inspecting the physics close to the degeneracy points, one can think of them as new particles, but this is only part of the story,” he said.

    Cano added, “The new fermions know about the global topology of the material. Crucially, they connect to other points in the Brillouin zone in nontrivial ways.”

    During the search for materials exhibiting the new fermions, the team uncovered a fundamentally new and systematic way of finding metals in nature. Until now, searching for metals involved performing detailed calculations of the electronic states of matter.

    “The presence of the new fermions allows for a much easier way to determine whether a given system is a protected metal or not, in some cases without the need to do a detailed calculation,” Wang said.

    Verginory added, “One can just count the number of electrons of a crystal, and figure out, based on symmetry, if a new fermion exists within observable range.”

    The researchers suggest that this is because the new fermions require multiple electronic states to meet in energy: The 8-branch fermion requires the presence of 8 electronic states. As such, a system with only 4 electrons can only occupy half of those states and cannot be insulating, thereby creating a protected metal.

    “The interplay between symmetry, topology and material science hinted by the presence of the new fermions is likely to play a more fundamental role in our future understanding of topological materials – both semimetals and insulators,” Cava said.

    Felser added, “We all envision a future for quantum physical chemistry where one can write down the formula of a material, look at both the symmetries of the crystal lattice and at the valence orbitals of each element, and, without a calculation, be able to tell whether the material is a topological insulator or a protected metal.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Princeton University Campus

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield

     
  • richardmitnick 7:20 am on July 21, 2016 Permalink | Reply
    Tags: , , Physics   

    From COSMOS: “The more we know, the more we don’t – physics’ unsolved problems” 

    Cosmos Magazine bloc

    COSMOS

    Brian Cox’s day job is to winkle out the secrets of subatomic particles. But that has given him insights into the very large as well. He talks to Andrew Masterson.

    1
    Credit: Supplied

    Brian Cox is possibly the only particle physicist on Earth who is the subject of a book that claims to be an “unauthorised biography”.

    Contrary to expectations, though, the book, published in 2015 and written by Ben Falk, is not about Cox’s early career as a pop star in the 1990s Northern Irish dance outfit, D:REAM. Rather, according to the blurb, it concentrates on how “his affable charm and infectious enthusiasm have brought science to a whole new audience”.

    The contention is certainly true. In recent years he has written and fronted several blockbuster science television series, including The Human Universe and the forthcoming The Forces of Nature.

    For 14 seasons he has co-hosted, with Robin Ince, the popular BBC radio program The Infinite Monkey Cage. In between he’s popped up on panel shows, in the rebooted Monty Python stage production, and in specials such as The Science of Dr Who.

    In August he’ll be doing a speaking tour of Australian mainland capitals. Of course it will sell out. And of course he’ll be a special guest on every TV show that can get him. He’s a celebrity, after all.

    Given all that, it’s easy to forget that he’s also a very serious scientist. Among other things, he holds research posts at the University of Manchester, CERN and the Large Hadron Collider. When free from his recording commitments, he is part of an international team working on the installation of proton tagging detectors for use in refinements of the collider’s ATLAS and CMS experiments.

    He’s publishing, too. In June this year he was a co-author – admittedly with 5,112 others – of at least one paper lodged on preprint site arXiv. The paper is titled Measurements of the Higgs boson production and decay rates and constraints on its couplings from a combined ATLAS and CMS analysis of the LHC pp collision data at s√= 7 and 8 TeV – which, just by itself, slates home that the good professor, despite what the music mags used to say, is far from just a pretty face.

    While his day job concerns playing close attention to subatomic particles, it doesn’t at all preclude his thoughts turning to matters macro.

    The possibility of life beyond earth is one such area. In The Human Universe he led the somewhat depressing argument that, while life per se might be abundant in the cosmos, humanity could well be the only intelligent, civilisation-building example of it in this galaxy. This is in part, he said, because the development of eukaryotic cells was so vanishingly unlikely, the chances of it happening twice in planets orbiting our 300 billion local area stars is exceedingly remote.

    Of course, there is much unknown – indeed, unknowable – beneath such a statement, because all species on Earth stem ultimately from a single common ancestor. In terms of types of life, this presents a sample size of just one – never a reliable number from which to draw conclusions.

    Because of that, it invites the question of whether life evolving once only is always the name of the game, or whether two or more entirely separate life-systems can share a planet.

    “There’s an argument that there’s a sense of inevitability about geochemistry turning into biochemistry in the right circumstances,” says Cox.

    “The theory that I think is the most promising is the idea that in hydrothermal vents in an ocean, in those kinds of conditions where you get an energy flow – an out-of-equilibrium situation – where you get acid and alkaline together, hot and cold together, biochemistry may be inevitable in those conditions.

    “The counter argument is what some evolutionary biologists say: once you get a life, once you get a replicator, it’s then very difficult for another replicator to get a foothold because the head start you get by being the first replicator is immense.

    ____________________________________________________________________________________________________

    ‘I’d say we’re sure now that dark matter must be some kind of particle. It would be a very strange thing if it wasn’t’

    ____________________________________________________________________________________________________

    “You get access to natural selection straight away, and off you go. The advantage to being first is colossal.”

    Cox’s own opinion? “Honestly? I don’t know.”

    And that, of course, is how it should be. However much certain crusty parliamentarians and commentators may like to deride them, doubt and uncertainty are essential elements in the scientific process.

    This is so, even in areas where evidence appears to be ever bolstering theory. Take the standard model of particle physics, for instance, a model at the heart of Cox’s research.

    From the outside, recent discoveries – not least of the Higgs boson and the (albeit contested) detection of gravitational waves – seem to be yet more proof of the model’s robust structure.

    Brian Cox has his doubts.

    “The standard model, with Higgs, is really not what most people would have expected,” he says.

    “It’s not clear that it’s … well, it looks quite ‘fine-tuned’, let’s put it that way. There are more natural models, in many people’s eyes.

    “Supersymmetry is a very good example – which also gives you a candidate for dark matter. This is absolutely necessary.”

    But the drawback with supersymmetry – where every particle has another twin with higher mass – is that it would be a much better model if there were the slightest experimental evidence that it actually exists. It’s an obstacle, says Cox, but not necessarily a permanent one.

    “I’d say we’re sure now that dark matter must be some kind of particle. It would be a very strange thing if it wasn’t,” he says.

    “So, it looks like the astronomers are telling us that there must be some other kind of particle out there. And you would expect it in terms of energy – naively – to be around about where the Higgs particle is. So it’s kind of surprising that we haven’t seen any hint of anything like supersymmetry.

    “But there’s a lot data still to be taken at the LHC, and I would not be surprised at all if we didn’t see something else. Although it’s possible that we won’t, of course.”

    But if they do, it still won’t be the end of the standard model’s faults. Far from it, in fact.

    “There are huge unsolved problems in physics at the moment,” says Cox.

    “I would say that dark matter is one that should be resolved pretty soon. But dark energy is a colossal problem. There’s something very strange that we don’t understand about the way particles interact with space-time.

    “That link between general relativity and quantum theory – there’s something really terribly wrong with our understanding of that.”

    And that is significant not just in and of itself. From an anthropocentric point of view, it feeds into an even deeper issue, at least if Cox is right in suggesting intelligent life in the universe is as rare as it is.

    Bluntly, if we don’t sort out the friction between the two paradigms, nothing else in the Milky Way will do so, either. Possibly ever.

    “The most interesting questions, I think, are questions that appear to have mutually exclusive answers,” says Cox.

    “So when you start asking questions, such as how valuable is the human race, I take two views that appear to be opposed.

    “One is that physically we’re obviously insignificant. But I think we’re incredibly valuable. The reason for that is that I think in the local universe there are very few civilisations around, so we’re not likely to meet any others.

    “You could even argue that there are only a handful, possibly even one. And that’s enough for me to make us valuable.”

    Brian Cox, with Robin Ince, will be in Australia from 5-18 August. Details and tickets: https://lateralevents.com/public-events/brian-cox-journey-into-deep-space/

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:28 am on July 17, 2016 Permalink | Reply
    Tags: , CP-Symmetry, Physics,   

    From Science Alert: “Physicists just confirmed a pear-shaped nucleus, and it could ruin time travel forever” 

    ScienceAlert

    Science Alert

    27 JUN 2016 [Just today in social media]
    BEC CREW

    1
    http://home.cern/about/updates/2013/05/first-observations-short-lived-pear-shaped-atomic-nuclei

    Physicists have confirmed the existence of a new form of atomic nuclei, and the fact that it’s not symmetrical challenges the fundamental theories of physics that explain our Universe.

    But that’s not as bad as it sounds, because the discovery could help scientists solve one of the biggest mysteries in theoretical physics – where is all the dark matter? – and could also explain why travelling backwards in time might actually be impossible.

    “We’ve found these nuclei literally point towards a direction in space. This relates to a direction in time, proving there’s a well-defined direction in time and we will always travel from past to present,” Marcus Scheck from the University of the West of Scotland told Kenneth MacDonald at BBC News.

    So let’s back up here, because to understand this new form of atomic nuclei, you have to get to know the old ones first. Until recently, it was established that the nuclei of atoms could be one of just three shapes – spherical, discus, or rugby ball.

    These shapes are formed by the distribution of electrical charge within a nucleus, and are dictated by the specific combinations of protons and neutrons in a certain type of atom, whether it’s a hydrogen atom, a zinc atom, or a complex isotope created in a lab.

    The common factor across all three shapes is their symmetry, and this marries nicely with a theory in particle physics known as CP-Symmetry. CP-symmetry is the combination of two symmetries that are thought to exist in the Universe: C-Symmtery and P-Symmetry.

    C-Symmetry, also known as charge symmetry, states that if you flip an atomic charge to its opposite, the physics of that atom should still be the same. So if we take a hydrogen atom and an anti-hydrogen atom and mess with them, both should respond in identical ways, even though they have opposite charges.

    P-Symmetry, also known as Parity, states that the the spatial coordinates describing a system can be inverted through the point at the origin, so that x, y, and z are replaced with −x, −y, and −z.

    “Your left hand and your right hand exhibit P-Symmetry from one another: if you point your thumb up and curl your fingers, your left and right hands mirror one another,” Ethan Siegel from It Starts With a Bang explains.

    CP-Symmetry is a combination of both of these assumptions. “In particle physics, if you have a particle spinning clockwise and decaying upwards, its antiparticle should spin counterclockwise and decay upwards 100 percent of the time if CP is conserved,” says Siegel. “If not, CP is violated.”

    The possibility that the Universe could actually violate both C-Symmetry and CP-Symmetry is one of the conditions that have been proposed to explain the mystery of antimatter in the Universe. But proving that would mean the Standard Model of Physics needs a serious rethink.

    According to the laws of physics, at the time of the Big Bang, equal amounts of matter and antimatter had to have been created, but now, billions of years later, we’re surrounded by heaps of matter (solid, liquid, gas, and plasma), but there appears to be almost no naturally occurring antimatter.

    Okay, so back to our atomic nuclei shapes. Most of our fundamental theories of physics are based on symmetry, so when physicists at CERN discovered an asymmetrical pear-shaped nucleus in the isotope Radium-224 back in 2013, it was a bit of a shock, because it showed that nuclei could have more mass at one end than the other.

    Now, three years later, the find has been confirmed by a second study, which has shown that the nucleus of the isotope Barium-144 is also asymmetrical and pear-shaped.

    “[T]he protons enrich in the bump of the pear and create a specific charge distribution in the nucleus,” Scheck told the BBC. “This violates the theory of mirror symmetry and relates to the violation shown in the distribution of matter and antimatter in our Universe.”

    While physicists have suspected that Barium-144 has a pear-shaped nucleus for some time now, Scheck and his team finally figured out how to directly observe that, and it turns out its distortion is even more pronounced than predicted.

    So what does all of this have to do with time travel? It’s a pretty out-there hypothesis, but Scheck says that this uneven distribition of mass and charge causes Barium-144’s nucleus to ‘point’ in a certain direction in spacetime, and this bias could explain why time seems to only want to go from past to present, and not backwards, even if the laws of physics don’t care which way it goes.

    Of course, there’s no way of proving that without further evidence, but the discovery is yet another indication that the Universe might not be as symmetrical as the Standard Model of Physics needs it to be, and proving that could usher us into a whole new era of theoretical physics.

    The study has been published in Phyiscal Review Letters, and can be accessed for free at arXiv.org.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 583 other followers

%d bloggers like this: