Tagged: physicsworld.com Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:31 am on January 8, 2019 Permalink | Reply
    Tags: , Antiuniverse, , , CPT symmetry, , Our universe has antimatter partner on the other side of the Big Bang say physicists, , , physicsworld.com, The entity that respects the symmetry is a universe–antiuniverse pair   

    From physicsworld.com: “Our universe has antimatter partner on the other side of the Big Bang, say physicists” 

    physicsworld
    From physicsworld.com

    03 Jan 2019

    1
    (Courtesy: shutterstock/tomertu)

    Our universe could be the mirror image of an antimatter universe extending backwards in time before the Big Bang. So claim physicists in Canada, who have devised a new cosmological model positing the existence of an “antiuniverse” [Physical Review Letters] which, paired to our own, preserves a fundamental rule of physics called CPT symmetry. The researchers still need to work out many details of their theory, but they say it naturally explains the existence of dark matter.

    Standard cosmological models tell us that the universe – space, time and mass/energy – exploded into existence some 14 billion years ago and has since expanded and cooled, leading to the progressive formation of subatomic particles, atoms, stars and planets.

    However, Neil Turok of the Perimeter Institute for Theoretical Physics in Ontario reckons that these models’ reliance on ad-hoc parameters means they increasingly resemble Ptolemy’s description of the solar system. One such parameter, he says, is the brief period of rapid expansion known as inflation that can account for the universe’s large-scale uniformity. “There is this frame of mind that you explain a new phenomenon by inventing a new particle or field,” he says. “I think that may turn out to be misguided.”

    Instead, Turok and his Perimeter Institute colleague Latham Boyle set out to develop a model of the universe that can explain all observable phenomena based only on the known particles and fields. They asked themselves whether there is a natural way to extend the universe beyond the Big Bang – a singularity where general relativity breaks down – and then out the other side. “We found that there was,” he says.

    The answer was to assume that the universe as a whole obeys CPT symmetry. This fundamental principle requires that any physical process remains the same if time is reversed, space inverted and particles replaced by antiparticles. Turok says that this is not the case for the universe that we see around us, where time runs forward as space expands, and there’s more matter than antimatter.

    2
    In a CPT-symmetric universe, time would run backwards from the Big Bang and antimatter would dominate (L Boyle/Perimeter Institute of Theoretical Physics)

    Instead, says Turok, the entity that respects the symmetry is a universe–antiuniverse pair. The antiuniverse would stretch back in time from the Big Bang, getting bigger as it does so, and would be dominated by antimatter as well as having its spatial properties inverted compared to those in our universe – a situation analogous to the creation of electron–positron pairs in a vacuum, says Turok.

    Turok, who also collaborated with Kieran Finn of Manchester University in the UK, acknowledges that the model still needs plenty of work and is likely to have many detractors. Indeed, he says that he and his colleagues “had a protracted discussion” with the referees reviewing the paper for Physical Review Letters [link is above] – where it was eventually published – over the temperature fluctuations in the cosmic microwave background. “They said you have to explain the fluctuations and we said that is a work in progress. Eventually they gave in,” he says.

    In very broad terms, Turok says, the fluctuations are due to the quantum-mechanical nature of space–time near the Big Bang singularity. While the far future of our universe and the distant past of the antiuniverse would provide fixed (classical) points, all possible quantum-based permutations would exist in the middle. He and his colleagues counted the instances of each possible configuration of the CPT pair, and from that worked out which is most likely to exist. “It turns out that the most likely universe is one that looks similar to ours,” he says.

    Turok adds that quantum uncertainty means that universe and antiuniverse are not exact mirror images of one another – which sidesteps thorny problems such as free will.

    But problems aside, Turok says that the new model provides a natural candidate for dark matter. This candidate is an ultra-elusive, very massive particle called a “sterile” neutrino hypothesized to account for the finite (very small) mass of more common left-handed neutrinos. According to Turok, CPT symmetry can be used to work out the abundance of right-handed neutrinos in our universe from first principles. By factoring in the observed density of dark matter, he says that quantity yields a mass for the right-handed neutrino of about 5×108 GeV – some 500 million times the mass of the proton.

    Turok describes that mass as “tantalizingly” similar to the one derived from a couple of anomalous radio signals spotted by the Antarctic Impulsive Transient Antenna (ANITA). The balloon-borne experiment, which flies high over Antarctica, generally observes cosmic rays travelling down through the atmosphere. However, on two occasions ANITA appears to have detected particles travelling up through the Earth with masses between 2 and 10×108 GeV. Given that ordinary neutrinos would almost certainly interact before getting that far, Thomas Weiler of Vanderbilt University and colleagues recently proposed that the culprits were instead decaying right-handed neutrinos [Letters in High Energy Physics].

    Turok, however, points out a fly in the ointment – which is that the CPT symmetric model requires these neutrinos to be completely stable. But he remains cautiously optimistic. “It is possible to make these particles decay over the age of the universe but that takes a little adjustment of our model,” he says. “So we are still intrigued but I certainly wouldn’t say we are convinced at this stage.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Perimeter Institute is the world’s largest research hub devoted to theoretical physics. The independent Institute was founded in 1999 to foster breakthroughs in the fundamental understanding of our universe, from the smallest particles to the entire cosmos. Research at Perimeter is motivated by the understanding that fundamental science advances human knowledge and catalyzes innovation, and that today’s theoretical physics is tomorrow’s technology. Located in the Region of Waterloo, the not-for-profit Institute is a unique public-private endeavour, including the Governments of Ontario and Canada, that enables cutting-edge research, trains the next generation of scientific pioneers, and shares the power of physics through award-winning educational outreach and public engagement.

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 2:33 pm on November 28, 2018 Permalink | Reply
    Tags: , , , , , , , physicsworld.com,   

    From physicsworld.com: “Cosmic expansion rate remains a mystery despite new measurement” 

    physicsworld
    From physicsworld.com

    21 Nov 2018

    1
    Galaxy far away: an image taken by the Dark Energy Camera. (Courtesy: Fermilab)

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    A new value for the Hubble constant – the expansion rate of the universe — has been calculated by an international group of astrophysicists. The team used primordial distance scales to study more than 200 supernovae observed by telescopes in Chile and Australia. The new result agrees well with previous values of the constant obtained using a specific model of cosmic expansion, while disagreeing with more direct observations from the nearby universe – so exacerbating a long-running disagreement between cosmologists and astronomers.

    The Hubble constant is calculated by looking at distant celestial objects and determining how fast they are moving away from Earth. A plot of the speeds of the objects versus their distance from Earth falls on a straight line, the slope of which is the Hubble constant.

    Obtaining an object’s speed is straightforward and involves measuring the redshift of the light it emits, but quantifying its distance is much more complicated. Historically, this has been done using a “distance-ladder”, whereby progressively greater length scales are measured by using one type of “standard candle” to calibrate the output of another standard candle. The distance to stars known as Cepheid variables (one type of standard candle) is first established via parallax, and that information is used to calibrate the output of type Ia supernovae (another type of standard candle) located in galaxies containing Cepheids. The apparent brightness of other supernovae can then be used to work out distances to galaxies further away.

    Large discrepancy

    This approach has been refined over the years and has most recently yielded a Hubble constant of 73.5 ± 1.7 kilometres per second per magaparsec (one megaparsec being 3.25 million light-years). That number, however – obtained by starting close to Earth and moving outwards – is at odds with calculations of the Hubble constant that take the opposite approach — moving inwards from the dawn of time. The baseline in that latter case comes from length scales of temperature fluctuations in the radiation dating back to just after the Big Bang, known as the cosmic microwave background. The cosmic expansion rate at that time is extrapolated to the present day by assuming that the universe’s growth has accelerated under the influence of a particular kind of dark energy. Using the final results from the European Space Agency’s Planck satellite, a very different Hubble constant of 67.4 ± 0.5 is obtained.

    ESA/Planck 2009 to 2013

    To try to resolve the problem by using an alternative approach, scientists have in recent years created what is known as an “inverse distance ladder”. This also uses the cosmic microwave background as a starting point, but it calculates the expansion rate at a later time – about 10 billion years after the Big Bang – when the density fluctuations imprinted on the background radiation had grown to create clusters of galaxies distributed within “baryon acoustic oscillations”. The oscillations are used to calibrate the distance to supernovae – present in the galaxies – thanks to the fact that the oscillations lead to a characteristic separation between galaxies of 147 megaparsecs.

    In the latest work, the Dark Energy Survey collaboration draws on galaxy data from the Sloan Digital Sky Survey as well as 207 newly-studied supernovae captured by the Dark Energy Camera mounted on the 4-metre Víctor M Blanco telescope in Chile. Using spectra obtained mainly at the similarly-sized Anglo-Australian Telescope in New South Wales, the collaboration calculates a value for the Hubble constant of 67.8 ± 1.3 – so agreeing with the Planck value while completely at odds with the conventional distance ladder.


    AAO Anglo Australian Telescope near Siding Spring, New South Wales, Australia, Altitude 1,100 m (3,600 ft)

    Siding Spring Mountain with Anglo-Australian Telescope dome visible near centre of image at an altitude of 1,165 m (3,822 ft)

    Fewer assumptions

    “The key thing with these results,“ says team member Ed Macaulay of the University of Portsmouth in the UK, “is that the only physics you need to assume is plasma physics in the early universe. You don’t need to assume anything about dark energy.”

    Adam Riess, an astrophysicist at the Space Telescope Science Institute in Baltimore, US who studies the distance-ladder, says that the new work “adds more weight” to the disparity in values of the Hubble constant obtained from the present and early universe.

    Cosmic Distance Ladder, skynetblogs


    Dark Energy Camera Enables Astronomers a Glimpse at the Cosmic Dawn. CREDIT National Astronomical Observatory of Japan

    (Indeed, the distance-ladder itself has gained independent support from expansion rates calculated using gravitational lensing.) He reckons that the similarity between the Planck and Dark Energy Survey results means that redshifts out to z=1 (going back about 8 billion years) are “probably not where the tension develops” and that the physics of the early universe might be responsible instead.

    Chuck Bennett of Johns Hopkins University, who led the team on Planck’s predecessor WMAP, agrees. He points to a new model put forward by his Johns Hopkins colleagues Marc Kamionkowski, Vivian Poulin and others that adds extra dark energy to the universe very early on (before rapidly decaying). This model, says Bennett, “proves that it is theoretically possible to find cosmological solutions to the Hubble constant tension”.

    Macaulay is more cautious. He acknowledges the difficulty of trying to find an error, reckoning that potential systematic effects in any of the measurements “are about ten times smaller” than the disparity. But he argues that more data are needed before any serious theoretical explanations can be put forward. To that end, he and his colleagues are attempting to analyse a further 2000 supernovae observed by the Dark Energy Camera, although they are doing so without the aid of (costly) spectroscopic analysis. Picking out the right kind of supernovae and then working out their redshift “will be very difficult,” he says, “and not something that has been done with this many supernovae before”.

    A preprint describing the research is available on arXiv.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 7:44 am on August 14, 2018 Permalink | Reply
    Tags: , , , physicsworld.com, String theory-not even wrong   

    From Columbia University via physicsworld.com: “Not Even Wrong” 

    Columbia U bloc

    From Columbia University

    via

    physicsworld
    physicsworld.com

    1

    August 13, 2018
    woit

    In recent weeks string theory has been again getting a lot of press attention, because of claims that new progress is being made in the study of the relation of string theory and the real world, via the study of the “swampland”. This is a very old story, and I’ve often written about it here. I just added a new category, so anyone who wants to can go follow it by clicking on the Swampland category of posts.

    Recent press coverage of this includes an article by Clara Moskowitz at Scientific American, entitled String Theory May Create Far Fewer Universes Than Thought. This motivated Avi Loeb to write his own Scientific American piece highlighting the dangers of string theory speculation unmoored to any possible experimental test, which appeared as Theoretical Physics is Pointless without Experimental Tests. Loeb reports:

    “…There is a funny anecdote related to the content of this commentary. In my concluding remarks at the BHI conference we held at Harvard in May 2018, I recommended boarding a futuristic spacecraft directed at the nearest black hole to experimentally test the validity of string theory near the singularity. Nima Arkani-Hamed commented that he suspects I have an ulterior motive for sending string theorists into a black hole. For the video of this exchange, see

    https://www.youtube.com/watch?v=WdFkbsPFQi0

    Last week Natalie Wolchover reported on this controversy, with an article that appeared at Quanta magazine as Dark Energy May Be Incompatible With String Theory and at The Atlantic as The Universe as We Understand It May Be Impossible (The Atlantic headline writer misidentifies “we” as “string theorists”).

    Wolchover accurately explains part of this story as a conflict between string theorists over whether certain solutions (such as the KKLT solution and the rest of the so-called “string theory landscape”) to string theory really exist. Vafa argues they may not exist, since the proposed solutions are complicated and “Usually in physics, we have simple examples of general phenomena.” In response Eva Silverstein argues:

    “…They [Vafa and others] essentially just speculate that those things don’t exist, citing very limited and in some cases highly dubious analyses….”

    On Twitter, Jim Baggott explains the problem

    “Let’s be clear. This is not a ‘test’ of string theory. There is no ‘evidence’ here. This is yet another conjecture that ‘might be true’, on which there is no consensus in the string theory community.”

    and points to an earlier tweet thread of his about this. Sabine Hossenfelder replies with the comment that

    “…The landscape itself is already a conjecture build on a conjecture, the latter being strings to begin with. So: conjecture strings, then conjecture the landscape (so you don’t have to admit the theory isn’t unique), then conjecture the swampland because it’s still not working….”

    The Simons Center summer workshop this year has been devoted to Recent Developments in the Swampland, videos are here (this was also the case in 2006, see here). Next month in Madrid a conference will be devoted to Vistas over the Swampland, and I’m sure many more such gatherings are planned.

    Unfortunately I think the fundamental problem here somehow never gets clearly explained: String theorists don’t actually have a theory, what they have is an approximation to an unknown theory supposed to be valid in certain limits, and a list of properties they would like the unknown theory to have. If this is all you have, there’s no way to distinguish when you’re on dry land (a solution to string theory) from when you’re in the swamp (a non-solution to string theory). Different string theorists can generate different opinions, conjectures and speculations about whether some location is swamp or dry land, but in the absence of an actual theory, no one can tell who is right and who is wrong. I don’t know why Vafa back in 2005 chose “Swampland” as the metaphor for this subject, but it’s an unfortunately apt one: string theorists are stuck in a swamp, with no way of getting out since they can’t tell what’s dry land and what isn’t.

    [I do not normally “poach” a blog post, especially wordpress material, but there was no other way to get this out]

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Columbia U Campus

    Columbia University was founded in 1754 as King’s College by royal charter of King George II of England. It is the oldest institution of higher learning in the state of New York and the fifth oldest in the United States.

     
  • richardmitnick 7:27 pm on July 20, 2018 Permalink | Reply
    Tags: , , Darm Matter and Hydrogen?, , , physicsworld.com   

    From physicsworld.com: “Did dark matter have a chilling effect on the early universe?” 

    physicsworld
    From physicsworld.com

    10 Jul 2018
    Edwin Cartlidge

    1
    Early days: artist’s impression of stars forming from primordial hydrogen gas. (Courtesy: E R Fuller/National Science Foundation)

    New research lends further support to the idea that a detection of surprisingly strong absorption by primordial hydrogen gas, reported earlier this year, could be evidence of dark matter. The new results, described in three papers in Physical Review Letters, are theoretical and do not settle the issue. Indeed, one group is sceptical of the dark-matter interpretation. But the work heightens interest in ongoing observations of the “cosmic dawn”, with new results from radio telescopes expected within the next year.

    According to cosmologists, the hydrogen gas that existed in the very early universe was in thermal equilibrium with the cosmic microwave background (CMB), which meant that the gas would not have been visible either through absorption of the microwave photons or through emission. But at the start of the cosmic dawn about 100 million years after the Big Bang, ultraviolet light from the first stars would have excited the hydrogen atoms and shifted the distribution of electrons within the lower and upper levels of the hyperfine transition. As such, the hydrogen would have started to absorb much more radiation at the transition wavelength (21 cm), which would be seen today as a dip at longer, re-shifted wavelengths in the CMB spectrum.

    Dark Energy Camera Enables Astronomers a Glimpse at the Cosmic Dawn

    In February, researchers working on the Experiment to Detect the Global Epoch of Reionization Signature (EDGES) telescope reported in Nature that they had seen just such a dip at a wavelength of 380 cm in data from their small ground-based antenna system in Western Australia.

    EDGES telescope in a radio quiet zone at the Murchison Radio-astronomy Observatory in Western Australia.

    The observation was exciting news, but nevertheless in line with standard cosmological theory. However, the dip was actually twice as deep as expected – immediately leading theorists to speculate that the hydrogen was in fact interacting with particles of dark matter.

    “The stakes are high because if the signal is real, this experiment is worth two Nobel prizes,” says Abraham Loeb of Harvard University. “One for being first to detect the 21 cm signal from the cosmic dawn and the second for finding an unexpected level of hydrogen absorption that may be indicative of new physics.”

    New or old force?

    The idea is that the dark matter would have been colder than the hydrogen atoms and so interactions between the two would have transferred energy from the gas to the dark matter – so cooling the gas and boosting absorption. The possibility of this mechanism being tied to the switching on of the first stars was proposed by Rennan Barkana of Tel Aviv University in Israel, but Barkana suggested that the interaction could involve a new fundamental force between dark and ordinary matter.

    However, Loeb and Harvard colleague Julián Muñoz argued that there could be no such force as it would have led to stars cooling more quickly than is observed. Instead, they reckon that the interaction could be that of familiar electromagnetism – requiring that a small fraction of dark matter particles have little mass and carry about a millionth of the charge of the electron.

    That view has now won cautious backing from other researchers in the US. By imposing constraints from a wider range of cosmological and astrophysical observations, Asher Berlin of the SLAC National Accelerator Laboratory in California and colleagues have shown in a new paper [Physical Review Letters] that dark matter interactions could indeed explain the EDGES results if up to 2% of dark matter weighs in at less than a tenth the mass of the proton and has a charge less than 0.01% of the electron’s. Berlin and colleagues do, however, add that this scenario would require “additional forces” to subsequently deplete the dark matter so its abundance is in line with observations of the present universe. “Although it’s possible that dark matter could produce the EDGES result, it is not easy or simple to do so,” says Berlin’s colleague Dan Hooper of Fermilab near Chicago.

    Extraordinary claims

    Loeb acknowledges that “extraordinary claims require extraordinary evidence,” adding that the apparent 21 cm signal from EDGES could be nothing more than instrumental noise or absorption by dust grains in our galaxy. He looks forward to new results from other experiments operating at different sites – including SARAS-2, LEDA, and PRIzM – and expects new data to be available within the next year.

    Even if the signal is confirmed, however, dark matter is not necessarily the culprit. Guido D’Amico and colleagues at CERN in Geneva argue in the second new paper [Physical Review Letters] that proponents of the dark-matter interpretation have carried out an “incomplete analysis” by neglecting the heating effect of dark-matter annihilation. In particular, they say that annihilations could inject electrons and low-energy photons into the hydrogen gas, thereby potentially heating the gas more than it is cooled. As such, they conclude, dark-matter annihilations are “strongly constrained” by a 21 cm signal.

    In a third new paper [Physical Review Letters], on the other hand, Anastasia Fialkov of the Harvard-Smithsonian Center for Astrophysics in the US and colleagues (including Barkana) show that the dark-matter hypothesis yields an additional prediction that can be tested using different kinds of radio telescope. They have found that the 21 cm signal should vary across the sky by up to 30 times as much as it would do if there were no charged interactions between ordinary and dark matter – and pointing out that this prediction can be tested using low-frequency interferometers.

    Muñoz is enthusiastic about these spatial measurements, explaining that they are far more immune to foreground noise and other potential systematic errors than the data collected by EDGES, and are therefore, he says, “more reliable”. He reckons that a couple of interferometers – LOFAR in the Netherlands and HERA in South Africa – might have gathered sufficient data within the next five to ten years to establish definitively whether or not the dip at 21 cm really is due to charged dark matter.

    ASTRON LOFAR Radio Antenna Bank, Nethrlands

    UC Berkeley Hydrogen Epoch of Reionization Array (HERA), South Africa

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 3:05 pm on June 21, 2018 Permalink | Reply
    Tags: , , Muon antineutrino oscillation spotted by NOvA, , physicsworld.com   

    From physicsworld.com: “Muon antineutrino oscillation spotted by NOvA” 

    physicsworld
    From physicsworld.com

    07 June 2018
    Hamish Johnston

    FNAL NOvA detector in northern Minnesota

    NOvA Far Detector Block

    The best evidence yet that muon antineutrinos can change into electron antineutrinos has been found by the NOvA experiment in the US. The measurement involved sending a beam of muon antineutrinos more than 800 km through the Earth from Fermilab near Chicago to a detector in northern Minnesota. After running for about 14 months, NOvA found that at least 13 of the muon antineutrinos had changed type, or “flavour”, during their journey.

    The results were presented at the Neutrino 2018 conference, which is being held in Heidelberg, Germany, this week. Although the measurement is still below the threshold required to claim a “discovery”, the result means that fundamental properties of neutrinos and antineutrinos can be compared in detail. This could shed light on important mysteries of physics, such as why there is very little antimatter in the universe.

    Neutrinos and antineutrinos come in three flavours: electron, muon and tau. The subatomic particles also exist in three mass states, which means that neutrinos (and antineutrinos) will continuously change flavour (or oscillate). Neutrino oscillation came as a surprise to physicists, who had originally thought that neutrinos have no mass. Indeed, the origins of neutrino mass are not well-understood and a better understanding of neutrino oscillation could point to new physics beyond the Standard Model.
    Pion focusing

    NOvA has been running for more than three years and comprises two detectors – one located at Fermilab and the other in Minnesota near the border with Canada.

    FNAL Near Detector

    The muon antineutrinos in the beam are produced at Fermilab’s NuMI facility by firing a beam of protons at a carbon target. This produces pions, which then decay to produce either muon neutrinos or muon antineutrinos – depending upon the charge of the pion. By focusing pions of one charge into a beam, researchers can create a beam of either neutrinos or antineutrinos.

    The beam is aimed on a slight downward trajectory so it can travel through the Earth to the detector in Minnesota, which weighs in at 14,000 ton. Electron neutrinos and antineutrinos are detected when they very occasionally collide with an atom in a liquid scintillator, which produces a tiny flash of light. This light is converted into electrical signals by photomultipler tubes and the type of neutrino (or antineutrino) can be worked-out by studying the pattern of signal produced.

    The experiment’s first run with antineutrino began in February 2017 and ended in April 2018. The first results were presented this week in Heidelberg by collaboration member Mayly Sanchez of Iowa State University, who reported that a total of 18 electron antineutrinos had been seen by the Minnesota detector. If muon antineutrinos did not oscillate to electron antineutrinos, then only five detections should have been made.
    “Strong evidence”

    “The result is above 4σ level, which is strong evidence for electron antineutrino appearance,” Sanchez told Physics World, adding that this is the first time that the appearance of electron antineutrinos has been seen in a beam of muon antineutrinos. While this is below the 5σ level normally accepted as a discovery in particle physics, it is much stronger evidence than found by physicists working on the T2K detector in Japan – which last year reported seeing hints of the oscillation.

    In 2014-2017 NOvA detected 58 electron neutrinos that have appeared in a muon neutrino beam. This has allowed NOvA physicists to compare the rates at which muon neutrinos and antineutrinos oscillate to their respective electron counterparts. According to Sanchez, the team has seen a small discrepancy that has a statistical significance of just 1.8σ. While this difference is well within the expected measurement uncertainty, if it persists as more data are collected it could point towards new physics.

    Sanchez says that NOvA is still running in antineutrino mode and the amount of data taken will double by 2019.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 1:53 pm on January 17, 2018 Permalink | Reply
    Tags: , , , , , , physicsworld.com   

    From Physics World: “Neutrino hunter” 

    physicsworld
    physicsworld.com

    Nigel Lockyer

    Nigel Lockyer, director of Fermilab in the US, talks to Michael Banks about the future of particle physics – and why neutrinos hold the key.

    Fermilab is currently building the Deep Underground Neutrino Experiment (DUNE). How are things progressing?

    Construction began last year with a ground-breaking ceremony held in July at the Sanford Underground Research Facility, which is home to DUNE.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    By 2022 the first of four tanks of liquid argon, each 17,000 tonnes, will be in place detecting neutrinos from space. Then in 2026, when all four are installed, Fermilab will begin sending the first beam of neutrinos to DUNE, which is some 1300 km away.

    Why neutrinos?

    Neutrinos have kept throwing up surprises ever since we began studying them and we expect a lot more in the future. In many ways, the best method to study physics beyond the Standard Model is with neutrinos.

    Standard Model of Particle Physics from Symmetry Magazine

    What science do you plan when DUNE comes online?

    One fascinating aspect is detecting neutrinos from supernova explosions. Liquid argon is very good at picking up electron neutrinos and we would expect to see a signal if that occurred in our galaxy. We could then study how the explosion results in a neutron star or black hole. That would really be an amazing discovery.

    And what about when Fermilab begins firing neutrinos towards DUNE?

    One of the main goals is to investigate charge–parity (CP) violation in the lepton sector. We would be looking for the appearance of electron and antielectron neutrinos. If there is a statistical difference then this would be a sign of CP violation and could give us hints as to the reason why there is more matter than antimatter in the universe. Another aspect of the experiment is to search for proton decay.

    How will Fermilab help in the effort?

    To produce neutrinos, the protons smash into a graphite target that is currently the shape of a pencil. We are aiming to quadruple the proton beam power from 700 kW to 2.5 MW. Yet we can’t use graphite after the accelerator has been upgraded due to the high beam power so we need to have a rigorous R&D effort in materials physics.

    What kind of materials are you looking at?

    The issue we face is how to dissipate heat better. We are looking at alloys of beryllium to act as a target and potentially rotating it to cool it down better.

    What are some of the challenges in building the liquid argon detectors?

    So far the largest liquid argon detector is built in the US at Fermilab, which is 170 tonnes. As each full-sized tank at DUNE will be 17,000 tonnes, we face a challenge to scale up the technology. One particular issue is that the electronics are contained within the liquid argon and we need to do some more R&D in this area to make sure they can operate effectively. The other area is with the purity of the liquid argon itself. It is a noble gas and, if pure, an electron can drift forever within it. But if there are any impurities that will limit how well the detector can operate.

    How will you go about developing this technology?

    The amount of data you get out of liquid argon detectors is enormous, so we need to make sure we have all the technology tried and tested. We are in the process of building two 600 tonne prototype detectors, the first of which will be tested at CERN in June 2018.

    CERN Proto DUNE Maximillian Brice

    The UK recently announced it will contribute £65m towards DUNE, how will that be used?

    The UK is helping build components for the detector and contributing with the data-acquisition side. It is also helping to develop the new proton target, and to construct the new linear accelerator that will enable the needed beam power.

    3
    The APA being prepped for shipment at Daresbury Laboratory. (Credit: STFC)

    4
    First APA (Anode Plane Assembly) ready to be installed in the protoDUNE-SP detector Photograph: Ordan, Julien Marius

    Are you worried Brexit might derail such an agreement?

    I don’t think so. The agreement is between the UK and US governments and we expect the UK to maintain its support.

    Japan is planning a successor to its Super Kamiokande neutrino detector – Hyper Kamiokande – that would carry out similar physics. Is it a collaborator or competitor?

    Well, it’s not a collaborator. Like Super Kamiokande, Hyper Kamiokande would be a water-based detector, the technology of which is much more established than liquid argon. However, in the long run liquid argon is a much more powerful detector medium – you can get a lot more information about the neutrino from it. I think we are pursuing the right technology. We also have a longer baseline that would let us look for additional interactions between neutrinos and we will create neutrinos with a range of energies. Additionally, the DUNE detectors will be built a mile underground to shield them from cosmic interference.

    Super-Kamiokande experiment. located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    Hyper-Kamiokande, a neutrino physics laboratory located underground in the Mozumi Mine of the Kamioka Mining and Smelting Co. near the Kamioka section of the city of Hida in Gifu Prefecture, Japan.

    _____________________________________________________
    In the long run liquid argon is a much more powerful detector medium – you can get a lot more information about the neutrino from it.
    _____________________________________________________

    Regarding the future at the high-energy frontier, does the US support the International Linear Collider (ILC)?

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    The ILC began as an international project and in recent years Japan has come forward with an interest to host it. We think that Japan now needs to take a lead on the project and give it the go-ahead. Then we can all get around the table and begin negotiations.

    And what about plans by China to build its own Higgs factory?

    The Chinese government is looking at the proposal carefully and trying to gauge how important it is for the research community in China. Currently, Chinese accelerator scientists are busy with two upcoming projects in the country: a free-electron laser in Shanghai and a synchrotron in Beijing. That will keep them busy for the next five years, but after that this project could really take off.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 11:38 am on January 5, 2018 Permalink | Reply
    Tags: HANARO research reactor at the Korean Atomic Energy Research Institute South Korea, , , , physicsworld.com,   

    From physicsworld.com: “Neutrons probe gravity’s inverse square law” 

    physicsworld
    physicsworld.com

    Jan 4, 2018
    Edwin Cartlidge

    1
    Gravitating toward Newton’s law: the J-PARC neutron facility

    A spallation neutron source used by physicists in Japan to search for possible violations of the inverse square law of gravity. By scattering neutrons off noble-gas nuclei, the researchers found no evidence of any deviation from the tried and tested formula. However, they could slightly reduce the wiggle room for any non-conventional interactions at distances of less than 0.1 nm, and are confident they can boost the sensitivity of their experiment over the next few months.

    According to Newton’s law of universal gravitation, the gravitational force between two objects is proportional to each of their masses and inversely proportional to the square of the distance between them. This relationship can also be derived using general relativity, when the field involved is fairly weak and objects are travelling significantly slower than the speed of light. However, there are many speculative theories – some designed to provide a quantum description of gravity – that predict that the relationship breaks down at small distances.

    Physicists have done a wide range of different experiments to look for such a deviation. These include torsion balances, which measure the tiny gravitational attraction between two masses suspended on a fibre and two fixed masses. However, this approach is limited by environmental noise such as seismic vibrations and even the effects of dust particles. As a result such experiments cannot probe gravity at very short distances, with the current limit being about 0.01 mm.

    Scattered in all directions

    Neutrons, on the other hand, can get down to the nanoscale and beyond. The idea is to fire a beam of neutrons at a gas and record how the neutrons are scattered by the constituent nuclei. In the absence of any new forces modifying gravity at short scales, the neutrons and nuclei essentially only interact via the strong force (neutrons being electrically neutral). But the strong force acts over extremely short distances – roughly the size of the nucleus, about 10–14 m – while the neutrons have a de Broglie wavelength of around 1 nm. The neutrons therefore perceive the nuclei as point sources and as such are scattered equally in all directions.

    Any new force, however, would likely extend beyond the nucleus. If its range were comparable to the neutrons’ wavelength then those neutrons would be scattered more frequently in a forward direction than at other angles. Evidence of such a force, should it exist, can therefore be sought by firing in large numbers of neutrons and measuring the distribution of their scattering angles.

    In 2008, Valery Nesvizhevsky of the Institut Laue-Langevin in France and colleagues looked for evidence of such forward scattering in data from previous neutron experiments. They ended up empty handed but could place new upper limits on the strength of any new forces, improving on the existing constraints for scales between 1 pm and 5 nm by several orders of magnitude. Those limits were then pushed back by about another order of magnitude two years ago, when Sachio Komamiya at the University of Tokyo and team scattered neutrons off atomic xenon at the HANARO research reactor at the Korean Atomic Energy Research Institute in South Korea.

    2
    HANARO research reactor at the Korean Atomic Energy Research Institute in South Korea

    Time of flight

    In the new research, Tamaki Yoshioka of Kyushu University in Japan and colleagues use neutrons from a spallation source at the Japan Proton Accelerator Research Complex (J-PARC) in Tokai, which they fire at samples of xenon and helium. Because the J-PARC neutrons come in pulses, the researchers can easily measure their time of flight, and, from that, work out their velocity and hence their wavelength.


    J-PARC Facility Japan Proton Accelerator Research Complex J-PARC, located in Tokai village, Ibaraki prefecture, Japan

    Armed with this information, the team can establish whether any forward scattering is due to a new force or simply caused by neutrons bouncing off larger objects in the gas, such as trace amounts of atmospheric gases. At any given wavelength, both types of scattering would be skewed in the forward direction and so would be indistinguishable from one another. But across a range of wavelengths different patterns would emerge. For atmospheric gases, the scattering angle would simply be proportional to the neutrons’ wavelength. In the case of a new force, on the other hand, the relationship would be more complex because the effective size of the nucleus would itself vary with neutron wavelength.

    Reactors can also be used to generate pulses, by “chopping” a neutron beam. But that process severely limits the beam’s intensity. Taking advantage of the superior statistics at J-PARC, Yoshioka and colleagues were able to reduce the upper limit on any new forces below 0.1 nm by about an order of magnitude over the HANARO results – showing that their inherent strength can at most be 10^24 times that of gravity’s (gravity being an exceptionally weak force).

    Cost-effective search

    That is still nowhere near the sensitivity of torsion balance searches at bigger scales – which can get down to the strength of gravity itself. As Nesvizhevsky points out, torsion balances use macroscopic masses with “Avogadro numbers” (1023) of atoms, whereas neutron scattering experiments involve at most a few tens of millions of neutrons. Nevertheless, he believes that the new line of research is well worth pursuing, pointing out that many theories positing additional gravity-like forces “predict forces in this range of observations”. Such experiments, he argues, represent “an extremely cost-effective way of looking for a new fundamental force” when compared to searches carried out in high-energy physics.

    Spurred on by the prospect of discovery, Yoshioka and colleagues are currently taking more data. The lead author of a preprint on arXiv describing the latest research, Christopher Haddock of Nagoya University, says that they hope to have new results by the summer. A series of improvements to the experiment, including less scattering from the beam stop, he says, could boost sensitivity to new forces in the sub-nanometre range by up to a further order of magnitude and should also improve existing limits at distances of up to 10 nm.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 4:54 pm on January 2, 2018 Permalink | Reply
    Tags: , , , physicsworld.com, Seismic-wave attenuation studies have great potential to add knowledge in this field, Wave attenuation hints at nature of Earth’s asthenosphere   

    From physicsworld.com: “Wave attenuation hints at nature of Earth’s asthenosphere” 

    physicsworld
    physicsworld.com

    Jan 2, 2018

    1
    Soft on the inside. No image credit.

    Researchers in Japan have used measurements of the aftershocks of the 2011 Tohoku earthquake to gain insight into the dynamics of the Earth’s crust and upper mantle. Nozomu Takeuchi and colleagues at the University of Tokyo, Kobe University, and the Japan Agency for Marine–Earth Science and Technology, analysed the attenuation of seismic waves as they propagated through the rigid lithosphere and the less viscous asthenosphere beneath. The team found that the rate of attenuation in the lithosphere showed a marked frequency dependence, whereas in the asthenosphere the relationship was much weaker. The result demonstrates the possibility of using broadband seismic attenuation data to characterize the properties of the Earth’s subsurface.

    Weak, deep and mysterious

    The lithosphere is the rigid outermost layer of the Earth. It comprises two compositional units – the crust and the upper mantle. The movement of individual fragments of the lithosphere (the tectonic plates) is responsible for the phenomenon of continental drift, and is possible due to the low mechanical strength of the underlying asthenosphere.

    Away from the active mid-ocean ridges, the lithosphere–asthenosphere boundary (LAB) lies at least tens of kilometres below the ocean floor, making direct investigation impossible for now. The LAB is even less accessible beneath the continents, where the lithosphere can be hundreds of kilometres thick. Nevertheless, seismic wave velocities and the way the continents have rebounded after deglaciation have allowed the viscosity of the asthenosphere to be estimated even though the physical cause of the mechanical contrast between the layers remains mysterious. A rise in temperature across the boundary presumably contributes, but probably does not explain the disparity completely; partial melting and differences in water content have also been proposed.

    Complex signal

    To help discriminate between these mechanisms, Takeuchi and collaborators looked to differences in the attenuating effects of the lithosphere and asthenosphere. This is a promising approach, because the process of anelastic attenuation is closely related to a material’s thermomechanical properties. The situation is complicated, however, by the fact that high-frequency seismic waves are also attenuated by scattering from small-scale features, and low-frequency waves are attenuated by geometrical spreading.

    Using a dataset obtained after the 2011 earthquake by an array of ocean-floor seismometers in the northwest Pacific, the group compared actual records of seismic waves with a series of probabilistic models. To isolate the anelastic attenuation signature for high-frequency (>3 Hz) waves, the researchers conducted simulations in which the scattering properties of the lithosphere and asthenosphere were varied. The model that most closely matched observations indicated a rate of attenuation for the asthenosphere 50 times that for the lithosphere, and suggested that this attenuation is not related to frequency. Seismic waves in the lithosphere, in contrast, seem strongly frequency dependent.

    More experiments needed

    Although Takeuchi and colleagues’ research shows that seismic-wave attenuation studies have great potential to add knowledge in this field, the results themselves do not immediately support one model over another. Laboratory experiments reveal that partial melting of a sample can produce a weak frequency dependence similar to that determined by this study for the asthenosphere, which on its own would strongly suggest that as the reason for the layer’s low viscosity. However, a similar effect has been observed for samples below the material’s solidus, undermining that explanation somewhat, and also failing to explain why the same response is not observed in the solid lithosphere. Further experiments involving additional factors will be needed to settle the issue.

    Full details of the research are published in Science. A commentary on the research, written by Colleen Dalton of Brown University in the US, is also published in the same issue.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 4:00 pm on December 28, 2017 Permalink | Reply
    Tags: , , , Can Bose-Einstein condensates simulate cosmic inflation?, , physicsworld.com   

    From physicsworld.com: “Can Bose-Einstein condensates simulate cosmic inflation?” 

    physicsworld
    physicsworld.com

    Dec 28, 2017
    Tim Wogan

    1
    Rolling downhill: illustration of a coherent quantum phase transition

    Cosmological inflation, first proposed by Alan Guth in 1979, describes a hypothetical period when the early Universe expanded faster than the speed of light.

    4
    Alan Guth, Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    5
    Alan Guth’s notes. http://www.bestchinanews.com/Explore/4730.html

    The model, which answers fundamental questions about the formation of the Universe we know today, has become central to modern cosmology, but many details remain uncertain. Now atomic physicists in the US have developed a laboratory analogue by shaking a Bose-Einstein condensate (BEC). The team’s initial results suggest that the Universe may have remained quantum coherent throughout inflation and beyond. The researchers hope their condensate model may provide further insights into inflation in a more accessible system, however not everyone agrees on its usefulness.

    Dynamical instability occurs in all sorts of physical systems that are out of equilibrium. A ball perched at the top of a hill, for example, may stay put for short time. But the tiniest perturbation will send the ball falling towards a lower-energy state at the bottom of the hill. Guth realized that a very short, very rapid period of expansion could occur if the Universe got stuck out of equilibrium around 10-35 s after the Big Bang, causing it to expand by a factor of around 1026 in a tiny fraction of a second. The details of the inflationary model have been revised many times, and numerous questions remain. “This is where I can contribute, even though I’m not a cosmologist,” says Cheng Chin of the University of Chicago in Illinois: “We have only one Universe, so it becomes very hard to say whether our theories really capture the whole physics as we can’t repeat the experiment.”

    Shake it up

    Chin and colleagues created their model system by cooling 30,000 atoms in an optical trap into a BEC, in which all the atoms occupy a single quantum state. Initially, this BEC was sitting still in the centre of the trap. The researchers then began to shake the condensate by moving the trapping potential from side to side with increasing amplitude. This raised the energy of the state in which the condensate was stationary relative to the trapping potential. When the shaking amplitude was increased past a critical value, the energy of this “stationary” state became higher than the energy of two other states with the condensate oscillating in opposite directions inside the trap. The condensate therefore underwent a dynamical phase transition, splitting into two parts that each entered one of these two momentum states.

    Between 20-30 ms after the phase transition, the researchers saw a clear interference pattern in the density of the condensate. This shows, says Chin, that the condensate had undergone a quantum coherent separation, with each atom entering a superposition of both momentum states. After this, the clear interference pattern died out. This later period corresponds, says Chin, to the period of cosmological relaxation in which, after inflation had finished, different parts of the Universe relaxed to their new ground states. More detailed analysis of the condensate in this phase showed that, although its quantum dynamics were more complicated – with higher harmonics of the oscillation frequencies becoming more prominent – the researchers’ observations could not be described classically.

    Chin says that cosmologists may find this observation interesting. Although “in principle, everything is quantum mechanical,” he explains, the practical impossibility of performing a full quantum simulation of the Universe as its complexity grows leads cosmologists to fall back on classical models. “The value of our research is to try and point out that we shouldn’t give up [on quantum simulation] that early,” he says. “Even in inflation and the subsequent relaxation process, we have one concrete example to show that quantum mechanics and coherence still play a very essential role.”

    Inflated claims?

    James Anglin of the University of Kaiserslautern in Germany is impressed by the research. “Understanding what happens to small initial quantum fluctuations after a big instability has saturated is an important and basic question in physics, and it really is an especially relevant question for cosmology,” he explains. “The big difference, of course, is that the cosmic inflation scenario includes gravity as curved spacetime in general relativity, such that space expands enormously while the inflaton field [the field thought to drive inflation] finds its true ground state. A malicious critic might say that this experiment is a perfect analogue for cosmological inflation, except for the inflation part.”

    “This is indeed nice work,” he concludes: “The language is simply a little bit inflated!” The research is described in Nature Physics.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 11:10 am on August 12, 2017 Permalink | Reply
    Tags: , , , , , physicsworld.com, Solar core spins four times faster than expected,   

    From physicsworld.com: “Solar core spins four times faster than expected” 

    physicsworld
    physicsworld.com

    Aug 11, 2017
    Keith Cooper

    1
    Sunny science: the Sun still holds some mysteries for researchers. No image credit.

    The Sun’s core rotates four times faster than its outer layers – and the elemental composition of its corona is linked to the 11 year cycle of solar magnetic activity. These two findings have been made by astronomers using a pair of orbiting solar telescopes – NASA’s Solar Dynamics Observatory (SDO) and the joint NASA–ESA Solar and Heliospheric Observatory (SOHO). The researchers believe their conclusions could revolutionize our understanding of the Sun’s structure.

    NASA/SDO

    ESA/NASA SOHO

    Onboard SOHO is an instrument named GOLF (Global Oscillations at Low Frequencies) – designed to search for millimetre-sized gravity, or g-mode, oscillations on the Sun’s surface (the photosphere). Evidence for these g-modes has, however, proven elusive – convection of energy within the Sun disrupts the oscillations, and the Sun’s convective layer exists in its outer third. If solar g-modes exist then they do so deep within the Sun’s radiative core.

    A team led by Eric Fossat of the Université Côte d’Azur in France has therefore taken a different tack. The researchers realized that acoustic pressure, or p-mode, oscillations that penetrate all the way through to the core – which Fossat dubs “solar music” – could be used as a probe for g-mode oscillations. Assessing over 16 years’ worth of observations by GOLF, Fossat’s team has found that p-modes passing through the solar core are modulated by the g-modes that reverberate there, slightly altering the spacing between the p-modes.

    Fossat describes this discovery as “a fantastic result”, in terms of what g-modes can tell us about the solar interior. The properties of the g-mode oscillations depend strongly on the structure and conditions within the Sun’s core, including the ratio of hydrogen to helium, and the period of the g-modes indicate that the Sun’s core rotates approximately once per week. This is around four times faster than the Sun’s outer layers, which rotate once every 25 days at the equator and once every 35 days at the poles.

    Diving into noise

    Not everyone is convinced by the results. Jeff Kuhn of the University of Hawaii describes the findings as “interesting”, but warns that independent verification is required.

    “Over the last 30 years there have been several claims for detecting g-modes, but none have been confirmed,” Kuhn told physicsworld.com. “In their defence, [Fossat’s researchers] have tried several different tests of the GOLF data that give them confidence, but they are diving far into the noise to extract this signal.” He thinks that long-term ground-based measurements of some p-mode frequencies should also contain the signal and confirm Fossat’s findings further.

    If the results presented in Astronomy & Astrophysics can be verified, then Kuhn is excited about what a faster spinning core could mean for the Sun. “It could pose some trouble for our basic understanding of the solar interior,” he says. When stars are born, they are spinning fast but over time their stellar winds rob their outer layers of angular momentum, slowing them down. But Fossat suggests that conceivably their cores could somehow retain their original spin rate.

    Solar links under scrutiny

    Turning attention from the Sun’s core to its outer layers reveals another mystery. The energy generated by nuclear reactions in the Sun’s core ultimately powers the activity in the Sun’s outer layers, including the corona. But the corona is more than a million degrees hotter than the layers of the chromosphere and photosphere below it. The source of this coronal heating is unknown, but a new paper published in Nature Communications has found a link between the elemental composition of the corona, which features a broad spectrum of atomic nuclei including iron and neon, and the Sun’s 11 year cycle of magnetic activity.

    Observations made by SDO between 2010 (when the Sun was near solar minimum) and 2014 (when its activity peaked) revealed that when at minimum, the corona’s composition is dominated by processes of the quiet Sun. However, when at maximum the corona’s composition is instead controlled by some unidentified process that takes place around the active regions of sunspots.

    That the composition of the corona is not linked to a fixed property of the Sun (such as its rotation) but is instead connected to a variable property, could “prompt a new way of thinking about the coronal heating problem,” says David Brooks of George Mason University, USA, who is lead author on the paper. This is because the way in which elements are transported into the corona is thought to be closely related to how the corona is being heated.

    Quest for consensus

    Many explanations for the corona’s high temperature have been proposed, ranging from magnetic reconnection to fountain-like spicules, and magnetic Alfvén waves to nanoflares, but none have yet managed to win over a consensus of solar physicists.

    “If there’s a model that explains everything – the origins of the solar wind, coronal heating and the observed preferential transport – then that would be a very strong candidate,” says Brooks. The discovery that the elemental abundances vary with the magnetic cycle is therefore a new diagnostic against which to test models of coronal heating.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: