Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:11 pm on July 28, 2016 Permalink | Reply
    Tags: , Physics, , The Standard Model   

    From Symmetry: “The deconstructed Standard Model equation” 

    Symmetry Mag


    Rashmi Shivni

    Yvonne Tang, SLAC National Accelerator Laboratory

    The Standard Model is far more than elementary particles arranged in a table.

    The Standard Model of particle physics is often visualized as a table, similar to the periodic table of elements, and used to describe particle properties, such as mass, charge and spin.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The table is also organized to represent how these teeny, tiny bits of matter interact with the fundamental forces of nature.

    But it didn’t begin as a table. The grand theory of almost everything actually represents a collection of several mathematical models that proved to be timeless interpretations of the laws of physics.

    Here is a brief tour of the topics covered in this gargantuan equation.

    The whole thing

    This version of the Standard Model is written in the Lagrangian form. The Lagrangian is a fancy way of writing an equation to determine the state of a changing system and explain the maximum possible energy the system can maintain.

    Technically, the Standard Model can be written in several different formulations, but, despite appearances, the Lagrangian is one of the easiest and most compact ways of presenting the theory.


    Section 1

    These three lines in the Standard Model are ultraspecific to the gluon, the boson that carries the strong force. Gluons come in eight types, interact among themselves and have what’s called a color charge.


    Section 2

    Almost half of this equation is dedicated to explaining interactions between bosons, particularly W and Z bosons.

    Bosons are force-carrying particles, and there are four species of bosons that interact with other particles using three fundamental forces. Photons carry electromagnetism, gluons carry the strong force and W and Z bosons carry the weak force. The most recently discovered boson, the Higgs boson, is a bit different; its interactions appear in the next part of the equation.


    Section 3

    This part of the equation describes how elementary matter particles interact with the weak force. According to this formulation, matter particles come in three generations, each with different masses. The weak force helps massive matter particles decay into less massive matter particles.

    This section also includes basic interactions with the Higgs field, from which some elementary particles receive their mass.

    Intriguingly, this part of the equation makes an assumption that contradicts discoveries made by physicists in recent years. It incorrectly assumes that particles called neutrinos have no mass.


    Section 4

    In quantum mechanics, there is no single path or trajectory a particle can take, which means that sometimes redundancies appear in this type of mathematical formulation. To clean up these redundancies, theorists use virtual particles they call ghosts.

    This part of the equation describes how matter particles interact with Higgs ghosts, virtual artifacts from the Higgs field.


    Section 5

    This last part of the equation includes more ghosts. These ones are called Faddeev-Popov ghosts, and they cancel out redundancies that occur in interactions through the weak force.


    Note: Thomas Gutierrez, an assistant professor of Physics at California Polytechnic State University, transcribed the Standard Model Lagrangian for the web. He derived it from Diagrammatica, a theoretical physics reference written by Nobel Laureate Martinus Veltman. In Gutierrez’s dissemination of the transcript, he noted a sign error he made somewhere in the equation. Good luck finding it!

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 2:55 pm on July 28, 2016 Permalink | Reply
    Tags: , Physics, Solar cells research, U illinois Chicago   

    From U Illinois Chicago: “Breakthrough solar cell captures CO2 and sunlight, produces burnable fuel” 

    U Illinois bloc

    University of Illinois


    July 28, 2016
    Bill Burton

    Simulated sunlight powers a solar cell that converts atmospheric carbon dioxide directly into syngas.

    Researchers at the University of Illinois at Chicago have engineered a potentially game-changing solar cell that cheaply and efficiently converts atmospheric carbon dioxide directly into usable hydrocarbon fuel, using only sunlight for energy.

    The finding is reported in the July 29 issue of Science and was funded by the National Science Foundation and the U.S. Department of Energy. A provisional patent application has been filed.

    Unlike conventional solar cells, which convert sunlight into electricity that must be stored in heavy batteries, the new device essentially does the work of plants, converting atmospheric carbon dioxide into fuel, solving two crucial problems at once. A solar farm of such “artificial leaves” could remove significant amounts of carbon from the atmosphere and produce energy-dense fuel efficiently.

    “The new solar cell is not photovoltaic — it’s photosynthetic,” says Amin Salehi-Khojin, assistant professor of mechanical and industrial engineering at UIC and senior author on the study.

    “Instead of producing energy in an unsustainable one-way route from fossil fuels to greenhouse gas, we can now reverse the process and recycle atmospheric carbon into fuel using sunlight,” he said.

    While plants produce fuel in the form of sugar, the artificial leaf delivers syngas, or synthesis gas, a mixture of hydrogen gas and carbon monoxide. Syngas can be burned directly, or converted into diesel or other hydrocarbon fuels.

    The ability to turn CO2 into fuel at a cost comparable to a gallon of gasoline would render fossil fuels obsolete.

    Chemical reactions that convert CO2 into burnable forms of carbon are called reduction reactions, the opposite of oxidation or combustion. Engineers have been exploring different catalysts to drive CO2 reduction, but so far such reactions have been inefficient and rely on expensive precious metals such as silver, Salehi-Khojin said.

    “What we needed was a new family of chemicals with extraordinary properties,” he said.

    Amin Salehi-Khojin (left), UIC assistant professor of mechanical and industrial engineering, and postdoctoral researcher Mohammad Asadi with their breakthrough solar cell that converts atmospheric carbon dioxide directly into syngas.

    Salehi-Khojin and his coworkers focused on a family of nano-structured compounds called transition metal dichalcogenides — or TMDCs — as catalysts, pairing them with an unconventional ionic liquid as the electrolyte inside a two-compartment, three-electrode electrochemical cell.

    The best of several catalysts they studied turned out to be nanoflake tungsten diselenide.

    “The new catalyst is more active; more able to break carbon dioxide’s chemical bonds,” said UIC postdoctoral researcher Mohammad Asadi, first author on the Science paper.

    In fact, he said, the new catalyst is 1,000 times faster than noble-metal catalysts — and about 20 times cheaper.

    Other researchers have used TMDC catalysts to produce hydrogen by other means, but not by reduction of CO2. The catalyst couldn’t survive the reaction.

    “The active sites of the catalyst get poisoned and oxidized,” Salehi-Khojin said. The breakthrough, he said, was to use an ionic fluid called ethyl-methyl-imidazolium tetrafluoroborate, mixed 50-50 with water.

    “The combination of water and the ionic liquid makes a co-catalyst that preserves the catalyst’s active sites under the harsh reduction reaction conditions,” Salehi-Khojin said.

    The UIC artificial leaf consists of two silicon triple-junction photovoltaic cells of 18 square centimeters to harvest light; the tungsten diselenide and ionic liquid co-catalyst system on the cathode side; and cobalt oxide in potassium phosphate electrolyte on the anode side.

    When light of 100 watts per square meter – about the average intensity reaching the Earth’s surface – energizes the cell, hydrogen and carbon monoxide gas bubble up from the cathode, while free oxygen and hydrogen ions are produced at the anode.

    “The hydrogen ions diffuse through a membrane to the cathode side, to participate in the carbon dioxide reduction reaction,” said Asadi.

    The technology should be adaptable not only to large-scale use, like solar farms, but also to small-scale applications, Salehi-Khojin said. In the future, he said, it may prove useful on Mars, whose atmosphere is mostly carbon dioxide, if the planet is also found to have water.

    “This work has benefitted from the significant history of NSF support for basic research that feeds directly into valuable technologies and engineering achievements,” said NSF program director Robert McCabe.

    “The results nicely meld experimental and computational studies to obtain new insight into the unique electronic properties of transition metal dichalcogenides,” McCabe said. “The research team has combined this mechanistic insight with some clever electrochemical engineering to make significant progress in one of the grand-challenge areas of catalysis as related to energy conversion and the environment.”

    Nanostructured transition metal dichalcogenide electrocatalysts for CO2 reduction in ionic liquid is online at http://www.eurekalert.org/jrnls/sci/ or by contacting scipak@aaas.org.

    Co-authors with Asadi and Salehi-Khojin are Kibum Kim, Aditya Venkata Addepalli, Pedram Abbasi, Poya Yasaei, Amirhossein Behranginia, Bijandra Kumar and Jeremiah Abiade of UIC’s mechanical and industrial engineering department, who performed the electrochemical experiments and prepared the catalyst under NSF contract CBET-1512647; Robert F. Klie and Patrick Phillips of UIC’s physics department, who performed electron microscopy and spectroscopy experiments; Larry A. Curtiss, Cong Liu and Peter Zapol of Argonne National Laboratory, who did Density Functional Theory calculations under DOE contract DE-ACO206CH11357; Richard Haasch of the University of Illinois at Urbana-Champaign, who did ultraviolet photoelectron spectroscopy; and José M. Cerrato of the University of New Mexico, who did elemental analysis.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Illinois campus

    The University of Illinois at Urbana-Champaign community of students, scholars, and alumni is changing the world.

    With our land-grant heritage as a foundation, we pioneer innovative research that tackles global problems and expands the human experience. Our transformative learning experiences, in and out of the classroom, are designed to produce alumni who desire to make a significant, societal impact.

  • richardmitnick 5:59 am on July 28, 2016 Permalink | Reply
    Tags: Physics, Quantum spin liquid,   

    From Science Alert: “Physicists observe brand-new state of matter in an unexpected material” 


    Science Alert

    27 JUL 2016

    Robert Couse-Baker/Flickr

    Back in April, the physics world freaked out when scientists confirmed that they’d made the first direct observation of a brand-new state of matter – known as quantum spin liquid – for the first time.

    But now a team of physicists has just announced that they’ve observed quantum spin liquid state again… and this time in a material where it should be impossible.

    Back in April, the physics world freaked out when scientists confirmed that they’d made the first direct observation of a brand-new state of matter – known as quantum spin liquid – for the first time.

    But now a team of physicists has just announced that they’ve observed quantum spin liquid state again… and this time in a material where it should be impossible.

    Let’s back up a second, because all this isn’t as confusing as it sounds.

    Spin in the quantum world doesn’t actually mean an electron is physically spinning. It refers to a type of intrinsic angular momentum that simply describes how an electron is behaving. In quantum computing we often simplify this by saying the spin state is down, up, or in superposition (both at the same time).

    Quantum spin liquid is a state of matter that, very simply, occurs when the spin of electrons continue to fluctuate in a fluid manner even at very low temperatures, when they should be frozen in place.

    It’s like atoms inside regular materials. When they’re in a fluid state, they’re moving freely. But when temperatures drop, they’ll freeze in place in a solid arrangement. That should happen with spin orientation in magnetic materials, but in quantum spin liquid state, it doesn’t.

    Even though it was predicted in 1973, the new state of matter was only observed for the first time this year, in a two-dimensional, graphene-like material.

    That discovery made a lot of sense, because the material fit our understanding of how spin liquid state arises.

    Basically, the criteria is that a material has to have has anti-ferromagnetic – or antiparallel – interactions, which, as the name suggests, is the opposite to ferromagnetic interactions in materials such as iron and nickel.

    It means that if one electron has a ‘down’ spin, the one next to it has to have an ‘up’ spin, and so on.

    Anti-ferromagnetic materials on their own don’t necessarily enter quantum spin liquid state, unless they also happen to have a triangular atomic arrangement, which makes this alignment impossible.

    So, just imagine three atoms at the corner of a triangle – they’re never all going to be in parallel alignments because as soon as one changes to match the one to its right, the one on its left will have to change, and so on and so on. They’ll keep flipping their alignment even at absolute zero temperature – hence, quantum spin liquid state.

    But the new research suggests that our criteria isn’t quite right, because the German team were able to observe the new state of matter occurring in a material that doesn’t fit that profile.

    The material in question is a monocrystal of calcium chromium oxide (Ca10Cr7O28).

    Calcium-chromium oxide is made up of what are known as Kagomé lattices – named after the pattern of triangles and hexagons woven in Japanese baskets.

    This figure shows the Kagome lattice, a two-dimensional periodic geometric structure. All triangles are equilateral. The three Bravais sub-lattices are coloured in red, green, and blue, and labelled 1, 2, 3, respectively. WilliamSix

    Basically that means the material has a complex mix of anti-ferromagnetic interactions, but also much stronger ferromagnetic interactions, which, according to conventional understanding, should prevent quantum spin liquid behaviour.

    But through a range of scattering and spectrometry experiments in Germany, France, England, Switzerland, and the US, the team was able to show that this wasn’t the case – quantum spin liquid state was happening even at temperatures as low as 20 millikelvin (around –273 degrees Celsius).

    So what’s going on here? Fortunately, the team has already come up with a hypothesis to explain why this material could behave like a quantum spin liquid without breaking our conventional understanding of the state of matter.

    Using numerical simulations, they’ve shown that competition is the key to the strange behaviour – different magnetic interactions in the materials are competing with each other, and keeping the spins flip-flopping around.

    You can see that happening in the illustration below, which shows the competing interactions on each atom (the grey and black balls). The green and red sticks represent ferromagnetic interactions, while the blue sticks represent anti-ferromagnetic interactions, which are forcing the spins to keep changing.


    “The work expands our understanding of magnetic materials, and also shows us that there are potentially far more candidates for spin liquids than expected,” said Lake.

    The research has been published in Nature Physics, and now needs to be verified by other teams before we say for sure that quantum spin liquid state can exist in these new types of materials.

    But it’s a pretty exciting study that hugely widens the potential pool of materials that we could use in future to build quantum computers. We can’t wait to find out more.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 11:31 am on July 26, 2016 Permalink | Reply
    Tags: , , , From dark gravity to phantom energy: what’s driving the expansion of the universe?, Physics   

    From COSMOS: “From dark gravity to phantom energy: what’s driving the expansion of the universe?’ 

    Cosmos Magazine bloc


    There is something strange happening in the local universe, with galaxies moving away from each other faster than expected. What is driving this extra expansion, and what does it mean for the cosmos?

    A diagram representing the evolution of the universe, starting with the Big Bang to present day. The red arrow marks the flow of time. New research suggests it’s expanding even faster than shown here. NASA/GSFC

    There is something strange happening in the local universe, with galaxies moving away from each other faster than expected.

    What is driving this extra expansion, and what does it mean for the cosmos? To explore this, let’s start with the observations.

    The rate of cosmic expansion is encapsulated in the “Hubble constant”, although don’t let the name fool you, as it’s not a constant and changes as the universe expands.

    To determine this constant, astronomers must relate the distances to galaxies to the velocity they’re travelling away from us. But measuring astronomical distances has always proven difficult. This is because we lack convenient signposts, known as standard candles and rulers, to chart the heavens.

    So astronomers have built up cosmic distances through a series of steps, using overlapping methods to span the heavens. But each step in this cosmological distance ladder has its own quirks and uncertainties, and extraordinary effort over many decades has been expended to calibrate the various methods.

    A new paper has pushed this calibration even harder, using a number of methods to tie down the Hubble constant to an accuracy of 2.4% within a few hundred million light years (which is local by cosmic standards).

    We can also determine the universal expansion from observations of the cosmic microwave background, which is the radiation leftover from the Big Bang.

    Cosmic Microwave Background per ESA/Planck
    Cosmic Microwave Background per ESA/Planck

    Unlike local observations, this reveals the global expansion of the universe. And this is where the problems begin, as this global expansion is 9% slower than that seen in the local universe. In both measurements, the astronomers have worked hard to reduce the uncertainties, and so are confident this difference is valid.

    So what can explain this tension in cosmic measurement? Here are a few of the contenders.

    Cosmic contenders
    Dark matter

    The first potential culprit is dark matter, the dominant mass in the universe. We know it is not smoothly spread through space, so perhaps the lumps and bumps, like the galaxies and clusters of galaxies, are exacting less gravitational pull in the local universe.

    Perhaps we are in a cosmic void, a region whose density is below the universal average.

    If this were the case, we would have to be inhabiting a strange corner of the universe, sitting at the centre of immense emptiness not very unlike anything expected in our cosmological ideas.

    Dark energy

    And then there is dark energy, the dominant energy in the universe. This component is responsible for accelerating the cosmic expansion, but is assumed to have a very simple form, eternal and unchanging over all of history.

    But what if dark energy is dynamic and evolving, changing its properties as the universe expands? If it changed quite recently (in cosmic terms), the additional expansion could be imprinted on the local universe, but have not yet impacted the global expansion.

    If this is the case, the universe has something to worry about, as this new form of dark energy would be a “phantom”, driving universal expansion faster and faster into a “big rip”, which is more dramatic than it sounds.

    Dark radiation

    Another potential solution is “dark radiation”, which consists of hyper-fast particles that zipped around in the early universe.

    While there is no single definition on what constitutes dark radiation, a favoured candidate is a new member of the neutrino family, affectionately known as sterile neutrinos.

    While dark radiation is theoretical, there is little observational evidence for its existence. But if it had been present in the early universe, it would have influenced the early expansion of the universe, which would still be imprinted on the global value of the Hubble constant, but would now be washed out of the local value.

    Dark gravity

    The potential solutions so far have considered modifying the properties of components in the universe, but there is the more drastic alternative: dark gravity.

    This suggests that we don’t fully understand the fundamental nature of the universe, and that gravity does not follow the rules laid out by Albert Einstein in his general theory of relativity.

    Such theories of modified gravity have existed for a long time, and come in many forms, and it is not clear how we deduce the impact of such gravity on the universal expansion.

    Dark speculations

    So there are several alternatives that could potentially explain the discrepancy between the local and global measurements of the Hubble constant. Which one is correct?

    At the moment, the observations are rather raw and do not discriminate between the possibilities. And so we enter the realm of theoretical speculation, where ideas are tried and discarded until viable explanations are discovered.

    At the same time, astronomers will seek more data, and will continue to tie down calibrations and methods. This brings us to our final possibility.

    No observations are perfect, and much of science is about understanding the uncertainties of measurements. Scientists can generally wrangle random errors and understand how uncertainties in measurement impact uncertainties in results.

    But there is another uncertainty: the systematic error, which can strike fear into a researcher. Instead of scattering results, systematic errors shift all results one way or another.

    Systematic errors can also influence astronomical distance measures. And if they propagate through the distance ladder, they could potentially shift the local measurement of the Hubble constant away from the global value.

    With new data and methods, this tension may evaporate. Some astronomers are already suggesting that this is a “more reasonable explanation”.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:14 am on July 26, 2016 Permalink | Reply
    Tags: , , , , , , , Physics,   

    From Physics Today: “High-energy lab has high-energy director” 

    Physics Today bloc

    Physics Today

    21 July 2016
    Toni Feder

    CERN director general Fabiola Gianotti looks at what lies ahead for particle physics.

    Fabiola Gianotti in December 2015, just before she became CERN’s director general. Credit: CERN

    Fabiola Gianotti shot to prominence on 4 July 2012, with the announcement of the discovery of the Higgs boson. At the time, she was the spokesperson of ATLAS, which along with the Compact Muon Solenoid (CMS) experiment spotted the Higgs at the Large Hadron Collider (LHC) at CERN.

    CERN ATLAS Higgs Event
    CERN/ATLAS detector
    CERN ATLAS Higgs Event; CERN/ATLAS detector

    CERN CMS Higgs Event
    CERN/CMS Detector
    CERN CMS Higgs Event; CERN/CMS Detector

    In the excitement over the Higgs discovery, Gianotti was on the cover of Time. She was hailed as among the most influential and the most inspirational women of our time. She was listed among the “leading global thinkers of 2013” by Foreign Policy magazine.

    “I am not very comfortable in the limelight,” says Gianotti. “Particle physics is a truly collaborative field. The discovery of the Higgs boson is the result of the work of thousands of physicists over more than 20 years.”

    Gianotti first went to CERN in 1987 as a graduate student at the University of Milan. She has been there ever since. And she seems comfortable at the helm, a job she has held since the beginning of this year.

    “The main challenge is to cope with so many different aspects, and switching my brain instantly from one problem to another one,” she says. “There are many challenges—human challenges, scientific challenges, technological challenges, budget challenges. But the challenges are interesting and engaging.”

    As of this summer, the LHC is in the middle of its second run, known as Run 2. In June the collider reached a record luminosity of 1034 cm−2s−1. It produces proton–proton collisions of energy of 13 TeV. A further push to the design energy of 14 TeV may be made later in Run 2 or in Run 3, which is planned for 2021–23. An upgrade following the third run will increase the LHC’s luminosity by an order of magnitude.

    Physics Today’s Toni Feder caught up with Gianotti in June, about six months into her five-year appointment in CERN’s top job.

    PT: Last fall the ATLAS and CMS experiments both reported hints of a signal at 750 GeV. What would the implications be of finding a particle at that energy?

    GIANOTTI: At the moment, we don’t know if what the experiments observed last year is the first hint of a signal or just a fluctuation. But if the bump turns into a signal, then the implications are extraordinary. Its presumed features would not be something we can classify within the best-known scenarios for physics beyond the standard model. So it would be something unexpected, and for researchers there is nothing more exciting than a surprise.

    The experiments are analyzing the data from this year’s run and will release results in the coming weeks. We can expect them on the time scale of ICHEP in Chicago at the beginning of August. [ICHEP is the International Conference on High Energy Physics.]

    PT: The LHC is up to nearly the originally planned collision energy. The next step is to increase the luminosity. How will that be done?

    GIANOTTI: To increase the luminosity, we will have to replace components of the accelerator—for example, the magnets sitting on each side of the ATLAS and CMS collision regions. These are quadrupoles that squeeze the beams and therefore increase the interaction probability. We will replace them with higher-field, larger-aperture magnets. There are also other things we have to do to upgrade the accelerator. The present schedule for the installation of the hardware components is at the end of Run 3—that is, during the 2024–26 shutdown. The operation of the high-luminosity LHC will start after this installation, so on the time scale of 2027.

    The high-luminosity LHC will allow the experiments to collect 10 times as much data. Improving the precision will be extremely important, in particular for the interaction strength—so-called couplings—of the Higgs boson with other particles. New physics can alter these couplings from the standard-model expectation. Hence the Higgs boson is a door to new physics.

    The high-luminosity LHC will also increase the discovery potential for new physics: Experiments will be able to detect particles with masses 20% to 30% larger than before the upgrade.

    And third, if new physics is discovered at the LHC in Run 2 or Run 3, the high-luminosity LHC will allow the first precise measurements of the new physics to be performed with a very well-known accelerator and very well-known experiments. So it would provide powerful constraints on the underlying theory.

    PT: What are some of the activities at CERN aside from the LHC?

    GIANOTTI: I have spent my scientific career working on high-energy colliders, which are very close to my heart. However, the open questions today in particle physics are difficult and crucial, and there is no single way to attack them. We can’t say today that a high-energy collider is the way to go and let’s forget about other approaches. Or underground experiments are the way to go. Or neutrino experiments are the way to go. There is no exclusive way. I think we have to be very inclusive, and we have to address the outstanding questions with all the approaches that our discipline has developed over the decades.

    In this vein, at CERN we have a scientific diversity program. It includes the study of antimatter through a dedicated facility, the Antiproton Decelerator; precise measurements of rare decays; and many other projects. We also participate in accelerator-based neutrino programs, mainly in the US. And we are doing R&D and design studies for the future high-energy colliders: an electron–positron collider in the multi-TeV region [the Compact Linear Collider] and future circular colliders.

    PT: Japan is the most likely host for a future International Linear Collider, an electron–positron collider (see Physics Today, March 2013, page 23). What’s your sense about whether the ILC will go ahead and whether it’s the best next step for high-energy physics?

    GIANOTTI: Japan is consulting with international partners to see if a global collaboration can be built. It’s a difficult decision to be taken, and it has to be taken by the worldwide community.

    Europe will produce a new road map, the European Strategy for Particle Physics, on the time scale of 2019–20. That will be a good opportunity to think about the future of the discipline, based also on the results from the LHC Run 2 and other facilities in the world.

    PT: How is CERN affected by tight financial situations in member countries?

    GIANOTTI: CERN has been running for many years with a constant budget, with constant revenues from member states, at a level of CHF 1.2 billion [$1.2 billion] per year. We strive to squeeze the operation of the most powerful accelerator in the world, its upgrade, and other interesting projects within this budget.

    PT: Will Brexit affect CERN?

    GIANOTTI: We are not directly affected because CERN membership is not related to being members of the European Union.

    PT: You have said you have four areas that you want to maintain and expand at CERN: science, technology and innovation, education, and peaceful collaboration. Please elaborate.

    GIANOTTI: Science first. We do research in fundamental physics, with the aim of understanding the elementary particles and their interactions, which also gives us very important indications about the structure and evolution of the universe.

    In order to accomplish these scientific goals, we have to develop cutting-edge technologies in many domains, from superconducting magnets to vacuum technology, cryogenics, electronics, computing, et cetera.

    These technologies are transferred to society and find applications in many other sectors—for example, in the medical fields with imaging and cancer therapy, but also solar panels, not to mention the World Wide Web. Fundamental research requires very sophisticated instruments and is a driver of innovation.

    Another component of our mission is education and training. The CERN population is very young: The age distribution of the 12 000 citizens from all over the world working on our experiments peaks at 27 years, and almost 50% are below 35. About half of our PhD students remain in academia or research, and about half go to industry. It is our duty to prepare them to be tomorrow’s scientists or tomorrow’s employees of industry—and in any case, good citizens.

    How do we prepare them to be good citizens? CERN was created in the early 1950s to promote fundamental research and to foster peaceful collaboration among European countries after the war. Today we have scientists of more than 110 nationalities, some from countries that are in conflict with each other, some from countries that do not even recognize each other’s right to exist. And yet they work together in a peaceful way, animated by the same passion for knowledge.

    PT: You are the first woman to head CERN. What do you see as the significance of this?

    GIANOTTI: The CERN director general should be appointed on the basis of his or her capabilities to run the laboratory and not on the basis of gender arguments. This being said, I hope that my being a woman can be useful as an encouragement to girls and young women who would like to do fundamental research but might hesitate. It shows them they have similar opportunities as their male colleagues.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Our mission

    The mission of Physics Today is to be a unifying influence for the diverse areas of physics and the physics-related sciences.

    It does that in three ways:

    • by providing authoritative, engaging coverage of physical science research and its applications without regard to disciplinary boundaries;
    • by providing authoritative, engaging coverage of the often complex interactions of the physical sciences with each other and with other spheres of human endeavor; and
    • by providing a forum for the exchange of ideas within the scientific community.”

  • richardmitnick 9:41 am on July 26, 2016 Permalink | Reply
    Tags: , Asymptotically Safe Gravity, , Causal Dynamical Triangulation, Dimensional reduction, , , Physics   

    From Ethan Siegel: “Dimensional Reduction: The Key To Physics’ Greatest Mystery?” 

    From Ethan Siegel

    Jul 26, 2016
    Sabine Hossenfelder

    A visualization of a 3-torus model of space, where lines or sheets in series could reproduce a larger-dimensional structure. Image credit: Bryan Brandenburg, under c.c.a.-s.a.-3.0.

    What if the Universe – and fundamentally, space itself – were like a pile of laundry?

    It doesn’t sound like a sober thought, but it’s got math behind it, so physicists think there might be something to it. Indeed the math has piled up lately. They call it “dimensional reduction,” the idea that space on short distances has fewer than three dimensions – and it might help physicists to quantize gravity.

    We’ve gotten used to space with additional dimensions, rolled up so small (or compactified) that we can’t observe them. But how do you get rid of dimensions instead? To understand how it works we first have clarify what we mean by “dimension.”

    A 3-D object like a pipe will have a Hausdorff dimension of 1, as the lines only have one dimension to spread out as long as they’d like, which is also seen in the reduction to a line as you zoom out. Image credit: Alex Dunkel (Maky) of Wikipedia, based on Brian Greene’s The Elegant Universe, under a c.c.a.-s.a.-4.0 license.

    We normally think about dimensions of space by picturing a series of lines which spread from a point. How quickly the lines dilute with the distance from the point tells us the “Hausdorff dimension” of a space. The faster the lines diverge from each other with distance, the larger the Hausdorff dimension. If you speak through a pipe, for example, sound waves spread less and your voice carries farther. The pipe hence has a lower Hausdorff dimension than our normal 3-dimensional office cubicles. It’s the Hausdorff dimension that we colloquially refer to as just dimension.

    For dimensional reduction, however, it is not the Hausdorff dimension which is relevant, but instead the “spectral dimension,” which is a slightly different concept. We can calculate it by first getting rid of the “time” in “space-time” and making it into space (period). We then place a random walker at one point and measure the probability that it returns to the same point during its walk. The smaller the average return probability, the higher the probability the walker gets lost, and the higher the number of spectral dimensions.

    Isotropic random walk on the euclidean lattice Z^3. This picture shows three different walks after 10 000 unit steps, all three starting from the origin. Image credit: Zweistein, under c.c.a.-s.a.-3.0.

    Normally, for a non-quantum space, both notions of dimension are identical. However, add quantum mechanics and the spectral dimension at short distances goes down from four to two. The return probability for short walks becomes larger than expected, and the walker is less likely to get lost – this is what physicists mean by “dimensional reduction.”

    The spectral dimension is not necessarily an integer; it can take on any value. This value starts at 4 when quantum effects can be neglected, and decreases when the walker’s sensitivity to quantum effects at shortest distances increases. Physicists therefore also like to say that the spectral dimension “runs,” meaning its value depends on the resolution at which space-time is probed.

    Dimensional reduction is an attractive idea because quantizing gravity is considerably easier in lower dimensions, where the infinities that plague traditional attempts to quantize gravity go away. A theory with a reduced number of dimensions at the shortest distances therefore has a much higher chance to remain consistent, and therefore to provide a meaningful theory for the quantum nature of space and time. Not so surprisingly, among physicists, dimensional reduction has received quite some attention lately.

    Cross section of the quintic Calabi–Yau manifold. Unlike taking a cross section, dimensional reduction is about having reduced degrees of freedom when it comes to the probability of returning to your starting point in a finite number of steps. Public domain.

    This strange property of quantum-spaces was first found in Causal Dynamical Triangulation, an approach to quantum gravity that relies on approximating curved spaces by triangular patches. In this work, the researchers did a numerical simulation of a random walk in such a triangulized quantum-space, and found that the spectral dimension goes down from four to two. Or actually, to 1.80 ± 0.25, if you want to know precisely.

    Instead of doing numerical simulations, it is also possible to study the spectral dimension mathematically, which has since been done in various other approaches. For this, physicists exploit that the behavior of the random walk is governed by a differential equation – the diffusion equation (a.k.a., the heat equation) – which depends on the curvature of space. In quantum gravity, spatial curvature has quantum fluctuations, so instead it’s the average curvature value which enters the diffusion equation. From the diffusion equation, one then calculates the return probability for the random walk.

    Through this method, physicists have inferred the spectral dimension also in Asymptotically Safe Gravity, an approach to quantum gravity which relies on the resolution-dependence (the “running”) of quantum field theories. And they found the same drop as in Causal Dynamical Triangulations: from four to two spectral dimensions.

    A representation of a spin network in Loop quantum gravity. Image credit: Markus Poessel (Mapos) of Wikimedia Commons, under c.c.a.-s.a.-3.0.

    Another indication that dimensional reduction might be important comes from Loop Quantum Gravity, where the scaling of the area operator with length changes at short distances. In this case, is somewhat questionable whether the notion of curvature makes sense at all on short distances. Ignoring this philosophical conundrum, one can construct the diffusion equation anyway, and one finds that the spectral dimension – surprise – drops from four to two.

    And finally, there is Horava-Lifshitz gravity, yet another modification of gravity which some believe helps with quantizing it. Here too, dimensional reduction, from four to two, has been found.

    It is difficult to visualize what is happening with the dimensionality of space if it goes down continuously, rather than in discrete steps as in the example with the laundry pile. Perhaps a good way to picture it, as Calcagni, Eichhorn and Saueressig suggest, is to think of the quantum fluctuations of space-time as hindering a particle’s random walk, thereby slowing it down. It wouldn’t have to be that way, though. Quantum fluctuations could have also kicked the particle around wildly, thereby increasing the spectral dimension rather than decreasing it. But that’s not what the math tells us.

    Real gravitational effects occur in spacetime, not just space, and must propagate at the speed of light through space and time. Image credit: SLAC National Accelerator Laboratory.

    One shouldn’t take this picture too seriously though, because we’re talking about a random walk in space, not space-time, and so it’s not a real physical process. Turning time into space might seem strange, but it is a common mathematical simplification which is often used for calculations in quantum theory. Still, it makes it difficult to interpret what is happening physically.

    I find it intriguing that several different approaches to quantum gravity share a behavior like this. Maybe it is a general property of quantum space-time? But then, there are many different types of random walks, and while these different approaches to quantum gravity share a similar scaling behavior for the spectral dimension, they differ in the type of random walk that produces this scaling. So maybe the similarities are only superficial.

    And, of course, this idea has no observational evidence speaking for it. Maybe never will. But one day, I’m sure, all the math will click into place and everything will make perfect sense. Meanwhile, have another.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 3:36 pm on July 25, 2016 Permalink | Reply
    Tags: , , , , Physics, TH2K, VA Tech   

    From phys.org: “CP violation or new physics?” 


    July 25, 2016
    Lisa Zyga

    This is the “South Pillar” region of the star-forming region called the Carina Nebula. Like cracking open a watermelon and finding its seeds, the infrared telescope “busted open” this murky cloud to reveal star embryos tucked inside finger-like pillars of thick dust. Credit: NASA/Spitzer

    Over the past few years, multiple neutrino experiments have detected hints for leptonic charge parity (CP) violation—a finding that could help explain why the universe is made of matter and not antimatter. So far, matter-antimatter asymmetry cannot be explained by any physics theory and is one of the biggest unsolved problems in cosmology.

    But now in a new study published in Physical Review Letters, physicists David V. Forero and Patrick Huber at Virginia Tech have proposed that the same hints could instead indicate CP-conserving “new physics,” and current experiments would have no way to tell the difference.

    Both possibilities—CP violation or new physics—would have a major impact on the scientific understanding of some of the biggest questions in cosmology. Currently, one of the most pressing problems is the search for new physics, or physics beyond the Standard Model, which is a theory that scientists know is incomplete but aren’t sure exactly how to improve. New physics could potentially explain several phenomena that the Standard Model cannot, including the matter-antimatter asymmetry problem, as well as dark matter, dark energy, and gravity.

    As the scientists show in the new study, determining whether the recent hints indicate CP violation or new physics will be very challenging. The main goal of the study was to “quantify the level of confusion” between the two possibilities. The physicists’ simulations and analysis revealed that both CP violation and new physics have distributions centered at the exact same value for what the neutrino experiments measure—something called the Dirac CP phase. This identical preference makes it impossible for current neutrino experiments to distinguish between the two cases.

    “Our results show that establishing leptonic CP violation will need exceptional care, and that new physics can in many ways lead to non-trivial confusion,” Huber told Phys.org.

    The good news is that new and future experiments may be capable of resolving the issue. One possible way to test the two proposals is to compare the measurements of the Dirac CP phase made by two slightly different experiments: DUNE (the Deep Underground Neutrino Experiment) at Fermilab in Batavia, Illinois; and T2HK (the Tokai to Hyper-Kamiokande project) at J-PARC in Tokai, Japan.


    Proposed TH2K

    “The trick is that the type of new physics we postulate in our paper manifests itself in the way in which neutrino oscillations are affected by the amount of earth matter through which the neutrino traverses,” Huber said. “The more matter travelled through, the larger the effect of this type of new physics.”

    “Now, for DUNE, neutrinos would have to travel roughly 1300 km in the earth, whereas for T2HK they would travel only about 300 km. Thus one would find two different values for the Dirac CP phase in both cases, indicating a problem.”

    In order to be accurate, these experiments will require extremely high degrees of precision, which Huber emphasizes should not be overlooked.

    “Of course, the same result could arise if for some reason either experiment was not properly calibrated and thus precisely calibrating these experiments will be extraordinarily important—a very difficult task, which I believe is not quite getting the attention it should.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

  • richardmitnick 10:32 am on July 25, 2016 Permalink | Reply
    Tags: , , , Physics, Possible fifth force?,   

    From Don Lincoln of FNAL on livescience: “A Fifth Force: Fact or Fiction” 


    FNAL Icon

    FNAL Don Lincoln
    Don lincoln

    July 5, 2016

    Has a Hungarian lab really found evidence of a fifth force of nature? Credit: Jurik Peter / Shutterstock.com

    Science and the internet have an uneasy relationship: Science tends to move forward through a careful and tedious evaluation of data and theory, and the process can take years to complete. In contrast, the internet community generally has the attention span of Dory, the absent-minded fish of Finding Nemo(and now Finding Dory) — a meme here, a celebrity picture there — oh, look … a funny cat video.

    Thus people who are interested in serious science should be extremely cautious when they read an online story that purports to be a paradigm-shifting scientific discovery. A recent example is one suggesting that a new force of nature might have been discovered. If true, that would mean that we have to rewrite the textbooks.

    A fifth force

    So what has been claimed?

    In an article submitted on April 7, 2015, to the arXiv repository of physics papers, a group of Hungarian researchers reported on a study in which they focused an intense beam of protons (particles found in the center of atoms) on thin lithium targets. The collisions created excited nuclei of beryllium-8, which decayed into ordinary beryllium-8 and pairs of electron-positron particles. (The positron is the antimatter equivalent of the electron.)

    The Standard Model is the collection of theories that describe the smallest experimentally observed particles of matter and the interactions between energy and matter. Credit: Karl Tate, LiveScience Infographic Artist

    They claimed that their data could not be explained by known physical phenomena in the Standard Model, the reigning model governing particle physics. But, they purported, they could explain the data if a new particle existed with a mass of approximately 17 million electron volts, which is 32.7 times heavier than an electron and just shy of 2 percent the mass of a proton. The particles that emerge at this energy range, which is relatively low by modern standards, have been well studied. And so it would be very surprising if a new particle were discovered in this energy regime.

    However, the measurement survived peer review and was published on Jan. 26, 2016, in the journal Physical Review Letters, which is one of the most prestigious physics journals in the world. In this publication, the researchers, and this research, cleared an impressive hurdle.

    Their measurement received little attention until a group of theoretical physicists from the University of California, Irvine (UCI), turned their attention to it. As theorists commonly do with a controversial physics measurement, the team compared it with the body of work that has been assembled over the last century or so, to see if the new data are consistent or inconsistent with the existing body of knowledge. In this case, they looked at about a dozen published studies.

    What they found is that though the measurement didn’t conflict with any past studies, it seemed to be something never before observed — and something that couldn’t be explained by the Standard Model.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth

    New theoretical framework

    To make sense of the Hungarian measurement, then, this group of UCI theorists invented a new theory.

    The theory invented by the Irvine group is really quite exotic. They start with the very reasonable premise that the possible new particle is something that is not described by existing theory. This makes sense because the possible new particle is very low mass and would have been discovered before if it were governed by known physics. If this were a new particle governed by new physics, perhaps a new force is involved. Since traditionally physicists speak of four known fundamental forces (gravity, electromagnetism and the strong and weak nuclear forces), this hypothetical new force has been dubbed “the fifth force.”

    Theories and discoveries of a fifth force have a checkered history, going back decades, with measurements and ideas arising and disappearing with new data. On the other hand, there are mysteries not explained by ordinary physics like, for example, dark matter. While dark matter has historically been modeled as a single form of a stable and massive particle that experiences gravity and none of the other known forces, there is no reason that dark matter couldn’t experience forces that ordinary matter doesn’t experience. After all, ordinary matter experiences forces that dark matter doesn’t, so the hypothesis isn’t so silly.

    There is no reason dark matter couldn’t experience forces that ordinary matter doesn’t experience. Here, in the galaxy cluster Abell 3827, dark matter was observed interacting with itself during a galaxy collision. Credit: ESO

    There are many ideas about forces that affect only dark matter and the term for this basic idea is called “complex dark matter.” One common idea is that there is a dark photon that interacts with a dark charge carried only by dark matter. This particle is a dark matter analog of the photon of ordinary matter that interacts with familiar electrical charge, with one exception: Some theories of complex dark matter imbue dark photons with mass, in stark contrast with ordinary photons.

    If dark photons exist, they can couple with ordinary matter (and ordinary photons) and decay into electron-positron pairs, which is what the Hungarian research group was investigating. Because dark photons don’t interact with ordinary electric charge, this coupling can only occur because of the vagaries of quantum mechanics. But if scientists started seeing an increase in electron-positron pairs, that might mean they were observing a dark photon.

    The Irvine group found a model that included a “protophobic” particle that was not ruled out by earlier measurements and would explain the Hungarian result. Particles that are “protophobic,” which literally means “fear of protons,” rarely or never interact with protons but can interact with neutrons (neutrophilic).

    The particle proposed by the Irvine group experiences a fifth and unknown force, which is in the range of 12 femtometers, or about 12 times bigger than a proton. The particle is protophobic and neutrophilic. The proposed particle has a mass of 17 million electron volts and can decay into electron-positron pairs. In addition to explaining the Hungarian measurement, such a particle would help explain some discrepancies seen by other experiments. This last consequence adds some weight to the idea.

    Paradigm-shifting force?

    So this is the status.

    What is likely to be true? Obviously, data is king. Other experiments will need to confirm or refute the measurement. Nothing else really matters. But that will take a year or so and having some idea before then might be nice. The best way to estimate the likelihood the finding is real is to look at the reputations of the various researchers involved. This is clearly a shoddy way to do science, but it will help shade your expectations.

    So let’s start with the Irvine group. Many of them (the senior ones, typically) are well- regarded and established members of the field, with substantive and solid papers in their past. The group includes a spectrum of ages, with both senior and junior members. In the interest of full disclosure, I know some of them personally and, indeed, two of them have read the theoretical portions of chapters of books I have written for the public to ensure that I didn’t say anything stupid. (By the way, they didn’t find any gaffes, but they certainly helped clarify certain points.) That certainly demonstrates my high regard for members of the Irvine group, but possibly taints my opinion. In my judgment, they almost certainly did a thorough and professional job of comparing their new model to existing data. They have found a small and unexplored region of possible theories that could exist.

    On the other hand, the theory is pretty speculative and highly improbable. This isn’t an indictment … all proposed theories could be labeled in this way. After all, the Standard Model, which governs particle physics, is nearly a half century old and has been thoroughly explored. In addition, ALL new theoretical ideas are speculative and improbable and almost all of them are wrong. This also isn’t an indictment. There are many ways to add possible modifications to existing theories to account for new phenomena. They can’t all be right. Sometimes none of the proposed ideas are right.

    However, we can conclude from the reputation of the group’s members that they have generated a new idea and have compared it to all relevant existing data. The fact that they released their model means that it survived their tests and thus it remains a credible, if improbable, possibility.

    What about the Hungarian group? I know none of them personally, but the article was published in Physical Review Letters — a chalk mark in the win column. However, the group has also published two previous papers in which comparable anomalies were observed, including a possible particle with a mass of 12 million electron volts and a second publication claiming the discovery of a particle with a mass of about 14 million electron volts. Both of these claims were subsequently falsified by other experiments.

    Further, the Hungarian group has never satisfactorily disclosed what error was made that resulted in these erroneous claims. Another possible red flag is that the group rarely publishes data that doesn’t claim anomalies. That is improbable. In my own research career, most publications were confirmation of existing theories. Anomalies that persist are very, very, rare.

    So what’s the bottom line? Should you be excited about this new possible discovery? Well…sure…possible discoveries are always exciting. The Standard Model has stood the test of time for half a century, but there are unexplained mysteries and the scientific community is always looking for the discovery that points us in the direction of a new and improved theory. But what are the odds that this measurement and theory will lead to the scientific world accepting a new force with a range of 12 fm and with a particle that shuns protons? My sense is that this a long shot. I am not so sanguine as to the chances of this outcome.

    Of course, this opinion is only that…an opinion, albeit an informed one. Other experiments will also be looking for dark photons because, even if the Hungarian measurement doesn’t stand up to scrutiny, there is still a real problem with dark matter. Many experiments looking for dark photons will explore the same parameter space (e.g. energy, mass and decay modes) in which the Hungarian researchers claim to have found an anomaly. We will soon (within a year) know if this anomaly is a discovery or just another bump in the data that temporarily excited the community, only to be discarded as better data is recorded. And, no matter the outcome, good and better science will be the eventual result.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 7:29 am on July 25, 2016 Permalink | Reply
    Tags: , ‘Tractor beams’ build atom-by-atom assembly in mid-air, , Physics   

    From COSMOS: ” ‘Tractor beams’ build atom-by-atom assembly in mid-air” 

    Cosmos Magazine bloc


    25 July 2016
    Cathal O’Connell

    Physicists have manipulated 50 individual atoms at once in a dramatic upscaling of a technique vital to quantum computing.

    Have you ever tried to catch a speck of dust between your fingers? That’s challenging enough, but what about catching a single atom?

    Controlling the position of individual atoms is vital for quantum computers, which use individual atoms as “qubits” – the quantum version of the “bits” of regular computers.

    Usually atom assembly is a painstaking process, and can only be done one at a time.

    In a new paper uploaded to the Arxiv (prior to peer review), physicists at Harvard, Caltech and MIT have teamed up to manipulate up to 50 individual rubidium atoms using an array of 100 optical tweezers.

    The technique works a bit like the tractor beam from Star Trek. The atoms float around in a cloud within a vacuum chamber, and the tweezers pluck them out of mid-air (or, perhaps we should say, out of mid-vacuum).

    The starship Enterprise using its tractor beam in an episode of Star Trek: The Next Generation. Quantum physicists have now done something similar at an atomic scale.Image credit: CBS via Getty Images

    The system then automatically arranges the atoms into a precise formation in less than half a second.

    Optical tweezers are tightly focused beams of light able to hold microscopic particles, or even single atoms, in three dimensions. It works by focusing two laser beams on to the same spot.

    An atom caught in the crossbeam stops dead, like a deer in headlights, because it is attracted to the strong electric field right at the center of the beam.

    When the beam is moved, the atom is dragged with it.

    Usually optical tweezers can only control one atom at a time. Now a team of American researchers, led by Mikhail Lukin at Harvard University and Manuel Endres at the California Institute of Technology, have found a way to upscale the process to control 50 atoms at once.

    The advance hinges on the team’s ability to split their laser source into 50 separate beams, and then control each beam individually.

    The team starts off with a cloud of rubidium atoms cooled to less than a degree above absolute zero, floating around in a vacuum chamber. When the scientists switch on the optical tweezers array, they create a line of 50 atom traps within the cloud.

    Most, but not all, of the traps usually succeed in catching an atom, and to check which ones are successful the researchers snap a picture using a special camera that can detect how a single atom fluoresces when trapped.

    Empty tweezers are simply switched off, while those that are holding atoms are manipulated to drag the atoms a desired pattern. Then another picture confirms it.

    If the pattern does not match up with what has been programmed, the system can use any remaining empty tweezers to grab a few more atoms and bring them over.

    For 50 atoms, this whole process takes about 400 milliseconds. For smaller arrays, it takes even less time.

    So far the technique only works for making a single line of atoms stretching about a tenth of a millimetre across. But the team plans to scale up the process to make a two dimensional array of optical tweezers.

    They write that the “robust creation of defect-free arrays of hundreds of atoms is feasible”.

    Another problem is how long the pattern can be held. At the moment, the limit is about 10 seconds. A quantum computer, meanwhile, would require holding times on the order of 100 seconds.

    The researchers expect that using a better vacuum, and an improved laser system, might get them at least to the one-minute mark.

    After that, they’ll need to think of other tricks so that their pattern is not gone in 60 seconds.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 7:45 am on July 23, 2016 Permalink | Reply
    Tags: Asimina Arvanitaki, , , , Physics, ,   

    From Quanta: Women in Science – “Mining Black Hole Collisions for New Physics” Asimina Arvanitaki 

    Quanta Magazine
    Quanta Magazine

    July 21, 2016
    Joshua Sokol

    Asimina Arvanitaki during a July visit to the CERN particle physics laboratory in Geneva, Switzerland. Samuel Rubio for Quanta Magazine

    When physicists announced in February that they had detected gravitational waves firsthand, the foundations of physics scarcely rattled. The signal exactly matched the expectations physicists had arrived at after a century of tinkering with Einstein’s theory of general relativity. “There is a question: Can you do fundamental physics with it? Can you do things beyond the standard model with it?” said Savas Dimopoulos, a theoretical physicist at Stanford University. “And most people think the answer to that is no.”

    Asimina Arvanitaki is not one of those people. A theoretical physicist at Ontario’s Perimeter Institute of Theoretical Physics, Arvanitaki has been dreaming up ways to use black holes to explore nature’s fundamental particles and forces since 2010, when she published a paper with Dimopoulos, her mentor from graduate school, and others. Together, they sketched out a “string axiverse,” a pantheon of as yet undiscovered, weakly interacting particles. Axions such as these have long been a favored candidate to explain dark matter and other mysteries.

    In the intervening years, Arvanitaki and her colleagues have developed the idea through successive papers. But February’s announcement marked a turning point, where it all started to seem possible to test these ideas. Studying gravitational waves from the newfound population of merging black holes would allow physicists to search for those axions, since the axions would bind to black holes in what Arvanitaki describes as a “black hole atom.”

    “When it came up, we were like, ‘Oh my god, we’re going to do it now, we’re going to look for this,’” she said. “It’s a whole different ball game if you actually have data.”

    That’s Arvanitaki’s knack: matching what she calls “well-motivated,” field-hopping theoretical ideas with the precise experiment that could probe them. “By thinking away from what people are used to thinking about, you see that there is low-hanging fruit that lie in the interfaces,” she said. At the end of April, she was named the Stavros Niarchos Foundation’s Aristarchus Chair at the Perimeter Institute, the first woman to hold a research chair there.

    It’s a long way to come for someone raised in the small Grecian village of Koklas, where the graduating class at her high school — at which both of her parents taught — consisted of nine students. Quanta Magazine spoke with Arvanitaki about her plan to use black holes as particle detectors. An edited and condensed version of those discussions follows.

    QUANTA MAGZINE: When did you start to think that black holes might be good places to look for axions?

    ASIMINA ARVANITAKI: When we were writing the axiverse paper, Nemanja Kaloper, a physicist who is very good in general relativity, came and told us, “Hey, did you know there is this effect in general relativity called superradiance?” And we’re like, “No, this cannot be, I don’t think this happens. This cannot happen for a realistic system. You must be wrong.” And then he eventually convinced us that this could be possible, and then we spent like a year figuring out the dynamics.

    What is superradiance, and how does it work?

    An astrophysical black hole can rotate. There is a region around it called the “ergo region” where even light has to rotate. Imagine I take a piece of matter and throw it in a trajectory that goes through the ergo region. Now imagine you have some explosives in the matter, and it breaks apart into pieces. Part of it falls into the black hole and part escapes into infinity. The piece that is coming out has more total energy than the piece that went in the black hole.

    You can perform the same experiment by scattering radiation from a black hole. Take an electromagnetic wave pulse, scatter it from the black hole, and you see that the pulse you got back has a higher amplitude.

    So you can send a pulse of light near a black hole in such a way that it would take some energy and angular momentum from the black hole’s spin?

    This is old news, by the way, this is very old news. In ’72 Press and Teukolsky wrote a Nature paper that suggested the following cute thing. Let’s imagine you performed the same experiment as the light, but now imagine that you have the black hole surrounded by a giant mirror. What will happen in that case is the light will bounce on the mirror many times, the amplitude [of the light] grows exponentially, and the mirror eventually explodes due to radiation pressure. They called it the black hole bomb.

    The property that allows light to do this is that light is made of photons, and photons are bosons — particles that can sit in the same space at the same time with the same wave function. Now imagine that you have another boson that has a mass. It can [orbit] the black hole. The particle’s mass acts like a mirror, because it confines the particle in the vicinity of the black hole.

    In this way, axions might get stuck around a black hole?

    This process requires that the size of the particle is comparable to the black hole size. Turns out that [axion] mass can be anywhere from Hubble scale — with a quantum wavelength as big as the universe — or you could have a particle that’s tiny in size.

    So if they exist, axions can bind to black holes with a similar size and mass. What’s next?

    What happens is the number of particles in this bound orbit starts growing exponentially. At the same time the black hole spins down. If you solve for the wave functions of the bound orbits, what you find is that they look like hydrogen wave functions. Instead of electromagnetism binding your atom, what’s binding it is gravity. There are three quantum numbers you can describe, just the same. You can use the exact terminology that you can use in the hydrogen atom.

    How could we check to see if any of the black holes LIGO finds have axion clouds orbiting around black hole nuclei?

    This is a process that extracts energy and angular momentum from the black hole. If you were to measure spin versus mass of black holes, you should see that in a certain mass range for black holes you see no quickly rotating black holes.

    This is where Advanced LIGO comes in.

    LSC LIGO Scientific Collaboration
    VIRGO Collaboration bloc

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation
    Caltech/MIT Advanced aLigo Hanford, WA, USA installation

    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA
    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    You saw the event they saw. [Their measurements] allowed them to measure the masses of the merging objects, the mass of the final object, the spin of the final object, and to have some information about the spins of the initial objects.

    If I were to take the spins of the black holes before they merged, they could have been affected by superradiance. Now imagine a graph of black hole spin versus mass. Advanced LIGO could maybe get, if the things that we hear are correct, a thousand events per year. Now you have a thousand data points on this plot. So you may trace out the region that is affected by this particle just by those measurements.

    That would be supercool.

    That’s of course indirect. So the other cool thing is that it turns out there are signatures that have to do with the cloud of particles themselves. And essentially what they do is turn the black hole into a gravitational wave laser.

    Awesome. OK, what does that mean?

    Yeah, what that means is important. Just like you have transitions of electrons in an excited atom, you can have transitions of particles in the gravitational wave atom. The rate of emission of gravitational waves from these transitions is enhanced by the 1080 particles that you have. It would look like a very monochromatic line. It wouldn’t look like a transient. Imagine something now that emits a signal at a very fixed frequency.

    Where could LIGO expect to see signals like this?

    In Advanced LIGO, you actually see the birth of a black hole. You know when and where a black hole was born with a certain mass and a certain spin. So if you know the particle masses that you’re looking for, you can predict when the black hole will start growing the [axion] cloud around it. It could be that you see a merger in that day, and one or 10 years down the line, they go back to the same position and they see this laser turning on, they see this monochromatic line coming out from the cloud.

    You can also do a blind search. Because you have black holes that are roaming the universe by themselves, and they could still have some leftover cloud around them, you can do a blind search for monochromatic gravitational waves.

    Were you surprised to find out that axions and black holes could combine to produce such a dramatic effect?

    Oh my god yes. What are you talking about? We had panic attacks. You know how many panic attacks we had saying that this effect, no, this cannot be true, this is too good to be true? So yes, it was a surprise.

    The experiments you suggest draw from a lot of different theoretical ideas — like how we could look for high-frequency gravitational waves with tabletop sensors, or test whether dark matter oscillates using atomic clocks. When you’re thinking about making risky bets on physics beyond the standard model, what sorts of theories seem worth the effort?

    What is well motivated? Things that are not: “What if you had this?” People imagine: “What if dark matter was this thing? What if dark matter was the other thing?” For example, supersymmetry makes predictions about what types of dark matter should be there. String theory makes predictions about what types of particles you should have. There is always an underlying reason why these particles are there; it’s not just the endless theoretical possibilities that we have.

    And axions fit that definition?

    This is a particle that was proposed 30 years ago to explain the smallness of the observed electric dipole moment of the neutron. There are several experiments around the world looking for it already, at different wavelengths. So this particle, we’ve been looking for it for 30 years. This can be the dark matter. That particle solves an outstanding problem of the standard model, so that makes it a good particle to look for.

    Now, whether or not the particle is there I cannot answer for nature. Nature will have to answer.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 582 other followers

%d bloggers like this: