Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:35 pm on February 18, 2017 Permalink | Reply
    Tags: , Breakthrough in understanding heat transport with a chain of gold atoms, , Physics, Wiedemann-Franz law   

    From phys.org: “Breakthrough in understanding heat transport with a chain of gold atoms” 

    physdotorg
    phys.org

    February 17, 2017

    1
    Artists’ view of the quantized thermal conductance of an atomically thin gold contact. Credit: Enrique Sahagun

    The precise control of electron transport in microelectronics makes complex logic circuits possible that are in daily use in smartphones and laptops. Heat transport is of similar fundamental importance and its control is for instance necessary to efficiently cool the ever smaller chips. An international team including theoretical physicists from Konstanz, Junior Professor Fabian Pauly and Professor Peter Nielaba and their staff, has achieved a real breakthrough in better understanding heat transport at the nanoscale. The team used a system that experimentalists in nanoscience can nowadays realize quite routinely and keeps serving as the “fruit fly” for breakthrough discoveries: a chain of gold atoms. They used it to demonstrate the quantization of the electronic part of the thermal conductance. The study also shows that the Wiedemann-Franz law, a relation from classical physics, remains valid down to the atomic level. The results were published in the scientific journal Science on 16 February 2017.

    To begin with, the test object is a microscopic gold wire. This wire is pulled until its cross section is only one atom wide and a chain of gold atoms forms, before it finally breaks. The physicists send electric current through this atomic chain, that is through the thinnest wire conceivable. With the help of different theoretical models the researchers can predict the conductance value of the electric transport, and also confirm it by experiment. This electric conductance value indicates how much charge current flows when an electrical voltage is applied. The thermal conductance, that indicates the amount of heat flow for a difference in temperature, could not yet be measured for such atomic wires.

    Now the question was whether the Wiedemann-Franz law, that states that the electric conductance and the thermal conductance are proportional to each other, remains valid also at the atomic scale. Generally, electrons as well as atomic oscillations (also called vibrations or phonons) contribute to heat transport. Quantum mechanics has to be used, at the atomic level, to describe both the electron and the phonon transport. The Wiedemann-Franz law, however, only describes the relation between macroscopic electronic properties. Therefore, initially the researchers had to find out how high the contribution of the phonons is to the thermal conductance.

    The doctoral researchers Jan Klöckner and Manuel Matt did complementary theoretical calculations, which showed that usually the contribution of phonons to the heat transport in atomically thin gold wires is less than ten percent, and thus is not decisive. At the same time, the simulations confirm the applicability of the Wiedemann-Franz law. Manuel Matt used an efficient, albeit less accurate method that provided statistical results for many gold wire stretching events to calculate the electronic part of the thermal conductance value, while Jan Klöckner applied density functional theory to estimate the electronic and phononic contributions in individual contact geometries. The quantization of the thermal conductance in gold chains, as proven by experiment, ultimately results from the combination of three factors: the quantization of the electrical conductance value in units of the so-called conductance quantum (twice the inverse Klitzing constant 2e2/h), the negligible role of phonons in heat transport and the validity of the Wiedemann-Franz law.

    For quite some time it has been possible to theoretically calculate, with the help of computer models as developed in the teams of Fabian Pauly and Peter Nielaba, how charges and heat flow through nanostructures. A highly precise experimental setup, as created by the experimental colleagues Professor Edgar Meyhofer and Professor Pramod Reddy from the University of Michigan (USA), was required to be able to compare the theoretical predictions with measurements. In previous experiments the signals from the heat flow through single atom contacts were too small. The Michigan group succeeded in improving the experiment: Now the actual signal can be filtered out and measured.

    The results of the research team make it possible to study heat transport not only in atomic gold contacts but many other nanosystems. They offer opportunities to experimentally and theoretically explore numerous fundamental quantum heat transport phenomenona that might help to use energy more efficiently, for example by exploiting thermoelectricity.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 3:56 pm on February 17, 2017 Permalink | Reply
    Tags: , Bjorken x variable, , , HERA collider, How strange is the proton?, , , Physics, , , Strong interactions   

    From CERN ATLAS: “How strange is the proton?” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    ATLAS

    25th January 2017
    ATLAS Collaboration

    1
    Figure 1: The data ellipses illustrate the 68% CL coverage for the total uncertainties (full green) and total excluding the luminosity uncertainty (open black). Theoretical predictions based on various PDF sets are shown with open symbols of different colours. (Image: ATLAS Collaboration/CERN)

    The protons collided by the LHC are not elementary particles, but are instead made up of quarks, antiquarks and gluons. The theory of the strong interactions – quantum chromodynamics (QCD) – does not allow physicists to calculate the composition of protons from first principles. However, QCD can connect measurements made in different processes and at different energy scales such that universal “parton density functions” (PDFs) can be extracted. These determine the dynamic substructure of the proton.

    The discovery of quarks as the elements of the partonic structure of the proton dates about 50 years. Soon after QCD was born and the existence of gluons inside the proton was established. Much has since been learned through a combination of new experimental data and theoretical advances. At the LHC, reactions involve quarks or gluons that carry a certain fraction x of the proton’s momentum, expressed through the Bjorken x variable. Below x of 0.01, the proton constituents are mainly gluons and a sea of quark-antiquark pairs.

    Electron-proton scattering data from the HERA collider has constrained the gluon and the sum of all quarks weighted by the square of their electric charge.

    3
    Data from the HERA collider live on (Image: DESY Hamburg)

    4
    DESY map

    But the low x sea-quark composition – expressed in terms of the lighter quarks named up, down and strange quarks – is still not well understood. New data from the ATLAS experiment shows, with unprecedented precision, the production of W and Z bosons through the weak interaction. This sheds new light on the question: how “strange” the proton is at small x?

    The W production is detected through its decay into a charged lepton (electron or muon) and a neutrino, while the Z boson produces an electron-positron (or muon-antimuon) pair. The experimental detection of electrons and muons poses different challenges and thus the simultaneous measurement in both channels provides an important cross-check of the results, thus improving the final precision achieved. The integrated cross sections for Z boson and W boson production are measured with a precision of 0.3% and 0.6%, respectively, and with an additional common normalisation uncertainty from the luminosity determination of 1.8%. Differential cross sections are also measured in a variety of kinematic regions and about half of the measurement points have a precision of 1% or better.

    2
    Figure 2: Determination of the relative strange-to-light quark fraction R_s. Bands: Present result and its uncertainty contributions from experimental data, QCD fit, and theoretical uncertainties. Closed symbols with horizontal error bars: predictions from different NNLO PDF sets. Open square: previous ATLAS result. (Image: ATLAS Collaboration/CERN)

    The measurements are then compared to state-of-the-art QCD expectations using different PDF sets. Because the production of W and Z bosons through the weak interaction has a different dependence on the specific quark flavours compared to the electromagnetic interaction seen in electron-positron scattering at HERA, analysing both data sets gives new access to the strange quark content of the proton.

    As is shown in Figure 1, the measured production rate of W bosons is very similar for all PDF sets, in good agreement with the data. In contrast, the rate for Z boson production is underestimated significantly for most PDF sets. A dedicated analysis enables this deficit to be attributed to a too small strange quark contribution in most PDF sets. The new PDF set, which this paper presents, requires the strange quark sea to be of a similar size as the up and down quark sea. This is summarised by the quantity RS, which is the ratio of the strange quark sea to the up and down quark sea, and which is found to be close to one, as shown in Figure 2. This result is a striking confirmation of the hypothesis of a light-flavour symmetry of proton structure at low x. This result will generate many further studies, because hitherto there had been indications from low energy neutrino-scattering data that favoured a suppressed strange-quark contribution with respect to the up and down quark parts, leading to an RS close to 0.5.

    The analysis also shows that the potential to which precise W and Z cross sections can provide useful constraints on PDFs is not limited by the now very high experimental precision, but rather by the uncertainty of the currently available theory calculations. The salient results of this paper are thus of fundamental importance for forthcoming high-precision measurements, such as the mass of the W boson, and also represent a strong incentive for further improving the theory of Drell-Yan scattering in proton-proton collisions.

    Links:

    Precision measurement and interpretation of inclusive W+, W− and Z/γ∗ production cross sections with the ATLAS detector, https://arxiv.org/abs/1612.03016

    See the full article here .

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 3:32 pm on February 17, 2017 Permalink | Reply
    Tags: A minimal extension to the standard model of particle physics involves six new particles, Astrophysical observations suggest that the mysterious dark matter is more than five times as common, , , Model Tries to Solve Five Physics Problems at Once, Physical Review Letters, Physics, , The particles are three heavy right-handed neutrinos and a color triplet fermion and a particle called rho that both gives mass to the right-handed neutrinos and drives cosmic inflation together with   

    From DESY: “Solving five big questions in particle physics in a SMASH” 

    DESY
    DESY

    2017/02/16
    No writer credit found

    Extension of the standard model provides complete and consistent description of the history of the universe.

    The extremely successful standard model of particle physics has an unfortunate limitation: the current version is only able to explain about 15 percent of the matter found in the universe.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Although it describes and categorises all the known fundamental particles and interactions, it does so only for the type of matter we are familiar with. However, astrophysical observations suggest that the mysterious dark matter is more than five times as common. An international team of theoretical physicists has now come up with an extension to the standard model which could not only explain dark matter but at the same time solve five major problems faced by particle physics at one stroke. Guillermo Ballesteros, from the University of Paris-Saclay, and his colleagues are presenting their SMASH model (“Standard Model Axion Seesaw Higgs portal inflation” model) in the journal Physical Review Letters.

    1
    The history of the universe according to SMASH, denoting the different phases and the dominant energies of the epochs since the Big Bang. Credit: DESY

    3

    Model Tries to Solve Five Physics Problems at Once

    A minimal extension to the standard model of particle physics involves six new particles. http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.118.071802

    The standard model has enjoyed a happy life. Ever since it was proposed four decades ago, it has passed all particle physics tests with flying colors. But it has several sticky problems. For instance, it doesn’t explain why there’s more matter than antimatter in the cosmos. A quartet of theorists from Europe has now taken a stab at solving five of these problems in one go. The solution is a model dubbed SMASH, which extends the standard model in a minimal fashion.

    SMASH adds six new particles to the seventeen fundamental particles of the standard model. The particles are three heavy right-handed neutrinos, a color triplet fermion, a particle called rho that both gives mass to the right-handed neutrinos and drives cosmic inflation together with the Higgs boson, and an axion, which is a promising dark matter candidate. With these six particles, SMASH does five things: produces the matter–antimatter imbalance in the Universe; creates the mysterious tiny masses of the known left-handed neutrinos; explains an unusual symmetry of the strong interaction that binds quarks in nuclei; accounts for the origin of dark matter; and explains inflation.

    The jury is out on whether the model will fly. For one thing, it doesn’t tackle the so-called hierarchy problem and the cosmological constant problem. On the plus side, it makes clear predictions, which the authors say can be tested with future data from observations of the cosmic microwave background and from experiments searching for axions. One prediction is that axions should have a mass between 50 and 200 μeV. Over to the experimentalists, then.

    This research is published in Physical Review Letters.

    “SMASH was actually developed from the bottom up,” explains DESY’s Andreas Ringwald, who co-authored the study. “We started off with the standard model and only added as few new concepts as were necessary in order to answer the unresolved issues.” To do this, the scientists combined various different existing theoretical approaches and came up with a simple, uniform model. SMASH adds a total of six new particles to the standard model: three heavy, right-handed neutrinos and an additional quark, as well as a so-called axion and the heavy rho (ρ) particle. The latter two form a new field which extends throughout the entire universe.

    Using these extensions, the scientists were able to solve five problems: the axion is a candidate for dark matter, which astrophysical observations suggest is five times more ubiquitous than the matter we are familiar with, which is described by the standard model. The heavy neutrinos explain the mass of the already known, very light neutrinos; and the rho interacts with the Higgs boson to produce so-called cosmic inflation, a period during which the entire young universe suddenly expanded by a factor of at least one hundred septillion for hitherto unknown reasons. In addition, SMASH provides explanations as to why our universe contains so much more matter than antimatter, even though equal amounts must have been created during the big bang, and it reveals why no violation of so-called CP symmetry is observed in the strong force, one of the fundamental interactions.

    3
    The particles of the standard model (SM, left) and of the extension SMASH (right). Credit: Carlos Tamarit, University of Durham

    “Overall, the resulting description of the history of the universe is complete and consistent, from the period of inflation to the present day. And unlike many older models, the individual important values can be calculated to a high level of precision, for example the time at which the universe starts heating up again after inflation,” emphasises Ringwald.

    Being able to calculate these values with such precision means that SMASH could potentially be tested experimentally within the next ten years. “The good thing about SMASH is that the theory is falsifiable. For example, it contains very precise predictions of certain features of the so-called cosmic microwave background. Future experiments that measure this radiation with even greater precision could therefore soon rule out SMASH – or else confirm its predictions,” explains Ringwald. A further test of the model is the search for axions. Here too, the model is able to make accurate predictions, and if axions do indeed account for the bulk of dark matter in the universe, then SMASH requires them to have a mass of 50 to 200 micro-electronvolts, in the units conventionally used in particle physics. Experiments that examine dark matter more precisely could soon test this prediction too.

    Javier Redondo from the University of Saragossa in Spain and Carlos Tamarit from the University of Durham in England were also involved in the study.

    Read the APS synopsis: http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.118.071802

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    desi

    DESY is one of the world’s leading accelerator centres. Researchers use the large-scale facilities at DESY to explore the microcosm in all its variety – from the interactions of tiny elementary particles and the behaviour of new types of nanomaterials to biomolecular processes that are essential to life. The accelerators and detectors that DESY develops and builds are unique research tools. The facilities generate the world’s most intense X-ray light, accelerate particles to record energies and open completely new windows onto the universe. 
That makes DESY not only a magnet for more than 3000 guest researchers from over 40 countries every year, but also a coveted partner for national and international cooperations. Committed young researchers find an exciting interdisciplinary setting at DESY. The research centre offers specialized training for a large number of professions. DESY cooperates with industry and business to promote new technologies that will benefit society and encourage innovations. This also benefits the metropolitan regions of the two DESY locations, Hamburg and Zeuthen near Berlin.

     
  • richardmitnick 1:31 pm on February 16, 2017 Permalink | Reply
    Tags: , , , Physics, Triangulene   

    From Futurism: “Scientists Have Finally Created a Molecule That Was 70 Years in the Making” 

    futurism-bloc

    Futurism

    2.16.17
    Neil C. Bhavsar

    Creating the Impossible

    Move over graphene, it’s 2017 and we have a new carbon structure to rave about: Triangulene. It’s one atom thick, six carbon hexagons in size, and in the shape of – you guessed it – a triangle.

    1

    Development of the molecule has eluded chemists for a period of nearly seventy years. It was first predicted mathematically in the 1950s by Czech scientist, Eric Clar. He noted that the molecule would be unstable electronically due to two unpaired electrons in the six benzene structure. Since then, the mysterious molecule has ushered generations of scientists in a pursuit for the unstable molecule – all resulting in failure due to the oxidizing properties of two lone electron pairs.

    Now, IBM researchers in Zurich, Switzerland seem to have done the impossible: they created the molecule. While most scientists build molecules from the ground up, Leo Gross and his team decided to take the opposite approach. They worked with a larger precursor model and removed two hydrogens substituents from the molecule to conjure up the apparition molecule that is triangulene.

    On top of this, they were able to successfully image the structure with a scanning probe microscope and note the molecule’s unexpected stability in the presence of copper. Their published work is available at Nature Nanotechnology.

    This new material is already proving to be impressive. The two unpaired electrons of the triangulene molecules were discovered to have aligned spins, granting the molecule magnetic properties. Meaning triangulene has a lot of potential in electronics, specifically by allowing us to encode and process information by manipulating the electron spin – a field known as spintronics.

    The IBM researchers still have a lot to learn about triangulene. Moving forward, other teams will attempt to verify whether the researchers actually created the triangle-shaped molecule or not. Until then, the technique the team developed could be used for making other elusive structures. Although, it still isn’t ideal, as it is a slow and expensive process. Even so, this could push us closer to the age of quantum computers.

    References: ScienceAlert – Latest, Nature

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Futurism covers the breakthrough technologies and scientific discoveries that will shape humanity’s future. Our mission is to empower our readers and drive the development of these transformative technologies towards maximizing human potential.

     
  • richardmitnick 9:12 am on February 16, 2017 Permalink | Reply
    Tags: , Physics, , Technique could help increase resolution of microscopes and telescopes.,   

    From U Toronto: “University of Toronto physicists harness neglected properties of light” 

    U Toronto Bloc

    University of Toronto

    February 15, 2017
    Patchen Barss

    Technique could help increase resolution of microscopes and telescopes.

    U of T researchers have demonstrated a way to increase the resolution of microscopes and telescopes beyond long-accepted limitations by tapping into previously neglected properties of light. The method allows observers to distinguish very small or distant objects that are so close together they normally meld into a single blur.

    Telescopes and microscopes are great for observing lone subjects. Scientists can precisely detect and measure a single distant star. The longer they observe, the more refined their data becomes.

    But objects like binary stars don’t work the same way.

    That’s because even the best telescopes are subject to laws of physics that cause light to spread out or “diffract.” A sharp pinpoint becomes an ever-so-slightly blurry dot. If two stars are so close together that their blurs overlap, no amount of observation can separate them out. Their individual information is irrevocably lost.

    Circumventing the limitations of the “Rayleigh Criterion”

    More than 100 years ago, British physicist John William Strutt – better known as Lord Rayleigh – established the minimum distance between objects necessary for a telescope to pick out each individually. The “Rayleigh Criterion” has stood as an inherent limitation of the field of optics ever since.

    Telescopes, though, only register light’s “intensity” or brightness. Light has other properties that now appear to allow one to circumvent the Rayleigh Criterion.

    “To beat Rayleigh’s curse, you have to do something clever,” says Professor Aephraim Steinberg, a physicist at U of T’s Centre for Quantum Information and Quantum Control, and Senior Fellow in CIFAR’s Quantum Information Science program. He’s the lead author of a paper published today in the journal Physical Review Letters.

    2
    Professor Aephraim Steinberg, is a physicist at U of T’s Centre for Quantum Information and Quantum Control, and Senior Fellow in CIFAR’s Quantum Information Science program.

    Measuring a property of light called ‘phase’

    Some of these clever ideas were recognized with the 2014 Nobel Prize in Chemistry, notes Steinberg, but those methods all still rely on intensity only, limiting the situations in which they can be applied. “We measured another property of light called ‘phase.’ And phase gives you just as much information about sources that are very close together as it does those with large separations.”

    Light travels in waves, and all waves have a phase. Phase refers to the location of a wave’s crests and troughs. Even when a pair of close-together light sources blurs into a single blob, information about their individual wave phases remains intact. You just have to know how to look for it. This realization was published by Tsang, Nair, and Lu last year in Physical Review X, and Steinberg’s and three other experimental groups immediately set about devising a variety of ways to put it into practice.

    “Light is actually easy to slow down”

    “We tried to come up with the simplest thing you could possibly do,” Steinberg says. “To play with the phase, you have to slow a wave down, and light is actually easy to slow down.”

    3
    PhD students Edwin (Weng Kian) Tham and Hugo Ferretti. Photo: Diana Tyszko

    His team, including PhD students Edwin (Weng Kian) Tham and Hugo Ferretti, split test images in half. Light from each half passes through glass of a different thickness, which slows the waves for different amounts of time, changing their respective phases. When the beams recombine, they create distinct interference patterns that tell the researchers whether the original image contained one object or two – at resolutions well beyond the Rayleigh Criterion.

    So far, Steinberg’s team has tested the method only in artificial situations involving highly restrictive parameters.

    True value lies in shaking up physicists’ concept of “where information actually is”

    “I want to be cautious – these are early stages,” he says. “In our laboratory experiments, we knew we just had one spot or two, and we could assume they had the same intensity. That’s not necessarily the case in the real world. But people are already taking these ideas and looking at what happens when you relax those assumptions.”

    The advance has potential applications both in observing the cosmos, and also in microscopy, where the method can be used to study bonded molecules and other tiny, tight-packed structures.

    Regardless of how much phase measurements ultimately improve imaging resolution, Steinberg says the experiment’s true value is in shaking up physicists’ concept of “where information actually is.”

    Steinberg’s “day job” is in quantum physics – this experiment was a departure for him. He says work in the quantum realm provided key philosophical insights about information itself that helped him beat Rayleigh’s Curse.

    “When we measure quantum states, you have something called the Uncertainty Principle, which says you can look at position or velocity, but not both. You have to choose what you measure. Now we’re learning that imaging is more like quantum mechanics than we realized,” he says. “When you only measure intensity, you’ve made a choice and you’ve thrown out information. What you learn depends on where you look.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Toronto Campus

    Established in 1827, the University of Toronto has one of the strongest research and teaching faculties in North America, presenting top students at all levels with an intellectual environment unmatched in depth and breadth on any other Canadian campus.

    Established in 1827, the University of Toronto has one of the strongest research and teaching faculties in North America, presenting top students at all levels with an intellectual environment unmatched in depth and breadth on any other Canadian campus.

     
  • richardmitnick 1:57 pm on February 15, 2017 Permalink | Reply
    Tags: , , , , Fixing the Big Bang Theory’s Lithium Problem, Physics   

    From AAS NOVA: “Fixing the Big Bang Theory’s Lithium Problem” 

    AASNOVA

    American Astronomical Society

    15 February 2017
    Susanna Kohler

    Inflationary Universe. NASA/WMAP
    Inflationary Universe. NASA/WMAP
    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey
    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    “The Big Bang theory is the most widely accepted cosmological model of the universe, but it still contains a few puzzles.”

    How did our universe come into being? The Big Bang theory is a widely accepted and highly successful cosmological model of the universe, but it does introduce one puzzle: the “cosmological lithium problem.” Have scientists now found a solution?

    Too Much Lithium

    In the Big Bang theory, the universe expanded rapidly from a very high-density and high-temperature state dominated by radiation. This theory has been validated again and again: the discovery of the cosmic microwave background radiation and observations of the large-scale structure of the universe both beautifully support the Big Bang theory, for instance. But one pesky trouble-spot remains: the abundance of lithium.

    2
    The arrows show the primary reactions involved in Big Bang nucleosynthesis, and their flux ratios, as predicted by the authors’ model, are given on the right. Synthesizing primordial elements is complicated! [Hou et al. 2017]

    According to Big Bang nucleosynthesis theory, primordial nucleosynthesis ran wild during the first half hour of the universe’s existence. This produced most of the universe’s helium and small amounts of other light nuclides, including deuterium and lithium.

    But while predictions match the observed primordial deuterium and helium abundances, Big Bang nucleosynthesis theory overpredicts the abundance of primordial lithium by about a factor of three. This inconsistency is known as the “cosmological lithium problem” — and attempts to resolve it using conventional astrophysics and nuclear physics over the past few decades have not been successful.

    In a recent study led by Suqing Hou (Institute of Modern Physics, Chinese Academy of Sciences), however, a team of scientists has proposed an elegant solution to this problem.

    3

    4
    Time and temperature evolution of the abundances of primordial light elements during the beginning of the universe. The authors’ model (dotted lines) successfully predicts a lower abundance of the beryllium isotope — which eventually decays into lithium — relative to the classical Maxwell-Boltzmann distribution (solid lines), without changing the predicted abundances of deuterium or helium. [Hou et al. 2017]

    Questioning Statistics

    Hou and collaborators questioned a key assumption in Big Bang nucleosynthesis theory: that the nuclei involved in the process are all in thermodynamic equilibrium, and their velocities — which determine the thermonuclear reaction rates — are described by the classical Maxwell-Boltzmann distribution.

    But do nuclei still obey this classical distribution in the extremely complex, fast-expanding Big Bang hot plasma? Hou and collaborators propose that the lithium nuclei don’t, and that they must instead be described by a slightly modified version of the classical distribution, accounted for using what’s known as “non-extensive statistics”.

    The authors show that using the modified velocity distributions described by these statistics, they can successfully predict the observed primordial abundances of deuterium, helium, and lithium simultaneously. If this solution to the cosmological lithium problem is correct, the Big Bang theory is now one step closer to fully describing the formation of our universe.

    Citation

    S. Q. Hou et al 2017 ApJ 834 165. doi:10.3847/1538-4357/834/2/165

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 12:16 pm on February 11, 2017 Permalink | Reply
    Tags: , Beamsplitter, Physics, , Wave function, What shape are photons? Quantum holography sheds light   

    From COSMOS: “What shape are photons? Quantum holography sheds light” 

    Cosmos Magazine bloc

    COSMOS

    20 July 2016 [Just found this in social media]
    Cathal O’Connell

    1
    Hologram of a single photon reconstructed from raw measurements (left) and theoretically predicted (right).
    FUW

    Imagine a shaft of yellow sunlight beaming through a window. Quantum physics tells us that beam is made of zillions of tiny packets of light, called photons, streaming through the air. But what does an individual photon “look” like? Does it have a shape? Are these questions even meaningful?

    Now, Polish physicists have created the first ever hologram of a single light particle. The feat, achieved by observing the interference of two intersecting light beams, is an important insight into the fundamental quantum nature of light.

    The result could also be important for technologies that require an understanding of the shape of single photons – such as quantum communication and quantum computers.

    ”We performed a relatively simple experiment to measure and view something incredibly difficult to observe: the shape of wavefronts of a single photon,” says Radoslaw Chrapkiewicz, a physicist at the University of Warsaw and lead author of the new paper, published in Nature Photonics.

    For hundreds of years, physicists have been working to figure out what light is made of. In the 19th century, the debate seemed to be settled by Scottish physicist James Clerk Maxwell’s description of light as a wave of electromagnetism.

    But things got a bit more complicated at the turn of the 20th century when German physicist Max Planck, then fellow countryman Albert Einstein, showed light was made up of tiny indivisible packets called photons.

    In the 1920s, Austrian physicist Erwin Schrödinger elaborated on these ideas with his equation for the quantum wave function to describe what a wave looks like, which has proved incredibly powerful in predicting the results of experiments with photons. But, despite the success of Schrödinger’s theory, physicists still debate what the wave function really means.

    Now physicists at the University of Warsaw measured, for the first time, the shape described by Schrödinger’s equation in a real experiment.

    Photons, travelling as waves, can be in step (called having the same phase). If they interact, they produce a bright signal. If they’re out of phase, they cancel each other out. It’s like sound waves from two speakers producing loud and quiet patches in a room.

    The image – which is called a hologram because it holds information on both the photon’s shape and phase – was created by firing two light beams at a beamsplitter, made of calcite crystal, at the same time.

    The beamsplitter acts a bit like a traffic intersection, where each photon can either pass straight on through or make a turn. The Polish team’s experiment hinged on measuring which path each photon took, which depends on the shape of their wave functions.

    2
    Scheme of the experimental setup for measuring holograms of single photons. FUW / dualcolor.pl / jch

    For a photon on its own, each path is equally probable. But when two photons approach the intersection, they interact – and these odds change.

    The team realised that if they knew the wave function of one of the photons, they could figure out the shape of the second from the positions of flashes appearing on a detector.

    It’s a little like firing two bullets to glance off one another mid-air and using the deflected trajectories to figure our shape of each projectile.

    Each run of the experiment produced two flashes on a detector, one for each photon. After more than 2,000 repetitions, a pattern of flashes built up and the team were able to reconstruct the shape of the unknown photon’s wave function.

    The resulting image looks a bit like a Maltese cross, just like the wave function predicted from Schrödinger’s equation. In the arms of the cross, where the photons were in step, the image is bright – and where they weren’t, we see darkness.

    The experiment brings us “a step closer to understanding what the wave function really is,” says Michal Jachura, who co-authored the work, and it could be a new tool for studying the interaction between two photons, on which technologies such as quantum communication and some versions of quantum computing rely.

    The researchers also hope to recreate wave functions of more complex quantum objects, such as atoms.

    “It’s likely that real applications of quantum holography won’t appear for a few decades yet,” says Konrad Banaszek, who was also part of the team, “but if there’s one thing we can be sure of it’s that they will be surprising.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:44 am on February 9, 2017 Permalink | Reply
    Tags: , Physics, , Wave of the future: Terahertz chips a new way of seeing through matter   

    From Princeton: “Wave of the future: Terahertz chips a new way of seeing through matter” 

    Princeton University
    Princeton University

    February 8, 2017
    Tien Nguyen

    1
    Princeton University researchers have drastically shrunk the equipment for producing terahertz — important electromagnetic pulses lasting one millionth of a millionth of a second — from a tabletop setup with lasers and mirrors to a pair of microchips small enough to fit on a fingertip (above). The simpler, cheaper generation of terahertz has potential for advances in medical imaging, communications and drug development. (Photos by Frank Wojciechowski for the Office of Engineering Communications)

    Electromagnetic pulses lasting one millionth of a millionth of a second may hold the key to advances in medical imaging, communications and drug development. But the pulses, called terahertz waves, have long required elaborate and expensive equipment to use.

    Now, researchers at Princeton University have drastically shrunk much of that equipment: moving from a tabletop setup with lasers and mirrors to a pair of microchips small enough to fit on a fingertip.

    In two articles recently published in the IEEE Journal of Solid State Circuits, the researchers describe one microchip that can generate terahertz waves, and a second chip that can capture and read intricate details of these waves.

    “The system is realized in the same silicon chip technology that powers all modern electronic devices from smartphones to tablets, and therefore costs only a few dollars to make on a large scale” said lead researcher Kaushik Sengupta, a Princeton assistant professor of electrical engineering.

    Terahertz waves are part of the electromagnetic spectrum — the broad class of waves that includes radio, X-rays and visible light — and sit between the microwave and infrared light wavebands. The waves have some unique characteristics that make them interesting to science. For one, they pass through most non-conducting material, so they could be used to peer through clothing or boxes for security purposes, and because they have less energy than X-rays, they don’t damage human tissue or DNA.

    Terahertz waves also interact in distinct ways with different chemicals, so they can be used to characterize specific substances. Known as spectroscopy, the ability to use light waves to analyze material is one of the most promising — and the most challenging — applications of terahertz technology, Sengupta said.

    To do it, scientists shine a broad range of terahertz waves on a target then observe how the waves change after interacting with it. The human eye performs a similar type of spectroscopy with visible light — we see a leaf as green because light in the green light frequency bounces off the chlorophyll-laden leaf.

    The challenge has been that generating a broad range of terahertz waves and interpreting their interaction with a target requires a complex array of equipment such as bulky terahertz generators or ultrafast lasers. The equipment’s size and expense make the technology impractical for most applications.

    Researchers have been working for years to simplify these systems. In September, Sengupta’s team reported a way to reduce the size of the terahertz generator and the apparatus that interprets the returning waves to a millimeter-sized chip. The solution lies in re-imaging how an antenna functions. When terahertz waves interact with a metal structure inside the chip, they create a complex distribution of electromagnetic fields that are unique to the incident signal. Typically, these subtle fields are ignored, but the researchers realized that they could read the patterns as a sort of signature to identify the waves. The entire process can be accomplished with tiny devices inside the microchip that read terahertz waves.

    “Instead of directly reading the waves, we are interpreting the patterns created by the waves,” Sengupta said. “It is somewhat like looking for a pattern of raindrops by the ripples they make in a pond.”

    2
    In two recently published articles, researchers Kaushik Sengupta (left), an assistant professor of electrical engineering, and Xue Wu (right), a Princeton graduate student in computer science, describe one microchip that can generate terahertz waves, and a second chip that can capture and read intricate details of these waves. Terahertz waves sit between the microwave and infrared light wavebands on the electromagnetic spectrum and have unique characteristics, such as the ability to pass through most non-conducting material such as clothing or boxes without damaging human tissue or DNA.

    Daniel Mittleman, a professor of engineering at Brown University, said the development was “a very innovative piece of work, and it potentially has a lot of impact.” Mittleman, who is the vice chair of the International Society for Infrared Millimeter and Terahertz Waves, said scientists still have work to do before the terahertz band can begin to be used in everyday devices, but the developments are promising.

    “It is a very big puzzle with many pieces, and this is just one, but it is a very important one,” said Mittleman, who is familiar with the work but had no role in it.

    On the terahertz-generation end, much of the challenge is creating a wide range of wavelengths within the terahertz band, particularly in a microchip. The researchers realized they could overcome the problem by generating multiple wavelengths on the chip. They then used precise timing to combine these wavelengths and create very sharp terahertz pulses.

    In an article published Dec. 14 in the IEEE Journal of Solid State Circuits, the researchers explained how they created a chip to generate the terahertz waves. The next step, the researchers said, is to extend the work farther along the terahertz band. “Right now we are working with the lower part of the terahertz band,” said Xue Wu, a Princeton doctoral student in electrical engineering and an author on both papers.

    “What can you do with a billion transistors operating at terahertz frequencies?” Sengupta asked. “Only by re-imagining these complex electromagnetic interactions from fundamental principles can we invent game-changing new technology.”

    The paper On-chip THz spectroscope exploiting electromagnetic scattering with multi-port antenna was published Sept. 2, and the paper Dynamic waveform shaping with picosecond time widths was published Dec. 14, both by IEEE Journal of Solid State Circuits. The research was supported in part by the National Science Foundation’s Division of Electrical, Communications and Cyber Systems (grant nos. ECCS-1408490 and ECCS-1509560).

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Princeton University Campus

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield

     
  • richardmitnick 12:20 pm on February 7, 2017 Permalink | Reply
    Tags: , , , First-principles evolutionary algorithm called USPEX, Na2He electride, Physics, Scientists discover helium chemistry   

    From EurekaAlert: “Scientists discover helium chemistry” 

    eurekaalert-bloc

    EurekaAlert

    1
    Crystal structure of Na2He, resembling a three-dimensional checkerboard. The purple spheres represent sodium atoms, which are inside the green cubes that represent helium atoms. The red regions inside voids of the structure show areas where localized electron pairs reside. Credit: Illustration is provided courtesy of Artem R. Oganov.

    6-Feb-2017
    Media Contact
    Asya Shepunova
    shepunova@phystech.edu
    7-916-813-0267

    Although helium is the second most-abundant element (after hydrogen) in the universe, it doesn’t play well with others. It is a member of a family of seven elements called the noble gases, which are called that because of their chemical aloofness — they don’t easily form compounds with other elements. Helium, widely believed to be the most inert element, has no stable compounds under normal conditions.

    Now, an international team of researchers led by Skoltech’s Prof. Artem R. Oganov (also a professor at Stony Brook University and head of Computational Materials Discovery laboratory at Moscow Institute of Physics and Technology) has predicted two stable helium compounds — Na2He and Na2HeO. The scientists experimentally confirmed and theoretically explained the stability of Na2He. This work could hold clues about the chemistry occurring inside gas giant planets and possibly even stars, where helium is a major component. The work is published by Nature Chemistry.

    The authors of the study used a crystal structure-predicting tool, the first-principles evolutionary algorithm called USPEX, to conduct a systematic search for stable helium compounds. They predicted the existence of Na2He, which was then successfully synthesized in a diamond anvil cell (DAC) experiment performed at the Carnegie Institution for Science in Washington by Prof. Alexander F. Goncharov and his colleagues. The compound appeared at pressures of about 1.1 million times Earth’s atmospheric pressure and is predicted to be stable at least up to 10 million times that.

    “The compound that we discovered is very peculiar: helium atoms do not actually form any chemical bonds, yet their presence fundamentally changes chemical interactions between sodium atoms, forces electrons to localize inside cubic voids of the structure and makes this material insulating,” says Xiao Dong, the first author of this work, who was a long-term visiting student in Oganov’s laboratory at the time when this work was done.

    Na2He is what’s called an electride, which is a special type of an ionic salt-like crystal. It has a positively charged sublattice of sodium ions and another negatively charged sublattice formed of localized electron pairs. Because electrons are strongly localized, this material is an insulator, meaning that it cannot conduct the free-flowing electrons that make up an electric current.

    The other predicted helium compound, Na2HeO, was found to be stable in the pressure range from 0.15 to 1.1 million atmospheres. It is also an ionic crystal with a structure similar to that of Na2He. However, in place of electron pairs, it has negatively charged oxygen in the form of O²?.

    “This study shows how new surprising phenomena can be discovered by combination of powerful theoretical methods and state-of-the-art experiments. It shows that very weird chemical phenomena and compounds can emerge at extreme conditions, and the role of such phenomena inside planets needs to be explored,” says Oganov.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    EurekAlert!, the premier online news source focusing on science, health, medicine and technology, is a free service for reporters worldwide.

    Since 1996, EurekAlert! has served as the leading destination for scientific organizations seeking to disseminate news to reporters and the public. Today, thousands of reporters around the globe rely on EurekAlert! as a source of ideas, background information, and advance word on breaking news stories.

    More than 1,000 peer-reviewed journals, universities, medical centers, government agencies and public relations firms have used EurekAlert! to distribute their news. EurekAlert! is an authoritative and comprehensive research news source for journalists all over the world.

     
  • richardmitnick 3:41 pm on February 4, 2017 Permalink | Reply
    Tags: , “Giant acceleration of diffusion” or GAD, , , Brownian motion, , , Physics   

    From Brown: “Research pushes concept of entropy out of kilter” 

    Brown University
    Brown University

    [THIS POST IS DEDICATED TO EBM, READY TO ROCK THE CAMPUS]

    February 2, 2017
    Kevin Stacey
    kevin_stacey@brown.edu
    401-863-3766

    Entropy, the measure of disorder in a physical system, is something that physicists understand well when systems are at equilibrium, meaning there’s no external force throwing things out of kilter. But new research by Brown University physicists takes the idea of entropy out of its equilibrium comfort zone.

    The research, published in Physical Review Letters, describes an experiment in which the emergence of a non-equilibrium phenomenon actually requires an entropic assist.

    1
    DNA drag race. Fluorescent stained DNA molecules make their way across of fluid channel pocked with tiny pits. The pits act as “entropic barriers.”
    Stein Lab / Brown University

    “It’s not clear what entropy even means when you’re moving away from equilibrium, so to have this interplay between a non-equilibrium phenomenon and an entropic state is surprising,” said Derek Stein, a Brown University physicist and co-author of the work. “It’s the tension between these two fundamental things that is so interesting.”

    The phenomenon the research investigated is known as “giant acceleration of diffusion,” or GAD. Diffusion is the term used to describe the extent to which small, jiggling particles spread out. The jiggling refers to Brownian motion, which describes the random movement of small particles as a result of collisions with surrounding particles. In 2001, a group of researchers developed a theory of how Brownian particles would diffuse in a system that was pushed out of equilibrium.

    Imagine jiggling particles arranged on a surface with undulating bumps like a washboard. Their jiggle isn’t quite big enough to enable the particles to jump over the bumps in the board, so they don’t diffuse much at all. However, if the board were tilted to some degree (in other words, moved out of equilibrium) the bumps would become easier to jump over in the downward-facing direction. As tilt begins to increase, some particles will jiggle free of the washboard barriers and run down the board, while others will stay put. In physics terms, the particles have become more diffusive — more spread-out — as the system is moved out of equilibrium. The GAD theory quantifies this diffusivity effect and predicts that as tilt starts to increase, diffusivity accelerates. When the tilt passes the point where all the particles are able to jiggle free and move down the washboard, then diffusivity decreases again.

    The theory is important, Stein says, because it’s one of only a few attempts to make solid predictions about how systems behave away from equilibrium. It’s been tested in a few other settings and has been found to make accurate predictions.

    But Stein and his team wanted to test the theory in an unfamiliar setting — one that introduces entropy into the mix.

    For the experiment, Stein and his colleagues placed DNA strands into nanofluidic channels — essentially, tiny fluid-filled hallways through which the molecules could travel. The channels were lined however with nanopits — tiny rectangular depressions that create deep spots within the relatively narrower channels. At equilibrium, DNA molecules tend to arrange themselves in disordered, spaghetti-like balls. As a result, when a molecule finds its way into a nanopit where it has more room to form a disordered ball, it tends to stay stuck there. The pits can be seen as being somewhat like the dips between bumps on the theoretical GAD washboard, but with a critical difference: The only thing actually holding the molecule in the pit is entropy.

    “This molecule is randomly jiggling around in the pit — randomly selecting different configurations to be in — and the number of possible configurations is a measure of the molecule’s entropy,” Stein explained. “It could, at some point, land on a configuration that’s thin enough to fit into the channel outside the pit, which would allow it to move from one pit to another. But that’s unlikely because there are so many more shapes that don’t go through than shapes that do. So the pit becomes an ‘entropic barrier.’”

    Stein and his colleagues wanted to see if the non-equilibrium GAD dynamic would still emerge in a system where the barriers were entropic. They used a pump to apply pressure to the nanofluidic channels, pushing them out of equilibrium. They then measured the speeds of each molecule to see if GAD emerged. What they saw was largely in keeping with the GAD theory. As the pressure increased toward a critical point, the diffusivity of the molecules increased — meaning some molecules zipped across the channel while others stayed stuck in their pits.

    “It wasn’t at all clear how this experiment would come out,” Stein said. “This is a non-equilibrium phenomenon that requires barriers, but our barriers are entropic and we don’t understand entropy away from equilibrium.”


    Anastasios Matzavinos, a professor of applied math at Brown, developed computer simulations of the experiment to help understand the forces at play.

    The fact that the barriers remained raises interesting questions about the nature of entropy, Stein says.

    “Non-equilibrium and entropy are two concepts that are kind of at odds, but we show a situation in which one depends on the other,” he said. “So what’s the guiding principle that tells what the tradeoff is between the two? The answer is: We don’t have one, but maybe experiments like this can start to give us a window into that.”

    In addition to the more profound implications, there may also be practical applications for the findings, Stein says. The researchers showed that they could estimate the tiny piconewton forces pushing the DNA forward just by analyzing the molecules’ motion. For reference, one newton of force is roughly the weight of an average apple. A piconewton is one trillionth of that.

    The experiment also showed that, with the right amount of pressure, the diffusivity of the DNA molecules was increased by factor of 15. So a similar technique could be useful in quickly making mixtures. If such a technique were developed to take advantage of GAD, it would be a first, Stein says.

    “No one has ever harnessed a non-equilibrium phenomenon for anything like that,” he said. “So that would certainly be an interesting possibility.”

    The work was led by Stein’s graduate student Daniel Kim. Co-authors were Clark Bowman, Jackson T. Del Bonis-O’Donnell and Anastasios Matzavinos, all from Brown. The work was supported by the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Welcome to Brown

    Brown U Robinson Hall
    Located in historic Providence, Rhode Island and founded in 1764, Brown University is the seventh-oldest college in the United States. Brown is an independent, coeducational Ivy League institution comprising undergraduate and graduate programs, plus the Alpert Medical School, School of Public Health, School of Engineering, and the School of Professional Studies.

    With its talented and motivated student body and accomplished faculty, Brown is a leading research university that maintains a particular commitment to exceptional undergraduate instruction.

    Brown’s vibrant, diverse community consists of 6,000 undergraduates, 2,000 graduate students, 400 medical school students, more than 5,000 summer, visiting and online students, and nearly 700 faculty members. Brown students come from all 50 states and more than 100 countries.

    Undergraduates pursue bachelor’s degrees in more than 70 concentrations, ranging from Egyptology to cognitive neuroscience. Anything’s possible at Brown—the university’s commitment to undergraduate freedom means students must take responsibility as architects of their courses of study.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: