Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:41 pm on September 22, 2017 Permalink | Reply
    Tags: , , , , , When a Star and a Binary Meet   

    From AAS NOVA: ” When a Star and a Binary Meet” 

    AASNOVA

    American Astronomical Society

    22 September 2017
    Susanna Kohler

    1
    What happens when stars interact in dense environments, such as globular clusters like the one pictured here? [HST/NASA/ESA]

    What happens in the extreme environments of globular clusters when a star and a binary system meet? A team of scientists has new ideas about how these objects can deform, change their paths, spiral around each other, and merge.

    Getting to Know Your Neighbors

    2
    Two simulations of the interaction of a white-dwarf–compact-object binary with a single incoming compact object (progressing from left to right). When tides are not included (bottom panel), the system interacts chaotically for a while before the single compact object is ejected and the binary system leaves on slightly modified orbit. When tides are included (top panel), the chaotic interactions eventually result in the tidal inspiral and merger of the binary (labeled in the top diagram and shown in detail in the inset). [Samsing et al. 2017]

    Stars living in dense environments, like globular clusters, experience very different lives than those in the solar neighborhood. In these extreme environments, close encounters are the norm — and this can lead to a variety of interesting interactions between the stars and systems of stars that encounter each other.

    One common type of meeting is that of a single star with a binary star system. Studies of such interactions often treat all three bodies like point sources, examining outcomes like:

    1. All three objects are mutually unbound by the interaction, resulting in three single objects.
    2. A flyby encounter occurs, in which the binary survives the encounter but its orbit becomes modified by the third star.
    3. An exchange occurs, in which the single star swaps spots with one of the binary stars and ejects it from the system.

    Complexities of Extended Objects

    But what if you treat the bodies not like point sources, but like extended objects with actual radii (as is true in real life)? Then there are additional complexities, such as collisions when the stars’ radii overlap, general relativistic effects when the stars pass very near one another, and tidal oscillations as gravitational forces stretch the stars out during a close passage and then release afterward.

    In a recently published study led by Johan Samsing (an Einstein Fellow at Princeton University), the authors explore how these complexities change the behavior of binary-single interactions in the centers of dense star clusters.

    3
    One example — again in the case of a white-dwarf–compact-object binary interacting with a single compact object — of the cross sections for different types of interactions. Exchanges (triangles) are generally most common, and direct collisions (circles) occur frequently, but tidal inspirals (pluses) can occur with similar frequency in such systems. Inspirals due to energy loss to gravitational waves (crosses) can occur as well. [Samsing et al. 2017]

    How Tides Change Things

    Using numerical simulations with an N-body code, and following up with analytic arguments, Samsing and collaborators show that the biggest change when they include effects such as tides is a new outcome that sometimes results from the chaotic evolution of the triple interaction: tidal inspirals.

    Tidal inspirals occur when a close passage creates tidal oscillations in a star, draining energy from the binary orbit. Under the right conditions, the loss of energy will lead to the stars’ inspiral, eventually resulting in a merger. This new channel for mergers — similar to mergers due to energy lost to gravitational waves — can occur even more frequently than collisions in some systems.

    Samsing and collaborators demonstrate that tidal inspirals occur more commonly for widely separated binaries and small-radius objects. Highly eccentric white-dwarf–neutron-star mergers, for example, can be dominated by tidal inspirals.

    The authors point out that this interesting population of eccentric compact binaries likely results in unique electromagnetic and gravitational-wave signatures — which suggests that further studies of these systems are important for better understanding what we can expect to observe when stars encounter each other in dense stellar systems.
    Citation

    Johan Samsing et al 2017 ApJ 846 36. doi:10.3847/1538-4357/aa7e32

    Related Journal Articles
    Further references at the full article with links.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    1

    AAS Mission and Vision Statement

    The mission of the American Astronomical Society is to enhance and share humanity’s scientific understanding of the Universe.

    The Society, through its publications, disseminates and archives the results of astronomical research. The Society also communicates and explains our understanding of the universe to the public.
    The Society facilitates and strengthens the interactions among members through professional meetings and other means. The Society supports member divisions representing specialized research and astronomical interests.
    The Society represents the goals of its community of members to the nation and the world. The Society also works with other scientific and educational societies to promote the advancement of science.
    The Society, through its members, trains, mentors and supports the next generation of astronomers. The Society supports and promotes increased participation of historically underrepresented groups in astronomy.
    The Society assists its members to develop their skills in the fields of education and public outreach at all levels. The Society promotes broad interest in astronomy, which enhances science literacy and leads many to careers in science and engineering.

    Adopted June 7, 2009

    Advertisements
     
  • richardmitnick 4:23 pm on September 22, 2017 Permalink | Reply
    Tags: A mini-halo is a faint diffuse region of radio emission that surrounds a cluster of galaxies, , , Nature of Galaxy Cluster Mini-Halos,   

    From CfA: “Nature of Galaxy Cluster Mini-Halos” 

    Harvard Smithsonian Center for Astrophysics


    Center For Astrophysics

    1
    A galaxy cluster mini-halo as seen around the galaxy NGC 1275 in the radio, with its main structures labeled: the northern extension, the two eastern spurs, the concave edge to the south, the south-western edge and a plume of emission to the south-south-west. Astronomers used radio and X-ray data to conclude that mini-halos, rather than being simple structures resulting from turbulence, are actually the result of multiple processes. Gendron-Marsolais et al.

    A mini-halo is a faint, diffuse region of radio emission that surrounds a cluster of galaxies. So far about thirty of these cluster mini-halos have been detected via their X-ray and radio emission, the result of radiation from electrons in the ionized gas, including one mini-halo in the nearby Perseus cluster of galaxies. These electrons are thought to arise from activity around a supermassive black hole at a galactic nucleus, which injects steams of particles into the intracluster medium and which also produces turbulence and shocks. One issue puzzling astronomers is that such electrons should rapidly lose their energy, faster than the time it takes for them to reach the mini-halo regions. Suggested solutions include processes in which turbulence reaccelerates the electrons, and in which cosmic rays generate new ones.

    CfA astronomer Reinout van Weeren and his colleagues used the radio Karl G. Jansky Very Large Array (JVLA) to obtain the first detailed study of the structure of the mini-halo in Perseus, and to compare it with Chandra X-Ray images.

    NRAO/Karl V Jansky VLA, on the Plains of San Agustin fifty miles west of Socorro, NM, USA

    NASA/Chandra Telescope

    They find that the radio emission comes primarily from gas behind a cold front as would be expected if the gas is sloshing around within the cluster as particles are re-accelerated. They also detect unexpected, filamentary structures that seem to be associated with edges of X-ray features. The scientists conclude that mini-halos are not simply diffuse structures produced by a single process, but reflect a variety of structures and processes including turbulent re-acceleration of electrons, relativistic activity from the black hole jets, and also some magnetic field effects. Not least, the results demonstrate the sensitivity of the new JVLA and the need to obtain such sensitive images to understand the mini-halo phenomenon.

    Reference(s):

    Deep 230–470 MHz VLA Observations of the Mini-Halo in the Perseus Cluster, M. Gendron-Marsolais, J. Hlavacek-Larrondo, R. J. van Weeren, T. Clarke, A. C. Fabian, H. T. Intema, G. B. Taylor, K. M. Blundell, and J. S. Sanders, MNRAS 469, 3872, 2017.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory (SAO) is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory (HCO), founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

     
  • richardmitnick 3:36 pm on September 22, 2017 Permalink | Reply
    Tags: , ATLAS hunts for new physics with dibosons, , , , ,   

    From CERN Courier: “ATLAS hunts for new physics with dibosons” 


    CERN Courier

    Sep 22, 2017

    1
    WZ data

    Beyond the Standard Model of particle physics (SM), crucial open questions remain such as the nature of dark matter, the overabundance of matter compared to antimatter in the universe, and also the mass scale of the scalar sector (what makes the Higgs boson so light?). Theorists have extended the SM with new symmetries or forces that address these questions, and many such extensions predict new resonances that can decay into a pair of bosons (diboson), for example: VV, Vh, Vγ and γγ, where V stands for a weak boson (W and Z), h for the Higgs boson, and γ is a photon.

    The ATLAS collaboration has a broad search programme for diboson resonances, and the most recent results using 36 fb–1 of proton–proton collision data at the LHC taken at a centre-of-mass energy of 13 TeV in 2015 and 2016 have now been released. Six different final states characterised by different boson decay modes were considered in searches for a VV resonance: 4ℓ, ℓℓνν, ℓℓqq, ℓνqq, ννqq and qqqq, where ℓ, ν and q stand for charged leptons (electrons and muons), neutrinos and quarks, respectively. For the Vh resonance search, the dominant Higgs boson decay into a pair of b-quarks (branching fraction of 58%) was exploited together with four different V decays leading to ℓℓbb, ℓνbb, ννbb and qqbb final states. A Zγ resonance was sought in final states with two leptons and a photon.

    A new resonance would appear as an excess (bump) over the smoothly distributed SM background in the invariant mass distribution reconstructed from the final-state particles. The left figure shows the observed WZ mass distribution in the qqqq channel together with simulations of some example signals. An important key to probe very high-mass signals is to identify high-momentum hadronically decaying V and h bosons. ATLAS developed a new technique to reconstruct the invariant mass of such bosons combining information from the calorimeters and the central tracking detectors. The resulting improved mass resolution for reconstructed V and h bosons increased the sensitivity to very heavy signals.

    No evidence for a new resonance was observed in these searches, allowing ATLAS to set stringent exclusion limits. For example, a graviton signal predicted in a model with extra spatial dimensions was excluded up to masses of 4 TeV, while heavy weak-boson-like resonances (as predicted in composite Higgs boson models) decaying to WZ bosons are excluded for masses up to 3.3 TeV. Heavier Higgs partners can be excluded up to masses of about 350 GeV, assuming specific model parameters.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 3:19 pm on September 22, 2017 Permalink | Reply
    Tags: , , , ,   

    From ORNL via CERN Courier: “Miniature detector first to spot coherent neutrino-nucleus scattering” 

    i1

    Oak Ridge National Laboratory

    CERN Courier

    1
    Detector placement. No image credit

    The COHERENT collaboration at Oak Ridge National Laboratory (ORNL) in the US has detected coherent elastic scattering of neutrinos off nuclei for the first time. The ability to harness this process, predicted 43 years ago, offers new ways to study neutrino properties and could drastically reduce the scale of neutrino detectors.

    Neutrinos famously interact very weakly, requiring very large volumes of active material to detect their presence. Typically, neutrinos interact with individual protons or neutrons inside a nucleus, but coherent elastic neutrino-nucleus scattering (CEνNS) occurs when a neutrino interacts with an entire nucleus. For this to occur, the momentum exchanged must remain significantly small compared to the nuclear size. This restricts the process to neutrino energies below a few tens of MeV, in contrast to the charged-current interactions by which neutrinos are usually detected. The signature of CEνNS is a low-energy nuclear recoil with all nucleon wavefunctions remaining in phase, but until now the difficulty in detecting these low-energy nuclear recoils has prevented observations of CEνNS – despite the predicted cross-section for this process being the largest of all low-energy neutrino couplings.

    The COHERENT team, comprising 80 researchers from 19 institutions, used ORNL’s Spallation Neutron Source (SNS), which generates the most intense pulsed neutron beams in the world while simultaneously creating a significant yield of low-energy neutrinos.

    ORNL Spallation Neutron Source

    Approximately 5 × 1020 protons are delivered per day, each returning roughly 0.08 isotropically emitted neutrinos per flavour. The researchers placed a detector, a caesium-iodide scintillator crystal doped with sodium, 20 m from the neutrino source with shielding to reduce background events associated with the neutron-induced nuclear recoils produced from the SNS. The results favour the presence of CEνNS over its absence at the 6.7σ level, with 134±22 events observed versus 173±48 predicted.

    Crucially, the result was achieved using the world’s smallest neutrino detector, with a mass of 14.5 kg. This is a consequence of the large nuclear mass of caesium and iodine, which results in a large CEνNS cross-section.

    The intense scintillation of this material for low-energy nuclear recoils, combined with the large neutrino flux of the SNS, also contributed to the success of the measurement. In effect, CEνNS allows the same detection rates as conventional neutrino detectors that are 100 times more massive.

    “It is a nearly ideal detector choice for coherent neutrino scattering,” says lead designer Juan Collar of the University of Chicago. “However, other new coherent neutrino-detector designs are appearing over the horizon that look extraordinarily promising in order to further reduce detector mass, truly realising technological applications such as reactor monitoring.”

    Yoshi Uchida of Imperial College London, who was not involved in the study, says that detecting neutrinos via the neutral-current process as opposed to the usual charged-current process is a great advantage because it is “blind” to the type of neutrino being produced and is sensitive at low energies. “So in combination with other types of detection, it could tell us a lot about a particular neutrino source of interest.” However, he adds that the SNS set-up is very specific and that, outside such ideal conditions, it might be difficult to scale a similar detector in a way that would be of practical use. “The fact that the COHERENT collaboration already has several other target nuclei (and detection methods) being used in their set-up means there will be more to come on this subject in the near future.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 2:46 pm on September 22, 2017 Permalink | Reply
    Tags: , , , , , , Other CERN Linacs, ,   

    From CERN Courier: “Injecting new life into the LHC” 

    CERN Courier

    Sep 22, 2017

    Malika Meddahi
    Giovanni Rumolo

    1
    Beam transfer magnets. No image credit

    The Large Hadron Collider (LHC) is the most famous and powerful of all CERN’s machines, colliding intense beams of protons at an energy of 13 TeV.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    But its success relies on a series of smaller machines in CERN’s accelerator complex that serve it. The LHC’s proton injectors have already been providing beams with characteristics exceeding the LHC’s design specifications. This decisively contributed to the excellent performance of the 2010–2013 LHC physics operation and, since 2015, has allowed CERN to push the machine beyond its nominal beam performance.

    Built between 1959 and 1976, the CERN injector complex accelerates proton beams to a kinetic energy of 450 GeV. It does this via a succession of accelerators: a linear accelerator called Linac 2 followed by three synchrotrons – the Proton Synchrotron Booster (PSB), the Proton Synchrotron (PS) and the Super Proton Synchrotron (SPS).

    2
    CERN Linac 2. No image credit

    3
    CERN The Proton Synchrotron Booster

    CERN Proton Synchrotron

    CERN Super Proton Synchrotron

    The complex also provides the LHC with ion beams, which are first accelerated through a linear accelerator called Linac 3 [nand the Low Energy Ion Ring (LEIR) synchrotron before being injected into the PS and the SPS.

    6
    CERN Linac 3

    5
    CERN Low Energy Ion Ring (LEIR) synchrotron

    The CERN injectors, besides providing beams to the LHC, also serve a large number of fixed-target experiments at CERN – including the ISOLDE radioactive-beam facility and many others.

    CERN ISOLDE

    Part of the LHC’s success lies in the flexibility of the injectors to produce various beam parameters, such as the intensity, the spacing between proton bunches and the total number of bunches in a bunch train. This was clearly illustrated in 2016 when the LHC reached peak luminosity values 40% higher than the design value of 1034 cm–2 s–1, although the number of bunches in the LHC was still about 27% below the maximum achievable. This gain was due to the production of a brighter beam with roughly the same intensity per bunch but in a beam envelope of just half the size.

    Despite the excellent performance of today’s injectors, the beams produced are not sufficient to meet the very demanding proton beam parameters specified by the high-luminosity upgrade of the LHC (HL-LHC).

    Indeed, as of 2025, the HL-LHC aims to accumulate an integrated luminosity of around 250 fb–1 per year, to be compared with the 40 fb–1 achieved in 2016. For heavy-ion operations, the goals are just as challenging: with lead ions the objective is to obtain an integrated luminosity of 10 nb–1 during four runs starting from 2021 (compared to the 2015 achievement of less than 1 nb–1). This has demanded a significant upgrade programme that is now being implemented.

    Immense challenges

    To prepare the CERN accelerator complex for the immense challenges of the HL-LHC, the LHC Injectors Upgrade project (LIU) was launched in 2010. In addition to enabling the necessary proton and ion injector chains to deliver beams of ions and protons required for the HL-LHC, the LIU project must ensure the reliable operation and lifetime of the injectors throughout the HL-LHC era, which is expected to last until around 2035. Hence, the LIU project is also tasked with replacing ageing equipment (such as power supplies, magnets and radio-frequency cavities) and improving radioprotection measures such as shielding and ventilation. [See https://sciencesprings.wordpress.com/2017/09/21/from-cern-next-stop-the-superconducting-magnets-of-the-future/%5D

    One of the first challenges faced by the LIU team members was to define the beam-performance limitations of all the accelerators in the injector chain and identify the actions needed to overcome them by the required amount. Significant machine and simulation studies were carried out over a period of years, while functional and engineering specifications were prepared to provide clear guidelines to the equipment groups. This was followed by the production of the first hardware prototype devices and their installation in the machines for testing and, where possible, early exploitation.

    Significant progress has already been made concerning the production of ion beams. Thanks to the modifications in Linac 3 and LEIR implemented after 2015 and the intensive machine studies conducted within the LIU programme over the last three years, the excellent performance of the ion injector chain could be further improved in 2016 (figure 1). This enabled the recorded luminosity for the 2016 proton–lead run to exceed the target value by a factor of almost eight. The main remaining challenges for the ion beams will be to more than double the number of bunches in the LHC through complex RF manipulations in the SPS known as “momentum slip stacking”, as well as to guarantee continued and stable performance of the ion injector chain without constant expert monitoring.

    Along the proton injector chain, the higher-intensity beams within a comparatively small beam envelope required by the HL-LHC can only be demonstrated after the installation of all the LIU equipment during Long Shutdown 2 (LS2) in 2019–2020. The main installations feature: a new injection region, a new main power supply and RF system in the PSB; a new injection region and RF system to stabilise the future beams in the PS; an upgraded main RF system; and the shielding of vacuum flanges together with partial coating of the beam chambers in order to stabilise future beams against parasitic electromagnetic interaction and electron clouds in the SPS. Beam instrumentation, protection devices and beam dumps also need to be upgraded in all the machines to match the new beam parameters. The baseline goals of the LIU project to meet the challenging HL-LHC requirements are summarised in the panel (final page of feature).

    Execution phase

    Having defined, designed and endorsed all of the baseline items during the last seven years, the LIU project is presently in its execution phase. New hardware is being produced, installed and tested in the different machines. Civil-engineering work is proceeding for the buildings that will host the new PSB main power supply and the upgraded SPS RF equipment, and to prepare the area in which the new SPS internal beam dump will be located.

    The 86 m-long Linac 4, which will eventually replace Linac 2, is an essential component of the HL-LHC upgrade .

    7
    CERN Linac 4

    The machine, based on newly developed technology, became operational at the end of 2016 following the successful completion of acceleration tests at its nominal energy of 160 MeV. It is presently undergoing an important reliability run that will be instrumental to reach beams with characteristics matching the requirements of the LIU project and to achieve an operational availability higher than 95%, which is an essential level for the first link in the proton injector chain. On 26 October 2016, the first 160 MeV negative hydrogen-ion beam was successfully sent to the injection test stand, which operated until the beginning of April 2017 and demonstrated the correct functioning of this new and critical CERN injection system as well as of the related diagnostics and controls.

    The PSB upgrade has mostly completed the equipment needed for the injection of negative hydrogen ions from Linac 4 into the PSB and is progressing with the 2 GeV energy upgrade of the PSB rings and extraction, with a planned installation date of 2019–2020 during LS2. On the beam-physics side, studies have mainly focused on the deployment of the new wideband RF system, commissioning of beam diagnostics and investigation of space-charge effects. During the 2016–2017 technical stop, the principal LIU-related activities were the removal of a large volume of obsolete cables and the installation of new beam instrumentation (e.g. a prototype transverse size measurement device and turn-by-turn orbit measurement systems). The unused cables, which had been individually identified and labelled beforehand, could be safely removed from the machine to allow cables for the new LIU equipment to be pulled.

    The procurement, construction, installation and testing of upgrade items for the PS is also progressing. Some hardware, such as new corrector magnets and power supplies, a newly developed beam gas-ionisation monitor and new injection vacuum chambers to remove aperture limitations, was already installed during past technical stops. Mitigating anticipated longitudinal beam instabilities in the PS is essential for achieving the LIU baseline beam parameters. This requires that the parasitic electromagnetic interaction of the beam with the multiple RF systems has to be reduced and a new feedback system has to be deployed to keep the beam stable. Beam-dynamics studies will determine the present intensity reach of the PS and identify any remaining needs to comfortably achieve the value required for the HL-LHC. Improved schemes of bunch rotation are also under investigation to better match the beam extracted from the PS to the SPS RF system and thus limit the beam losses at injection energy in the SPS.

    In the SPS, the LIU deployment in the tunnel has begun in earnest, with the re-arrangement and improvement of the extraction kicker system, the start of civil engineering for the new beam-dump system in LSS5 and the shielding of vacuum flanges in 10 half-cells together with the amorphous carbon coating of the adjacent beam chambers (to mitigate against electron-cloud effects). In a notable first, eight dipole and 10 focusing quadrupole magnet chambers were amorphous carbon coated in-situ during the 2016–2017 technical stop, proving the industrialisation of this process (figure 2). The new overground RF building needed to accommodate the power amplifiers of the upgraded main RF system has been completed, while procurement and testing of the solid-state amplifiers has also commenced. The prototyping and engineering for the LIU beam-dump is in progress with the construction and installation of a new SPS beam-dump block, which will be able to cope with the higher beam intensities of the HL-LHC and minimise radiation issues.

    Regarding diagnostics, the development of beam-size measurement devices based on flying wire, gas ionisation and synchrotron radiation, all of which are part of the LIU programme, is already providing meaningful results (figure 3) addressing the challenges of measuring the operating high-intensity and high-brightness beams with high precision. From the machine performance and beam dynamics side, measurements in 2015–2016 made with the very high intensities available from the PS meant that new regimes were probed in terms of electron-cloud instabilities, RF power and losses at injection. More studies are planned in 2017–2018 to clearly identify a path for the mitigation of the injection losses when operating with higher beam currents.

    Looking forward to LS2

    The success of LIU in delivering beams with the desired parameters is the key to achieving the HL-LHC luminosity target. Without the LIU beams, all of the other necessary HL-LHC developments – including high-field triplet magnets [see above], crab cavities and new collimators – would only allow a fraction of the desired luminosity to be delivered to experiments.

    Whenever possible, LIU installation work is taking place during CERN’s regular year-end technical stops. But the great majority of the upgrade requires an extended machine stop and therefore will have to wait until LS2 for implementation. The duration of access to the different accelerators during LS2 is being defined and careful preparation is ongoing to manage the work on site, ensure safety and level the available resources among the different machines in the CERN accelerator complex. After all of the LIU upgrades are in place, beams will be commissioned with the newly installed systems. The LIU goals in terms of beam characteristics are, by definition, uncharted territory. Reaching them will require not only a high level of expertise, but also careful optimisation and extensive beam-physics and machine-development studies in all of CERN’s accelerators.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 1:35 pm on September 22, 2017 Permalink | Reply
    Tags: Dauphas and his team looked at titanium in the shales over time, , Geologists often look at a particular kind of rock called shales, , If you fertilize the ocean with phosphorus life will bloom, Plate techtonics is believed to be needed to create felsic rock, Study suggests significant tectonic action was already taking place 3.5 billion years ago—about half a billion years earlier than currently thought, The flood of oxygen came from a surge of photosynthetic microorganisms - cyanobacteria, The titanium timeline suggests that the primary trigger of the surge of phosphorus was the change in the makeup of mafic rock over time, Tracing the path of metallic element titanium through the Earth’s crust across time,   

    From U Chicago: “Study suggests tectonic plates began moving half a billion years earlier than thought” 

    U Chicago bloc

    University of Chicago

    September 21, 2017
    Louise Lerner

    1
    While previous studies had argued that Earth’s crust 3.5 billion years ago looked like these Hawaiian lavas, a new study led by UChicago scientists suggests by then much of it had already been transformed into lighter-colored felsic rock by plate tectonics.
    Photo by Basil Greber

    The Earth’s history is written in its elements, but as the tectonic plates slip and slide over and under each other over time, they muddy that evidence—and with it the secrets of why Earth can sustain life.

    A new study led by UChicago geochemists rearranges the picture of the early Earth by tracing the path of metallic element titanium through the Earth’s crust across time. The research, published Sept. 22 in Science, suggests significant tectonic action was already taking place 3.5 billion years ago—about half a billion years earlier than currently thought.

    The crust was once made of uniformly dark, magnesium- and iron-rich mafic minerals. But today the crust looks very different between land and ocean: The crust on land is now a lighter-colored felsic, rich in silicon and aluminum. The point at which these two diverged is important, since the composition of minerals affects the flow of nutrients available to the fledgling life struggling to survive on Earth.

    “This question has been discussed since geologists first started thinking about rocks,” said lead author Nicolas Dauphas, the Louis Block Professor and head of the Origins Laboratory in the Department of the Geophysical Sciences and the Enrico Fermi Institute. “This result is a surprise and certainly an upheaval in that discussion.”

    To reconstruct the crust changing over time, geologists often look at a particular kind of rock called shales, made up of tiny bits of other rocks and minerals that are carried by water into mud deposits and compressed into rock. The only problem is that scientists have to adjust the numbers to account for different rates of weathering and transport. “There are many things that can foul you up,” Dauphas said.

    To avoid this issue, Dauphas and his team looked at titanium in the shales over time. This element doesn’t dissolve in water and isn’t taken up by plants in nutrient cycles, so they thought the data would have fewer biases with which to contend.

    They crushed samples of shale rocks of different ages from around the world and checked in what form its titanium appeared. The proportions of titanium isotopes present should shift as the rock changes from mafic to felsic. Instead, they saw little change over three and a half billion years, suggesting that the transition must have occurred before then.

    2
    These granite peaks are an example of felsic rock, created via plate tectonics. Photo by Basil Greber

    This also would mark the beginning of plate tectonics, since that process is believed to be needed to create felsic rock.

    “With a null response like that, seeing no change, it’s difficult to imagine an alternate explanation,” said Matouš Ptáček, a UChicago graduate student who co-authored the study.

    “Our results can also be used to track the average composition of the continental crust through time, allowing us to investigate the supply of nutrients to the oceans going back 3.5 billion years ago,” said Nicolas Greber, the first author of the paper, then a postdoctoral researcher at UChicago and now with the University of Geneva.

    Phosphorous leads to life

    The question about nutrients is important for our understanding of the circumstances around a mysterious but crucial turning point called the great oxygenation event. This is when oxygen started to emerge as an important constituent of Earth’s atmosphere, wreaking a massive change on the planet—and making it possible for multi-celled beings to evolve.

    The flood of oxygen came from a surge of photosynthetic microorganisms; and in turn their work was fostered by a surge of nutrients to the oceans, particularly phosphorus. “Phosphorus is the most important limiting nutrient in the modern ocean. If you fertilize the ocean with phosphorus, life will bloom,” Dauphas said.

    The titanium timeline suggests that the primary trigger of the surge of phosphorus was the change in the makeup of mafic rock over time. As the Earth cooled, the mafic rock coming out of volcanoes and underground melts became richer in phosphorus.

    “We’ve known for a long time that mafic rock changed over time, but what we didn’t know was that their contribution to the crust has stayed rather consistent,” Ptáček said.

    Other institutions on the study were the University of California-Riverside, University of Oregon-Eugene and the University of Johannesburg.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

     
  • richardmitnick 1:09 pm on September 22, 2017 Permalink | Reply
    Tags: , , , BLNR-Board of Land and Natural Resources, , Process to secure a Conservation District Use Permit (CDUP) to build TMT on Maunakea continued,   

    From TMT: “Hawaii Board of Land & Natural Resources Hears Final Arguments in Conservation District Use Permit Contested Case” 

    Thirty Meter Telescope Banner

    Thirty Meter Telescope
    Thirty Meter Telescope

    09.21.2017
    Press Release

    [Hilo, HI – Sep. 20, 2017] The process to secure a Conservation District Use Permit (CDUP) to build TMT on Maunakea continued in Hawaii, as the state land board – also known as the Board of Land and Natural Resources (BLNR) – heard final oral arguments from all parties involved in the contested case.

    The hearing at the Grand Naniloa Hotel in Hilo on Hawaii Island allowed the Board to listen to both sides of the debate on whether Board members should issue a CDUP to the University of Hawaii – Hilo to allow construction of the Thirty Meter Telescope on Maunakea.

    After nearly five months of evidentiary hearings that ran from October 2016 to March 2017, contested case Hearings Officer and former Judge Riki May Amano released a 305-page report recommending the state land board issue the CDUP. As required by law, today’s hearing afforded the seven-member state land board the opportunity to hear directly from the 23 participating contested case parties before deciding on the permit.

    The proceedings were open to the public. Each of the 23 parties involved were given 15 minutes to make their case on whether the CDUP should be issued or rejected.

    The three parties in support of TMT – University of Hawaii at Hilo, TMT International Observatory, and the Native Hawaiian group PUEO (Perpetuating Unique Educational Opportunities) – were first to give their oral arguments, followed by the project opponents. Oral arguments were followed by rebuttals, along with follow-up questions by the state land board.

    BLNR will now review and take into consideration all of the arguments, as well as Judge Amano’s report before making their final decision on the CDUP.

    Following the hearing, TMT International Observatory Executive Director Ed Stone said:

    “This is an important day for TMT with the conclusion of the second contested case related to the Conservation District Use Permit needed for TMT to be built on Maunakea. We deeply appreciate the time and attention given by both the Board of Land & Natural Resources and Judge Riki May Amano in considering whether a state permit should be granted.

    “Everyone had the opportunity to be heard as part of the process, and we are hopeful that the Board will act quickly on its decision and that it will be a positive one for TMT. We thank all of our supporters and friends who have been with us during the hearing process over the past 10 years.”

    Background:

    All University of Hawaii-managed lands on Maunakea, including the site for TMT, are in a conservation district, which requires a Conservation District Use Permit approved by the BLNR. In April 2013, following a contested case hearing that took seven days over the course of two months, the BLNR issued a CDUP to the University of Hawaii at Hilo for the construction of TMT on Maunakea.

    In late 2015, the Hawaii Supreme Court invalidated the permit stating that at the time the permit was initially granted, a contested case hearing was also approved, as was a stay on construction pending the outcome of the contested case hearing. The Supreme Court returned the case to the Hawaii Circuit Court and instructed that a new contested case hearing be conducted before the Board.

    That second contested case got underway in October 2016. Following 44 days of testimony by 71 witnesses over five months, that hearing concluded in early March 2017.

    Over the last 10 years, TMT has followed the state’s laws, procedures and processes in its efforts to build TMT on Maunakea. More than 20 public hearings have been held since 2008. An EIS was completed and approved. For the complete process, visit http//www.maunakeaandtmt.org.

    What’s Next:

    The BLNR will review all evidence and issue its decision. An exact timeframe is not known.

    TMT is also awaiting resolution on the state’s consent to the University of Hawaii’s sublease to the TMT International Observatory.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Near the center of Pasadena, California, a team of scientists, engineers, and project specialists is busily planning and designing what eventually will become the most advanced and powerful optical telescope on Earth. When completed later this decade, the Thirty Meter Telescope (TMT) will enable astronomers to study objects in our own solar system and stars throughout our Milky Way and its neighboring galaxies, and forming galaxies at the very edge of the observable Universe, near the beginning of time.
    Partners
    The Association of Canadian Universities for Research in Astronomy
    California Institute of Technology
    Department of Science and Technology of India
    The National Astronomical Observatories, Chinese Academy of Sciences (NAOC)
    National Astronomical Observatory of Japan
    University of California

     
  • richardmitnick 12:47 pm on September 22, 2017 Permalink | Reply
    Tags: Additive manufacturing (3D printing), , , Computational Design Optimization strategic initiative, Computer-aided design (CAD) and computer-aided engineering (CAE) have not caught up to advanced manufacturing technologies and the sheer number of design possibilities they afford, DARPA funds TRAnsformative DESign (TRADES) at LLNL Autodesk the UC Berkeley International Computer Science Institute (ICSI) and the University of Texas at Austin, , LLNL Center for Design and Optimization, Powerful multi-physics computer simulations, The linchpin for LLNL's research is the Livermore Design Optimization (LiDO) code   

    From LLNL: “LLNL gears up for next generation of computer-aided design and engineering” 


    Lawrence Livermore National Laboratory

    Jeremy Thomas
    thomas244@llnl.gov
    925-422-5539

    1
    Lawrence Livermore has instituted the Center for Design and Optimization, tasked with utilizing advanced manufacturing techniques, high-performance computing and cutting-edge simulation codes to optimize design. No image credit.

    The emergence of additive manufacturing (3D printing) and powerful multi-physics computer simulations have enabled design to reach unprecedented levels of complexity, expanding the realm of what can be engineered and created. However, conventional design tools such as computer-aided design (CAD) and computer-aided engineering (CAE) have not caught up to advanced manufacturing technologies and the sheer number of design possibilities they afford.

    To address next-generation technological capabilities and their potential impact, Lawrence Livermore National Laboratory instituted the Center for Design and Optimization last October, tasked with using advanced manufacturing techniques, high-performance computing and cutting-edge simulation codes to optimize design. With Laboratory Directed Research and Development (LDRD) strategic initiative (SI) funding, the center began a new Computational Design Optimization strategic initiative, including collaborators at University of Illinois at Urbana-Champaign, Lund University, University of Wisconsin Madison, and Technical University in Denmark.

    “I want to solve real, relevant problems,” said center director Dan Tortorelli, who recently came to the Lab from a mechanical sciences and engineering professorship at the University of Illinois at Urbana-Champaign. “We want to take the great simulation capabilities that we have with our supercomputers, along with optimization and our AM (additive manufacturing) capabilities, and combine them into a single systematic design environment.”

    Although optimization has been around since the 1980s, Tortorelli said, it has been largely limited to linear elastic design problems and is often performed through trial and error, looping a simulation over and over with different parameters until reaching a functional design. The process is not only time-consuming and tedious, it’s also expensive. Through the SI, researchers want to have computers do the repetitive work, freeing up engineers to focus their attention on more creative matters.

    “It may surprise some people but engineers are still doing things by trial and error — even if they are doing simulations, it is still trial and error,” said Computational Design Optimization project co-lead Dan White. “We are trying to give the computer the objectives and constraints, give it a list of materials to use, then it does the calculations and designs the part for us. This is a new way of thinking about engineering and design.”

    Ultimately, researchers want to have a fully integrated optimization program that incorporates multiple scales, complex physics and transient and nonlinear phenomena. By no longer adhering to outdated design paradigms, researchers said, the possibilities are essentially endless.

    “Fundamentally changing the way design is done is really what we want to do,” said Rob Sharpe, deputy associate director for research and development within Engineering. “We want to be able to design things that are so complicated that human intuition isn’t going to work…Almost all design work has been done for things that are static. But a lot of the things that we want to create, we want it to be dynamic and, in some cases, push it until it fails, and we want them to be designed to fail in a specific way. In addition, if we just used all the advanced AM techniques to make the same designs, we’d be missing the point. That’s really why we started the center.”

    Optimal design through advanced codes

    Current 3D printers have the resolution to create parts with millions, even billions of parameters, in theory placing a different material at every point in the structure. To optimize design for those printers, engineers will need the same level of resolution in their design software, White said, a feat only possible through high-performance computing.

    The linchpin for LLNL’s research is the Livermore Design Optimization (LiDO) code. Based on LLNL’s open source Modular Finite Element Method library (MFEM), it uses gradient-based algorithms that incorporate uncertainty, multiple scales, multiple physics and multiple functionality into design. White is working on writing the software and said the code’s basic capability is already functioning, having already created designs for cantilevers and lattice structures.

    “It’s already designing things better than a human could do,” White said. “The idea is that the computer is doing this work on the weekend when we are not here. It is fully automated; the computer sorts through all the scenarios and comes up with the best designs.”

    So far, researchers have designed different micro-architected lattice structures with new properties, including those with negative coefficient of thermal expansion (i.e., lattices that contract when heated or expand when cooled) or negative Poisson’s ratio (i.e., lattices that expand laterally when stretched or contract when compressed, which is opposite of most materials — for example, a rubber band). Other research has been done to design electrodes, multifunctional metamaterials, carbon-fiber-reinforced composites and path planning for direct ink writing (a 3D printing process) onto curvilinear surfaces. The advances someday could be applied to optimize armor plating for military vehicles, blast and impact-resistant structures, optics and electromagnetic devices, researchers said. Interestingly, some of the designs generated have resembled “organic” shapes found in nature, Sharpe said.

    “A lot of what nature has done is to do evolutionary optimization over time,” Sharpe said. “Human efforts inspired by these are often called biomimetic designs — and some do almost have an organic look…We’re moving to where the computer fully explores the possibilities, does the work and shows the possibilities. We can now begin to explore complex, nonlinear, multi-function, dynamic designs for the first time. The computer is going to be able to do larger problems than humans could possibly do — we’re talking about a billion design variables. It’s going to change the way people approach design problems and the way engineers interact with all these tools.”

    There’s a strong link between AM and design optimization, according to LLNL engineer Eric Duoss, because of the ability to control both material composition and structure at multiple length scales with 3D printing.

    “With new printing approaches, we can deterministically place material and structure into previously unobtainable form factors; that means we really need to rethink design not only to keep pace with manufacturing advances, but to fully realize the potential of AM,” said Duoss, who is working on a project to optimize printing 3D lattices onto 3D surfaces. “Achieving the goals of this strategic initiative would be revolutionary for design and have far-reaching impact beyond just additive manufacturing, but it’s a really hard problem. There’s a real need for the proposed capability right now, and we’re going to scramble to get there as fast as we can.”

    In January, the Defense Advanced Research Projects Agency (DARPA) awarded a multimillion dollar grant to LLNL, Autodesk, the UC Berkeley International Computer Science Institute (ICSI) and the University of Texas at Austin, to institute a project called TRAnsformative DESign (TRADES), aimed at advancing the tools needed to generate and better manage the enormous complexity of design afforded by new technologies.

    Under the four-year DARPA project, LLNL will be using its high-performance computing libraries to develop algorithms capable of optimizing large, complex systems and working with Autodesk to create a more robust and user-friendly graphical interface. The algorithms, when combined with emerging advanced manufacturing capabilities, will be used to design revolutionary systems for the Department of Defense.

    The collaboration partially stemmed from previous research performed with Autodesk seeking to improve performance of helmets, which resulted in designs for rigid shell materials formed from graded density, lattice structures with optimized macroscale shape and microscale material distribution.

    “As part of that project, we encountered challenges that helped inspire our current efforts in design,” said Duoss, a principal investigator on the project. “Almost by definition, helmets are required to function under highly nonlinear and dynamic conditions. Turns out those conditions are hard to simulate, let alone design for, and it is certainly a new area for design optimization. With the Autodesk collaboration, we collectively identified these gaps in capability and knew the time was right to assemble a team to attack these difficult problems.”

    LLNL looks to be world leader

    About a dozen LLNL employees are part of the Computational Design Optimization strategic initiative. In the coming years, they will work to improve the LiDO program to perform generalized shape optimization, ensure accurate simulations, increase optimization speed through Reduced Order Models (ROMs) and accommodate uncertainty.

    Specific projects include multi-physics lattices with optimal mechanical, heat transfer and electromagnetic responses, to create antennas with sufficient strength and gain. Researchers also want to enhance fabrication resolution in their designs down to the submicron level, to potentially design and manufacture lenses for optics simpler, cheaper and not necessarily parabolic.

    Other goals are to investigate machine learning, and consider a variety of response metrics. A benchmark problem will be the optimization of a large 3D-printed part consisting of a spatially varying lattice to illustrate LiDO’s ability to incorporate ROMs. Optimization under uncertainty using models derived from Lab data, and the utilization of domain symmetry to simplify the analysis and design process also will be investigated.

    At the end of the three-year SI, the researchers hope to have funding from various Lab programs and go from having almost no capabilities in design optimization to being one of the world leaders.

    “We want to do this inverse problem where we know the outcome that we want and what our design domain should be,” Tortorelli said. “We want to be able to define the parameters, press a button and have it come out the way we want. We are not taking the designers or engineers out of the loop. We are not saying, ‘do away with them.’ We are saying, ‘do away with the drudgery.’ We are simplifying their lives so they can be more creative.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    LLNL Campus

    Operated by Lawrence Livermore National Security, LLC, for the Department of Energy’s National Nuclear Security
    Administration
    DOE Seal
    NNSA

     
  • richardmitnick 12:15 pm on September 22, 2017 Permalink | Reply
    Tags: , groundwork for additional collaboration between the U.S. DOE its national laboratories (including Fermilab) and the UK Science and Technology Facilities Council, , UK labs and universities were important partners in the main Tevatron experiments CDF and DZero, UK Minister of State for Universities Science Research and Innovation Jo Johnson, UK science   

    From FNAL: “UK science minister announces $88 million for LBNF/DUNE, visits Fermilab” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    1
    Jo Johnson learns about accelerator technologies at Fermilab. From left: Fermilab Chief Strategic Partnerships Officer Alison Markovitz; Fermilab scientist Anna Grassellino; Andrew Price of the UK Science and Innovation Network; DUNE co-spokesperson Mark Thomson; STFC Chief Executive Brian Bowsher; UK Minister of State for Universities, Science, Research and Innovation Jo Johnson. Photo: Reidar Hahn

    UK minister Jo Johnson traveled to the United States this week to sign the first ever umbrella science and technology agreement between the two nations and to announce approximately $88 million in funding for the international Long-Baseline Neutrino Facility and Deep Underground Neutrino Experiment.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    On Thursday, he visited the host laboratory for LBNF/DUNE, the U.S. Department of Energy’s Fermi National Accelerator Laboratory, emphasizing the importance of the project and the strong scientific partnership between the two countries.

    Johnson, the UK minister of state for universities, science, research and innovation, signed the agreement on Wednesday in Washington, D.C. Signing for the United States was Judith G. Garber, acting assistant secretary of state for oceans and international environmental and scientific affairs.

    This new agreement lays the groundwork for additional collaboration between the U.S. DOE, its national laboratories (including Fermilab) and the UK Science and Technology Facilities Council. STFC funds research in particle physics, nuclear physics, space science and astronomy in the United Kingdom. The U.S. DOE is the largest supporter of basic research in the physical sciences in the United States.

    “Our continued collaboration with the U.S. on science and innovation benefits both nations,” said Johnson, “and this agreement will enable us to share our expertise to enhance our understanding of many important topics that have the potential to be world changing.”

    LBNF/DUNE will be a world-leading international neutrino experiment based in the United States. Fermilab’s powerful particle accelerators will create the world’s most intense beam of neutrinos and send it 800 miles through Earth to massive particle detectors, which will be built a mile underground at the Sanford Underground Research Facility in South Dakota.

    The UK research community is already a major contributor to the DUNE collaboration, providing expertise and components to the facility and the experiment. UK contributions range from the high-power neutrino production target to the data acquisition systems to the software that reconstructs particle interactions into visible 3-D readouts.

    DUNE will be the first large-scale experiment hosted in the United States that runs as a truly international project, with more than 1,000 scientists and engineers from 31 countries building and operating the facility. Its goal is to learn more about ghostly particles called neutrinos, which may provide insight into why we live in a matter-dominated universe that survived the Big Bang.

    2
    The UK delegation visits the Fermilab underground neutrino experimental area. UK Minister Jo Johnson stands in the center. Immediately to his left is Fermilab Director Nigel Lockyer. Photo: Reidar Hahn

    In addition to Johnson, the UK delegation to Fermilab included Brian Bowsher, chief executive of STFC; Andrew Price of the UK Science and Innovation Network; and Martin Whalley, deputy consul general from the Great Britain Consulate in Chicago.

    They toured several areas of the lab, including the underground cavern that houses the NOvA neutrino detector, and the Cryomodule Test Facility, where components of the accelerator that will power DUNE are being tested. The UK will contribute world-leading expertise in particle accelerators to the upgrade of Fermilab’s neutrino beam and accelerator complex.

    “This investment is part of a long history of UK research collaboration with the U.S.,” said Bowsher. “International partnerships are the key to building these world-leading experiments, and I am looking forward to seeing our scientists work with our colleagues in the U.S. in developing this experiment and the exciting science that will happen as a result.”

    UK institutions have been a vital part of Fermilab’s 50-year history, from the earliest days of the laboratory. UK labs and universities were important partners in the main Tevatron experiments, CDF and DZero, in the 1980s and 1990s. UK institutions have been involved with accelerator research and development, are partners in Fermilab’s muon experiments and are at the forefront of Fermilab’s focus on neutrino physics.

    Sixteen UK institutions (14 universities and two STFC-funded labs) are contributors to the DUNE collaboration, the U.S.-hosted centerpiece for a world-class neutrino experiment. The collaboration is led by Mark Thomson, professor of experimental particle physics at the University of Cambridge, and Ed Blucher, professor and chair of the Department of Physics at the University of Chicago.

    “Our colleagues in the United Kingdom have been critical partners for Fermilab, for LBNF/DUNE and for the advancement of particle physics around the world,” said Fermilab Director Nigel Lockyer. “We look forward to the discoveries that these projects will bring.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 11:49 am on September 22, 2017 Permalink | Reply
    Tags: A phonon is a collective movement of particles vibrating together, A photon (a quantum of light), At the microscopic quantum level a phonon behaves even more like a particle, , Hanbury Brown and Twiss interferometry, Optomechanical crystal, Photons and phonons combine for quantum solution, Vibrations in crystals can obey the laws of quantum mechanics and might be used to store information in quantum computers   

    From COSMOS: “Photons and phonons combine for quantum solution” 

    Cosmos Magazine bloc

    COSMOS Magazine

    22 September 2017
    Michael Lucy

    1
    A conceptual illustration showing the crystal (middle), a phonon (top) and the quantum probability distribution (bottom). Moritz Forsch / TU Delft

    Vibrations in crystals can obey the laws of quantum mechanics and might be used to store information in quantum computers, according to new research published in the journal Science.

    The new development, achieved by Sungkun Hong at the University of Vienna in Austria and colleagues, revolves around the idea of the phonon. Not to be confused with a photon (a quantum of light), a phonon is a collective movement of particles vibrating together.

    For an analogue at the macroscopic scale, think of a single wave moving through the ocean – after the wave passes, the water molecules that made it go back to where they began, but the wave itself moves on.

    At the microscopic, quantum level, a phonon behaves even more like a particle, to the point where it is often referred to as a “quasiparticle”. Physicists think phonons may provide a useful bridge between quantum and classical worlds, and the interactions between light and vibrations are a very active area of research.

    Hong’s team analysed phonons using an optomechanical crystal – described as a “microfabricated silicon nanobeam” – that is designed to vibrate in a specific way when struck by a photon.

    The researchers fired photons at the device, creating phonons within it. They then fired photons of a different frequency, which were reflected back after interacting with those phonons.

    They then analysed the reflected photons via a process called Hanbury Brown and Twiss interferometry, which lets them gain information about the quantum state of the phonons. Using this technique, they proved that a single phonon in the crystal obeys the laws of quantum mechanics rather than classical physics.

    Because of this quantum property, and the ability manipulate them with light, phonons in crystals could be “an ideal candidate for storage of quantum information”, the authors say.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: