Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:53 am on April 12, 2014 Permalink | Reply
    Tags: , , , , , , , Physics   

    From BBC: “Dark matter hunt: LUX experiment reaches critical phase” 


    8 April 2014
    Rebecca Morelle

    The quest to find the most mysterious particles in the Universe is entering a critical phase, scientists say.

    An experiment located in the bottom of a gold mine in South Dakota, US, could offer the best chance yet of detecting dark matter.

    Scientists believe this substance makes up more than a quarter of the cosmos, yet no-one has ever seen it directly.

    Early results from this detector, which is called LUX, confirmed it was the most powerful experiment of its kind.

    LUX Dark matter

    In the coming weeks, it will begin a 300-day-long run that could provide the first direct evidence of these enigmatic particles.

    Spotting WIMPs

    Beneath the snow-covered Black Hills of South Dakota, a cage rattles and creaks as it begins to descend into the darkness.

    For more than 100 years, this was the daily commute for the Homestake miners searching for gold buried deep in the rocks.

    Today, the subterranean caverns and tunnels have been transformed into a high-tech physics laboratory.

    Scientists now make the 1.5km (1-mile) journey underground in an attempt to solve one of the biggest mysteries in science.

    “We’ve moved into the 21st Century, and we still do not know what most of the matter in the Universe is made of,” says Prof Rick Gaitskell, from Brown University in Rhode Island, one of the principle investigators on Large Underground Xenon (LUX) experiment.

    The LUX detector is located 3km underground – and could be our best hope yet of finding dark matter

    Scientists believe all of the matter we can see – planets, stars, dust and so on – only makes up a tiny fraction of what is actually out there.

    They say about 85% of the matter in the Universe is actually dark matter, so called because it cannot be seen directly and nobody really knows what it is.

    This has not stopped physicists coming up with ideas though. And the most widely supported theory is that dark matter takes the form of Weakly Interacting Massive Particles, or WIMPs.

    Prof Gaitskell explains: “If one considers the Big Bang, 14bn years ago, the Universe was very much hotter than it is today and created an enormous number of particles.

    “The hypothesis we are working with at the moment is that a WIMP was the relic left-over from the Big Bang, and in fact dominates over the regular material you and I are made of.”

    The Homestake gold mine, which has now been converted into a lab, is in the Black Hills of South Dakota

    The presence of dark matter was first inferred because of its effect on galaxies like our own.

    As these celestial systems rotate around their dense centre, all of the regular matter that they contain does not have enough mass to account for the gravity needed to hold everything together. Really, a spinning galaxy should fly apart.

    Instead, scientists believe that dark matter provides the extra mass, and therefore gravity, needed to hold a galaxy together.

    It is so pervasive throughout the Universe that researchers believe a vast number of WIMPs are streaming through the Earth every single second. Almost all pass through without a trace.

    However, on very rare occasions, it is thought that dark matter particles do bump into regular matter – and it is this weak interaction that scientists are hoping to see.

    The LUX detector is one of a number of physics experiments based in the Sanford Underground Research Facility that require a “cosmic quietness”.

    Prof Gaitskell says: “The purpose of the mile of rock above is to deal with cosmic rays. These are high-energy particles generated from outside our Solar System and also by the Sun itself, and these are very penetrating.

    “If we don’t put a mile of rock between us and space, we wouldn’t be able to do this experiment.”

    Inside a cavern in the mine, the detector is situated inside a stainless steel tank that is two storeys high.

    The detector is in housed in a tank that is filled with purified water

    This is filled with about 300,000 litres (70,000 gallons) of ultra-purified water, which means it is free from traces of naturally occurring radioactive elements that could also interfere with the results.

    “With LUX, we’ve worked extremely hard to make this the quietest verified place in the world,” says Prof Gaitskell.

    At the detector’s heart is 370kg (815lb) of liquid xenon. This element has the unusual, but very useful, property of throwing out a flash of light when particles bump into it.

    And detecting a series of these bright sparks could mean that dark matter has been found.

    The LUX detector was first turned on last year for a 90-day test run. No dark matter was seen, but the results concluded that it was the most sensitive experiment of its kind.

    Now, when the experiment is run for 300 days, Prof Gaitskell says these interactions might be detected once a month or every few months.

    The team would have to see a significant number of interactions – between five and 10 – to suggest that dark matter has really been glimpsed. The more that are seen, the more statistical confidence there will be.
    LUX uses light detectors called photomultiplier tubes to record any flashes of light

    However, LUX is not the only experiment setting its sights on dark matter.

    With the Large Hadron Collider, scientists are attempting to create dark matter as they smash particles together, and in space, telescopes are searching for the debris left behind as dark matter particles crash into each other.

    CERN LHC New
    LHC at CERN

    Mike Headley, director of the South Dakota Science and Technology Authority, which runs the Sanford laboratory, says a Nobel prize will very probably be in store for the scientists who first detect dark matter.

    He says: “There are a handful of experiments located at different underground laboratories around the world that want to be the first ones to stand up and say ‘we have discovered it’, and so it is very competitive.”

    Finding dark matter would transform our understanding of the Universe, and usher in a new era in fundamental physics.

    However, there is also a chance that it might not be spotted – and the theory of dark matter is wrong.

    Dr Jim Dobson, based at the UK’s University of Edinburgh and affiliated with University College London, says: “We are going into unknown territory. We really don’t know what we’re going to find.

    “If we search with this experiment and then the next experiment, LUX Zeppelin, which is this much, much bigger version of LUX – if we didn’t find anything then there would be a good chance it didn’t exist.

    He adds: “In some ways, showing that there was no dark matter would be a more interesting result than if there was. But, personally, I would rather we found some.”

    Prof Carlos Frenk, a cosmologist from Durham University, says that many scientists have gambled decades of research on finding dark matter.

    He adds: “If I was a betting man, I think LUX is the frontrunner. It has the sensitivity we need. Now, we just need the data.

    “If they don’t [find it], it means the dark matter is not what we think it is. It would mean I have wasted my whole scientific career – everything I have done is based on the hypothesis that the Universe is made of dark matter. It would mean we had better look for something else.”

    See the full article here.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 11:07 am on April 11, 2014 Permalink | Reply
    Tags: , , , Physics   

    From CERN: “CERN technology that could help out in space” 

    CERN New Masthead

    11 Apr 2014
    Barbara Warmbein

    This year at the world’s largest industrial fair, the Hannover Messe, CERN and the European Space Agency (ESA) have teamed up to present their technologies. Some of the technologies developed by CERN could find applications in space.

    In the planned upgrades of detectors on the Large Hadron Collider (LHC), electronics will come under intense radiation from high-energy particle beams. So electronics engineers at CERN have developed a power converter that can take a radiation dose of up to 7 million Gray and is not disturbed by single particle hits. Because power distribution is also an issue on spacecraft, the engineers – and CERN’s Knowledge Transfer group – believe that the converter could be useful in space.

    Then there’s the problem of thermal management. Devices called collimators narrow the beams in particle accelerators. On the LHC they suffer intense heat, acting as “brakes” to strip particles from the edges of spreading beams. With the higher energies and intensities planned for the upgraded LHC, researchers are looking to improve on the carbon-based composite materials currently in use for collimators .

    “The material [for the collimator] needs to be robust, conduct heat away quickly, have high geometrical stability and conduct electricity so as not to adversely influence the beam,” says Alessandro Bertarelli, an aerospace engineer turned beam expert at CERN.

    CERN and the European Space Agency share a stand to present their technologies at Hannover Messe, the world’s biggest industrial fair (Image: rheinland relations)

    The possible solution: a molybdenum-carbide–graphite composite. “It has better electrical properties than the current collimators,” says Bertarelli. “And its thermal conductivity is four times better – probably a world record for an engineered material.” The composite is also quite light and very stable up to high temperatures – it could find a use on aircraft or in the harsh environment of space.

    Another technology that could come in handy in space is related to surfaces in ultra-high vacuum. Particle beams have electric fields that can knock electrons from metallic surfaces around them. These electrons knock out even more electrons and so on. The process, called beam-induced multipactoring, forms an unwanted electron cloud that interferes with the beam.

    Rough surfaces or coatings with particular chemical composition can mitigate this effect by reducing the number of secondary electrons produced. So engineers sometimes use special coatings – amorphous carbon, for example.

    But mechanically rough surfaces can have drawbacks. “They are likely to increase the radiofrequency losses in microwave applications,” says beam physicist Fritz Caspers. “So [a team at CERN is] looking at surfaces which are mechanically very smooth.”

    Instead of a surface that is rough in a mechanical way the CERN team invented a surface that is rough in a magnetic way. Tiny permanent magnets of alternating polarity applied just underneath the surface of high-frequency-carrying structures can trap secondary electrons. The technology reduces the secondary electron yield without the drawbacks of a rough surface.

    Magnetic surface roughness could be used in particle physics, powering structures such as klystrons that use high frequencies. Spacecraft also suffer from efficiency losses due to stray electrons, and rough magnetic surfaces could be a basis of discussion for loss-free power transport in space.

    The concept of magnetic surface roughness could one day find applications in radiofrequency systems on board satellites.

    CERN and ESA recently signed a cooperation agreement. With so much overlap between technology for aerospace and particle physics, it’s off to a good start.

    See the full article here.

    Meet CERN in a variety of places:

    Cern Courier



    CERN CMS New

    CERN LHCb New


    CERN LHC New

    LHC particles

    Quantum Diaries

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 3:32 pm on April 10, 2014 Permalink | Reply
    Tags: , , , Physics   

    From Brookhaven Lab: “National Synchrotron Light Source II Achieves First Stored Electron Beam” 

    Brookhaven Lab

    April 10, 2014
    Chelsea Whyte

    Scientists and engineers at the U.S. Department of Energy’s Brookhaven National Laboratory achieved a major milestone in the commissioning of the state-of-the-art National Synchrotron Light Source II (NSLS-II) on April 5, 2014. For the first time, Associate Laboratory Director for Photon Sciences Steve Dierker and his project team were able to store electron beam in the NSLS-II storage ring overnight Friday into Saturday, with an initial beam lifetime of about three hours.

    NSLS-II at Brookhaven Lab

    Laboratory Director Doon Gibbs called it a “significant advance” and said, “Achieving stored beam means the team can now accelerate further optimization of the storage ring. Thanks in particular go to Division Director for Accelerator Systems Ferdinand Willeke for his strong leadership of the design, construction, and commissioning of the NSLS-II accelerator systems.”

    This achievement is the result of more than seven years of planning, design, construction, and commissioning by the Photon Sciences staff.

    More details on the technical aspects of this accomplishment and the next steps will be coming soon.

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 6:41 pm on April 9, 2014 Permalink | Reply
    Tags: , , , Physics,   

    From M.I.T.: “New ‘switch’ could power quantum computing” 

    April 9, 2014
    Peter Dizikes | MIT News Office

    A light lattice that traps atoms may help scientists build networks of quantum information transmitters.

    Using a laser to place individual rubidium atoms near the surface of a lattice of light, scientists at MIT and Harvard University have developed a new method for connecting particles — one that could help in the development of powerful quantum computing systems.


    The new technique, described in a paper published today in the journal Nature, allows researchers to couple a lone atom of rubidium, a metal, with a single photon, or light particle. This allows both the atom and photon to switch the quantum state of the other particle, providing a mechanism through which quantum-level computing operations could take place.

    Moreover, the scientists believe their technique will allow them to increase the number of useful interactions occurring within a small space, thus scaling up the amount of quantum computing processing available.

    “This is a major advance of this system,” says Vladan Vuletić, a professor in MIT’s Department of Physics and Research Laboratory for Electronics (RLE), and a co-author of the paper. “We have demonstrated basically an atom can switch the phase of a photon. And the photon can switch the phase of an atom.”

    That is, photons can have two polarization states, and interaction with the atom can change the photon from one state to another; conversely, interaction with the photon can change the atom’s phase, which is equivalent to changing the quantum state of the atom from its “ground” state to its “excited” state. In this way the atom-photon coupling can serve as a quantum switch to transmit information — the equivalent of a transistor in a classical computing system. And by placing many atoms within the same field of light, the researchers may be able to build networks that can process quantum information more effectively.

    “You can now imagine having several atoms placed there, to make several of these devices — which are only a few hundred nanometers thick, 1,000 times thinner than a human hair — and couple them together to make them exchange information,” Vuletić adds.

    Using a photonic cavity

    Quantum computing could enable the rapid performance of calculations by taking advantage of the distinctive quantum-level properties of particles. Some particles can be in a condition of superposition, appearing to exist in two places at the same time. Particles in superposition, known as qubits, could thus contain more information than particles at classical scales, and allow for faster computing.

    However, researchers are in the early stages of determining which materials best allow for quantum-scale computing. The MIT and Harvard researchers have been examining photons as a candidate material, since photons rarely interact with other particles. For this reason, an optical quantum computing system, using photons, could be harder to knock out of its delicate alignment. But since photons rarely interact with other bits of matter, they are difficult to manipulate in the first place.

    In this case, the researchers used a laser to place a rubidium atom very close to the surface of a photonic crystal cavity, a structure of light. The atoms were placed no more than 100 or 200 nanometers — less than a wavelength of light — from the edge of the cavity. At such small distances, there is a strong attractive force between the atom and the surface of the light field, which the researchers used to trap the atom in place.

    Other methods of producing a similar outcome have been considered before — such as, in effect, dropping atoms into the light and then finding and trapping them. But the researchers found that they had greater control over the particles this way.

    “In some sense, it was a big surprise how simple this solution was compared to the different techniques you might envision of getting the atoms there,” Vuletić says.

    The result is what he calls a “hybrid quantum system,” where individual atoms are coupled to microscopic fabricated devices, and in which atoms and photons can be controlled in productive ways. The researchers also found that the new device serves as a kind of router separating photons from each other.

    “The idea is to combine different things that have different strengths and weaknesses in such a way to generate something new,” Vuletić says, adding: “This is an advance in technology. Of course, whether this will be the technology remains to be seen.”

    ‘Still amazing’ to hold onto one atom

    The paper, Nanophotonic quantum phase switch with a single atom, is co-authored by Vuletić; Tobias Tiecke, a postdoc affiliated with both RLE and Harvard; Harvard professor of physics Mikhail Lukin; Harvard postdoc Nathalie de Leon; and Harvard graduate students Jeff Thompson and Bo Liu.

    The collaboration between the MIT and Harvard researchers is one of two advances in the field described in the current issue of Nature. Researchers at the Max Planck Institute of Quantum Optics in Germany have concurrently developed a new method of producing atom-photon interactions using mirrors, forming quantum gates, which change the direction of motion or polarization of photons.

    “The Harvard/MIT experiment is a masterpiece of quantum nonlinear optics, demonstrating impressively the preponderance of single atoms over many atoms for the control of quantum light fields,” says Gerhard Rempe, a professor at the Max Planck Institute of Quantum Optics who helped lead the German team’s new research, and who has read the paper by the U.S.-based team. “The coherent manipulation of an atom coupled to a photonic crystal resonator constitutes a breakthrough and complements our own work … with an atom in a dielectric mirror resonator.”

    Rempe adds that he thinks both techniques will be regarded as notable “achievements on our way toward a robust quantum technology with stationary atoms and flying photons.”

    If the research techniques seem a bit futuristic, Vuletić says that even as an experienced researcher in the field, he remains slightly awed by the tools at his disposal.

    “For me what is still amazing, after working in this for 20 years,” Vuletić reflects, “is that we can hold onto a single atom, we can see it, we can move it around, we can prepare quantum superpositions of atoms, we can detect them one by one.”

    Funding for the research was provided in part by the National Science Foundation, the MIT-Harvard Center for Ultracold Atoms, the Natural Sciences and Engineering Research Council of Canada, the Air Force Office of Scientific Research, and the Packard Foundation.

    See the full article here.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 2:15 pm on April 8, 2014 Permalink | Reply
    Tags: Fermilab Holometer experiment, Interferometry, Physics,   

    From Symmetry: “Searching for the holographic universe” 

    April 08, 2014
    Fermilab Leah Hesla
    Leah Hesla

    Physicist Aaron Chou keeps the Holometer experiment—which looks for a phenomenon whose implications border on the unreal—grounded in the realities of day-to-day operations.

    The beauty of the small operation—the mom-and-pop restaurant or the do-it-yourself home repair—is that pragmatism begets creativity. The industrious individual who makes do with limited resources is compelled onto paths of ingenuity, inventing rather than following rules to address the project’s peculiarities.

    As project manager for the Holometer experiment at Fermilab, physicist Aaron Chou runs a show that, though grandiose in goal, is remarkably humble in setup. Operated out of a trailer by a small team with a small budget, it has the feel more of a scrappy startup than of an undertaking that could make humanity completely rethink our universe.

    During an exceptionally snowy winter, Aaron Chou and Vanderbilt University student Brittany Kamai make their way to the Holometer’s modest home base, a relatively isolated trailer on the Fermilab prairie. Photo by: Reidar Hahn, Fermilab

    The experiment is based on the proposition that our familiar, three-dimensional universe is a manifestation of a two-dimensional, digitized space-time. In other words, all that we see around us is no more than a hologram of a more fundamental, lower-dimensional reality.

    If this were the case, then space-time would not be smooth; instead, if you zoomed in on it far enough, you would begin to see the smallest quantum bits—much as a digital photo eventually reveals its fundamental pixels.

    In 2009, the GEO600 experiment, which searches for gravitational waves emanating from black holes, was plagued by unaccountable noise. This noise could, in theory, be a telltale sign of the universe’s smallest quantum bits. The Holometer experiment seeks to measure space-time with far more precision than any experiment before—and potentially observe effects from those fundamental bits.

    Such an endeavor is thrilling—but also risky. Discovery would change the most basic assumptions we make about the universe. But there also might not be any holographic noise to find. So for Chou, managing the Holometer means building and operating the apparatus on the cheap—not shoddily, but with utmost economy.

    Thus Chou and his team take every opportunity to make rather than purchase, to pick up rather than wait for delivery, to seize the opportunity and take that measurement when all the right people are available.

    Some of the Holometer’s parts are ordered custom, and some are homemade. Chou makes sure all of them work together in harmony.
    Photo by: Reidar Hahn, Fermilab

    “It’s kind of like solving a Rubik’s cube,” Chou says. “You have an overview of every aspect of the measurement that you’re trying to make. You have to be able to tell the instant something doesn’t look right, and tell that it conflicts with some other assumption you had. And the instant you have a conflict, you have to figure out a way to resolve it. It’s a lot of fun.”

    Chou is one of the experiment’s 1.5 full-time staff members; a complement of students rounds out a team of 10. Although Chou is essentially the overseer, he runs the experiment from down in the trenches.

    Aaron Chou, project manager for Fermilab’s Holometer, tests the experiment’s instrumentation.
    Photo by: Reidar Hahn, Fermilab

    The Holometer experimental area, for example, is a couple of aboveground, dirt-covered tunnels whose walls don’t altogether keep out the water after a heavy rain. So any time the area needs the attention of a wet-dry vacuum, he and his team are down on the ground, cheerfully squeegeeing, mopping and vacuuming away.

    Research takes place as much in the trailer as in the Holometer tunnel, where the instrument itself sits.
    Photo by: Reidar Hahn, Fermilab

    “That’s why I wear such shabby clothes,” he says. “This is not the type of experiment where you sit behind the computer and analyze data or control things remotely all day long. It’s really crawling-around-on-the-floor kind of work, which I actually find to be kind of a relief, because I spent more than a decade sitting in front of a computer for more well-established experiments where the installation took 10 years and most of the resulting experiment is done from behind a keyboard.”

    As a graduate student at Stanford University, Chou worked on the SLD experiment at SLAC National Accelerator Laboratory, writing software to help look for parity violation in Z bosons. As a Fermilab postdoc on the Pierre Auger experiment, he analyzed data on ultra-high-energy cosmic rays.

    Now Chou and his team are down in the dirt, hunting for the universe’s quantum bits. In length terms, these bits are expected to be on the smallest scale of the universe, the Planck scale: 1.6 x 10-35 meters. That’s roughly 10 trillion trillion times smaller than an atom; no existing instrument can directly probe objects that small. If humanity could build a particle collider the size of the Milky Way, we might be able to investigate Planck-scale bits directly.

    The Holometer instead will look for a jitter arising from the cosmos’ minuscule quanta. In the experiment’s dimly lit tunnels, the team built two interferometers, L-shaped configurations of tubes. Beginning at the L’s vertex, a laser beam travels down each of the L’s 40-meter arms simultaneously, bounces off the mirrors at the ends and recombines at the starting point. Since the laser beam’s paths down each arm of the L are the same length, absent a holographic jitter, the beam should cancel itself out as it recombines. If it doesn’t, it could be evidence of the jitter, a disruption in the laser beam’s flight.

    The light path through a Michelson interferometer. The two light rays with a common source combine at the half-silvered mirror to reach the detector. They may either interfere constructively (strengthening in intensity) if their light waves arrive in phase, or interfere destructively (weakening in intensity) if they arrive out of phase, depending on the exact distances between the three mirrors. No image credit.

    And why are there two interferometers? The two beam spots’ particular brightening and dimming will match if it’s the looked-for signal.

    “Real signals have to be in sync,” Chou says. “Random fluctuations won’t be heard by both instruments.”

    Should the humble Holometer find a jitter when it looks for the signal—researchers will soon begin the initial search and expect results by 2015—the reward to physics would be extraordinarily high, especially given the scrimping behind the experiment and the fact that no one had to build an impossibly high-energy, Milky Way-sized collider. The data would support the idea that the universe we see around us is only a hologram. It would also help bring together the two thus-far-irreconcilable principles of quantum mechanics and relativity.

    “Right now, so little experimental data exists about this high-energy scale that theorists are unable to construct any meaningful models other than those based on speculation,” Chou says. “Our experiment is really a mission of exploration—to obtain data about an extremely high-energy scale that is otherwise inaccessible.”

    In the Holometer trailer, University of Michigan scientist Dick Gustafson checks a signal from the Holometer during a test.
    Photo by: Reidar Hahn, Fermilab

    What’s more, when the Holometer is up and running, it will be able to look for other phenomena that manifest themselves in the form of high-frequency gravitational waves, including topological defects in our cosmos—areas of tension between large regions in space-time that were formed by the big bang.

    “Whenever you design a new apparatus, what you’re doing is building something that’s more sensitive to some aspect of nature than anything that has ever been built before,” Chou says. “We may discover evidence of holographic jitter. But even if we don’t, if we’re smart about how we use our newly built apparatus, we may still be able to discover new aspects of our universe.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 7:09 pm on April 3, 2014 Permalink | Reply
    Tags: , , , Physics   

    From ISOLDE at CERN: “ISOLDE sheds light on dying stars” 

    CERN New Masthead


    3 Apr 2014
    Dan Noyes

    What happens inside a dying star? A recent experiment at CERN’s REX accelerator offers clues that could help astrophysicists to recalculate the ages of some of the largest explosions in the universe.

    CERN REX post accelerator
    REX post-accelerator

    Core-collapse supernovae are spectacular stellar explosions that can briefly outshine an entire galaxy. They occur when massive stars – stars that are more than eight times as massive as our sun – collapse upon themselves. Huge amounts of matter and energy are ejected into space during these events. The cores of such stars then rapidly collapse and go on to form a neutron star or a black hole.

    Date 6 January 2014, 16:15:00
    Source http://www.eso.org/public/images/eso1401a/
    Author ALMA (ESO/NAOJ/NRAO)/A. Angelich. Visible light image: the NASA/ESA Hubble Space Telescope. X-Ray image: The NASA Chandra X-Ray Observatory
    The expanding remnant of SN 1987A, a Type II-P supernova in the Large Magellanic Cloud. NASA image.

    The sequence of events in the first few seconds of a massive star collapsing is well understood. Elements in and around the core are broken down by high-energy photons into free protons, neutrons and alpha particles. Bursts of neutrinos follow. But modelling what happens next remains a challenge for astrophysicists.

    Optical telescopes offer little detail on the explosion mechanism. Gamma ray observatories, by contrast, offer tantalising clues, notably in the gamma rays produced by titanium-44 , an isotope of titanium created naturally in supernovae, which can be detected as it is ejected from the dying stars. The amount of the isotope ejected from the supernovae can tell astrophysicists about how it exploded.

    The Compton Gamma Ray Observatory (CGRO) was a space observatory detecting light from 20 KeV to 30 GeV in Earth orbit from 1991 to 2000. It featured four main telescopes in one spacecraft covering x-rays and gamma-rays, including various specialized sub-instruments and detectors. Following 14 years of effort, the observatory was launched from Space Shuttle Atlantis during STS-37 on 5 April 1991, and operated until its deorbit on 4 June 2000. CGRO was part of NASA’s Great Observatories series, along with the Hubble Space Telescope, the Chandra X-ray Observatory, and the Spitzer Space Telescope. It was the second of the NASA “Great Observatories” to be launched to space, following the Hubble Space Telescope. CGRO was an international collaboration and additional contributions came from the European Space Agency and various Universities, as well as the U.S. Naval Research Laboratory.

    Two of the best of the ground based Optical observatories
    Keck Observatory


    By understanding the behavior of titanium-44 at energies similar to those at the core a collapsing star, researchers at CERN hope to offer some insight into the mechanisms of core-collapse supernovae.

    In a paper published in March, they reported on an experiment that used titanium-44 harvested from waste accelerator parts at the Paul Scherrer Institute (PSI) in Switzerland.

    At the ISOLDE facility at CERN, the REX team accelerated a beam of titanium-44 into a chamber of helium gas and observed the resulting collisions between the isotope and the helium atoms. The measurements – which mimic reactions occurring in the silicon-rich region just above the exploding core of a supernova – indicated that more of the isotope is ejected from core collapse supernovae than has previously been thought.

    Astrophysicists can use the new data to recalculate the ages of supernovae.

    See the full article here.

    Meet CERN in a variety of places:

    Cern Courier



    CERN CMS New

    CERN LHCb New


    CERN LHC New

    LHC particles

    Quantum Diaries

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 1:20 pm on March 21, 2014 Permalink | Reply
    Tags: , , , General Relativity, , Physics   

    From Fermilab: “Proving special relativity: episode 1″ 

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Friday, March 21, 2014

    Fermilab Don Lincoln
    This article was written by Dr Don Lincoln

    In 1905, Albert Einstein wrote four seminal papers.

    Albert Einstein, PhD, Nobel Laureate

    The most famous was his theory of special relativity, which describes how an object behaves as its speed increases. It predicts the most mind-bending things: Time slows down as speed goes up. Increasing speed also causes the length of an object to shrink. And, according to some science popularizations, an object’s mass increases as its velocity approaches the speed of light. (This statement is both kinda-sorta right and terribly wrong — we’ll get to that in a future column.) Perhaps its best-known prediction is that no object with mass can go faster than light. This last statement is especially disappointing, as it puts the kibosh on mankind’s dreams of zipping around the galaxy and exploring nearby stars.

    These predictions are all counterintuitive; we never see these behaviors in our everyday lives. If you’re in a high-speed jet fighter, the length of objects doesn’t shrink, objects themselves don’t get heavier, and time seems to march along at its familiar pace.

    The fact that Einstein’s predictions and common sense disagree prompts a subset of science enthusiasts to react against the theory of relativity. Science bulletin boards are full of relativity deniers, some holding firmly to the ideas of the 1800s and others espousing ideas that are alternatives to special relativity.

    Part of the gap between ordinary experience and Einstein’s predictions originates in the velocities involved. Until you get to really fast speeds, special relativity is numerically indistinguishable from the classical physics you learn in a high school or freshman college class. In fact, the difference in physics between the two approaches is less than one percent until you get to a speed of 18,600 miles per second. That’s about fast enough to get from Chicago to Honolulu in a second, and not even going the short way — that’s going via London. Given that an M-16 rifle bullet barrels through the air at about half a mile per second and that the fastest projectile ever fired moves at about 10 miles per second, it is not surprising that our intuition doesn’t accurately predict the behavior of matter under these super-high speeds.

    Over the next three columns, we’ll talk about relativity with an emphasis on how particle and accelerator physics demonstrates without any doubt that Einstein’s ideas are correct. While you’ll have to wait for subsequent columns to learn about some detailed evidence, I can tease you with a compelling demonstration of why scientists don’t use classical physics when they design accelerators.

    Let us use the venerable Fermilab Tevatron as our example. This accelerator was a ring 3.9 miles in circumference. According to relativity, the protons in the accelerator moved at 99.99995 percent the speed of light, or 186,000 miles per second.

    Given these figures, relativity predicts that the protons will circle the ring about 48,000 times a second. In contrast, classical physics predicts that the velocity of protons in the Tevatron is about 46 times faster than light and therefore that a proton will orbit the ring about 2,220,000 times a second. Fermilab accelerator scientists observed the expected 48,000 times a second. Score one for relativity.

    In the next column, we’ll look at the energies and velocities in the various accelerators in the Fermilab complex and explore the idea of relativistic mass. Given the ability of scientists at accelerator laboratories to accelerate particles to high velocity, we are able to confront both classical and relativistic physics with real data. The message from the data is clear: Our universe obeys the laws of relativity.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 8:39 am on March 14, 2014 Permalink | Reply
    Tags: , , Physics,   

    From Quantum Diaries: “The Standard Model: a beautiful but flawed theory” 

    March 14th, 2014

    Pauline Gagnon
    Pauline Gagnon

    This is the first part of a series of three on supersymmetry, the theory many believe could go beyond the Standard Model. First I explain what is the Standard Model and show its limitations. Then I introduce supersymmetry and explain how it would fix the main flaws of the Standard Model. Finally, I will review how experimental physicists are trying to discover “superparticles” at the Large Hadron Collider at CERN.

    The Standard Model describes what matter is made of and how it holds together. It rests on two basic ideas: all matter is made of particles, and these particles interact with each other by exchanging other particles associated with the fundamental forces.

    The basic grains of matter are fermions and the force carriers are bosons. The names of these two classes refer to their spin – or angular momentum. Fermions have half-integer values of spin whereas bosons have integer values as shown in the diagram below.

    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Fermions come in two families. The leptons family has six members, with the electron being the best known of them. The quarks family contains six quarks. The up and down quarks are found inside protons and neutrons. The twelve fermions are the building blocks of matter and each one has a spin value of ½.

    These particles interact with each other through fundamental forces. Each force comes with one or more force carriers. The nuclear force comes with the gluon and binds the quarks within the proton and neutrons. The photon is associated with the electromagnetic force. The weak interaction is responsible for radioactivity. It comes with the Z and W bosons. All have a spin of 1.

    The main point is: there are grains of matter, the fermions with spin ½, and force carriers, the bosons with integer values of spin.

    The Standard Model is both remarkably simple and very powerful. There are complex equations expressing all this in a mathematical way. These equations allow theorists to make very precise predictions. Nearly every quantity that has been measured in particle physics laboratories over the past five decades falls right on the predicted value, within experimental error margins.

    So what’s wrong with the Standard Model? Essentially, one could say that the whole model lacks robustness at higher energy. As long as we observe various phenomena at low energy, as we have done so far, things behave properly. But as [particle] accelerators are getting more and more powerful, we are about to reach a level of energy which existed only shortly after the Big Bang where the equations of the Standard Model start getting shaky.

    This is a bit like with the laws of physics at low and high speed. A particle moving at near the speed of light cannot be described with the simple laws of mechanics derived by [Isaac] Newton. One needs special relativity to describe its motion.

    One major problem of the Standard Model is that it does not include gravity, one of the four fundamental forces. The model also fails to explain why gravity is so much weaker than the electromagnetic or nuclear forces. For example, a simple fridge magnet can counteract the gravitational attraction of a whole planet on a small object.

    This huge difference in the strength of fundamental forces is one aspect of the “hierarchy problem”. It also refers to the wide range in mass for the elementary particles. In the table shown above, we see the electron is about 200 times lighter than the muon and 3500 times lighter than the tau. Same thing for the quarks: the top quark is 75 000 times heavier than the up quark. Why is there such a wide spectrum of masses among the building blocks of matter? Imagine having a Lego set containing bricks as disparate in size as that!

    The hierarchy problem is also related to the Higgs boson mass. The equations of the Standard Model establish relations between the fundamental particles. For example, in the equations, the Higgs boson has a basic mass to which theorists add a correction for each particle that interact with it. The heavier the particle, the larger the correction. The top quark being the heaviest particle, it adds such a large correction to the theoretical Higgs boson mass that theorists wonder how the measured Higgs boson mass can be as small as it was found.

    This seems to indicate that other yet undiscovered particles exist and change the picture. In that case, the corrections to the Higgs mass from the top quark could be cancelled out by some other hypothetical particle and lead to the observed low Higgs boson mass. Supersymmetry just happens to predict the existence of such particles, hence its appeal.

    Last but not least, the Standard Model only describes visible matter, that is all matter we see around us on Earth as well as in stars and galaxies. But proofs abound telling us the Universe contains about five times more “dark matter”, a type of matter completely different from the one we know, than ordinary matter. Dark matter does not emit any light but manifests itself through its gravitational effects. Among all the particles contained in the Standard Model, none has the properties of dark matter. Hence it is clear the Standard Model gives an incomplete picture of the content of the Universe but supersymmetry could solve this problem.

    See the full article here.

    Participants in Quantum Diaries:



    US/LHC Blog


    Brookhaven Lab


    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 6:39 pm on March 10, 2014 Permalink | Reply
    Tags: , , , , , Physics   

    From M.I.T.: “Two-dimensional material shows promise for optoelectronics” 

    March 10, 2014
    David L. Chandler, MIT News Office

    Team creates LEDs, photovoltaic cells, and light detectors using novel one-molecule-thick material.

    A team of MIT researchers has used a novel material that’s just a few atoms thick to create devices that can harness or emit light. This proof-of-concept could lead to ultrathin, lightweight, and flexible photovoltaic cells, light emitting diodes (LEDs), and other optoelectronic devices, they say.

    In the team’s experimental setup, electricity was supplied to a tiny piece of tungsten selenide (small rectangle at center) through two gold wires (from top left and right), causing it to emit light (bright area at center), demonstrating its potential as an LED material.
    Image courtesy of Britt Baugher and Hugh Churchill

    Microscope image shows the teams experimental setup.
    Image courtesy of Hugh Churchill and Felice Frankel

    Their report is one of three papers by different groups describing similar results with this material, published in the March 9 issue of Nature Nanotechnology. The MIT research was carried out by Pablo Jarillo-Herrero, the Mitsui Career Development Associate Professor of Physics, graduate students Britton Baugher and Yafang Yang, and postdoc Hugh Churchill.

    The material they used, called tungsten diselenide (WSe2), is part of a class of single-molecule-thick materials under investigation for possible use in new optoelectronic devices — ones that can manipulate the interactions of light and electricity. In these experiments, the MIT researchers were able to use the material to produce diodes, the basic building block of modern electronics.

    Typically, diodes (which allow electrons to flow in only one direction) are made by “doping,” which is a process of injecting other atoms into the crystal structure of a host material. By using different materials for this irreversible process, it is possible to make either of the two basic kinds of semiconducting materials, p-type or n-type.

    But with the new material, either p-type or n-type functions can be obtained just by bringing the vanishingly thin film into very close proximity with an adjacent metal electrode, and tuning the voltage in this electrode from positive to negative. That means the material can easily and instantly be switched from one type to the other, which is rarely the case with conventional semiconductors.

    In their experiments, the MIT team produced a device with a sheet of WSe2 material that was electrically doped half n-type and half p-type, creating a working diode that has properties “very close to the ideal,” Jarillo-Herrero says.

    By making diodes, it is possible to produce all three basic optoelectronic devices — photodetectors, photovoltaic cells, and LEDs; the MIT team has demonstrated all three, Jarillo-Herrero says. While these are proof-of-concept devices, and not designed for scaling up, the successful demonstration could point the way toward a wide range of potential uses, he says.

    “It’s known how to make very large-area materials” of this type, Churchill says. While further work will be required, he says, “there’s no reason you wouldn’t be able to do it on an industrial scale.”

    In principle, Jarillo-Herrero says, because this material can be engineered to produce different values of a key property called bandgap, it should be possible to make LEDs that produce any color — something that is difficult to do with conventional materials. And because the material is so thin, transparent, and lightweight, devices such as solar cells or displays could potentially be built into building or vehicle windows, or even incorporated into clothing, he says.

    While selenium is not as abundant as silicon or other promising materials for electronics, the thinness of these sheets is a big advantage, Churchill points out: “It’s thousands or tens of thousands of times thinner” than conventional diode materials, “so you’d use thousands of times less material” to make devices of a given size.

    In addition to the diodes the team has produced, the team has also used the same methods to make p-type and n-type transistors and other electronic components, Jarillo-Herrero says. Such transistors could have a significant advantage in speed and power consumption because they are so thin, he says.

    Kirill Bolotin, an assistant professor of physics and electrical engineering at Vanderbilt University, says, “The field of two-dimensional materials is still at its infancy, and because of this, any potential devices with well-defined applications are highly desired. Perhaps the most surprising aspect of this study is that all of these devices are efficient. … It is possible that devices of this kind can transform the way we think about applications where small optoelectronic elements are needed.”

    The research was supported by the U.S. Office of Naval Research, by a Packard fellowship, and by a Pappalardo fellowship, and made use of National Science Foundation-supported facilities.

    See the full article here.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 9:59 am on March 7, 2014 Permalink | Reply
    Tags: , , Magnetics, Physics,   

    From UC Berkeley: “Colored diamonds are a superconductor’s best friend” 

    UC Berkeley

    March 6, 2014
    Robert Sanders

    Flawed but colorful diamonds are among the most sensitive detectors of magnetic fields known today, allowing physicists to explore the minuscule magnetic fields in metals, exotic materials and even human tissue.

    Dmitry Budker and Ron Folman build ‘atom chips’ to probe the minuscule magnetic properties of high-temperature superconductors. Robert Sanders photo.

    University of California, Berkeley, physicist Dmitry Budker and his colleagues at Ben-Gurion University of the Negev in Israel and UCLA have now shown that these diamond sensors can measure the tiny magnetic fields in high-temperature superconductors, providing a new tool to probe these much ballyhooed but poorly understood materials.

    “Diamond sensors will give us measurements that will be useful in understanding the physics of high temperature superconductors, which, despite the fact that their discoverers won a 1987 Nobel Prize, are still not understood,” said Budker, a professor of physics and faculty scientist at Lawrence Berkeley National Laboratory.

    High-temperature superconductors are exotic mixes of materials like yttrium or bismuth that, when chilled to around 180 degrees Fahrenheit above absolute zero (-280ºF), lose all resistance to electricity, whereas low-temperature superconductors must be chilled to several degrees above absolute zero. When discovered 28 years ago, scientists predicted we would soon have room-temperature superconductors for lossless electrical transmission or magnetically levitated trains.

    It never happened.

    “The new probe may shed light on high-temperature superconductors and help theoreticians crack this open question,” said coauthor Ron Folman of Ben-Gurion University of the Negev, who is currently a Miller Visiting Professor at UC Berkeley. “With the help of this new sensor, we may be able to take a step forward.”

    Budker, Folman and their colleagues report their success in an article posted online Feb. 18 in the journal Physical Review B.

    Flawed but colorful

    Colorful diamonds, ranging from yellow and orange to purple, have been prized for millennia. Their color derives from flaws in the gem’s carbon structure: some of the carbon atoms have been replaced by an element, such as boron, that emits or absorbs a specific color of light.

    Once scientists learned how to create synthetic diamonds, they found that they could selectively alter a diamond’s optical properties by injecting impurities. In this experiment, Budker, Folman and their colleagues bombarded a synthetic diamond with nitrogen atoms to knock out carbon atoms, leaving holes in some places and nitrogen atoms in others. They then heated the crystal to force the holes, called vacancies, to move around and pair with nitrogen atoms, resulting in diamonds with so-called nitrogen-vacancy centers. For the negatively charged centers, the amount of light they re-emit when excited with light becomes very sensitive to magnetic fields, allowing them to be used as sensors that are read out by laser spectroscopy.

    Folman noted that color centers in diamonds have the unique property of exhibiting quantum behavior, whereas most other solids at room temperature do not.

    “This is quite surprising, and is part of the reason that these new sensors have such a high potential,” Folman said.

    Applications in homeland security?

    Technology visionaries are thinking about using nitrogen-vacancy centers to probe for cracks in metals, such as bridge structures or jet engine blades, for homeland security applications, as sensitive rotation sensors, and perhaps even as building blocks for quantum computers.

    The crystal lattice of a pure diamond is pure carbon (black balls), but when a nitrogen atom replaces one carbon and an adjacent carbon is kicked out, the ‘nitrogen-vacancy center’ becomes a sensitive magnetic field sensor.

    Budker, who works on sensitive magnetic field detectors, and Folman, who builds ‘atom chips’ to probe and manipulate atoms, focused in this work on using these magnetometers to study new materials.

    “These diamond sensors combine high sensitivity with the potential for high spatial resolution, and since they operate at higher temperatures than their competitors – superconducting quantum interference device, or SQUID, magnetometers – they turn out to be good for studying high temperature superconductors,” Budker said. “Although several techniques already exist for magnetic probing of superconducting materials, there is a need for new methods which will offer better performance.”

    The team used their diamond sensor to measure properties of a thin layer of yttrium barium copper oxide (YBCO), one of the two most popular types of high-temperatures superconductor. The Ben-Gurion group integrated the diamond sensor with the superconductor on one chip and used it to detect the transition from normal conductivity to superconductivity, when the material expels all magnetic fields. The sensor also detected tiny magnetic vortices, which appear and disappear as the material becomes superconducting and may be a key to understanding how these materials become superconducting at high temperatures.

    “Now that we have proved it is possible to probe high-temperatures superconductors, we plan to build more sensitive and higher-resolution sensors on a chip to study the structure of an individual magnetic vortex,” Folman said. “We hope to discover something new that cannot be seen with other technologies.”

    Researchers, including Budker and Folman, are attempting to solve other mysteries through magnetic sensing. For example, they are investigating networks of nerve cells by detecting the magnetic field each nerve cell pulse emits. In another project, they aim at detecting strange never-before-observed entities called axions through their effect on magnetic sensors.

    Coauthors include Amir Waxman, Yechezkel Schlussel and David Groswasser of Ben-Gurion University of the Negev, UC Berkeley Ph.D. graduate Victor Acosta, who is now at Google [x] in Mountain View, Calif., and former UC Berkeley post-doc Louis Bouchard, now a UCLA assistant professor of chemistry and biochemistry.

    The work was supported by the NATO Science for Peace program, AFOSR/DARPA QuASAR program, the National Science Foundation and UC Berkeley’s Miller Institute for Basic Research in Science.

    See the full article here.

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

    ScienceSprings is powered by MAINGEAR computers

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 249 other followers

%d bloggers like this: