Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:59 am on June 9, 2015 Permalink | Reply
    Tags: , , Physics   

    From NYT: “A Crisis at the Edge of Physics” 

    New York Times

    The New York Times

    JUNE 5, 2015

    Gérard DuBois

    DO physicists need empirical evidence to confirm their theories?

    You may think that the answer is an obvious yes, experimental confirmation being the very heart of science. But a growing controversy at the frontiers of physics and cosmology suggests that the situation is not so simple.

    A few months ago in the journal Nature, two leading researchers, George Ellis and Joseph Silk, published a controversial piece called Scientific Method: Defend the Integrity of Physics. They criticized a newfound willingness among some scientists to explicitly set aside the need for experimental confirmation of today’s most ambitious cosmic theories — so long as those theories are “sufficiently elegant and explanatory.” Despite working at the cutting edge of knowledge, such scientists are, for Professors Ellis and Silk, “breaking with centuries of philosophical tradition of defining scientific knowledge as empirical.”

    Whether or not you agree with them, the professors have identified a mounting concern in fundamental physics: Today, our most ambitious science can seem at odds with the empirical methodology that has historically given the field its credibility.

    How did we get to this impasse? In a way, the landmark detection three years ago of the elusive Higgs boson particle by researchers at the Large Hadron Collider marked the end of an era.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Predicted about 50 years ago, the Higgs particle is the linchpin of what physicists call the “standard model” of particle physics, a powerful mathematical theory that accounts for all the fundamental entities in the quantum world (quarks and leptons) and all the known forces acting between them (gravity, electromagnetism and the strong and weak nuclear forces).

    Standard Model of Particle Physics. The diagram shows the elementary particles of the Standard Model (the Higgs boson, the three generations of quarks and leptons, and the gauge bosons), including their names, masses, spins, charges, chiralities, and interactions with the strong, weak and electromagnetic forces. It also depicts the crucial role of the Higgs boson in electroweak symmetry breaking, and shows how the properties of the various particles differ in the (high-energy) symmetric phase (top) and the (low-energy) broken-symmetry phase (bottom).

    But the standard model, despite the glory of its vindication, is also a dead end. It offers no path forward to unite its vision of nature’s tiny building blocks with the other great edifice of 20th-century physics: Einstein’s cosmic-scale description of gravity. Without a unification of these two theories — a so-called theory of quantum gravity — we have no idea why our universe is made up of just these particles, forces and properties. (We also can’t know how to truly understand the Big Bang, the cosmic event that marked the beginning of time.)

    This is where the specter of an evidence-independent science arises. For most of the last half-century, physicists have struggled to move beyond the standard model to reach the ultimate goal of uniting gravity and the quantum world. Many tantalizing possibilities (like the often-discussed string theory) have been explored, but so far with no concrete success in terms of experimental validation.

    Today, the favored theory for the next step beyond the standard model is called supersymmetry (which is also the basis for string theory). Supersymmetry predicts the existence of a “partner” particle for every particle that we currently know.

    Supersymmetry standard model
    Standard Model of Supersymmetry

    It doubles the number of elementary particles of matter in nature. The theory is elegant mathematically, and the particles whose existence it predicts might also explain the universe’s unaccounted-for “dark matter.” As a result, many researchers were confident that supersymmetry would be experimentally validated soon after the Large Hadron Collider became operational.

    That’s not how things worked out, however. To date, no supersymmetric particles have been found. If the Large Hadron Collider cannot detect these particles, many physicists will declare supersymmetry — and, by extension, string theory — just another beautiful idea in physics that didn’t pan out.

    But many won’t. Some may choose instead to simply retune their models to predict supersymmetric particles at masses beyond the reach of the Large Hadron Collider’s power of detection — and that of any foreseeable substitute.

    Implicit in such a maneuver is a philosophical question: How are we to determine whether a theory is true if it cannot be validated experimentally? Should we abandon it just because, at a given level of technological capacity, empirical support might be impossible? If not, how long should we wait for such experimental machinery before moving on: ten years? Fifty years? Centuries?

    Consider, likewise, the cutting-edge theory in physics that suggests that our universe is just one universe in a profusion of separate universes that make up the so-called multiverse. This theory could help solve some deep scientific conundrums about our own universe (such as the so-called fine-tuning problem), but at considerable cost: Namely, the additional universes of the multiverse would lie beyond our powers of observation and could never be directly investigated. Multiverse advocates argue nonetheless that we should keep exploring the idea — and search for indirect evidence of other universes.

    The opposing camp, in response, has its own questions. If a theory successfully explains what we can detect but does so by positing entities that we can’t detect (like other universes or the hyperdimensional superstrings of string theory) then what is the status of these posited entities? Should we consider them as real as the verified particles of the standard model? How are scientific claims about them any different from any other untestable — but useful — explanations of reality?

    Recall the epicycles, the imaginary circles that Ptolemy used and formalized around A.D. 150 to describe the motions of planets. Although Ptolemy had no evidence for their existence, epicycles successfully explained what the ancients could see in the night sky, so they were accepted as real. But they were eventually shown to be a fiction, more than 1,500 years later. Are superstrings and the multiverse, painstakingly theorized by hundreds of brilliant scientists, anything more than modern-day epicycles?

    Just a few days ago, scientists restarted investigations with the Large Hadron Collider, after a two-year hiatus. Upgrades have made it even more powerful, and physicists are eager to explore the properties of the Higgs particle in greater detail. If the upgraded collider does discover supersymmetric particles, it will be an astonishing triumph of modern physics. But if nothing is found, our next steps may prove to be difficult and controversial, challenging not just how we do science but what it means to do science at all.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 7:15 am on June 5, 2015 Permalink | Reply
    Tags: , , Physics   

    From livescience: “The 9 Biggest Unsolved Mysteries in Physics” 2012 but Really Worth Your Time 


    July 03, 2012
    Natalie Wolchover

    In 1900, the British physicist Lord Kelvin is said to have pronounced: “There is nothing new to be discovered in physics now. All that remains is more and more precise measurement.” Within three decades, quantum mechanics and Einstein’s theory of relativity had revolutionized the field. Today, no physicist would dare assert that our physical knowledge of the universe is near completion. To the contrary, each new discovery seems to unlock a Pandora’s box of even bigger, even deeper physics questions. These are our picks for the most profound open questions of all.

    What is dark energy?

    Credit: NASA

    No matter how astrophysicists crunch the numbers, the universe simply doesn’t add up. Even though gravity is pulling inward on space-time — the “fabric” of the cosmos — it keeps expanding outward faster and faster. To account for this, astrophysicists have proposed an invisible agent that counteracts gravity by pushing space-time apart. They call it dark energy. In the most widely accepted model of dark energy, it is a “cosmological constant [(usually denoted by the Greek capital letter lambda: Λ] “: an inherent property of space itself, which has “negative pressure” driving space apart. As space expands, more space is created, and with it, more dark energy. Based on the observed rate of expansion, scientists know that the sum of all the dark energy must make up more than 70 percent of the total contents of the universe. But no one knows how to look for it.

    What is dark matter?

    Credit: ESO/L. Calçada

    Evidently, about 84 percent of the matter in the universe does not absorb or emit light. “Dark matter,” as it is called, cannot be seen directly, and it hasn’t yet been detected by indirect means, either. Instead, dark matter’s existence and properties are inferred from its gravitational effects on visible matter, radiation and the structure of the universe. This shadowy substance is thought to pervade the outskirts of galaxies, and may be composed of “weakly interacting massive particles,” or WIMPs. Worldwide, there are several detectors on the lookout for WIMPs, but so far, not one has been found.

    Why is there an arrow of time?

    Credit: Image via Shutterstock

    Time moves forward because a property of the universe called “entropy,” roughly defined as the level of disorder, only increases, and so there is no way to reverse a rise in entropy after it has occurred. The fact that entropy increases is a matter of logic: There are more disordered arrangements of particles than there are ordered arrangements, and so as things change, they tend to fall into disarray. But the underlying question here is, why was entropy so low in the past? Put differently, why was the universe so ordered at its beginning, when a huge amount of energy was crammed together in a small amount of space?

    Are there parallel universes?

    Credit: Image via Shutterstock

    Astrophysical data suggests space-time might be “flat,” rather than curved, and thus that it goes on forever. If so, then the region we can see (which we think of as “the universe”) is just one patch in an infinitely large “quilted multiverse.” At the same time, the laws of quantum mechanics dictate that there are only a finite number of possible particle configurations within each cosmic patch (10^10^122 distinct possibilities). So, with an infinite number of cosmic patches, the particle arrangements within them are forced to repeat — infinitely many times over. This means there are infinitely many parallel universes: cosmic patches exactly the same as ours (containing someone exactly like you), as well as patches that differ by just one particle’s position, patches that differ by two particles’ positions, and so on down to patches that are totally different from ours.

    Is there something wrong with that logic, or is its bizarre outcome true? And if it is true, how might we ever detect the presence of parallel universes?

    Why is there more matter than antimatter?

    Credit: Image via Shutterstock

    The question of why there is so much more matter than its oppositely-charged and oppositely-spinning twin, antimatter, is actually a question of why anything exists at all. One assumes the universe would treat matter and antimatter symmetrically, and thus that, at the moment of the Big Bang, equal amounts of matter and antimatter should have been produced. But if that had happened, there would have been a total annihilation of both: Protons would have canceled with antiprotons, electrons with anti-electrons (positrons), neutrons with antineutrons, and so on, leaving behind a dull sea of photons in a matterless expanse. For some reason, there was excess matter that didn’t get annihilated, and here we are. For this, there is no accepted explanation.

    What is the fate of the universe?

    Credit: Creative Commons Attribution-Share Alike 3.0 Unported | Bjarmason

    The fate of the universe strongly depends on a factor of unknown value: Ω, a measure of the density of matter and energy throughout the cosmos. If Ω is greater than 1, then space-time would be “closed” like the surface of an enormous sphere. If there is no dark energy, such a universe would eventually stop expanding and would instead start contracting, eventually collapsing in on itself in an event dubbed the “Big Crunch.” If the universe is closed but there is dark energy, the spherical universe would expand forever.

    Alternatively, if Ω is less than 1, then the geometry of space would be “open” like the surface of a saddle. In this case, its ultimate fate is the “Big Freeze” followed by the “Big Rip“: first, the universe’s outward acceleration would tear galaxies and stars apart, leaving all matter frigid and alone. Next, the acceleration would grow so strong that it would overwhelm the effects of the forces that hold atoms together, and everything would be wrenched apart.

    If Ω = 1, the universe would be flat, extending like an infinite plane in all directions. If there is no dark energy, such a planar universe would expand forever but at a continually decelerating rate, approaching a standstill. If there is dark energy, the flat universe ultimately would experience runaway expansion leading to the Big Rip.

    Que sera, sera.

    How do measurements collapse quantum wavefunctions?

    Credit: John D. Norton

    In the strange realm of electrons, photons and the other fundamental particles, quantum mechanics is law. Particles don’t behave like tiny balls, but rather like waves that are spread over a large area. Each particle is described by a “wavefunction,” or probability distribution, which tells what its location, velocity, and other properties are more likely to be, but not what those properties are. The particle actually has a range of values for all the properties, until you experimentally measure one of them — its location, for example — at which point the particle’s wavefunction “collapses” and it adopts just one location.

    But how and why does measuring a particle make its wavefunction collapse, producing the concrete reality that we perceive to exist? The issue, known as the measurement problem, may seem esoteric, but our understanding of what reality is, or if it exists at all, hinges upon the answer.

    Is string theory correct?

    Credit: Creative Commons | Lunch

    When physicists assume all the elementary particles are actually one-dimensional loops, or “strings,” each of which vibrates at a different frequency, physics gets much easier. String theory allows physicists to reconcile the laws governing particles, called quantum mechanics, with the laws governing space-time, called general relativity, and to unify the four fundamental forces of nature into a single framework. But the problem is, string theory can only work in a universe with 10 or 11 dimensions: three large spatial ones, six or seven compacted spatial ones, and a time dimension. The compacted spatial dimensions — as well as the vibrating strings themselves — are about a billionth of a trillionth of the size of an atomic nucleus. There’s no conceivable way to detect anything that small, and so there’s no known way to experimentally validate or invalidate string theory.

    Is there order in chaos?

    Physicists can’t exactly solve the set of equations that describes the behavior of fluids, from water to air to all other liquids and gases. In fact, it isn’t known whether a general solution of the so-called Navier-Stokes equations even exists, or, if there is a solution, whether it describes fluids everywhere, or contains inherently unknowable points called singularities. As a consequence, the nature of chaos is not well understood. Physicists and mathematicians wonder, is the weather merely difficult to predict, or inherently unpredictable? Does turbulence transcend mathematical description, or does it all make sense when you tackle it with the right math?

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 2:42 pm on May 30, 2015 Permalink | Reply
    Tags: , Physics, ,   

    From physicstoday: “Synchrotrons round the bend to make cheaper, better x rays” 

    physicstoday bloc


    June 2015
    Toni Feder

    Sweden’s new national synchrotron light source, the MAX IV in Lund, is blazing the trail to produce the brightest x rays yet from a storage ring.

    MAX IV Lund
    MAX IV in Lund

    The record brightness, achieved by shrinking the emittance—the product of beam size and angular divergence—of the source electrons, is thanks largely to multibend achromats (MBAs).

    Today’s synchrotrons use groups of magnets, typically two or three dipole bending magnets plus focusing and correction magnets, to send electrons around a circular storage ring. The trick with MBAs is to use more bending magnets per group, or achromat. More focusing magnets can then be interspersed between bending magnets, which makes it easier to return wayward electrons to the fold. The resulting x-ray beam is smaller, brighter, and more coherent.

    “It’s mind-boggling that in electron storage rings, which have been mature for a couple of decades, there is still a factor of 50 improvement lurking, and if we are smart enough, we can figure out how to grab it,” says Stuart Henderson, director of the upgrade project at the Advanced Photon Source (APS) at Argonne National Laboratory [ANL], near Chicago.

    ANL APS interior
    APS at ANL

    The first generation of light sources, in the 1970s, was parasitic to machines built for particle-physics experiments. (See the article by Giorgio Margaritondo, Physics Today, May 2008, page 37.) The second generation was optimized for flux. “In the third generation, we deployed undulators to produce bright beams, which were accompanied by reduced emittance,” says Henderson. (See Physics Today, January 1994, page 18.) The jump in performance promised by MBAs, in parallel with other technical advances, has people calling the MAX IV and other MBA-adopting facilities fourth-generation synchrotrons.

    “Coherence is the game changer for these fourth-generation storage rings,” Henderson says. “It gives you incredible resolution, particularly in [imaging] nonperiodic systems, which after all are what most of life is made of.” The improved coherence, the increased brightness—which “takes what you can do today and puts it on steroids”—and the larger field of view hold the promise of applications across many areas of science. For example, says Henderson, “with coherent flux at high x-ray energy, you could penetrate a fully functioning battery and, with resolution approaching atomic scale, look at the electrochemistry.” Other examples include studying the early stages of crack formation in structural materials and looking at a beating heart or a breathing lung in vivo.

    Picking up the ball

    Initially, the MBA approach was widely dismissed. But now, says Hamed Tarawneh, who is in charge of insertion devices at MAX IV, “many labs are copying the idea. Lund is the Mecca.” (See the interview with Tarawneh in the Singularities department of Physics Today’s online Daily Edition.)

    The idea of MBAs for synchrotron sources goes back to a 1995 paper by Dieter Einfeld, who was a machine physicist at the Elettra light source in Trieste, Italy. But, says Einfeld, now a consultant for the upgrade to the European Synchrotron Radiation Facility (ESRF) in Grenoble, France, “my colleagues didn’t look at this option in detail.” Until, he says, Mikael Eriksson of Sweden’s MAX IV “picked up the ball” in 2003.


    Eriksson, who heads the machine group at MAX IV, says the attraction of MBAs was their relatively low cost. “In Sweden, a small country, there was no use asking for money for a large machine,” he says. “So we went the other way and looked at how to build small technology, small magnets.” The MAX IV, with a circumference half the size of some existing third-generation light sources, will produce brighter beams.

    Meanwhile, several technical advances have smoothed the way to realizing MBAs. For example, the magnet requirements drive down the size of the vacuum tubes in which the electrons circulate, from the conventional diameter of 50 mm to only 22 mm. Evacuating such narrow tubes is possible with a nonevaporable getter, a distributed pumping system that uses an alloy coating to passively absorb molecules.

    Emittance decreases as the third power of the number of bending magnets in the storage ring. The vertical emittance is already small in third-generation storage rings, typically 5–10 picometer radians; it’s the horizontal emittance that is mainly affected as the electrons fly around the ring and radiate x rays. In the synchrotron business, there is a strong push toward the diffraction limit, for which the emittance is small enough that the brightness of the x-ray beams depends only on wavelength. “That is the holy grail,” says Eriksson. “MAX IV is a factor of 20 from that.” At MAX IV, the horizontal emittance will start at about 300 pm·rad, and go down to 150 pm·rad as undulators are added.

    Integrated magnets

    MAX IV will have two storage rings: a 528-m-circumference, 3-GeV ring for hard x rays, and a 96-m, 1.5-GeV ring for soft x rays, which is a traditional research strength in Sweden. Both rings will be fed by a 1.5-GeV injector that could later be lengthened for use as a free-electron laser.

    An innovative feature of MAX IV is that the MBAs are being machined into solid iron blocks (see photo on page 21). Each block is about 2.8 m long and houses a dipole bending magnet plus focusing and correcting quadrupole, sextupole, and octopole magnets. The 3-GeV ring will have 20 “seven-bend” MBAs, each made up of seven blocks. The 1.5-GeV ring will have 12 double-bend achromats. “The revolutionary thing is to have several magnets in one block,” says MAX IV laboratory director Christoph Quitmann. “Instead of installing 1000 magnets and aligning them carefully,” says Eriksson, “we only have to install 140 blocks. This is simpler. Everything is prealigned.”

    The large storage ring at the MAX IV light source in Lund, Sweden, has 20 seven-bend magnets; the lower half of one is shown here. The upper half will be added as the final installation step. Carving the magnets into iron blocks cuts costs and simplifies the alignment process.

    The computerized precision machining “makes it possible to build a huge number of magnets, mechanically stable, all for affordable cost,” says Quitmann. And, he adds, “because the magnets are smaller, the magnets in the new facility will use 10 times less power per meter of circumference than Sweden’s present third-generation machine. We will have five times more circumference but use half the electrical power. We are much more environmentally friendly, which gives a political benefit, and we save money.” The total cost of MAX IV is $500 million, including the site, buildings, three accelerators, and the first 8 of as many as 26 beamlines. Startup for users is scheduled for June 2016.

    “Everybody got excited”

    Two other new synchrotrons are being built from scratch with MBAs: Sirius, a 3-GeV facility in Campinas, Brazil, and Solaris, a replica of MAX IV’s low-energy ring, in Krakow, Poland (see the story on page 23). The ESRF is the only upgrade to MBAs yet funded, but considerations are under way at many facilities, including Soleil in France, Diamond in the UK, SPring-8 in Japan, and the APS and Advanced Light Source in the US.

    The $430 million Sirius will use five-bend MBAs. “We achieve the same emittance as Lund with fewer bends because our optics is more aggressive,” says Sirius accelerator physicist Liu Lin. “It’s a tradeoff. In principle, we have more room for insertion devices.” At Sirius, the magnets will be mounted separately, partly because the precise machining capability for the integrated magnet blocks is not locally available. Sirius is slated to turn on for debugging in 2018.

    ESRF director Francesco Sette says that the idea of an upgrade using MBAs was abandoned in 2008 because at the time switching would have meant an injection efficiency of less than 1% “or an unsustainable upgrade cost.” Then, he says, in 2012 Pantaleo Raimondi, who heads the facility’s accelerator and source division, found a solution: a hybrid seven-bend achromat, in which the dipoles are not all identical. “By adapting the bending,” explains Sette, “the energy and momentum of the electrons from the injector can be matched to the storage ring. Everybody got excited.”

    The approval and funding process for an ESRF upgrade moved quickly. “We will rip out everything in the storage ring except the straight sections,” says Sette. The horizontal emittance will shrink to 60–100 pm·rad from its current 4 nm·rad, he says. The €340 million ($380 million) upgrade began in January and is scheduled to be finished by 2022. Now, says Sette, “the biggest challenge is to deliver with minimal disruption of the user program.”

    At around the same time the ESRF upgrade got the green light, the US Department of Energy’s Basic Energy Sciences Advisory Committee looked at the US position in the international landscape of light sources. “The consensus was that the US has to get its act together in terms of light sources. The US has to have a plan to ensure competitiveness,” says Henderson.

    The APS upgrade team is looking at a hybrid MBA design similar to ESRF’s. The emittance drops rapidly with the number of bends, but there are tradeoffs to having more bends, Henderson says. “It requires gymnastics in the correction magnets. There seems to be a sweet spot around seven bends [for the APS]. Five is not aggressive enough, and nine looks too complicated.”

    Along with the switch to MBAs, the APS would decrease the energy of its storage ring from 7 GeV to 6 GeV. That’s advantageous because emittance scales as the square of the energy, explains Henderson. Combined, he says, the two changes will reduce the emittance by a factor of 50. The horizontal emittance will be about 67 pm·rad. “We can make up for the beam energy by using superconducting undulators. Replacing permanent magnets with superconducting undulator magnets gives you a boost in flux, particularly with hard x rays.”

    Fourth-generation upgrades for APS and the Advanced Light Source are not yet priced out or funded. But to stay competitive, says Henderson, they have to be in operation by the mid 2020s. “Pretty much everyone is looking to upgrade with MBA.” And, says Eriksson, “Others are now pushing their magnet lattices harder than we dared to do.”

    Japan intends to upgrade its light source, SPring-8, on a similar time scale. The plan there is to use five-bend hybrid achromats and to reduce the storage ring energy from 8 GeV to 6 GeV; the ultimate target emittance is around 10 pm·rad, says director Tetsuya Ishikawa. The project, not yet funded, will cost an estimated ¥40 billion ($340 million).

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The American Physical Society strives to:

    Be the leading voice for physics and an authoritative source of physics information for the advancement of physics and the benefit of humanity;
    Provide effective programs in support of the physics community and the conduct of physics;
    Collaborate with national scientific societies for the advancement of science, science education and the science community;
    Cooperate with international physics societies to promote physics, to support physicists worldwide and to foster international collaboration;
    Promote an active, engaged and diverse membership, and support the activities of its units and members.

  • richardmitnick 3:32 pm on May 29, 2015 Permalink | Reply
    Tags: Physics, , ,   

    From ANU: “Physicists solve quantum tunneling mystery” 

    ANU Australian National University Bloc

    Australian National University

    28 May 2015
    No Writer Credit

    Professor Anatoli Kheifets’ theory tackles ultrafast physics. Composite image Stuart Hay, ANU

    An international team of scientists studying ultrafast physics have solved a mystery of quantum mechanics, and found that quantum tunneling is an instantaneous process.

    The new theory could lead to faster and smaller electronic components, for which quantum tunneling is a significant factor. It will also lead to a better understanding of diverse areas such as electron microscopy, nuclear fusion and DNA mutations.

    “Timescales this short have never been explored before. It’s an entirely new world,” said one of the international team, Professor Anatoli Kheifets, from The Australian National University (ANU).

    “We have modelled the most delicate processes of nature very accurately.”

    At very small scales quantum physics shows that particles such as electrons have wave-like properties – their exact position is not well defined. This means they can occasionally sneak through apparently impenetrable barriers, a phenomenon called quantum tunneling.

    Quantum tunneling plays a role in a number of phenomena, such as nuclear fusion in the sun, scanning tunneling microscopy, and flash memory for computers. However, the leakage of particles also limits the miniaturisation of electronic components.

    Professor Kheifets and Dr. Igor Ivanov, from the ANU Research School of Physics and Engineering, are members of a team which studied ultrafast experiments at the attosecond scale (10-18 seconds), a field that has developed in the last 15 years.

    Until their work, a number of attosecond phenomena could not be adequately explained, such as the time delay when a photon ionised an atom.

    “At that timescale the time an electron takes to quantum tunnel out of an atom was thought to be significant. But the mathematics says the time during tunneling is imaginary – a complex number – which we realised meant it must be an instantaneous process,” said Professor Kheifets.

    “A very interesting paradox arises, because electron velocity during tunneling may become greater than the speed of light. However, this does not contradict the special theory of relativity, as the tunneling velocity is also imaginary” said Dr Ivanov, who recently took up a position at the Center for Relativistic Laser Science in Korea.

    The team’s calculations, which were made using the Raijin supercomputer, revealed that the delay in photoionisation originates not from quantum tunneling but from the electric field of the nucleus attracting the escaping electron.

    The results give an accurate calibration for future attosecond-scale research, said Professor Kheifets.

    “It’s a good reference point for future experiments, such as studying proteins unfolding, or speeding up electrons in microchips,” he said.

    The research is published in Nature Physics.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ANU Campus

    ANU is a world-leading university in Australia’s capital city, Canberra. Our location points to our unique history, ties to the Australian Government and special standing as a resource for the Australian people.

    Our focus on research as an asset, and an approach to education, ensures our graduates are in demand the world-over for their abilities to understand, and apply vision and creativity to addressing complex contemporary challenges.

  • richardmitnick 4:29 pm on May 28, 2015 Permalink | Reply
    Tags: Physics, , , , Classical Mechanics   

    From NOVA: “Ultracold Experiment Could Solve One of Physics’s Biggest Contradictions” 



    28 May 2015
    Allison Eck

    A vortex structure emerges within a rotating Bose-Einstein condensate.

    There’s a mysterious threshold that’s predicted to exist beyond the limits of what we can see. It’s called the quantum-classical transition.

    If scientists were to find it, they’d be able to solve one of the most baffling questions in physics: why is it that a soccer ball or a ballet dancer both obey the Newtonian laws while the subatomic particles they’re made of behave according to quantum rules? Finding the bridge between the two could usher in a new era in physics.

    We don’t yet know how the transition from the quantum world to the classical one occurs, but a new experiment, detailed in Physical Review Letters, might give us the opportunity to learn more.

    The experiment involves cooling a cloud of rubidium atoms to the point that they become virtually motionless. Theoretically, if a cloud of atoms becomes cold enough, the wave-like (quantum) nature of the individual atoms will start to expand and overlap with one another. It’s sort of like circular ripples in a pond that, as they get bigger, merge to form one large ring. This phenomenon is more commonly known as a Bose-Einstein condensate, a state of matter in which subatomic particles are chilled to near absolute zero (0 Kelvin or −273.15° C) and coalesce into a single quantum object. That quantum object is so big (compared to the individual atoms) that it’s almost macroscopic—in other words, it’s encroaching on the classical world.

    The team of physicists cooled their cloud of atoms down to the nano-Kelvin range by trapping them in a magnetic “bowl.” To attempt further cooling, they then shot the cloud of atoms upward in a 10-meter-long pipe and let them free-fall from there, during which time the atom cloud expanded thermally. Then the scientists contained that expansion by sending another laser down onto the atoms, creating an electromagnetic field that kept the cloud from expanding further as it dropped. It created a kind of “cooling” effect, but not in the traditional way you might think—rather, the atoms have a lowered “effective temperature,” which is a measure of how quickly the atom cloud is spreading outward. At this point, then, the atom cloud can be described in terms of two separate temperatures: one in the direction of downward travel, and another in the transverse direction (perpendicular to the direction of travel).

    Here’s Chris Lee, writing for ArsTechnica:

    “This is only the start though. Like all lenses, a magnetic lens has an intrinsic limit to how well it can focus (or, in this case, collimate) the atoms. Ultimately, this limitation is given by the quantum uncertainty in the atom’s momentum and position. If the lensing technique performed at these physical limits, then the cloud’s transverse temperature would end up at a few femtoKelvin (10-15). That would be absolutely incredible.

    A really nice side effect is that combinations of lenses can be used like telescopes to compress or expand the cloud while leaving the transverse temperature very cold. It may then be possible to tune how strongly the atoms’ waves overlap and control the speed at which the transition from quantum to classical occurs. This would allow the researchers to explore the transition over a large range of conditions and make their findings more general.”

    Jason Hogan, assistant professor of physics at Stanford University and one of the study’s authors, told NOVA Next that you can understand this last part by using the Heisenberg Uncertainty Principle. As a quantum object’s uncertainty in momentum goes down, its uncertainty in position goes up. Hogan and his colleagues are essentially fine-tuning these parameters along two dimensions. If they can find a minimum uncertainty in the momentum (by cooling the particles as much as they can), then they could find the point at which the quantum-to-classical transition occurs. And that would be a spectacular discovery for the field of particle physics.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 4:08 pm on May 26, 2015 Permalink | Reply
    Tags: , , , Physics, ,   

    From Symmetry: “A goldmine of scientific research” 


    May 26, 2015
    Amelia Williamson Smith

    Photo by Anna Davis

    The underground home of the LUX dark matter experiment has a rich scientific history.

    There’s more than gold in the Black Hills of South Dakota. For longer than five decades, the Homestake mine has hosted scientists searching for particles impossible to detect on Earth’s surface.

    It all began with the Davis Cavern.

    In the early 1960s, Ray Davis, a nuclear chemist at Brookhaven National Laboratory designed an experiment to detect particles produced in fusion reactions in the sun. The experiment would earn him a share of the Nobel Prize in Physics in 2002.

    Davis was searching for neutrinos, fundamental particles that had been discovered only a few years before. Neutrinos are very difficult to detect; they can pass through the entire Earth without bumping into another particle. But they are constantly streaming through us. So, with a big enough detector, Davis knew he could catch at least a few.

    Davis’ experiment had to be done deep underground; without the shielding of layers of rock and earth it would be flooded by the shower of cosmic rays also constantly raining from space.

    Davis put his first small prototype detector in a limestone mine near Akron, Ohio. But it was only about half a mile underground, not deep enough.

    “The only reason for mining deep into the earth was for something valuable like gold,” says Kenneth Lande, professor of physics at the University of Pennsylvania, who worked on the experiment with Davis. “And so a gold mine became the obvious place to look.”

    But there was no precedent for hosting a particle physics experiment in such a place. “There was no case where a physics group would appear at a working mine and say, ‘Can we move in please?’”

    Davis approached the Homestake Mining Company anyway, and the company agreed to excavate a cavern for the experiment.

    BNL funded the experiment. In 1965, it was installed in a cavern 4850 feet below the surface.

    The detector consisted of a 100,000-gallon tank of chlorine atoms. Davis had predicted that as solar neutrinos passed through the tank, one would occasionally collide with a chlorine atom, changing it to an argon atom. After letting the detector run for a couple of months at a time, Davis’ team would flush out the tank and count the argon atoms to determine how many neutrino interactions had occurred.

    “The detector had approximately 1031 atoms in it. One argon atom was produced every two days,” Lande says. “To design something that could do that kind of extraction was mind-boggling.”

    Ray Davis. Courtesy of: Brookhaven National Laboratory

    A different kind of laboratory

    During the early years of the Davis experiment, around 2000 miners worked at the mine, along with engineers and geologists. The small group of scientists working on the Davis experiment would travel down into the mine with them.

    To go down the shaft to the 4850-foot level, they would get into what was called the “cage,” a 4.4-foot by 12.5-foot metal conveyance that held 36 people. The ride down, lit only by the glow of a couple of headlamps, took about five minutes, says Tom Regan, former operations safety manager and now safety consultant, who worked as a student laborer in the mine during the early years of the Davis experiment.

    Once they reached the 4850-foot level, the scientists walked across a rock dump. “It was guarded so a person couldn’t fall down the hole,” Regan says. “But you had to sometimes wait for a production train of rock or even loads of supplies or men or materials.”

    The Davis Cavern was 24 feet long, 24 feet wide, and 30 feet high. A small room off to the side held the group’s control system. “We were basically out of touch with the rest of the world when we were underground,” Lande says. “There was no difference between day and night, heat and cold, and snow and sunshine.”

    The miners and locals from Lead, South Dakota—the community surrounding the mine—were welcoming of the scientists and interested in their work, Lande says. “We’d go out to dinner at the local restaurant and we’d hear this hot conversation in the next booth, and they would be discussing black holes and neutron stars. So science became the talk of the small town.”

    Davis Cavern, during the solar neutrino experiment. Photo by: Anna Davis

    The solar neutrino problem

    As the experiment began taking data, Davis’ group found they were detecting only about one-third the number of neutrinos predicted—a discrepancy that became known as the “solar neutrino problem.”

    Davis described the situation in his Nobel Prize biographical sketch: “My opinion in the early years was that something was wrong with the standard solar model; many physicists thought there was something wrong with my experiment.”

    However, every test of the experiment confirmed the results, and no problems were found with the model of the sun. Davis’ group began to suspect it was instead a problem with the neutrinos.

    This suspicion was confirmed in 2001, when the Sudbury Neutrino Observatory experiment [SNO] in Canada determined that as solar neutrinos travel through space, they oscillate, or change, between three flavors—electron, muon and tau. By the time neutrinos from the sun reach the Earth, they are an equal mixture of the three types.

    Sudbury Neutrino Observatory

    The Davis experiment was sensitive only to electron neutrinos, so it was able to detect only one-third of the neutrinos from the sun. The solar neutrino problem was solved.

    Davis Cavern, during a more recent expansion. Photo by: Matthew Kapust, Sanford Underground Research Facility

    A different kind of gold

    The Davis experiment ran for almost 40 years, until the mine closed in 2003.

    But the days of science in the Davis Cavern weren’t over. In 2006, the mining company donated Homestake to the state of South Dakota. It was renamed the Sanford Underground Research Facility.

    In 2009, many former Homestake miners became technicians on a $15.2 million project to renovate the experimental area. They completed the new 30,000-square-foot Davis Campus in 2012.

    Although scientists still ride in the cage to get down to the 4850-foot level of the mine, once they arrive it looks completely different.

    “It’s a very interesting contrast,” says Stanford University professor Thomas Shutt of SLAC National Accelerator Laboratory. “Going into the mine, it’s all mining carts, rust and rock, and then you get down to the Davis Campus, and it’s a really state-of-the-art facility.”

    The campus now contains block buildings with doors and windows. It has its own heating and air conditioning system, ventilation system, humidifiers and dust filters.

    The original Davis Cavern has been expanded and now houses the Large Underground Xenon experiment, the most sensitive detector yet searching for what many consider the most promising candidate for a type of dark matter particle.

    LUX Dark matter

    Shielded from distracting background particles this far underground, scientists hope LUX will detect the rare interaction of dark matter particles with the nucleus of xenon atoms in the 368-kilogram tank.

    Another cavern nearby was excavated as part of the Davis Campus renovation project and now holds the Majorana Demonstrator experiment, which will soon start to examine whether neutrinos are their own antimatter partners.

    Majorano Demonstrator Experiment
    Majorano Demonstrator Experiment

    LUX began taking data in 2013. It is currently on its second run and will continue through spring 2016.

    After its current run, LUX will be replaced by the LUX-ZEPLIN, or LZ, experiment, which will be 50 times bigger in usable mass and several hundred times more sensitive than the current LUX results.

    LZ project

    Science in the mine is still the talk of the town in Lead, says Carmen Carmona, an assistant project scientist at the University of California, Santa Barbara, who works on LUX. “When you go out on the streets and talk to people—especially the families of the miners from the gold mine days—they want to know how it is working underground now and how the experiment is going.”

    The spirit of cooperation between the mining community, the science community and the public community lives on, Regan says.

    “It’s been kind of a legacy to provide the beneficial space and be good neighbors and good hosts,” Regan says. “Our goal is for them to succeed, so we do everything we can to help and provide the best and safest place for them to do their good science.”

    In 2010, Sanford Lab enlarged the Davis Cavern to support the Large Underground Xenon experiment. Matthew Kapust, Sanford Underground Research Facility

    This cavern is being outfitted for the Compact Accelerator System Performing Astrophysical Research. CASPAR will use a low-powered accelerator to study what happens when stars die. Matthew Kapust, Sanford Underground Research Facility

    Davis Cavern undergoes outfitting for the LUX experiment. Matthew Kapust, Sanford Underground Research Facility

    Each day scientists working at the the Davis Campus pass this area, known as the Big X. The entrance to the Davis Campus is to the left; Yates Shaft is to the right. Matthew Kapust, Sanford Underground Research Facility

    LUX researchers install the detector at the 4850 level. Matthew Kapust, Sanford Underground Research Facility

    The Majorana Demonstrator experiment requires a very strict level of cleanliness. Researcher work in full clean room garb and assemble their detectors inside nitrogen-filled glove boxes. Matthew Kapust, Sanford Underground Research Facility

    The LUX detector was built in a clean room on the surface and then brought underground. Matthew Kapust, Sanford Underground Research Facility

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 3:46 pm on May 15, 2015 Permalink | Reply
    Tags: , , , Physics   

    From Physics: “Viewpoint: A More Precise Higgs Boson Mass” 

    Physics LogoAbout Physics

    Physics Logo 2


    May 14, 2015
    Chris Quigg, FNAL and ENS

    A new value for the Higgs boson mass will allow stronger tests of the standard model and of theories about the Universe’s stability.

    Figure 1: Values of the top quark and W boson masses measured in experiments (green) and inferred from calculations (blue). The inner and outer ellipses represent 68% and 95% confidence levels, respectively, for the measured and inferred values. Within current experimental and theoretical uncertainties, the two ways of determining the top quark and W boson masses agree. A more precise value of the Higgs mass would narrow the width of the blue ellipses, whereas improved measurements of the top quark and W boson masses would shrink the green ellipses, making for a more incisive test for new physics. (Note, the calculations assume the Higgs mass has a central value of 125.14GeV, which differs insignificantly from the new measurement by ATLAS and CMS, but does not affect the width of the blue ellipses.)

    A great insight of twentieth-century science is that symmetries expressed in the laws of nature need not be manifest in the outcomes of those laws. Consider the snowflake. Its structure is a consequence of electromagnetic interactions, which are identical from any direction, but a snowflake only looks the same when rotated by multiples of 60∘ about a single axis. The full symmetry is hidden by the particular conditions under which the water molecules crystallize. Similarly, a symmetry relates the electromagnetic and weak interactions in the standard model of particle physics, but we know it must be concealed because the weak interactions appear much weaker than electromagnetism.

    Standard Model of Particle Physics. The diagram shows the elementary particles of the Standard Model (the Higgs boson, the three generations of quarks and leptons, and the gauge bosons), including their names, masses, spins, charges, chiralities, and interactions with the strong, weak and electromagnetic forces. It also depicts the crucial role of the Higgs boson in electroweak symmetry breaking, and shows how the properties of the various particles differ in the (high-energy) symmetric phase (top) and the (low-energy) broken-symmetry phase (bottom).

    To learn what distinguishes electromagnetism from the weak interactions was an early goal of experiments at CERN’s Large Hadron Collider (LHC).

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    A big part of the answer was given in mid-2012, when the ATLAS and CMS Collaborations at the LHC announced the discovery of the Higgs boson in the study of proton–proton collisions [1].


    CERN CMS Detector

    Now the discovery teams have pooled their data analyses to produce a measurement of the Higgs boson mass with 0.2% precision [2]. The new value enables physicists to make more stringent tests of the electroweak theory and of the Higgs boson’s properties.

    The electroweak theory [3] is a key element of the standard model of particle physics that weaves together ideas and observations from diverse areas of physics [4]. In the theory, interactions are prescribed by gauge symmetries. If nature displayed these symmetries explicitly, the force particles would all be massless, whereas we know experimentally that the weak interactions must—because they are short-ranged—be mediated by massive particles. The so-called Higgs field was introduced to the electroweak theory to hide the gauge symmetry, leading to weak force particles (W± and Z0) that have mass but a photon that is massless.

    The Higgs boson is a spin-zero excitation of the Higgs field and the “footprint” of the mechanism that hides the electroweak gauge symmetry in the standard model. The Higgs boson’s interactions are fully specified in terms of known couplings and masses of its decay products, but the theory does not predict its mass. Instead, experimentalists must measure the energies and momenta of the Higgs boson’s decay products and determine its mass using kinematical equations. Once that mass is known, the rates at which the Higgs boson decays into different particles can be predicted with high precision, and compared with experiment. For a mass in the neighborhood of 125 giga-electron-volts (GeV), the electroweak theory foresees a happy circumstance in which several decay paths occur at large enough rates to be detected.

    ATLAS and CMS are large, broad-acceptance detectors located in multistory caverns about 100 meters below ground [5]. In the discovery run of the LHC, the ATLAS and CMS Collaborations searched for decays of a Higgs boson into bottom-quark–antiquark pairs, tau-lepton pairs, and pairs of electroweak gauge bosons: two photons, W+W−, and Z0Z0. The actual discovery was based primarily on mass peaks associated with either the two-photon final states or Z0Z0 pairs decaying to four-lepton (electrons or muons) final states. These channels, for which the ATLAS and CMS detectors have the best mass resolution, form the basis of their new report.

    Both of the “high-resolution” final states are relatively rare: the standard model predicts that only about 1/4% of Higgs boson decays produce two-photon states; the four-lepton rate is predicted to be nearly 20 times smaller. The two-photon channel exhibits a narrow resonance peak that contains several hundred events per experiment; the Z0Z0 to four-lepton channel yields only a few tens of signal events per experiment. To see these events in the first run of the LHC, the ATLAS and CMS collaborations chose different detector technologies, and therefore different measurement and calibration methods [2]. These differences make pooling the data complicated, but also allow the experimentalists to cross-check systematic uncertainties in their separate measurements. Their combined analyses yield a Higgs boson mass of 125.09±0.24GeV, the precision of which is limited by statistics and by uncertainties in the energy or momentum scale of the ATLAS and CMS detectors.

    The first consequence of the new, precise mass value is sharper predictions, within the standard model, for the relative probabilities of different Higgs boson decay modes and production rates [6]. So far, the measured decay modes and production rates agree with standard-model predictions. The current uncertainties in the measured rates are large, but they will be narrowed in the coming runs at the LHC and at possible future colliders. Evidence of any deviation would suggest that the Higgs boson does not follow the standard model textbook, or that new particles or new forces are implicated in its decays.

    With a precisely known Higgs boson mass MH, theorists can also make more refined predictions of the quantum corrections to many observables, such as the Z0 decay rates. These predictions test the consistency of the electroweak theory as a quantum field theory. Figure 1 illustrates a telling example [7]. The diagonal blue ellipses show the values of the W boson and top quark masses required to reproduce a selection of electroweak observables once MH is fixed. (The narrow and wide ellipses represent 68% and 95% confidence levels, respectively.) The range of masses depends on MH, and the precision with which it is known controls the width of the blue ellipses. The preferred range overlaps the green ellipses, which show the directly measured values of the W boson and top quark masses. In the future, more precise values for the masses of the Higgs boson, W boson, and top quark could unveil a discrepancy that might lead to the discovery of new physics.

    The specific value of MH constrains speculations about physics beyond the standard model, including supersymmetric or composite models. Perhaps most provocative of all is the possibility that the measured value of the mass is special. Quantum corrections influence not just the predictions for observable quantities, but also the shape of the Higgs potential that lies behind electroweak symmetry breaking in the standard model. According to recent analyses, the newly reported value of the Higgs boson mass corresponds to a near-critical situation in which the Higgs vacuum does not lie at the state of lowest energy, but in a metastable state close to a phase transition [8]. This might imply that our Universe is living on borrowed time, or that the electroweak theory must be augmented in some way.

    With LHC Run 2 about to commence, now at higher energies, particle physicists can look forward to a new round of exploration, searches for new phenomena, and refined measurements. Combined analyses and critical evaluations, such as the measurement of the Higgs boson mass discussed here, will help make the most of the data. We still have much to learn about the Higgs boson, the electroweak theory, and beyond.


    Fermilab is operated by Fermi Research Alliance, LLC, under Contract No. DE-AC02-07CH11359 with the United States Department of Energy. I thank the Fondation Meyer pour le développement culturel et artistique for generous support.


    1. G. Aad et al. (ATLAS Collaboration), “Observation of a New Particle in the Search for the Standard Model Higgs Boson with the ATLAS Detector at the LHC,” Phys. Lett. B 716, 1 (2012); S. Chatrchyan et al. (CMS Collaboration), “Observation of a New Boson at a Mass of 125 GeV with the CMS Experiment at the LHC,” 716, 30 (2012)
    2. G. Aad et al. (ATLAS Collaboration†), “Combined Measurement of the Higgs Boson Mass in pp Collisions at s=7 and 8 TeV with the ATLAS and CMS Experiments,” Phys. Rev. Lett. 114, 191803 (2015)
    3. The electroweak theory was developed from a proposal by S. Weinberg, “A Model of Leptons,” Phys. Rev. Lett. 19, 1264 (1967); A. Salam “Weak Electromagnetic Interactions,” in Elementary Particle Theory: Relativistic Groups and Analyticity (Nobel Symposium No. 8), edited by N. Svartholm (Almqvist and Wiksell, Stockholm, 1968), p. 367; http://j.mp/r9dJOo ; The theory is built on the SU(2)L⊗U(1)Y gauge symmetry investigated by S. L. Glashow, “Partial Symmetries of Weak Interactions,” Nucl. Phys. 22, 579 (1961)
    4. C. Quigg, “Electroweak Symmetry Breaking in Historical Perspective,” Ann. Rev. Nucl. Part. Sci.; arXiv:1503.01756
    5. ATLAS Collaboration, “The ATLAS Experiment at the CERN Large Hadron Collider,” JINST 3, S08003 (2008); CMS Collaboration, “The CMS Experiment at the CERN Large Hadron Collider,” 3, S08004 (2008)
    6.S. Heinemeyer et al. (LHC Higgs Cross Section Working Group), Handbook of LHC Higgs Cross Sections: 3. Higgs Properties, Report No. CERN-2013-004; Tables of Higgs boson branching fractions are given at http://j.mp/1OrjQL0
    7. M. Baak et al. (Gfitter Group), “The global electroweak fit at NNLO and prospects for the LHC and ILC,” Eur. Phys. J. C 74, 3046 (2014); a more detailed version of Figure 1 may be found at http://j.mp/1cvuXGQ
    8. D. Buttazzo, G. Degrassi, P. P. Giardino, G. F. Giudice, F. Sala, A. Salvio, and A. Strumia, ”Investigating the Near-Criticality of the Higgs Boson,” J. High Energy Phys. 1312, 089 (2013)

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments (physics@aps.org).

  • richardmitnick 7:56 am on May 14, 2015 Permalink | Reply
    Tags: , , , , , , Physics   

    From MIT: “Researchers build new fermion microscope” 

    MIT News

    May 13, 2015
    Jennifer Chu

    Graduate student Lawrence Cheuk adjusts the optics setup for laser cooling of sodium atoms. Photo: Jose-Luis Olivares/MIT

    Laser beams are precisely aligned before being sent into the vacuum chamber. Photo: Jose-Luis Olivares/MIT

    Sodium atoms diffuse out of an oven to form an atomic beam, which is then slowed and trapped using laser light. Photo: Jose-Luis Olivares/MIT

    A Quantum gas microscope for fermionic atoms. The atoms, potassium-40, are cooled during imaging by laser light, allowing thousands of photons to be collected by the microscope. Credit: Lawrence Cheuk/MIT

    The Fermi gas microscope group: (from left) graduate students Katherine Lawrence and Melih Okan, postdoc Thomas Lompe, graduate student Matt Nichols, Professor Martin Zwierlein, and graduate student Lawrence Cheuk. Photo: Jose-Luis Olivares/MIT

    Instrument freezes and images 1,000 individual fermionic atoms at once.

    Fermions are the building blocks of matter, interacting in a multitude of permutations to give rise to the elements of the periodic table. Without fermions, the physical world would not exist.

    Examples of fermions are electrons, protons, neutrons, quarks, and atoms consisting of an odd number of these elementary particles. Because of their fermionic nature, electrons and nuclear matter are difficult to understand theoretically, so researchers are trying to use ultracold gases of fermionic atoms as stand-ins for other fermions.

    But atoms are extremely sensitive to light: When a single photon hits an atom, it can knock the particle out of place — an effect that has made imaging individual fermionic atoms devilishly hard.

    Now a team of MIT physicists has built a microscope that is able to see up to 1,000 individual fermionic atoms. The researchers devised a laser-based technique to trap and freeze fermions in place, and image the particles simultaneously.

    The new imaging technique uses two laser beams trained on a cloud of fermionic atoms in an optical lattice. The two beams, each of a different wavelength, cool the cloud, causing individual fermions to drop down an energy level, eventually bringing them to their lowest energy states — cool and stable enough to stay in place. At the same time, each fermion releases light, which is captured by the microscope and used to image the fermion’s exact position in the lattice — to an accuracy better than the wavelength of light.

    With the new technique, the researchers are able to cool and image over 95 percent of the fermionic atoms making up a cloud of potassium gas. Martin Zwierlein, a professor of physics at MIT, says an intriguing result from the technique appears to be that it can keep fermions cold even after imaging.

    “That means I know where they are, and I can maybe move them around with a little tweezer to any location, and arrange them in any pattern I’d like,” Zwierlein says.

    Zwierlein and his colleagues, including first author and graduate student Lawrence Cheuk, have published their results today in the journal Physical Review Letters.

    Seeing fermions from bosons

    For the past two decades, experimental physicists have studied ultracold atomic gases of the two classes of particles: fermions and bosons — particles such as photons that, unlike fermions, can occupy the same quantum state in limitless numbers. In 2009, physicist Marcus Greiner at Harvard University devised a microscope that successfully imaged individual bosons in a tightly spaced optical lattice. This milestone was followed, in 2010, by a second boson microscope, developed by Immanuel Bloch’s group at the Max Planck Institute of Quantum Optics.

    These microscopes revealed, in unprecedented detail, the behavior of bosons under strong interactions. However, no one had yet developed a comparable microscope for fermionic atoms.

    “We wanted to do what these groups had done for bosons, but for fermions,” Zwierlein says. “And it turned out it was much harder for fermions, because the atoms we use are not so easily cooled. So we had to find a new way to cool them while looking at them.”

    Techniques to cool atoms ever closer to absolute zero have been devised in recent decades. Carl Wieman, Eric Cornell, and MIT’s Wolfgang Ketterle were able to achieve Bose-Einstein condensation in 1995, a milestone for which they were awarded the 2001 Nobel Prize in physics. Other techniques include a process using lasers to cool atoms from 300 degrees Celsius to a few ten-thousandths of a degree above absolute zero.

    A clever cooling technique

    And yet, to see individual fermionic atoms, the particles need to be cooled further still. To do this, Zwierlein’s group created an optical lattice using laser beams, forming a structure resembling an egg carton, each well of which could potentially trap a single fermion. Through various stages of laser cooling, magnetic trapping, and further evaporative cooling of the gas, the atoms were prepared at temperatures just above absolute zero — cold enough for individual fermions to settle onto the underlying optical lattice. The team placed the lattice a mere 7 microns from an imaging lens, through which they hoped to see individual fermions.

    However, seeing fermions requires shining light on them, causing a photon to essentially knock a fermionic atom out of its well, and potentially out of the system entirely.

    “We needed a clever technique to keep the atoms cool while looking at them,” Zwierlein says.

    His team decided to use a two-laser approach to further cool the atoms; the technique manipulates an atom’s particular energy level, or vibrational energy. Each atom occupies a certain energy state — the higher that state, the more active the particle is. The team shone two laser beams of differing frequencies at the lattice. The difference in frequencies corresponded to the energy between a fermion’s energy levels. As a result, when both beams were directed at a fermion, the particle would absorb the smaller frequency, and emit a photon from the larger-frequency beam, in turn dropping one energy level to a cooler, more inert state. The lens above the lattice collects the emitted photon, recording its precise position, and that of the fermion.

    Zwierlein says such high-resolution imaging of more than 1,000 fermionic atoms simultaneously would enhance our understanding of the behavior of other fermions in nature — particularly the behavior of electrons. This knowledge may one day advance our understanding of high-temperature superconductors, which enable lossless energy transport, as well as quantum systems such as solid-state systems or nuclear matter.

    “The Fermi gas microscope, together with the ability to position atoms at will, might be an important step toward the realization of a quantum computer based on fermions,” Zwierlein says. “One would thus harness the power of the very same intricate quantum rules that so far hamper our understanding of electronic systems.”

    Zwierlein says it is a good time for Fermi gas microscopists: Around the same time his group first reported its results, teams from Harvard and the University of Strathclyde in Glasgow also reported imaging individual fermionic atoms in optical lattices, indicating a promising future for such microscopes.

    Zoran Hadzibabic, a professor of physics at Trinity College, says the group’s microscope is able to detect individual atoms “with almost perfect fidelity.”

    “They detect them reliably, and do so without affecting their positions — that’s all you want,” says Hadzibabic, who did not contribute to the research. “So far they demonstrated the technique, but we know from the experience with bosons that that’s the hardest step, and I expect the scientific results to start pouring out.”

    This research was funded in part by the National Science Foundation, the Air Force Office of Scientific Research, the Office of Naval Research, the Army Research Office, and the David and Lucile Packard Foundation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 12:54 pm on May 12, 2015 Permalink | Reply
    Tags: ANITA, , COSI, Physics, SPIDER,   

    From Symmetry: “High adventure physics” 


    May 12, 2015
    Angela Anderson

    Photo by Harm Schoorlemmer, ANITA

    Three groups of hardy scientists recently met up in Antarctica to launch experiments into the big blue via balloon.

    UC Berkeley grad student Carolyn Kierans recently watched her 5000-pound astrophysics experiment ascend 110,000 feet over Antarctica on the end of a helium-filled balloon the size of a football field.

    She had been up since 3 a.m. with the team that prepped and transported the telescope known as COSI—Compton Spectrometer and Imager—across the ice shelf on an oversized vehicle called “The Boss.” They waited hours at the launch site in a thick fog for the winds to die down before getting the go-ahead to fill the balloon.

    Then the sky opened up, and they were cleared for launch.

    “I was with the crew at the launch pad, in the middle of nowhere, when the clouds disappeared and I could finally see the balloon hundreds of feet up,” she recalls. “I had to stop and say, ‘Wait, I’m doing my PhD in physics right now?’”

    Kierans was among three groups of hardy physicists who met up at Antarctica’s McMurdo Station last fall to fly their curious-looking instruments during NASA’s most recent Antarctic Scientific Balloon Campaign.

    Fully assembled and flight ready, COSI gets some final adjustments from Carolyn Kierans during testing. Photo by: Laura Gerwin

    For Antarctica’s three summer months, December through February, conditions are right to conduct studies in the upper atmosphere via scientific balloon. The sun never sets during those months, so the balloons are spared nighttime temperatures that would cause significant changes in altitude. And seasonal wind patterns take the balloons on a circular route almost entirely over land.

    To allow the balloons enough time to collect data and safely land before conditions change, all launches must take place within a few weeks in December. Near the end of 2014, three teams of physicists arrived at the end of the Earth to try to launch, one after the other, within that small window.

    Each team was driven by a different scientific pursuit: COSI set out to capture images of gamma rays for clues to the life and death of stars; ANITA (Antarctic Impulsive Transient Antenna) sought rare signs of ultra-high-energy neutrinos; and SPIDER was probing the cosmic microwave background [CMB] for evidence of cosmic inflation.

    Cosmic Microwave Background  Planck
    CMB per ESA/Planck

    Months of intense preparation, naps on the floor of a barn, competition for launch times during narrow windows of opportunity, and numerous aborted attempts did not dampen spirits. The teams shared meals, supplies, hikes and live music jams with locals at one of two town bars—united by the common pursuit of physics on high.

    “The community was like a gigantic family with the same goal of getting those balloons up,” Kierans says.

    None could be sure of a successful launch. Nor could they know exactly when or where their balloon would land once it took flight or how they would navigate the icy landscape to retrieve their precious data.

    ‘The crinkling of Mylar’

    Balloon-based physics experiments take many months of preparation. The teams first met up during the summer at the Columbia Scientific Balloon Facility in Palestine, Texas, where they assembled payloads and tested science and flight systems. Then they disassembled their experiments, shipped them in boxes and put them back together at McMurdo starting in October to be launch-ready by early December. Each group had 10 to 20 team members on the continent during peak work efforts.

    “We had about eight weeks to get everything back together and perform all the calibrations—it’s an exhausting and stressful period—and a very long time to be away from family,” recalls William Jones, assistant professor of physics at Princeton University and SPIDER lead.

    A successful launch depends on the optimal functioning of gear and instruments—and the cooperation of the weather.

    First in line was the ANITA experiment. ANITA hunts for the highest energy particles ever observed. Scientists have known about ultra-high-energy neutrinos since the 1960s, but they still don’t know exactly where they come from or how they get their energy.

    “Nothing on Earth can produce such particles right now,” says Harm Schoorlemmer, a postdoctoral fellow at the University of Hawaii from the ANITA team. “They are five to seven orders of magnitude higher in energy than particles we can accelerate in machines like the LHC at CERN.”

    Neutrinos travel through the universe barely interacting with anything—until they hit the dense Earth. ANITA’s 48 antennas on a 25-foot-tall gondola fly pointed down to capture radio waves in the Antarctic ice—signs of ultra-high-energy neutrino reactions.

    “The ice sheet has the advantage that it is transparent for radio waves,” says Christian Miki, University of Hawaii staff scientist and ANITA on-ice lead. “By flying high—about 120,000 feet up—ANITA can capture a diameter of 600 kilometers all at once.”

    Numerous ANITA launch attempts were scrubbed due to weather. It took several hours from hangar to launch at the Long Duration Balloon Facility, and Antarctic weather is known for radical shifts within the hour, Miki says.

    ANITA hangs from the The Boss on its way to the launch pad. Photo by: Harm Schoorlemmer, ANITA

    The day before the actual launch, the payload had been brought out of the hanger and checks were being performed when the team noticed an Emperor penguin hanging out on the edge of the launch pad. “We thought this was either good luck—getting a blessing from the Antarctic gods—or bad luck as penguins are flightless birds,” Miki recalls.

    Apparently graced, the ANITA team rolled out on December 18 for the real deal. The 4944-pound experiment was loaded onto the The Boss and taken to the launch site. Hours passed as they waited for optimal conditions; all the instruments were checked and double-checked. Finally, they got the go-ahead from NASA.

    “It’s hard to grasp the scales involved,” Schoorlemmer says. “The balloon is 800 to 900 feet above The Boss before the line is cut—buildings are about 35 to 40 feet tall. It takes one and a half hours to fill the balloon with helium, and then everything goes quiet. All we could hear is the crinkling of the Mylar and people going ‘Ooh, ooh.’”

    Hunting gamma rays

    Next up was COSI, a wide-field gamma-ray telescope that studies radiation blasted toward Earth by the most energetic or extreme environments in the universe, such as gamma-ray bursts, pulsars and nuclear decay from supernova remnants. Because gamma rays don’t make it through the Earth’s atmosphere, the telescope must rise above it. Pointed out to space, it can survey 25 percent of the sky at one time for sources of gamma-ray emissions and help detect where these high-energy photons come from. Researchers hope to use its images to learn more about the life and death of stars or the mysterious source of positrons in our galaxy.

    Testing gamma ray telescopes like COSI on balloons can help scientists develop technologies that can eventually be used on satellites. The recent COSI launch was the first to use a new ultra-long-duration balloon design in hopes of getting 100 days worth of data.

    COSI was launch-ready at the same time as ANITA but waited for it to go up before preparing to do the same. They also experienced several attempts called off due to weather.

    COSI’s super pressure balloon is finally released from the spool and takes flight. Photo by: Jeffrey Filippini, SPIDER

    “For nine days in a row, we showed up and did all the prep work,” only to abandon the efforts, Kierans says. On one attempt they got as far as laying out the balloon, which was theoretically the point of no return, before the weather turned against them. They somehow managed to put the 1.5-millimeter-thick, 5000-pound balloon back into the box. “It took 10 riggers over an hour of strenuous, delicate work” to put it back, Kierans wrote on her blog.

    Finally, on December 27 the silvery white balloon was filled with helium and cut loose, taking COSI up to the dark space above the Earth’s atmosphere.

    Jubilation at the successful launch did not last long. Just 40 hours later, a leak in the balloon forced the team to bring it back down. “It will be tough to get science data out of that short flight,” Kierans says. “But we will learn a lot. We made the decision to bring it down where we could get everything back and rebuild.”

    COSI was fully recovered by Kierans, who made three trips by twin otter plane to the Polar Plateau just over the Transantarctic Mountains—known as the “great flat white”—to disassemble and load up the instruments.

    Every inch of their flesh was covered to prevent frostbite. “This was not what I signed up for when I started out in physics,” she says. “But don’t get me wrong—I love it!”
    Big sky, big bang

    Last in line was SPIDER, which uses six telescopes designed to create extremely high-fidelity images of the polarization of the sky at certain wavelengths—or “colors”—of light. Scientists will use the images to search for patterns in the cosmic microwave background, the oldest light ever observed. Such patterns could provide evidence for the period of rapid expansion in the early universe known as cosmic inflation.

    Rising 118,000 feet above the Earth, the 6500-pound SPIDER is able to observe over six times more sky than Earth-based CMB experiments like BICEP.

    “Large sky coverage is the best way to be able to say whether or not the signal appears the same no matter where you look,” explains Jones, SPIDER lead.

    With just days remaining in the launch window after the COSI launch, SPIDER took advantage of a good patch of weather on the last possible day—New Year’s Eve in the US.

    SPIDER reflects its first rays of Antarctic sun with its Mylar sun shields after being rolled out of the bay. Photo by: Zigmund Kermish, SPIDER

    The team started out at 4 a.m. with what seemed like perfect weather, but the winds higher up were too fast and the launch was put on hold for about five hours. Eventually the winds died down and SPIDER was back on track to fly.

    “The launch, in particular the final few minutes once the balloon filled and released, represents the culmination of over eight years of work. It is a thrill. At the same time it is truly frightening,” Jones says.

    Princeton University graduate student Anne Gambrel left this note on the experiment’s “SPIDER on the Ice” blog: “Over the next couple of hours, we all huddled around our computers, and as each subsystem came online, working as designed, we all cheered. By 9 p.m., we were at float altitude and nothing had gone seriously wrong. I went home and slept like a rock as others got all of the details sorted and started taking data on the CMB.”

    Around and around she goes

    During the first 24 hours after their launch, the ANITA team constantly observed and tuned the instruments from the base. “There were six of us rotating in and out of the controls, while others were sleeping in cardboard boxes next to commanders,” Schoorlemmer says.

    The balloons are tracked in their circular flight around the continent, watched carefully for the optimal time to call them back to Earth.

    “Once the balloon is launched, you only have historical record to guide your intuition about where it will go,” Jones says. “No one really knows.”

    ANITA was up in the air for 22 days and 9 hours and was able to collect about twice the data of the experiment’s last polar flight.

    The instruments came down near the Australian Antarctic Station on January 9. “The Australians volunteered their services in recovering the instruments. They will go on a vessel up to Hobart and be picked up by the team in spring,” Miki says.

    SPIDER flew for about 17 days, generating approximately 85 GB of data each day, mainly from snapshots taken at about 120 images per second.

    This map shows SPIDER’s flight path and final resting place. Courtesy of: John Ruhl, SPIDER

    “It’s a daunting analysis task,” Jones says. But his team will eventually combine the data to make an image of the southern hemisphere representing about 10 percent of the full sky.

    SPIDER was brought down on January 17, 1500 miles from launch location “before it could go over the water and possibly not come back,” Jones says.

    The SPIDER team received assistance from the British Antarctic Survey in recovering the data. “Our experiment weighed roughly 6200 pounds, and we got back about 180,” Jones says. The rest, including the science cameras and most electronics, will remain on the West Antarctic plateau over the southern hemisphere winter.
    Other discoveries

    Finally arriving in New Zealand post-recovery, a few of the scientists went to the botanical gardens to lie on the grass.

    “To be able to walk barefoot in it!” Miki says. “I remember landing at 6 o’clock in the morning, walking out of the airport and actually smelling plants and the rain.”

    While the landscape, the science, the instruments, engineering and logistics of such balloon experiments are impressive, the Antarctic researchers were just as taken with the stalwart souls that make them happen.

    “The biggest surprise for me was the people,” Kierans says. “The contractors who work at McMurdo devote half the year to be in the harshest of continents, and they are some of the most interesting people I’ve ever met.”

    Miki concurs. “You’d be surprised who you might find working as support staff there. There was a lawyer taking a break from law; PhDs driving dozers. Some are just out of college and others are seasoned Antarctic veterans.”

    The staff is as friendly as they are professional, Miki says. “They’ll invite ‘beakers’ (what they call scientists) to parties, knitting circles, hikes, etc. With a peak population of over 900 people living in close quarters, getting along is essential.”
    Miki also reflected on the strong friendships made: “Maybe it’s the 24 hours of sunlight, living in close proximity, minimal privacy, long work hours, the desolation in which we are all immersed. Maybe it’s just that the ice attracts amazing, brilliant, talented people from around the world.”

    For Jones, the commitment such adventure-ready researchers show to their work goes above and beyond.

    “We were always supportive, always competitive, sometimes strained, sometimes ecstatic,” he says. “It’s an honor to be able to work with such talented people who are selflessly devoted to learning more about how Nature works at a fundamental level.”

    Looking down on McMurdo Station and McMurdo Sound from Observation Hill. Clio Sleator, COSI

    COSI team members Alex Lowell, and Clio Sleator and Christian Miki from ANITA watch the launch of COSI from a distance required by safety regulations. Jeffrey Filippini, SPIDER

    Just minutes after COSI was launched, the instrument is barely visible. The balloon hasn’t yet expanded to its full size, which happens when it reaches lower pressures at float altitudes. The final shape is more like a pumpkin. Jeffrey Filippini, SPIDER

    The SPIDER parachute is prepared for launch. Jeffrey Filippini, SPIDER

    SPIDER team members inspect waveplates that rotate the polarization of the light that enters the telescopes. Anne Gambrel, SPIDER

    SPIDER generated about 85 GB of data each day of its flight. Anne Gambrel, SPIDER

    SPIDER landed right side up and then fell on its back about 1500 miles from where it launched. Sam Burrell, British Antarctic Survey

    ANITA waits on the “dance floor,” where GPS and communication systems are tested. Harm Schoorlemmer, ANITA

    The ANITA team took the appearance of this Emperor penguin on the edge of the launch pad as a “blessing from the Antarctic gods.” Christian Miki, ANITA

    ANITA’s balloon is ready to take the experiment into the big blue during launch. Harm Schoorlemmer, ANITA

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 6:59 am on May 8, 2015 Permalink | Reply
    Tags: , , , , Physics   

    From MIT: “Electrons corralled using new quantum tool” 

    MIT News

    May 7, 2015
    David L. Chandler

    Image: Jon Wyrick/NIST

    “Whispering gallery” effect confines electrons, could provide basis for new electron-optics devices.

    Researchers have succeeded in creating a new “whispering gallery” effect for electrons in a sheet of graphene — making it possible to precisely control a region that reflects electrons within the material. They say the accomplishment could provide a basic building block for new kinds of electronic lenses, as well as quantum-based devices that combine electronics and optics.

    The new system uses a needle-like probe that forms the basis of present-day scanning tunneling microscopes (STM), enabling control of both the location and the size of the reflecting region within graphene — a two-dimensional form of carbon that is just one atom thick.

    The new finding is described in a paper appearing in the journal Science, co-authored by MIT professor of physics Leonid Levitov and researchers at the National Institute of Standards and Technology (NIST), the University of Maryland, Imperial College London, and the National Institute for Materials Science (NIMS) in Tsukuba, Japan.

    When the sharp tip of the STM is poised over a sheet of graphene, it produces a circular barrier on the sheet that “acts as a perfect curved mirror” for electrons, Levitov says, reflecting them along the curved surface until they begin to interfere with themselves. This controllable reflectivity and interference is similar, he adds, to so-called “whispering gallery” confinement modes that have been used in optical and acoustic systems — but these have not been tunable or adjustable.

    “In optics, whispering gallery resonators are known and useful,” Levitov says. “They provide high-quality cavities that find applications in sensing, spectroscopy, and communications. But the usual problem in optics is they’re not tunable.” Similarly, previous attempts to create quantum “corrals” for electrons have used atoms precisely positioned on a surface, which cannot be reconfigured easily.

    The confinement in this case is produced by the boundary between two different regions on the graphene surface, corresponding to the “p” and “n” regions in a transistor. In this case, a circular region just beneath the STM tip takes on one polarity, and the surrounding region the opposite polarity, creating a controllable circular junction between the two regions. Electrons inside sheets of graphene behave like particles of light; in this case, the circular junction acts as a curved mirror that can focus and control the electrons.

    It’s too early to predict what specific uses might be found for this phenomenon, Levitov says, but adds, “Any resonator can be used for a variety of things.”

    This electron resonator combines several good features. There’s clearly something special about having tunability and also high quality at the same time.”

    Philip Kim, a professor of physics at Harvard University who was not connected with this research, says it is “a very notable example of demonstrating novel electronic properties of graphene.” He adds, “Electrons in graphene behave like photons confined in a two-dimensional atomic sheet. This work unambiguously demonstrates that electrons confined in the potential created by scanning probe microscope exhibit a wave like resonance behavior, known as whispering gallery mode.”

    Because the new system is based on well-established STM technology, it could be developed relatively quickly into usable devices, Levitov suggests. And conveniently, the STM not only creates the whispering gallery effect, but also provides a means of observing the results, to study the phenomenon. “The tip does double-duty in this case,” he says.

    This could be a step toward the creation of electronic lenses, Levitov says — “a concept that intrigues graphene researchers.” In principle, these could provide a way of observing objects one-thousandth the size of those visible using light waves.

    Electronic lenses would represent a fundamentally different approach from existing electron microscopes, which bombard a surface with high-energy beams of electrons, obliterating any subtle effects within the objects being observed. Electron lenses, by contrast, would be able to observe the ambient low-energy electrons within the object itself.

    An appealing feature of the setup developed in NIST is that the boundary between the two surface regions, which can serve as a lens, is movable, since it is carried along with the STM tip when it is scanning the surface. This could make it possible to study “subtle things about how charge carriers behave at a microscopic level, that you can’t see from the outside,” Levitov says.

    The new work by Levitov and his colleagues provides one piece of such a system — and potentially of other advanced electro-optical systems, he says, such as negative-refraction materials that have been proposed as a kind of “invisibility cloak.” The new whispering-gallery mode for electrons is part of a toolbox that could lead to a whole family of new quantum-based electron-optics devices. It could also be used for high-fidelity sensing, since such resonators “can be used to enhance your sensitivity to very small signals,” Levitov says.

    Harvard’s Kim says that this work “is an important step toward building novel electronic applications, based on the unique relativistic quantum-mechanical behavior of electrons in graphene.”

    The research team also included graduate student Joaquin Rodriguez-Nieva from MIT; Yue Zhao, Jonathan Wyrick, Fabian Natterer, Nikolai Zhitenev, and Joseph Stroscio from NIST; Cyprian Lewandowski from Imperial College London; and Kenji Watanabe and Takashi Taniguchi from NIMS.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 446 other followers

%d bloggers like this: