Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:33 am on March 24, 2017 Permalink | Reply
    Tags: A new gem inside the CMS detector, , , , , , , Physics,   

    From Symmetry: “A new gem inside the CMS detector” 

    Symmetry Mag

    Symmetry

    03/24/17
    Sarah Charley

    1
    Photo by Maximilien Brice, CERN

    This month scientists embedded sophisticated new instruments in the heart of a Large Hadron Collider experiment.

    Sometimes big questions require big tools. That’s why a global community of scientists designed and built gigantic detectors to monitor the high-energy particle collisions generated by CERN’s Large Hadron Collider in Geneva, Switzerland. From these collisions, scientists can retrace the footsteps of the Big Bang and search for new properties of nature.

    The CMS experiment is one such detector. In 2012, it co-discovered the elusive Higgs boson with its sister experiment, ATLAS. Now, scientists want CMS to push beyond the known laws of physics and search for new phenomena that could help answer fundamental questions about our universe. But to do this, the CMS detector needed an upgrade.

    “Just like any other electronic device, over time parts of our detector wear down,” says Steve Nahn, a researcher in the US Department of Energy’s Fermi National Accelerator Laboratory and the US project manager for the CMS detector upgrades. “We’ve been planning and designing this upgrade since shortly after our experiment first started collecting data in 2010.”

    The CMS detector is built like a giant onion. It contains layers of instruments that track the trajectory, energy and momentum of particles produced in the LHC’s collisions. The vast majority of the sensors in the massive detector are packed into its center, within what is called the pixel detector. The CMS pixel detector uses sensors like those inside digital cameras but with a lightning fast shutter speed: In three dimensions, they take 40 million pictures every second.

    For the last several years, scientists and engineers at Fermilab and 21 US universities have been assembling and testing a new pixel detector to replace the current one as part of the CMS upgrade, with funding provided by the Department of Energy Office of Science and National Science Foundation.

    2
    Maral Alyari of SUNY Buffalo and Stephanie Timpone of Fermilab measure the thermal properties of a forward pixel detector disk at Fermilab. Almost all of the construction and testing of the forward pixel detectors occurred in the United States before the components were shipped to CERN for installation inside the CMS detector. Photo by Reidar Hahn, Fermilab

    3
    Stephanie Timpone consults a cabling map while fellow engineers Greg Derylo and Otto Alvarez inspect a completed forward pixel disk. The cabling map guides their task of routing the the thin, flexible cables that connect the disk’s 672 silicon sensors to electronics boards. Maximilien Brice, CERN

    4
    The CMS detector, located in a cavern 100 meters underground, is open for the pixel detector installation. Photo by Maximilien Brice, CERN

    5
    Postdoctoral researcher Stefanos Leontsinis and colleague Roland Horisberger work with a mock-up of one side of the barrel pixel detector next to the LHC’s beampipe.
    Photo by Maximilien Brice, CERN

    6
    Leontsinis watches the clearance as engineers slide the first part of the barrel pixel just millimeters from the LHC’s beampipe. Photo by Maximilien Brice, CERN

    7
    Scientists and engineers lift and guide the components by hand as they prepare to insert them into the CMS detector. Photo by Maximilien Brice, CERN

    8
    Scientists and engineers connect the cooling pipes of the forward pixel detector. The pixel detector is flushed with liquid carbon dioxide to keep the silicon sensors protected from the LHC’s high-energy collisions. Photo by Maximilien Brice, CERN

    The pixel detector consists of three sections: the innermost barrel section and two end caps called the forward pixel detectors. The tiered and can-like structure gives scientists a near-complete sphere of coverage around the collision point. Because the three pixel detectors fit on the beam pipe like three bulky bracelets, engineers designed each component as two half-moons, which latch together to form a ring around the beam pipe during the insertion process.

    Over time, scientists have increased the rate of particle collisions at the LHC. In 2016 alone, the LHC produced about as many collisions as it had in the three years of its first run together. To be able to differentiate between dozens of simultaneous collisions, CMS needed a brand new pixel detector.

    The upgrade packs even more sensors into the heart of the CMS detector. It’s as if CMS graduated from a 66-megapixel camera to a 124-megapixel camera.

    Each of the two forward pixel detectors is a mosaic of 672 silicon sensors, robust electronics and bundles of cables and optical fibers that feed electricity and instructions in and carry raw data out, according to Marco Verzocchi, a Fermilab researcher on the CMS experiment.

    The multipart, 6.5-meter-long pixel detector is as delicate as raw spaghetti. Installing the new components into a gap the size of a manhole required more than just finesse. It required months of planning and extreme coordination.

    “We practiced this installation on mock-ups of our detector many times,” says Greg Derylo, an engineer at Fermilab. “By the time we got to the actual installation, we knew exactly how we needed to slide this new component into the heart of CMS.”

    The most difficult part was maneuvering the delicate components around the pre-existing structures inside the CMS experiment.

    “In total, the full three-part pixel detector consists of six separate segments, which fit together like a three-dimensional cylindrical puzzle around the beam pipe,” says Stephanie Timpone, a Fermilab engineer. “Inserting the pieces in the right positions and right order without touching any of the pre-existing supports and protections was a well-choreographed dance.”

    For engineers like Timpone and Derylo, installing the pixel detector was the last step of a six-year process. But for the scientists working on the CMS experiment, it was just the beginning.

    “Now we have to make it work,” says Stefanos Leontsinis, a postdoctoral researcher at the University of Colorado, Boulder. “We’ll spend the next several weeks testing the components and preparing for the LHC restart.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:42 am on March 23, 2017 Permalink | Reply
    Tags: , , , , Physics, Scientists switch on 'artificial sun' in German lab   

    From DLR via phys.org: “Scientists switch on ‘artificial sun’ in German lab” 

    DLR Bloc

    German Aerospace Center

    phys.org

    March 23, 2017

    1
    In this March 21, 2017 photo engineer Volkmar Dohmen stands in front of xenon short-arc lamps in the DLR German national aeronautics and space research center in Juelich, western Germany. The lights are part of an artificial sun that will be used for research purposes. (Caroline Seidel/dpa via AP)

    Scientists in Germany are flipping the switch on what’s being described as “the world’s largest artificial sun,” hoping it will help shed light on new ways of making climate-friendly fuel.

    The “Synlight” experiment in Juelich, about 30 kilometers (19 miles) west of Cologne, consists of 149 giant spotlights normally used for film projectors.

    Starting Thursday, scientists from the German Aerospace Center will start experimenting with this dazzling array to try to find ways of tapping the enormous amount of energy that reaches Earth in the form of light from the sun.

    One area of research will focus on how to efficiently produce hydrogen, a first step toward making artificial fuel for airplanes.

    The experiment uses as much electricity in four hours as a four-person household would in a year.

    2
    n this March 21, 2017 photo engineer Volkmar Dohmen stands in front of xenon short-arc lamps in the DLR German national aeronautics and space research center in Juelich, western Germany. The lights are part of an artificial sun that will be used for research purposes. (Caroline Seidel/dpa via AP)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    DLR Center

    DLR is the national aeronautics and space research centre of the Federal Republic of Germany. Its extensive research and development work in aeronautics, space, energy, transport and security is integrated into national and international cooperative ventures. In addition to its own research, as Germany’s space agency, DLR has been given responsibility by the federal government for the planning and implementation of the German space programme. DLR is also the umbrella organisation for the nation’s largest project management agency.

    DLR has approximately 8000 employees at 16 locations in Germany: Cologne (headquarters), Augsburg, Berlin, Bonn, Braunschweig, Bremen, Goettingen, Hamburg, Juelich, Lampoldshausen, Neustrelitz, Oberpfaffenhofen, Stade, Stuttgart, Trauen, and Weilheim. DLR also has offices in Brussels, Paris, Tokyo and Washington D.C.

     
  • richardmitnick 12:38 pm on March 22, 2017 Permalink | Reply
    Tags: , , NOvA sees first antineutrino, Physics   

    From FNAL: “NOvA sees first antineutrino” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    March 21, 2017


    NOvA

    On Feb. 20, the NOvA neutrino experiment observed its first antineutrino, only two hours after the Fermilab accelerator complex switched to antineutrino delivery mode. The NOvA collaboration saw the antineutrino in the experiment’s far detector, which is located in northern Minnesota.

    NOvA scientists hope to learn more about how and why neutrinos change between one type and another. The three types, called flavors, are the muon, electron and tau neutrino. Over longer distances, neutrinos can flip between these flavors. NOvA is specifically designed to study muon neutrinos changing into electron neutrinos. Unraveling this mystery may help scientists understand why the universe is composed of matter and why that matter was not annihilated by antimatter after the Big Bang.

    1
    This plot shows the tracks of particles resulting from an antineutrino interaction inside the NOvA far detector. Image: NOvA collaboration

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 8:19 am on March 22, 2017 Permalink | Reply
    Tags: , CHAMP Lithosphere, , , Physics   

    From ESA: “Unravelling Earth’s magnetic field” 

    ESA Space For Europe Banner

    European Space Agency

    21 March 2017

    ESA’s Swarm satellites are seeing fine details in one of the most difficult layers of Earth’s magnetic field to unpick – as well as our planet’s magnetic history imprinted on Earth’s crust.


    ESA/Swarm

    Earth’s magnetic field can be thought of as a huge cocoon, protecting us from cosmic radiation and charged particles that bombard our planet in solar wind. Without it, life as we know it would not exist.


    Magnetosphere of Earth, original bitmap from NASA. SVG rendering by Aaron Kaase

    Most of the field is generated at depths greater than 3000 km by the movement of molten iron in the outer core. The remaining 6% is partly due to electrical currents in space surrounding Earth, and partly due to magnetised rocks in the upper lithosphere – the rigid outer part of Earth, consisting of the crust and upper mantle.

    Although this ‘lithospheric magnetic field’ is very weak and therefore difficult to detect from space, the Swarm trio is able to map its magnetic signals. After three years of collecting data, the highest resolution map of this field from space to date has been released.

    2

    “By combining Swarm measurements with historical data from the German CHAMP satellite, and using a new modelling technique, it was possible to extract the tiny magnetic signals of crustal magnetisation,” explained Nils Olsen from the Technical University of Denmark, one of the scientists behind the new map.

    3
    Lithosphere From CHAMP. http://geomag.org/info/lithosphere.html

    ESA’s Swarm mission manager, Rune Floberghagen, added: “Understanding the crust of our home planet is no easy feat. We can’t simply drill through it to measure its structure, composition and history.

    “Measurements from space have great value as they offer a sharp global view on the magnetic structure of our planet’s rigid outer shell.”

    Presented at this week’s Swarm Science Meeting in Canada, the new map shows detailed variations in this field more precisely than previous satellite-based reconstructions, caused by geological structures in Earth’s crust.

    One of these anomalies occurs in Central African Republic, centred around the city of Bangui, where the magnetic field is significantly sharper and stronger. The cause for this anomaly is still unknown, but some scientists speculate that it may be the result of a meteorite impact more than 540 million years ago.

    The magnetic field is in a permanent state of flux. Magnetic north wanders, and every few hundred thousand years the polarity flips so that a compass would point south instead of north.

    When new crust is generated through volcanic activity, mainly along the ocean floor, iron-rich minerals in the solidifying magma are oriented towards magnetic north, thus capturing a ‘snapshot’ of the magnetic field in the state it was in when the rocks cooled.

    Since magnetic poles flip back and forth over time, the solidified minerals form ‘stripes’ on the seafloor and provide a record of Earth’s magnetic history.

    The latest map from Swarm gives us an unprecedented global view of the magnetic stripes associated with plate tectonics reflected in the mid-oceanic ridges in the oceans.

    “These magnetic stripes are evidence of pole reversals and analysing the magnetic imprints of the ocean floor allows the reconstruction of past core field changes. They also help to investigate tectonic plate motions,” said Dhananjay Ravat from the University of Kentucky in the USA.

    “The new map defines magnetic field features down to about 250 km and will help investigate geology and temperatures in Earth’s lithosphere.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

     
  • richardmitnick 12:43 pm on March 21, 2017 Permalink | Reply
    Tags: , Physics, , Shanghai Synchrotron Radiation Facility (SSRF), Soft X-ray Free Electron Laser (SXFEL) facility   

    From physicsworld.com: “China outlines free-electron laser plans” 

    physicsworld
    physicsworld.com

    Mar 21, 2017
    Michael Banks

    1
    Zhentang Zhao, director of the Shanghai Institute of Applied Physics.

    There was a noticeable step change in the weather today in Shanghai as the Sun finally emerged and the temperature rose somewhat.

    This time I braved the rush-hour metro system to head to the Zhangjiang Technology Park in the south of the city.

    The park is home to the Shanghai Synchrotron Radiation Facility (SSRF), which opened in 2007. The facility accelerates electrons to 3.5 GeV before making them produce X-rays that are then used by researchers to study a range of materials.

    The SSRF currently has 15 beamlines focusing on topics including energy, materials, bioscience and medicine. I was given a tour of the facility by Zhentang Zhao, director of the Shanghai Institute of Applied Physics, which operates the SSRF.

    As I found out this morning, the centre has big plans. Perhaps the sight of building materials and cranes nearby the SSRF should have given it away.

    Over the next six years there are plans to build a further 16 beamlines to put the SSRF at full capacity, some of which will extend 100 m or so from the synchrotron.

    Neighbouring the SSRF, scientists are also building the Soft X-ray Free Electron Laser (SXFEL) facility. The SSRF used to have a test FEL beam line, but since 2014 that has transformed to become a fully fledged centre costing 8bn RMB.

    Currently, the 250 m, 150 MeV linac for the SXFEL has been built and is being commissioned. Over the next couple of years two undulator beamlines will be put in place to generate X-rays with a wavelength of 9 nm and at a repetition rate of 10 Hz. The X-rays will then be sent to five experimental stations that will open to users in 2019.

    There are also plans to upgrade the SXFEL so that it generates X-rays with a 2 nm wavelength (soft X-ray regime) at a frequency of 50 Hz.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 4:35 pm on March 17, 2017 Permalink | Reply
    Tags: , , , Physics, Scientists make microscopes from droplets, Tunable microlenses   

    From MIT: “Scientists make microscopes from droplets” 

    MIT News

    MIT Widget

    MIT News

    March 10, 2017
    Jennifer Chu

    1
    Researchers at MIT have devised tiny “microlenses” from complex liquid droplets, such as these pictured here, that are comparable in size to the width of a human hair. Courtesy of the researchers

    With chemistry and light, researchers can tune the focus of tiny beads of liquid.

    Liquid droplets are natural magnifiers. Look inside a single drop of water, and you are likely to see a reflection of the world around you, close up and distended as you’d see in a crystal ball.

    Researchers at MIT have now devised tiny “microlenses” from complex liquid droplets comparable in size to the width of a human hair. They report the advance this week in the journal Nature Communications.

    Each droplet consists of an emulsion, or combination of two liquids, one encapsulated in the other, similar to a bead of oil within a drop of water. Even in their simple form, these droplets can magnify and produce images of surrounding objects. But now the researchers can also reconfigure the properties of each droplet to adjust the way they filter and scatter light, similar to adjusting the focus on a microscope.

    The scientists used a combination of chemistry and light to precisely shape the curvature of the interface between the internal bead and the surrounding droplet. This interface acts as a kind of internal lens, comparable to the compounded lens elements in microscopes.

    “We have shown fluids are very versatile optically,” says Mathias Kolle, the Brit and Alex d’Arbeloff Career Development Assistant Professor in MIT’s Department of Mechanical Engineering. “We can create complex geometries that form lenses, and these lenses can be tuned optically. When you have a tunable microlens, you can dream up all sorts of applications.”

    For instance, Kolle says, tunable microlenses might be used as liquid pixels in a three-dimensional display, directing light to precisely determined angles and projecting images that change depending on the angle from which they are observed. He also envisions pocket-sized microscopes that could take a sample of blood and pass it over an array of tiny droplets. The droplets would capture images from varying perspectives that could be used to recover a three-dimensional image of individual blood cells.

    “We hope that we can use the imaging capacity of lenses on the microscale combined with the dynamically adjustable optical characteristics of complex fluid-based microlenses to do imaging in a way people have not done yet,” Kolle says.

    Kolle’s MIT co-authors are graduate student and lead author Sara Nagelberg, former postdoc Lauren Zarzar, junior Natalie Nicolas, former postdoc Julia Kalow, research affiliate Vishnu Sresht, professor of chemical engineering Daniel Blankschtein, professor of mechanical engineering George Barbastathis, and John D. MacArthur Professor of Chemistry Timothy Swager. Moritz Kreysing and Kaushikaram Subramanian of the Max Planck Institute of Molecular Cell Biology and Genetics are also co-authors.

    Shaping a curve

    The group’s work builds on research by Swager’s team, which in 2015 reported a new way to make and reconfigure complex emulsions. In particular, the team developed a simple technique to make and control the size and configuration of double emulsions, such as water that was suspended in oil, then suspended again in water. Kolle and his colleagues used the same techniques to make their liquid lenses.

    They first chose two transparent fluids, one with a higher refractive index (a property that relates to the speed at which light travels through a medium), and the other with a lower refractive index. The contrast between the two refractive indices can contribute to a droplet’s focusing power. The researchers poured the fluids into a vial, heated them to a temperature at which the fluids would mix, then added a water-surfactant solution. When the liquids were mixed rapidly, tiny emulsion droplets formed. As the mixture cooled, the fluids in each of the droplets separated, resulting in droplets within droplets.

    To manipulate the droplets’ optical properties, the researchers added certain concentrations and ratios of various surfactants — chemical compounds that lower the interfacial tension between two liquids. In this case, one of the surfactants the team chose was a light-sensitive molecule. When exposed to ultraviolet light this molecule changes its shape, which modifies the tension at the droplet-water interfaces and the droplet’s focusing power. This effect can be reversed by exposure to blue light.

    “We can change focal length, for example, and we can decide where an image is picked up from, or where a laser beam focuses to,” Kolle says. “In terms of light guiding, propagation, and tailoring of light flow, it’s really a good tool.”

    Optics on the horizon

    Kolle and his colleagues tested the properties of the microlenses through a number of experiments, including one in which they poured droplets into a shallow plate, placed under a stencil, or “photomask,” with a cutout of a smiley face. When they turned on an overhead UV lamp, the light filtered through the holes in the photomask, activating the surfactants in the droplets underneath. Those droplets, in turn, switched from their original, flat interface, to a more curved one, which strongly scattered light, thereby generating a dark pattern in the plate that resembled the photomask’s smiley face.

    The researchers also describe their idea for how the microlenses might be used as pocket-sized microscopes. They propose forming a microfluidic device with a layer of microlenses, each of which could capture an image of a tiny object flowing past, such as a blood cell. Each image would be captured from a different perspective, ultimately allowing recovery of information about the object’s three-dimensional shape.

    “The whole system could be the size of your phone or wallet,” Kolle says. “If you put some electronics around it, you have a microscope where you can flow blood cells or other cells through and visualize them in 3-D.”

    He also envisions screens, layered with microlenses, that are designed to refract light into specific directions.

    “Can we project information to one part of a crowd and different information to another part of crowd in a stadium?” Kolle says. “These kinds of optics are challenging, but possible.”

    This research was supported, in part, by the National Science Foundation, the Natural Sciences and Engineering Research Council of Canada, and the Max Planck Society.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 12:52 pm on March 16, 2017 Permalink | Reply
    Tags: , , Nauilus, , Physics, , Supersymetry   

    From Nautilus: “A Brief History of the Grand Unified Theory of Physics” 

    Nautilus

    Nautilus

    March 16, 2017
    Lawrence M. Krauss
    Paintings by Jonathan Feldschuh

    Particle physicists had two nightmares before the Higgs particle was discovered in 2012. The first was that the Large Hadron Collider (LHC) particle accelerator would see precisely nothing.


    CERN ATLAS Higgs Event

    CERN ATLAS detector


    CERN CMS Higgs Event


    CERN CMS detector




    LHC at CERN

    For if it did, it would likely be the last large accelerator ever built to probe the fundamental makeup of the cosmos. The second was that the LHC would discover the Higgs particle predicted by theoretical physicist Peter Higgs in 1964 … and nothing else.

    Each time we peel back one layer of reality, other layers beckon. So each important new development in science generally leaves us with more questions than answers. But it also usually leaves us with at least the outline of a road map to help us begin to seek answers to those questions. The successful discovery of the Higgs particle, and with it the validation of the existence of an invisible background Higgs field throughout space (in the quantum world, every particle like the Higgs is associated with a field), was a profound validation of the bold scientific developments of the 20th century.

    2
    Particles #22

    However, the words of Sheldon Glashow continue to ring true: The Higgs is like a toilet. It hides all the messy details we would rather not speak of. The Higgs field interacts with most elementary particles as they travel through space, producing a resistive force that slows their motion and makes them appear massive. Thus, the masses of elementary particles that we measure, and that make the world of our experience possible is something of an illusion—an accident of our particular experience.

    As elegant as this idea might be, it is essentially an ad hoc addition to the Standard Model of physics—which explains three of the four known forces of nature, and how these forces interact with matter.


    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    It is added to the theory to do what is required to accurately model the world of our experience. But it is not required by the theory. The universe could have happily existed with massless particles and a long-range weak force (which, along with the strong force, gravity, and electromagnetism, make up the four known forces). We would just not be here to ask about them. Moreover, the detailed physics of the Higgs is undetermined within the Standard Model alone. The Higgs could have been 20 times heavier, or 100 times lighter.

    Why, then, does the Higgs exist at all? And why does it have the mass it does? (Recognizing that whenever scientists ask “Why?” we really mean “How?”) If the Higgs did not exist, the world we see would not exist, but surely that is not an explanation. Or is it? Ultimately to understand the underlying physics behind the Higgs is to understand how we came to exist. When we ask, “Why are we here?,” at a fundamental level we may as well be asking, “Why is the Higgs here?” And the Standard Model gives no answer to this question.

    Some hints do exist, however, coming from a combination of theory and experiment. Shortly after the fundamental structure of the Standard Model became firmly established, in 1974, and well before the details were experimentally verified over the next decade, two different groups of physicists at Harvard, where both Sheldown Glashow and Steven Weinberg were working, noticed something interesting. Glashow, along with Howard Georgi, did what Glashow did best: They looked for patterns among the existing particles and forces and sought out new possibilities using the mathematics of group theory.

    In the Standard Model the weak and electromagnetic forces of nature are unified at a high-energy scale, into a single force that physicists call the “electroweak force.” This means that the mathematics governing the weak and electromagnetic forces are the same, both constrained by the same mathematical symmetry, and the two forces are different reflections of a single underlying theory. But the symmetry is “spontaneously broken” by the Higgs field, which interacts with the particles that convey the weak force, but not the particles that convey the electromagnetic force. This accident of nature causes these two forces to appear as two separate and distinct forces at scales we can measure—with the weak force being short-range and electromagnetism remaining long-range.

    Georgi and Glashow tried to extend this idea to include the strong force, and discovered that all of the known particles and the three non-gravitational forces could naturally fit within a single fundamental symmetry structure. They then speculated that this symmetry could spontaneously break at some ultrahigh energy scale (and short distance scale) far beyond the range of current experiments, leaving two separate and distinct unbroken symmetries left over—resulting in separate strong and electroweak forces. Subsequently, at a lower energy and larger distance scale, the electroweak symmetry would break, separating the electroweak force into the short-range weak and the long-range electromagnetic force.

    They called such a theory, modestly, a Grand Unified Theory (GUT).

    At around the same time, Weinberg and Georgi along with Helen Quinn noticed something interesting—following the work of Frank Wilczek, David Gross, and David Politzer. While the strong interaction got weaker at smaller distance scales, the electromagnetic and weak interactions got stronger.

    It didn’t take a rocket scientist to wonder whether the strength of the three different interactions might become identical at some small-distance scale. When they did the calculations, they found (with the accuracy with which the interactions were then measured) that such a unification looked possible, but only if the scale of unification was about 15 orders of magnitude in scale smaller than the size of the proton.

    This was good news if the unified theory was the one proposed by Howard Georgi and Glashow—because if all the particles we observe in nature got unified this way, then new particles (called gauge bosons) would exist that produce transitions between quarks (which make up protons and neutrons), and electrons and neutrinos. That would mean protons could decay into other lighter particles, which we could potentially observe. As Glashow put it, “Diamonds aren’t forever.”

    Even then it was known that protons must have an incredibly long lifetime. Not just because we still exist almost 14 billion years after the big bang, but because we all don’t die of cancer as children. If protons decayed with an average lifetime smaller than about a billion billion years, then enough protons would decay in our bodies during our childhood to produce enough radiation to kill us. Remember that in quantum mechanics, processes are probabilistic. If an average proton lives a billion billion years, and if one has a billion billion protons, then on average one will decay each year. There are a lot more than a billion billion protons in our bodies.

    However, with the incredibly small proposed distance scale and therefore the incredibly large mass scale associated with spontaneous symmetry breaking in Grand Unification, the new gauge bosons would get large masses. That would make the interactions they mediate be so short-range that they would be unbelievably weak on the scale of protons and neutrons today. As a result, while protons could decay, they might live, in this scenario, perhaps a million billion billion billion years before decaying. Still time to hold onto your growth stocks.

    With the results of Glashow and Georgi, and Georgi, Quinn, and Weinberg, the smell of grand synthesis was in the air. After the success of the electroweak theory, particle physicists were feeling ambitious and ready for further unification.

    How would one know if these ideas were correct, however? There was no way to build an accelerator to probe an energy scale a million billion times greater than the rest mass energy of protons. Such a machine would have to have a circumference of the moon’s orbit. Even if it was possible, considering the earlier debacle over the Superconducting Super Collider, no government would ever foot the bill.


    Superconducting Super Collider map, in the vicinity of Waxahachie, Texas.

    Happily, there was another way, using the kind of probability arguments I just presented that give limits to the proton lifetime. If the new Grand Unified Theory predicted a proton lifetime of, say, a thousand billion billion billion years, then if one could put a thousand billion billion billion protons in a single detector, on average one of them would decay each year.

    Where could one find so many protons? Simple: in about 3,000 tons of water.

    So all that was required was to get a tank of water, put it in the dark, make sure there were no radioactivity backgrounds, surround it with sensitive phototubes that can detect flashes of light in the detector, and then wait for a year to see a burst of light when a proton decayed. As daunting as this may seem, at least two large experiments were commissioned and built to do just this, one deep underground next to Lake Erie in a salt mine, and one in a mine near Kamioka, Japan. The mines were necessary to screen out incoming cosmic rays that would otherwise produce a background that would swamp any proton decay signal.

    Both experiments began taking data around 1982–83. Grand Unification seemed so compelling that the physics community was confident a signal would soon appear and Grand Unification would mean the culmination of a decade of amazing change and discovery in particle physics—not to mention another Nobel Prize for Glashow and maybe some others.

    Unfortunately, nature was not so kind in this instance. No signals were seen in the first year, the second, or the third. The simplest elegant model proposed by Glashow and Georgi was soon ruled out. But once the Grand Unification bug had caught on, it was not easy to let it go. Other proposals were made for unified theories that might cause proton decay to be suppressed beyond the limits of the ongoing experiments.

    On Feb. 23, 1987, however, another event occurred that demonstrates a maxim I have found is almost universal: Every time we open a new window on the universe, we are surprised. On that day a group of astronomers observed, in photographic plates obtained during the night, the closest exploding star (a supernova) seen in almost 400 years.

    3
    NASA is celebrating the 30th anniversary of SN 1987A by releasing new data.

    The star, about 160,000 light-years away, was in the Large Magellanic Cloud—a small satellite galaxy of the Milky Way observable in the southern hemisphere.


    Large Magellanic Cloud. Adrian Pingstone December 2003

    If our ideas about exploding stars are correct, most of the energy released should be in the form of neutrinos, despite that the visible light released is so great that supernovas are the brightest cosmic fireworks in the sky when they explode (at a rate of about one explosion per 100 years per galaxy). Rough estimates then suggested that the huge IMB (Irvine- Michigan-Brookhaven) and Kamiokande water detectors should see about 20 neutrino events.

    5
    Irvine- Michigan-Brookhaven detector


    Super Kamiokande detector

    When the IMB and Kamiokande experimentalists went back and reviewed their data for that day, lo and behold IMB displayed eight candidate events in a 10-second interval, and Kamiokande displayed 11 such events. In the world of neutrino physics, this was a flood of data. The field of neutrino astrophysics had suddenly reached maturity. These 19 events produced perhaps 1,900 papers by physicists, such as me, who realized that they provided an unprecedented window into the core of an exploding star, and a laboratory not just for astrophysics but also for the physics of neutrinos themselves.

    Spurred on by the realization that large proton-decay detectors might serve a dual purpose as new astrophysical neutrino detectors, several groups began to build a new generation of such dual-purpose detectors. The largest one in the world was again built in the Kamioka mine and was called Super-Kamiokande, and with good reason. This mammoth 50,000-ton tank of water, surrounded by 11,800 phototubes, was operated in a working mine, yet the experiment was maintained with the purity of a laboratory clean room. This was absolutely necessary because in a detector of this size one had to worry not only about external cosmic rays, but also about internal radioactive contaminants in the water that could swamp any signals being searched for.

    Meanwhile, interest in a related astrophysical neutrino signature also reached a new high during this period. The sun produces neutrinos due to the nuclear reactions in its core that power it, and over 20 years, using a huge underground detector, physicist Ray Davis had detected solar neutrinos, but had consistently found an event rate about a factor of three below what was predicted using the best models of the sun. A new type of solar neutrino detector was built inside a deep mine in Sudbury, Canada, which became known as the Sudbury Neutrino Observatory (SNO).


    SNOLAB, Sudbury, Ontario, Canada.

    Super-Kamiokande has now been operating almost continuously, through various upgrades, for more than 20 years. No proton-decay signals have been seen, and no new supernovas observed. However, the precision observations of neutrinos at this huge detector, combined with complementary observations at SNO, definitely established that the solar neutrino deficit observed by Ray Davis is real, and moreover that it is not due to astrophysical effects in the sun but rather due to the properties of neutrinos. The implication was that at least one of the three known types of neutrinos is not massless. Since the Standard Model does not accommodate neutrinos’ masses, this was the first definitive observation that some new physics, beyond the Standard Model and beyond the Higgs, must be operating in nature.

    Soon after this, observations of higher-energy neutrinos that regularly bombard Earth as high-energy cosmic-ray protons hit the atmosphere and produce a downward shower of particles, including neutrinos, demonstrated that yet a second neutrino has mass. This mass is somewhat larger, but still far smaller than the mass of the electron. For these results team leaders at SNO and Kamiokande were awarded the 2015 Nobel Prize in Physics—a week before I wrote the first draft of these words. To date these tantalizing hints of new physics are not explained by current theories.

    The absence of proton decay, while disappointing, turned out to be not totally unexpected. Since Grand Unification was first proposed, the physics landscape had shifted slightly. More precise measurements of the actual strengths of the three non-gravitational interactions—combined with more sophisticated calculations of the change in the strength of these interactions with distance—demonstrated that if the particles of the Standard Model are the only ones existing in nature, the strength of the three forces will not unify at a single scale. In order for Grand Unification to take place, some new physics at energy scales beyond those that have been observed thus far must exist. The presence of new particles would not only change the energy scale at which the three known interactions might unify, it would also tend to drive up the Grand Unification scale and thus suppress the rate of proton decay—leading to predicted lifetimes in excess of a million billion billion billion years.

    As these developments were taking place, theorists were driven by new mathematical tools to explore a possible new type of symmetry in nature, which became known as supersymmetry.


    Standard model of Supersymmetry DESY

    This fundamental symmetry is different from any previous known symmetry, in that it connects the two different types of particles in nature, fermions (particles with half-integer spins) and bosons (particles with integer spins). The upshot of this is that if this symmetry exists in nature, then for every known particle in the Standard Model at least one corresponding new elementary particle must exist. For every known boson there must exist a new fermion. For every known fermion there must exist a new boson.

    Since we haven’t seen these particles, this symmetry cannot be manifest in the world at the level we experience it, and it must be broken, meaning the new particles will all get masses that could be heavy enough so that they haven’t been seen in any accelerator constructed thus far.

    What could be so attractive about a symmetry that suddenly doubles all the particles in nature without any evidence of any of the new particles? In large part the seduction lay in the very fact of Grand Unification. Because if a Grand Unified theory exists at a mass scale of 15 to 16 orders of magnitude higher energy than the rest mass of the proton, this is also about 13 orders of magnitude higher than the scale of electroweak symmetry breaking. The big question is why and how such a huge difference in scales can exist for the fundamental laws of nature. In particular, if the Standard Model Higgs is the true last remnant of the Standard Model, then the question arises, Why is the energy scale of Higgs symmetry breaking 13 orders of magnitude smaller than the scale of symmetry breaking associated with whatever new field must be introduced to break the GUT symmetry into its separate component forces?

    ____________________________________________________________________________
    Following three years of LHC runs, there are no signs of supersymmetry whatsoever.
    ____________________________________________________________________________

    The problem is a little more severe than it appears. When one considers the effects of virtual particles (which appear and disappear on timescales so short that their existence can only be probed indirectly), including particles of arbitrarily large mass, such as the gauge particles of a presumed Grand Unified Theory, these tend to drive up the mass and symmetry-breaking scale of the Higgs so that it essentially becomes close to, or identical to, the heavy GUT scale. This generates a problem that has become known as the naturalness problem. It is technically unnatural to have a huge hierarchy between the scale at which the electroweak symmetry is broken by the Higgs particle and the scale at which the GUT symmetry is broken by whatever new heavy field scalar breaks that symmetry.

    The mathematical physicist Edward Witten argued in an influential paper in 1981 that supersymmetry had a special property. It could tame the effect that virtual particles of arbitrarily high mass and energy have on the properties of the world at the scales we can currently probe. Because virtual fermions and virtual bosons of the same mass produce quantum corrections that are identical except for a sign, if every boson is accompanied by a fermion of equal mass, then the quantum effects of the virtual particles will cancel out. This means that the effects of virtual particles of arbitrarily high mass and energy on the physical properties of the universe on scales we can measure would now be completely removed.

    If, however, supersymmetry is itself broken (as it must be or all the supersymmetric partners of ordinary matter would have the same mass as the observed particles and we would have observed them), then the quantum corrections will not quite cancel out. Instead they would yield contributions to masses that are the same order as the supersymmetry-breaking scale. If it was comparable to the scale of the electroweak symmetry breaking, then it would explain why the Higgs mass scale is what it is.

    And it also means we should expect to begin to observe a lot of new particles—the supersymmetric partners of ordinary matter—at the scale currently being probed at the LHC.

    This would solve the naturalness problem because it would protect the Higgs boson masses from possible quantum corrections that could drive them up to be as large as the energy scale associated with Grand Unification. Supersymmetry could allow a “natural” large hierarchy in energy (and mass) separating the electroweak scale from the Grand Unified scale.

    That supersymmetry could in principle solve the hierarchy problem, as it has become known, greatly increased its stock with physicists. It caused theorists to begin to explore realistic models that incorporated supersymmetry breaking and to explore the other physical consequences of this idea. When they did so, the stock price of supersymmetry went through the roof. For if one included the possibility of spontaneously broken supersymmetry into calculations of how the three non-gravitational forces change with distance, then suddenly the strength of the three forces would naturally converge at a single, very small-distance scale. Grand Unification became viable again!

    Models in which supersymmetry is broken have another attractive feature. It was pointed out, well before the top quark was discovered, that if the top quark was heavy, then through its interactions with other supersymmetric partners, it could produce quantum corrections to the Higgs particle properties that would cause the Higgs field to form a coherent background field throughout space at its currently measured energy scale if Grand Unification occurred at a much higher, superheavy scale. In short, the energy scale of electroweak symmetry breaking could be generated naturally within a theory in which Grand Unification occurs at a much higher energy scale. When the top quark was discovered and indeed was heavy, this added to the attractiveness of the possibility that supersymmetry breaking might be responsible for the observed energy scale of the weak interaction.

    _____________________________________________________________________
    In order for Grand Unification to take place, some new physics at energy scales beyond those that have been observed thus far must exist.
    _____________________________________________________________________

    All of this comes at a cost, however. For the theory to work, there must be two Higgs bosons, not just one. Moreover, one would expect to begin to see the new supersymmetric particles if one built an accelerator such as the LHC, which could probe for new physics near the electroweak scale. Finally, in what looked for a while like a rather damning constraint, the lightest Higgs in the theory could not be too heavy or the mechanism wouldn’t work.

    As searches for the Higgs continued without yielding any results, accelerators began to push closer and closer to the theoretical upper limit on the mass of the lightest Higgs boson in supersymmetric theories. The value was something like 135 times the mass of the proton, with details to some extent depending on the model. If the Higgs could have been ruled out up to that scale, it would have suggested all the hype about supersymmetry was just that.

    Well, things turned out differently. The Higgs that was observed at the LHC has a mass about 125 times the mass of the proton. Perhaps a grand synthesis was within reach.

    The answer at present is … not so clear. The signatures of new super- symmetric partners of ordinary particles should be so striking at the LHC, if they exist, that many of us thought that the LHC had a much greater chance of discovering supersymmetry than it did of discovering the Higgs. It didn’t turn out that way. Following three years of LHC runs, there are no signs of supersymmetry whatsoever. The situation is already beginning to look uncomfortable. The lower limits that can now be placed on the masses of supersymmetric partners of ordinary matter are getting higher. If they get too high, then the supersymmetry-breaking scale would no longer be close to the electroweak scale, and many of the attractive features of supersymmetry breaking for resolving the hierarchy problem would go away.

    But the situation is not yet hopeless, and the LHC has been turned on again, this time at higher energy. It could be that supersymmetric particles will soon be discovered.

    If they are, this will have another important consequence. One of the bigger mysteries in cosmology is the nature of the dark matter that appears to dominate the mass of all galaxies we can see.


    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    There is so much of it that it cannot be made of the same particles as normal matter. If it were, for example, the predictions of the abundance of light elements such as helium produced in the big bang would no longer agree with observation. Thus physicists are reasonably certain that the dark matter is made of a new type of elementary particle. But what type?

    Well, the lightest supersymmetric partner of ordinary matter is, in most models, absolutely stable and has many of the properties of neutrinos. It would be weakly interacting and electrically neutral, so that it wouldn’t absorb or emit light. Moreover, calculations that I and others performed more than 30 years ago showed that the remnant abundance today of the lightest supersymmetric particle left over after the big bang would naturally be in the range so that it could be the dark matter dominating the mass of galaxies.

    In that case our galaxy would have a halo of dark matter particles whizzing throughout it, including through the room in which you are reading this. As a number of us also realized some time ago, this means that if one designs sensitive detectors and puts them underground, not unlike, at least in spirit, the neutrino detectors that already exist underground, one might directly detect these dark matter particles. Around the world a half dozen beautiful experiments are now going on to do just that. So far nothing has been seen, however.

    So, we are in potentially the best of times or the worst of times. A race is going on between the detectors at the LHC and the underground direct dark matter detectors to see who might discover the nature of dark matter first. If either group reports a detection, it will herald the opening up of a whole new world of discovery, leading potentially to an understanding of Grand Unification itself. And if no discovery is made in the coming years, we might rule out the notion of a simple supersymmetric origin of dark matter—and in turn rule out the whole notion of supersymmetry as a solution of the hierarchy problem. In that case we would have to go back to the drawing board, except if we don’t see any new signals at the LHC, we will have little guidance about which direction to head in order to derive a model of nature that might actually be correct.

    Things got more interesting when the LHC reported a tantalizing possible signal due to a new particle about six times heavier than the Higgs particle. This particle did not have the characteristics one would expect for any supersymmetric partner of ordinary matter. In general the most exciting spurious hints of signals go away when more data are amassed, and about six months after this signal first appeared, after more data were amassed, it disappeared. If it had not, it could have changed everything about the way we think about Grand Unified Theories and electroweak symmetry, suggesting instead a new fundamental force and a new set of particles that feel this force. But while it generated many hopeful theoretical papers, nature seems to have chosen otherwise.

    The absence of clear experimental direction or confirmation of super- symmetry has thus far not bothered one group of theoretical physicists. The beautiful mathematical aspects of supersymmetry encouraged, in 1984, the resurrection of an idea that had been dormant since the 1960s when Yoichiro Nambu and others tried to understand the strong force as if it were a theory of quarks connected by string-like excitations. When supersymmetry was incorporated in a quantum theory of strings, to create what became known as superstring theory, some amazingly beautiful mathematical results began to emerge, including the possibility of unifying not just the three non-gravitational forces, but all four known forces in nature into a single consistent quantum field theory.

    However, the theory requires a host of new spacetime dimensions to exist, none of which has been, as yet, observed. Also, the theory makes no other predictions that are yet testable with currently conceived experiments. And the theory has recently gotten a lot more complicated so that it now seems that strings themselves are probably not even the central dynamical variables in the theory.

    None of this dampened the enthusiasm of a hard core of dedicated and highly talented physicists who have continued to work on superstring theory, now called M-theory, over the 30 years since its heyday in the mid-1980s. Great successes are periodically claimed, but so far M-theory lacks the key element that makes the Standard Model such a triumph of the scientific enterprise: the ability to make contact with the world we can measure, resolve otherwise inexplicable puzzles, and provide fundamental explanations of how our world has arisen as it has. This doesn’t mean M-theory isn’t right, but at this point it is mostly speculation, although well-meaning and well-motivated speculation.

    It is worth remembering that if the lessons of history are any guide, most forefront physical ideas are wrong. If they weren’t, anyone could do theoretical physics. It took several centuries or, if one counts back to the science of the Greeks, several millennia of hits and misses to come up with the Standard Model.

    So this is where we are. Are great new experimental insights just around the corner that may validate, or invalidate, some of the grander speculations of theoretical physicists? Or are we on the verge of a desert where nature will give us no hint of what direction to search in to probe deeper into the underlying nature of the cosmos? We’ll find out, and we will have to live with the new reality either way.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 10:47 am on March 16, 2017 Permalink | Reply
    Tags: , , , , , Physics,   

    From CERN via Accelerating News: “CESSAMag delivering impact” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    Accelerating News

    3.16.17
    Livia Lapadatescu (CERN)

    1
    Section of the SESAME Main Accelerator Ring (Image credit: CERN)

    The main objective of the FP7-CESSAMag (CERN-EC Support for SESAME Magnets) project was to support the construction of the SESAME light source in the Middle-East. With financial contribution from the EC, CERN’s main objective was to deliver the magnetic system and its powering scheme for the SESAME main accelerator ring, as well as to support the training of SESAME staff. Completed at the end of 2016, the project fulfilled or exceeded all its objectives.

    Scientific and technical impact of CESSAMag

    Building upon SESAME studies, CESSAMag finalized the requirements and design and produced the engineering and technical drawings of the SESAME magnets and powering scheme. The first main result of CESSAMag is the production of design reports on the combined function bending magnets, on the quadrupole magnets (long and short), on the sextupole magnets with their auxiliary corrector windings and on the powering scheme. These design and engineering study reports were used as background for the technical specifications needed for tendering and can serve as reference for the construction of similar light sources.

    During the tendering process, CERN made a special effort to place orders not only with experienced European companies, but also with companies based in some of the SESAME Members (Cyprus, Israel, Pakistan, Turkey), without former experience in accelerator components (except for Israel), but demonstrating potential and motivation. This was achieved through effective knowledge transfer from CERN and generated potential commercial impact in the companies trained.

    All magnets successfully passed the acceptance tests at either ALBA-CELLS or CERN and their measured field quality and reproducibility from magnet to magnet are excellent, making them a reference for similar synchrotrons. Therefore, a key result of CESSAMag is the string of magnets forming the SESAME storage ring, composed of:

    16 combined function bending magnets (dipole + quadrupole)

    64 quadrupoles of two types: 32 long focusing and 32 short defocusing quadrupoles

    64 sextupole/correctors

    CESSAMag also contributed to the production of an improved magnet powering scheme: rather than procuring power supplies adapted to each kind of magnet, another approach was proposed by CERN, based on light source standards (PSI), which allows individual powering of quadrupoles and simplified maintenance by plug-and-play modules by standardizing interfaces. With this strategy, SESAME benefits from a powering strategy more powerful, flexible and robust than initially foreseen.

    Following the decision to procure some components from companies based in the SESAME Members and thanks to the in-kind contribution of Pakistan, offering the assembly of 50% of the sextupoles, CESSAMag managed to deliver a more powerful and complete magnetic system and reduce the financial share that SESAME was due to contribute to the project.

    Finally, CESSAMag contributed to the magnet integration and commissioning, with the goal of making SESAME fully in control of the equipment delivered by CERN.

    The first beam was circulated in the SESAME main accelerator ring on 11 January 2017 and it was stored and accumulated up to 20mA in mid-February. The next step is ramping the beam and completing the RF stations and final acceleration assessment expected before the end of summer. The inauguration ceremony of the SESAME light source will take place in mid-May with the foreseen presence of high-ranking officials from SESAME Members and Observers. The first user experiments are foreseen to start in Q3.

    Political and social impact of CESSAMag

    A significant aspect showcasing the socio-economic impact of CESSAMag is the knowledge transfer to companies from SESAME Members and training of SESAME staff. The duration of training to staff, engineers and companies from SESAME Members amounts to about 90 person-months and the CERN personnel effort in training and knowledge transfer amounts to 16 person-months.

    In the context of CESSAMag, international collaborations and agreements were established between CERN and SESAME and CERN and ALBA-CELLS; implementation agreements were formed with PAEK (Pakistan), TAEK (Turkey) and ILSF (Iran) and an informal collaboration with IAEA, which provided financial support for training and experts’ visits between CERN and SESAME. These collaborations and agreements illustrate the international and science diplomacy dimensions of the project.

    Furthermore, the European Union acknowledged the science diplomacy impact of CESSAMag and made further steps in support of SESAME. Since 2015, the EU is an Observer in the SESAME Council and the EC decided to further support the training of SESAME users and staff in the framework of the OPEN SESAME (Opening Synchrotron Light for Experimental Science and Applications in the Middle East) H2020 “Policy and international cooperation measures for research infrastructures” project.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CernCourier
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 9:27 am on March 14, 2017 Permalink | Reply
    Tags: , Last but not least the poly shield, , , , Physics,   

    From SURF: “Last, but not least, the poly shield” 

    SURF logo
    Sanford Underground levels

    Sanford Underground Research facility

    March 13, 2017
    Constance Walter

    1
    Vince Guiseppe stands next to an extra lead brick monolith, which keeps the shield sealed if a working module needs to be removed for service. Credit: Constance Walter

    For nearly seven years, the Majorana Demonstrator Project’s “shield team” has been building the six-layered shield that surrounds the experiment on the 4850 Level. In early March, they placed the last piece of polyethylene on the outermost layer of the shield.

    “I’m proud of what the team has produced,” said Vince Guiseppe, assistant professor of physics at the University of South Carolina. “This was a complicated project. Every layer was added at the right time and fit perfectly.”


    U Washington Majorana Demonstrator Experiment

    The Majorana collaboration uses germanium crystals to look for a rare form of radioactive decay called neutrinoless double-beta decay. The discovery could determine whether the neutrino is its own antiparticle. Its detection could help explain why matter exists. The shield is critical to the success of the experiment.

    Each layer of the shield was designed to target certain forms of radiation. “The closer the layer is to the experiment, the greater its impact,” Guiseppe said.

    The most important layer is the electroformed copper that sits closest to the experiment. Comprised of 40, half-inch thick copper plates, it was grown and machined underground. “This is clearly the hallmark of our shield system in terms of purity and cleanliness protocols,” Guiseppe said. Surrounding that portion of the shield, is a 2-inch thick layer of ultrapure commercial copper.

    Next is a “castle” built with 3,400 lead bricks. Two portable monoliths, each holding 570 bricks, support the cryostats filled with strings of germanium detectors and cryogenic hardware, what Guiseppe calls “the heart of the experiment.”

    An aluminum box encapsulating the lead castle protects the experiment from naturally occurring radon. Every minute, the team injects eight liters of nitrogen gas to purge the air within the enclosure. “We don’t want any lab air getting in.”

    Attached to the aluminum box are scintillating plastic “veto panels” designed to detect muons, the most penetrating of all cosmic rays.

    Finally, there’s the 12 inches of polyethylene enclosing the entire experiment, including the cryogenics (chilled water heat exchangers moderate the temperature). The poly slows down neutrons that could cause very rare backgrounds. Why worry about such rare events? High-energy neutrons can bounce through just about anything, including the 22 inches of lead and copper shielding. If a neutron hits a copper atom, it could create a gamma ray right next to the experiment.

    “The poly is the final defense against backgrounds in an experiment that requires extreme quiet,” Guiseppe said.

    The entire shield, weighing 145,000 pounds, rests on an over floor made of steel with channels for the poly.

    Jared Thompson, a research assistant, began his work with Majorana in 2010, etching lead bricks for the shield. In fact, in March 2014, he placed the last brick on the castle. And he was part of the group that recently placed the last piece of poly.

    “It’s really exciting,” Thompson said. “A complete shield could mean a whole new data set down the road.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

     
  • richardmitnick 9:10 am on March 13, 2017 Permalink | Reply
    Tags: , , Physics, , Stanford engineers use soup additive to create a stretchable plastic electrode   

    From Stanford: “So long stiffness: Stanford engineers use soup additive to create a stretchable plastic electrode” 

    Stanford University Name
    Stanford University

    March 10, 2017
    Shara Tonn

    .
    Courtesy Bao Research Group
    Access mp4 video here .
    A robotic test instrument stretches over a curved surface a nearly transparent, flexible electrode based on a special plastic developed in the lab of Stanford chemical engineer. Zhenan Bao.

    Chemical engineer Zhenan Bao is trying to change that. For more than a decade, her lab has been working to make electronics soft and flexible so that they feel and operate almost like a second skin. Along the way, the team has started to focus on making brittle plastics that can conduct electricity more elastic.

    Now in Science Advances, Bao’s team describes how they took one such brittle plastic and modified it chemically to make it as bendable as a rubber band, while slightly enhancing its electrical conductivity. The result is a soft, flexible electrode that is compatible with our supple and sensitive nerves.

    “This flexible electrode opens up many new, exciting possibilities down the road for brain interfaces and other implantable electronics,” said Bao, a professor of chemical engineering. “Here, we have a new material with uncompromised electrical performance and high stretchability.”

    The material is still a laboratory prototype, but the team hopes to develop it as part of their long-term focus on creating flexible materials that interface with the human body.

    1
    A printed electrode pattern of the new polymer being stretched to several times of its original length (top), and a transparent, highly stretchy “electronic skin” patch forming an intimate interface with the human skin to potentially measure various biomarkers (bottom). (Image credit: Bao Lab)

    Flexible interface

    Electrodes are fundamental to electronics. Conducting electricity, these wires carry back and forth signals that allow different components in a device to work together. In our brains, special thread-like fibers called axons play a similar role, transmitting electric impulses between neurons. Bao’s stretchable plastic is designed to make a more seamless connection between the stiff world of electronics and the flexible organic electrodes in our bodies.

    “One thing about the human brain that a lot of people don’t know is that it changes volume throughout the day,” says postdoctoral research fellow Yue Wang, the first author on the paper. “It swells and deswells.” The current generation of electronic implants can’t stretch and contract with the brain and make it complicated to maintain a good connection.

    “If we have an electrode with a similar softness as the brain, it will form a better interface,” said Wang.

    To create this flexible electrode, the researchers began with a plastic that had two essential qualities: high conductivity and biocompatibility, meaning that it could be safely brought into contact with the human body. But this plastic had a shortcoming: It was very brittle. Stretching it even 5 percent would break it.

    Tightly wound and brittle

    As Bao and her team sought to preserve conductivity while adding flexibility, they worked with scientists at the SLAC National Accelerator Laboratory to use a special type of X-ray to study this material at the molecular level. All plastics are polymers; that is, chains of molecules strung together like beads. The plastic in this experiment was actually made up of two different polymers that were tightly wound together. One was the electrical conductor. The other polymer was essential to the process of making the plastic. When these two polymers combined they created a plastic that was like a string of brittle, sphere-like structures. It was conductive, but not flexible.

    The researchers hypothesized that if they could find the right molecular additive to separate these two tightly wound polymers, they could prevent this crystallization and give the plastic more stretch. But they had to be careful – adding material to a conductor usually weakens its ability to transmit electrical signals.

    After testing more than 20 different molecular additives, they finally found one that did the trick. It was a molecule similar to the sort of additives used to thicken soups in industrial kitchens. This additive transformed the plastic’s chunky and brittle molecular structure into a fishnet pattern with holes in the strands to allow the material to stretch and deform. When they tested their new material’s elasticity, they were delighted to find that it became slightly more conductive when stretched to twice its original length. The plastic remained very conductive even when stretched 800 percent its original length.

    “We thought that if we add insulating material, we would get really poor conductivity, especially when we added so much,” said Bao. But thanks to their precise understanding of how to tune the molecular assembly, the researchers got the best of both worlds: the highest possible conductivity for the plastic while at the same transforming it into a very robust and stretchy substance.

    “By understanding the interaction at the molecular level, we can develop electronics that are soft and stretchy like skin, while remaining conductive,” Wang says.

    Other authors include postdoctoral fellows Chenxin Zhu, Francisco Molina-Lopez, Franziska Lissel and Jia Liu; graduate students Shucheng Chen and Noelle I. Rabiah; Hongping Yan and Michael F. Toney, staff scientists at SLAC National Accelerator Laboratory; Christian Linder, an assistant professor of civil and environmental engineering who is also a member of Stanford Bio-X and of the Stanford Neurosciences Institute; Boris Murmann, a professor of electrical engineering and a member of the Stanford Neurosciences Institute; Lihua Jin, now an assistant professor of mechanical and aerospace engineering at the University of California, Los Angeles; Zheng Chen, now an assistant professor of nano engineering at the University of California, San Diego; and colleagues from the Materials Science Institute of Barcelona, Spain, and Samsung Advanced Institute of Technology.

    This work was funded by Samsung Electronics and the Air Force Office of Science Research.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: