Tagged: Dark Matter Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:21 pm on July 16, 2019 Permalink | Reply
    Tags: , being replaced by LBNL Lux Zeplin project, Dark Matter, ending, , Lead, , , SD, , U Washington LUX Dark matter Experiment at SURF, ,   

    From Lawrence Berkeley National Lab: “Some Assembly Required: Scientists Piece Together the Largest U.S.-Based Dark Matter Experiment” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    July 16, 2019

    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Major deliveries in June set the stage for the next phase of work on LUX-ZEPLIN project.

    1
    Lower (left) and upper photomultiplier tube arrays are prepared for LZ at the Sanford Underground Research Facility in Lead, South Dakota. (Credit: Matt Kapust/SURF)

    Most of the remaining components needed to fully assemble an underground dark matter-search experiment called LUX-ZEPLIN (LZ) arrived at the project’s South Dakota home during a rush of deliveries in June.

    When complete, LZ will be the largest, most sensitive U.S.-based experiment yet that is designed to directly detect dark matter particles. Scientists around the world have been trying for decades to solve the mystery of dark matter, which makes up about 85 percent of all matter in the universe though we have so far only detected it indirectly through observed gravitational effects.

    The bulk of the digital components for LZ’s electronics system, which is designed to transmit and record signals from ever-slight particle interactions in LZ’s core detector vessel, were among the new arrivals at the Sanford Underground Research Facility (SURF). SURF, the site of a former gold mine now dedicated to a broad spectrum of scientific research, was also home to a predecessor search experiment called LUX.

    U Washington LUX Dark matter Experiment at SURF, Lead, SD, USA

    A final set of snugly fitting acrylic vessels, which will be filled with a special liquid designed to identify false dark matter signals in LZ’s inner detector, also arrived at SURF in June.

    3
    An intricately thin wire grid is visible (click image to view larger size) atop an array of photomultiplier tube. The components are part of the LZ inner detector. (Credit: Matt Kapust/SURF)

    Also, the last two of four intricately woven wire grids that are essential to maintain a constant electric field and extract signals from the experiment’s inner detector, also called the time projection chamber, arrived in June (see related article).

    LZ achieved major milestones in June. It was the busiest single month for delivering things to SURF — it was the peak,” said LZ Project Director Murdock Gilchriese of the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab). Berkeley Lab is the lead institution for the LZ project, which is supported by an international collaboration that has about 37 participating institutions and about 250 researchers and technical support crew members.

    “A few months from now all of the action on LZ is going to be at SURF — we are already getting close to having everything there,” Gilchriese said.

    Mike Headley, executive director at SURF, said, “We’ve been collectively preparing for these deliveries for some time and everything has gone very well. It’s been exciting to see the experiment assembly work progress and we look forward to lowering the assembled detector a mile underground for installation.”

    4
    Components for the LUX-ZEPLIN project are stored inside a water tank nearly a mile below ground. The inner detector will be installed on the central mount pictured here, and acrylic vessels (wrapped in white) will fit snugly around this inner detector. (Credit: Matt Kapust/SURF)

    All of these components will be transported down a shaft and installed in a nearly mile-deep research cavern. The rock above provides a natural shield against much of the constant bombardment of particles raining down on the planet’s surface that produce unwanted “noise.”

    LZ components have also been painstakingly tested and selected to ensure that the materials they are made of do not themselves interfere with particle signals that researchers are trying to tease out.

    LZ is particularly focused on finding a type of theoretical particle called a weakly interacting massive particle or WIMP by triggering a unique sequence of light and electrical signals in a tank filled with 10 metric tons of highly purified liquid xenon, which is among Earth’s rarest elements. The properties of xenon atoms allow them to produce light in certain particle interactions.

    Proof of dark matter particles would fundamentally change our understanding of the makeup of the universe, as our current Standard Model of Physics does not account for their existence.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    Assembly of the liquid xenon time projection chamber for LZ is now about 80 percent complete, Gilchriese said. When fully assembled later this month this inner detector will contain about 500 photomultiplier tubes. The tubes are designed to amplify and transmit signals produced within the chamber.

    5
    An array of photomultiplier tubes that are designed to detect signals occurring within LZ’s liquid xenon tank. (Credit: Matt Kapust/SURF)

    Once assembled, the time projection chamber will be lowered carefully into a custom titanium vessel already at SURF. Before it is filled with xenon, this chamber will be lowered to a depth of about 4,850 feet. It will be carried in a frame that is specially designed to minimize vibrations, and then floated into the experimental cavern across a temporarily assembled metal runway on air-pumped pucks known as air skates.

    Finally, it will be lowered into a larger outer titanium vessel, already underground, to form the final vacuum-insulated cryostat needed to house the liquid xenon.

    That daylong journey, planned in September, will be a nail-biting experience for the entire project team, noted Berkeley Lab’s Simon Fiorucci, LZ deputy project manager.

    “It will certainly be the most stressful — this is the thing that really cannot fail. Once we’re done with this, a lot of our risk disappears and a lot of our planning becomes easier,” he said, adding, “This will be the biggest milestone that’s left besides having liquid xenon in the detector.”

    Project crews will soon begin testing the xenon circulation system, already installed underground, that will continually circulate xenon through the inner detector, further purify it, and reliquify it. Fiorucci said researchers will use about 250 pounds of xenon for these early tests.

    Work is also nearing completion on LZ’s cryogenic cooling system that is required to convert xenon gas to its liquid form.

    6
    Researchers from the University of Rochester in June installed six racks of electronics hardware that will be used to process signals from the LZ experiment. (Credit: University of Rochester)

    LZ digital electronics, which will ultimately connect to the arrays of photomultiplier tubes and enable the readout of signals from particle interactions, were designed, developed, delivered, and installed by University of Rochester researchers and technical staff at SURF in June.

    “All of our electronics have been designed specifically for LZ with the goal of maximizing our sensitivity for the smallest possible signals,” said Frank Wolfs, a professor of physics and astronomy at the University of Rochester who is overseeing the university’s efforts.

    He noted that more than 28 miles of coaxial cable will connect the photomultiplier tubes and their amplifying electronics – which are undergoing tests at UC Davis – to the digitizing electronics. “The successful installation of the digital electronics and the online network and computing infrastructure in June makes us eager to see the first signals emerge from LZ,” Wolfs added.

    Also in June, LZ participants exercised high-speed data connections from the site of the experiment to the surface level at SURF and then to Berkeley Lab. Data captured by the detectors’ electronics will ultimately be transferred to LZ’s primary data center, the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab via the Energy Sciences Network (ESnet), a high-speed nationwide data network based at Berkeley Lab.

    NERSC

    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    NERSC Hopper Cray XE6 supercomputer


    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Future:

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supeercomputer

    NERSC is a DOE Office of Science User Facility.

    The production of the custom acrylic tanks (see related article), which will contain a fluid known as a liquid scintillator, was overseen by LZ participants at University of California,Santa Barbara.

    5
    The top three acrylic tanks for the LUX-ZEPLIN outer detector during testing at the fabrication vendor. These tanks are now at the Sanford Underground Research Facility in Lead, South Dakota. (Credit: LZ Collaboration)

    “The partnership between LZ and SURF is tremendous, as evidenced by the success of the assembly work to date,” Headley said. “We’re proud to be a part of the LZ team and host this world-leading experiment in South Dakota.”

    NERSC and ESnet are DOE Office of Science User Facilities.

    Major support for LZ comes from the DOE Office of Science, the South Dakota Science and Technology Authority, the U.K.’s Science & Technology Facilities Council, and by collaboration members in the U.S., U.K., South Korea, and Portugal.

    More:

    For information about LZ and the LZ collaboration, visit: http://lz.lbl.gov/

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

    DOE Seal

     
  • richardmitnick 7:31 am on July 12, 2019 Permalink | Reply
    Tags: , , , , Dark Matter,   

    From SLAC National Accelerator Lab: “Light dark matter is a thousand times less likely to bump into regular matter than previous astrophysical analyses allowed” 

    From SLAC National Accelerator Lab

    July 11, 2019
    Manuel Gnida

    1
    Simulation of the dark matter structure surrounding the Milky Way. Driven by gravity, dark matter forms dense structures, referred to as halos (bright areas), in which galaxies are born. The number and distribution of halos, and therefore also of galaxies, depends on the properties of dark matter, such as its mass and its likelihood to interact with normal matter. (Ethan Nadler/Risa Wechsler/Ralf Kaehler/SLAC National Accelerator Laboratory/Stanford University)

    A team led by scientists from the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University has narrowed down how strongly dark matter particles might interact with normal matter. Based on the number and distribution of small satellite galaxies seen orbiting our Milky Way, the team found this interaction to be at least a thousand times weaker than the strongest interaction allowed by previous astrophysical analyses.

    “Improving our understanding of these interactions is important because it’s one of the factors that helps us determine what dark matter can and cannot be,” said Risa Wechsler, director of the SLAC/Stanford Kavli Institute for Particle Astrophysics and Cosmology and the study’s senior author. The study can also help researchers refine their models for the evolution of the universe because dark matter and its interactions with gravity play such a fundamental role in how galaxies form, she said.

    Study lead author Ethan Nadler, a graduate student working with Wechsler, said, “Our results exclude dark matter properties in a mass range that has been largely unexplored before, nicely complementing the outcomes of other experiments that set tight limits for heavier dark matter particles.”

    The researchers recently published their results in The Astrophysical Journal Letters.

    The ‘missing satellites conundrum

    Most of the structure in today’s universe can be explained with a quite simple dark matter model. It assumes that dark matter is relatively “cold,” meaning it moved very slowly compared to the speed of light, and “collisionless,” meaning it doesn’t interact with itself or regular matter. As the universe expands, gravity causes dark matter to clump together and form dense dark matter halos. Dark matter also pulls in regular matter around it, concentrating regular matter and initiating galaxy formation inside dark matter halos.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex Mittelmann Cold creation


    Simulation of the formation of the dark matter structure surrounding the Milky Way, from the early universe to today. Gravity makes dark matter clump together and form dense structures, referred to as halos (bright areas). The number and distribution of halos depends on the properties of dark matter, such as its mass and its likelihood to interact with normal matter. Galaxies are thought to form inside these halos. In a new study, SLAC and Stanford researchers have used measurements of faint satellite galaxies orbiting the Milky Way to derive limits on how often dark matter particles can possibly collide with regular matter particles. (Ethan Nadler/Risa Wechsler/R. Kaehler/SLAC National Accelerator Laboratory/Stanford University)

    This “cold dark matter” model works well on very large scales, including clusters of galaxies, and describes how typical galaxies are clustered in the universe. But on much smaller scales – for galaxies smaller than our Milky Way, for example – the simple model seemed to cause problems.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    It predicts that the Milky Way’s halo is surrounded by thousands of smaller subhalos, so there should be also thousands of smaller satellite galaxies orbiting our galaxy. Yet, by the early 2000s, researchers only knew of about 10 of them.

    “The apparent discrepancy between observations and predictions made people think there is a serious issue with the model, but recently this has become less of a problem,” Nadler said.

    “Increasingly sensitive astrophysical surveys have discovered many more faint satellite galaxies, and we expect next-generation instruments like the Large Synoptic Survey Telescope to find hundreds more if the simplest cold dark matter model is correct.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.


    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    Thus, if fewer galaxies are observed, this could indicate that the simplest model is not exactly correct,” he said. “At the same time, we don’t expect the smallest halos to host galaxies, so understanding the connection between galaxies and halos is crucial to make conclusions about the nature of dark matter.”

    Limiting what dark matter can be

    One way the dark matter model can be modified is by assuming that dark matter was produced in a “warmer” state in the early universe, meaning it moved faster than in the simple model and was less likely to clump. This would result in a smaller number of dark matter halos and cut down the number of observable satellite galaxies. Because the mass of dark matter controls its velocity when it was produced in the early universe, the abundance of satellites can be used to determine the minimum mass of warm dark matter particles.

    Here, the researchers looked at a different property of dark matter in other non-standard models: its interaction with normal matter. They showed that collisions between dark matter particles and regular matter particles like protons and neutrons would also reduce the observable satellite population.

    “If the interaction is very strong, it erases small dark matter halos and suppresses a lot of the small structure,” Wechsler said. “But we can actually see some smaller structures based on the tiny galaxies they host, so the interaction can’t be too strong either.” In other words, the number of observable satellite galaxies provides a path to learning about these fundamental interactions.

    In their study, the team varied the strength of the collision interaction in their dark matter model and ran simulations to predict how that affected the distribution of dark matter halos. Then, they tried to fit known satellite galaxies into the halos.

    “What’s really exciting is that our study nicely bridges experimental observations of faint galaxies today with theories of dark matter and its behavior in the early universe. It connects a lot of pieces, and by doing so it tells us something very profound about dark matter,” Nadler said.

    The researchers found that in order to make everything fit together, dark matter particles with relatively low mass must interact at least a thousand times more weakly with normal matter than the previous limit. Before this work, the leading constraint in this mass range were set by astrophysical studies based on the cosmic microwave background, the earliest light in the universe. Meanwhile, direct detection experiments, which search for signs of dark matter with sensitive underground detectors, set stringent limits on the interaction strength for heavier dark matter particles, making studies of satellite galaxies highly complementary to those experiments.

    “Although we still don’t know what dark matter is made of, our results are a step forward that sets tighter limits on what it actually can be,” Nadler said.

    Other researchers involved in the study were Vera Gluscevic at the University of Southern California and Kimberly Boddy at Johns Hopkins University. Financial support came from the National Science Foundation and the Department of Energy.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC/LCLS


    SLAC/LCLS II projected view


    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

     
  • richardmitnick 5:11 pm on June 20, 2019 Permalink | Reply
    Tags: , Dark Matter, LZ experiment at SURF, , , Search for WIMPS,   

    From SLAC National Accelerator Lab: “SLAC sends off woven grids for LUX-ZEPLIN dark matter detector” 

    From SLAC National Accelerator Lab

    June 20, 2019
    Manuel Gnida

    Four large meshes made from 2 miles of metal wire will extract potential signals of dark matter particles.

    The ultra-sensitive LUX-ZEPLIN (LZ) detector is scheduled to begin its search for elusive dark matter next year.

    LBNL LZ project at SURF, Lead, SD, USA

    At its core: a large tank filled with 10 metric tons of liquid xenon whose atoms would produce telltale signals when struck by dark matter particles. Inside the tank, four high-voltage grids – fine circular metal meshes, each 5 feet in diameter – are needed to extract these signals.

    Over the past few months, the LZ team at the Department of Energy’s SLAC National Accelerator Laboratory, which is part of the international LZ collaboration of 250 scientists from 37 institutions, has carefully woven the grids from 2 miles of thin stainless steel wire, and yesterday they sent the last one on its way to the Sanford Underground Research Facility (SURF) in South Dakota, where the LZ detector is being assembled.


    Weaving the high-voltage grids of the LUX-ZEPLIN dark matter experiment. (Farrin Abbott/SLAC National Accelerator Laboratory)

    “Completion of the delivery of the grids from SLAC is one of the most critical project milestones,” said LZ Project Director Murdoch “Gil” Gilchriese of DOE’s Lawrence Berkeley National Laboratory, which leads the project.

    “Congratulations to the grids team.”

    The quality of the grids is critical to LZ’s performance, and making them was a major challenge, said Tom Shutt, one of the directors of SLAC’s LZ team: “It took us about four years to develop the design and to manufacture and test them. It’s exciting that we’re now integrating them into the detector.” The team’s efforts included inventing a clever way of weaving the grids from metal wires.

    Rare collisions with dark matter

    Scientists have overwhelming evidence that the matter we can see makes up only a small fraction of the universe. About 85 percent of matter is invisible and interacts with everything else almost entirely through gravity. This mysterious substance is called “dark matter,” and researchers believe it’s composed of particles, just like ordinary matter is made of fundamental particles. Yet, dark matter’s building blocks have yet to show up in experiments.

    Scientists have been trying to detect dark matter particles by putting tanks of liquefied noble gases, like xenon, deep underground. Most dark matter particles rush through these tanks unhindered while traveling through our planet as if it were made of air. But from time to time, scientists theorize, a particle might collide with a noble gas atom and produce a signal that reveals dark matter’s presence and nature.

    As the newest generation of this type of “direct detection” dark matter experiment, LZ will be hundreds of times more sensitive to a particular type of dark matter candidate, called weakly interacting massive particles (WIMPs), than its predecessor.


    Dark matter hunt with the LZ experiment. (Greg Stewart/SLAC National Accelerator Laboratory)

    From collisions to flashes of light

    If and when a WIMP particle hits a xenon atom in LZ’s tank, two things will happen: The atom will emit a flash of light that is recorded by nearly 500 light-sensitive detectors, called photomultiplier tubes (PMTs), at the top and bottom of the tank. The atom will also release electrons, and that’s where the high-voltage grids come in.

    Two of the grids – the cathode at the bottom and the gate at the top – will help create an electric field that pushes electrons through the liquid xenon toward the top of the tank. There, they’ll be extracted from the liquid by a field between the gate and anode, which sits just below the top PMT array within a tightly controlled layer of xenon gas. In the gas, the electrons create another flash of light. A characteristic combination of two light flashes signals the arrival of a WIMP.

    “Establishing the electric field is critical to be able to distinguish between potential dark matter signals and background signals,” said Ryan Linehan, a Stanford University graduate student on SLAC’s LZ team.

    2
    Four high-voltage grids inside LZ’s tank of liquid xenon. The cathode and gate grids create an electric field that pushes electrons through the liquid xenon toward the top of the tank. There, a field between the gate and anode grids extracts the electrons. They enter a thin layer of xenon gas that floats atop the liquid, where they create a flash of light. A fourth grid at the bottom of the xenon tank shields the bottom PMT array from the high fields above. (Greg Stewart/SLAC National Accelerator Laboratory)

    A fourth grid at the bottom of the xenon tank will shield the bottom PMT array from the high electric fields above.

    Weaving a ‘metal fabric’

    To build the grids, LZ engineers and scientists had to solve a number of technical challenges. For instance, the grids can produce the required uniform electric field only if they stay very flat when mounted horizontally inside the xenon tank. They must also be transparent enough so that they don’t stop light from reaching the PMTs. To further complicate things, there are no commercially available grids in the size the LZ team needed, so they had to find a way to build their own.

    The crucial idea came from SLAC mechanical engineer Knut Skarpaas. He designed a loom similar to those used for weaving fabric. But instead of thread, LZ’s loom wove metal wires about the size of a human hair into fine meshes with wires only millimeters apart (see video at the top of this article). And instead of weaving the “fabric” on an ordinary production floor, the loom operated in a clean room to avoid contamination.

    3
    Members of SLAC’s LZ team with the loom they used to weave high-voltage grids for the next-gen dark matter experiment. (Farrin Abbott/SLAC National Accelerator Laboratory)

    Ramping up the voltage

    Once a metal mesh was woven, LZ folks sandwiched it between two metal rings and cut out a circular piece of the right size. Then, they carefully transferred the circular grids one by one to a customized test vessel and checked their performance.

    “Nobody had studied the behavior of such large grids under high fields and in this particular xenon environment before, so there was a lot we had to test and learn,” said Rachel Mannino, a postdoctoral researcher at the University of Wisconsin-Madison working with SLAC’s LZ team. “We were particularly worried about electron emissions from the wires, which can occur under high fields and would generate false signals in the detector.”

    The tests were done in xenon gas under high pressure. While slowly ramping up the voltage on the grids, the researchers used PMTs to search for potential hotspots where electrons leave the metal mesh. The results allowed the team to define grid operating conditions that minimize unwanted electron emissions.

    In addition, the gate grid was chemically treated to further reduce those nuisance emissions and improve the experiment’s ability to search for WIMPs with lower masses.

    4
    A SLAC team sets up a specialized vessel to test the performance of LZ’s high-voltage grids under high voltage and in a high-pressure xenon atmosphere. (Farrin Abbott/SLAC National Accelerator Laboratory)

    Getting ready for the dark matter hunt

    With the last grid on its way to SURF, the LZ team is now ready to put everything together.

    “We’ve recently begun building the detector core from the bottom up,” said SLAC’s Tomasz Biesiadzinski, one of the scientists in charge of detector integration, who splits his time between SLAC and SURF. “In the fall, we’ll move everything underground, where LZ’s outer layers are already being assembled, and integrate and connect all the parts. After all the years of preparation we’re finally getting close to collecting data.”

    LZ’s dark matter hunt is set to begin sometime next year. Then, it’ll be up to the WIMPs to show up.

    5
    LZ’s high-voltage grids are about 5 feet in diameter. Each of the four grids is woven from hundreds of metal wires thinner than a human hair – a total of two miles of wire for all four. (Farrin Abbott/SLAC National Accelerator Laboratory)

    Major support for LZ comes from the DOE Office of Science; South Dakota Science and Technology Authority; the U.K.’s Science & Technology Facilities Council; and from collaboration members in South Korea and Portugal.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC/LCLS


    SLAC/LCLS II projected view


    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

     
  • richardmitnick 2:11 pm on June 11, 2019 Permalink | Reply
    Tags: , , Dark Matter, , , Future Circular Collider (FCC), If we don’t push the frontiers of physics we’ll never learn what lies beyond our current understanding., , Lepton collider, New accelerators ecplored, , , , Proton collider,   

    From Ethan Siegel: “Does Particle Physics Have A Future On Earth?” 

    From Ethan Siegel
    Jun 11. 2019

    1
    The inside of the LHC, where protons pass each other at 299,792,455 m/s, just 3 m/s shy of the speed of light. As powerful as the LHC is, the cancelled SSC could have been three times as powerful, and may have revealed secrets of nature that are inaccessible at the LHC. (CERN)

    If we don’t push the frontiers of physics, we’ll never learn what lies beyond our current understanding.

    At a fundamental level, what is our Universe made of? This question has driven physics forward for centuries. Even with all the advances we’ve made, we still don’t know it all. While the Large Hadron Collider discovered the Higgs boson and completed the Standard Model earlier this decade, the full suite of the particles we know of only make up 5% of the total energy in the Universe.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Standard Model of Particle Physics

    We don’t know what dark matter is, but the indirect evidence for it is overwhelming.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    Same deal with dark energy.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    Or questions like why the fundamental particles have the masses they do, or why neutrinos aren’t massless, or why our Universe is made of matter and not antimatter. Our current tools and searches have not answered these great existential puzzles of modern physics. Particle physics now faces an incredible dilemma: try harder, or give up.

    2
    The Standard Model of particle physics accounts for three of the four forces (excepting gravity), the full suite of discovered particles, and all of their interactions. Whether there are additional particles and/or interactions that are discoverable with colliders we can build on Earth is a debatable subject, but one we’ll only know the answer to if we explore past the known energy frontier. (CONTEMPORARY PHYSICS EDUCATION PROJECT / DOE / NSF / LBNL)

    The particles and interactions that we know of are all governed by the Standard Model of particle physics, plus gravity, dark matter, and dark energy. In particle physics experiments, however, it’s the Standard Model alone that matters. The six quarks, charged leptons and neutrinos, gluons, photon, gauge bosons and Higgs boson are all that it predicts, and each particle has been not only discovered, but their properties have been measured.

    As a result, the Standard Model is perhaps a victim of its own success. The masses, spins, lifetimes, interaction strengths, and decay ratios of every particle and antiparticle have all been measured, and they agree with the Standard Model’s predictions at every turn. There are enormous puzzles about our Universe, and particle physics has given us no experimental indications of where or how they might be solved.

    3
    The particles and antiparticles of the Standard Model have now all been directly detected, with the last holdout, the Higgs Boson, falling at the LHC earlier this decade. All of these particles can be created at LHC energies, and the masses of the particles lead to fundamental constants that are absolutely necessary to describe them fully. These particles can be well-described by the physics of the quantum field theories underlying the Standard Model, but they do not describe everything, like dark matter. (E. SIEGEL / BEYOND THE GALAXY)

    It might be tempting, therefore, to presume that building a superior particle collider would be a fruitless endeavor. Indeed, this could be the case. The Standard Model of particle physics has explicit predictions for the couplings that occur between particles. While there are a number of parameters that remain poorly determined at present, it’s conceivable that there are no new particles that a next-generation collider could reveal.

    The heaviest Standard Model particle is the top quark, which takes roughly ~180 GeV of energy to create. While the Large Hadron Collider can reach energies of 14 TeV (about 80 times the energy needed to create a top quark), there might not be any new particles present to find unless we reach energies in excess of 1,000,000 times as great. This is the great fear of many: the possible existence of a so-called “energy desert” extending for many orders of magnitude.

    4
    There is certainly new physics beyond the Standard Model, but it might not show up until energies far, far greater than what a terrestrial collider could ever reach. Still, whether this scenario is true or not, the only way we’ll know is to look. In the meantime, properties of the known particles can be better explored with a future collider than any other tool. The LHC has failed to reveal, thus far, anything beyond the known particles of the Standard Model. (UNIVERSE-REVIEW.CA)

    But it’s also possible that there is new physics present at a modest scale beyond where we’ve presently probed. There are many theoretical extensions to the Standard Model that are quite generic, where deviations from the Standard Model’s predictions can be detected by a next-generation collider.

    If we want to know what the truth about our Universe is, we have to look, and that means pushing the present frontiers of particle physics into uncharted territory. Right now, the community is debating between multiple approaches, with each one having its pros and cons. The nightmare scenario, however, isn’t that we’ll look and won’t find anything. It’s that infighting and a lack of unity will doom experimental physics forever, and that we won’t get a next-generation collider at all.

    5
    A hypothetical new accelerator, either a long linear one or one inhabiting a large tunnel beneath the Earth, could dwarf the sensitivity to new particles that prior and current colliders can achieve. Even at that, there’s no guarantee we’ll find anything new, but we’re certain to find nothing new if we fail to try. (ILC COLLABORATION)

    When it comes to deciding what collider to build next, there are two generic approaches: a lepton collider (where electrons and positrons are accelerated and collided), and a proton collider (where protons are accelerated and collided). The lepton colliders have the advantages of:

    the fact that leptons are point particles, rather than composite particles,
    100% of the energy from electrons colliding with positrons can be converted into energy for new particles,
    the signal is clean and much easier to extracts,
    and the energy is controllable, meaning we can choose to tune the energy to a specific value and maximize the chance of creating a specific particle.

    Lepton colliders, in general, are great for precision studies, and we haven’t had a cutting-edge one since LEP was operational nearly 20 years ago.

    CERN LEP Collider

    5
    At various center-of-mass energies in electron/positron (lepton) colliders, various Higgs production mechanisms can be reached at explicit energies. While a circular collider can achieve much greater collision rates and production rates of W, Z, H, and t particles, a long-enough linear collider can conceivably reach higher energies, enabling us to probe Higgs production mechanisms that a circular collider cannot reach. This is the main advantage that linear lepton colliders possess; if they are low-energy only (like the proposed ILC), there is no reason not to go circular. (H. ABRAMOWICZ ET AL., EUR. PHYS. J. C 77, 475 (2017))

    It’s very unlikely, unless nature is extremely kind, that a lepton collider will directly discover a new particle, but it may be the best bet for indirectly discovering evidence of particles beyond the Standard Model. We’ve already discovered particles like the W and Z bosons, the Higgs boson, and the top quark, but a lepton collider could both produce them in great abundances and through a variety of channels.

    The more events of interest we create, the more deeply we can probe the Standard Model. The Large Hadron Collider, for example, will be able to tell whether the Higgs behaves consistently with the Standard Model down to about the 1% level. In a wide series of extensions to the Standard Model, ~0.1% deviations are expected, and the right future lepton collider will get you the best physics constraints possible.

    6
    The observed Higgs decay channels vs. the Standard Model agreement, with the latest data from ATLAS and CMS included. The agreement is astounding, and yet frustrating at the same time. By the 2030s, the LHC will have approximately 50 times as much data, but the precisions on many decay channels will still only be known to a few percent. A future collider could increase that precision by multiple orders of magnitude, revealing the existence of potential new particles.(ANDRÉ DAVID, VIA TWITTER)

    These precision studies could be incredibly sensitive to the presence of particles or interactions we haven’t yet discovered. When we create a particle, it has a certain set of branching ratios, or probabilities that it will decay in a variety of ways. The Standard Model makes explicit predictions for those ratios, so if we create a million, or a billion, or a trillion such particles, we can probe those branching ratios to unprecedented precisions.

    If you want better physics constraints, you need more data and better data. It isn’t just the technical considerations that should determine which collider comes next, but also where and how you can get the best personnel, the best infrastructure and support, and where you can build a (or take advantage of an already-existing) strong experimental and theoretical physics community.

    7
    The idea of a linear lepton collider has been bandied about in the particle physics community as the ideal machine to explore post-LHC physics for many decades, but that was under the assumption that the LHC would find a new particle other than the Higgs. If we want to do precision testing of Standard Model particles to indirectly search for new physics, a linear collider may be an inferior option to a circular lepton collider. (REY HORI/KEK)

    There are two general classes proposals for a lepton collider: a circular collider and a linear collider. Linear colliders are simple: accelerate your particles in a straight line and collide them together in the center. With ideal accelerator technology, a linear collider 11 km long could reach energies of 380 GeV: enough to produce the W, Z, Higgs, or top in great abundance. With a 29 km linear collider, you could reach energies of 1.5 TeV, and with a 50 km collider, 3 TeV, although costs rise tremendously to accompany longer lengths.

    Linear colliders are slightly less expensive than circular colliders for the same energy, because you can dig a smaller tunnel to reach the same energies, and they don’t suffer energy losses due to synchrotron radiation, enabling them to reach potentially higher energies. However, the circular colliders offer an enormous advantage: they can produce much greater numbers of particles and collisions.

    Future Circular Collider (FCC)Larger LHC


    The Future Circular Collider is a proposal to build, for the 2030s, a successor to the LHC with a circumference of up to 100 km: nearly four times the size of the present underground tunnels. This will enable, with current magnet technology, the creation of a lepton collider that can produce ~1⁰⁴ times the number of W, Z, H, and t particles that have been produced by prior and current colliders. (CERN / FCC STUDY)

    While a linear collider might be able to produce 10 to 100 times as many collisions as a prior-generation lepton collider like LEP (dependent on energies), a circular version can surpass that easily: producing 10,000 times as many collisions at the energies required to create the Z boson.

    Although circular colliders have substantially higher event rates than linear colliders at the relevant energies that produce Higgs particles as well, they begin to lose their advantage at energies required to produce top quarks, and cannot reach beyond that at all, where linear colliders become dominant.

    Because all of the decay and production processes that occur in these heavy particles scales as either the number of collisions or the square root of the number of collisions, a circular collider has the potential to probe physics with many times the sensitivity of a linear collider.

    7
    A number of the various lepton colliders, with their luminosity (a measure of the collision rate and the number of detections one can make) as a function of center-of-mass collision energy. Note that the red line, which is a circular collider option, offers many more collisions than the linear version, but gets less superior as energy increases. Beyond about 380 GeV, circular colliders cannot reach, and a linear collider like CLIC is the far superior option. (GRANADA STRATEGY MEETING SUMMARY SLIDES / LUCIE LINSSEN (PRIVATE COMMUNICATION))

    The proposed FCC-ee, or the lepton stage of the Future Circular Collider, would realistically discover indirect evidence for any new particles that coupled to the W, Z, Higgs, or top quark with masses up to 70 TeV: five times the maximum energy of the Large Hadron Collider.

    The flipside to a lepton collider is a proton collider, which — at these high energies — is essentially a gluon-gluon collider. This cannot be linear; it must be circular.

    8
    The scale of the proposed Future Circular Collider (FCC), compared with the LHC presently at CERN and the Tevatron, formerly operational at Fermilab. The Future Circular Collider is perhaps the most ambitious proposal for a next-generation collider to date, including both lepton and proton options as various phases of its proposed scientific programme. (PCHARITO / WIKIMEDIA COMMONS)

    There is really only one suitable site for this: CERN, since it not only needs a new, enormous tunnel, but all the infrastructure of the prior stages, which only exist at CERN. (They could be built elsewhere, but the cost would be more expensive than a site where the infrastructure like the LHC and earlier colliders like SPS already exist.)

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator.

    Just as the LHC is presently occupying the tunnel previously occupied by LEP, a circular lepton collider could be superseded by a next-generation circular proton collider, such as the proposed FCC-pp. However, you cannot run both an exploratory proton collider and a precision lepton collider simultaneously; you must decommission one to finish the other.

    9
    The CMS detector at CERN, one of the two most powerful particle detectors ever assembled. Every 25 nanoseconds, on average, a new particle bunch collides at the center-point of this detector. A next-generation detector, whether for a lepton or proton collider, may be able to record even more data, faster, and with higher-precision than the CMS or ATLAS detectors can at present. (CERN)

    It’s very important to make the right decision, as we do not know what secrets nature holds beyond the already-explored frontiers. Going to higher energies unlocks the potential for new direct discoveries, while going to higher precisions and greater statistics could provide even stronger indirect evidence for the existence of new physics.

    The first-stage linear colliders are going to cost between 5 and 7 billion dollars, including the tunnel, while a proton collider of four times the LHC’s radius, with magnets twice as strong, 10 times the collision rate and next-generation computing and cryogenics might cost a total of up to $22 billion, offering as big a leap over the LHC as the LHC was over the Tevatron. Some money could be saved if we build the circular lepton and proton colliders one after the other in the same tunnel, which would essentially provide a future for experimental particle physics after the LHC is done running at the end of the 2030s.

    10
    The Standard Model particles and their supersymmetric counterparts. Slightly under 50% of these particles have been discovered, and just over 50% have never showed a trace that they exist. Supersymmetry is an idea that hopes to improve on the Standard Model, but it has yet to make successful predictions about the Universe in attempting to supplant the prevailing theory. However, new colliders are not being proposed to find supersymmetry or dark matter, but to perform generic searches. Regardless of what they’ll find, we’ll learn something new about the Universe itself. (CLAIRE DAVID / CERN)

    The most important thing to remember in all of this is that we aren’t simply continuing to look for supersymmetry, dark matter, or any particular extension of the Standard Model. We have a slew of problems and puzzles that indicate that there must be new physics beyond what we currently understand, and our scientific curiosity compels us to look. In choosing what machine to build, it’s vital to choose the most performant machine: the ones with the highest numbers of collisions at the energies we’re interested in probing.

    Regardless of which specific projects the community chooses, there will be trade-offs. A linear lepton collider can always reach higher energies than a circular one, while a circular one can always create more collisions and go to higher precisions. It can gather just as much data in a tenth the time, and probe for more subtle effects, at the cost of a lower energy reach.

    Will it be successful? Regardless of what we find, that answer is unequivocally yes. In experimental physics, success does not equate to finding something, as some might erroneously believe. Instead, success means knowing something, post-experiment, that you did not know before you did the experiment. To push beyond the presently known frontiers, we’d ideally want both a lepton and a proton collider, at the highest energies and collision rates we can achieve.

    There is no doubt that new technologies and spinoffs will come from whichever collider or colliders come next, but that’s not why we do it. We are after the deepest secrets of nature, the ones that will remain elusive even after the Large Hadron Collider finishes. We have the technical capabilities, the personnel, and the expertise to build it right at our fingertips. All we need is the political and financial will, as a civilization, to seek the ultimate truths about nature.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 3:48 pm on June 10, 2019 Permalink | Reply
    Tags: An alternative to the WIMP model of dark matter calls for a form of “dark electromagnetism” including “dark photons” and other particles, , Dark Matter, , ,   

    From UC Davis: “A New Candidate for Dark Matter and a Way to Detect It” 

    UC Davis bloc

    From UC Davis

    June 10, 2019
    Andy Fell

    1
    A simulation of the large-scale structure of the universe with filaments of dark matter in blue and places of galaxy formation in yellow. Dark matter cannot yet be detected directly. UC Davis physicists have proposed a new model to explain it. (Image: Zarija Lukic/Lawrence Berkeley National Laboratory)

    Two theoretical physicists at the University of California, Davis, have a new candidate for dark matter, and a possible way to detect it. They presented their work June 6 at the Planck 2019 conference in Granada, Spain, and it has been submitted for publication.

    Dark matter is thought to make up just over a quarter of our universe, with most of the rest being even-more mysterious dark energy. It cannot be seen directly, but dark matter’s presence can be detected because its gravity determines the shape of distant galaxies and other objects.

    Many physicists believe that dark matter is made up of some particle yet to be discovered. For some time, the favorite candidate has been the weakly interacting massive particle or WIMP. But despite years of effort, WIMPs have so far not shown up in experiments designed to detect them.

    “We still don’t know what dark matter is,” said John Terning, professor of physics at UC Davis and co-author on the paper. “The primary candidate for a long time was the WIMP, but it looks like that’s almost completely ruled out.”

    An alternative to the WIMP model of dark matter calls for a form of “dark electromagnetism” including “dark photons” and other particles. Dark photons would have some weak coupling with “regular” photons.

    In their new paper, Terning and postdoctoral researcher Christopher Verhaaren add a twist to this idea: a dark magnetic “monopole” that would interact with the dark photon.

    In the macroscopic world, magnets always have two poles, north and south. A monopole is a particle that acts like one end of a magnet. Monopoles are predicted by quantum theory but have never been observed in an experiment. The scientists suggest that dark monopoles would interact with dark photons and dark electrons in the same way that theory predicts electrons and photons interact with monopoles.

    And that implies a way to detect these dark particles. The physicist Paul Dirac predicted that an electron moving in a circle near a monopole would pick up a change of phase in its wave function. Because electrons exist as both particles and waves in quantum theory, the same electron could pass on either side of the monopole and as a result be slightly out of phase on the other side.

    This interference pattern, called the Aharonov-Bohm effect, means that an electron passing around a magnetic field is influenced by it, even if it does not pass through the field itself.

    Terning and Verhaaren argue that you could detect a dark monopole because of the way it shifts the phase of electrons as they pass by.

    “This is a new type of dark matter but it comes with a new way to look for it as well,” Terning said.

    Electron beams are relatively easy to come by: Electron microscopes were used to demonstrate the Aharonov-Bohm effect in the 1960s, and electron beam technology has improved with time, Terning noted.

    Theoretically, dark matter particles are streaming through us all the time. To be detectable in Terning and Verhaaren’s model, the monopoles would have to be excited by the sun. Then they would take about a month to reach Earth, traveling at about a thousandth of the speed of light.

    On the other hand, the predicted phase shift is extremely small — smaller than that needed to detect gravity waves, for example. However, Terning noted that when the LIGO gravity wave experiment was first proposed, the technology to make it work did not exist — instead, technology caught up over time.

    The work was supported by a grant from the U.S. Department of Energy.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Davis Campus

    The University of California, Davis, is a major public research university located in Davis, California, just west of Sacramento. It encompasses 5,300 acres of land, making it the second largest UC campus in terms of land ownership, after UC Merced.

     
  • richardmitnick 12:38 pm on May 25, 2019 Permalink | Reply
    Tags: "CMS hunts for dark photons coming from the Higgs boson", , , Dark Matter, , One idea is that dark matter comprises dark particles that interact with each other through a mediator particle called the dark photon, , ,   

    From CERN CMS: “CMS hunts for dark photons coming from the Higgs boson” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    24 May, 2019
    Ana Lopes

    1
    A proton–proton collision event featuring a muon–antimuon pair (red), a photon (green), and large missing transverse momentum. (Image: CERN)

    They know it’s there but they don’t know what it’s made of. That pretty much sums up scientists’ knowledge of dark matter.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    This knowledge comes from observations of the universe, which indicate that the invisible form matter is about five to six times more abundant than visible matter.

    One idea is that dark matter comprises dark particles that interact with each other through a mediator particle called the dark photon, named in analogy with the ordinary photon that acts as a mediator between electrically charged particles. A dark photon would also interact weakly with the known particles described by the Standard Model of particle physics, including the Higgs boson.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    At the Large Hadron Collider Physics (LHCP) conference, happening this week in Puebla, Mexico, the CMS collaboration reported the results of its latest search for dark photons.

    The collaboration used a large proton–proton collision dataset, collected during the Large Hadron Collider’s second run, to search for instances in which the Higgs boson might transform, or “decay”, into a photon and a massless dark photon. They focused on cases in which the boson is produced together with a Z boson that itself decays into electrons or their heavier cousins known as muons.

    Such instances are expected to be extremely rare, and finding them requires deducing the presence of the potential dark photon, which particle detectors won’t see. For this, researchers add up the momenta of the detected particles in the transverse direction – that is, at right angles to the colliding beams of protons – and identify any missing momentum needed to reach a total value of zero. Such missing transverse momentum indicates an undetected particle.

    But there’s another step to distinguish between a possible dark photon and known particles. This entails estimating the mass of the particle that decays into the detected photon and the undetected particle. If the missing transverse momentum is carried by a dark photon produced in the decay of the Higgs boson, that mass should correspond to the Higgs-boson mass.

    The CMS collaboration followed this approach but found no signal of dark photons. However, the collaboration placed upper bounds on the likelihood that a signal would have been seen.

    Another null result? Yes, but results such as these and the ATLAS results on supersymmetry also presented this week in Puebla, while not finding new particles or ruling out their existence, are much needed to guide future work, both experimental and theoretical.

    For more details about this result, see the CMS website.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
  • richardmitnick 12:13 pm on May 19, 2019 Permalink | Reply
    Tags: "CosmoGAN Neural Network to Study Dark Matter", , , , , Dark Matter, , , New deep learning network,   

    From insideHPC: “CosmoGAN Neural Network to Study Dark Matter” 

    From insideHPC

    May 18, 2019
    Rich Brueckner

    As cosmologists and astrophysicists delve deeper into the darkest recesses of the universe, their need for increasingly powerful observational and computational tools has expanded exponentially. From facilities such as the Dark Energy Spectroscopic Instrument to supercomputers like Lawrence Berkeley National Laboratory’s Cori system at NERSC, they are on a quest to collect, simulate, and analyze increasing amounts of data that can help explain the nature of things we can’t see, as well as those we can.

    Why opt for GANs instead of other types of generative models? Performance and precision, according to Mustafa.

    “From a deep learning perspective, there are other ways to learn how to generate convergence maps from images, but when we started this project GANs seemed to produce very high-resolution images compared to competing methods, while still being computationally and neural network size efficient,” he said.

    “We were looking for two things: to be accurate and to be fast,” added co-author Zaria Lukic, a research scientist in the Computational Cosmology Center at Berkeley Lab. “GANs offer hope of being nearly as accurate compared to full physics simulations.”

    The research team is particularly interested in constructing a surrogate model that would reduce the computational cost of running these simulations. In the Computational Astrophysics and Cosmology paper, they outline a number of advantages of GANs in the study of large physics simulations.

    “GANs are known to be very unstable during training, especially when you reach the very end of the training and the images start to look nice – that’s when the updates to the network can be really chaotic,” Mustafa said. “But because we have the summary statistics that we use in cosmology, we were able to evaluate the GANs at every step of the training, which helped us determine the generator we thought was the best. This procedure is not usually used in training GANs.”

    Using the CosmoGAN generator network, the team has been able to produce convergence maps that are described by – with high statistical confidence – the same summary statistics as the fully simulated maps. This very high level of agreement between convergence maps that are statistically indistinguishable from maps produced by physics-based generative models offers an important step toward building emulators out of deep neural networks.

    1
    Weak lensing convergence maps for the ΛCDM cosmological model. Randomly selected maps from validation dataset (top) and GAN-generated examples (bottom).

    Weak gravitational lensing NASA/ESA Hubble

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex Mittelmann Cold creation


    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018

    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    Toward this end, gravitational lensing is one of the most promising tools scientists have to extract this information by giving them the ability to probe both the geometry of the universe and the growth of cosmic structure.

    Gravitational Lensing NASA/ESA

    Gravitational lensing distorts images of distant galaxies in a way that is determined by the amount of matter in the line of sight in a certain direction, and it provides a way of looking at a two-dimensional map of dark matter, according to Deborah Bard, Group Lead for the Data Science Engagement Group at NERSC.

    “Gravitational lensing is one of the best ways we have to study dark matter, which is important because it tells us a lot about the structure of the universe,” she said. “The majority of matter in the universe is dark matter, which we can’t see directly, so we have to use indirect methods to study how it is distributed.”

    But as experimental and theoretical datasets grow, along with the simulations needed to image and analyze this data, a new challenge has emerged: these simulations are increasingly – even prohibitively – computationally expensive. So computational cosmologists often resort to computationally cheaper surrogate models, which emulate expensive simulations. More recently, however, “advances in deep generative models based on neural networks opened the possibility of constructing more robust and less hand-engineered surrogate models for many types of simulators, including those in cosmology,” said Mustafa Mustafa, a machine learning engineer at NERSC and lead author on a new study that describes one such approach developed by a collaboration involving Berkeley Lab, Google Research, and the University of KwaZulu-Natal.

    A variety of deep generative models are being investigated for science applications, but the Berkeley Lab-led team is taking a unique tack: generative adversarial networks (GANs). In a paper published May 6, 2019 in Computational Astrophysics and Cosmology, they discuss their new deep learning network, dubbed CosmoGAN, and its ability to create high-fidelity, weak gravitational lensing convergence maps.

    “A convergence map is effectively a 2D map of the gravitational lensing that we see in the sky along the line of sight,” said Bard, a co-author on the Computational Astrophysics and Cosmology paper. “If you have a peak in a convergence map that corresponds to a peak in a large amount of matter along the line of sight, that means there is a huge amount of dark matter in that direction.”

    The Advantages of GANs

    “The huge advantage here was that the problem we were tackling was a physics problem that had associated metrics,” Bard said. “But with our approach, there are actual metrics that allow you to quantify how accurate your GAN is. To me that is what is really exciting about this – how these kinds of physics problems can influence machine learning methods.”

    Ultimately such approaches could transform science that currently relies on detailed physics simulations that require billions of compute hours and occupy petabytes of disk space – but there is considerable work still to be done. Cosmology data (and scientific data in general) can require very high-resolution measurements, such as full-sky telescope images.

    “The 2D images considered for this project are valuable, but the actual physics simulations are 3D and can be time-varying ?and irregular, producing a rich, web-like structure of features,” said Wahid Bhmiji, a big data architect in the Data and Analytics Services group at NERSC and a co-author on the Computational Astrophysics and Cosmology paper. “In addition, the approach needs to be extended to explore new virtual universes rather than ones that have already been simulated – ultimately building a controllable CosmoGAN.”

    “The idea of doing controllable GANs is essentially the Holy Grail of the whole problem that we are working on: to be able to truly emulate the physical simulators we need to build surrogate models based on controllable GANs,” Mustafa added. “Right now we are trying to understand how to stabilize the training dynamics, given all the advances in the field that have happened in the last couple of years. Stabilizing the training is extremely important to actually be able to do what we want to do next.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    insideHPC
    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

     
  • richardmitnick 8:01 am on May 10, 2019 Permalink | Reply
    Tags: "Q&A: SLAC/Stanford researchers prepare for a new quantum revolution", , , , Dark Matter, , , , , , Quantum squeezing, , The most exciting opportunities in quantum control make use of a phenomenon known as entanglement   

    From SLAC National Accelerator Lab- “Q&A: SLAC/Stanford researchers prepare for a new quantum revolution” 

    From SLAC National Accelerator Lab

    May 9, 2019
    Manuel Gnida

    Monika Schleier-Smith and Kent Irwin explain how their projects in quantum information science could help us better understand black holes and dark matter.

    The tech world is abuzz about quantum information science (QIS). This emerging technology explores bizarre quantum effects that occur on the smallest scales of matter and could potentially revolutionize the way we live.

    Quantum computers would outperform today’s most powerful supercomputers; data transfer technology based on quantum encryption would be more secure; exquisitely sensitive detectors could pick up fainter-than-ever signals from all corners of the universe; and new quantum materials could enable superconductors that transport electricity without loss.

    In December 2018, President Trump signed the National Quantum Initiative Act into law, which will mobilize $1.2 billion over the next five years to accelerate the development of quantum technology and its applications. Three months earlier, the Department of Energy had already announced $218 million in funding for 85 QIS research awards.

    The Fundamental Physics and Technology Innovation directorates of DOE’s SLAC National Accelerator Laboratory recently joined forces with Stanford University on a new initiative called Q-FARM to make progress in the field. In this Q&A, two Q-FARM scientists explain how they will explore the quantum world through projects funded by DOE QIS awards in high-energy physics.

    Monika Schleier-Smith, assistant professor of physics at Stanford, wants to build a quantum simulator made of atoms to test how quantum information spreads. The research, she said, could even lead to a better understanding of black holes.

    Kent Irwin, professor of physics at Stanford and professor of photon science and of particle physics and astrophysics at SLAC, works on quantum sensors that would open new avenues to search for the identity of the mysterious dark matter that makes up most of the universe.

    1
    Monika Schleier-Smith and Kent Irwin are the principal investigators of three quantum information science projects in high-energy physics at SLAC. (Farrin Abbott/Dawn Harmer/SLAC National Accelerator Laboratory)

    What exactly is quantum information science?

    Irwin: If we look at the world on the smallest scales, everything we know is already “quantum.” On this scale, the properties of atoms, molecules and materials follow the rules of quantum mechanics. QIS strives to make significant advances in controlling those quantum effects that don’t exist on larger scales.

    Schleier-Smith: We’re truly witnessing a revolution in the field in the sense that we’re getting better and better at engineering systems with carefully designed quantum properties, which could pave the way for a broad range of future applications.

    What does quantum control mean in practice?

    Schleier-Smith: The most exciting opportunities in quantum control make use of a phenomenon known as entanglement – a type of correlation that doesn’t exist in the “classical,” non-quantum world. Let me give you a simple analogy: Imagine that we flip two coins. Classically, whether one coin shows heads or tails is independent of what the other coin shows. But if the two coins are instead in an entangled quantum state, looking at the result for one “coin” automatically determines the result for the other one, even though the coin toss still looks random for either coin in isolation.

    Entanglement thus provides a fundamentally new way of encoding information – not in the states of individual “coins” or bits but in correlations between the states of different qubits. This capability could potentially enable transformative new ways of computing, where problems that are intrinsically difficult to solve on classical computers might be more efficiently solved on quantum ones. A challenge, however, is that entangled states are exceedingly fragile: any measurement of the system – even unintentional – necessarily changes the quantum state. So a major area of quantum control is to understand how to generate and preserve this fragile resource.

    At the same time, certain quantum technologies can also take advantage of the extreme sensitivity of quantum states to perturbations. One application is in secure telecommunications: If a sender and receiver share information in the form of quantum bits, an eavesdropper cannot go undetected, because her measurement necessarily changes the quantum state.

    Another very promising application is quantum sensing, where the idea is to reduce noise and enhance sensitivity by controlling quantum correlations, for instance, through quantum squeezing.

    What is quantum squeezing?

    Irwin: Quantum mechanics sets limits on how we can measure certain things in nature. For instance, we can’t perfectly measure both the position and momentum of a particle. The very act of measuring one changes the other. This is called the Heisenberg uncertainty principle. When we search for dark matter, we need to measure an electromagnetic signal extremely well, but Heisenberg tells us that we can’t measure the strength and timing of this signal without introducing uncertainty.

    Quantum squeezing allows us to evade limits on measurement set by Heisenberg by putting all the uncertainty into one thing (which we don’t care about), and then measuring the other with much greater precision. So, for instance, if we squeeze all of the quantum uncertainty in an electromagnetic signal into its timing, we can measure its strength much better than quantum mechanics would ordinarily allow. This lets us search for an electromagnetic signal from dark matter much more quickly and sensitively than is otherwise possible.

    2
    Kent Irwin (at left with Dale Li) leads efforts at SLAC and Stanford to build quantum sensors for exquisitely sensitive detectors. (Andy Freeberg/SLAC National Accelerator Laboratory)

    What types of sensors are you working on?

    Irwin: My team is exploring quantum techniques to develop sensors that could break new ground in the search for dark matter.

    We’ve known since the 1930s that the universe contains much more matter than the ordinary type that we can see with our eyes and telescopes – the matter made up of atoms. Whatever dark matter is, it’s a new type of particle that we don’t understand yet. Most of today’s dark matter detectors search for relatively heavy particles, called weakly interacting massive particles, or WIMPs.

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LBNL LZ project at SURF, Lead, SD, USA

    But what if dark matter particles were so light that they wouldn’t leave a trace in those detectors? We want to develop sensors that would be able to “see” much lighter dark matter particles.

    There would be so many of these very light dark matter particles that they would behave much more like waves than individual particles. So instead of looking for collisions of individual dark matter particles within a detector, which is how WIMP detectors work, we want to look for dark matter waves, which would be detected like a very weak AM radio signal.

    In fact, we even call one of our projects “Dark Matter Radio.” It works like the world’s most sensitive AM radio. But it’s also placed in the world’s most perfect radio shield, made up of a material called a superconductor, which keeps all normal radio waves out. However, unlike real AM radio signals, dark matter waves would be able to go right through the shield and produce a signal. So we are looking for a very weak AM radio station made by dark matter at an unknown frequency.

    Quantum sensors can make this radio much more sensitive, for instance by using quantum tricks such as squeezing and entanglement. So the Dark Matter Radio will not only be the world’s most sensitive AM radio; it will also be better than the Heisenberg uncertainty principle would normally allow.

    What are the challenges of QIS?

    Schleier-Smith: There is a lot we need to learn about controlling quantum correlations before we can make broad use of them in future applications. For example, the sensitivity of entangled quantum states to perturbations is great for sensor applications. However, for quantum computing it’s a major challenge because perturbations of information encoded in qubits will introduce errors, and nobody knows for sure how to correct for them.

    To make progress in that area, my team is studying a question that is very fundamental to our ability to control quantum correlations: How does information actually spread in quantum systems?

    The model system we’re using for these studies consists of atoms that are laser-cooled and optically trapped. We use light to controllably turn on interactions between the atoms, as a means of generating entanglement. By measuring the speed with which quantum information can spread in the system, we hope to understand how to design the structure of the interactions to generate entanglement most efficiently. We view the system of cold atoms as a quantum simulator that allows us to study principles that are also applicable to other physical systems.

    In this area of quantum simulation, one major thrust has been to advance understanding of solid-state systems, by trapping atoms in arrays that mimic the structure of a crystalline material. In my lab, we are additionally working to extend the ideas and tools of quantum simulation in new directions. One prospect that I am particularly excited about is to use cold atoms to simulate what happens to quantum information in black holes.

    3
    Monika Schleier-Smith (at center with graduate students Emily Davis and Eric Cooper) uses laser-cooled atoms in her lab at Stanford to study the transfer of quantum information. (Dawn Harmer/SLAC National Accelerator Laboratory)

    What do cold atoms have to do with black holes?

    Schleier-Smith: The idea that there might be any connection between quantum systems we can build in the lab and black holes has its origins in a long-standing theoretical problem: When particles fall into a black hole, what happens to the information they contained? There were compelling arguments that the information should be lost, but that would contradict the laws of quantum mechanics.

    More recently, theoretical physicists – notably my Stanford colleague Patrick Hayden – found a resolution to this problem: We should think of the black hole as a highly chaotic system that “scrambles” the information as fast as physically possible. It’s almost like shredding documents, but quantum information scrambling is much richer in that the result is a highly entangled quantum state.

    Although precisely recreating such a process in the lab will be very challenging, we hope to look at one of its key features already in the near term. In order for information scrambling to happen, information needs to be transferred through space exponentially fast. This, in turn, requires quantum interactions to occur over long distances, which is quite counterintuitive because interactions in nature typically become weaker with distance. With our quantum simulator, we are able to study interactions between distant atoms by sending information back and forth with photons, particles of light.

    What do you hope will happen in QIS over the next few years?

    Irwin: We need to prove that, in real applications, quantum technology is superior to the technology that we already have. We are in the early stages of this new quantum revolution, but this is already starting to happen. The things we’re learning now will help us make a leap in developing future technology, such as universal quantum computers and next-generation sensors. The work we do on quantum sensors will enable new science, not only in dark matter research. At SLAC, I also see potential for quantum-enhanced sensors in X-ray applications, which could provide us with new tools to study advanced materials and understand how biomolecules work.

    Schleier-Smith: QIS offers plenty of room for breakthroughs. There are many open questions we still need to answer about how to engineer the properties of quantum systems in order to harness them for technology, so it’s imperative that we continue to broadly advance our understanding of complex quantum systems. Personally, I hope that we’ll be able to better connect experimental observations with the latest theoretical advances. Bringing all this knowledge together will help us build the technologies of the future.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC/LCLS


    SLAC/LCLS II projected view


    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

     
  • richardmitnick 3:02 pm on May 2, 2019 Permalink | Reply
    Tags: , , Dark Matter, , , , , ,   

    From University of Chicago: “Scientists invent way to trap mysterious ‘dark world’ particle at Large Hadron Collider” 

    U Chicago bloc

    From University of Chicago

    Apr 17, 2019 [Just found this via social media]
    Louise Lerner

    1
    Courtesy of Zarija Lukic/Berkeley Lab

    A new paper outlines a method to directly detect particles from the ‘dark world’ using the Large Hadron Collider. Until now we’ve only been able to make indirect measurements and simulations, such as the visualization of dark matter above.

    CERN LHC Maximilien Brice and Julien Marius Ordan

    Higgs boson could be tied with dark particle, serve as ‘portal to the dark world’.

    Now that they’ve identified the Higgs boson, scientists at the Large Hadron Collider have set their sights on an even more elusive target.

    All around us is dark matter and dark energy—the invisible stuff that binds the galaxy together, but which no one has been able to directly detect. “We know for sure there’s a dark world, and there’s more energy in it than there is in ours,” said LianTao Wang, a University of Chicago professor of physics who studies how to find signals in large particle accelerators like the LHC.

    Wang, along with scientists from the University and UChicago-affiliated Fermilab, think they may be able to lead us to its tracks; in a paper published April 3 in Physical Review Letters, they laid out an innovative method for stalking dark matter in the LHC by exploiting a potential particle’s slightly slower speed.

    While the dark world makes up more than 95% of the universe, scientists only know it exists from its effects—like a poltergeist you can only see when it pushes something off a shelf. For example, we know there’s dark matter because we can see gravity acting on it—it helps keep our galaxies from flying apart.

    Theorists think there’s one particular kind of dark particle that only occasionally interacts with normal matter. It would be heavier and longer-lived than other known particles, with a lifetime up to one tenth of a second. A few times in a decade, researchers believe, this particle can get caught up in the collisions of protons that the LHC is constantly creating and measuring.

    “One particularly interesting possibility is that these long-lived dark particles are coupled to the Higgs boson in some fashion—that the Higgs is actually a portal to the dark world,” said Wang, referring to the last holdout particle in physicists’ grand theory of how the universe works, discovered at the LHC in 2012.

    Standard Model of Particle Physics

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    “It’s possible that the Higgs could actually decay into these long-lived particles.”

    The only problem is sorting out these events from the rest; there are more than a billion collisions per second in the 27-kilometer LHC, and each one of these sends subatomic chaff spraying in all directions.

    Wang, UChicago postdoctoral fellow Jia Liu and Fermilab scientist Zhen Liu (now at the University of Maryland) proposed a new way to search by exploiting one particular aspect of such a dark particle. “If it’s that heavy, it costs energy to produce, so its momentum would not be large—it would move more slowly than the speed of light,” said Liu, the first author on the study.

    That time delay would set it apart from all the rest of the normal particles. Scientists would only need to tweak the system to look for particles that are produced and then decay a bit more slowly than everything else.

    The difference is on the order of a nanosecond—a billionth of a second—or smaller. But the LHC already has detectors sophisticated enough to catch this difference; a recent study using data collected from the last run and found the method should work, plus the detectors will get even more sensitive as part of the upgrade that is currently underway.

    “We anticipate this method will increase our sensitivity to long-lived dark particles by more than an order of magnitude—while using capabilities we already have at the LHC,” Liu said.

    Experimentalists are already working to build the trap: When the LHC turns back on in 2021, after boosting its luminosity by tenfold, all three of the major detectors will be implementing the new system, the scientists said. “We think it has great potential for discovery,” Liu said.

    CERN ATLAS Credit CERN SCIENCE PHOTO LIBRARY


    CERN/CMS Detector


    CERN/ALICE Detector

    “If the particle is there, we just have to find a way to dig it out,” Wang said. “Usually, the key is finding the question to ask.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

     
  • richardmitnick 12:10 pm on April 23, 2019 Permalink | Reply
    Tags: "Falsifiability and physics", , , , , Dark Matter, , , Karl Popper (1902-1994) "The Logic of Scientific Discovery", , ,   

    From Symmetry: “Falsifiability and physics” 

    Symmetry Mag
    From Symmetry

    04/23/19
    Matthew R. Francis

    1
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    Can a theory that isn’t completely testable still be useful to physics?

    What determines if an idea is legitimately scientific or not? This question has been debated by philosophers and historians of science, working scientists, and lawyers in courts of law. That’s because it’s not merely an abstract notion: What makes something scientific or not determines if it should be taught in classrooms or supported by government grant money.

    The answer is relatively straightforward in many cases: Despite conspiracy theories to the contrary, the Earth is not flat. Literally all evidence is in favor of a round and rotating Earth, so statements based on a flat-Earth hypothesis are not scientific.

    In other cases, though, people actively debate where and how the demarcation line should be drawn. One such criterion was proposed by philosopher of science Karl Popper (1902-1994), who argued that scientific ideas must be subject to “falsification.”

    Popper wrote in his classic book The Logic of Scientific Discovery that a theory that cannot be proven false—that is, a theory flexible enough to encompass every possible experimental outcome—is scientifically useless. He wrote that a scientific idea must contain the key to its own downfall: It must make predictions that can be tested and, if those predictions are proven false, the theory must be jettisoned.

    When writing this, Popper was less concerned with physics than he was with theories like Freudian psychology and Stalinist history. These, he argued, were not falsifiable because they were vague or flexible enough to incorporate all the available evidence and therefore immune to testing.

    But where does this falsifiability requirement leave certain areas of theoretical physics? String theory, for example, involves physics on extremely small length scales unreachable by any foreseeable experiment.

    String Theory depiction. Cross section of the quintic Calabi–Yau manifold Calabi yau.jpg. Jbourjai (using Mathematica output)

    Cosmic inflation, a theory that explains much about the properties of the observable universe, may itself be untestable through direct observations.

    Some critics believe these theories are unfalsifiable and, for that reason, are of dubious scientific value.

    At the same time, many physicists align with philosophers of science who identified flaws in Popper’s model, saying falsification is most useful in identifying blatant pseudoscience (the flat-Earth hypothesis, again) but relatively unimportant for judging theories growing out of established paradigms in science.

    “I think we should be worried about being arrogant,” says Chanda Prescod-Weinstein of the University of New Hampshire. “Falsifiability is important, but so is remembering that nature does what it wants.”

    Prescod-Weinstein is both a particle cosmologist and researcher in science, technology, and society studies, interested in analyzing the priorities scientists have as a group. “Any particular generation deciding that they’ve worked out all that can be worked out seems like the height of arrogance to me,” she says.

    Tracy Slatyer of MIT agrees, and argues that stringently worrying about falsification can prevent new ideas from germinating, stifling creativity. “In theoretical physics, the vast majority of all the ideas you ever work on are going to be wrong,” she says. “They may be interesting ideas, they may be beautiful ideas, they may be gorgeous structures that are simply not realized in our universe.”

    Particles and practical philosophy

    Take, for example, supersymmetry. SUSY is an extension of the Standard Model in which each known particle is paired with a supersymmetric partner.

    Standard Model of Supersymmetry via DESY

    The theory is a natural outgrowth of a mathematical symmetry of spacetime, in ways similar to the Standard Model itself. It’s well established within particle physics, even though supersymmetric particles, if they exist, may be out of scientists’ experimental reach.

    SUSY could potentially resolve some major mysteries in modern physics. For one, all of those supersymmetric particles could be the reason the mass of the Higgs boson is smaller than quantum mechanics says it should be.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    “Quantum mechanics says that [the Higgs boson] mass should blow up to the largest mass scale possible,” says Howard Baer of the University of Oklahoma. That’s because masses in quantum theory are the result of contributions from many different particles involved in interactions—and the Higgs field, which gives other particles mass, racks up a lot of these interactions. But the Higgs mass isn’t huge, which requires an explanation.

    “Something else would have to be tuned to a huge negative [value] in order to cancel [the huge positive value of those interactions] and give you the observed value,” Baer says. That level of coincidence, known as a “fine-tuning problem,” makes physicists itchy. “It’s like trying to play the lottery. It’s possible you might win, but really you’re almost certain to lose.”

    If SUSY particles turn up in a certain mass range, their contributions to the Higgs mass “naturally” solve this problem, which has been an argument in favor of the theory of supersymmetry. So far, the Large Hadron Collider has not turned up any SUSY particles in the range of “naturalness.”

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    However, the broad framework of supersymmetry can accommodate even more massive SUSY particles, which may or may not be detectable using the LHC. In fact, if naturalness is abandoned, SUSY doesn’t provide an obvious mass scale at all, meaning SUSY particles might be out of range for discovery with any earthly particle collider. That point has made some critics queasy: If there’s no obvious mass scale at which colliders can hunt for SUSY, is the theory falsifiable?

    A related problem confronts dark matter researchers: Despite strong indirect evidence for a large amount of mass invisible to all forms of light, particle experiments have yet to find any dark matter particles. It could be that dark matter particles are just impossible to directly detect. A small but vocal group of researchers has argued that we need to consider alternative theories of gravity instead.

    Fritz Zwicky, the Father of Dark Matter research.No image credit after long search

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)

    U Washington ADMX Axion Dark Matter Experiment

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    Dark Side-50 Dark Matter Experiment at Gran Sasso

    Slatyer, whose research involves looking for dark matter, considers the criticism partly as a problem of language. “When you say ‘dark matter,’ [you need] to distinguish dark matter from specific scenarios for what dark matter could be,” she says. “The community has not always done that well.”

    In other words, specific models for dark matter can stand or fall, but the dark matter paradigm as a whole has withstood all tests so far. But as Slatyer points out, no alternative theory of gravity can explain all the phenomena that a simple dark matter model can, from the behavior of galaxies to the structure of the cosmic microwave background.

    Prescod-Weinstein argues that we’re a long way from ruling out all dark matter possibilities. “How will we prove that the dark matter, if it exists, definitively doesn’t interact with the Standard Model?” she says. “Astrophysics is always a bit of a detective game. Without laboratory [detection of] dark matter, it’s hard to make definitive statements about its properties. But we can construct likely narratives based on what we know about its behavior.”

    Similarly, Baer thinks that we haven’t exhausted all the SUSY possibilities yet. “People say, ‘you’ve been promising supersymmetry for 20 or 30 years,’ but it was based on overly optimistic naturalness calculations,” he says. “I think if one evaluates the naturalness properly, then you find that supersymmetry is still even now very natural. But you’re going to need either an energy upgrade of LHC or an ILC [International Linear Collider] in order to discover it.”

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    Beyond falsifiability of dark matter or SUSY, physicists are motivated by more mundane concerns. “Even if these individual scenarios are in principle falsifiable, how much money would [it] take and how much time would it take?” Slatyer says. In other words, rather than try to demonstrate or rule out SUSY as a whole, physicists focus on particle experiments that can be performed within a certain number of budgetary cycles. It’s not romantic, but it’s true nevertheless.

    2
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    Is it science? Who decides?

    Historically, sometimes theories that seem untestable turn out to just need more time. For example, 19th century physicist Ludwig Boltzmann and colleagues showed they could explain many results in thermal physics and chemistry if everything were made up of “atoms”—what we call particles, atoms, and molecules today—governed by Newtonian physics.

    Since atoms were out of reach of experiments of the day, prominent philosophers of science argued that the atomic hypothesis was untestable in principle, and therefore unscientific.

    However, the atomists eventually won the day: J. J. Thompson demonstrated the existence of electrons, while Albert Einstein showed that water molecules could make grains of pollen dance on a pond’s surface.

    Atoms provide a case study for how falsifiability proved to be the wrong criterion. Many other cases are trickier.

    For instance, Einstein’s theory of general relativity is one of the best-tested theories in all of science. At the same time, it allows for physically unrealistic “universes,” such as a “rotating” cosmos where movement back and forth in time is possible, which are contradicted by all observations of the reality we inhabit.

    General relativity also makes predictions about things that are untestable by definition, like how particles move inside the event horizon of a black hole: No information about these trajectories can be determined by experiment.

    The first image of a black hole, Messier 87 Credit Event Horizon Telescope Collaboration, via NSF 4.10.19

    Yet no knowledgeable physicist or philosopher of science would argue that general relativity is unscientific. The success of the theory is due to enough of its predictions being testable.

    Eddington/Einstein exibition of gravitational lensing solar eclipse of 29 May 1919

    Another type of theory may be mostly untestable, but have important consequences. One such theory is cosmic inflation, which (among other things) explains why we don’t see isolated magnetic monopoles and why the universe is a nearly uniform temperature everywhere we look.

    The key property of inflation—the extremely rapid expansion of spacetime during a tiny split second after the Big Bang—cannot be tested directly. Cosmologists look for indirect evidence for inflation, but in the end it may be difficult or impossible to distinguish between different inflationary models, simply because scientists can’t get the data. Does that mean it isn’t scientific?

    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    Alan Guth’s notes:
    5

    “A lot of people have personal feelings about inflation and the aesthetics of physical theories,” Prescod-Weinstein says. She’s willing to entertain alternative ideas which have testable consequences, but inflation works well enough for now to keep it around. “It’s also the case that the majority of the cosmology community continues to take inflation seriously as a model, so I have to shrug a little when someone says it’s not science.”

    On that note, Caltech cosmologist Sean M. Carroll argues that many very useful theories have both falsifiable and unfalsifiable predictions. Some aspects may be testable in principle, but not by any experiment or observation we can perform with existing technology. Many particle physics models fall into that category, but that doesn’t stop physicists from finding them useful. SUSY as a concept may not be falsifiable, but many specific models within the broad framework certainly are. All the evidence we have for the existence of dark matter is indirect, which won’t go away even if laboratory experiments never find dark matter particles. Physicists accept the concept of dark matter because it works.

    Slatyer is a practical dark matter hunter. “The questions I’m most interested asking are not even just questions that are in principle falsifiable, but questions that in principle can be tested by data on the timescale of less than my lifetime,” she says. “But it’s not only problems that can be tested by data on a timescale of ‘less than Tracy’s lifetime’ are good scientific questions!”

    Prescod-Weinstein agrees, and argues for keeping an open mind. “There’s a lot we don’t know about the universe, including what’s knowable about it. We are a curious species, and I think we should remain curious.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: