Tagged: Symmetry Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:31 pm on March 8, 2022 Permalink | Reply
    Tags: "How to break a theory", , , , , , , , Symmetry   

    From Symmetry: “How to break a theory” 

    Symmetry Mag

    From Symmetry

    03/08/22
    Sarah Charley

    When a theory breaks, you learn how to build it better.

    In 1859, a French astronomer noticed that gradual changes to Mercury’s orbit could not be explained by Isaac Newton’s concept of gravity.

    “Mercury is within the solar system, where Newton’s laws had worked perfectly, and continue to work perfectly, in almost all cases,” says Sophie Renner, a fellow in the CERN Theory Group. “But for some reason, they saw a discrepancy.”

    Astronomers attributed the observation to a missing variable, such as an unseen planet slowly tugging Mercury off course. But Mercury’s mysterious playmate was never found.

    That’s because their equations didn’t need a new variable; their theory needed a revolution.

    Half a century later, scientists found the explanation for Mercury’s behavior: The sun was stretching spacetime and creating a gravity well that slowly changed Mercury’s path. This revelation brought to light the limits of Newtonian physics—and validated Albert Einstein’s theory of general relativity.

    These fractures between prediction and observation are what physicists look for.

    “We want to break theories for the same reason you want to break software,” says Cynthia Keeler, an assistant professor at Arizona State University. “You have to stress-test it and find out what its boundaries are. When a theory breaks, you learn how to build it better.”

    When the CMS and ATLAS experiments at CERN saw an unexpected bump in their data in 2016, theorists submitted nearly 500 papers speculating about what new physics it might reveal. (When the experiments collected more data, the bump disappeared.)

    But rigorous experimental testing is only one of the many ways to break a theory. Physicists constantly run quality-control checks designed to crack, bend and extend their favorite mathematical models of the universe.

    Ask weird questions

    Einstein had a wild imagination. He asked himself questions like: What would he feel if he rode an elevator through outer space? What would he see if he chased a beam of light?

    Other daydreamers might not have moved beyond wondering. But Einstein had a background in physics and friends with advanced degrees in mathematics. His thought experiments seeded deeper investigations that eventually showed the limitations of Newtonian mechanics.

    “What Einstein did was expose internal paradoxes of the theory itself,” says Stephon Alexander, a physics professor at Brown University. “It’s like looking at a picture of something beautiful, but then finding a new angle and the picture isn’t as beautiful or elegant as you thought.”

    Theorists must look for every possible angle, Alexander says. “As a theorist, you have the responsibility to strive for mastery and at the same time, be willing to look at things from the outside-in.”

    Today’s thought experiments sound just as bizarre as Einstein’s from 100 years ago. The internal paradoxes they reveal are just as gnarly.

    For example: “If I build a black hole out of a bunch of dictionaries, can I find the information in those dictionaries?” Keeler says. “Quantum mechanics says the information should be preserved—maybe it’s hard to get because it’s all mixed up, but it shouldn’t go away.

    “Black holes seem to contradict that. We’ve had 50 years of discussions over this problem.”

    Check the math

    Theories tell stories. What are the smallest pieces of matter? What are their characteristics? What are their relationships? What is their destiny?

    But unlike the stories of Shakespeare or Kurosawa, a physics theory is told in the language of mathematics. If the math doesn’t check out, neither does the theory.

    “A lot if it is asking, ‘Is this legal?’” Keeler says. “You might write down something that seems mathematically consistent and then run into problems later. You have to ask, could any universe be constructed with this, or would it fall apart?”

    Quantum field theory, which describes physics at subatomic scales, makes many mathematicians cringe because of its “algebraic shenanigans,” says Dorota Grabowska, a fellow in the CERN Theory Group. “If I had a conversation with a mathematician about quantum field theory, they would start crying. It’s like when your mom tells you to clean your room, so you shove everything in the closet. It looks fine, but please don’t open the closet.”

    Quantum field theory is rife with something mathematicians can’t stand: unresolved infinities. In a 1977 essay, Nobel Laureate Steven Weinberg [The University of Texas-Austin (US)] wrote that “[Quantum field theory’s] reputation among physicists suffered frequent fluctuations… at times dropping so low that quantum field theory came close to be[ing] abandoned altogether.”

    But quantum field theory survives because at the end of the day, it still makes predictions that check out with experiments, such as those at the Large Hadron Collider at CERN.

    “The LHC is like our mother, and when she opens the closet, everything is magically organized,” Grabowska says.

    Push it to extremes

    Physics before the 20th century was mostly limited to the study of speeds, sizes and energies around the human scale. But then scientists started asking, what happens if we go faster? Or smaller? Or to a higher energy?

    “A theory breaks when you try to calculate something new with the theory you have, and it gives you something absurd,” Renner says. “That’s what happened with the ultraviolet catastrophe of black-body radiation.”

    Any blacksmith can attest that there is a link between the temperature of molten iron and the color and brightness of the light it emits. Classical physics did a pretty good job predicting the intensity of this light (hotter objects glow more brightly).

    The trouble started when they pushed into the ultraviolet range.

    Scientists calculated the amount of ultraviolet radiation an object burning at high temperature would emit. They found the prediction from classical physics in no way reflected reality, Renner says. “It went towards infinity at high frequencies, which is not what we see at all.”

    The newly exposed edges of classical physics inspired Max Planck to reinterpret what energy actually is.

    Planck proposed that unlike speed—which can be any value up to the speed of light—energy is more like a currency that comes in discrete bills called quanta. High-frequency light costs big quanta to shine, which explained the steep drop-off in the intensity of light radiating from an object in the ultraviolet range.

    Today scientists are pushing far beyond the ultraviolet. UV light has around 3 to 30 electronvolts of energy; scientists at the Large Hadron Collider are currently studying the laws of physics at up to 13 trillion electronvolts.

    The LHC’s enormous energy allowed physicists to finally find the legendary Higgs boson, which was theorized 50 years before its discovery and helps explain the origin of mass.

    But this discovery illuminated what might be the limits of current theory in the form of the Standard Model, which physicists use to describe subatomic particles, forces and fields.

    “If the Standard Model is valid across a large range of energies, we would expect the Higgs to have a much heavier mass than it does,” Renner says. “There’s no reason why the Higgs should be at the mass that it is, unless some new theory takes over at energies just out of our reach.”

    Physicists are pushing the limits and searching for cracks that will let them see beyond the boundaries of the Standard Model’s effectiveness. Theory alone can go only so far, and many theorists are looking to experimentalists to light the way.

    “Theoretical physics has always been driven by observations and detection,” Alexander says. “The field would be nothing without it. We’re relying on experimentalists to break the theory, and we’re really interested to see how they will surprise us.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:41 am on January 26, 2022 Permalink | Reply
    Tags: "What is a quantum network?", , , Cloud supercomputing with quantum networks harnessing the power of multiple quantum computers., Connecting optical telescopes allowing multiple observatories to function as a single giant scope—an optical interferometer., Entanglement allows two qubits to become inextricably interlinked no matter how much space separates them., It may be decades before the average person has contact with a quantum network. But their applications in science may be a lot more imminent., , , , Quantum networks are like the classical networks we use in everyday life to transmit and share digital information., Quantum networks don’t exist—and many scientists in the field will tell you they’re a long way off., Quantum networks use quantum bits-or qubits-which encode information in a way that is utterly foreign to the classical way of thinking., , , Qubits use tricks from the weird world of quantum mechanics and are fundamentally different from classical computing bits., , Symmetry, When quantum networks arrive they could revolutionize everyday life making unhackable communications secured for banking; medicine; navigation and more.   

    From Symmetry: “What is a quantum network?” 

    Symmetry Mag

    From Symmetry

    01/26/22
    Mara Johnson-Groh

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova.

    As we step into the quantum age, here are four things to know about quantum networks.

    Four years, four months, and twelve days ago, a photon—a particle of light—left Proxima Centauri, the closest star to us. Just now, it finally arrived at Earth.

    Centauris Alpha, Beta, Proxima, 27 February 2012. Skatebiker.

    This photon, and others that have come with it, could reveal incredible secrets about the planets that orbit the red dwarf star—such as if they’re habitable, or even inhabited. However, with current instruments, we’re not able to tease out this information.

    That could one day change with technology called quantum networks.

    Quantum networks are like the classical networks we use in everyday life to transmit and share digital information. However, quantum networks use quantum bits-or qubits-which encode information in a way that is utterly foreign to the classical way of thinking. Qubits use tricks from the weird world of quantum mechanics and are fundamentally different from classical computing bits. And when employed on quantum networks, they are radically more powerful.

    Quantum networks don’t exist—and many scientists in the field will tell you they’re a long way off. But when they arrive, they could revolutionize everyday life, making unhackable communications secured for banking, medicine, navigation and more. We might not be there yet, but already scientists are testing the building blocks and putting together prototype systems.

    “There are breakthroughs happening all the time,” says Sophia Economou, a physics professor and quantum information expert at The Virginia Polytechnic Institute and State University (US).

    Already, basic quantum communications called quantum key distributions are helping secure transmissions made over short distances. But before quantum networks become commonplace, they’ll likely make their more public debut in scientific settings.

    As we step into the quantum age, here are four things to know about quantum networks.

    1. Quantum networks are possible because of the weird world of quantum mechanics.

    Understanding quantum networks boils down to grasping a few fundamental quantum phenomena with sci-fi sounding names: superposition, entanglement and teleportation.

    Understanding these phenomena requires stepping out of your daily experience of how the world works.

    For example, classical computer bits are either 1 or 0—like a coin flipped heads or tails or a computer’s electrical signal switched on or off. The quantum realm, though, isn’t so decisive. Qubits, which are typically photons or electrons, can be 1 or 0. But they can also simultaneously be a 1 and 0. They’re more like spinning coins, which are undecidedly both heads and tails. Only once qubits are measured do they snap into a 1 or a 0 state. This duality is called superposition, and it allows for faster completion of some computing processes.

    Furthermore, computing with qubits is more secure than with classical bits, thanks to a phenomenon known as entanglement. As described by Panagiotis Spentzouris, a scientist at The DOE’s Fermi National Accelerator Laboratory (US), “entanglement is one of the coolest and most intriguing aspects of quantum physics.”

    Entanglement allows two qubits to become inextricably interlinked no matter how much space separates them. Once entangled, two qubits can mirror one another, each fully correlated with the measurement of the other. If one qubit is switched to a 0, so will its correlated partner.

    This quirk is used to pass quantum information securely—a process known as teleportation. While this teleportation doesn’t involve moving physical objects, it does move information.

    Imagine you wanted to send a secure message to a friend connected via a quantum network.

    With a quantum network, you could send an entangled qubit to them and keep the other one for yourself. Measuring the state of the qubit would provide a key that you could use to encrypt a message sent through a non-quantum channel. Your friend’s qubit, entangled with and thus fully correlated to your qubit, would function as the key to unencrypting the received message.

    An unread quantum state can’t be copied. If a spy intercepted the qubit to steal the encryption, the qubit’s state would be interrupted, leaving a clue someone was eavesdropping.

    These types of quantum-encoded messages are already being sent. Quantum key distribution has been used for bank transfers and secure ballot result transmissions. However, this type of communication is currently practical only at short, city-scale distances.

    That’s because quantum information is delicate. Qubits are typically sent as photons using the same standard fiber-optic cables that carry the bulk of the internet. The slightest bump against the wall of a fiber optic cable, a passing photon of sunlight, and even a tiny mismatch in distances traveled can all lead to two qubits falling out of entanglement.

    2. Extended quantum networks will need special repeaters to go the distance.

    Sending information halfway around the world is much harder with quantum networks than with classical networks. In classical networks, amplifiers placed periodically along the line reemit signals, splitting a marathon into a relay race. Quantum networks can’t use amplifiers, though, because reading and reemitting qubits would disrupt their entanglement, ruining the transmission.

    Researchers are instead working on building quantum repeaters, which would be able to pass along the information without having to read the qubits. To do this, quantum repeaters would create multiple entangled pairs of qubits that would link together to form a giant entangled chain—something known as entanglement swapping. Instead of a relay race, this is more like a game of “Simon Says”, where each qubit mirrors its neighbor. The system retains its security because, just as with entanglement, if an outsider tried to copy the information, the qubits’ state would be interrupted, revealing the snooper.

    While conceptually simple, it is incredibly hard to implement.

    “Some people have demonstrated designs that would in principle be a quantum repeater, but there aren’t any deployed in a real network,” says Emilio Nanni, an assistant professor at Stanford University (US) and The DOE’s SLAC National Accelerator Laboratory (US).

    Right now, researchers are largely focusing on developing metropolitan-scale networks, which are small enough to avoid needing quantum repeaters. Spentzouris is one such researcher. He’s creating a Chicago-wide network to test network infrastructure, like entanglement swapping, which can already be done with nodes that do not use quantum repeaters. He hopes such steps will help quantum networks be ready to expand when repeaters are available.

    Other groups around the world, such as those at The Delft University of Technology [Technische Universiteit Delft](NL) and The University of Science and Technology [中国科学技术大学](CN) at Chinese Academy of Sciences [中国科学院](CN), have demonstrated longer network-like connections, including linking multiple quantum devices, entanglement over a dozen or more qubits, and using quantum teleportation over a thousand kilometers with satellite links, which suffer less loss than fiber optic cables. Though impressive, such demonstrations are still a long way from being true quantum networks.

    3. Quantum networks will work with existing networks.

    Quantum networks will ultimately need to be highly reliable and should seamlessly integrate into our lives. As such, it’s likely quantum networks will work off of a backbone of fiber optic cables, alongside our current networks and internet. To merge with our current infrastructure, quantum networks will need interfaces that can connect non-quantum systems—like your smartphone—with quantum processors and nodes.

    In his lab, Nanni and his collaborators are working to create a computer chip that could connect classical computers to a quantum network. Such chips and other classical-quantum bridges could one day allow us to send bank transfers or information effortlessly and securely via quantum networking without needing personal quantum computers.

    As researchers work toward more reliable networks, new prototypes and designs are being developed, with breakthroughs coming almost monthly. Most areas of research have designed multiple options with no clear winners.

    For example, qubits can be encoded in a multitude of ways—using their polarization states, spin states, times of arrival, the motions of trapped ions and atoms, and the states of superconductors. Some designs work incredibly well but only at supercooled temperatures, while others are compatible at room temperatures but are less reliable. Likely, future quantum networks will exist as a mash-up of such options, with different designs specialized for different applications.

    “A key challenge for quantum networks is being able to interface between many different types of quantum systems and whatever you choose to be the network,” Nanni says. “I strongly suspect that in the long run we will not really settle on just one type of device because different types of platforms have different inherent advantages.”

    4. Quantum networks will be important in scientific sensing first.
    ===
    It may be decades before the average person has contact with a quantum network. But their applications in science may be a lot more imminent.

    Early networks will likely be used for things like cloud supercomputing with quantum networks harnessing the power of multiple quantum computers. Quantum networks will also enable more precise scientific sensing, which can improve atomic clocks and make GPS more reliable.

    Astronomers are also looking to leverage quantum networks by connecting optical telescopes allowing multiple observatories to function as a single giant scope—an optical interferometer.

    Scientists already achieved something similar, in 2019, when they used the Event Horizon Telescope to create the first-ever image of a black hole.

    EHT map.

    Messier 87*, The first image of the event horizon of a black hole. This is the supermassive black hole at the center of the galaxy Messier 87. Image via The Event Horizon Telescope Collaboration released on 10 April 2019 via National Science Foundation(US).

    The EHT was not a single telescope, but rather a network of radio telescopes located around the world. Similarly, the GRAVITY instrument on ESO’s Very Large Telescope Interferometer, which consists of telescopes spread along a small hilltop, used optical interferometry to image a planet around another star in the same year.

    ESO VLTI GRAVITY instrument.

    European Southern Observatory(EU) VLTI Interferometer image at Cerro Paranal, with an elevation of 2,635 metres (8,645 ft) above sea level, •ANTU (UT1; The Sun ),
    •KUEYEN (UT2; The Moon ),
    •MELIPAL (UT3; The Southern Cross ), and
    •YEPUN (UT4; Venus – as evening star).

    The next step is to combine optical telescopes spaced even farther apart, which would improve image resolution further. This could lead to ground-breaking discoveries about the habitability of nearby planets, dark matter and the expansion of the universe.

    “Such a resolution [that could be achieved with optical interferometers] is enough to see an area like New York City on a planet in the closest star system,” says Emil Khabiboulline, a PhD student at Harvard University (US), who published a paper [Physical Review A] describing one possible way to connect telescopes with quantum networks.

    Increasing the distance between optical telescopes, however, is a big challenge. Photons are inevitably lost during the journey to a central hub where they’re recombined, and longer distances mean more data lost.

    Quantum networks offer one solution to this problem. If the photons’ quantum information can be recorded at each telescope and passed in a network, it could massively reduce data loss. But the huge number of photons possibly amassed by an optical telescope would overwhelm the bandwidth of quantum networks as they’re now envisioned.

    One workaround is a quantum approach, proposed by Khabiboulline and others, that could compress and store the photons’ quantum information before sending it over a quantum network using a smaller number of qubits. Other groups, like researchers at The University of Sydney (AU), have proposed using quantum hard drives, devices that would store the quantum information of photons arriving at separate telescopes until they could be physically brought together and recombined.

    Regardless of the final approach, the advances first designed by astronomers and other scientists will likely trickle down into the quantum networks the public may someday use.

    “I think with a level of enthusiasm that is present in the scientific community now, over the next decade, we’re going to really make a big impact,” Nanni says.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:51 am on January 11, 2022 Permalink | Reply
    Tags: "Looking at a new quantum revolution", , , , , , , Symmetry   

    From Symmetry: “Looking at a new quantum revolution” 

    Symmetry Mag

    From Symmetry

    01/11/22
    Kathryn Jepsen

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova.

    This month, Symmetry presents a series of articles on the past, present and future of quantum research—and its many connections to particle physics, astrophysics and computing.

    On July 25, 2018, a group of scientists from Microsoft, Google and IBM sat on a stage at the Computer History Museum in Mountain View, California. Matthias Troyer, John Martinis and Pat Gumann were all working on research into quantum computing, which takes advantage of our knowledge of quantum mechanics, the physics of how the world operates at the smallest level.

    The evening was billed as a night to ask the experts

    Quantum Questions.
    CHM Live | Quantum Questions
    1:27:07

    About an hour into the event, moderator and historian David Brock asked the scientists one last thing: “What do you think—for us as, you know, citizens of the world—what are the most important things for us to know about and keep in mind about quantum computing, as it is today?”

    Troyer called attention to the museum displays around them. “When you look back at the history of computing… the abacus works on the same principle of the most modern, fastest classical CPU. It’s discrete, digital logic. There’s been no change in the way we compute for the last 5,000 years.

    “And now is the time when this is changing,” he said, “because with quantum computing we are radically changing the way we use nature to compute.”

    Scientists have called this moment a second quantum revolution. The first quantum revolution brought us developments like the transistor, which enabled the creation of powerful, portable modern electronic devices.

    It’s not yet clear what this new revolution will bring. But plenty of computer scientists, physicists and engineers are hard at work to find out. Around the world, research institutions, universities and businesses have been ramping up their investments in quantum science.

    At the end of 2018, the United States passed the National Quantum Initiative Act, which led to the establishment of five new Department of Energy Quantum Information Science Research Centers; five new National Science Foundation Quantum Leap Challenge Institutes; and the National Institute of Standards and Technology’s Quantum Economic Development Consortium.

    Efforts to develop quantum computers, quantum sensors and quantum networks have the potential to change our lives. And some of the first applications of these developments could be in particle physics and astrophysics research.

    Throughout the month of January, Symmetry will publish a series of articles meant to give readers a better understanding of this quantum ecosystem—the physics ideas it’s based on, the ways this knowledge can be applied, and what will determine the shape of our quantum future.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:36 am on January 4, 2022 Permalink | Reply
    Tags: "Theorists imagine a different kind of dark matter", , , Symmetry   

    From Symmetry: “Theorists imagine a different kind of dark matter” 

    Symmetry Mag

    From Symmetry

    01/04/22
    Maxwell Bernstein

    1
    Illustration by Sandbox Studio, Chicago.

    Physicists are revisiting what they previously assumed about how dark matter interacts with itself.

    In 1968, an astronomer named Vera Rubin measured the spectra of stars in our neighboring Andromeda Galaxy to determine how fast they were moving.

    Andromeda Galaxy Messier 31. Credit: Adam Evans.

    Rubin expected to see stars at the galaxy’s edge moving at slower speeds than those near the center, which should have felt a stronger pull from the galaxy’s gravity. But the measurements indicated that both sets of stars were moving at the same speed.

    This curious result matched up well with a theory that physicist Fritz Zwicky had proposed in 1933: Galaxies were hiding an additional source of gravitational pull in the form of invisible material he called “dunkle Materie,” or “dark matter.”

    Rubin’s observations, along with others, brought Zwicky’s idea a new level of clout. After decades of research, scientists have indirectly determined that dark matter makes up about 85% of all matter. Scientists are now seeking to directly observe the particles that form the halos of hidden matter that seem to hold together the systems of luminous matter that we know. Numerous experiments are searching, but dark matter’s identity remains a mystery.

    Scientists are still looking at the rotation of luminous material in galaxies, now to test new ideas about what dark matter could be.

    ______________________________________________________
    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.
    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    LBNL LZ Dark Matter Experiment (US) xenon detector at Sanford Underground Research Facility(US) Credit: Matt Kapust.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    ______________________________________________________

    A new view of dark matter

    In the late ’70s and early ’80s, a proposed particle for dark matter took center stage. The WIMP, or Weakly Interacting Massive Particle, would be a slow-moving, massive particle that would interact with known particles through gravity and the weak nuclear force that governs radioactive decay. But WIMPs would interact with other WIMPs only through gravity.

    “The guiding theory has been something where dark matter weakly interacts with itself and primarily interacts with other particles,” says Yonit Hochberg, an assistant professor of particle physics at The Hebrew University of Jerusalem [הַאוּנִיבֶרְסִיטָה הַעִבְרִית בִּירוּשָׁלַיִם‎] (IL).

    The popular theory of supersymmetry, which predicts a slew of new particles in addition to those already mentioned in the Standard Model of particle physics, conveniently included this dark-matter candidate.

    Standard Model of Supersymmetry

    But the WIMP, which took most of the limelight and has been guiding experimental searches for nearly four decades, has yet to be found.

    Scientists are designing better and better experiments aimed at finding a particle like the WIMP, but they’re also thinking about how dark matter might turn out to be something different.

    Hochberg says she asked herself and her colleagues, as a thought experiment: What if dark matter didn’t primarily interact with other particles? “What would happen if what’s most important about dark matter is how it interacts with itself?”

    There could be many particles and forces in addition to the ones we know. These hypothetical particles could be part of a dark sector, which could include both dark-matter particles and new forces that govern dark-matter self-interactions.

    If dark-matter particles can interact with each other, scientists should see the effects inside the centers of galaxies and the ways gases rotate around them.

    “One of the biggest pieces of evidence of self-interactions would be the density profiles of the centers of galaxies,” says Mike Boylan-Kolchin, an associate professor of astronomy at The University of Texas (US). “And that’s usually measured by rotation curve data,” a plot comparing the orbital speed of visible matter in a galaxy to its distance from that galaxy’s center.

    Boylan-Kolchin looks for evidence of dark-matter self-interactions by inputting all the known, relevant cosmological variables into a computer program and then simulating the evolution of galaxies. From this, he can study how dark matter shapes cosmic structures and look for evidence of the particle nature of dark matter.

    A way to understand the idea of strong self-interactions between dark-matter particles would be to think about how energy could propagate through a dark-matter halo, says Hai-Bo Yu, an associate professor of particle physics at The University of California-Riverside (US).

    Imagine how heat transfers across a room with a heater on, through collisions among air particles, he says. In a galaxy, “dark matter particles could have a temperature gradient just like in the room, and the self-interactions would transport heat from the hot to cold regions of a dark-mater halo and thermalize it. The question is: Can we see this effect in observations?”

    Finding dark-matter self-interactions will require mapping large swathes of the night sky, he says.

    That’s a process scientists are constantly improving. Observatories like the Vera C. Rubin Observatory under construction in Chile might provide the right data to uncover the true nature of dark matter.

    NSF (US) NOIRLab (US) NOAO (US) Vera C. Rubin Observatory [LSST] Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing NSF (US) NOIRLab (US) NOAO (US) AURA (US) Gemini South Telescope and Southern Astrophysical Research Telescope.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 5:18 pm on December 18, 2021 Permalink | Reply
    Tags: , , , , , , , , , , Symmetry, Triggers   

    From Symmetry: “Blink and it’s gone” 

    Symmetry Mag

    From Symmetry

    07/13/21 [Found in a year-end round up]
    Eoin O’Carroll

    Fast electronics and artificial intelligence are helping physicists capture data and decide what to keep and what to throw away.

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova.

    The nucleus of the atom was discovered a century ago thanks to scientists who didn’t blink.

    Working in pitch darkness at The University of Manchester (UK) between 1909 and 1913, research assistants Hans Geiger and Ernest Marsden peered through microscopes to count flashes of alpha particles on a fluorescent screen. The task demanded total concentration, and the scientists could count accurately for only about a minute before fatigue set in. The physicist and science historian Siegmund Brandt wrote that Geiger and Marsden maintained their focus by ingesting strong coffee and “a pinch of strychnine.”

    Modern particle detectors use sensitive electronics instead of microscopes and rat poison to observe particle collisions, but now there’s a new challenge. Instead of worrying about blinking and missing interesting particle interactions, physicists worry about accidentally throwing them away.

    The Large Hadron Collider at CERN produces collisions at a rate of 40 million per second, producing enough data to fill more than 140,000 one-terabyte storage drives every hour.

    European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN].

    European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire (CH) map.

    CERN LHC tunnel and tube.

    SixTRack CERN LHC particles.

    Capturing all those events is impossible, so the electronics have to make some tough choices.

    To decide which collisions to retain for analysis and which ones to discard, physicists use specialized systems called trigger systems. The trigger is the only component to observe every collision. In about half the time it takes a human to blink, the CMS experiment’s triggers have processed and discarded 99.9975% of the data.

    European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire(CH) CMS
    European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU) CMS Detector

    Iconic view of the European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN] ATLAS detector.

    Depending on how a trigger is programmed, it could be the first to capture evidence of new phenomena—or to lose it.

    “Once we lose the data, we lose it forever,” says Georgia Karagiorgi, a professor of physics at Columbia University (US) and the US project manager for the data acquisition system for the Deep Underground Neutrino Experiment. “We need to be constantly looking. We can’t close our eyes.”

    The challenge of deciding in a split second which data to keep, some scientists say, could be met with artificial intelligence.

    A numbers game

    Discovering new subatomic phenomena often requires amassing a colossal dataset, most of it uninteresting.

    Geiger and Marsden learned this the hard way. Working under the direction of Ernest Rutherford, the two scientists sought to reveal the structures of atoms by sending streams of alpha particles through sheets of gold foil and observing how the particles scattered. They found that for about every 8000 particles that passed straight through the foil, one particle would bounce away as though it had collided with something solid. That was the atom’s nucleus, and its discovery sent physics itself on a new trajectory.

    By today’s physics’ standards, Geiger and Marsden’s 1-in-8000 odds look like a safe bet. The Higgs boson is thought to appear in only one out of every 5 billion collisions in the LHC.

    European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) CMS Higgs Event May 27, 2012.

    European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) ATLAS Higgs Event

    The triggers will soon need to get even faster. In the LHC’s Run 3, set to begin in March 2022, the total number of collisions will equal that of the two previous runs combined. The collision rate will increase dramatically during the LHC’s High-Luminosity era, which is scheduled to begin in 2027 and continue through the 2030s. That’s when the collider’s luminosity, a measure of how tightly the crossing beams are packed with particles, is set to increase tenfold over its original design value.

    Collecting this data is important because in the coming decade, scientists will intensify their searches for phenomena that are just as mysterious to today’s physicists as atomic nuclei were to Geiger and Marsden.

    And scientists have only a small window of time in which to catch them.

    “At CMS we have a massive amount of data,” says Princeton University (US) physicist Isobel Ojalvo, who has been heavily involved in upgrading the CMS trigger system. “We’re only able to store that data for about three and a half [millionths of a second] before we make decisions about keeping it or throwing it away.”

    A new physics

    In 2012, the Higgs boson became the last confirmed elementary particle of the Standard Model, the equation that succinctly describes all known forms of matter and predicts with astonishing accuracy how they interact.

    Standard Model of Particle Physics, Quantum Diaries

    But there are strong signs that the Standard Model, which has guided physics for nearly 50 years, won’t have the last word. In April, for instance, preliminary results from the Muon g-2 experiment at The DOE’s Fermi National Accelerator Laboratory (US) offered tantalizing hints that the muon may be interacting with a force or particle the Standard Model doesn’t include.

    DOE’s Fermi National Accelerator Laboratory(US) Muon g-2 studio. As muons race around a ring at the Muon g-2 studio, their spin axes twirl, reflecting the influence of unseen particles.

    Identifying these phenomena and many others may require a new understanding.

    “Given that we have not seen [beyond the Standard Model] physics yet, we need to revolutionize how we collect our data to enable processing data rates at least an order of magnitude higher than achieved thus far,” says The Massachusetts Institute of Technology (US) physicist Mike Williams, who is a member of the Institute for Research and Innovation in Software for High-Energy Physics, IRIS-HEP, funded by the National Science Foundation.

    Physicists agree that future triggers will need to be faster, but there’s less consensus on how they should be programmed.

    “How do we make discoveries when we don’t know what to look for?” asks Peter Elmer, executive director and principal investigator for IRIS-HEP. “We don’t want to throw anything away that might hint at new physics.”

    There are two different schools of thought, Ojalvo says.

    The more conservative approach is to search for signatures that match theoretical predictions. “Another way,” she says, “is to look for things that are different from everything else.”

    This second option, known as anomaly detection, would scan not for specific signatures, but for anything that deviates from the Standard Model, something that artificial intelligence could help with.

    “In the past, we guessed the model and used the trigger system to pick those signatures up,” Ojalvo says.

    But “now we’re not finding the new physics that we believe is out there,” Ojalvo says. “It may be that we cannot create those interactions in present-day colliders, but we also need to ask ourselves if we’ve turned over every stone.”

    Instead of searching one-by-one for signals predicted by each theory, physicists could deploy to a collider’s trigger system an unsupervised machine-learning algorithm, Ojalvo says. They could train the algorithm only on the collisions it observes, without reference to any other dataset. Over time, the algorithm would learn to distinguish common collision events from rare ones. The approach would not require knowing any details in advance about what new signals might be, and it would avoid bias toward one theory or another.

    MIT physicist Philip Harris says that recent advances in artificial intelligence are fueling a growing interest in this approach—but that advocates of “theoryless searches” remain a minority in the physics community.

    More generally, says Harris, using AI for triggers can create opportunities for more innovative ways to acquire data. “The algorithm will be able to recognize the beam conditions and adapt their choices,” he says. “Effectively, it can change itself.”

    Programming triggers calls for tradeoffs between efficiency, breadth, accuracy and feasibility. “All of this is wonderful in theory,” says Karagiorgi. “It’s all about hardware resource constraints, power resource constraints, and, of course, cost.”

    “Thankfully,” she adds, “we don’t need strychnine.”
    Follow

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 8:19 pm on December 7, 2021 Permalink | Reply
    Tags: "Are leptons all alike?", , , Symmetry   

    From Symmetry: “Are leptons all alike?” 

    Symmetry Mag

    From Symmetry

    12/07/21
    Daniel Garisto

    Building experimental evidence suggests that the electron, muon and tau may feel different forces.

    Standard Model of Particle Physics, Quantum Diaries

    When the tau lepton was discovered in the 1970s, it didn’t resolve any outstanding mysteries—it raised new ones.

    Aside from its comparatively enormous mass, the tau had all the same properties as the two other charged leptons: the electron and muon. As one of its discoverers, Martin Perl, put it: “But why three?”

    Historically, “lepton” (from the Greek leptos, meaning “small, delicate”) referred to any light particle like the electron. But as particle physicists gained more details about the subatomic world, the definition of lepton became more rigorous.

    Today, matter particles are divided into two groups: quarks, which interact via the strong nuclear force, and leptons, which don’t. Leptons include charged particles—electrons, muons, taus, which interact via the weak and electromagnetic force—and their neutral neutrino counterparts, which interact only via the weak force.

    Physicists have long made cursory checks to understand if leptons are really as similar to one another as they seem. One aspect they’ve tested for is lepton universality—whether all of the charged leptons and neutral leptons interact via the same forces.

    For decades it seemed as though lepton universality held true. Recently, however, data from experiments like CERN’s LHCb and Fermilab’s Muon g-2 have accumulated into a compelling case that muons don’t behave as they’re predicted to. One possibility is that the electron, muon and tau could interact differently from one another, due to the presence of an undiscovered force.

    European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU) LHCb.
    DOE’s Fermi National Accelerator Laboratory(US) Muon g-2 studio. As muons race around a ring at the Muon g-2 studio, their spin axes twirl, reflecting the influence of unseen particles.

    “I think it is probably the most interesting thing that’s happening now in particle physics,” says Gudrun Hiller, a theorist at The Technical University of Dortmund [Technische Universität Dortmund](DE). “It would be fantastic if this is true.”

    If lepton universality were violated, it could explain how a muon differs from an electron, or even what gives the three charged leptons their specific masses.

    Studies of lepton universality were performed during the 1990s. For example, at colliders such as the Large Electron-Positron Collider at CERN, researchers measured whether there were differences in how Z bosons decayed into different leptons.

    CERN Large Electron Positron Collider

    Still, lepton universality was generally assumed to be correct, Hiller says. Compared to the ideas du jour, like supersymmetry and extra dimensions, the idea of a possible break in lepton universality wasn’t seen as interesting. Additionally, there were large theoretical uncertainties involved in predictions of decays needed to test lepton universality.

    Then in 2004, Hiller figured out a way past the uncertainties: Let them cancel out.

    In rare decays of bottom quarks, some of the decay products should be leptons—electrons, muons and taus. If there is an undiscovered force that breaks lepton universality, it could have an effect, such as suppressing the number of observed muons coming out of these decays, as compared to other leptons.

    Muons and electrons should have the same decay uncertainties. So while it might be impossible to tell whether there were fewer total muons than expected in a type of decay, it could be possible to notice whether the ratio of muons to electrons was off.

    The rare bottom-quark decay experiments active at the time, Belle at KEK in Japan and BaBar at US Department of Energy’s SLAC National Accelerator Laboratory in the US, lacked the sensitivity to make a definitive detection of lepton universality violation. In 2007, 2010 and 2012, the experiments found hints [The European Physical Journal C] in their data that there might be something interesting going on.

    KEK Belle detector, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan.

    SLAC National Accelerator Laboratory(US) BaBar

    Then, in 2014, the LHCb experiment made waves with a new result. Using a much larger sample of rare decays, they tested lepton universality to precision, finding an anomaly with a statistical certainty of 2.6 sigma.

    This statistical shorthand means that, assuming there are no new forces, LHCb should see an anomaly at least that large 1 in 200 runs of the experiment. That doesn’t mean there’s a 99.5% chance of new physics—it just indicates how unlikely it is that the result they’re seeing is due to random chance. Typically, 5 sigma, which corresponds to 1 in 3 million odds, is needed to claim a discovery.

    Since then, data at LHCb has continued to pile up against lepton universality. What’s more, the anomalies in different rare particle decays all point in the same direction, away from the Standard Model prediction. LHCb’s latest result, released in October, reaffirms the trend.

    Some physicists have pointed out that adding up the anomalies leads to a greater than 5 sigma effect; others have noted that this could be interpreted as cherry-picking. As is often the case with particle physics, the devil is in the statistics.

    “This has raised a lot of curiosity and a lot of theoretical speculation and a lot of interest in the community,” says Monica Pepe-Altarelli, an experimental physicist on the LHCb experiment. “It could be true. We don’t know.”

    Hints from other experiments support the possibility that lepton universality could be violated. Perhaps the most compelling one comes from the Muon g-2 experiment at Fermi National Accelerator Laboratory.

    Muon g-2 is an upgraded version of a previous experiment at DOE’s Brookhaven National Laboratory (US), which found signs that muons were moving in an unexpected pattern in a strong magnetic field. Like a violation of lepton universality, this strange muon behavior could be a sign of an unknown force at work.

    Not every scenario that could explain the Muon g-2 result would imply a violation of lepton universality. But some recent models [Journal of High Energy Physics] not only explain both lepton anomalies, they also explain how these results could point to the presence of dark matter.

    Many more tests are needed to verify whether lepton universality is truly disrupted, let alone if any of these models are true.

    “If there’s lepton universality violation, it must show up elsewhere,” Hiller says.

    She’s working on theoretical tests to look for lepton universality violations in decays of charm and top quarks, or even neutrinos.

    On the experimental side, researchers like Pepe-Altarelli simply hope to get more data. LHCb is now undergoing an upgrade for its detectors. But Pepe-Altarelli says forthcoming analyses based on the already recorded data are expected to come out within the next year. Additionally, the CMS and ATLAS experiments at the LHC may have some ability to also look for lepton universality violation.

    Perhaps the most promising corroboration of LHCb and Muon g-2 would come from Belle II, the successor to the original Belle, but those results are estimated to take at least five years.

    The context for all of this is that the Standard Model has proven extremely resilient. So far, just about every anomaly that’s appeared has eventually come crumbling down with more data.

    Still, Pepe-Altarelli says she’s cautiously excited. These results give her and others hope that we’ll someday get an answer to the question, “Why three?”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:15 am on November 9, 2021 Permalink | Reply
    Tags: "The problem-solver-Cosmic inflation", , Symmetry   

    From Symmetry: “The problem-solver-Cosmic inflation” 

    Symmetry Mag

    From Symmetry

    11/09/21
    Sarah Wells

    Just over 40 years ago, a new theory about the early universe provided a way to tackle multiple cosmological conundrums at once.

    1
    Illustration by Sandbox Studio, Chicago with Tara Kennedy.

    For Alan Guth, insight into the origins of the universe started in a Cornell University lecture hall in the fall of 1978.

    _____________________________________________________________________________________
    Inflation

    4
    Alan Guth, from M.I.T., who first proposed cosmic inflation

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes Alex Mittelmann, Coldcreation.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation
    _____________________________________________________________________________________

    It was that semester that Guth, then a postdoc, attended a series of talks by astronomer and physicist Robert Dicke. In his lectures, Dicke introduced a critical cosmological question that was eating away at the theory of the Big Bang: the flatness problem.

    The flatness problem asks why the universe looks the way it does. The density of matter and energy around just after the Big Bang should have determined the universe’s future shape, and the parameters that would produce a flat universe—as opposed to a curved one—were extremely narrow. And yet, as far as we can measure using several different methods, our universe is almost perfectly flat.

    Just dialing the density slightly up or down, at very early times, would have resulted in a universe very strongly curved in one direction or another. Also, perhaps troublingly, either of these options could have precluded our existence.

    In the first second after the Big Bang, if the universe had been less dense “by just one digit in the 14th decimal place,” Guth says, it would have been largely empty. This is because there would have been less mass to put the brakes on its expansion. On the other hand, a slightly denser universe would have expanded too slowly, leading it to collapse on itself in a “Big Crunch.”

    2
    Illustration by Sandbox Studio, Chicago with Tara Kennedy.

    This kind of precise fine-tuning seemed peculiar, Guth says. Nevertheless, he initially brushed the problem aside to pursue work on a different one: magnetic monopoles.

    Popular theories posited that the early universe should have produced an enormous number of heavy particles, including magnetic monopoles. Unlike the dipole magnets we know today—which have “north” and “south” charges on opposite ends—a monopole can have just one, either a “north” or a “south” charge, but not both.

    These distinctive particles should have both proliferated and stuck around; we should still be able to find them today. But physicists have yet to encounter even one. Guth, along with then fellow Cornell University (US) postdoc Henry Tye, explored why this might be.

    Tye and Guth thought extreme supercooling could explain the universe’s apparent lack of magnetic monopoles. As Guth explains, the monopoles would have formed when twists in a chaotic quantum field became frozen at a phase transition. However, if the phase transition were delayed by extreme supercooling, the twists could have smoothed out before they were frozen, resulting in the absence of monopoles.

    It was only when rushing toward a deadline on the project, more than a year later, that Guth says he suddenly saw a crucial connection between this idea and the flatness problem.

    “Henry was preparing to leave for a six-week trip to China, so we were hurrying to finish the paper before he left,” says Guth, who by then had moved on to a postdoc at The DOE’s SLAC National Accelerator Laboratory (US) (then called the Stanford Linear Accelerator Center). “But before he left an important thing happened, which is that he said we should look at what effect the extreme supercooling would have on the expansion rate in the universe.

    “So I went home one night at the beginning of December to work out the equations that describe how the expansion rate in the universe would be affected by the supercooling of the matter,” Guth says. “And it was immediately obvious that it would affect the expansion of the universe tremendously. It would drive the universe into a period of exponential expansion, which is what we now call inflation. And the same night that I realized that this exponential expansion would also give a solution to the flatness problem.”

    According to the theory of inflation, the expansion rate of the universe exploded in its earliest moments and then slowed. This happened as a quantum field called the inflaton field underwent a transition that pushed things apart before settling into a phase of normal gravity. The first part, the rapid expansion, would have diluted any matter and energy already present when inflation started. The second part would have released a new batch.

    Luckily for those of us who appreciate the universe as it is today, the energy released was just the right amount to drive the universe toward flatness, Guth says.

    In addition to offering plausible solutions to the flatness and monopole problems, inflation also helped explain a third problem: the horizon problem.

    The horizon problem comes from our observations of the cosmic microwave background, or CMB: the afterglow left by the early universe’s first freed particles of light, explains Nobuchika Okada, a professor of physics at The University of Alabama (US).

    CMB per European Space Agency(EU) Planck.

    Essentially, scientists have observed that the CMB is very nearly the exact same temperature in all directions. This was considered strange because the parts of the universe at opposite edges of our “horizon”—as far as we can detect from our vantage point on Earth—were too many lightyears apart to ever have communicated with one another. They should not have been able to settle into an average, uniform temperature.

    Inflation suggests that the entire visible universe once existed as a single, contained region before the inflaton field drove it into expansion. This shared proximity of origin would explain how now disparate parts of the universe could have once mingled, Okada says.

    In January 1981, Guth published a paper in Physical Review D. It has since been cited about 13,000 times. In 2002, Guth shared the Dirac Prize for the development of the concept of inflation with physicists Andrei Linde and Paul Steinhardt. In 2012, Guth and Linde received the Breakthrough Prize in Fundamental Physics for the innovation, and in 2014, they, along with Alexei Starobinsky, were awarded the Kavli Prize in Astrophysics. Guth is now the Victor F. Weisskopf Professor of Physics at MIT.

    The theory of inflation has been widely celebrated, but a theory alone isn’t enough to close the case on the mystery of the early universe.

    That’s where research like that of Eva Silverstein, a professor of theoretical physics at Stanford University (US), comes in. She works to come up with ways to test inflation theory. Silverstein uses ancient data sources—like the CMB—to try to uncover the mechanisms behind inflation. In particular, Silverstein’s work uncovers how the mechanisms of inflation may be consistent with the tenets of quantum gravity. This quantum gravity framework can help explain observational data of inflation’s energy plateau, she says.

    The CMB is of great interest to scientists investigating the theory of inflation. The rapid expansion should have produced gravitational waves, which would have left a unique pattern in the CMB called B-mode polarization. In 2014, the BICEP2 experiment announced that it had observed this pattern, but scientists later walked back their confidence in the result. Experiments are still searching for B-mode polarization in the CMB.

    BICEP Array at the South Pole by Nathan Precup .

    For his part, Okada looks for ways to discover the inflaton field at particle accelerators. Guth says he isn’t sure we’ll ever be able to find inflatons, which might have decayed entirely during inflation. But, Silverstein says, if peering back to the very moment of inflation does turn out to be forever off limits, that’s okay.

    “There is a limit to the data and a causal limit to what we can see in the universe consistent with the finite speed of light,” Silverstein says. But “it is amazing how much we can see and deduce, so this remaining uncertainty is not the end of the world.”

    In fact, it’s only just the beginning.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 7:47 pm on October 19, 2021 Permalink | Reply
    Tags: "Eyes on the sky", , , , , Symmetry,   

    From Symmetry: “Eyes on the sky” 

    Symmetry Mag

    From Symmetry

    10/19/21
    Christopher Crockett

    There’s no one best way to build a telescope.

    1
    Illustration by Sandbox Studio, Chicago with Steve Shanabruch.

    On a mountaintop in the Atacama Desert of Chile, a gargantuan new eye on the cosmos is taking shape.

    When completed in the late 2020s, the aptly named Extremely Large Telescope will be the largest optical telescope on the planet.

    With a nearly 40-meter-wide mirror—roughly four times as wide as the current record holder—it will search for Earth-like planets, seek out the first generation of galaxies and produce images up to 16 times sharper than the Hubble Space Telescope (depending on the wavelength).

    Meanwhile, NASA is preparing to launch what it dubs Hubble’s successor: the James Webb Space Telescope. It won’t be as big as the ELT; its 6.5-meter-wide mirror is middling compared to ground-based telescopes. But from its stable perch beyond Earth’s atmosphere, JWST’s infrared eyes will glimpse light from the first stars, peek into the atmospheres of worlds beyond the solar system and lift the veil on dust-enshrouded stellar nurseries.

    National Aeronautics Space Agency(USA)/European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/ Canadian Space Agency [Agence Spatiale Canadienne](CA) Webb Infrared Space Telescope(US) James Webb Space Telescope annotated. Scheduled for launch in October 2021 delayed to December 2021.

    Telescopes in space. Telescopes on the ground. Each have their place in the exploration of the cosmos. But over the four centuries that have elapsed since Galileo pointed a handheld spyglass skyward, the telescope’s job has remained the same.

    “I would describe it as a photon bucket,” says Elizabeth George, a telescope detector engineer at the European Southern Observatory [Observatoire européen austral][Europäische Südsternwarte](EU)(CL).

    Photons, tiny packets of light, are the main currency in astronomy. Light, in its many guises, is often the only intel we have from far-off locales. A telescope’s job is straightforward: collect more photons than our eyes alone can see.

    Generally, that means going big. In the same way that a large bucket collects more rain than a small pail, a telescope with a larger mirror or lens ensnares more photons, allowing astronomers to see fainter things. And for a given wavelength of light, a wide mirror or lens creates a sharper image, letting researchers see those faint things in better detail.

    This holds true whether a telescope sits on the ground or lives in space, whether it focuses on visible light or expands beyond our human senses to collect radio waves or X-rays. The choice of whether to erect a telescope on land or lob one into orbit is the result of balancing several factors.

    “In space, there’s no atmosphere, that’s really the benefit,” George says.

    Telescopes on Earth must peer through shifting parcels of air that blur images and make stars twinkle.

    Space offers an incredibly stable environment. With minimal temperature swings and no mechanical stress from gravity, the precision of a telescope in space outperforms anything on the ground. “Earth is incredibly noisy, it turns out,” George says.

    Earth’s atmosphere also blocks many types of light from reaching the ground. Visible light and radio waves get through fine, but gamma rays, X-rays, most ultraviolet light and some infrared wavelengths don’t make it. Each of these wavelength ranges probe vastly different physical phenomena.

    “The electromagnetic spectrum is so diverse … we need many different types of telescopes in order to really fully understand the whole universe,” says Regina Caputo, an astrophysicist at The Goddard Space Flight Center | NASA (US).

    Space comes with caveats. It’s phenomenally expensive to put a telescope in orbit, and most are impossible to repair or update once they’ve escaped the atmosphere. “On the ground, you can make things really big, for cheaper and faster, and you can constantly upgrade,” George says. “The benefit of ground-based is you can try out new things.”

    Telescopes on the ground can grow as big as humans dare to build them. While Hubble, the largest optical space telescope, measures 2.4 meters across, the widest optical scope on the ground—the Gran Telescopio Canarias on the Canary Island of La Palma—spans 10.4 meters.

    Some radio telescopes are bigger still: The largest single telescope of any kind is FAST, the Five-hundred-meter Aperture Spherical radio Telescope, a radio dish in China that spans half a kilometer.

    FAST-Five-hundred-meter Aperture Spherical radio Telescope [[五百米口径球面射电望远镜] (CN), located in the Dawodang depression in Pingtang County, Guizhou Province, South China.

    With the flexibility to try new things on the ground, astronomers have gotten clever about boosting image resolution while gazing through a turbulent atmosphere. Many of the largest optical telescopes come equipped with adaptive optics: A deformable mirror—used to correct distortions—and reference points of light in the sky—bright stars or marks created by lasers—help these scopes regain much of the clarity they otherwise would have lost.

    Some ground-based observatories achieve resolution that far surpasses that of any one telescope by combining the light of many smaller scopes observing in sync. The ALMA observatory in Chile, for example, links up 66 radio dishes to produce images with the clarity (though not the sensitivity to faint light) of a single telescope 16 kilometers across.

    European Southern Observatory/National Radio Astronomy Observatory(US)/National Astronomical Observatory of Japan(JP) ALMA Observatory (CL)

    But whether in space or on the ground, big or small, sensitive to one type of light or another, no one telescope is the best. “You identify the physics question you’re trying to ask, and then that will dictate what kind of telescope you need,” says Marc Postman, an astronomer at Space Telescope Science Institute in Baltimore.

    To study ultracold interstellar gas, a radio telescope on the ground is the right tool; those clouds are home to molecules and hydrogen atoms that emit specific radio wavelengths. But to probe 10-million-degree plasma that pervades clusters of galaxies, an X-ray telescope in space is the best bet; it can detect the high-energy photons emitted by electrons decelerating in those ionized gases.

    Even a single telescope can fare better or worse than another, depending on the metric. The Extremely Large Telescope and the JWST, for example, will have some overlap in the wavelengths of light they can see. At those wavelengths, the ELT will produce much sharper images, thanks to its sheer size and adaptive optics. But JWST, despite being much smaller, will see far fainter things, because there’s no infrared glow from a warm atmosphere to compete with in space.

    And sometimes, tiny is king. When Postman and colleagues wanted to measure the cosmic optical background, a feeble glow of visible light coming from all directions in space, they didn’t use a big telescope. They turned to a 21-centimeter-wide instrument—smaller than many backyard telescopes—on the New Horizons spacecraft, which buzzed Pluto in 2015.

    National Aeronautics Space Agency(USA) New Horizons(US) spacecraft

    Out at the solar system’s edge, that telescope was beyond the light pollution caused by a haze of interplanetary dust particles that scatter sunlight.

    “The sky is so dark out there that even a small telescope can make observations that would be difficult for a much bigger telescope much closer to the sun,” Postman says.

    In the coming years, new ground-based observatories will push the limits of what we can see. The ELT, under construction in the southern hemisphere, and the Thirty Meter Telescope, proposed to be built in the northern hemisphere, are planned to be the largest optical eyes humans have ever trained on the sky. And when the Vera C. Rubin Observatory comes online in 2023, it will have an exceptionally wide field of view that will let it scan the entire sky visible from northern Chile every few days. It will create a time-lapse movie that will help astronomers discover anything that flashes or moves: supernovae, passing asteroids or even undiscovered planets in the remote parts of our solar system.

    In space, all eyes are on the JWST, which completed its final prelaunch tests on August 26 and is scheduled for launch December 18. In the early 2030s, the European Space Agency’s Athena observatory will give astronomers powerful new X-ray vision, which will let them probe deeper into some of the hotter, more energetic locales in the universe.

    European Space Agency [Agence spatiale européenne](EU) Athena [Advanced Telescope for High-ENergy Astrophysics] spacecraft depiction.

    And in between, a medley of specialized space telescopes will zero in on specific science questions such as the nature of dark matter and dark energy as well as the continued hunt for more planets in the galaxy.

    All these telescopes—and too many others to name—will continue their 400-plus-year legacy of expanding our view of cosmos. It’s a legacy that Galileo himself would be glad to hear. In his 1610 publication The Starry Messenger, where he documented the wonders his telescope revealed, he pondered his device’s future: “Perchance other discoveries still more excellent will be made from time to time by me or by other observers, with the assistance of a similar instrument.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:37 am on October 12, 2021 Permalink | Reply
    Tags: "Is dark matter cold or warm or hot?", , , , , , , , Symmetry, ,   

    From Symmetry: “Is dark matter cold or warm or hot?” 

    Symmetry Mag

    From Symmetry

    10/12/21
    Glennda Chui

    The answer has to do with dark matter’s role in shaping the cosmos.

    Milky Way Dark Matter Halo Credit:L. Calçada/ European Southern Observatory [Observatoire européen austral][Europäische Südsternwarte](EU)(CL)

    Half a century after Vera Rubin and Kent Ford confirmed that a form of invisible matter—now called dark matter—is required to account for the rotation of galaxies, the evidence for its existence is overwhelming.
    _____________________________________________________________________________________
    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.

    Coma cluster via NASA/ESA Hubble.

    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970

    Dark Matter Research

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    _____________________________________________________________________________________

    Although it is known to interact with ordinary matter only through gravity, there is such a massive amount of dark matter out there—85% of all the matter in the universe—that it has played a pivotal behind-the-scenes role in shaping all the stuff we can see, from our own Milky Way galaxy to the wispy filaments of gas that link galaxies across vast distances.

    “We think it exists because there’s evidence for it on many, many scales,” says Kevork Abazajian, a theoretical physicist and astrophysicist at The University of California-Irvine (US).

    There have been a lot of ideas about what form dark matter might take, from planet-sized objects called MACHOs to individual particles like WIMPs—weakly interacting massive particles roughly the size of a proton—and even tinier things like axions and sterile neutrinos.

    In the 1980s, scientists came up with a way to make sense of this growing collection: They started classifying proposed dark-matter particles as cold, warm or hot. These categories are based on how fast each type of dark matter would have traveled through the early universe—a speed that depended on its mass—and on how hot its surroundings were when it popped into existence.

    Light, fast particles are known as hot dark matter; heavy, slow ones are cold dark matter; and warm dark matter falls in between.

    In this way of seeing things, WIMPs are cold, sterile neutrinos are warm, and relic neutrinos from the early universe are hot. (Axions are a special case—both light and extremely cold. We’ll get to them later.)

    Why is their speed so important?

    “If a dark matter particle is lighter and faster, it can travel farther in a given time, and it will smooth out any structure that already exists along the way,” Abazajian says.

    On the other hand, slower, colder forms of dark matter would have helped build structure, and based on what we know and see today it must have been part of the mix.

    Building galaxies

    Although there are theories about when and how each type of dark-matter candidate would have formed, the only thing scientists know for sure is that dark matter was already around about 75,000 years after the Big Bang. It was then that matter started to dominate over radiation and little seeds of structure started to form, says Stanford University (US) theoretical physicist Peter Graham.

    Most types of dark-matter particles would have been created by collisions between other particles in the hot, dense soup of the infant universe, in much the same way that high-energy particle collisions at places like the Large Hadron Collider give rise to exotic new types of particles. As the universe expanded and cooled, dark-matter particles would have wound up being hot, warm or cold—and, in fact, there could have been more than one type.

    Scientists describe them as freely “streaming” through the universe, although this term is a little misleading, Abazajian says. Unlike leaves floating on a river, all headed in the same direction in a coordinated way, “these things are not just in one place and then in another place,” he says. “They’re everywhere and going in every direction.”

    As it streamed, each type of dark matter would have had a distinctive impact on the growth of structure along the way—either adding to its clumpiness, and thus to the building of galaxies, or thwarting their growth.

    Cold dark matter, such as the WIMP, would have been a clump-builder. It moved slowly enough to glom together and form gravitational wells, which would have captured nearby bits of matter.

    Hot dark matter, on the other hand, would have been a clump-smoother, zipping by so fast that it could ignore those gravitational wells. If all dark matter were hot, none of those seeds could have grown into bigger structures, says Silvia Pascoli, a theoretical physicist at The University of Bologna [Alma mater studiorum – Università di Bologna](IT). That’s why scientists now believe that hot dark-matter particles, such as relic neutrinos from the early days of the cosmos, could not constitute more than a sliver of dark matter as a whole.

    Despite their tiny contribution, Pascoli adds, “I say these relic neutrinos are currently the only known component of dark matter. They have an important impact on the evolution of the universe.”

    You might think that warm dark matter would be the best dark matter, filling the universe with a Goldilocks bowl of just-right structure. Sterile neutrinos are considered the top candidate in this category, and in theory they could indeed constitute the vast majority of dark matter.

    But most of the parameter space—the sets of conditions—where they could exist have been ruled out, says Abazajian, who as a graduate student researched how specific types of neutrino oscillations in the early universe could have produced sterile neutrino dark matter.

    Although those same oscillations could be happening today, he says, the probability that a regular neutrino would turn into a sterile one through standard oscillations in the vacuum of space are thought to be very small, with estimates ranging from 1 in 100,000 to 1 in 100 trillion.

    “You’d have to have a very good counting mechanism to count up to 100 trillion hits in your detector without missing the one hit from a sterile neutrino,” Abazajian says.

    That said, there are a few experiments out there that are giving it a try, using new approaches that don’t rely on direct hits.

    Then there’s the axion.

    Unlike the other dark-matter candidates, axions would be both extremely light—so light that they are better described as waves whose associated fields can spread over kilometers—and extremely cold, Graham says. They are so weakly coupled to other forms of matter that the frantic collisions of particles in the thermal bath of the early universe would have produced hardly any.

    “They would have been produced in a different way than the other dark matter candidates,” Graham says. “Even though the universe was very hot at the time, axions would have been very cold at birth and would stay cold forever, which means that they are absolutely cold dark matter.”

    Even though axions are very light, Graham says, “because they exist at close to absolute zero, the temperature where all motion stops, they are essentially not moving. They’re kind of this ghostly fluid, and everything else moves through it.”

    Searching for dark matter of all kinds

    Some scientists think it will take more than one type of dark matter to account for all the things we see in the universe.

    And in the past few years, as experiments aimed at detecting WIMPs and producing dark matter particles through collisions at the Large Hadron Collider have so far come up empty-handed, the search for dark matter has broadened.

    SixTRack CERN LHC particles

    The proliferation of ideas for searches has been helped by technological advances and clever approaches that could force much lighter and even more exotic dark-matter particles out of hiding.

    Some of those efforts make use of the very clumpiness that dark matter was instrumental in creating.

    Simona Murgia, an experimentalist at The University of California-Irvine (US), led a team looking for signs of collisions between WIMPs and their antiparticles with the Fermi Gamma-ray Space Telescope while a postdoc at the DOE’s SLAC National Accelerator Laboratory.

    Now she’s joined an international team of scientists who will conduct a vast survey of the Southern sky from the Vera C. Rubin Observatory in Chile using the world’s biggest digital camera, which is under construction at SLAC.

    NSF (US) NOIRLab (US) NOAO (US) Vera C. Rubin Observatory [LSST] Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing NSF (US) NOIRLab (US) NOAO (US) AURA (US) Gemini South Telescope and Southern Astrophysical Research Telescope.

    One of the things this survey will do is get a much better handle on the distribution of dark matter in the universe by looking at how it bends light from the galaxies we can see.

    “It will tell us something about the nature of dark matter in a totally different way,” Murgia says. “The more clumpy its distribution is, the more consistent it is with theories that tell you dark matter is cold.”

    The camera is expected to snap images of about 20 billion galaxies over 10 years, and from those images scientists hope to infer the fundamental nature of the dark matter that shaped them.

    “We don’t only want to know the dark matter is there,” Murgia says. “We do want to understand the cosmology, but we also really want to know what dark matter is.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:21 am on October 5, 2021 Permalink | Reply
    Tags: "How to train your magnet", , , , , Large Hadron Collider, Niobium-3-tin magnets, , Symmetry   

    From Symmetry: “How to train your magnet” 

    Symmetry Mag

    From Symmetry

    10/05/21
    Sarah Charley

    1
    illustration by Sandbox Studio, Chicago with Tara Kennedy.

    New accelerator magnets are undergoing a rigorous training program to prepare them for the extreme conditions inside the upgraded Large Hadron Collider.

    When training for a marathon, runners must gradually ramp up the distance of their runs. They know that their runs in the early days of training do not define what they will one day be capable of; they’re building a strong foundation that will help them reach their full potential.

    The car-length magnets that steer particles around the Large Hadron Collider go through a similar process. Scientists must push them to their limits, time and again, until they can handle enormous amounts of electrical current.

    “These magnets are great marvels of engineering,” says scientist Kathleen Amm, director of the Magnet Division at the DOE’s Brookhaven National Laboratory (US) in New York. “But one thing we cannot do is put them straight into an accelerator. They have to be trained.”

    Scientists, engineers and technicians at Brookhaven are now training magnets for an even more difficult task: directing and focusing particles in a next-generation accelerator, the powered-up High-Luminosity LHC at CERN. Luckily, these magnets can not only withstand the workout, but also gain the ability to carry even more current than before.

    Withstanding lightning bolts

    Using a new type of superconducting wire based on niobium-3-tin, Nb3Sn, the HL-LHC accelerator magnets will be able to conduct about 40% more electrical current than the previous iteration of magnets for the LHC. Each will carry about 16,500 amperes—roughly as much as a small bolt of lightning. The average laptop, for reference, uses less than 5 amperes.

    LHC magnets are made from materials that are different from those used to make a laptop in an important way: They’re superconducting. That means they can carry an electrical current without losing any energy. They don’t produce any heat because they have zero electrical resistance.

    But there’s a catch: Both the old and new LHC magnets obtain the property of superconductivity only when cooled to extremely low temperatures. Inside the LHC, they are kept at 1.9 kelvin (minus 456.25 Fahrenheit), just above absolute zero.

    Even that is not always enough: A tiny imperfection can cause a magnet to suddenly lose its superconducting properties in a process called quenching.

    “A quench means that a portion of the superconductor becomes normal,” says scientist Sandor Feher, who oversees HL-LHC magnet testing and training. “Its temperature starts to rise, and this heat spreads to other parts of the magnet.”

    A quench can be ruinous. “When a superconductor loses its superconducting properties, it goes from having zero electrical resistance to a very high electrical resistance,” Amm says. “In the early days [of superconductor development], magnets would get burnt out because of this rapid transition.”

    But this overheating does not always spell disaster. During magnet training, controlled quenches induce helpful structural changes on the microscopic level that improve a magnet’s performance.

    The anatomy of a magnet

    When he was 12 years old, Martel Walls won a local art competition with a detailed and realistic drawing of a courthouse in Bloomington, Illinois. “My drawing ended up inside the courthouse,” he says. “Ever since then, I knew I wanted to work in a field that would take advantage of my eye for detail and steady hand.”

    Walls’ eye for complex forms eventually led him to his job as lead technician in charge of magnetic coil development at DOE’s Fermi National Accelerator Laboratory (US) in Illinois, where teams both produce and test magnets bound for the HL-LHC.

    The magnets Walls and his team are assembling consist of 450 meters (about 1480 feet) of Nb3Sn superconducting cable wound around two interlocking support structures. The coils are about 4.5 meters (almost 15 feet) in length. Every centimeter of cable is inspected both before and during the winding process.

    The coils are then heated up to 665 degrees Celsius (1229 degrees Fahrenheit) over an 11-day heat cycle; a process which transforms the ordinary niobium-tin cable into a superconductor, but also makes it incredibly brittle. “It becomes as fragile as uncooked spaghetti,” Walls says.

    Handling them as gently as possible, technicians solder more components onto the coils before soaking them in epoxy. The final coils are shipped to DOE’s Lawrence Berkeley National Laboratory (US) in California, where multiple coils are fitted together and then wrapped in a strong steel casing. They are then shipped to Brookhaven to begin their training regime.

    When the Brookhaven test team connects the magnets to electricity, the coils push and pull on each other with enormous forces due to the high magnetic fields.

    Even a tiny movement on the order of just 10 to 20 microns—about the width of a human hair—can be enough to generate a quench.

    Training regime

    Early on, engineers realized that a well-built magnet could remember these microscopic movements. When an unstable component shifts into a more comfortable position, the component then normally stays put. The result is a magnet that is sturdier the next time it powers up.

    During training, scientists and engineers gradually increase the electrical current circulating in the magnet. If any portion of the magnet is going to move or release energy, it does so in a controlled laboratory setting rather than a hard-to-access subterranean accelerator complex.

    Magnet training at Brookhaven begins by immersing the magnet in a bath of liquid helium. Once it’s cooled, the test team introduces and gradually increases the electrical current.

    As soon as there’s a quench, the electricity is automatically diverted out of the magnet. The liquid helium bath evaporates, carrying with it the heat of the quench. After each quench, the helium is recollected to be reused, and the process starts again.

    “Our goal is three quenches per magnet per day,” Feher says. “We start around 5 or 6 in the morning and work in shifts until 6 or 7 in the evening.”

    Little by little, the Brookhaven test team exposes the magnet to higher and higher currents.

    “During magnet R&D, we might see 50 to 60 quenches,” Amm says. “When we go into production, the goal is to see a minimum number of quenches, around 14 or 15, before we get to the desired field level.”

    Once training is completed—that is, the magnet can operate at the desired current without quenching—it is shipped back to Fermilab for further outfitting and testing. The final magnets then will be shipped to CERN.

    According to Amm, designing, building and preparing magnets for the LHC’s upgrade is more than applied physics: It’s a form of craftsmanship.

    “That’s where the art comes in along with the science,” she says. “You can do so much science and engineering, but ultimately you have to build and test a lot of magnets before you understand the sweet spot.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: