Tagged: Symmetry Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:26 pm on February 1, 2023 Permalink | Reply
    Tags: "A new way to explore proton’s structure with neutrinos yields first results", , , , Symmetry   

    From “Symmetry”: “A new way to explore proton’s structure with neutrinos yields first results” 

    Symmetry Mag

    From “Symmetry”

    2.1.23
    Madeleine O’Keefe

    Physicists used MINERvA, a Fermilab neutrino experiment, to measure the proton’s size and structure using a neutrino-scattering technique.

    For the first time, particle physicists have been able to precisely measure the proton’s size and structure using neutrinos.

    With data gathered from thousands of neutrino-hydrogen scattering events collected by MINERvA, a particle physics experiment at the US Department of Energy’s Fermi National Accelerator Laboratory, physicists have found a new lens for exploring protons. The results were published today in the scientific journal Nature[below].

    This measurement is also important for analyzing data from experiments that aim to measure the properties of neutrinos with great precision, including the future Deep Underground Neutrino Experiment, hosted by Fermilab.

    _____________________________________________________________________________________________





    _____________________________________________________________________________________________

    “The MINERvA experiment has found a novel way for us to see and understand proton structure, critical both for our understanding of the building blocks of matter and for our ability to interpret results from the flagship DUNE experiment on the horizon,” says Bonnie Fleming, Fermilab deputy director for science and technology.

    Protons and neutrons are the particles that make up the nucleus, or core, of an atom.

    Understanding their size and structure is essential to understand particle interactions. But it is very difficult to measure things at subatomic scales. Protons—about a femtometer, or 10^−15 meters, in diameter—are too small to examine with visible light. Instead, scientists use particles accelerated to high energies. Their wavelengths are capable of probing miniscule scales.

    Starting in the 1950s, particle physicists used electrons to measure the size and structure of the proton. Electrons are electrically charged, which means they interact with the electromagnetic force distribution in the proton. By shooting a beam of accelerated electrons at a target containing lots of atoms, physicists can observe how the electrons interact with the protons and thus how the electromagnetic force is distributed in a proton. Performing increasingly more precise experiments, physicists now have measured the proton’s electric charge radius to be 0.877 femtometers.

    The MINERvA collaboration achieved its groundbreaking result by using particles called neutrinos in lieu of electrons. Specifically, they used antineutrinos, the antimatter partners of neutrinos. Unlike electrons, neutrinos and antineutrinos have no electric charge; they only interact with other particles via the weak nuclear force. This makes them sensitive to the “weak charge” distribution inside a proton.

    However, neutrinos and antineutrinos rarely interact with protons—hence the name weak force. To collect enough scattering events to make a statistically meaningful measurement, MINERvA scientists needed to smash a lot of antineutrinos into a lot of protons.

    Fortunately, Fermilab is home to the world’s most powerful high-energy neutrino and antineutrino beams. And MINERvA contains a lot of protons. Located 100 meters underground at Fermilab’s campus in Batavia, Illinois, MINERvA was designed to perform high-precision measurements of neutrino interactions on a wide variety of materials, including carbon, lead and plastic.

    To measure the proton structure with high precision, scientists ideally would send neutrinos or antineutrinos into a very dense target made only of hydrogen, which contains protons but no neutrons. That is experimentally challenging, if not impossible to achieve. Instead, the MINERvA detector contains hydrogen that is closely bonded to carbon in the form of a plastic called polystyrene. But no one had ever tried to separate hydrogen data from carbon data.

    “If we were not optimists, we would say it’s impossible,” says Tejin Cai, a postdoctoral researcher at York University and lead author on the Nature paper. Cai performed this research for his doctorate at the University of Rochester.

    “The hydrogen and carbon are chemically bonded together, so the detector sees interactions on both at once. But then, I realized that the very nuclear effects that made scattering on carbon complicated also allowed us to select hydrogen and would allow us to subtract off the carbon interactions.”

    Cai and Arie Bodek, a professor at the University of Rochester, proposed using MINERvA’s polystyrene target to measure antineutrinos scattering off protons in hydrogen and carbon nuclei to Cai’s PhD advisor, Kevin McFarland. Together, they developed algorithms to subtract the large carbon background by identifying neutrons produced from antineutrinos scattering off carbon atoms.

    “When Tejin and Arie first suggested trying this analysis, I thought it would be too difficult, and I wasn’t encouraging. Tejin persevered and proved it could be done,” says McFarland, a professor at the University of Rochester. “One of the best parts of being a teacher is having a student who learns enough to prove you wrong.”

    Cai and his collaborators used MINERvA to record more than a million antineutrino interactions over the course of three years. They determined that about 5,000 of these were neutrino-hydrogen scattering events.

    With these data, they inferred the size of the proton’s weak charge radius to be 0.73 ± 0.17 femtometers. It is the first statistically significant measurement of the proton’s radius using neutrinos. Within its uncertainties, the result aligns with the electric charge radius measured with electron scattering.

    The result shows that physicists can use this neutrino-scattering technique to see the proton through a new lens. The result also provides a better understanding of the proton’s structure. This can be used to predict the behavior of groups of protons in an atom’s nucleus. If physicists start with a better measurement of neutrino-proton interactions, they can make better models of neutrino-nucleus interactions. This will improve the performance of other neutrino experiments, such as NOvA at Fermilab and T2K in Japan.

    Nature
    See the science paper for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:58 am on January 31, 2023 Permalink | Reply
    Tags: "Proposed experiment seeks origin of cosmic neutrinos", , Most astronomers trek to the mountaintops to study the stars but a group of physicists are seeking the secrets of the cosmos with a detector at the bottom of the ocean., , P-ONE experiment UBC, Symmetry,   

    From “Symmetry”: “Proposed experiment seeks origin of cosmic neutrinos” 

    Symmetry Mag

    From “Symmetry”

    1.31.23
    Mara Johnson-Groh

    3
    P-ONE experiment

    Most astronomers trek to the mountaintops to study the stars but a group of physicists are seeking the secrets of the cosmos with a detector at the bottom of the ocean.

    Deep underwater, far off the coast of British Columbia, Canada, the world is cold and dark. Rising from the sand below bob a set of submerged buoys, securely fastened with mooring lines to the ocean floor. Tethered at intervals along each line are large glass spheres housing sensitive, light-detecting instruments.

    The scientists who constructed these deep-sea instruments aren’t biologists or oceanographers. They’re physicists and astronomers. It’s here, 2 kilometers under the frigid Pacific waves, that they are hoping to capture wily, shape-shifting, nearly massless particles called neutrinos that could change our view of the universe.

    Particle decays regularly produce mid-to-low-energy neutrinos, the kind that physicists spend the bulk of their time observing. But every once and a while a different kind of neutrino is detected—a high-energy cosmic neutrino.

    Scientists know from the intense energy of these superpowered particles that they must have been accelerated by extreme objects outside our galaxy.

    “These energies are really hard to imagine,” says Juan Pablo Yanez Garza, a physicist and assistant professor at the University of Alberta. “When you consider how we accelerate particles in our labs, like the Large Hadron Collider, and plug in the typical magnetic fields in the universe, you realize that you would need an ‘accelerator’ the size of an entire galaxy to energize neutrinos this much.”

    Using gigantic, specialized detectors, scientists have just begun to pinpoint some of the extragalactic origins of these particles. Figuring out where high-energy neutrinos come from can help solve longstanding mysteries about the behemoth cosmic accelerators that produce them, resolve unanswered questions about cosmic rays, and even provide hints about the origins of dark matter.

    The instruments in the Pacific Ocean are some of the first steps toward a proposed experiment called the Pacific Ocean Neutrino Experiment, or P-ONE, which scientists hope will help them uncover cosmic neutrino origins.

    1
    P-ONE experiment. Credit: UBC.

    2
    P-ONE experiment. Credit: UBC.

    Cosmic messengers

    Due to the vastness of space and the ubiquity of view-blocking dust clouds, most of the universe is hidden from photon-based telescopes. Instead, astronomers look for messenger particles, such as neutrinos, to learn more about these dark regions, which include some of the highest-powered objects in the universe. Neutrinos are ideal cosmic messengers. Their limited interactions with other particles and chargeless state mean they can race across space unencumbered by magnetic fields and dust clouds. However, this introversion means they can be hard to capture when they reach Earth.

    Scientists have developed goliath detectors in hopes of upping the odds. Even so, the chance of catching a high-energy cosmic neutrino is slim. In its 12 years of operation, the IceCube detector—one of the premier cosmic neutrino observatories, located at the South Pole with a cubic kilometer of detection volume—has caught only a few hundred.

    “With more than 10 years of data from IceCube, we still do not know what most of the cosmic high-energy neutrino sources are,” says Lisa Schumacher, a research scientist at the Technical University of Munich, who is involved with both IceCube and P-ONE.

    But we do know about some of them. In 2018, the IceCube observatory was the first to pinpoint a source of high-energy neutrinos: a blazar 3.7 billion light years away. A blazar is the nucleus of a galaxy powered by a black hole that can accelerate particles in huge jets at nearly the speed of light.

    Then in 2022, IceCube announced a second source in another active galaxy just 47 million light years away, where scientists think neutrinos and other matter are accelerated around a giant black hole.

    While active galaxies are now a confirmed source, statistical analyses show that they alone can’t account for all high-energy astrophysical neutrinos. “The problem is we need more data,” says Elisa Resconi, a professor at the Technical University of Munich who has worked with IceCube. “Statistics is what’s limiting us right now.”

    A light in the dark

    To gather more data, scientists like Resconi have been dreaming up new neutrino observatories. Resconi spearheaded an effort to launch the new large-scale P-ONE experiment. With a goal of building hundreds of neutrino detectors along several 1-kilometer-long lines, the experiment is intended to complement IceCube. From its location in the northeast Pacific, P-ONE could grab neutrinos from different parts of the universe that IceCube can’t see.

    “The idea is to have a telescope that would be similar to IceCube, but with improvements given advances in technology over the last decade” Resconi says. “Our aim is to really complement other detectors, as we want to be able to work together and pool our data.”

    P-ONE will detect neutrinos in the same way IceCube has for years—by looking for the tiny streaks of light neutrinos create as they bump into other particles in a medium such as water or ice.

    To block out other atmospheric particles that can mimic these tiny streaks, scientists often install neutrino detectors underground. Some neutrino detectors, like the Sudbury Neutrino Observatory in Canada and Super-Kamiokande in Japan, are built in former or currently operating mines. Others, like the Baikal Deep Underwater Neutrino Telescope in Russia and the future Cubic Kilometre Neutrino Telescope off the coast of Italy and France, are built deep underwater. P-ONE scientists hope to add to—and widen the reach of—the aquatic fleet.

    P-ONE’s location in water gives it an edge over IceCube, which is enshrined 2,000 meters deep in glacial ice. Though the Antarctic ice is highly transparent, its crystalline structure prevents a beam of light from traveling along a perfectly straight line. Since the angle and direction of the streaks are used to identify where in the sky the neutrino came from, this diffusion makes it harder for IceCube to identify cosmic neutrino factories.

    “The total amount of light the detectors will measure with P-ONE will be a little bit less than IceCube, but we can better reconstruct where the light came from,” Schumacher says.

    Neutrinos, bioluminescence, and more

    Scientists have proposed to build P-ONE off the backbone of Ocean Networks Canada’s oceanographic observatory, the largest permanent oceanographic infrastructure in the world. If they do, P-ONE scientists will be able to tap into the network of hundreds of kilometers of optical cables and substations already installed on the ocean floor, saving the experiment time and money.

    In return, P-ONE could also open new doors in oceanography and biology. Extra detectors, like hydrophones or oxygen sensors, could be attached to the P-ONE lines to measure the ocean’s vitals and conduct acoustic tomography, which uses low-frequency signals to measure ocean currents and temperature over large regions. And since P-ONE’s detectors are light-sensitive, they could also be used to study long-term changes in bioluminescence activity in the deep.

    In 2018, the project scientists deployed STRAW-a, an instrument cluster designed to test the suitability of the site for the P-ONE experiment. Along with STRAW-b, which finished testing in 2021, the pathfinder mission proved the location’s clear waters would make a good canvas for neutrino detection. Now, the scientists are preparing for the next stage: installing a prototype instrument, planned for spring of 2024.

    The prototype phase will see at least three lines, each with 20 detectors, installed on the ocean floor. This should allow scientists to capture 30 or so atmospheric neutrinos—enough to calibrate and provide proof-of-concept. If all goes well, P-ONE will eventually consist of 70 1,000-meter-long lines spread over a square kilometer of ocean. And if interest in the project grows, the experiment is extremely scalable.

    “With P-ONE, IceCube and other detectors in the works, we can really do neutrino astronomy properly,” Resconi says. “We’ll be able to pinpoint many objects and do population studies, which can tell us which objects produce the most neutrinos.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:11 pm on January 24, 2023 Permalink | Reply
    Tags: "Ways to weigh a neutrino", Another process that produces a neutrino is electron capture: A proton in an unstable nucleus captures an electron from the inner shell and converts to a neutron and ejects a neutrino., , , For decades scientists have tried to find a way to measure the mass of the lightest matter particle known to exist. Three new approaches now have a chance to succeed., In a process called beta decay a neutron in an unstable nucleus transforms into a proton to restore balance. As the neutron becomes a proton it emits a negatively charged electron—and a neutrino., , , Neutrinos are very very small. But they outnumber the other fundamental particles by a factor of 10 billion., , Symmetry, They probably influenced the formation of structures in the early universe so knowing their mass is critical to closing gaps in our understanding of the cosmos., To measure directly or indirectly-that is the question.   

    From “Symmetry”: “Ways to weigh a neutrino” 

    Symmetry Mag

    From “Symmetry”

    1.24.23
    Elise Overgaard

    For decades scientists have tried to find a way to measure the mass of the lightest matter particle known to exist. Three new approaches now have a chance to succeed.

    In 1980, Hamish Robertson was a tenured professor at Michigan State. He’d been there since his postdoc in 1971, and he was content. “I want to stress how valued and happy I felt there,” he says. “It was, and still is, an outstanding place.”

    But he and his friend and colleague, Tom Bowles, had begun to hatch an idea that would take him far from MSU. They were devising a new experiment to measure the mass of the elusive and perplexingly light neutrino.

    Neutrinos are the only fundamental particles whose mass we still don’t know. As their name implies, neutrinos are very very small. But they outnumber the other fundamental particles by a factor of 10 billion.

    Their collective abundance makes it likely that they probably influenced the formation of structures in the early universe so knowing their mass is critical to closing gaps in our understanding of the cosmos.

    But how do you measure something with a mass so small it approaches zero? Hundreds of physicists, including Robertson, have devoted their careers to solving this problem, and they’re seeing progress. Research projects underway in Europe and the United States fuel a sense of optimism that the task can be accomplished.

    To measure directly or indirectly-that is the question

    If you were asked to weigh something—your dog for example—how would you do it?

    You could set the dog in your car, watch the compression, measure how many inches the car is displaced, then convert that into a measurement of the dog’s mass. You’d need to know the weight of the car, the technical specifications of the shocks, how much air is in the tires, and something about spring constants. That’s an indirect (and hard) way.

    Alternatively, you could simply set the dog on a bathroom scale.

    The decision on which approach to use, direct versus indirect, depends on what resources you have available. If you don’t have a bathroom scale, an indirect measurement using your car could hypothetically be your best option.

    When it comes to neutrinos, scientists have faced a similar situation.

    In 1987, astrophysicists interested in the mass of neutrinos got an assist from a rare nearby supernova. The spectral data they collected from the stellar explosion helped them make an indirect measurement that gave them an upper limit on the neutrino mass. As in the car example, they used mathematical models, known quantities, and interactions between many parts of a system to make a calculation.

    Cosmologists have also made an indirect measurement by looking for the imprint of neutrino mass on faint radiation in space called the cosmic microwave background [CMB], says Diana Parno, an associate professor at Carnegie Mellon University and US spokesperson for the Karlsruhe Tritium Neutrino direct mass experiment, or KATRIN.

    “What we would ideally like is an Earth-based measurement of the neutrino mass, and then we can compare that against the cosmological measurement,” Parno says.

    So, while cosmologists and astrophysicists look to the sky, experimentalists like Robertson, Bowles and Parno take the direct approach to searching for the neutrino mass. It’s tough—the ghostly particles don’t interact with electromagnetic fields or the nuclear strong force, and they’re so light that gravity barely pulls on them. Experimentalists have to get creative.

    Building a bathroom scale for neutrinos

    Back in 1980, Robertson and Bowles developed a new idea for making a direct neutrino mass measurement.

    But there were only a few places that had the equipment, the funding and other resources required for the experiment they wanted to propose. The DOE’s Los Alamos National Laboratory was one of them.

    Bowles already worked in Los Alamos, and he wanted Robertson to join him. Robertson was tempted. “Since I first was aware of science as a child, I knew about Los Alamos. I always thought it would be the most extraordinary place to be,” he says.

    To convince Robertson to take the dive, Bowles invited him to visit the lab. One night he treated Robertson to dinner at Southwest restaurant Rancho de Chimayo. “On the way back, we stopped the car and turned off the lights,” Robertson says. “We got out so I could marvel at the spectacular carpet of stars in the clear mountain air.”

    Robertson was sold. He was ready to embark on the journey to measure neutrino mass.

    So how does one go about building a bathroom scale for neutrinos? If you have a squirmy dog that won’t sit on the scale, you can weigh yourself alone, then weigh yourself with your pet. The difference between the two is the weight of the dog. Neutrino researchers use that same idea, taking advantage of processes that produce neutrinos.

    In a process called beta decay a neutron in an unstable nucleus transforms into a proton to restore balance. As the neutron becomes a proton it emits a negatively charged electron—and a neutrino.

    Another process that produces a neutrino is electron capture: A proton in an unstable nucleus captures an electron from the inner shell and converts to a neutron and ejects a neutrino.

    In both cases the events produce a very specific amount of energy—you can look it up in a table. That exact amount of energy is the difference between the mass of the parent atom and the mass of the daughter atom. And that energy is shared between the products: the neutrino and the electron in beta decay, or the neutrino and the excited daughter atom in electron capture.

    Experimentalists measure energies, hence they can determine the energy taken by the neutrino. Then they take advantage of that old reliable equation E = mc2 and convert the neutrino’s energy to mass.

    Awkward space in the Standard Model

    Why is investigating the mass of neutrinos so alluring to scientists like Robertson and Bowles? Because the unresolved problem has shaken scientists’ understanding of the universe.

    The Standard Model of particle physics—our current best explanation of the fundamental forces and particles that make up everything—predicts that neutrinos should have no mass.

    But oscillation experiments in the 1990s showed that they must have mass.

    “Neutrino mass is our first laboratory evidence for physics beyond the Standard Model, and that’s so cool,” Parno says.

    As Robertson explains: “Neutrinos are the only matter particles for which the Standard Model made a prediction for what its mass would be. And that prediction was wrong.

    “We have examples of things that are not in the Standard Model. Gravity is not in the Standard Model. The mass of quarks is not predicted. But there is only this one case where the Standard Model actually made a prediction, and it got it wrong.”

    The upper limit for the neutrino mass, as determined indirectly by cosmology, is roughly one millionth the mass of the next lightest particle, the electron. That’s like the gap between one mouse, which weighs roughly 25 grams, and five elephants, which together weigh roughly 25,000 kilograms. “That’s an enormous gap,” says Parno. “It’s awkward to have this empty space in the Standard Model.”

    The gap means that neutrino masses might be special. Or they might not be. But until we have a measurement to work with, we can’t really say anything definitive. So making that measurement is a crucial first step.

    Even if researchers achieved a result for the mass of the neutrino, the work would not be done. It would help rule out some theories and models, but there would still be questions, says Patrick Huber, neutrino theorist and director of the Center for Neutrino Physics at Virginia Tech.

    “It’s not like once you have the measurement, you immediately select the right model,” he says. “It’s not like every theorist has predicted a certain neutrino mass, and somebody has the right theory once it’s measured. But it would lead to a whole bunch of broader questions.”

    Huber has dedicated his career to neutrinos for this exact reason. “If they see the neutrino mass, this will be very exciting. But what would be even more exciting is if they’re not seeing the neutrino mass where they should have seen it, because that means something new is happening,” he says. “Then we are forced to really start thinking anew.”

    A ghost worth chasing

    Back to the 1980s—Bowles’s persuasion had worked. “The siren song of Los Alamos was too strong to resist,” Robertson says.

    Robertson’s wife was also interested in heading out west. As a female nuclear physicist in a male-dominated lab in the ’70s, she had faced an uphill battle to navigate her career. Together, husband and wife secured jobs at the lab and flew down to buy a house. “On the glide path into Albuquerque, I still remember the feeling of elation,” Robertson says. “It was a new beginning.”

    A few months later, they officially made the move, with their 6-month-old son in tow.

    In 1972 a Swedish physicist named Karl-Erik Bergkvist had declared a new neutrino mass limit: 55 electronvolts. Robertson, Bowles and their colleagues thought they could do better. Their idea was to study beta decay using tritium, a radioactive isotope of hydrogen. Bergkvist had also used tritium, but it was a form implanted into aluminum. Robertson and Bowles wanted to use it in a more pure, gaseous form.

    The lab administration was open to their project. “They said, ‘How much money do you need?’ Well, I had no idea,” Robertson says. So he threw out a number. “They said, ‘Okay, fine.’ So that was it. Off we went.”

    With that funding—which in the end wasn’t quite enough, but at least got them started—they pushed the limit down to about 10 electronvolts. Importantly, they had proved that tritium decay experiments could work.

    In 1988, Robertson shifted gears and joined the Sudbury Neutrino Observatory.

    In 2001 the group demonstrated neutrino oscillation—a finding that proved neutrinos have mass and that eventually earned the 2015 Nobel Prize in Physics.

    Having confirmed that neutrinos have mass, scientists returned to the quest to build a bathroom scale for neutrinos. The KATRIN experiment [above] was forming, and the collaboration members asked everyone who had worked on a tritium experiment in the past to join. Robertson had moved to the University of Washington by then, but he joined the collaboration happily, even offering to design and provide the detector system for the project.

    Nestled in Karlsruhe, Germany, the KATRIN experiment now relies on contributions from 150 researchers from seven countries.

    Parno is one of those researchers. Like others, she was drawn to the experiment by the thrill of chasing something unknown. “Neutrinos are really weird,” she says. “They keep on surprising and confusing us. I think neutrinos still have a lot to teach us.”

    KATRIN is the most advanced direct mass measurement experiment. The KATRIN collaboration published an exciting result in February 2022: Neutrinos must weigh less than 0.8 electronvolts [Nature Physics (below)].

    Fig. 1: Illustration of the 70-m-long KATRIN beamline.
    3
    The main components are labelled. The transport of β-electrons and magnetic adiabatic collimation of their momenta p are illustrated [a–f]. View into the tritium source depicts three systematic effects: molecular excitations during β-decay (a), scattering of electrons off the gas molecules (b) and spatial distribution of the electric potential in the source Usrc(r, z) (c). The view into the spectrometer illustrates the main background processes arising from radon decays inside the volume of the spectrometer (d), highly excited Rydberg atoms sputtered off from the structural material via α-decays of 210Po (e) and positive ions created in a Penning trap between the two spectrometers (f). Low-energy electrons, created in the volume as a consequence of radon decays or Rydberg-atom ionizations, can be accelerated by qUana towards the focal-plane detector, making them indistinguishable from signal electrons.

    Oscillation experiments provided a floor, and the KATRIN result provides a ceiling. “We are closing in,” Robertson says.

    This range is much tighter than the one from the indirect astrophysical supernova measurement. “The supernova limit is about 5.7 electronvolts, about seven times looser than the current KATRIN limit,” Parno says.

    And it’s close to the limit from cosmological indirect measurements, which is somewhere in the range 0.12 to 0.5 electronvolts, depending which parameters are used in the model, Robertson says.

    KATRIN’s 0.8 electronvolt number is the new benchmark for all direct mass experiments, but other inventive scientists are on KATRIN’s heels.

    New approaches

    In 2009, Joe Formaggio, a professor at the Massachusetts Institute of Technology, and Benjamin Monreal, an associate professor at Case Western Reserve University, had another tritium-based idea, which became the foundation of the neutrino mass experiment known as Project 8.

    “They proposed just a really beautiful idea,” Robertson says. “It’s one of those ideas where you say: Oh yeah, I wish I thought of that!”

    Robertson has worked with the Project 8 team since its inception. The experiment also uses tritium decay, but Project 8 determines the energy of the emitted electron differently. They measure the frequency of the electron’s cyclotron radiation—the microwave radiation that escapes from charged particles in circular orbit in a magnetic field.

    “The amount of power radiated is really very small, but you can measure it,” Robertson says.

    Various aspects of the experiment are still being developed and tested. If the new approach works as planned, the Project 8 team hopes to measure the neutrino mass with a sensitivity of approximately 0.04 electronvolts.

    The third technique currently under investigation as a way to directly measure neutrino mass uses molecules of the holmium isotope 163Ho. It’s challenging as well, says researcher Loredana Gastaldo, Junior Professor at the Kirchhoff Institute for Physics at the University of Heidelberg and spokesperson of the Electron Capture 163Ho, or ECHo, experiment.

    “If you want to learn more about neutrinos, you need to be creative, to find an original and clever method that allows you to really learn something about them,” she says. “The neutrinos don’t give anything up as a present, so you need to sweat a lot to gain a little bit more understanding.”

    The ECHo experiment relies on electron capture events in 163Ho. Scientists implant 163Ho ions in microcalorimeters, a totally different type of detector than the ones used in KATRIN and Project 8. “The idea is that if energy is deposited into the detector, there is an increase of temperature … and we can measure this extremely small increase in temperature with very precise thermometers,” Gastaldo says.

    Each project is different, but through human ingenuity (and persistence), direct mass measurement experiments are making strides.

    Let’s get together

    Robertson, who is now a professor emeritus at the Center for Experimental Nuclear Physics and Astrophysics at the University of Washington in Seattle, officially retired in 2017. But he’s still working on Project 8.

    The collaboration is younger than KATRIN and, so far, has done only proof-of-concept experiments. The next step is to create a full-blown large-scale detector.

    “Some days you get up and you learn something that is just not going to work,” Robertson says. “And you say: Oh my goodness, we’re doomed.

    “And then you work for another week or two—or a month or a year—and you talk to your friends, and somebody has an idea, and suddenly the sun comes out again, and that problem is solved, and you move on.

    “I still work at this 24/7 because I really love this experiment.”

    And he’s not alone. Parno, Gastaldo and Huber are just a few of the hundreds of other neutrino experts who have dedicated their careers to finding the mass of the neutrino. And they all rely on each other.

    “We are all learning from all the other groups,” Gastaldo says. “And this makes the collaborations really alive and with super interesting discussions in which all of us are gaining a lot of knowledge.”

    To maximize neutrino physicists’ ability to learn from each other, Gastaldo organized a conference, called “NuMass”, in 2016. The inaugural meeting was a gathering of 40 scientists from the US and Europe, all with different expertise. “The discussions were so deep it was unbelievable,” Gastaldo says.

    The group repeated the conference in 2018, 2020 and 2022.

    “I think it’s fantastically interesting to have so many different fields of knowledge that turn out to be necessary in order to unlock the secrets of this incredibly lightweight, incredibly rarely interacting particle that somehow shaped the universe,” Parno says.

    The optimism of neutrino mass researchers is infectious. They continue, with tangible passion, to push the limits of human ingenuity.

    And there’s always room for newcomers.

    “You have to have young people because they’re the people who actually can get stuff done,” Robertson says. “There’s this huge group of people. I’m just wandering along jumping from the shoulder of one giant to another. That’s part of what makes science fun.”

    The latest: The DOE’s Fermi National Accelerator Laboratory LBNF/DUNE experiment.



    Nature Physics
    See the science paper for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:13 pm on January 17, 2023 Permalink | Reply
    Tags: "Energy consumption and cost considerations could shape future of accelerator R&D", A recent report underscores the importance of energy consumption and cost to decisions about future large-scale particle accelerator projects., , Accelerators that can achieve even higher energies than the LHC could lead to our first observations of hypothetical particles like dark matter or produce even more exotic material., Achieving energies beyond the 6.8 trillion electronvolts per beam world record the LHC set in 2022 will require major strides in accelerator R&D., Any new project is a heavy long-term investment., , , , , , Symmetry, The Accelerator Frontier ITF evaluated 24 competing proposals for new accelerator facilities aiming to achieve greater power or increased energy efficiency or a more compact design., The construction of a large-scale particle accelerator like the Large Hadron Collider at CERN requires a staggering investment of time and resources and labor., The group tasked with evaluating proposals for new accelerator facilities as part of the recent US planning process was the Accelerator Frontier Implementation Task Force [ITF]., The mammoth cost and energy requirements of major accelerator projects are not merely political or logistical obstacles somehow separate from the science of particle physics., The quest to achieve higher and higher energies and other performance improvements may ultimately be at odds with the conclusions of this year’s ITF report emphasizing reducing cost and energy., These discussions help the community align on research priorities in areas like neutrino and cosmic and accelerator physics., To make the best use of its efforts in this and other areas of study the particle physics community organizes recurring regional planning exercises., What is the next big thing? Where is the next really exciting physics out there?, Why do researchers believe higher energies could produce new and even heavier particles?   

    From “Symmetry”: “Energy consumption and cost considerations could shape future of accelerator R&D” 

    Symmetry Mag

    From “Symmetry”

    1.17.23
    R.M. Davis

    1
    Illustration by Sandbox Studio, Chicago with Steve Shanabruch.

    A recent report underscores the importance of energy consumption and cost to decisions about future large-scale particle accelerator projects.

    The construction of a large-scale particle accelerator like the Large Hadron Collider at CERN requires a staggering investment of time and resources and labor.

    To make the best use of its efforts in this and other areas of study the particle physics community organizes recurring regional planning exercises. These discussions help the community align on research priorities in areas like neutrino and cosmic and accelerator physics.

    The group tasked with evaluating proposals for new accelerator facilities as part of the recent US planning process was the Accelerator Frontier Implementation Task Force [ITF]. In July, the task force published a preliminary report emphasizing the need for R&D to reduce the cost and energy consumption of the world’s next big particle accelerator project.

    The Accelerator Frontier ITF evaluated 24 competing proposals for new accelerator facilities, many of them aiming to achieve greater power, increased energy efficiency, or a more compact design than their predecessors—or some combination of the three. Accelerator physicist and ITF member Marlene Turner says that’s a lot of proposals for the field, especially since she expects only one or two will ever be built, if any. 

    For Turner, this discrepancy underscores the importance of the planning process. 

    “If you’re only going to build one big facility, or let’s say two, you really want to make sure that what you’re going for is the best possible thing,” Turner says. “Any new project is a heavy long-term investment.”

    Turner says one reason there’s so much accelerator R&D happening in the community today is the simple fact that facility construction of the last major accelerator project, the LHC, was completed more than a decade ago. “There’s the question about, what is the next big thing? Where is the next really exciting physics out there?” 

    Some proposed accelerator designs aim to generate detailed information on already known particles. For example, “Higgs factory” colliders would focus specifically on the Higgs boson. However, the highest-energy designs serve a more exploratory purpose. 

    Particle accelerators have a long history of facilitating groundbreaking physics discoveries. In the late ’60s and early ’70s, accelerator experiments at The DOE’s SLAC National Accelerator Laboratory provided the first evidence for the existence of quarks, which are fundamental particles—i.e., particles that are not composed of other, smaller particles. Most recently, the LHC enabled the discovery of the Higgs boson. 

    Accelerator experiments have also enabled the discovery of many different types of composite particles, which are made up of two or more fundamental particles. In fact, the unprecedented high-energy experiments conducted by the LHC have led to the discovery of nearly 60 new composite particles in just 10 years of operation. 

    Why do researchers believe higher energies could produce new and even heavier particles? Their logic is based in one of the best known scientific equations in the history of modern science: “Albert Einstein’s famous formula, E=mc2, says that you can make matter out of energy, and that’s what particle colliders do—they kind of apply the energy, and they hope that you can produce the matter,” Turner says. 

    Accelerators that can achieve even higher energies than the LHC could lead to our first observations of hypothetical particles like dark matter, or produce even more exotic material that has yet to enter the realm of theoretical physics. “We’re just putting as much energy as we can on the table and hope to get something that we didn’t expect,” Turner says.

    Achieving energies beyond the 6.8 trillion electronvolts per beam world record the LHC set in 2022 will require major strides in accelerator R&D. “It’s pushing every single limit of the technology,” Turner says. “Say you give me infinite money, but you want a 10+ TeV machine.” Turner explained that we simply would not know how to build such a machine, at least not at this point in time. 

    Progress needed in accelerator R&D

    The quest to achieve higher and higher energies and other performance improvements may ultimately be at odds with the conclusions of this year’s ITF report, which emphasizes the importance of R&D centered on reducing cost and energy consumption. ITF chair Thomas Roser notes that these conclusions represent a marked departure from previous planning cycles. 

    “In the past, a lot of the R&D went into performance, to get high luminosity, how do you reach the high energies, all that,” Roser says. “But the cost has now become so big that it’s not clear that society as a whole can afford to do this. Also energy consumption—energy consumption has become in my mind a critical deciding factor as to whether we can ever build another collider.”

    For Roser, the mammoth cost and energy requirements of major accelerator projects are not merely political or logistical obstacles, somehow separate from the science of particle physics. Instead, he views them as a core consideration of accelerator R&D itself. “I am concerned that these machines are getting out of the reach of what society will tolerate if we don’t move aggressively, and with the same priority as performance R&D, towards sustainable operations,” he says. 

    Katsonobu Oide, a Japanese accelerator physicist and one of a handful of ITF members representing the international community, concurs. “We have to always pay attention to the energy efficiency—not on the accelerator gradient, not on the size of the machine,” Oide says. “We have to always pay attention to this efficiency.”

    The accelerator R&D landscape today

    The 24 accelerator proposals evaluated by the ITF vary widely in terms of approach and technical maturity. Some projects, like the International Linear Collider, have been under design development for decades.


    Others represent newer concepts that could require more than a quarter century of further R&D.

    The ILC provides a useful counterpoint to some of the other accelerator proposals evaluated by the ITF. “ILC is shovel ready, and if you give us the money, we’re going to build it,” Turner says. 

    The idea of the ILC has been around so long, it has become the baseline design for other proposed Higgs factory accelerators. At the same time, funding issues have left the project stalled for years—another reason the physics community has generated so much new accelerator R&D. 

    One alternative to the ILC is the Cool Copper Collider, a linear collider that would replace the traditional accelerating structure of the ILC with cryo-cooled copper, enabling faster beam acceleration and, as a result, a more compact machine.

    2
    Cool Copper Collider. Credit: SLAC.
    A program to build a lepton-collider Higgs factory, to precisely measure the couplings of the Higgs boson to other particles, followed by a higher energy run to establish the Higgs self-coupling and expand the new physics reach, is widely recognized as a primary focus of modern particle physics. We propose a strategy that focuses on a new technology and preliminary estimates suggest that can lead to a compact, affordable machine… the Cool Copper Collider (C^3).

    Achieving this would allow the CCC to be less than half the size of the ILC while delivering similar performance. 

    Another alternative is the Recycling Linear Collider, or ReLiC, an example of an “energy recovery” concept, in which energy and particles generated by the accelerator would be captured and reused for greater energy efficiency.

    Smaller or more energy-efficient colliders may seem like a clear solution to Roser’s energy and cost concerns, but those concepts have their own disadvantages. “The drawback…is the technology is not ready for that,” Roser says. 

    In general, the most technically ambitious proposals require the greatest amount of additional R&D, Roser says. And in many cases, they will incur the greatest costs.

    Looking to the future

    To Turner, ambitious proposals like the ReLiC, as well as other proposals from the ITF evaluation like muon- and plasma-based colliders, underscore the usefulness of accelerator R&D.

    “We’re pushing every possible boundary of technology,” she says. “If you develop better magnets, that’s great for colliders, but it’s also great for industry, for medical applications, for industrial purposes. Touch screens were developed back at CERN for one of their machines. The World Wide Web, also an invention of CERN.” 

    Oide suggests that while appreciating these spin-offs, we shouldn’t underestimate the value of scientific discovery. “Many discoveries in particle physics were done by accelerators—not all, but many,” he says. “Many, many, many physics discoveries need new accelerators.”

    For Roser, it all comes back to energy consumption. “The right way to approach accelerator R&D is building the machines that are much more efficient in energy usage so that you can get the high-performance machine, but it doesn’t use huge amounts of energy,” he says.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:23 pm on November 29, 2022 Permalink | Reply
    Tags: "First-time ATLAS measurement provides new look at Higgs", , , , For the first time physicists have a statistically significant measurement of the joint polarization of W and Z bosons., , Higgs in Standard Model of Particle Physics, , , , Symmetry   

    From “Symmetry”: “First-time ATLAS measurement provides new look at Higgs” 

    Symmetry Mag

    From “Symmetry”

    11.29.22
    Madeleine O’Keefe

    1
    Event display depicting a particle collision in the ATLAS detector. Credit: ATLAS Collaboration.

    For the first time physicists have a statistically significant measurement of the joint polarization of W and Z bosons.

    1
    W an Z bosons in Feynman diagram. Wikipedia.

    Luka Selem says he was always a curious kid. Growing up in France, he was given copies of Science et vie junior, a science magazine for young people, by his parents.

    “Since I was very young, I was always interested in quite a lot of things,” he says. “I was always asking, ‘Why? But why that? Why that, and then why that?’ I wanted to go all the way to the end. I was never satisfied by the answer.”

    Particle physics, the study of the fundamental particles and forces that make up everything around us, turned out to be a good way for Selem to search for answers. “In particle physics, there is no other ‘why,’” he says. “No one can tell me the rest of the story. I have to find it myself with my colleagues.”

    Selem recently found a new way to search for the rest of the story while doing research for his doctoral thesis. With other physicists on the ATLAS experiment at Europe’s CERN laboratory, Selem successfully made a measurement of particles called W and Z bosons that will allow physicists to further probe the mechanism that grants particles mass.

    How the boson got its polarization

    Bosons are a category of subatomic particles that are associated with the four fundamental forces in the universe: gravity, electromagnetism, the strong nuclear force and the weak nuclear force. The most familiar boson is the photon, the force carrier of light, that is responsible for electromagnetism. 

    While photons are massless, massive bosons, such as the W and Z bosons, have properties that set them apart from other particles. One concerns their polarization, the degree by which the quantum spin of the boson is aligned in a given direction. A boson can be transversely or longitudinally polarized. (There are technically two types of transverse polarization—left and right—but they are not yet differentiable in the data for joint polarizations of two bosons.) Massive bosons can be longitudinally polarized, meaning that their spin can be oriented perpendicular to their direction of motion.

    “Among physicists, polarization is something … you’re not sure you understand exactly what it is,” says Selem, who admits he was not very familiar with polarization himself when he chose his PhD project at Laboratoire d’Annecy de Physique des Particules, or LAPP.

    The W and Z bosons enable the weak force, the force responsible for radioactive decay, the same way that photons enable the electromagnetic force. But, unlike photons, W and Z bosons have mass, thanks to their interactions with a field generated by yet another boson, the Higgs.

    Fundamental particles’ interactions with the Higgs field are what physicists believe generate mass. 

    This is what makes polarization so interesting. “It’s a very indirect way of probing the Higgs mechanism,” says Selem.

    W and Z bosons gain mass in a process called electroweak symmetry breaking. When symmetry “breaks,” four particles are created, including the Higgs boson, says Junjie Zhu, a professor at the University of Michigan and an ATLAS physicist. 

    The other three particles are incorporated into the W+, W- and Z bosons, giving them mass and granting them a new longitudinal polarization. “[It is] very important to study the longitudinal components of the W and Z bosons because the longitudinal components are originally from the symmetry breaking,” says Zhu.

    This provides a unique window through which to search for new physics—things that can’t be explained by the current Standard Model of particle physics. Physicists want to measure longitudinal joint polarization more accurately because new physics is “often sensitive to longitudinal components,” says Zhu. 

    Since the 1990s era of the Large Electron-Positron Collider—the predecessor to the Large Hadron Collider at CERN—polarizations of W and Z bosons have been studied individually. But studies of single boson polarization measurements have not revealed any hints of new physics. 

    To keep probing the Standard Model, then, physicists next sought to measure joint polarization: two bosons polarized simultaneously. If the value they measured deviated from the Standard Model prediction, it would be an indicator of new physics, like an unknown aspect of the Higgs mechanism.

    A new measurement

    Selem and his colleagues used ATLAS data taken between 2015 and 2018 during Run 2 of the Large Hadron Collider. The researchers aligned these data with templates based on theoretical predictions for the four different combinations of polarizations. 

    “We have different [templates] from the simulations that tell us what shapes look like for longitudinal-longitudinal, transversal-transversal, longitudinal-transversal and transversal-longitudinal,” says Zhu, who does similar analyses with ATLAS.

    The physicists scaled their four templates to match the histogram of data. The proportion of each template in the end revealed the fraction of W and Z bosons with the respective joint polarizations. In the analysis, they found that the longitudinal-longitudinal fraction was statistically significant.

    The results support the model of the universe as we know it. “With the present sensitivity, no sign of new physics was observed, pushing even further the domain of validity of the Standard Model,” says Emmanuel Sauvan, an ATLAS physicist at LAPP and Selem’s PhD advisor. “This is per se an interesting result, even if I would have preferred myself to see new physics effects.

    “One next step is to refine this measurement, in view of reducing its uncertainties, with more data that we are recording right now [during LHC Run 3].”

    An important step further in the future will be to measure the scattering of two longitudinally polarized bosons. This interaction is even more sensitive to new physics, but researchers need 10–20 times more data than they have now to hope to detect this event. 

    “Such polarization measurements in vector boson scattering will require much more luminosity, and we will have to wait for the High-Luminosity LHC,” says Sauvan, referring to the proposed upgrade of the Large Hadron Collider scheduled for the late 2020s. “It will allow us to fully validate the Higgs mechanism of the Standard Model.”

    For now, the joint polarization result is a significant development. The researchers overcame challenges that had limited previous experiments’ attempts at the measurement, and they developed new analysis techniques, including machine learning algorithms. 

    Prachi Atmasiddha, a graduate student at the University of Michigan working with Zhu, performed event generation for the joint polarization result. “Once we had those events, that was one of the very important [reasons] why this analysis was motivated,” she says. “We [determined that we] can produce these separately polarized events, so we can build models on that, and we can do fitting with these templates of different polarization states.”

    Selem says he looks forward to finding out what physicists are able to do next. “I would say the most important part of this measurement is the roadmap it provides to do further measurements, because we did meet a lot of challenges that will happen in other similar measurements, and we gave ideas on how to solve them.”

    Selem recently accepted a postdoctoral research position in Grenoble, France, where he will work on detector development and exotic searches. There, he and his fellow physicists will continue to seek their own answers to the question, “Why?”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:38 am on November 1, 2022 Permalink | Reply
    Tags: "How to maintain a physics experiment in a desert", , Extreme swings in temperature make it difficult to maintain the temperature inside the large vacuum equipment areas of LIGO which must be kept right around 67 degrees Fahrenheit., , , , Ravens ripped seals out of windows on brand-new buildings and even caused glitches in the detector one hot summer. They pecked on the ice that builds up on the liquid nitrogen tubes., Symmetry, The holes the ravens created let rainwater leak through., The project also has to contend with the desert’s animal inhabitants. There are many different creatures roaming the grounds of the Observatory. Some do no harm. Others cause trouble., The project maintenance people have removed venomous scorpions and spiders from buildings and repaired rabbit-chewed wires. They have evicted porcupines., Tumbleweed walls-as much as 10 feet high-clog the 2.5-mile-long roads that run along the arms completely blocking the detector., Tumbleweeds are a problem. They are plants that have evolved detach from their root system and spread their seeds. Desert winds blow these desiccated plant balls into the detector.   

    From “Symmetry”: “How to maintain a physics experiment in a desert” 

    Symmetry Mag

    From “Symmetry”

    11.1.22
    Chris Patrick

    Threats of scorching heat, walls of tumbleweed, and countless critters mean innovation is a must for the facilities manager for LIGO Hanford Observatory.

    Glynn “Bubba” Gateley is not a physicist. And yet, the first thing he does upon waking is check on a physics experiment. The tablet he takes home from work provides constant updates about the HVAC system at the Laser Interferometer Gravitational-wave Observatory, or LIGO Hanford Observatory, in Richland, Washington.

    Before heading to work, Gateley checks his phone to make sure he hasn’t missed any calls about tumbleweeds, porcupines, ravens, or any other desert-dwellers that cause trouble for the LIGO facility.

    Along with its twin site in Livingston, Louisiana, LIGO Hanford Observatory—funded by the National Science Foundation—detects gravitational waves.

    In fact, in 2015 this pair of identical detectors were the first to confirm the existence of these ripples in spacetime caused by cataclysmic astronomical events billions of light-years away.

    The detectors at both LIGO locations are L-shaped structures with two 2.5-mile-long arms. Minute movements of sensitive instruments inside of these arms indicate the arrival of gravitational waves, which carry information about the events that created them, such as exploding supernovae, merging black holes, and colliding neutron stars.

    LIGO Hanford Observatory lives in a unique environment. The low seismic activity of Richland makes it a quiet place optimal for detecting weak signals. But a trifecta of vacillating temperatures, persistent flora and resourceful fauna make it a challenge to maintain the delicate equipment scientists use to do it.

    As the observatory’s facilities manager since 2014, Gateley is one of the people that must face this challenge. With his small staff of six, he is responsible for most of the facility’s logistics, including water, electricity, roadways, cleaning, and the aforementioned HVAC system.

    “I keep the facility functional and operational for pretty much everything but the instrument itself,” he says. “There’s definitely some interesting situations out here, to say the least.”

    Beating the heat (and the cold)

    Richland is situated east of Washington’s Cascade Mountains, in an arid shrub steppe ecosystem.

    “Usually when people think of Washington, they think of the rainforests of the Pacific Northwest, right in Seattle. That’s not the case in Richland,” says Michael Landry, head of LIGO Hanford Observatory and a physicist at the California Institute of Technology. “It’s starkly beautiful.”

    Of course, deserts mean hot temperatures. In the summer, many days in Richland are over 100 degrees Fahrenheit, sometimes reaching 118. But at night, the temperature can drop significantly. And in colder months of the year, it can plummet to zero. There are even snow days.

    These extreme swings in temperature outside make it difficult to maintain the temperature inside the large vacuum equipment areas of LIGO, which must be kept right around 67 degrees Fahrenheit.

    “If the temperature starts varying too much in the vacuum equipment areas, then the optics for the instrument start changing drastically and the scientists get upset,” Gateley says. “And then I get a lot of calls.”

    To keep the temperature steady, he does a lot of monitoring, fine-tuning, and finessing of the HVAC system. That’s why every day starts with Gateley checking its status.

    Tumbling a wall of weeds

    While temperature control is a 24/7 concern, Gateley also regularly grapples with another hallmark of the desert, tumbleweeds.

    A single tumbleweed may look innocuous enough, but at LIGO Hanford Observatory, these dust bunnies of the wild west are a nuisance. “Tumbleweeds are one of our biggest natural challenges,” Gateley says.

    Tumbleweeds are actually plants that have evolved to dry out, detach from their root system and spread their seeds as they roll along. Strong desert winds blow these desiccated plant balls into the arms of the observatory’s detector, where they get stuck. They build up quickly, forming walls that can be 10 feet tall.

    These tumble-walls clog the 2.5-mile-long roads that run along the arms, completely blocking the detector.

    However, the tanks along the detector’s arms need to be accessed on a weekly basis. The tanks hold liquid nitrogen running minus 320 degrees Fahrenheit, which the detector needs to ensure its arms are under vacuum.

    To clear these roads for the semitruck delivering liquid nitrogen, the observatory’s maintenance staff used to bale these tumbleweeds like hay, feeding the baler by hand. “It was a very slow process,” Gateley says. “I got to thinking and thinking, trying to figure out some way to expedite that.”

    Eventually, he thought of farm equipment. Specifically, a harvester, which drives through fields of corn or wheat and pulls up stalks.

    But if you drive a harvester into a tumbleweed, it just pushes it forward. So five years ago, Gateley purchased a harvester and added a modified hay reel to the front, creating the LIGO Franken-harvester.

    Its tined wheel grabs tumbleweeds and pulls them into the threshing cylinder, which grinds them up and shoots them out. “With the harvester we can drive down the road ten times faster than the baler ever could,” Gateley says.

    Cohabitating with critters

    Plants aren’t the only organisms that have required Gateley to get innovative with his job. In addition to being an HVAC expert, tumbleweed destroyer, and general handyman, Gateley also has to contend with the desert’s animal inhabitants. There are many different creatures roaming the grounds of LIGO Hanford Observatory. Some of these, like coyotes and deer, do no harm. Others cause trouble.

    Gateley has removed venomous scorpions and spiders from buildings. He has repaired rabbit-chewed wires. And he has evicted porcupines who took to snoozing on top of a trellis next to a building on the observatory’s campus.

    Porcupines would climb the wisteria vines threaded through the trellis and fall asleep at the top. Although the animals are nocturnal, when daytime public tours departed from the building it sometimes woke the porcupines up, at which point they relieved themselves before settling back to sleep.

    Luckily, Gateley noticed incriminating stains on the concrete before any passers-by received an unpleasant surprise from above.

    At first, he just tried to shoo the porcupines away, but they returned undeterred. So, again, Gateley turned to another field for a solution. This time, it was shipping.

    Workers attach disks to ropes that tie boats to docks to prevent rats from using them to board. “I made some of those and clamped them around the wisteria vines so the porcupines couldn’t climb up there,” Gateley says.

    Problem solved.

    Another animal, however, has proved more bothersome. Ravens have ripped seals out of windows on brand-new buildings, and even caused glitches in the detector during one hot summer when they pecked on the ice that builds up on the liquid nitrogen tubes.

    The birds also started pecking out the caulk that joins the concrete sections enclosing the detector arms. The holes they created let rainwater leak through. “It doesn’t rain much here, but we don’t like any moisture to get under there because it could eventually create issues,” Gateley says.

    A biologist helped him figure out what the ravens were trying to do: reach mice underneath the concrete enclosure. So Gateley covered the caulk with thin aluminum metal strips.

    “It deterred them somewhat,” he says. “I hesitate to say it stopped them completely.”

    Despite all of the trouble it creates for him, Gateley says he much prefers the desert to the climate of Louisiana, home of the other LIGO detector, where he coincidentally grew up.

    “I do not miss the humidity,” he says. “And the desert is intriguing.”

    Although he’s lived near Richland for over 40 years now, Gateley says he never knows what the desert will throw at him next. History suggests he’ll be able to figure out a solution anyway.

    “It’s one of the most challenging jobs I’ve ever had,” he says. “But definitely the most interesting. I enjoy it immensely.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:48 am on November 1, 2022 Permalink | Reply
    Tags: , , , , , Cosmic Microwave Background-Stage 4 [CMB-S4] project, , In the 1960s an anomalous and faint electromagnetic glow was observed across the entire sky by Arno Penzias and Robert Wilson with the AT&T Big Horn in Holmdel New Jersey., Many experiments both space- and ground-based are already studying the CMB., Physicists later determined that the light came from the very early universe released when the first atoms formed shortly after the Big Bang., , Symmetry, The CMB-S4 collaboration plans to use hundreds of thousands of superconducting bolometers as their detectors., The current image of the CMB was caught by the ESA Planck space satellite.   

    From “Symmetry”: “The next stage of cosmic microwave background research” 

    Symmetry Mag

    From “Symmetry”

    10.18.22
    Madeleine O’Keefe

    1
    Artwork by Sandbox Studio, Chicago.

    With CMB-S4, scientists hope to connect a sandy desert with a polar desert—and revolutionize our understanding of the early universe.

    In the 1960s, an anomalous, faint electromagnetic glow was observed across the entire sky.

    Physicists later determined that the light came from the very early universe, released when the first atoms formed shortly after the Big Bang.

    We now call this relic radiation the cosmic microwave background, or CMB. Studying it is one of the highest priority pursuits in cosmology.

    “The fantastic thing about these [photons] is they have experienced the entire history of the universe,” says Julian Borrill, a senior scientist at the DOE’s Lawrence Berkeley National Laboratory. “And everything that has ever happened in the universe has left a tiny imprint on those photons; it’s changed their distribution and their energies slightly in all kinds of subtle ways.

    “If we can measure them with enough precision and understand their statistics, we can tease out the entire history of the universe.”

    Many experiments, both space- and ground-based, are already studying the CMB. Now scientists are developing plans for an ambitious project that would multiply by 10 the sensitivity of all these searches combined.

    Called Cosmic Microwave Background-Stage 4, the project would comprise an array of small- and large-aperture telescopes deployed in Chile and at the South Pole. Building it would require unprecedented cooperation between two funding agencies and three scientific communities: astronomy, particle physics and polar science.

    If scientists can pull it off, CMB-S4 will connect a sandy desert with a polar desert to address major astronomical questions.

    “We’re going back to look for physics from the dawn of time and test the model for how our whole universe was created,” says John Carlstrom, a CMB-S4 project scientist and professor at the University of Chicago. “From that, we also learn a great deal about what’s in the universe, how it evolved from these quantum fluctuations to all the structure we see.

    “We’re developing the full story of the universe from its infancy and creation to the present day.”

    The science goals

    For many of the over 400 scientists across 121 worldwide institutions who are part of the CMB-S4 collaboration, the most intriguing goal of the experiment is its search for evidence of cosmic inflation.

    ___________________________________________________________________
    “Cosmic Inflation” Theory

    In physical cosmology, “cosmological inflation” is a theory of exponential expansion of space in the early universe. The inflationary epoch lasted from 10^−36 seconds after the conjectured Big Bang singularity to some time between 10^−33 and 10^−32 seconds after the singularity. Following the inflationary period, the universe continued to expand, but at a slower rate. The acceleration of this expansion due to dark energy began after the universe was already over 7.7 billion years old (5.4 billion years ago).

    Inflation theory was developed in the late 1970s and early 80s, with notable contributions by several theoretical physicists, including Alexei Starobinsky at Landau Institute for Theoretical Physics, Alan Guth at Cornell University, and Andrei Linde at Lebedev Physical Institute. Alexei Starobinsky, Alan Guth, and Andrei Linde won the 2014 Kavli Prize “for pioneering the theory of cosmic inflation.” It was developed further in the early 1980s. It explains the origin of the large-scale structure of the cosmos. Quantum fluctuations in the microscopic inflationary region, magnified to cosmic size, become the seeds for the growth of structure in the Universe. Many physicists also believe that inflation explains why the universe appears to be the same in all directions (isotropic), why the cosmic microwave background radiation is distributed evenly, why the universe is flat, and why no magnetic monopoles have been observed.

    The detailed particle physics mechanism responsible for inflation is unknown. The basic inflationary paradigm is accepted by most physicists, as a number of inflation model predictions have been confirmed by observation;[a] however, a substantial minority of scientists dissent from this position. The hypothetical field thought to be responsible for inflation is called the inflaton.

    In 2002 three of the original architects of the theory were recognized for their major contributions; physicists Alan Guth of M.I.T., Andrei Linde of Stanford, and Paul Steinhardt of Princeton shared the prestigious Dirac Prize “for development of the concept of inflation in cosmology”. In 2012 Guth and Linde were awarded the Breakthrough Prize in Fundamental Physics for their invention and development of inflationary cosmology.

    4
    Alan Guth, from M.I.T., who first proposed Cosmic Inflation.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation.
    ___________________________________________________________________

    “Cosmic inflation” is a hypothetical event in which the universe rapidly expanded. “We think that inflation is one of the many hints for resolving the inconsistency between our two great theories of physics,” General Relativity and Quantum Mechanics, says Borrill, who serves as the CMB-S4 project data scientist.

    “Cosmic inflation” would also explain, among other things, why areas of the universe that otherwise should not have ever been close enough together to affect one another still seem suspiciously similar.

    The inflation process should have released gravitational waves, fluctuations in space-time that CMB-S4 is designed to detect.

    Either detecting or ruling out the presence of primordial gravitational waves “would be a huge advance for our knowledge of the universe,” says Jeff McMahon, an associate professor at the University of Chicago and co-spokesperson for the CMB-S4 collaboration.

    But an experiment of this scale and sensitivity would have the potential to do much more, including discover unknown subatomic particles from the early universe, explore the nature of dark matter and dark energy, map the matter in the cosmos, and capture transient phenomena in the microwave sky.

    “I think the richness of the dataset means that it’s going to lead us in new directions, and those directions could be something new and exciting,” McMahon says. “I think there’s room for surprises.”

    The telescopes

    CMB-S4 is planning to place an array of microwave telescopes at two sites that have been vetted for their scientific value: the Atacama Plateau in Chile and the South Pole. The Simons Observatory, under construction in Chile, and the South Pole Observatory, operating in Antarctica, are among precursor “Stage-3” CMB experiments that could provide a solid basis for the development of CMB-S4.

    To fulfill some of CMB-S4’s scientific goals, scientists will need to look at the same patch of sky for a long time, and the South Pole is conveniently oriented for this, as its view of the sky changes very little over the course of the year. Scientists plan to host at least 9 small-aperture telescopes 0.5 meters in diameter and one 5-meter large-aperture telescope at CMB-S4’s South Pole site to conduct an ultra-deep survey of 3% of the sky.

    Other goals require scientists to collect data from a very large area of sky; the Chile site is well suited for this. At the CMB-S4 site in the Chilean Atacama Desert, scientists plan to use two 6-meter large-aperture telescopes to conduct a deep and wide survey of 70% of the sky.

    The CMB-S4 collaboration plans to use hundreds of thousands of superconducting bolometers as their detectors. “The thing that makes these experiments sensitive is the number of detectors that they have in their focal plane,” says Kevin Huffenberger, a professor at Florida State University and co-spokesperson for the CMB-S4 collaboration. “[Even with] a better detector, you’re still looking through the same atmosphere, so it doesn’t really help… You have to build more detectors so that you can average down the atmosphere over more detectors.”

    The design for CMB-S4, he says, offers “a big step up in the sensitivity, which allows it to do things that the other experiments couldn’t.”

    Progress and planning

    Three major studies have endorsed CMB-S4. It was recommended in 2014 by the Particle Physics Project Prioritization Panel [P5], which outlines priorities for US particle physics; in 2015 by the National Academies report A Strategic Vision for NSF Investments in Antarctic and Southern Ocean Research, which defines the goals for the National Science Foundation Office of Polar Programs; and in November 2021 by the National Academies report Pathways to Discovery in Astronomy and Astrophysics, which outlines priorities for US astronomy and astrophysics in the coming decade.

    The CMB-S4 team is planning for the project to be a partnership between the National Science Foundation and the US Department of Energy. In 2019, DOE formally established the scientific need for CMB-S4, and in 2020 it designated The Lawrence Berkeley National Lab as the host laboratory. The funding agencies require additional reviews before approving the start of construction.

    The seemingly slow progress is not unusual for an endeavor of this size, according to those involved. It’s a big project, and it takes a lot of time for the agencies to ensure that the project is well justified, says Huffenberger.

    Marcelle Soares-Santos, an assistant professor at the University of Michigan and a convener of the group that focused on astrophysics at Snowmass2021, is also unsurprised at the pace. “We all would like it to go faster, but I think it’s not unexpected to be at this stage,” she says. “There’s a reason why it requires support from the entire community—because it requires knowledge and expertise and resources that come from different corners of the community.”

    The CMB-S4 team must also consider the allocation of the limited resources available at the NSF-managed South Pole Station, which due to its remoteness and the extreme cold, is only accessible for a few months a year. To optimally match the capabilities of CMB-S4 to the logistical constraints of deploying and operating the telescopes at the South Pole, the balance between the number of telescopes placed at each site is currently being reexamined through an “analysis of alternatives,” one of the many agency requirements before considering and approving the project for construction.

    “We’re going over the different changes to the instrument configurations we could make that would still meet our instrument requirements [in order] to understand what fits best within the logistical footprint—for the South Pole Station especially, but also the project overall,” McMahon says.

    That process should wrap up this year, he says, “then we should be ready to seek funding to move forward toward the next stages of design and then construction, with operations in the early 2030s.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:48 pm on October 25, 2022 Permalink | Reply
    Tags: "Physicists work to bring more undergrads into research", "STEM": Science Technology Engineering and Math, , , Despite challenges some Physics faculty at predominantly undergraduate institutions make research experiences available to students., Most undergraduate students who get the chance to participate in experiments attend institutions that offer doctoral degrees and receive a high level of federal support for research., One way for faculty at smaller institutions to bring undergraduate students into research is for larger institutions to offer to share resources., , Primarily undergraduate institutions are not usually set up for research., Siena College for example has joined forces with Cornell University., Some primarily undergraduate institutions also offer funding for faculty research., Symmetry, The US Department of Energy and National Science Foundation provide some financial support to faculty at primarily undergraduate institutions., Undergraduate institutions lack the infrastructure and equipment available at institutions with more federal research dollars.   

    From “Symmetry”: “Physicists work to bring more undergrads into research” 

    Symmetry Mag

    From “Symmetry”

    10.25.22
    Maddi Langweil

    1
    Illustration by Sandbox Studio, Chicago with Steve Shanabruch.

    Despite challenges, some physics faculty at predominantly undergraduate institutions make research experiences available to students.

    Sneha Dixit kept herself company during a 25-hour trip—by plane, train and taxi, from Bengaluru, India, to Chestertown, Maryland—with Stephen Hawking’s popular science book The Grand Design. She was on her way to join the class of 2024 at Washington College.

    Dixit knew she wanted to study physics like Hawking. But it was not until she began her degree that she appreciated what the work of being a physicist was actually like.

    “I’ve wanted to be a physicist for almost seven or eight years now,” she says. “And when I got to university, I realized physicists don’t just sit in front of textbooks and solve equations. They do research, that’s how they find out stuff, that’s what makes them physicists.”

    That’s what Dixit wanted to do, too.

    But most undergraduate students who get the chance to participate in experiments attend institutions that offer doctoral degrees and receive a high level of federal support for research. Dixit is at a small liberal-arts institution that mostly offers four-year degrees. Faculty at primarily undergraduate institutions like Washington College face a number of obstacles to providing research experience to their students.

    Fortunately for Dixit, Washington College Physics Professor Suyog Shrestha found a way around those barriers, at least for a small group of students. With Shrestha, Dixit was able to analyze data from the ATLAS experiment at the Large Hadron Collider.

    Physicists who participated in a planning process for the future of US high-energy physics—called Snowmass—recently championed efforts like these in a white paper, “Enhancing HEP research in predominantly undergraduate institutions and community colleges.”

    To accomplish their goal of making research experience available to a wider range of undergraduate students, they wrote, they will need the support of college and university administrations, experimental collaborations and funding agencies.

    3
    Illustration by Sandbox Studio, Chicago with Steve Shanabruch.

    Learning through doing

    Conducting research is about learning how to find answers outside of a textbook, says Matt Bellis, an associate professor of physics at Siena College just outside Albany, New York.

    When it comes to big questions like why the universe is mostly made up of matter and not antimatter, “you can’t just go to a book and look up how to solve it,” Bellis says. “You need to become creative about how you tackle these problems for which there’s no answer.”

    Siena College is a private Franciscan college with a student body of just over 3,000 undergraduates. But Bellis makes sure his students have access to physics research.

    For students who go on to earn graduate degrees in physics, that head start is important, Shrestha says. “High-energy physics is a very challenging field in the sense where it requires years of preparation before someone can be productive.”

    Beyond bringing a student up to speed in an experimental collaboration, research experience can also make a career in physics seem more achievable. “I think I understand what physics is better than I used to,” says Josephine Swann, an undergraduate physics student at Siena. “I didn’t imagine it was so connected to so many other fields. I was very intimidated by physics before going into it, but it has been so much more accessible than I had hoped.”

    Working with Bellis, Swann recently performed calculations related to a search for dark matter particles by the CMS experiment at the LHC.

    The experience introduced her to a wider research community than she knew existed.

    “I’ve learned from so many people,” she says. “Physicists from projects at the LHC, my professors, other students, and people in non-physics fields like biology and chemistry have been so helpful. I had no idea how much went into a comprehensive calculation like [the one] I did, and ours is still full of uncertainties.

    “I had just finished [introductory physics class] ‘General Physics I’ when I started working on this project, and I can’t even begin to say how much I’ve learned.”

    Even students who do not continue to study physics can benefit from early exposure to scientific research, says Sudhir Malik, a professor of physics at the University of Puerto Rico-Mayaguez who worked on the Snowmass white paper. “Research is something that brings additional technology and additional skills, which are not part of the academic curriculum,” he says.

    Research at primarily undergraduate institutions

    Faculty at Carnegie-classified research institutions and doctoral institutions are expected to spend the majority of their time on research, typically teaching two or fewer courses per year. Graduate students are responsible for instructing many undergraduate classes.

    Faculty at primarily undergraduate institutions, by contrast, are expected to spend most of their time teaching, and they are evaluated based on their teaching ability. “Teaching is incredibly easy, unless you want to do it well,” Bellis says. “To teach well, you have to put time and effort into it.”

    The larger class load that faculty at primarily undergraduate institutions are responsible for can make it difficult for them to travel to attend experimental collaboration meetings or to fulfill service responsibilities such as data-taking, detector maintenance or computer operations.

    This blocks many undergraduate students from research, Malik says. “If the professors at [predominantly undergraduate institutions] do research, only then students can participate.”

    Primarily undergraduate institutions are not usually set up for research. Important upper-level classes that physics students need are either not offered or offered infrequently, due to the small number of students who sign up to take them. In addition, undergraduate institutions lack the infrastructure and equipment available at institutions with more federal research dollars.

    At research-focused institutions, graduate students and postdoctoral researchers support faculty. Supervising and mentoring them takes time and effort, but it also allows professors to delegate tasks.

    By contrast, professors at institutions where there are no graduate students or postdocs must move at a slower pace. “If you really want to give the student a good experience, it takes time to teach the students the basics,” Bellis says. “And that time that you’re teaching them, you could just be doing [the research] yourself.”

    All of this leaves faculty at primarily undergraduate institutions at a disadvantage when competing against faculty at other institutions for funding needed to cover expenses like collaboration membership fees, pay for student researchers, and access to equipment and computing resources.

    “If the university wants to set up something, some kind of research, they always need external funding,” Malik says. “And external funding is a challenge for us.”

    4
    Illustration by Sandbox Studio, Chicago with Steve Shanabruch.

    Making it work

    One way for faculty at smaller institutions to bring undergraduate students into research is for larger institutions to offer to share resources.

    Siena College, for example, has joined forces with Cornell University, which provides the help of graduate students and postdocs, along with technical and administrative support and an already-established connection with the CMS experiment and its host laboratory, CERN.

    Similarly, Shrestha, who is also an adjunct assistant professor at The Ohio State University, uses his connection with the larger institution to continue his research on the ATLAS experiment and bring in students from Washington College.

    Physicists suggest in the Snowmass white paper that experimental collaborations could ease the burden on professors from smaller institutions by charging them less to participate, by offering different levels of membership, and by offering funding for travel. They could also allow faculty with heavier teaching loads to do service work remotely.

    Funding agencies disallow professors from using grants to pay their institutions in exchange for relieving them of some of their teaching responsibilities. But allowing faculty with heavier teaching loads to do so would help professors at smaller institutions, the white paper says.

    The US Department of Energy and National Science Foundation provide some financial support to faculty at primarily undergraduate institutions, Malik says.

    NSF offers funding opportunities called Facilitating Research at Primarily Undergraduate Institutions, designed for schools that don’t have internal funding for research. DOE Office of Science provides funding for faculty and students from institutions historically underrepresented in research through the Visiting Faculty Program, which enables faculty to collaborate during the summer with scientific and technical staff at national labs. Starting in spring 2023, as part of the Office of Science RENEW initiative, the VFP program will offer terms in spring and fall terms as well.

    Some primarily undergraduate institutions also offer funding for faculty research. Shrestha has supported summer research students, including Dixit, with a small stipend from the John S. Toll Summer Student Research Program at Washington College.

    For Dixit, who is on track to graduate early, in 2023, the opportunity has been transformative.

    “This is something I wouldn’t mind doing for the rest of my life,” she says. “And whatever hunch I had about wanting to be a physicist, it was right.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:40 am on October 18, 2022 Permalink | Reply
    Tags: , , , , , , , , , Symmetry, With CMB-S4 scientists hope to connect a sandy desert with a polar desert—and revolutionize our understanding of the early universe.   

    From “Symmetry”: “The next stage of cosmic microwave background research” 

    Symmetry Mag

    From “Symmetry”

    10.18.22
    Madeleine O’Keefe

    1
    Artwork by Sandbox Studio, Chicago.

    With CMB-S4, scientists hope to connect a sandy desert with a polar desert—and revolutionize our understanding of the early universe.

    In the 1960s, an anomalous, faint electromagnetic glow was observed across the entire sky. Physicists later determined that the light came from the very early universe, released when the first atoms formed shortly after the Big Bang.

    We now call this relic radiation the Cosmic microwave Background, or CMB. Studying it is one of the highest priority pursuits in cosmology.

    “The fantastic thing about these [photons] is they have experienced the entire history of the universe,” says Julian Borrill, a senior scientist at the DOE’s Lawrence Berkeley National Laboratory. “And everything that has ever happened in the universe has left a tiny imprint on those photons; it’s changed their distribution and their energies slightly in all kinds of subtle ways.

    “If we can measure them with enough precision and understand their statistics, we can tease out the entire history of the universe.”

    Many experiments, both space- and ground-based, are already studying the CMB. Now scientists are developing plans for an ambitious project that would multiply by 10 the sensitivity of all these searches combined.

    Called Cosmic Microwave Background-Stage 4, the project would comprise an array of small- and large-aperture telescopes deployed in Chile and at the South Pole. Building it would require unprecedented cooperation between two funding agencies and three scientific communities: astronomy, particle physics and polar science.

    If scientists can pull it off, CMB-S4 will connect a sandy desert with a polar desert to address major astronomical questions.

    “We’re going back to look for physics from the dawn of time and test the model for how our whole universe was created,” says John Carlstrom, a CMB-S4 project scientist and professor at the University of Chicago. “From that, we also learn a great deal about what’s in the universe, how it evolved from these quantum fluctuations to all the structure we see.

    “We’re developing the full story of the universe from its infancy and creation to the present day.”

    The science goals

    For many of the over 400 scientists across 121 worldwide institutions who are part of the CMB-S4 collaboration, the most intriguing goal of the experiment is its search for evidence of cosmic inflation.

    ___________________________________________________________________
    Cosmic Inflation Theory

    In physical cosmology, cosmic inflation, cosmological inflation is a theory of exponential expansion of space in the early universe. The inflationary epoch lasted from 10^−36 seconds after the conjectured Big Bang singularity to some time between 10^−33 and 10^−32 seconds after the singularity. Following the inflationary period, the universe continued to expand, but at a slower rate. The acceleration of this expansion due to dark energy began after the universe was already over 7.7 billion years old (5.4 billion years ago).

    Inflation theory was developed in the late 1970s and early 80s, with notable contributions by several theoretical physicists, including Alexei Starobinsky at Landau Institute for Theoretical Physics, Alan Guth at Cornell University, and Andrei Linde at Lebedev Physical Institute. Alexei Starobinsky, Alan Guth, and Andrei Linde won the 2014 Kavli Prize “for pioneering the theory of cosmic inflation.” It was developed further in the early 1980s. It explains the origin of the large-scale structure of the cosmos. Quantum fluctuations in the microscopic inflationary region, magnified to cosmic size, become the seeds for the growth of structure in the Universe. Many physicists also believe that inflation explains why the universe appears to be the same in all directions (isotropic), why the cosmic microwave background radiation is distributed evenly, why the universe is flat, and why no magnetic monopoles have been observed.

    The detailed particle physics mechanism responsible for inflation is unknown. The basic inflationary paradigm is accepted by most physicists, as a number of inflation model predictions have been confirmed by observation;[a] however, a substantial minority of scientists dissent from this position. The hypothetical field thought to be responsible for inflation is called the inflaton.

    In 2002 three of the original architects of the theory were recognized for their major contributions; physicists Alan Guth of M.I.T., Andrei Linde of Stanford, and Paul Steinhardt of Princeton shared the prestigious Dirac Prize “for development of the concept of inflation in cosmology”. In 2012 Guth and Linde were awarded the Breakthrough Prize in Fundamental Physics for their invention and development of inflationary cosmology.

    4
    Alan Guth, from M.I.T., who first proposed Cosmic Inflation.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation.
    ___________________________________________________________________

    Cosmic inflation is a hypothetical event in which the universe rapidly expanded. “We think that inflation is one of the many hints for resolving the inconsistency between our two great theories of physics,” General Relativity and Quantum Mechanics, says Borrill, who serves as the CMB-S4 project data scientist.

    Cosmic inflation would also explain, among other things, why areas of the universe that otherwise should not have ever been close enough together to affect one another still seem suspiciously similar.

    The inflation process should have released gravitational waves, fluctuations in space-time that CMB-S4 is designed to detect.

    Either detecting or ruling out the presence of primordial gravitational waves “would be a huge advance for our knowledge of the universe,” says Jeff McMahon, an associate professor at the University of Chicago and co-spokesperson for the CMB-S4 collaboration.

    But an experiment of this scale and sensitivity would have the potential to do much more, including discover unknown subatomic particles from the early universe, explore the nature of dark matter and dark energy, map the matter in the cosmos, and capture transient phenomena in the microwave sky.

    “I think the richness of the dataset means that it’s going to lead us in new directions, and those directions could be something new and exciting,” McMahon says. “I think there’s room for surprises.”

    The telescopes

    CMB-S4 is planning to place an array of microwave telescopes at two sites that have been vetted for their scientific value: the Atacama Plateau in Chile and the South Pole. The Simons Observatory, under construction in Chile, and the South Pole Observatory, operating in Antarctica, are among precursor “Stage-3” CMB experiments that could provide a solid basis for the development of CMB-S4.

    To fulfill some of CMB-S4’s scientific goals, scientists will need to look at the same patch of sky for a long time, and the South Pole is conveniently oriented for this, as its view of the sky changes very little over the course of the year. Scientists plan to host at least 9 small-aperture telescopes 0.5 meters in diameter and one 5-meter large-aperture telescope at CMB-S4’s South Pole site to conduct an ultra-deep survey of 3% of the sky.

    Other goals require scientists to collect data from a very large area of sky; the Chile site is well suited for this. At the CMB-S4 site in the Chilean Atacama Desert, scientists plan to use two 6-meter large-aperture telescopes to conduct a deep and wide survey of 70% of the sky.

    The CMB-S4 collaboration plans to use hundreds of thousands of superconducting bolometers as their detectors. “The thing that makes these experiments sensitive is the number of detectors that they have in their focal plane,” says Kevin Huffenberger, a professor at Florida State University and co-spokesperson for the CMB-S4 collaboration. “[Even with] a better detector, you’re still looking through the same atmosphere, so it doesn’t really help… You have to build more detectors so that you can average down the atmosphere over more detectors.”

    The design for CMB-S4, he says, offers “a big step up in the sensitivity, which allows it to do things that the other experiments couldn’t.”

    Progress and planning

    Three major studies have endorsed CMB-S4. It was recommended in 2014 by the Particle Physics Project Prioritization Panel, which outlines priorities for US particle physics; in 2015 by the National Academies report A Strategic Vision for NSF Investments in Antarctic and Southern Ocean Research, which defines the goals for the National Science Foundation Office of Polar Programs; and in November 2021 by the National Academies report Pathways to Discovery in Astronomy and Astrophysics, which outlines priorities for US astronomy and astrophysics in the coming decade.

    The CMB-S4 team is planning for the project to be a partnership between the National Science Foundation and the US Department of Energy. In 2019, DOE formally established the scientific need for CMB-S4, and in 2020 it designated The DOE’s Lawrence Berkeley National Lab as the host laboratory. The funding agencies require additional reviews before approving the start of construction.

    The seemingly slow progress is not unusual for an endeavor of this size, according to those involved. It’s a big project, and it takes a lot of time for the agencies to ensure that the project is well justified, says Huffenberger.

    Marcelle Soares-Santos, an assistant professor at the University of Michigan and a convener of the group that focused on astrophysics at Snowmass2021, is also not surprised at the pace. “We all would like it to go faster, but I think it’s not unexpected to be at this stage,” she says. “There’s a reason why it requires support from the entire community—because it requires knowledge and expertise and resources that come from different corners of the community.”

    The CMB-S4 team must also consider the allocation of the limited resources available at the NSF-managed South Pole Station, which due to its remoteness and the extreme cold, is only accessible for a few months a year. To optimally match the capabilities of CMB-S4 to the logistical constraints of deploying and operating the telescopes at the South Pole, the balance between the number of telescopes placed at each site is currently being reexamined through an “analysis of alternatives,” one of the many agency requirements before considering and approving the project for construction.

    “We’re going over the different changes to the instrument configurations we could make that would still meet our instrument requirements [in order] to understand what fits best within the logistical footprint—for the South Pole Station especially, but also the project overall,” McMahon says.

    That process should wrap up this year, he says, “then we should be ready to seek funding to move forward toward the next stages of design and then construction, with operations in the early 2030s.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:37 am on October 4, 2022 Permalink | Reply
    Tags: "15 spectacular photos from the Dark Energy Camera", , , , , , Symmetry,   

    From “Symmetry”: “15 spectacular photos from the Dark Energy Camera” Photo Essay 

    Symmetry Mag

    From “Symmetry”

    10.4.22
    Lauren Biron

    ___________________________________________________________________
    The Dark Energy Survey

    Dark Energy Camera [DECam] built at The DOE’s Fermi National Accelerator Laboratory.

    NOIRLab National Optical Astronomy Observatory Cerro Tololo Inter-American Observatory (CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLabNSF NOIRLab NOAO Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

    According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.

    _______________________________________________________________________
    Nobel Prize in Physics for 2011 Expansion of the Universe

    4 October 2011

    The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics for 2011

    with one half to

    Saul Perlmutter
    The Supernova Cosmology Project
    The DOE’s Lawrence Berkeley National Laboratory and The University of California-Berkeley,

    and the other half jointly to

    Brian P. SchmidtThe High-z Supernova Search Team, The Australian National University, Weston Creek, Australia.

    and

    Adam G. Riess

    The High-z Supernova Search Team,The Johns Hopkins University and The Space Telescope Science Institute, Baltimore, MD.

    Written in the stars

    “Some say the world will end in fire, some say in ice…” *

    What will be the final destiny of the Universe? Probably it will end in ice, if we are to believe this year’s Nobel Laureates in Physics. They have studied several dozen exploding stars, called supernovae, and discovered that the Universe is expanding at an ever-accelerating rate. The discovery came as a complete surprise even to the Laureates themselves.

    In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role.

    The research teams raced to map the Universe by locating the most distant supernovae. More sophisticated telescopes on the ground and in space, as well as more powerful computers and new digital imaging sensors (CCD, Nobel Prize in Physics in 2009), opened the possibility in the 1990s to add more pieces to the cosmological puzzle.

    The teams used a particular kind of supernova, called Type 1a supernova. It is an explosion of an old compact star that is as heavy as the Sun but as small as the Earth. A single such supernova can emit as much light as a whole galaxy. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected – this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

    For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

    The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

    *Robert Frost, Fire and Ice, 1920
    ______________________________________________________________________________

    To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    ___________________________________________________________________

    3
    Photo by Reidar Hahn, Fermilab.

    The powerful camera built for the Dark Energy Survey has taken more than 1 million photos from its perch in Chile. Here are some of the best.

    From high atop a mountain in the Chilean Andes, the Dark Energy Camera has snapped more than one million exposures of the southern sky. The images have captured around 2.5 billion astronomical objects, including galaxies and galaxy clusters, stars, comets, asteroids, dwarf planets, and supernovae.

    Now 10 years since the Dark Energy Camera first saw stars, the impressive 570-megapixel camera was originally built at the U.S. Department of Energy’s Fermi National Accelerator Laboratory for the Dark Energy Survey. The international DES collaboration uses the deep-space data to investigate dark energy, a phenomenon that is accelerating the expansion of space.

    The Dark Energy Survey, whose scientists are now analyzing the data collected from 2013-2019, isn’t the only experiment to benefit from the powerful piece of equipment. Other research groups have also used the camera to conduct additional astronomical observations and surveys. Here are some of the many stellar photos created using the Dark Energy Camera.

    3
    Acknowledgment: M. Soraisam (University of Illinois). Image processing: Travis Rector (University of Alaska Anchorage), Mahdi Zamani & Davide de Martin CTIO/NOIRLab/DOE/NSF/AURA.

    The Southern Pinwheel Galaxy (also known as Messier 83 or NGC 5236) is about 15 million lightyears from Earth. It took DECam more than 11 hours of exposure time to capture this image. The camera is mounted on the Víctor M. Blanco 4-meter Telescope at Cerro Tololo Inter-American Observatory, a program of NSF’s NOIRLab.

    4
    Acknowledgments: T.A. Rector (University of Alaska Anchorage/NSF’s NOIRLab), M. Zamani (NSF’s NOIRLab) and D. de Martin (NSF’s NOIRLab) Dark Energy Survey/DOE/FNAL/DECam/CTIO/NOIRLab/NSF/AURA.

    The Dark Energy Survey imaged one-eighth of the sky, capturing light from galaxies up to 8 billion lightyears away. The survey repeatedly imaged 10 “deep fields” like the one shown here. By returning to certain sections of the sky, scientists are able to build up and collect different wavelengths of light to image incredibly distant galaxies and faint objects. These deep fields can be used to calibrate the rest of the DES data and to hunt for supernovae.

    5
    Marty Murphy, Nikolay Kuropatkin, Huan Lin and Brian Yanny, Dark Energy Survey.

    While the Dark Energy Survey typically looks at objects millions or billions of lightyears away, sometimes closer objects come into view. In 2014, the Dark Energy Survey spotted Comet Lovejoy traveling about 51 million miles from Earth. Each rectangle in the image represents one of the 62 CCDs that DECam uses, each one a sophisticated sensor designed to capture light from distant galaxies.

    6
    Dark Energy Survey.

    The spiral galaxy NGC 1566, sometimes called the Spanish Dancer, is about 69 million lightyears from Earth. Each photo from DECam is the result of choices made during image processing. The camera uses five filters that each record a different wavelength of light (between 400 and 1,080 nanometers) and can be combined to make color images.

    7
    W. Clarkson (UM-Dearborn)/CTIO/NOIRLab/DOE/NSF /AURA/STScI, C. Johnson (STScI), and M. Rich (UCLA)

    This DECam photo, taken looking toward the center of our Milky Way galaxy, covers an area roughly twice as wide as the full moon and contains more than 180,000 stars. You can also see a wider version encompassing more of the Milky Way’s bulge. While beautiful, the stars and dust of the Milky Way block out distant galaxies needed to study dark energy—so the Dark Energy Survey typically aims the telescope in the opposite direction, away from the plane of our galaxy.

    8
    Erin Sheldon, Dark Energy Survey.

    From our position on Earth, we see the spiral galaxy NGC 681 from the side (or edge-on). The galaxy, also known as the Little Sombrero Galaxy, is about 66.5 million lightyears away. To keep images of distant objects as sharp as possible, DECam uses a mechanism called a Hexapod, which uses six pneumatically driven pistons to align the camera’s many optical elements between exposures. In addition to the five light filters, DECam also has five optical lenses, the biggest of which is more than 3 feet wide and weighs 388 pounds.

    6
    Image processing: Travis Rector (University of Alaska Anchorage), Mahdi Zamani and Davide de Martin
    CTIO/NOIRLab/NSF/AURA/SMASH/D. Nidever (Montana State University)

    This image shows a wide-angle view of the Small Magellanic Cloud. The Large and Small Magellanic Clouds are dwarf satellite galaxies to the Milky Way, and their proximity makes them a valuable place to study star formation. The Dark Energy Camera captured deep looks at our galactic neighbors for the Survey of the Magellanic Stellar History, or SMASH.

    10
    Image processing: T.A. Rector (University of Alaska Anchorage/NSF’s NOIRLab), J. Miller (Gemini Observatory/NSF’s NOIRLab), M. Zamani and D. de Martin (NSF’s NOIRLab) Dark Energy Survey/DOE/FNAL/DECam/CTIO/NOIRLab/NSF/AURA

    The large galaxy at the center of this image is NGC 1515, a spiral galaxy with several neighboring galaxies in the Dorado Group. When looking at the large-scale structure of the universe, astronomers find galaxies are not distributed randomly but instead cluster together, forming a sort of cosmic web. The Dark Energy Survey has made some of the most precise maps of the universe’s structure and its evolution over time.

    11
    Robert Gruendl, Dark Energy Survey

    NGC 288 is a globular cluster of stars located about 28,700 lightyears from Earth. These stars are bound together by gravity and are concentrated toward the center of the sphere. Globular clusters are an interesting way to study how stars and our own Milky Way evolved, though the Dark Energy Survey looks at distant galaxies and galaxy clusters to better understand dark energy.

    12
    PI: M. Soraisam (University of Illinois at Urbana-Champaign/NSF’s NOIRLab) Image processing: T.A. Rector (University of Alaska Anchorage/NSF’s NOIRLab), M. Zamani (NSF’s NOIRLab) and D. de Martin (NSF’s NOIRLab) CTIO/NOIRLab/DOE/NSF/AURA

    This Dark Energy Camera image shows light from Centaurus A, a galaxy more than 12 million lightyears away. It is partially obscured by dark bands of dust caused by the collision of two galaxies.

    13
    Image processing: DES, Jen Miller (Gemini Observatory/NSF’s NOIRLab), Travis Rector (University of Alaska Anchorage), Mahdi Zamani and Davide de Martin DES/DOE/Fermilab/NCSA and CTIO/NOIRLab/NSF/AURA

    The Dark Energy Survey has found several new dwarf galaxies and used the data to limit how big potential dark matter particles could be. This irregular dwarf galaxy, IC 1613, is about 2.4 million lightyears away and contains around 100 million stars. Dwarf galaxies are considered small and faint by astronomical standards; for comparison, our Milky Way galaxy is estimated to contain between 100 and 400 billion stars.

    14
    Rob Morgan, Dark Energy Survey

    The Helix Nebula (NGC 7293) is a planetary nebula about 650 lightyears from Earth. It is shown here extending over several of the Dark Energy Camera’s CCDs. Planetary nebulae, so named because they appeared round and sharp-edged like planets, are actually the remains of stars. Here, a dying star has ejected its outer layers, leaving a small white dwarf surrounded by gas. In billions of years, our own sun will experience a similar fate.

    15
    Dark Energy Survey

    The spiral Sculptor Galaxy is about 11 million lightyears away. It’s one of more than 500 million galaxies imaged by the Dark Energy Survey across 5000 square degrees of sky. To optimize observations, DES used automated software to point the camera and capture exposures. The software could factor in what part of the sky was overhead, weather conditions, moonlight, and which areas had been recently imaged.

    16
    Image processing: DES, Jen Miller (Gemini Observatory/NSF’s NOIRLab), Travis Rector (University of Alaska Anchorage), Mahdi Zamani and Davide de Martin DES/DOE/Fermilab/NCSA and CTIO/NOIRLab/NSF/AURA

    The wispy shells around elliptical galaxy NGC 474 (center) are actually hundreds of millions of stars. To the left is a spiral galaxy, and in the background there are thousands of other, more distant galaxies—visible in this zoomable version. DECam images contain vast amounts of information; each one is about a gigabyte in size. The Dark Energy Survey would take a few hundred images per session, producing up to 2.5 terabytes of data in a single night.

    17
    Dark Energy Survey

    The Dark Energy Camera captured the barred spiral galaxy NGC 1365 in its very first photographs in 2012. The galaxy sits in the Fornax cluster, about 60 million lightyears from Earth. This close-up comes from the camera’s much wider field of view, which you can explore in the interactive DECam viewer.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: