Tagged: QCD: Quantum Chromodynamics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:49 am on July 1, 2022 Permalink | Reply
    Tags: "From RHIC to EIC:: At the QCD Frontiers", Annual users' meeting highlights physics results; future plans; efforts to increase diversity and more., , , , QCD: Quantum Chromodynamics, , The future Electron-Ion Collider,   

    From The DOE’s Brookhaven National Laboratory: “From RHIC to EIC:: At the QCD Frontiers” 

    From The DOE’s Brookhaven National Laboratory

    June 30, 2022
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Annual users’ meeting highlights physics results; future plans; efforts to increase diversity and more.

    1

    The annual meeting of scientists conducting research at the Relativistic Heavy Ion Collider (RHIC)[below] and its pre-injector accelerator, the Alternating Gradient Synchrotron (AGS)[below], was held virtually June 7–10, 2022. Known as the RHIC & AGS Users’ Meeting, this gathering is a chance for researchers to share their science and hear the latest plans related to RHIC, a U.S. Department of Energy Office of Science user facility for nuclear physics research at DOE’s Brookhaven National Laboratory.

    RHIC steers beams of ions—the nuclei of atoms stripped bare of their electrons—into collisions so scientists from all around the world can study fundamental characteristics of the building blocks of matter. The interactions of those building blocks, known as quarks and gluons, are governed by the theory of quantum chromodynamics (QCD). So RHIC is, in essence, a powerful machine for exploring the frontiers of QCD.

    Throughout the meeting, attendees heard talks highlighting the latest physics results from RHIC’s STAR [below] and PHENIX [below] experiments—giant 3D digital cameras that take “snapshots” of particles emerging from RHIC’s particle smashups. Physicists also presented recaps of how the collider and experiments performed during RHIC’s latest round of collisions, Run 22. Spoiler alert: The run was “highly eventful, highly challenging, and eventually highly successful,” said Haiyan Gao, Brookhaven’s Associate Laboratory Director for Nuclear and Particle Physics, who oversees the RHIC research program. And there was a lot of excitement at the meeting about future programs—including “hot-off-the-press” photos showing dramatic progress on the sPHENIX detector and the transformation of RHIC into the future Electron-Ion Collider (EIC) [below].

    2
    “Hot-off-the-press” photos of the inner hadronic calorimeter being installed in RHIC’s sPHENIX detector on Wednesday, June 8, during the annual RHIC & AGS Users’ Meeting.

    “We have overcome many challenges and accomplished much over the past two-plus years,” Gao said during the opening talk of the meeting’s first plenary session on Thursday, June 9. She was referring directly to the COVID-19 pandemic, which placed limits on how the Lab was staffed and required new ways of working to keep the science moving. But her comment also applies to the complexity of exploring a state of matter that lasts a mere fraction of a second after being created in RHIC’s particle smashups—as well as keeping up the pace of technological achievements and sustained funding required to make that exploration and future EIC science possible.

    3
    This schematic shows the chain of accelerators that feed beams of ions into the two rings that make up the 2.4-mile-circumference Relativistic Heavy Ion Collider (RHIC), and how RHIC will be transformed into an Electron-Ion Collider (EIC) by adding components for accelerating and colliding electrons with ions.

    Gao noted completion of the “beam energy scan” (BES II) in Run 21. That series of collisions at different energies is allowing scientists to map out the phases of nuclear matter—including the free quarks and gluons that make up a “quark-gluon plasma” created at RHIC’s highest collision energies, and the protons and neutrons of ordinary “hadronic” matter—as well as the types of transitions between these phases at the different energies. “I’m very excited to see the results of the BES II data analysis,” Gao said.

    Mapping out these phases was one of two RHIC science goals spelled out in the nuclear physics community’s last Long Range Plan, published in 2015. The second goal was describing the inner workings of the quark-gluon plasma at varying length scales—a central focus of the new sPHENIX detector, an upgrade to PHENIX, which concluded its data taking in 2016. Showing off the latest pictures of inner hadronic calorimeter components installed in the sPHENIX barrel just the day before her talk, Gao said, “I congratulate the entire [sPHENIX] team for making this progress!”

    Gao also noted progress on major milestones for the EIC, including the achievement last year of DOE’s Critical Decision 1, approval of alternative selection and cost range, and the international collaborative effort to select a reference design for the EIC project detector.

    “The EIC user community worked very hard in producing the ‘yellow report’ with detector requirements, and responding to the call for detector proposals,” she said. “We are very excited by the worldwide interest in EIC physics.”

    DOE highlights

    As Gao and other speakers noted, progress on all these efforts can only be accomplished with the generous funding allocated by Congress and the DOE Office of Science, Office of Nuclear Physics, which supports a wide variety of science.

    “We look at topics throughout the entire evolution of the universe and on physical scales from quarks to galaxies,” said Timothy Hallman, DOE’s Associate Director of the Office of Science for Nuclear Physics (NP). That expansive mission, he noted, “requires ‘microscopes’ and tools of varying resolving ‘power’”—the facilities across the nation whose construction, operations, and research programs account for a large part of the NP budget.

    Hallman noted the successful startup of the Facility for Rare Isotope Beams (FRIB) at Michigan State University and a range of other accomplishments across the complex—including at RHIC and the Continuous Electron Beam Accelerator Facility (CEBAF) at Thomas Jefferson National Accelerator Facility, Brookhaven’s partner in building the future EIC.

    He also noted the ongoing assembly of sPHENIX as “an experiment for the Office of Science, with authority delegated to the Laboratory for construction—and it was very successful.”

    Regarding the EIC, he noted that because of the significant cost ($1.7-2.8 billion) and long timeframe for the project, it will be a challenge and take “a significant amount of resolve” to keep it viewed as a top priority in terms of budgets. But he emphasized that the need for new money will be lessened once RHIC completes its scientific program, and that the facility, “is going to help maintain U.S. leadership in accelerator science and the technology of colliders, and already has a huge interest in the user community.”

    Getting into the numbers, Hallman showed that funding for fiscal year 22 (FY22) was a large ($93 million) increase over FY21. But he noted that FY21 was an anomalous year when funding for nuclear physics discovery science went down. The increase, he said, included $20 million of new funding allocated for EIC construction in FY22 (on top of $28.4 million in reprioritized funds)—still “not sufficient.” But he noted that, “it could have been a disaster if Congress had not given us that sizable increase.”

    The picture for FY23, with the President’s budget request including only an $11 million increase for NP over the FY22 enacted budget, means, “we still have work to do,” Hallman said.

    4
    Timothy Hallman, the Department of Energy’s Associate Director of the Office of Science for Nuclear Physics (NP), presented an update that included projects exploring the entire span of the evolution of the universe and from the scale of quarks to the cosmos.

    Several points Hallman made at the start of his talk—emphasizing the importance of the nuclear physics workforce to the nation—could help to make case for NP funding.

    “One of our most important products, in addition to knowledge, facilities, and new technologies is a highly trained, diverse workforce capable of supporting DOE and other missions,” he said. “U.S. science, commerce, medicine, and defense all benefit, in part, from a stable level of sustained competence, capability, capacity, and leadership in nuclear physics. DOE NP is the U.S. steward responsible for reliably delivering that benefit.”

    One key to success in both science and funding, Hallman advised, is for the field to stay united in establishing the next nuclear physics Long Range Plan.

    “We have a long tradition of partnership,” he said, noting how a united front has achieved much success in the field. “We need to all come up with a plan we can collectively get behind. If we imagine that if we just get rid of something someone else wants [to achieve our individual goals], that is a failed strategy,” he said.

    Later in the day, Kenneth Hicks, the DOE NP program manager for heavy ion physics, elaborated on how NP strives to balance different program initiatives with future strategies, and emphasized the need for the community to be involved.

    “I really depend on the views from experts to do my job properly,” he said. “So, if you are asked to review a project, please say yes. If you get an email survey about priorities and goals, please reply. Schedule a Zoom meeting with me if you’d like to discuss program priorities.”

    “A lot of very exciting things are going on in our field,” he said, reinforcing Hallman’s suggestion of developing a consensus. “But that’s not for me to come up with; [that’s a job] for the community.”

    James Sowinski, the retiring DOE program manager for NP facilities, shared a some departing words, recounting his early days as a STAR collaborator and the role he played in the RHIC spin program before joining DOE, as well as his former roles on the RHIC & AGS Users Executive Committee.

    “The future is bright,” he said. “I intend to follow NP as a hobby, including the success of sPHENIX, completion of RHIC physics program, and the EIC.”

    Diverse, technically skilled workforce

    Hallman, Gao, and other plenary speakers also described the importance of increasing the diversity of the NP workforce—to bring people from more diverse backgrounds into the field, make them feel welcome, and help them to succeed.

    “I’m very encouraged by the session we had yesterday,” said, Gao pointing out how efforts discussed during a half-day workshop devoted to this topic the day before the plenary session “highlight the collaboration between the Laboratory and [RHIC & AGS] users to work together on the ‘pipeline.’”

    Hallman noted the success of a new NP traineeship program that has included students from 18 Minority Serving Institutions (MSIs). Among the first 110 students participating in this program, 40 percent are Hispanic and 40 percent Black. With the aim of retaining students once they are introduced to the field of nuclear physics, Hallman said we’ll need to find ways to sustain investment at Historically Black Colleges and Universities (HBCUs) and MSIs, possibly by linking programs to other fields important to the Office of Science. Many fields could benefit from NP-related skills, including imaging and cryogenics, and students trained with these skills could go on to apply their expertise in many types of jobs across the economy.

    Bringing the “missing millions” of underrepresented minorities into the science, technology, engineering, and mathematics (STEM) workforce is one of three main points guiding the vision of the director of the National Science Foundation (NSF), said Allena Opper, Program Director for Nuclear Physics – Experiment in Mathematical and Physical Sciences (MPS) at NSF.
    ===
    The President’s requested FY23 budget for NSF includes what she calls a “huge bump” overall—a 24 percent increase over what was allocated in FY21—with 9.6 percent to MPS, including 4 percent dedicated to physics. In describing how that funding would be allocated, she included several programs offering support to faculty, graduate students, and postdoctoral fellows that aim to develop role models and increase participation among underrepresented groups.

    The other two major areas NSF will focus on with its funding “bump” are strengthening existing research capabilities and accelerating partnerships, Opper said. She noted there had been a dip in proposals in FY21 and FY22—likely due to COVID—and encouraged RHIC & AGS users interested in learning more to reach out to her or her colleagues in the experiment and theory programs.

    At Friday’s plenary session, Noel Blackburn, Brookhaven Lab’s Chief Diversity Officer, invited meeting attendees to consider, “What type of collaboration and environment do we want to create? Is it welcoming and respectful and allowing you to be your authentic self?”

    Many diversity, equity, and inclusion efforts focus only on “compliance,”— simply having programs in place, he said. “We want to take it to another level, to leverage diversity, equity, and inclusion to meet our mission.”

    Internally, he said, “We want to work with every member of the Lab’s community such that we will welcome, respect, and value all the diverse perspectives every member provides.”

    But a big part of the future is to focus efforts on reaching new external partners. “We’re looking to enhance and diversify the STEM pipeline for Brookhaven, DOE, and the nation while creating access for those unfamiliar with our mission. Our efforts will increase awareness of Brookhaven among diverse communities of potential supporters, users, and staff,” Blackburn said.

    Read more about the DEI-focused workshop at the RHIC & AGS Users Meeting.

    Troubleshooting through Run 22

    One highlight of the annual meeting is always the recap of the latest RHIC run. One main goal of Run 22 was to use RHIC’s ability to accelerate protons with their “spins” aligned in a particular direction to track how quarks and gluons within these particles move or align their spins with respect to that frame of reference, making use of newly installed upgrades of the STAR detector.

    But as Gao hinted and Vincent Schoefer, the Collider-Accelerator Department physicist who served as run coordinator described in detail, the run got off to a bumpy start. Two December power failures plagued one of RHIC’s “Siberian snake” accelerator magnets.

    The snakes’ job is to flip the spins of the protons 180 degrees—up to down, or down to up—each time they pass through. The flipping acts as a reset switch to keep the particles’ spins aligned, or polarized. Without the snakes, protons quickly lose polarization, similar to the way spinning tops start to wobble before coming to a stop.

    With help from the C-AD’s Power Supply Group and support from the Lab’s Superconducting Magnet Division in the middle of December, the physicists devised a way to wire the functioning parts of the magnet together to achieve a “partial snake”—a type of magnet used in the AGS. The team had to make a series of other fixes to deal with particles entering the collider at a tilt and “very tight steering tolerances” as they raced around the 2.4-mile circumference accelerator at nearly the speed of light.

    “We implemented all the tricks we had learned and [the polarization] jumped up to 50 percent,” Schoefer said. That was December 29, and “the first point where the run looked almost plausible,” he said. Adjusting the beam to a slightly lower energy got the protons aligned as near to vertical as possible. “We were happy to find a solution that made [polarization in RHIC’s ‘blue’ ring] look quite nice,” Schoefer said.

    Then, in January, a motor generator in the AGS failed.

    “The good news is we have a backup, but it has a slower ramp rate, which can lead to polarization loss. It needed a lot of babysitting,” Schoefer said, until the original generator could come back online and bring the polarization back up.

    STAR-studded data delivery

    Throughout the run the accelerator physicists were in constant communication with physicists at STAR. Spin rotators at STAR were used to reliably measure the direction of the beam polarization so the physicists could account for the slight tilt in their measurements and make the most of the beams the accelerator could deliver. Those measurements at STAR also helped the accelerator physicists determine how to refine the beams.

    “This was a great example of collaboration between the experiment and the accelerator teams,” Schoefer said.

    And thanks to DOE, the team was able to extend the run by two weeks.

    “That two-week extension helped us make it to our goals; we are extremely grateful,” Schoefer said. The accelerator physicists were even able to make some progress on instrumentation commissioning for a future accelerator goal.

    “You all should be thanked for the yeoman effort you put it,” said Carl Gagliardi, a STAR collaborator from Texas A&M University. “It was truly remarkable to watch and be part of.”

    STAR collaborator Zilong Chang of Indiana University, who presented STAR’s take on Run 22, agreed. Giving credit to the STAR graduate students, Brookhaven Lab engineers, and everyone else who worked to get the STAR forward upgrades installed on budget and on time, he was extremely grateful that the December snake failure and other issues didn’t scuttle the run.

    “C-AD came up with this magic to run with a partial snake and it turned out to work really well,” he said. “Once we found the lower energy to get better stability in the polarization angle alignment to the vertical axis…it allowed us to align the forward detectors and make precise cross-section measurements.”

    5
    Run 22 put all the new STAR detector systems through their paces with no significant issues.

    The overall luminosity ended up being better than the average for Run 17, the last time polarized protons were run at RHIC and higher than STAR had requested. The collaboration was only a couple of percent short of achieving the data level they needed for one physics goal, with six percent more data than they needed for another.

    Run 22 also put all the new STAR detector systems through their paces with no significant issues. Looking forward to next year’s gold-gold collision run, Chang said, “Our forward upgrade detectors are ready.”
    ===
    Physics highlights from PHENIX

    Cesar da Silva of The DOE’s Los Alamos National Laboratory presented the most recent highlights from RHIC’s PHENIX collaboration, which continues to analyze data taken before it was shut down to begin the transformation to sPHENIX. He mentioned several new results in spin physics that are aimed at accessing the gluons within nuclear matter and may be used as a step toward mapping out the 3D internal structure of the proton.

    6
    Composite photo of RHIC’s former PHENIX detector and particle tracks recorded by the detector.

    He also described measurements of direct photons emerging from heavy ion collisions (ex., gold-gold) that can access all stages of the collision as it evolves from initial cold nuclear matter through the quark-gluon plasma to a hadronic gas. These measurements give access to the temperature of the medium created in these collisions. Comparisons with models show the measurements are consistent with thermal and pre-equilibrium stages in central events.

    Da Silva also presented new measurements of high-energy direct photons and neutral pions based on analysis of data collected in deuteron-gold collisions in 2016. The results indicate that the neutral pion yield is suppressed relative to direct photons in the most central deuteron-gold collisions. This could be the first hint of energy loss of quarks and gluons in small collision systems.

    Results on jets emerging from collisions when quarks or gluons split to form a spray of particles reveal differences in how particles in different parts of the jet—and particles with different masses—lose energy as they travel through the quark-gluon plasma.

    High statistics measurements of electrons from the decay of heavy charm and bottom quarks were also presented. These results showed that the energy loss of the more massive botton quark is smaller than that of the charm quark, which is about one-quarter the mass. This heavy flavor energy loss measurement is a long-awaited result from PHENIX, da Silva said.

    “We still have a vibrant PHENIX collaboration,” he said. As the collaboration works to preserve data for future analyses, he noted that PHENIX physics has made unique contributions in several studies of quantum chromodynamics (QCD) and quark-gluon plasma (QGP).

    “Students who came [to the collaboration] after PHENIX ended operation are a vital part of the collaboration and [will be] responsible for more discoveries,” he said, noting a couple of upcoming publications.

    Physics highlights from STAR

    Takafumi Niida of the University of Tsukuba in Japan presented the highlights from STAR. In addition to noting again the completion of data taking for the Beam Energy Scan (BES) II and the commissioning of forward detectors in Run 22, he presented new explorations of “cold QCD”—meaning studies of collisions that don’t create hot quark-gluon plasma but can be used to explore other characteristics of nuclear matter in its initial state—as well as hot QCD.

    7
    Composite photo of RHIC’s STAR detector and particle tracks recorded by the detector.

    For example:

    High energy collisions of spin polarized protons at STAR are exploring how gluons’ spins contribute to overall proton spin with higher precision than ever before using data from runs in 2013 and 2015.
    Looking at how back-to-back pairs of particles emerge in various collision types shows signs of a saturated state of gluons.
    Interactions of photons (particles of light) surrounding colliding ions are helping STAR physicists measure the range of the strong force interaction—the force that binds quarks.
    Results from isobar collisions show the expected difference in magnetic field strength, but not the predicted signatures of the chiral magnetic effect. Still, data collected in these collisions offer a wealth of other areas of study.

    “We have many interesting results from cold QCD and hot QCD physics at STAR, and new results from high statistics isobar and BES II data,” Niida said. “More results from the full BES II data will be coming soon, and many interesting physics with the forward upgrades will come in 2023 and beyond.”

    sPHENIX, EIC, and the future

    Friday’s plenary session also included updates on the search for signatures of the chiral magnetic effect (CME) by STAR, progress on the sPHENIX detector, and an EIC project overview followed by detailed talks on the EIC accelerator, physics goals, and detector development.

    “This will be the final presentation of sPHENIX at the RHIC & AGS Users’ Meeting before we become an operating experiment,” said Cameron Dean of the Massachusetts Institute of Technology.

    “We are starting to install all of our subsystems into the detector,” he said, sharing a video showing the installation of the inner hadronic calorimeter that had taken place just two days earlier. The calorimetry will add capabilities for studies of particle jets in ways that have never been done at RHIC before, he said. His talk included details of individual detector components and how they’ll contribute to sPHENIX’s broad physics program studying jets, heavy flavor, quarkonia, and cold QCD.


    sPHENIX Inner Hadronic Calorimeter Installation.

    He even showed images of the first particles passing through the sPHENIX detector—showers of cosmic rays from outer space!

    “sPHENIX will have all detectors in place and ready to go for first data taking in February of 2023!”

    Jim Yeck, project director for the EIC, started his talk by referring to the progress on sPHENIX: “This is our goal at EIC—to show hardware being installed! But we have a way to go yet.”

    He described the process to date for establishing the scientific goals and machine parameters for the EIC. “It’s a significant project in terms of the scale of investment and technologies needed to make it successful,” he said, noting that DOE’s “critical decision” system of gateway approvals is “a big driver in how we make our plans and propose to move forward with technical progress.”

    He noted the uncertainty in funding over the past year, but also “fantastic progress in moving forward on the project detector” reference design.

    The challenges and successes “encourage us to lean forward—to use every penny to make progress,” he said.

    “We are trying to get the new funding needed now so we can take advantage of reprioritized funds and the people who become available” when RHIC completes its operational period, he said. “Lower amounts will lead to a longer project timeline and ultimately higher cost of the machine,” he said. He noted that the project team is working closely with DOE to make these points to Congress.

    But he concluded by focusing on the progress and the collaborations forming around the machine and the project detector.

    “A lot of young people are joining the effort, which is really encouraging,” he said. “This is workforce development at its finest.”

    Meeting attendees also heard references to efforts to address current and future users’ concerns and comforts.

    “In the last couple years, we’ve been real busy,” said Associate Laboratory Director for Facilities & Operations Tom Daniels. “Even though the pandemic hit and not as many folks have been on site, it hasn’t stopped us from doing the work that needs to be done to make the Laborotary a better place.”

    A search is underway for a local food vendor to provide cafeteria services, Daniels said. The Lab will increase immediate food services on site with food trucks and pop-up vendors as needed, said Gao.

    Looking forward, single-use bathrooms at Brookhaven will soon be labeled as gender-neutral facilities and a design for a non-gender-specific, handicap-accessible dorm wing has been completed, Daniels said.

    Construction is in progress on the Science and User Support Center, which will serve as a one-stop processing and welcome center for guests, researchers, and facility users near Brookhaven Lab’s main entrance. The 75,000 gross-square-foot building is anticipated to be completed in 2024 and will house new meeting and conferencing spaces in addition to office areas.

    8
    The future Science and User Support Center.
    ===
    Awards and election results

    The RHIC & AGS Users’ Meeting always includes the announcement of a series of awards and the results of elections for new officers on the RHIC & AGS Users’ Executive Committee (UEC). And the winners are:
    Thesis awards

    Niveditha Ram, Stony Brook University: “Nuclear modification of hard scattering processes in small systems at PHENIX”
    Krista Smith, Florida State University, “Measurement of J/Ψ and Ψ (2S) at forward and backward rapidity in p+p, p+Al, p+Au, and 3He+Au collisions”

    Merit awards

    Yingying Shi, Shandong University, “For her significant contributions to the STAR forward upgrade program, including prototyping, mass production, and quality assurance of the Forward sTCG Tracker”
    Raghav Kunnawalkam Elayavalli, Yale University “For his numerous contributions to measuring and interpreting jet and jet-like correlations at RHIC, and for his services as working group convener in STAR”
    Xu Sun, University of Illinois at Chicago, “For his key contributions to STAR Forward Silicon Tracker construction, commissioning, and operations in Run22, and breakthrough discovery of global spin alignment of phi mesons in heavy-ion collisions at RHIC”
    Zaochen Ye, Rice University, “For his dedication to STAR Time-of-Flight detector operation and calibration, and breakthrough analysis of dielectron continuous spectra related to the QCP radiation”

    Election results

    Chair-elect: Marzia Rosati, Iowa State University
    General members:
    Megan Connors, Georgia State University
    Raghav Kunnawalkam Elayavalli, Yale University
    Hanna Paulina Zbroszczyk, Warsaw University of Technology
    Student/Postdoc members:
    Roli Esha, Stony Brook University
    Maria Stefanik, Warsaw University of Technology
    Agnieszka Sorensen, UCLA/Lawrence Berkeley National Laboratory

    Poster award

    Derek Murphy Anderson, Texas A&M: “Reconstruction of neutral-triggered recoil jets in √SNN= 200 GeV p+p collision at STAR experiment”

    The meeting concluded with a big thank you to the staff in Brookhaven’s Guest, User & Visitor Center, and particularly Kelly Guiffreda, who set up registration and ensured that everything ran smoothly in the meeting’s virtual environment.

    Research at RHIC and the facility’s operations, the sPHENIX upgrade, and the EIC project are all funded by the DOE Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Brookhaven Campus

    One of ten national laboratories overseen and primarily funded by the The DOE Office of Science, The DOE’s Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University and Battelle Memorial Institute. From 1947 to 1998, it was operated by Associated Universities, Inc.(AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology to have a facility near Boston, Massachusetts. Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University, Cornell University, Harvard University, Johns Hopkins University, Massachusetts Institute of Technology, Princeton University, University of Pennsylvania, University of Rochester, and Yale University.

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966.

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II. [below].

    BNL National Synchrotron Light Source.

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] (US) as the future Electron–ion collider (EIC) in the United States.

    Brookhaven Lab Electron-Ion Collider (EIC) to be built inside the tunnel that currently houses the RHIC.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.

    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.

    BNL National Synchrotron Light Source II, Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years. NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.

    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.

    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University-SUNY.

    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to the ATLAS experiment, one of the four detectors located at the The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN] Large Hadron Collider(LHC).

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN] map.

    Iconic view of the European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear] [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN] ATLAS detector.

    It is currently operating at The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN] near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    DOE’s Oak Ridge National Laboratory Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.

    Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China .


    BNL Center for Functional Nanomaterials.

    BNL National Synchrotron Light Source II.

    BNL NSLS II.

    BNL Relative Heavy Ion Collider Campus.

    BNL/RHIC Star Detector.

    BNL/RHIC Phenix detector.

     
  • richardmitnick 9:35 am on June 11, 2022 Permalink | Reply
    Tags: "Collider data yield insights into neutron star structure", , Measurements of heavy-ion collisions predict properties of neutron stars that are consistent with those informed by astrophysical observations., Neutron stars—the culmination of the gravitational collapse of certain massive stars—pack about one to two solar masses into spheres 20–30 km across., , , QCD: Quantum Chromodynamics   

    From “Physics Today”: “Collider data yield insights into neutron star structure” 

    Physics Today bloc

    From “Physics Today”

    10 Jun 2022
    Andrew Grant

    Measurements of heavy-ion collisions predict properties of neutron stars that are consistent with those informed by astrophysical observations.

    1
    A high-frequency accelerator cavity at the GSI Helmholtz Centre for Heavy Ion Research in Germany. Data from heavy-ion collision experiments at GSI helped constrain the properties of dense matter in neutron stars. Credit: J. Hosan/GSI Helmholtzzentrum für Schwerionenforschung GmbH.

    Through a combination of quantum chromodynamics theory and astronomical observations, researchers have established that neutron stars—the culmination of the gravitational collapse of certain massive stars—pack about one to two solar masses into spheres 20–30 km across. Pinning down the properties of such small objects located so far away is an impressive achievement, but astrophysicists and nuclear physicists want to do even better.

    The compression in a neutron star is so high that it may force the nuclear matter into exotic phases (see the Quick Study by Nanda Rea, Physics Today, October 2015, page 62). For a 1.4-solar-mass neutron star, a few-kilometer difference in radius could determine whether the nucleons exist as hyperons, free quarks, or something else. To help pinpoint the parameters of the densest matter in the universe, Sabrina Huth of Technical University of Darmstadt in Germany, Peter T. H. Pang of Nikhef in Amsterdam, and their colleagues have incorporated data from the densest matter on Earth: heavy ions that collide in particle accelerators.

    The researchers strove to home in on the nuclear equation of state, which encompasses the relationship between neutron stars’ masses and radii and quantifies the stiffness of nuclear matter. The stiffer the matter, the greater its resistance to gravitational collapse and the larger the neutron star radius. Similarly, the nuclear stiffness dictates the dynamics of the compression and subsequent expansion when heavy nuclei such as those of gold slam into each other at relativistic energies inside particle colliders. The expansion of the post-collision nucleons is sensitive to the nuclear symmetry energy, a measure of how the nuclear binding energy changes with the neutron-to-proton ratio of the nucleus. That energy, in turn, is related to symmetry pressure, an important factor in the determination of the equation of state (see the article by Jorge Piekarewicz and Farrukh Fattoyev, Physics Today, July 2019, page 30).

    Huth, Pang, and colleagues analyzed data from accelerators at the GSI Helmholtz Centre for Heavy Ion Research in Germany and at the US’s Lawrence Berkeley and Brookhaven National Laboratories. After combining the heavy-ion data with nuclear-theory calculations, the researchers set constraints on the radii of and the typical pressures within 1.4-solar-mass neutron stars that are consistent with those based on astrophysical measurements. They then merged the astrophysical and heavy-ion data to tighten the constraints for both properties. The data suggest a slight increase over the prior predicted value for the radius, which is supported by recent observations from NASA’s x-ray-observing Neutron Star Interior Composition Explorer mission.

    The study highlights the value of looking beyond astrophysics and theory to understand the dense matter in neutron stars. Accelerators at GSI and elsewhere should soon be able to achieve particle densities comparable to those in neutron star cores, providing even more useful data for pinning down the equation of state. (S. Huth et al., Nature 606, 276, 2022.)

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Our mission

    The mission of ”Physics Today” is to be a unifying influence for the diverse areas of physics and the physics-related sciences.

    It does that in three ways:

    • by providing authoritative, engaging coverage of physical science research and its applications without regard to disciplinary boundaries;
    • by providing authoritative, engaging coverage of the often complex interactions of the physical sciences with each other and with other spheres of human endeavor; and
    • by providing a forum for the exchange of ideas within the scientific community.”

     
  • richardmitnick 11:50 am on October 12, 2021 Permalink | Reply
    Tags: "The Electron-Ion Collider- new accelerator could solve the mystery of how matter holds together", , QCD: Quantum Chromodynamics, ,   

    From The Conversation : “The Electron-Ion Collider- new accelerator could solve the mystery of how matter holds together” 

    From The Conversation

    October 11, 2021
    Daria Sokhan

    1
    DOE’s Brookhaven National Laboratory (US) campus. Credit: Brookhaven National Laboratory.

    When the Nobel Prize-winning US physicist Robert Hofstadter and his team fired highly energetic electrons at a small vial of hydrogen at the Stanford Linear Accelerator Center in 1956, they opened the door to a new era of physics. Until then, it was thought that protons and neutrons, which make up an atom’s nucleus, were the most fundamental particles in nature. They were considered to be “dots” in space, lacking physical dimensions. Now it suddenly became clear that these particles were not fundamental at all, and had a size and complex internal structure as well.

    What Hofstadter and his team saw was a small deviation in how electrons “scattered”, or bounced, when hitting the hydrogen. This suggested there was more to a nucleus than the dot-like protons and neutrons they had imagined. The experiments that followed around the world at accelerators – machines that propel particles to very high energies – heralded a paradigm shift in our understanding of matter.

    Yet there is a lot we still don’t know about the atomic nucleus – as well as the “strong force”, one of four fundamental forces of nature, that holds it together. Now a brand-new accelerator, the Electron-Ion Collider, to be built within the decade at the DOE’s Brookhaven National Laboratory (US), with the help of 1,300 scientists from around the world, could help take our understanding of the nucleus to a new level.

    Strong but strange force

    After the revelations of the 1950s, it soon became clear that particles called quarks and gluons are the fundamental building blocks of matter. They are the constituents of hadrons, which is the collective name for protons and other particles. Sometimes people imagine that these kinds of particles fit together like Lego, with quarks in a certain configuration making up protons, and then protons and neutrons coupling up to create a nucleus, and the nucleus attracting electrons to build an atom. But quarks and gluons are anything but static building blocks.

    A theory called quantum chromodynamics describes how the strong force works between quarks, mediated by gluons, which are force carriers. Yet it cannot help us to analytically calculate the proton’s properties. This isn’t some fault of our theorists or computers — the equations themselves are simply not solvable.

    This is why the experimental study of the proton and other hadrons is so crucial: to understand the proton and the force that binds it, one must study it from every angle. For this, the accelerator is our most powerful tool.

    Yet when you look at the proton with a collider (a type of accelerator which uses two beams), what we see depends on how deep — and with what — we look: sometimes it appears as three constituent quarks, at other times as an ocean of gluons, or a teeming sea of pairs of quarks and their antiparticles (antiparticles are near identical to particles, but have the opposite charge or other quantum properties).

    2
    How an electron colliding with a charged atom can reveal its nuclear structure. Brookhaven National Lab/Flickr, CC BY-NC.

    So while our understanding of matter at this tiniest of scales has made great progress in the past 60 years, many mysteries remain which the tools of today cannot fully address. What is the nature of the confinement of quarks within a hadron? How does the mass of the proton arise from the almost massless quarks, 1,000 times lighter?

    To answer such questions, we need a microscope that can image the structure of the proton and nucleus across the widest range of magnifications in exquisite detail, and build 3D images of their structure and dynamics. That’s exactly what the new collider will do.

    Experimental set up

    The Electron-Ion Collider (EIC) will use a very intense beam of electrons as its probe, with which it will be possible to slice the proton or nucleus open and look at the structure inside it. It will do that by colliding a beam of electrons with a beam of protons or ions (charged atoms) and look at how the electrons scatter. The ion beam is the first of its kind in the world.

    Effects which are barely perceptible, such as scattering processes which are so rare you only observe them once in a billion collisions, will become visible. By studying these processes, myself and other scientists will be able to reveal the structure of protons and neutrons, how it is modified when they are bound by the strong force, and how new hadrons are created. We could also uncover what sort of matter is made up of pure gluons — something which has never been seen.

    The collider will be tuneable to a wide range of energies: this is like turning the magnification dial on a microscope, the higher the energy, the deeper inside the proton or nucleus one can look and the finer the features one can resolve.

    Newly formed collaborations of scientists across the world, which are part of the EIC team, are also designing detectors, which will be placed at two different collision points in the collider. Aspects of this effort are led by UK teams, which have just been awarded a grant to lead the design of three key components of the detectors and develop the technologies needed to realise them: sensors for precision tracking of charged particles, sensors for the detection of electrons scattered extremely closely to the beam line and detectors to measure the polarisation (direction of spin) of the particles scattered in the collisions.

    While it may take another ten years before the collider is fully designed and built, it is likely to be well worth the effort. Understanding the structure of the proton and, through it, the fundamental force that gives rise to over 99% of the visible mass in the universe, is one of the greatest challenges in physics today.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Conversation launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 4:12 pm on May 27, 2021 Permalink | Reply
    Tags: "Quark-gluon plasma [QGP] flows like water according to new study", , Fluid viscosity is governed by fundamental physical constants such as the Planck constant and the nucleon mass., Kinematic viscosity, Navier-Stokes equation which contains density and viscosity, , , QCD: Quantum Chromodynamics,   

    From Queen Mary University of London (UK) : “Quark-gluon plasma [QGP] flows like water according to new study” 

    27 May 2021

    Sophie McLachlan
    Faculty Communications Manager (Science and Engineering)
    sophie.mclachlan@qmul.ac.uk

    What does quark-gluon plasma [QGP] – the hot soup of elementary particles formed a few microseconds after the Big Bang – have in common with tap water? Scientists say it’s the way it flows.

    Quark-Gluon Plasma from BNL RHIC.

    Quark gluon plasma from Duke University (US)

    A new study, published today in the journal SciPost Physics, has highlighted the surprising similarities between quark-gluon plasma, the first matter thought to have filled the early Universe, and water that comes from our tap.

    The ratio between the viscosity of a fluid, the measure of how runny it is, and its density, decides how it flows. Whilst both the viscosity and density of quark-gluon plasma are about 16 orders of magnitude larger than in water, the researchers found that the ratio between the viscosity and density of the two types of fluids are the same. This suggests that one of the most exotic states of matter known to exist in our universe would flow out of your tap in much the same way as water.

    What is quark-gluon plasma [QGP]?

    The matter that makes up our Universe is made of atoms, which consist of nuclei with orbiting electrons. Nuclei consist of protons and neutrons known collectively as nucleons and these in turn consist of quarks interacting via gluons. At very high temperatures – about one million times hotter than the centre of the Sun- quarks and gluons break free from their parent nucleons and instead form a dense, hot soup known as quark-gluon plasma.

    It is thought that shortly after the Big Bang the early Universe was filled with incredibly hot quark-gluon plasma. This then cooled microseconds later to form the building blocks of all the matter found within our universe. Since the early 2000s scientists have been able to recreate quark-gluon plasma experimentally using large particle colliders, which has provided new insights into this exotic state of matter.

    The ordinary matter we encounter on a daily basis are thought to have very different properties to the quark-gluon plasma found in the early beginnings of the Universe. For example, fluids like water are governed by the behaviour of atoms and molecules that are much larger than the particles found in quark-gluon plasma, and are held together by weaker forces.

    However, the recent study shows that despite these differences the ratio of viscosity and density, known as the kinematic viscosity, is close in both quark-gluon plasma and ordinary liquids. This ratio is important because the fluid flow does not depend on viscosity alone but is governed by the Navier-Stokes equation which contains density and viscosity. Therefore, if this ratio is the same for two different fluids these two fluids will flow in the same way even if they have very different viscosities and densities.

    The power of physics

    Importantly, it’s not just any liquid viscosity that coincides with the viscosity of quark-gluon plasma. Indeed, liquid viscosity can vary by many orders of magnitude depending on temperature. However, there is one very particular point where liquid viscosity has a nearly-universal lower limit.

    Previous research [Science Advances] found that in that limit, fluid viscosity is governed by fundamental physical constants such as the Planck constant and the nucleon mass. It is these constants of nature that ultimately decide whether a proton is a stable particle, and govern processes like nuclear synthesis in stars and the creation of essential biochemical elements needed for life. The recent study found that it is this universal lower limit of viscosity of ordinary fluids like water which turns out to be close to the viscosity of quark-gluon plasma.

    Professor Kostya Trachenko, Professor of Physics at Queen Mary University of London and author of the recent paper, said: “We do not fully understand the origin of this striking similarity yet but we think it could be related to the fundamental physical constants which set both the universal lower limit of viscosity for both ordinary liquids and quark-gluon plasma.”

    “This study provides a fairly rare and delightful example of where we can draw quantitative comparisons between hugely disparate systems,” continues Professor Matteo Baggioli from the Autonomous University of Madrid [Universidad Autónoma de Madrid] (ES). “Liquids are described by hydrodynamics, which leaves us with many open problems that are currently at the forefront of physics research. Our result shows the power of physics to translate general principles into specific predictions about complex properties such as liquid flow in exotic types of matter like quark-gluon plasma.”

    Improving our understanding

    Understanding quark-gluon plasma and its flow is currently at the forefront of high-energy physics. Strong forces between quarks and gluons are described by quantum chromodynamics, one of the most comprehensive physical theories that exist. However whilst quantum chromodynamics provides a theory of strong nuclear force, it is very hard to solve and understand quark-gluon plasma properties using this alone.

    “It is conceivable that the current result can provide us with a better understanding of the quark-gluon plasma,” added Professor Vadim Brazhkin from the Russian Academy of Sciences [Росси́йская акаде́мия нау́к; (РАН) Rossíiskaya akadémiya naúk](RU). “The reason is that viscosity in liquids at their minimum corresponds to a very particular regime of liquid dynamics which we understood only recently. The similarity with the quark-gluon plasma suggests that particles in this exotic system move in the same way as in tap water.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    At Queen Mary University of London (UK), we believe that a diversity of ideas helps us achieve the previously unthinkable.

    Throughout our history, we’ve fostered social justice and improved lives through academic excellence. And we continue to live and breathe this spirit today, not because it’s simply ‘the right thing to do’ but for what it helps us achieve and the intellectual brilliance it delivers.

    Our reformer heritage informs our conviction that great ideas can and should come from anywhere. It’s an approach that has brought results across the globe, from the communities of east London to the favelas of Rio de Janeiro.

    We continue to embrace diversity of thought and opinion in everything we do, in the belief that when views collide, disciplines interact, and perspectives intersect, truly original thought takes form.

     
  • richardmitnick 3:56 pm on March 5, 2021 Permalink | Reply
    Tags: "Tantalizing Signs of Phase-change ‘Turbulence’ in RHIC Collisions", , , Despite the tantalizing hints the STAR scientists acknowledge that the range of uncertainty in their measurements is still large., , Net baryon density, , , QCD: Quantum Chromodynamics, , STAR physicists took advantage of the incredible versatility of RHIC to collide gold ions (the nuclei of gold atoms) across a wide range of energies., Strictly speaking if the scientists don’t identify either the phase boundary or the critical point they really can’t put this [QGP phase] into the textbooks and say that there is a new state of ma, Tantalizing signs of a critical point—a change in the way that quarks and gluons-the building blocks of protons and neutrons-transform from one phase to another., The work is also a true collaboration of the experimentalists with nuclear theorists around the world and the accelerator physicists at RHIC., When there is a change from high energy to low energy there is an increase in the net baryon density and the structure of matter may change going through the phase transition area.   

    From DOE’s Brookhaven National Laboratory(US): “Tantalizing Signs of Phase-change ‘Turbulence’ in RHIC Collisions” 

    From DOE’s Brookhaven National Laboratory(US)

    March 5, 2021
    Karen McNulty Walsh
    Peter Genzer

    Fluctuations in net proton production hint at a possible ‘critical point’ marking a change in the way nuclear matter transforms from one phase to another.

    1
    The STAR detector at the U.S. Department of Energy’s Brookhaven National Laboratory.

    Physicists studying collisions of gold ions at the Relativistic Heavy Ion Collider (RHIC), a U.S. Department of Energy Office of Science user facility for nuclear physics research at DOE’s Brookhaven National Laboratory, are embarking on a journey through the phases of nuclear matter—the stuff that makes up the nuclei of all the visible matter in our universe. A new analysis of collisions conducted at different energies shows tantalizing signs of a critical point—a change in the way that quarks and gluons-the building blocks of protons and neutrons-transform from one phase to another. The findings, just published by RHIC’s STAR Collaboration in the journal Physical Review Letters, will help physicists map out details of these nuclear phase changes to better understand the evolution of the universe and the conditions in the cores of neutron stars.

    “If we are able to discover this critical point, then our map of nuclear phases—the nuclear phase diagram—may find a place in the textbooks, alongside that of water,” said Bedanga Mohanty of India’s National Institute of Science and Research, one of hundreds of physicists collaborating on research at RHIC using the sophisticated STAR detector.

    As Mohanty noted, studying nuclear phases is somewhat like learning about the solid, liquid, and gaseous forms of water, and mapping out how the transitions take place depending on conditions like temperature and pressure. But with nuclear matter, you can’t just set a pot on the stove and watch it boil. You need powerful particle accelerators like RHIC to turn up the heat.

    2
    As physicists turned the collision energy down at RHIC, they expected to see large event-by-event fluctuations in certain measurements such as net proton production—an effect that’s similar to the turbulence an airplane experiences when entering a bank of clouds—as evidence of a “critical point” in the nuclear phase transition. Higher level statistical analyses of the data, including the skew (kurtosis), revealed tantalizing hints of such fluctuations.

    RHIC’s highest collision energies “melt” ordinary nuclear matter (atomic nuclei made of protons and neutrons) to create an exotic phase called a quark-gluon plasma (QGP). Scientists believe the entire universe existed as QGP a fraction of a second after the Big Bang—before it cooled and the quarks bound together (glued by gluons) to form protons, neutrons, and eventually, atomic nuclei. But the tiny drops of QGP created at RHIC measure a mere 10^-13 centimeters across (that’s 0.0000000000001 cm) and they last for only 10^-23 seconds! That makes it incredibly challenging to map out the melting and freezing of the matter that makes up our world.

    “Strictly speaking if we don’t identify either the phase boundary or the critical point we really can’t put this [QGP phase] into the textbooks and say that we have a new state of matter,” said Nu Xu, a STAR physicist at DOE’s Lawrence Berkeley National Laboratory.

    Tracking phase transitions

    To track the transitions, STAR physicists took advantage of the incredible versatility of RHIC to collide gold ions (the nuclei of gold atoms) across a wide range of energies.

    2
    Mapping nuclear phase changes is like studying how water changes under different conditions of temperature and pressure (net baryon density for nuclear matter). RHIC’s collisions “melt” protons and neutrons to create quark-gluon plasma (QGP). STAR physicists are exploring collisions at different energies, turning the “knobs” of temperature and baryon density, to look for signs of a “critical point.”

    “RHIC is the only facility that can do this, providing beams from 200 billion electron volts (GeV) all the way down to 3 GeV. Nobody can dream of such an excellent machine,” Xu said.

    The changes in energy turn the collision temperature up and down and also vary a quantity known as net baryon density that is somewhat analogous to pressure. Looking at data collected during the first phase of RHIC’s “beam energy scan” from 2010 to 2017, STAR physicists tracked particles streaming out at each collision energy. They performed a detailed statistical analysis of the net number of protons produced. A number of theorists had predicted that this quantity would show large event-by-event fluctuations as the critical point is approached.

    The reason for the expected fluctuations comes from a theoretical understanding of the force that governs quarks and gluons. That theory, known as quantum chromodynamics, suggests that the transition from normal nuclear matter (“hadronic” protons and neutrons) to QGP can take place in two different ways. At high temperatures, where protons and anti-protons are produced in pairs and the net baryon density is close to zero, physicists have evidence of a smooth crossover between the phases. It’s as if protons gradually melt to form QGP, like butter gradually melting on a counter on a warm day. But at lower energies, they expect what’s called a first-order phase transition—an abrupt change like water boiling at a set temperature as individual molecules escape the pot to become steam. Nuclear theorists predict that in the QGP-to-hadronic-matter phase transition, net proton production should vary dramatically as collisions approach this switchover point.

    “At high energy, there is only one phase. The system is more or less invariant, normal,” Xu said. “But when we change from high energy to low energy you also increase the net baryon density and the structure of matter may change as you are going through the phase transition area.

    “It’s just like when you ride an airplane and you get into turbulence,” he added. “You see the fluctuation—boom, boom, boom. Then, when you pass the turbulence—the phase of structural changes—you are back to normal into the one-phase structure.”

    In the RHIC collision data, the signs of this turbulence are not as apparent as food and drinks bouncing off tray tables in an airplane. STAR physicists had to perform what’s known as “higher order correlation function” statistical analysis of the distributions of particles—looking for more than just the mean and width of the curve representing the data to things like how asymmetrical and skewed that distribution is.

    The oscillations they see in these higher orders, particularly the skew (or kurtosis), are reminiscent of another famous phase change observed when transparent liquid carbon dioxide suddenly becomes cloudy when heated, the scientists say. This “critical opalescence” comes from dramatic fluctuations in the density of the CO2—variations in how tightly packed the molecules are.

    “In our data, the oscillations signify that something interesting is happening, like the opalescence,” Mohanty said.

    Yet despite the tantalizing hints the STAR scientists acknowledge that the range of uncertainty in their measurements is still large. The team hopes to narrow that uncertainty to nail their critical point discovery by analyzing a second set of measurements made from many more collisions during phase II of RHIC’s beam energy scan, from 2019 through 2021.

    The entire STAR collaboration was involved in the analysis, Xu notes, with a particular group of physicists—including Xiaofeng Luo (and his student, Yu Zhang), Ashish Pandav, and Toshihiro Nonaka, from China, India, and Japan, respectively—meeting weekly with the U.S. scientists (over many time zones and virtual networks) to discuss and refine the results. The work is also a true collaboration of the experimentalists with nuclear theorists around the world and the accelerator physicists at RHIC. The latter group, in Brookhaven Lab’s Collider-Accelerator Department, devised ways to run RHIC far below its design energy while also maximizing collision rates to enable the collection of the necessary data at low collision energies.

    “We are exploring uncharted territory,” Xu said. “This has never been done before. We made lots of efforts to control the environment and make corrections, and we are eagerly awaiting the next round of higher statistical data,” he said.

    This study was supported by the DOE Office of Science, the U.S. National Science Foundation, and a wide range of international funding agencies listed in the paper. RHIC operations are funded by the DOE Office of Science. Data analysis was performed using computing resources at the RHIC and ATLAS Computing Facility (RACF) at Brookhaven Lab, the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory, and via the Open Science Grid consortium.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    One of ten national laboratories overseen and primarily funded by the DOE(US) Office of Science, DOE’s Brookhaven National Laboratory(US) conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University(US), the largest academic user of Laboratory facilities, and Battelle(US), a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University(US) and Battelle Memorial Institute(US). From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology(US) to have a facility near Boston, Massachusettes(US). Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia(US), Cornell(US), Harvard(US), Johns Hopkins(US), MIT, Princeton University(US), University of Pennsylvania(US), University of Rochester(US), and Yale University(US).

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS)

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II [below].

    BNL NSLS.

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] as the future Electron–ion collider (EIC) in the United States.

    Electron-Ion Collider (EIC) at BNL, to be built inside the tunnel that currently houses the RHIC.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 (mission need) from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma[16] and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.
    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.
    National Synchrotron Light Source II (NSLS-II), Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years.[19] NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.
    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.
    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University.
    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to ATLAS experiment, one of the four detectors located at the Large Hadron Collider (LHC).

    CERN map

    Iconic view of the CERN (CH) ATLAS detector.

    It is currently operating at CERN near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the SNS accumulator ring in partnership with Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    ORNL Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Reactor Neutrino Experiment in China and the Deep Underground Neutrino Experiment at DOE’s Fermi National Accelerator Laboratory(US).

    Daya Bay, nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA.

    Brookhaven Campus.

    BNL Center for Functional Nanomaterials.

    BNL NSLS-II.

    BNL NSLS II.

    BNL RHIC Campus.

    BNL/RHIC Star Detector.

    BNL/RHIC Phenix.

     
  • richardmitnick 11:56 am on March 5, 2021 Permalink | Reply
    Tags: "Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature", , , , , CERN(CH), , Hadrons, , Mesons, , , , Protons and neutrons, QCD: Quantum Chromodynamics, Quarks and antiquarks, , , , Tetraquarks and pentaquarks, The four new particles we've discovered recently are all tetraquarks with a charm quark pair and two other quarks., The standard model is certainly not the last word in the understanding of particles., These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model.   

    From CERN(CH) via Science Alert(AU): “Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature” 

    Cern New Bloc

    Cern New Particle Event


    From CERN(CH)

    via

    ScienceAlert

    Science Alert(AU)

    5 MARCH 2021
    PATRICK KOPPENBURG
    Research Fellow in Particle Physics
    Dutch National Institute for Subatomic Physics, Dutch Research Council (NWO – Nederlandse Organisatie voor Wetenschappelijk Onderzoek)(NL)

    Harry Cliff
    Particle physicist
    University of Cambridge(UK).

    1
    The Large Hadron Collider. Credit: CERN.

    This month is a time to celebrate. CERN has just announced the discovery of four brand new particles [3 March 2021: Observation of two ccus tetraquarks and two ccss tetraquarks.] at the Large Hadron Collider (LHC) in Geneva.

    This means that the LHC has now found a total of 59 new particles, in addition to the Nobel prize-winning Higgs boson, since it started colliding protons – particles that make up the atomic nucleus along with neutrons – in 2009.

    Excitingly, while some of these new particles were expected based on our established theories, some were altogether more surprising.

    The LHC’s goal is to explore the structure of matter at the shortest distances and highest energies ever probed in the lab – testing our current best theory of nature: the Standard Model of Particle Physics.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS).

    And the LHC has delivered the goods – it enabled scientists to discover the Higgs boson [below], the last missing piece of the model. That said, the theory is still far from being fully understood.

    One of its most troublesome features is its description of the strong interaction which holds the atomic nucleus together. The nucleus is made up of protons and neutrons, which are in turn each composed of three tiny particles called quarks (there are six different kinds of quarks: up, down, charm, strange, top and bottom).

    If we switched the strong force off for a second, all matter would immediately disintegrate into a soup of loose quarks – a state that existed for a fleeting instant at the beginning of the universe.

    Don’t get us wrong: the theory of the strong interaction, pretentiously called Quantum Chromodynamics, is on very solid footing. It describes how quarks interact through the strong interaction by exchanging particles called gluons. You can think of gluons as analogues of the more familiar photon, the particle of light and carrier of the electromagnetic interaction.

    However, the way gluons interact with quarks makes the strong interaction behave very differently from electromagnetism. While the electromagnetic interaction gets weaker as you pull two charged particles apart, the strong interaction actually gets stronger as you pull two quarks apart.

    As a result, quarks are forever locked up inside particles called hadrons – particles made of two or more quarks – which includes protons and neutrons. Unless, of course, you smash them open at incredible speeds, as we are doing at Cern.

    To complicate matters further, all the particles in the standard model have antiparticles which are nearly identical to themselves but with the opposite charge (or other quantum property). If you pull a quark out of a proton, the force will eventually be strong enough to create a quark-antiquark pair, with the newly created quark going into the proton.

    You end up with a proton and a brand new “meson”, a particle made of a quark and an antiquark. This may sound weird but according to quantum mechanics, which rules the universe on the smallest of scales, particles can pop out of empty space.

    This has been shown repeatedly by experiments – we have never seen a lone quark. An unpleasant feature of the theory of the strong interaction is that calculations of what would be a simple process in electromagnetism can end up being impossibly complicated. We therefore cannot (yet) prove theoretically that quarks can’t exist on their own.

    Worse still, we can’t even calculate which combinations of quarks would be viable in nature and which would not.

    2
    Illustration of a tetraquark. Credit: CERN.

    When quarks were first discovered, scientists realized that several combinations should be possible in theory. This included pairs of quarks and antiquarks (mesons); three quarks (baryons); three antiquarks (antibaryons); two quarks and two antiquarks (tetraquarks); and four quarks and one antiquark (pentaquarks) – as long as the number of quarks minus antiquarks in each combination was a multiple of three.

    For a long time, only baryons and mesons were seen in experiments. But in 2003, the Belle experiment in Japan discovered a particle that didn’t fit in anywhere.

    KEK Belle detector, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan.

    Belle II KEK High Energy Accelerator Research Organization Tsukuba, Japan.

    It turned out to be the first of a long series of tetraquarks.

    In 2015, the LHCb experiment [below] at the LHC discovered two pentaquarks.

    3
    Is a pentaquark tightly (above) or weakly bound (see image below)? Credit: CERN.

    The four new particles we’ve discovered recently are all tetraquarks with a charm quark pair and two other quarks. All these objects are particles in the same way as the proton and the neutron are particles. But they are not fundamental particles: quarks and electrons are the true building blocks of matter.

    Charming new particles

    The LHC has now discovered 59 new hadrons. These include the tetraquarks most recently discovered, but also new mesons and baryons. All these new particles contain heavy quarks such as “charm” and “bottom”.

    These hadrons are interesting to study. They tell us what nature considers acceptable as a bound combination of quarks, even if only for very short times.

    They also tell us what nature does not like. For example, why do all tetra- and pentaquarks contain a charm-quark pair (with just one exception)? And why are there no corresponding particles with strange-quark pairs? There is currently no explanation.

    4
    Is a pentaquark a molecule? A meson (left) interacting with a proton (right). Credit: CERN.

    Another mystery is how these particles are bound together by the strong interaction. One school of theorists considers them to be compact objects, like the proton or the neutron.

    Others claim they are akin to “molecules” formed by two loosely bound hadrons. Each newly found hadron allows experiments to measure its mass and other properties, which tell us something about how the strong interaction behaves. This helps bridge the gap between experiment and theory. The more hadrons we can find, the better we can tune the models to the experimental facts.

    These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model. Despite its successes, the standard model is certainly not the last word in the understanding of particles. It is for instance inconsistent with cosmological models describing the formation of the universe.

    The LHC is searching for new fundamental particles that could explain these discrepancies. These particles could be visible at the LHC, but hidden in the background of particle interactions. Or they could show up as small quantum mechanical effects in known processes.

    In either case, a better understanding of the strong interaction is needed to find them. With each new hadron, we improve our knowledge of nature’s laws, leading us to a better description of the most fundamental properties of matter.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN(CH) in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier(CH)

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan.


    SixTRack CERN LHC particles

    The European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU), known as CERN, is a European research organization that operates the largest particle physics laboratory in the world. Established in 1954, the organization is based in a northwest suburb of Geneva on the Franco–Swiss border and has 23 member states. Israel is the only non-European country granted full membership. CERN is an official United Nations Observer.

    The acronym CERN is also used to refer to the laboratory, which in 2019 had 2,660 scientific, technical, and administrative staff members, and hosted about 12,400 users from institutions in more than 70 countries. In 2016 CERN generated 49 petabytes of data.

    CERN’s main function is to provide the particle accelerators and other infrastructure needed for high-energy physics research – as a result, numerous experiments have been constructed at CERN through international collaborations. The main site at Meyrin hosts a large computing facility, which is primarily used to store and analyse data from experiments, as well as simulate events. Researchers need remote access to these facilities, so the lab has historically been a major wide area network hub. CERN is also the birthplace of the World Wide Web.

    The convention establishing CERN was ratified on 29 September 1954 by 12 countries in Western Europe. The acronym CERN originally represented the French words for Conseil Européen pour la Recherche Nucléaire (European Council for Nuclear Research), which was a provisional council for building the laboratory, established by 12 European governments in 1952. The acronym was retained for the new laboratory after the provisional council was dissolved, even though the name changed to the current Organisation Européenne pour la Recherche Nucléaire (European Organization for Nuclear Research)(EU) in 1954. According to Lew Kowarski, a former director of CERN, when the name was changed, the abbreviation could have become the awkward OERN, and Werner Heisenberg said that this could “still be CERN even if the name is [not]”.

    CERN’s first president was Sir Benjamin Lockspeiser. Edoardo Amaldi was the general secretary of CERN at its early stages when operations were still provisional, while the first Director-General (1954) was Felix Bloch.

    The laboratory was originally devoted to the study of atomic nuclei, but was soon applied to higher-energy physics, concerned mainly with the study of interactions between subatomic particles. Therefore, the laboratory operated by CERN is commonly referred to as the European laboratory for particle physics (Laboratoire européen pour la physique des particules), which better describes the research being performed there.

    Founding members

    At the sixth session of the CERN Council, which took place in Paris from 29 June – 1 July 1953, the convention establishing the organization was signed, subject to ratification, by 12 states. The convention was gradually ratified by the 12 founding Member States: Belgium, Denmark, France, the Federal Republic of Germany, Greece, Italy, the Netherlands, Norway, Sweden, Switzerland, the United Kingdom, and “Yugoslavia”.

    Scientific achievements

    Several important achievements in particle physics have been made through experiments at CERN. They include:

    1973: The discovery of neutral currents in the Gargamelle bubble chamber.
    1983: The discovery of W and Z bosons in the UA1 and UA2 experiments.
    1989: The determination of the number of light neutrino families at the Large Electron–Positron Collider (LEP) operating on the Z boson peak.
    1995: The first creation of antihydrogen atoms in the PS210 experiment.
    1999: The discovery of direct CP violation in the NA48 experiment.
    2010: The isolation of 38 atoms of antihydrogen.
    2011: Maintaining antihydrogen for over 15 minutes.
    2012: A boson with mass around 125 GeV/c2 consistent with the long-sought Higgs boson.

    In September 2011, CERN attracted media attention when the OPERA Collaboration reported the detection of possibly faster-than-light neutrinos. Further tests showed that the results were flawed due to an incorrectly connected GPS synchronization cable.

    The 1984 Nobel Prize for Physics was awarded to Carlo Rubbia and Simon van der Meer for the developments that resulted in the discoveries of the W and Z bosons. The 1992 Nobel Prize for Physics was awarded to CERN staff researcher Georges Charpak “for his invention and development of particle detectors, in particular the multiwire proportional chamber”. The 2013 Nobel Prize for Physics was awarded to François Englert and Peter Higgs for the theoretical description of the Higgs mechanism in the year after the Higgs boson was found by CERN experiments.

    Computer science

    The World Wide Web began as a CERN project named ENQUIRE, initiated by Tim Berners-Lee in 1989 and Robert Cailliau in 1990. Berners-Lee and Cailliau were jointly honoured by the Association for Computing Machinery in 1995 for their contributions to the development of the World Wide Web.

    Current complex

    CERN operates a network of six accelerators and a decelerator. Each machine in the chain increases the energy of particle beams before delivering them to experiments or to the next more powerful accelerator. Currently (as of 2019) active machines are:

    The LINAC 3 linear accelerator generating low energy particles. It provides heavy ions at 4.2 MeV/u for injection into the Low Energy Ion Ring (LEIR).
    The Proton Synchrotron Booster increases the energy of particles generated by the proton linear accelerator before they are transferred to the other accelerators.
    The Low Energy Ion Ring (LEIR) accelerates the ions from the ion linear accelerator LINAC 3, before transferring them to the Proton Synchrotron (PS). This accelerator was commissioned in 2005, after having been reconfigured from the previous Low Energy Antiproton Ring (LEAR).
    The 28 GeV Proton Synchrotron (PS), built during 1954—1959 and still operating as a feeder to the more powerful SPS.
    The Super Proton Synchrotron (SPS), a circular accelerator with a diameter of 2 kilometres built in a tunnel, which started operation in 1976. It was designed to deliver an energy of 300 GeV and was gradually upgraded to 450 GeV. As well as having its own beamlines for fixed-target experiments (currently COMPASS and NA62), it has been operated as a proton–antiproton collider (the SppS collider), and for accelerating high energy electrons and positrons which were injected into the Large Electron–Positron Collider (LEP). Since 2008, it has been used to inject protons and heavy ions into the Large Hadron Collider (LHC).
    The On-Line Isotope Mass Separator (ISOLDE), which is used to study unstable nuclei. The radioactive ions are produced by the impact of protons at an energy of 1.0–1.4 GeV from the Proton Synchrotron Booster. It was first commissioned in 1967 and was rebuilt with major upgrades in 1974 and 1992.
    The Antiproton Decelerator (AD), which reduces the velocity of antiprotons to about 10% of the speed of light for research of antimatter.[50] The AD machine was reconfigured from the previous Antiproton Collector (AC) machine.
    The AWAKE experiment, which is a proof-of-principle plasma wakefield accelerator.
    The CERN Linear Electron Accelerator for Research (CLEAR) accelerator research and development facility.

    Large Hadron Collider

    Many activities at CERN currently involve operating the Large Hadron Collider (LHC) and the experiments for it. The LHC represents a large-scale, worldwide scientific cooperation project.

    The LHC tunnel is located 100 metres underground, in the region between the Geneva International Airport and the nearby Jura mountains. The majority of its length is on the French side of the border. It uses the 27 km circumference circular tunnel previously occupied by the Large Electron–Positron Collider (LEP), which was shut down in November 2000. CERN’s existing PS/SPS accelerator complexes are used to pre-accelerate protons and lead ions which are then injected into the LHC.

    Eight experiments (CMS, ATLAS, LHCb, MoEDAL, TOTEM, LHCf, FASER and ALICE) are located along the collider; each of them studies particle collisions from a different aspect, and with different technologies. Construction for these experiments required an extraordinary engineering effort. For example, a special crane was rented from Belgium to lower pieces of the CMS detector into its cavern, since each piece weighed nearly 2,000 tons. The first of the approximately 5,000 magnets necessary for construction was lowered down a special shaft at 13:00 GMT on 7 March 2005.

    The LHC has begun to generate vast quantities of data, which CERN streams to laboratories around the world for distributed processing (making use of a specialized grid infrastructure, the LHC Computing Grid). During April 2005, a trial successfully streamed 600 MB/s to seven different sites across the world.

    The initial particle beams were injected into the LHC August 2008. The first beam was circulated through the entire LHC on 10 September 2008, but the system failed 10 days later because of a faulty magnet connection, and it was stopped for repairs on 19 September 2008.

    The LHC resumed operation on 20 November 2009 by successfully circulating two beams, each with an energy of 3.5 teraelectronvolts (TeV). The challenge for the engineers was then to try to line up the two beams so that they smashed into each other. This is like “firing two needles across the Atlantic and getting them to hit each other” according to Steve Myers, director for accelerators and technology.

    On 30 March 2010, the LHC successfully collided two proton beams with 3.5 TeV of energy per proton, resulting in a 7 TeV collision energy. However, this was just the start of what was needed for the expected discovery of the Higgs boson. When the 7 TeV experimental period ended, the LHC revved to 8 TeV (4 TeV per proton) starting March 2012, and soon began particle collisions at that energy. In July 2012, CERN scientists announced the discovery of a new sub-atomic particle that was later confirmed to be the Higgs boson.

    CERN CMS Higgs Event May 27, 2012.


    CERN ATLAS Higgs Event
    June 12, 2012.


    Peter Higgs

    In March 2013, CERN announced that the measurements performed on the newly found particle allowed it to conclude that this is a Higgs boson. In early 2013, the LHC was deactivated for a two-year maintenance period, to strengthen the electrical connections between magnets inside the accelerator and for other upgrades.

    On 5 April 2015, after two years of maintenance and consolidation, the LHC restarted for a second run. The first ramp to the record-breaking energy of 6.5 TeV was performed on 10 April 2015. In 2016, the design collision rate was exceeded for the first time. A second two-year period of shutdown begun at the end of 2018.

    Accelerators under construction

    As of October 2019, the construction is on-going to upgrade the LHC’s luminosity in a project called High Luminosity LHC (HL-LHC).

    This project should see the LHC accelerator upgraded by 2026 to an order of magnitude higher luminosity.

    As part of the HL-LHC upgrade project, also other CERN accelerators and their subsystems are receiving upgrades. Among other work, the LINAC 2 linear accelerator injector was decommissioned, to be replaced by a new injector accelerator, the LINAC4 in 2020.

    Possible future accelerators

    CERN, in collaboration with groups worldwide, is investigating two main concepts for future accelerators: A linear electron-positron collider with a new acceleration concept to increase the energy (CLIC) and a larger version of the LHC, a project currently named Future Circular Collider.

    CLIC collider

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC.

    Not discussed or described, but worthy of consideration is the ILC, International Linear Collider in the planning stages for construction in Japan.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan.

    Participation

    Since its foundation by 12 members in 1954, CERN regularly accepted new members. All new members have remained in the organization continuously since their accession, except Spain and Yugoslavia. Spain first joined CERN in 1961, withdrew in 1969, and rejoined in 1983. Yugoslavia was a founding member of CERN but quit in 1961. Of the 23 members, Israel joined CERN as a full member on 6 January 2014, becoming the first (and currently only) non-European full member.

    Enlargement

    Associate Members, Candidates:

    Turkey signed an association agreement on 12 May 2014 and became an associate member on 6 May 2015.
    Pakistan signed an association agreement on 19 December 2014 and became an associate member on 31 July 2015.
    Cyprus signed an association agreement on 5 October 2012 and became an associate Member in the pre-stage to membership on 1 April 2016.
    Ukraine signed an association agreement on 3 October 2013. The agreement was ratified on 5 October 2016.
    India signed an association agreement on 21 November 2016. The agreement was ratified on 16 January 2017.
    Slovenia was approved for admission as an Associate Member state in the pre-stage to membership on 16 December 2016. The agreement was ratified on 4 July 2017.
    Lithuania was approved for admission as an Associate Member state on 16 June 2017. The association agreement was signed on 27 June 2017 and ratified on 8 January 2018.
    Croatia was approved for admission as an Associate Member state on 28 February 2019. The agreement was ratified on 10 October 2019.
    Estonia was approved for admission as an Associate Member in the pre-stage to membership state on 19 June 2020. The agreement was ratified on 1 February 2021.

     
  • richardmitnick 4:04 pm on January 27, 2021 Permalink | Reply
    Tags: "Size of helium nucleus measured more precisely than ever before", , Helium is the second most abundant element in the universe., Helium=two protons and two neutrons., , , , Paul Scherrer Institute [Paul Scherrer Institut](CH), , Proton radius mystery is fading away, QCD: Quantum Chromodynamics, , Resonance frequency, Rydberg constant, Slow muons; complicated laser system,   

    From Paul Scherrer Institute [Paul Scherrer Institut](CH): “Size of helium nucleus measured more precisely than ever before” 

    From Paul Scherrer Institute [Paul Scherrer Institut](CH)

    27 January 2021

    Text: Barbara Vonarburg

    Dr. Aldo Antognini
    Labor für Teilchenphysik
    Paul Scherrer Institute, Forschungsstrasse 111, 5232 Villigen PSI (CH)
    &
    Institute for Particle Physics and Astrophysics
    ETH Zürich, Otto-Stern-Weg 5, 8093 Zürich (CH)
    +41 56 310 46 14
    aldo.antognini@psi.ch

    Dr. Franz Kottmann
    Labor für Teilchenphysik
    Paul Scherrer Institute, Forschungsstrasse 111, 5232 Villigen PSI (CH)
    Institute for Particle Physics and Astrophysics
    ETH Zürich, Otto-Stern-Weg 5, 8093 Zürich (CH)
    +41 79273 16 39
    franz.kottmann@psi.ch

    Dr. Julian J. Krauth
    LaserLaB, Faculty of Sciences
    Quantum Metrology and Laser Applications
    Vrije Universiteit Amsterdam
    De Boelelaan 1081, 1081HV Amsterdam (NL)
    +31 20 5987438
    j.krauth@vu.nl

    Prof. Dr. Randolf Pohl
    Institut für Physik
    Johannes Gutenberg Universität, 55128 Mainz (DE)
    +49 171 41 70 752, e-mail:
    pohl@uni-mainz.de

    In experiments at the Paul Scherrer Institute PSI, an international research collaboration has measured the radius of the atomic nucleus of helium five times more precisely than ever before. With the aid of the new value, fundamental physical theories can be tested and natural constants can be determined even more precisely. For their measurements, the researchers needed muons – these particles are similar to electrons but are around 200 times heavier. PSI is the only research site in the world where enough so-called low-energy muons are produced for such experiments. The researchers are publishing their results today in the journal Nature.

    1
    Both Franz Kottmann (left) and Karsten Schuhmann did essential preparatory work for the crucial experiment. Credit: Paul Scherrer Institute/Markus Fischer)

    After hydrogen, helium is the second most abundant element in the universe. Around one-fourth of the atomic nuclei that formed in the first few minutes after the Big Bang were helium nuclei. These consist of four building blocks: two protons and two neutrons. For fundamental physics, it is crucial to know the properties of the helium nucleus, among other things to understand the processes in other atomic nuclei that are heavier than helium. “The helium nucleus is a very fundamental nucleus, which could be described as magical,” says Aldo Antognini, a physicist at PSI and ETH Zürich (CH). His colleague and co-author Randolf Pohl from Johannes Gutenberg University Mainz (DE) adds: “Our previous knowledge about the helium nucleus comes from experiments with electrons. At PSI, however, we have for the first time developed a new type of measurement method that allows much better accuracy.”

    With this, the international research collaboration succeeded in determining the size of the helium nucleus around five times more precisely than was possible in previous measurements. The group is publishing its results today in the renowned scientific journal Nature [above]. According to their findings, the so-called mean charge radius of the helium nucleus is 1.67824 femtometers (there are 1 quadrillion femtometers in 1 meter).

    “The idea behind our experiments is simple,” explains Antognini. Normally two negatively charged electrons orbit the positively charged helium nucleus. “We don’t work with normal atoms, but with exotic atoms in which both electrons have been replaced by a single muon,” says the physicist. The muon is considered to be the electron’s heavier brother; it resembles it, but it’s around 200 times heavier. A muon is much more strongly bound to the atomic nucleus than an electron and encircles it in much narrower orbits. Compared to electrons, a muon is much more likely to stay in the nucleus itself. “So with muonic helium, we can draw conclusions about the structure of the atomic nucleus and measure its properties,” Antognini explains.

    Slow muons, complicated laser system

    The muons are produced at PSI using a particle accelerator. The specialty of the facility: generating muons with low energy. These particles are slow and can be stopped in the apparatus for experiments. This is the only way researchers can form the exotic atoms in which a muon throws an electron out of its orbit and replaces it. Fast muons, in contrast, would fly right through the apparatus. The PSI system delivers more low-energy muons than all other comparable systems worldwide. “That is why the experiment with muonic helium can only be carried out here,” says Franz Kottmann, who for 40 years has been pressing ahead with the necessary preliminary studies and technical developments for this experiment.

    The muons hit a small chamber filled with helium gas. If the conditions are right, muonic helium is created, where the muon is in an energy state in which it often stays in the atomic nucleus. “Now the second important component for the experiment comes into play: the laser system,” Pohl explains. The complicated system shoots a laser pulse at the helium gas. If the laser light has the right frequency, it excites the muon and advances it to a higher energy state, in which its path is practically always outside the nucleus. When it falls from this to the ground state, it emits X-rays. Detectors register these X-ray signals.

    In the experiment, the laser frequency is varied until a large number of X-ray signals arrive. Physicists then speak of the so-called resonance frequency. With its help, then, the difference between the two energetic states of the muon in the atom can be determined. According to theory, the measured energy difference depends on how large the atomic nucleus is. Hence, using the theoretical equation, the radius can be determined from the measured resonance. This data analysis was carried out in Randolf Pohl’s group in Mainz (DE).

    Proton radius mystery is fading away

    The researchers at PSI had already measured the radius of the proton in the same way in 2010. At that time, their value did not match that obtained by other measurement methods. There was talk of a proton radius puzzle, and some speculated that a new physics might lie behind it in the form of a previously unknown interaction between the muon and the proton. This time there is no contradiction between the new, more precise value and the measurements with other methods. “This makes the explanation of the results with physics beyond the standard model more improbable,” says Kottmann. In addition, in recent years the value of the proton radius determined by means of other methods has been approaching the precise number from PSI. “The proton radius puzzle still exists, but it is slowly fading away,” says Kottmann.

    “Our measurement can be used in different ways,” says Julian Krauth, first author of the study: “The radius of the helium nucleus is an important touchstone for nuclear physics.” Atomic nuclei are held together by the so-called strong interaction, one of the four fundamental forces in physics. With the theory of strong interaction, known as quantum chromodynamics, physicists would like to be able to predict the radius of the helium nucleus and other light atomic nuclei with a few protons and neutrons. The extremely precisely measured value for the radius of the helium nucleus puts these predictions to the test. This also makes it possible to test new theoretical models of the nuclear structure and to understand atomic nuclei even better.

    The measurements on muonic helium can also be compared with experiments using normal helium atoms and ions. In experiments on these, too, energy transitions can be triggered and measured with laser systems – here, though, with electrons instead of muons. Measurements on electronic helium are under way right now. By comparing the results of the two measurements, one can draw conclusions about fundamental natural constants such as the Rydberg constant, which plays an important role in quantum mechanics.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Paul Scherrer Institute [Paul Scherrer Institut](CH) is the largest research institute for natural and engineering sciences within Switzerland. We perform world-class research in three main subject areas: Matter and Material; Energy and the Environment; and Human Health. By conducting fundamental and applied research, we work on long-term solutions for major challenges facing society, industry and science.

    The Paul Scherrer Institute (PSI) is a multi-disciplinary research institute for natural and engineering sciences in Switzerland. It is located in the Canton of Aargau in the municipalities Villigen and Würenlingen on either side of the River Aare, and covers an area over 35 hectares in size. Like ETH Zurich and EPFL, PSI belongs to the Swiss Federal Institutes of Technology Domain of the Swiss Confederation. The PSI employs around 2100 people. It conducts basic and applied research in the fields of matter and materials, human health, and energy and the environment. About 37% of PSI’s research activities focus on material sciences, 24% on life sciences, 19% on general energy, 11% on nuclear energy and safety, and 9% on particle physics.

    PSI develops, builds and operates large and complex research facilities and makes them available to the national and international scientific communities. In 2017, for example, more than 2500 researchers from 60 different countries came to PSI to take advantage of the concentration of large-scale research facilities in the same location, which is unique worldwide. About 1900 experiments are conducted each year at the approximately 40 measuring stations in these facilities.

    In recent years, the institute has been one of the largest recipients of money from the Swiss lottery fund.

     
  • richardmitnick 11:00 am on May 8, 2020 Permalink | Reply
    Tags: "What Goes On in a Proton? Quark Math Still Conflicts With Experiments", A million-dollar math prize awaits anyone who can solve the type of equation used in QCD to show how massive entities like protons form., “We know absolutely that quarks and gluons interact with each other but we can’t calculate” the result., , , QCD: Quantum Chromodynamics, , , The discovery of quarks in the 1960s broke everything., The holographic principle   

    From Quanta Magazine: “What Goes On in a Proton? Quark Math Still Conflicts With Experiments” 

    From Quanta Magazine

    May 6, 2020
    Charlie Wood

    The quark structure of the proton 16 March 2006 Arpad Horvath

    Objects are made of atoms, and atoms are likewise the sum of their parts — electrons, protons and neutrons. Dive into one of those protons or neutrons, however, and things get weird. Three particles called quarks ricochet back and forth at nearly the speed of light, snapped back by interconnected strings of particles called gluons. Bizarrely, the proton’s mass must somehow arise from the energy of the stretchy gluon strings, since quarks weigh very little and gluons nothing at all.

    Physicists uncovered this odd quark-gluon picture in the 1960s and matched it to an equation in the ’70s, creating the theory of quantum chromodynamics (QCD). The problem is that while the theory seems accurate, it is extraordinarily complicated mathematically. Faced with a task like calculating how three wispy quarks produce the hulking proton, QCD simply fails to produce a meaningful answer.

    “It’s tantalizing and frustrating,” said Mark Lancaster, a particle physicist based at the University of Manchester in the United Kingdom. “We know absolutely that quarks and gluons interact with each other, but we can’t calculate” the result.

    A million-dollar math prize awaits anyone who can solve the type of equation used in QCD to show how massive entities like protons form. Lacking such a solution, particle physicists have developed arduous workarounds that deliver approximate answers. Some infer quark activity experimentally at particle colliders, while others harness the world’s most powerful supercomputers. But these approximation techniques have recently come into conflict, leaving physicists unsure exactly what their theory predicts and thus less able to interpret signs of new, unpredicted particles or effects.

    To understand what makes quarks and gluons such mathematical scofflaws, consider how much mathematical machinery goes into describing even well-behaved particles.

    A humble electron, for instance, can briefly emit and then absorb a photon. During that photon’s short life, it can split into a pair of matter-antimatter particles, each of which can engage in further acrobatics, ad infinitum. As long as each individual event ends quickly, quantum mechanics allows the combined flurry of “virtual” activity to continue indefinitely.

    In the 1940s, after considerable struggle, physicists developed mathematical rules that could accommodate this bizarre feature of nature. Studying an electron involved breaking down its virtual entourage into a series of possible events, each corresponding to a squiggly drawing known as a Feynman diagram and a matching equation. A perfect analysis of the electron would require an infinite string of diagrams — and a calculation with infinitely many steps — but fortunately for the physicists, the more byzantine sketches of rarer events ended up being relatively inconsequential. Truncating the series gives good-enough answers.

    The discovery of quarks in the 1960s broke everything. By pelting protons with electrons, researchers uncovered the proton’s internal parts, bound by a novel force. Physicists raced to find a description that could handle these new building blocks, and they managed to wrap all the details of quarks and the “strong interaction” that binds them into a compact equation in 1973. But their theory of the strong interaction, quantum chromodynamics, didn’t behave in the usual way, and neither did the particles.

    Feynman diagrams treat particles as if they interact by approaching each other from a distance, like billiard balls. But quarks don’t act like this. The Feynman diagram representing three quarks coming together from a distance and binding to one another to form a proton is a mere “cartoon,” according to Flip Tanedo, a particle physicist at the University of California, Riverside, because quarks are bound so strongly that they have no separate existence. The strength of their connection also means that the infinite series of terms corresponding to the Feynman diagrams grows in an unruly fashion, rather than fading away quickly enough to permit an easy approximation. Feynman diagrams are simply the wrong tool.

    The strong interaction is weird for two main reasons. First, whereas the electromagnetic interaction involves just one variety of charge (electric charge), the strong interaction involves three: “color” charges nicknamed red, green and blue. Weirder still, the carrier of the strong interaction, dubbed the gluon, itself bears color charge. So while the (electrically neutral) photons that comprise electromagnetic fields don’t interact with each other, collections of colorful gluons draw together into strings. “That really drives the differences we see,” Lancaster said. The ability of gluons to trip over themselves, together with the three charges, makes the strong interaction strong — so strong that quarks can’t escape each other’s company.

    Evidence piled up over the decades that gluons exist and act as predicted in certain circumstances. But for most calculations, the QCD equation has proved intractable. Physicists need to know what QCD predicts, however — not just to understand quarks and gluons, but to pin down properties of other particles as well, since they’re all affected by the dance of quantum activity that includes virtual quarks.

    One approach has been to infer incalculable values by watching how quarks behave in experiments. “You take electrons and positrons and slam them together,” said Chris Polly, a particle physicist at the Fermi National Accelerator Laboratory, “and ask how often you make quark [products] in the final state.” From those measurements, he said, you can extrapolate how often quark bundles should pop up in the hubbub of virtual activity that surrounds all particles.

    Other researchers have continued to try to wring information from the canonical QCD equation by calculating approximate solutions using supercomputers. “You just keep throwing more computing cycles at it and your answer will keep getting better,” said Aaron Meyer, a particle physicist at Brookhaven National Laboratory.

    This computational approach, known as lattice QCD, turns computers into laboratories that model the behavior of digital quarks and gluons. The technique gets its name from the way it slices space-time into a grid of points. Quarks sit on the lattice points, and the QCD equation lets them interact. The denser the grid, the more accurate the simulation. The Fermilab physicist Andreas Kronfeld remembers how, three decades ago, these simulations had just a handful of lattice points on a side. But computing power has increased, and lattice QCD can now successfully predict the proton’s mass to within a few percent of the experimentally determined value.

    Kronfeld is a spokesperson for USQCD, a federation of lattice QCD groups in the United States that have banded together to negotiate for bulk supercomputer time. He serves as the principal investigator for the federation’s efforts on the Summit supercomputer, currently the world’s fastest, located at Oak Ridge National Laboratory. USQCD runs one of Summit’s largest programs, occupying nearly 4% of the machine’s annual computing capacity.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Theorists thought these digital laboratories were still a year or two away from becoming competitive with the collider experiments in approximating the effects quarks have on other particles. But in February a European collaboration shocked the community with a preprint claiming to nail a magnetic property of a particle called the muon to within 1% of its true value, using novel noise reduction techniques. “You might think of it as throwing down the gauntlet,” said Aida El-Khadra, a high-energy theorist at the University of Illinois, Urbana-Champaign.

    The team’s prediction for virtual quark activity around the muon clashed with the inferences from electron-positron collisions, however. Meyer, who recently co-authored a survey of the conflicting results, says that many technical details in lattice QCD remain poorly understood, such as how to hop from the gritty lattice back to smooth space. Efforts to determine what QCD predicts for the muon, which many researchers consider a bellwether for undiscovered particles, are ongoing.

    Meanwhile, mathematically minded researchers haven’t entirely despaired of finding a pen-and-paper strategy for tackling the strong interaction — and reaping the million-dollar reward offered by the Clay Mathematics Institute for a rigorous prediction of the mass of the lightest possible collection of quarks or gluons.

    One such Hail Mary pass in the theoretical world is a tool called the holographic principle. The general strategy is to translate the problem into an abstract mathematical space where some hologram of quarks can be separated from each other, allowing an analysis in terms of Feynman diagrams.

    Simple attempts look promising, according to Tanedo, but none come close to the hard-won accuracy of lattice QCD. For now, theorists will continue to refine their imperfect tools and dream of new mathematical machinery capable of taming the fundamental but inseparable quarks.

    “That would be the holy grail,” Tanedo says. QCD is “just begging for us to figure out how that actually works.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 8:54 am on May 5, 2020 Permalink | Reply
    Tags: "Three Birds with One Particle: The Possibilities of Axions", , , Matter-antimatter asymmetry, , , QCD: Quantum Chromodynamics   

    From particlebites: “Three Birds with One Particle: The Possibilities of Axions” 

    particlebites bloc

    From particlebites

    May 1, 2020
    Amara McCune

    Title: “Axiogenesis”

    Author: Raymond T. Co and Keisuke Harigaya

    Reference: https://arxiv.org/pdf/1910.02080.pdf

    On the laundry list of problems in particle physics, a rare three-for-one solution could come in the form of a theorized light scalar particle fittingly named after a detergent: the axion. Frank Wilczek coined this term in reference to its potential to “clean up” the Standard Model once he realized its applicability to multiple unsolved mysteries. Although Axion the dish soap has been somewhat phased out of our everyday consumer life (being now primarily sold in Latin America), axion particles remain as a key component of a physicist’s toolbox. While axions get a lot of hype as a promising Dark Matter candidate, and are now being considered as a solution to matter-antimatter asymmetry, they were originally proposed as a solution for a different Standard Model puzzle: the strong CP problem.

    The strong CP problem refers to a peculiarity of quantum chromodynamics (QCD), our theory of quarks, gluons, and the strong force that mediates them: while the theory permits charge-parity (CP) symmetry violation, the ardent experimental search for CP-violating processes in QCD has so far come up empty-handed. What does this mean from a physical standpoint? Consider the neutron electric dipole moment (eDM), which roughly describes the distribution of the three quarks comprising a neutron. Naively, we might expect this orientation to be a triangular one. However, measurements of the neutron eDM, carried out by tracking changes in neutron spin precession, return a value orders of magnitude smaller than classically expected. In fact, the incredibly small value of this parameter corresponds to a neutron where the three quarks are found nearly in a line.

    1
    The classical picture of the neutron (left) looks markedly different from the picture necessitated by CP symmetry (right). The strong CP problem is essentially a question of why our mental image should look like the right picture instead of the left. Source: https://arxiv.org/pdf/1812.02669.pdf

    This would not initially appear to be a problem. In fact, in the context of CP, this makes sense: a simultaneous charge conjugation (exchanging positive charges for negative ones and vice versa) and parity inversion (flipping the sign of spatial directions) when the quark arrangement is linear results in a symmetry. Yet there are a few subtleties that point to the existence of further physics. First, this tiny value requires an adjustment of parameters within the mathematics of QCD, carefully fitting some coefficients to cancel out others in order to arrive at the desired conclusion. Second, we do observe violation of CP symmetry in particle physics processes mediated by the weak interaction, such as kaon decay, which also involves quarks.

    These arguments rest upon the idea of naturalness, a principle that has been invoked successfully several times throughout the development of particle theory as a hint toward the existence of a deeper, more underlying theory. Naturalness (in one of its forms) states that such minuscule values are only allowed if they increase the overall symmetry of the theory, something that cannot be true if weak processes exhibit CP-violation where strong processes do not. This puts the strong CP problem squarely within the realm of “fine-tuning” problems in physics; although there is no known reason for CP symmetry conservation to occur, the theory must be modified to fit this observation. We then seek one of two things: either an observation of CP-violation in QCD or a solution that sets the neutron eDM, and by extension any CP-violating phase within our theory, to zero.

    2
    This term in the QCD Lagrangian allows for CP symmetry violation. Current measurements place the value of \theta at no greater than 10^{-10}. In Peccei-Quinn symmetry, Θ is promoted to a field.

    When such an expected symmetry violation is nowhere to be found, where is a theoretician to look for such a solution? The most straightforward answer is to turn to a new symmetry. This is exactly what Roberto Peccei and Helen Quinn did in 1977, birthing the Peccei-Quinn symmetry, an extension of QCD which incorporates a CP-violating phase known as the Θ term. The main idea behind this theory is to promote Θ to a dynamical field, rather than keeping it a constant. Since quantum fields have associated particles, this also yields the particle we dub the axion. Looking back briefly to the neutron eDM picture of the strong CP problem, this means that the angular separation should also be dynamical, and hence be relegated to the minimum energy configuration: the quarks again all in a straight line. In the language of symmetries, the U(1) Peccei-Quinn symmetry is approximately spontaneously broken, giving us a non-zero vacuum expectation value and a nearly-massless Goldstone boson: our axion.

    This is all great, but what does it have to do with dark matter? As it turns out, axions make for an especially intriguing dark matter candidate due to their low mass and potential to be produced in large quantities. For decades, this prowess was overshadowed by the leading WIMP candidate (weakly-interacting massive particles), whose parameter space has been slowly whittled down to the point where physicists are more seriously turning to alternatives. As there are several production-mechanisms in early universe cosmology for axions, and 100% of dark matter abundance could be explained through this generation, the axion is now stepping into the spotlight.

    This increased focus is causing some theorists to turn to further avenues of physics as possible applications for the axion. In a recent paper, Co and Harigaya examined the connection between this versatile particle and matter-antimatter asymmetry (also called baryon asymmetry). This latter term refers to the simple observation that there appears to be more matter than antimatter in our universe, since we are predominantly composed of matter, yet matter and antimatter also seem to be produced in colliders in equal proportions. In order to explain this asymmetry, without which matter and antimatter would have annihilated and we would not exist, physicists look for any mechanism to trigger an imbalance in these two quantities in the early universe. This theorized process is known as baryogenesis.

    Here’s where the axion might play a part. The \theta term, which settles to zero in its possible solution to the strong CP problem, could also have taken on any value from 0 to 360 degrees very early on in the universe. Analyzing the axion field through the conjectures of quantum gravity, if there are no global symmetries then the initial axion potential cannot be symmetric [4]. By falling from some initial value through an uneven potential, which the authors describe as a wine bottle potential with a wiggly top, \theta would cycle several times through the allowed values before settling at its minimum energy value of zero. This causes the axion field to rotate, an asymmetry which could generate a disproportionality between the amounts of produced matter and antimatter. If the field were to rotate in one direction, we would see more matter than antimatter, while a rotation in the opposite direction would result instead in excess antimatter.

    3
    The team’s findings can be summarized in the plot above. Regions in purple, red, and above the orange lines (dependent upon a particular constant X which is proportional to weak scale quantities) signify excluded portions of the parameter space. The remaining white space shows values of the axion decay constant and mass where the currently measured amount of baryon asymmetry could be generated. Source: https://arxiv.org/pdf/1910.02080.pdf

    Introducing a third fundamental mystery into the realm of axions begets the question of whether all three problems (strong CP, dark matter, and matter-antimatter asymmetry) can be solved simultaneously with axions. And, of course, there are nuances that could make alternative solutions to the strong CP problem more favorable or other dark matter candidates more likely. Like most theorized particles, there are several formulations of axion in the works. It is then necessary to turn our attention to experiment to narrow down the possibilities for how axions could interact with other particles, determine what their mass could be, and answer the all-important question: if they exist at all. Consequently, there are a plethora of axion-focused experiments up and running, with more on the horizon, that use a variety of methods spanning several subfields of physics. While these results begin to roll in, we can continue to investigate just how many problems we might be able to solve with one adaptable, soapy particle.

    Learn More:

    A comprehensive introduction to the strong CP problem, the axion solution, and other potential solutions: https://arxiv.org/pdf/1812.02669.pdf
    Axions as a dark matter candidate: https://www.symmetrymagazine.org/article/the-other-dark-matter-candidate
    More information on matter-antimatter asymmetry and baryogenesis: https://www.quantumdiaries.org/2015/02/04/where-do-i-come-from/
    The quantum gravity conjectures that axiogenesis builds upon: https://arxiv.org/abs/1810.05338
    An overview of current axion-focused experiments: https://www.annualreviews.org/doi/full/10.1146/annurev-nucl-102014-022120

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    What is ParticleBites?
    ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.

    The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment).

    Why read ParticleBites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

    Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics.

    Who writes ParticleBites?

    ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites.

    ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop.

    2
    Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside

    It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders.

     
  • richardmitnick 10:54 am on July 27, 2019 Permalink | Reply
    Tags: "Ask Ethan: Can We Really Get A Universe From Nothing?", , , , , Because dark energy is a property of space itself when the Universe expands the dark energy density must remain constant., , , , , Galaxies that are gravitationally bound will merge together into groups and clusters while the unbound groups and clusters will accelerate away from one another., , , Negative gravity?, QCD: Quantum Chromodynamics,   

    From Ethan Siegel: “Ask Ethan: Can We Really Get A Universe From Nothing?” 

    From Ethan Siegel
    July 27, 2019

    1
    Our entire cosmic history is theoretically well-understood in terms of the frameworks and rules that govern it. It’s only by observationally confirming and revealing various stages in our Universe’s past that must have occurred, like when the first stars and galaxies formed, and how the Universe expanded over time, that we can truly come to understand what makes up our Universe and how it expands and gravitates in a quantitative fashion. The relic signatures imprinted on our Universe from an inflationary state before the hot Big Bang give us a unique way to test our cosmic history, subject to the same fundamental limitations that all frameworks possess. (NICOLE RAGER FULLER / NATIONAL SCIENCE FOUNDATION)

    And does it require the idea of ‘negative gravity’ in order to work?

    The biggest question that we’re even capable of asking, with our present knowledge and understanding of the Universe, is where did everything we can observe come from? If it came from some sort of pre-existing state, we’ll want to know exactly what that state was like and how our Universe came from it. If it emerged out of nothingness, we’d want to know how we went from nothing to the entire Universe, and what if anything caused it. At least, that’s what our Patreon supporter Charles Buchanan wants to know, asking:

    “One concept bothers me. Perhaps you can help. I see it in used many places, but never really explained. “A universe from Nothing” and the concept of negative gravity. As I learned my Newtonian physics, you could put the zero point of the gravitational potential anywhere, only differences mattered. However Newtonian physics never deals with situations where matter is created… Can you help solidify this for me, preferably on [a] conceptual level, maybe with a little calculation detail?”

    Gravitation might seem like a straightforward force, but an incredible number of aspects are anything but intuitive. Let’s take a deeper look.

    2
    Countless scientific tests of Einstein’s general theory of relativity have been performed, subjecting the idea to some of the most stringent constraints ever obtained by humanity. Einstein’s first solution was for the weak-field limit around a single mass, like the Sun; he applied these results to our Solar System with dramatic success. We can view this orbit as Earth (or any planet) being in free-fall around the Sun, traveling in a straight-line path in its own frame of reference. All masses and all sources of energy contribute to the curvature of spacetime. (LIGO SCIENTIFIC COLLABORATION / T. PYLE / CALTECH / MIT)

    MIT /Caltech Advanced aLigo



    VIRGO Gravitational Wave interferometer, near Pisa, Italy


    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    LSC LIGO Scientific Collaboration


    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger

    Gravity is talking. Lisa will listen. Dialogos of Eide

    ESA/eLISA the future of gravitational wave research

    Localizations of gravitational-wave signals detected by LIGO in 2015 (GW150914, LVT151012, GW151226, GW170104), more recently, by the LIGO-Virgo network (GW170814, GW170817). After Virgo came online in August 2018


    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    If you have two point masses located some distance apart in your Universe, they’ll experience an attractive force that compels them to gravitate towards one another. But this attractive force that you perceive, in the context of relativity, comes with two caveats.

    The first caveat is simple and straightforward: these two masses will experience an acceleration towards one another, but whether they wind up moving closer to one another or not is entirely dependent on how the space between them evolves. Unlike in Newtonian gravity, where space is a fixed quantity and only the masses within that space can evolve, everything is changeable in General Relativity. Not only does matter and energy move and accelerate due to gravitation, but the very fabric of space itself can expand, contract, or otherwise flow. All masses still move through space, but space itself is no longer stationary.

    3
    The ‘raisin bread’ model of the expanding Universe, where relative distances increase as the space (dough) expands. The farther away any two raisin are from one another, the greater the observed redshift will be by time the light is received. The redshift-distance relation predicted by the expanding Universe is borne out in observations, and has been consistent with what’s been known going all the way back to the 1920s. (NASA / WMAP SCIENCE TEAM)

    NASA/WMAP 2001 to 2010

    The second caveat is that the two masses you’re considering, even if you’re extremely careful about accounting for what’s in your Universe, are most likely not the only forms of energy around. There are bound to be other masses in the form of normal matter, dark matter, and neutrinos. There’s the presence of radiation, from both electromagnetic and gravitational waves. There’s even dark energy: a type of energy inherent to the fabric of space itself.

    Now, here’s a scenario that might exemplify where your intuition leads you astray: what happens if these masses, for the volume they occupy, have less total energy than the average energy density of the surrounding space?

    4
    The gravitational attraction (blue) of overdense regions and the relative repulsion (red) of the underdense regions, as they act on the Milky Way. Even though gravity is always attractive, there is an average amount of attraction throughout the Universe, and regions with lower energy densities than that will experience (and cause) an effective repulsion with respect to the average. (YEHUDA HOFFMAN, DANIEL POMARÈDE, R. BRENT TULLY, AND HÉLÈNE COURTOIS, NATURE ASTRONOMY 1, 0036 (2017))

    You can imagine three different scenarios:

    1.The first mass has a below-average energy density while the second has an above-average value.
    2.The first mass has an above-average energy density while the second has a below-average value.
    3.Both the first and second masses have a below-average energy density compared to the rest of space.

    In the first two scenarios, the above-average mass will begin growing as it pulls on the matter/energy all around it, while the below-average mass will start shrinking, as it’s less able to hold onto its own mass in the face of its surroundings. These two masses will effectively repel one another; even though gravitation is always attractive, the intervening matter is preferentially attracted to the heavier-than-average mass. This causes the lower-mass object to act like it’s both repelling and being repelled by the heavier-mass object, the same way a balloon held underwater will still be attracted to Earth’s center, but will be forced away from it owing to the (buoyant) effects of the water.

    5
    The Earth’s crust is thinnest over the ocean and thickest over mountains and plateaus, as the principle of buoyancy dictates and as gravitational experiments confirm. Just as a balloon submerged in water will accelerate away from the center of the Earth, a region with below-average energy density will accelerate away from an overdense region, as average-density regions will be more preferentially attracted to the overdense region than the underdense region will. (USGS)
    6

    So what’s going to happen if you have two regions of space with below-average densities, surrounded by regions of just average density? They’ll both shrink, giving up their remaining matter to the denser regions around them. But as far as motions go, they’ll accelerate towards one another, with exactly the same magnitude they’d accelerate at if they were both overdense regions that exceeded the average density by equivalent amounts.

    You might be wondering why it’s important to think about these concerns when talking about a Universe from nothing. After all, if your Universe is full of matter and energy, it’s pretty hard to understand how that’s relevant to making sense of the concept of something coming from nothing. But just as our intuition can lead us astray when thinking about matter and energy on the spacetime playing field of General Relativity, it’s a comparable situation when we think about nothingness.

    7
    A representation of flat, empty space with no matter, energy or curvature of any type. With the exception of small quantum fluctuations, space in an inflationary Universe becomes incredibly flat like this, except in a 3D grid rather than a 2D sheet. Space is stretched flat, and particles are rapidly driven away. (AMBER STUVER / LIVING LIGO)

    You very likely think about nothingness as a philosopher would: the complete absence of everything. Zero matter, zero energy, an absolutely zero value for all the quantum fields in the Universe, etc. You think of space that’s completely flat, with nothing around to cause its curvature anywhere.

    If you think this way, you’re not alone: there are many different ways to conceive of “nothing.” You might even be tempted to take away space, time, and the laws of physics themselves, too. The problem, if you start doing that, is that you lose your ability to predict anything at all. The type of nothingness you’re thinking about, in this context, is what we call unphysical.

    If we want to think about nothing in a physical sense, you have to keep certain things. You need spacetime and the laws of physics, for example; you cannot have a Universe without them.

    8
    A visualization of QCD illustrates how particle/antiparticle pairs pop out of the quantum vacuum for very small amounts of time as a consequence of Heisenberg uncertainty.

    The quantum vacuum is interesting because it demands that empty space itself isn’t so empty, but is filled with all the particles, antiparticles and fields in various states that are demanded by the quantum field theory that describes our Universe. Put this all together, and you find that empty space has a zero-point energy that’s actually greater than zero. (DEREK B. LEINWEBER)

    But here’s the kicker: if you have spacetime and the laws of physics, then by definition you have quantum fields permeating the Universe everywhere you go. You have a fundamental “jitter” to the energy inherent to space, due to the quantum nature of the Universe. (And the Heisenberg uncertainty principle, which is unavoidable.)

    Put these ingredients together — because you can’t have a physically sensible “nothing” without them — and you’ll find that space itself doesn’t have zero energy inherent to it, but energy with a finite, non-zero value. Just as there’s a finite zero-point energy (that’s greater than zero) for an electron bound to an atom, the same is true for space itself. Empty space, even with zero curvature, even devoid of particles and external fields, still has a finite energy density to it.

    9
    The four possible fates of the Universe with only matter, radiation, curvature and a cosmological constant allowed. The top three possibilities are for a Universe whose fate is determined by the balance of matter/radiation with spatial curvature alone; the bottom one includes dark energy. Only the bottom “fate” aligns with the evidence. (E. SIEGEL / BEYOND THE GALAXY)

    From the perspective of quantum field theory, this is conceptualized as the zero-point energy of the quantum vacuum: the lowest-energy state of empty space. In the framework of General Relativity, however, it appears in a different sense: as the value of a cosmological constant, which itself is the energy of empty space, independent of curvature or any other form of energy density.

    Although we do not know how to calculate the value of this energy density from first principles, we can calculate the effects it has on the expanding Universe. As your Universe expands, every form of energy that exists within it contributes to not only how your Universe expands, but how that expansion rate changes over time. From multiple independent lines of evidence — including the Universe’s large-scale structure, the cosmic microwave background, and distant supernovae — we have been able to determine how much energy is inherent to space itself.

    10
    Constraints on dark energy from three independent sources: supernovae, the CMB (cosmic microwave background) and BAO (which is a wiggly feature seen in the correlations of large-scale structure). Note that even without supernovae, we’d need dark energy for certain, and also that there are uncertainties and degeneracies between the amount of dark matter and dark energy that we’d need to accurately describe our Universe. (SUPERNOVA COSMOLOGY PROJECT, AMANULLAH, ET AL., AP.J. (2010))

    This form of energy is what we presently call dark energy, and it’s responsible for the observed accelerated expansion of the Universe. Although it’s been a part of our conceptions of reality for more than two decades now, we don’t fully understand its true nature. All we can say is that when we measure the expansion rate of the Universe, our observations are consistent with dark energy being a cosmological constant with a specific magnitude, and not with any of the alternatives that evolve significantly over cosmic time.

    Because dark energy causes distant galaxies to appear to recede from one another more and more quickly as time goes on — since the space between those galaxies is expanding — it’s often called negative gravity. This is not only highly informal, but incorrect. Gravity is only positive, never negative. But even positive gravity, as we saw earlier, can have effects that look very much like negative repulsion.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    11
    How energy density changes over time in a Universe dominated by matter (top), radiation (middle), and a cosmological constant (bottom). Note that dark energy doesn’t change in density as the Universe expands, which is why it comes to dominate the Universe at late times. (E. SIEGEL)

    If there were greater amounts of dark energy present within our spatially flat Universe, the expansion rate would be greater. But this is true for all forms of energy in a spatially flat Universe: dark energy is no exception. The only different between dark energy and the more commonly encountered forms of energy, like matter and radiation, is that as the Universe expands, the densities of matter and radiation decrease.

    But because dark energy is a property of space itself, when the Universe expands, the dark energy density must remain constant. As time goes on, galaxies that are gravitationally bound will merge together into groups and clusters, while the unbound groups and clusters will accelerate away from one another. That’s the ultimate fate of the Universe if dark energy is real.

    Laniakea supercluster. From Nature The Laniakea supercluster of galaxies R. Brent Tully, Hélène Courtois, Yehuda Hoffman & Daniel Pomarède at http://www.nature.com/nature/journal/v513/n7516/full/nature13674.html. Milky Way is the red dot.

    So why do we say we have a Universe that came from nothing? Because the value of dark energy may have been much higher in the distant past: before the hot Big Bang. A Universe with a very large amount of dark energy in it will behave identically to a Universe undergoing cosmic inflation. In order for inflation to end, that energy has to get converted into matter and radiation. The evidence strongly points to that happening some 13.8 billion years ago.

    When it did, though, a small amount of dark energy remained behind. Why? Because the zero-point energy of the quantum fields in our Universe isn’t zero, but a finite, greater-than-zero value. Our intuition may not be reliable when we consider the physical concepts of nothing and negative/positive gravity, but that’s why we have science. When we do it right, we wind up with physical theories that accurately describe the Universe we measure and observe.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: