Tagged: Neutrinoless double beta decay Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:19 am on April 6, 2022 Permalink | Reply
    Tags: "CUORE team places new limits on the bizarre behavior of neutrinos", , , , Neutrinoless double beta decay, , ,   

    From DOE’s Lawrence Berkeley National Laboratory: “CUORE team places new limits on the bizarre behavior of neutrinos” 

    From DOE’s Lawrence Berkeley National Laboratory

    April 6, 2022
    Adam Becker
    ambecker@lbl.gov
    (510) 424-2436

    Physicists are closing in on the true nature of the neutrino — and might be closer to answering a fundamental question about our own existence.

    In a Laboratory under a mountain, physicists are using crystals far colder than frozen air to study ghostly particles, hoping to learn secrets from the beginning of the universe. Researchers at the Cryogenic Underground Observatory for Rare Events (CUORE) announced this week that they had placed some of the most stringent limits yet on the strange possibility that the neutrino is its own antiparticle. Neutrinos are deeply unusual particles, so ethereal and so ubiquitous that they regularly pass through our bodies without us noticing. CUORE has spent the last three years patiently waiting to see evidence of a distinctive nuclear decay process, only possible if neutrinos and antineutrinos are the same particle. CUORE’s new data shows that this decay doesn’t happen for trillions of trillions of years, if it happens at all. CUORE’s limits on the behavior of these tiny phantoms are a crucial part of the search for the next breakthrough in particle and nuclear physics – and the search for our own origins.

    “Ultimately, we are trying to understand matter creation,” said Carlo Bucci, researcher at the Laboratori Nazionali del Gran Sasso (LNGS) in Italy and the spokesperson for CUORE. “We’re looking for a process that violates a fundamental symmetry of nature,” added Roger Huang, a postdoctoral researcher at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and one of the lead authors of the new study.

    CUORE – Italian for “heart” – is among the most sensitive neutrino experiments in the world. The new results from CUORE are based on a data set ten times larger than any other high-resolution search, collected over the last three years. CUORE is operated by an international research collaboration, led by the Istituto Nazionale di Fisica Nucleare (INFN) in Italy and Berkeley Lab in the US. The CUORE detector itself is located under nearly a mile of solid rock at LNGS, a facility of the INFN. U.S. Department of Energy-supported nuclear physicists play a leading scientific and technical role in this experiment. CUORE’s new results were published today in Nature.

    Peculiar Particles

    Neutrinos are everywhere — there are trillions of neutrinos passing through your thumbnail alone as you read this sentence. They are invisible to the two strongest forces in the universe, electromagnetism and the strong nuclear force, which allows them to pass right through you, the Earth, and nearly anything else without interacting.

    Despite their vast numbers, their enigmatic nature makes them very difficult to study, and has left physicists scratching their heads ever since they were first postulated over 90 years ago. It wasn’t even known whether neutrinos had any mass at all until the late 1990s — as it turns out, they do, albeit not very much.

    One of the many remaining open questions about neutrinos is whether they are their own antiparticles. All particles have antiparticles, their own antimatter counterpart: electrons have antielectrons (positrons), quarks have antiquarks, and neutrons and protons (which make up the nuclei of atoms) have antineutrons and antiprotons. But unlike all of those particles, it’s theoretically possible for neutrinos to be their own antiparticles. Such particles that are their own antiparticles were first postulated by the Italian physicist Ettore Majorana in 1937, and are known as Majorana fermions.

    If neutrinos are Majorana fermions, that could explain a deep question at the root of our own existence: why there’s so much more matter than antimatter in the universe. Neutrinos and electrons are both leptons, a kind of fundamental particle. One of the fundamental laws of nature appears to be that the number of leptons is always conserved — if a process creates a lepton, it must also create an anti-lepton to balance it out. Similarly, particles like protons and neutrons are known as baryons, and baryon number also appears to be conserved. Yet if baryon and lepton numbers were always conserved, then there would be exactly as much matter in the universe as antimatter — and in the early universe, the matter and antimatter would have met and annihilated, and we wouldn’t exist. Something must violate the exact conservation of baryons and leptons. Enter the neutrino: if neutrinos are their own antiparticles, then lepton number wouldn’t have to be conserved, and our existence becomes much less mysterious.

    “The matter-antimatter asymmetry in the universe is still unexplained,” said Huang. “If neutrinos are their own antiparticles, that could help explain it.”neutrinoless double beta decay

    Nor is this the only question that could be answered by a Majorana neutrino. The extreme lightness of neutrinos, about a million times lighter than the electron, has long been puzzling to particle physicists. But if neutrinos are their own antiparticles, then an existing solution known as the “seesaw mechanism” could explain the lightness of neutrinos in an elegant and natural way.

    1
    CUORE detector being installed into the cryostat. Credit: Yury Suvorov and the CUORE Collaboration.

    A Rare Device for Rare Decays

    But determining whether neutrinos are their own antiparticles is difficult, precisely because they don’t interact very often at all. Physicists’ best tool for looking for Majorana neutrinos is a hypothetical kind of radioactive decay called neutrinoless double beta decay. Beta decay is a fairly common form of decay in some atoms, turning a neutron in the atom’s nucleus into a proton, changing the chemical element of the atom and emitting an electron and an anti-neutrino in the process. Double beta decay is more rare: instead of one neutron turning into a proton, two of them do, emitting two electrons and two anti-neutrinos in the process. But if the neutrino is a Majorana fermion, then theoretically, that would allow a single “virtual” neutrino, acting as its own antiparticle, to take the place of both anti-neutrinos in double beta decay. Only the two electrons would make it out of the atomic nucleus. Neutrinoless double-beta decay has been theorized for decades, but it’s never been seen.

    The CUORE experiment has gone to great lengths to catch tellurium atoms in the act of this decay. The experiment uses nearly a thousand highly pure crystals of tellurium oxide, collectively weighing over 700 kg. This much tellurium is necessary because on average, it takes billions of times longer than the current age of the universe for a single unstable atom of tellurium to undergo ordinary double beta decay. But there are trillions of trillions of atoms of tellurium in each one of the crystals CUORE uses, meaning that ordinary double beta decay happens fairly regularly in the detector, around a few times a day in each crystal. Neutrinoless double beta decay, if it happens at all, is even more rare, and thus the CUORE team must work hard to remove as many sources of background radiation as possible. To shield the detector from cosmic rays, the entire system is located underneath the mountain of Gran Sasso, the largest mountain on the Italian peninsula. Further shielding is provided by several tons of lead. But freshly mined lead is slightly radioactive due to contamination by uranium and other elements, with that radioactivity decreasing over time — so the lead used to surround the most sensitive part of CUORE is mostly lead recovered from a sunken ancient Roman ship, nearly 2000 years old.

    Perhaps the most impressive piece of machinery used at CUORE is the cryostat, which keeps the detector cold. To detect neutrinoless double beta decay, the temperature of each crystal in the CUORE detector is carefully monitored with sensors capable of detecting a change in temperature as small as one ten-thousandth of a Celsius degree. Neutrinoless double beta decay has a specific energy signature and would raise the temperature of a single crystal by a well-defined and recognizable amount. But in order to maintain that sensitivity, the detector must be kept very cold — specifically, it’s kept around 10 mK, a hundredth of a degree above absolute zero. “This is the coldest cubic meter in the known universe,” said Laura Marini, a research fellow at Gran Sasso Science Institute and CUORE’s Run Coordinator. The resulting sensitivity of the detector is truly phenomenal. “When there were large earthquakes in Chile and New Zealand, we actually saw glimpses of it in our detector,” said Marini. “We can also see waves crashing on the seashore on the Adriatic Sea, 60 kilometers away. That signal gets bigger in the winter, when there are storms.”
    ===
    A Neutrino Through The Heart

    Despite that phenomenal sensitivity, CUORE hasn’t yet seen evidence of neutrinoless double beta decay. Instead, CUORE has established that, on average, this decay happens in a single tellurium atom no more often than once every 22 trillion trillion years. “Neutrinoless double beta decay, if observed, will be the rarest process ever observed in nature, with a half-life more than a million billion times longer than the age of the universe,” said Danielle Speller, Assistant Professor at Johns Hopkins University and a member of the CUORE Physics Board. “CUORE may not be sensitive enough to detect this decay even if it does occur, but it’s important to check. Sometimes physics yields surprising results, and that’s when we learn the most.” Even if CUORE doesn’t find evidence of neutrinoless double-beta decay, it is paving the way for the next generation of experiments. CUORE’s successor, the CUORE Upgrade with Particle Identification (CUPID) is already in the works. CUPID will be over 10 times more sensitive than CUORE, potentially allowing it to glimpse evidence of a Majorana neutrino.

    But regardless of anything else, CUORE is a scientific and technological triumph — not only for its new bounds on the rate of neutrinoless double beta decay, but also for its demonstration of its cryostat technology. “It’s the largest refrigerator of its kind in the world,” said Paolo Gorla, a staff scientist at LNGS and CUORE’s Technical Coordinator. “And it’s been kept at 10 mK continuously for about three years now.” Such technology has applications well beyond fundamental particle physics. Specifically, it may find use in quantum computing, where keeping large amounts of machinery cold enough and shielded from environmental radiation to manipulate on a quantum level is one of the major engineering challenges in the field.

    Meanwhile, CUORE isn’t done yet. “We’ll be operating until 2024,” said Bucci. “I’m excited to see what we find.”

    CUORE is supported by the U.S. Department of Energy, Italy’s National Institute of Nuclear Physics (Instituto Nazionale di Fisica Nucleare, or INFN), and the National Science Foundation (NSF). CUORE collaboration members include: INFN, University of Bologna, University of Genoa, University of Milano-Bicocca, and Sapienza University in Italy; California Polytechnic State University, San Luis Obispo; Berkeley Lab; Johns Hopkins University; Lawrence Livermore National Laboratory; Massachusetts Institute of Technology; University of California, Berkeley; University of California, Los Angeles; University of South Carolina; Virginia Polytechnic Institute and State University; and Yale University in the US; Saclay Nuclear Research Center (CEA) and the Irène Joliot-Curie Laboratory (CNRS/IN2P3, Paris Saclay University) in France; and Fudan University and Shanghai Jiao Tong University in China.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the The National Academy of Sciences, one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the The National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the University of California- Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.

    LBNL 88 inch cyclotron.

    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory, and Robert Wilson founded Fermi National Accelerator Laborator.

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy . The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy , with management from the University of California. Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science:

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    LBNL/ALS

    DOE’s Lawrence Berkeley National Laboratory Advanced Light Source.
    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory, DOE’s Oak Ridge National Laboratory (ORNL), DOE’s Pacific Northwest National Laboratory (PNNL), and the HudsonAlpha Institute for Biotechnology . The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory.

    Cray Cori II supercomputer at National Energy Research Scientific Computing Center at DOE’s Lawrence Berkeley National Laboratory, named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    NERSC Hopper Cray XE6 supercomputer.

    NERSC Cray XC30 Edison supercomputer.

    NERSC GPFS for Life Sciences.

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supercomputer.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory, the University of California campuses of Berkeley and Davis, the Carnegie Institution for Science , and DOE’s Lawrence Livermore National Laboratory (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory leads JCESR and Berkeley Lab is a major partner.

    The United States Department of Energy (DOE) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy. The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Energy Technology Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory

    The University of California-Berkeley is a public land-grant research university in Berkeley, California. Established in 1868 as the state’s first land-grant university, it was the first campus of the University of California system and a founding member of the Association of American Universities (US). Its 14 colleges and schools offer over 350 degree programs and enroll some 31,000 undergraduate and 12,000 graduate students. Berkeley is ranked among the world’s top universities by major educational publications.

    Berkeley hosts many leading research institutes, including the Mathematical Sciences Research Institute and the Space Sciences Laboratory. It founded and maintains close relationships with three national laboratories at DOE’s Lawrence Berkeley National Laborator, DOE’s Lawrence Livermore National Laboratory and DOE’s Los Alamos National Lab, and has played a prominent role in many scientific advances, from the Manhattan Project and the discovery of 16 chemical elements to breakthroughs in computer science and genomics. Berkeley is also known for student activism and the Free Speech Movement of the 1960s.

    Berkeley alumni and faculty count among their ranks 110 Nobel laureates (34 alumni), 25 Turing Award winners (11 alumni), 14 Fields Medalists, 28 Wolf Prize winners, 103 MacArthur “Genius Grant” recipients, 30 Pulitzer Prize winners, and 19 Academy Award winners. The university has produced seven heads of state or government; five chief justices, including Chief Justice of the United States Earl Warren; 21 cabinet-level officials; 11 governors; and 25 living billionaires. It is also a leading producer of Fulbright Scholars, MacArthur Fellows, and Marshall Scholars. Berkeley alumni, widely recognized for their entrepreneurship, have founded many notable companies.

    Berkeley’s athletic teams compete in Division I of the NCAA, primarily in the Pac-12 Conference, and are collectively known as the California Golden Bears. The university’s teams have won 107 national championships, and its students and alumni have won 207 Olympic medals.

    Made possible by President Lincoln’s signing of the Morrill Act in 1862, the University of California was founded in 1868 as the state’s first land-grant university by inheriting certain assets and objectives of the private College of California and the public Agricultural, Mining, and Mechanical Arts College. Although this process is often incorrectly mistaken for a merger, the Organic Act created a “completely new institution” and did not actually merge the two precursor entities into the new university. The Organic Act states that the “University shall have for its design, to provide instruction and thorough and complete education in all departments of science, literature and art, industrial and professional pursuits, and general education, and also special courses of instruction in preparation for the professions”.

    Ten faculty members and 40 students made up the fledgling university when it opened in Oakland in 1869. Frederick H. Billings, a trustee of the College of California, suggested that a new campus site north of Oakland be named in honor of Anglo-Irish philosopher George Berkeley. The university began admitting women the following year. In 1870, Henry Durant, founder of the College of California, became its first president. With the completion of North and South Halls in 1873, the university relocated to its Berkeley location with 167 male and 22 female students.

    Beginning in 1891, Phoebe Apperson Hearst made several large gifts to Berkeley, funding a number of programs and new buildings and sponsoring, in 1898, an international competition in Antwerp, Belgium, where French architect Émile Bénard submitted the winning design for a campus master plan.

    20th century

    In 1905, the University Farm was established near Sacramento, ultimately becoming the University of California-Davis. In 1919, Los Angeles State Normal School became the southern branch of the University, which ultimately became the University of California-Los Angeles. By 1920s, the number of campus buildings had grown substantially and included twenty structures designed by architect John Galen Howard.

    In 1917, one of the nation’s first ROTC programs was established at Berkeley and its School of Military Aeronautics began training pilots, including Gen. Jimmy Doolittle. Berkeley ROTC alumni include former Secretary of Defense Robert McNamara and Army Chief of Staff Frederick C. Weyand as well as 16 other generals. In 1926, future fleet admiral Chester W. Nimitz established the first Naval ROTC unit at Berkeley.

    In the 1930s, Ernest Lawrence helped establish the Radiation Laboratory (now DOE’s Lawrence Berkeley National Laboratory) and invented the cyclotron, which won him the Nobel physics prize in 1939. Using the cyclotron, Berkeley professors and Berkeley Lab researchers went on to discover 16 chemical elements—more than any other university in the world. In particular, during World War II and following Glenn Seaborg’s then-secret discovery of plutonium, Ernest Orlando Lawrence’s Radiation Laboratory began to contract with the U.S. Army to develop the atomic bomb. Physics professor J. Robert Oppenheimer was named scientific head of the Manhattan Project in 1942. Along with the Lawrence Berkeley National Laboratory, Berkeley founded and was then a partner in managing two other labs, Los Alamos National Laboratory (1943) and Lawrence Livermore National Laboratory (1952).

    By 1942, the American Council on Education ranked Berkeley second only to Harvard University in the number of distinguished departments.

    In 1952, the University of California reorganized itself into a system of semi-autonomous campuses, with each campus given its own chancellor, and Clark Kerr became Berkeley’s first Chancellor, while Sproul remained in place as the President of the University of California.

    Berkeley gained a worldwide reputation for political activism in the 1960s. In 1964, the Free Speech Movement organized student resistance to the university’s restrictions on political activities on campus—most conspicuously, student activities related to the Civil Rights Movement. The arrest in Sproul Plaza of Jack Weinberg, a recent Berkeley alumnus and chair of Campus CORE, in October 1964, prompted a series of student-led acts of formal remonstrance and civil disobedience that ultimately gave rise to the Free Speech Movement, which movement would prevail and serve as precedent for student opposition to America’s involvement in the Vietnam War.

    In 1982, the Mathematical Sciences Research Institute (MSRI) was established on campus with support from the National Science Foundation and at the request of three Berkeley mathematicians — Shiing-Shen Chern, Calvin Moore and Isadore M. Singer. The institute is now widely regarded as a leading center for collaborative mathematical research, drawing thousands of visiting researchers from around the world each year.

    21st century

    In the current century, Berkeley has become less politically active and more focused on entrepreneurship and fundraising, especially for STEM disciplines.

    Modern Berkeley students are less politically radical, with a greater percentage of moderates and conservatives than in the 1960s and 70s. Democrats outnumber Republicans on the faculty by a ratio of 9:1. On the whole, Democrats outnumber Republicans on American university campuses by a ratio of 10:1.

    In 2007, the Energy Biosciences Institute was established with funding from BP and Stanley Hall, a research facility and headquarters for the California Institute for Quantitative Biosciences, opened. The next few years saw the dedication of the Center for Biomedical and Health Sciences, funded by a lead gift from billionaire Li Ka-shing; the opening of Sutardja Dai Hall, home of the Center for Information Technology Research in the Interest of Society; and the unveiling of Blum Hall, housing the Blum Center for Developing Economies. Supported by a grant from alumnus James Simons, the Simons Institute for the Theory of Computing was established in 2012. In 2014, Berkeley and its sister campus, University of California-San Francisco, established the Innovative Genomics Institute, and, in 2020, an anonymous donor pledged $252 million to help fund a new center for computing and data science.

    Since 2000, Berkeley alumni and faculty have received 40 Nobel Prizes, behind only Harvard and Massachusetts Institute of Technology among US universities; five Turing Awards, behind only MIT and Stanford; and five Fields Medals, second only to Princeton University. According to PitchBook, Berkeley ranks second, just behind Stanford University, in producing VC-backed entrepreneurs.

    UC Berkeley Seal

    The University of California

    The University of California is a public land-grant research university system in the U.S. state of California. The system is composed of the campuses at Berkeley, Davis, Irvine, Los Angeles, Merced, Riverside, San Diego, San Francisco, Santa Barbara, and Santa Cruz, along with numerous research centers and academic abroad centers. The system is the state’s land-grant university.

    The University of California was founded on March 23, 1868, and operated in Oakland before moving to Berkeley in 1873. Over time, several branch locations and satellite programs were established. In March 1951, the University of California began to reorganize itself into something distinct from its campus in Berkeley, with University of California President Robert Gordon Sproul staying in place as chief executive of the University of California system, while Clark Kerr became the first chancellor of The University of California-Berkeley and Raymond B. Allen became the first chancellor of The University of California-Los Angeles. However, the 1951 reorganization was stalled by resistance from Sproul and his allies, and it was not until Kerr succeeded Sproul as University of California President that University of California was able to evolve into a university system from 1957 to 1960. At that time, chancellors were appointed for additional campuses and each was granted some degree of greater autonomy.

    The University of California currently has 10 campuses, a combined student body of 285,862 students, 24,400 faculty members, 143,200 staff members and over 2.0 million living alumni. Its newest campus in Merced opened in fall 2005. Nine campuses enroll both undergraduate and graduate students; one campus, The University of California-San Francisco, enrolls only graduate and professional students in the medical and health sciences. In addition, the University of California Hastings College of the Law, located in San Francisco, is legally affiliated with University of California, but other than sharing its name is entirely autonomous from the rest of the system. Under the California Master Plan for Higher Education, the University of California is a part of the state’s three-system public higher education plan, which also includes the California State University system and the California Community Colleges system. University of California is governed by a Board of Regents whose autonomy from the rest of the state government is protected by the state constitution. The University of California also manages or co-manages three national laboratories for the U.S. Department of Energy: The DOE’s Lawrence Berkeley National Laboratory , The DOE’s Lawrence Livermore National Laboratory , and The Doe’s Los Alamos National Laboratory.

    Collectively, the colleges, institutions, and alumni of the University of California make it the most comprehensive and advanced post-secondary educational system in the world, responsible for nearly $50 billion per year of economic impact. Major publications generally rank most University of California campuses as being among the best universities in the world. Eight of the campuses, Berkeley, Davis, Irvine, Los Angeles, Santa Barbara, San Diego, Santa Cruz, and Riverside, are considered Public Ivies, making California the state with the most universities in the nation to hold the title. University of California campuses have large numbers of distinguished faculty in almost every academic discipline, with University of California faculty and researchers having won 71 Nobel Prizes as of 2021.

    In 1849, the state of California ratified its first constitution, which contained the express objective of creating a complete educational system including a state university. Taking advantage of the Morrill Land-Grant Acts, the California State Legislature established an Agricultural, Mining, and Mechanical Arts College in 1866. However, it existed only on paper, as a placeholder to secure federal land-grant funds.

    Meanwhile, Congregational minister Henry Durant, an alumnus of Yale University, had established the private Contra Costa Academy, on June 20, 1853, in Oakland, California. The initial site was bounded by Twelfth and Fourteenth Streets and Harrison and Franklin Streets in downtown Oakland (and is marked today by State Historical Plaque No. 45 at the northeast corner of Thirteenth and Franklin). In turn, the academy’s trustees were granted a charter in 1855 for a College of California, though the college continued to operate as a college preparatory school until it added college-level courses in 1860. The college’s trustees, educators, and supporters believed in the importance of a liberal arts education (especially the study of the Greek and Roman classics), but ran into a lack of interest in liberal arts colleges on the American frontier (as a true college, the college was graduating only three or four students per year).

    In November 1857, the college’s trustees began to acquire various parcels of land facing the Golden Gate in what is now Berkeley for a future planned campus outside of Oakland. But first, they needed to secure the college’s water rights by buying a large farm to the east. In 1864, they organized the College Homestead Association, which borrowed $35,000 to purchase the land, plus another $33,000 to purchase 160 acres (650,000 m^2) of land to the south of the future campus. The Association subdivided the latter parcel and started selling lots with the hope it could raise enough money to repay its lenders and also create a new college town. But sales of new homesteads fell short.

    Governor Frederick Low favored the establishment of a state university based upon The University of Michigan plan, and thus in one sense may be regarded as the founder of the University of California. At the College of California’s 1867 commencement exercises, where Low was present, Benjamin Silliman Jr. criticized Californians for creating a state polytechnic school instead of a real university. That same day, Low reportedly first suggested a merger of the already-functional College of California (which had land, buildings, faculty, and students, but not enough money) with the nonfunctional state college (which had money and nothing else), and went on to participate in the ensuing negotiations. On October 9, 1867, the college’s trustees reluctantly agreed to join forces with the state college to their mutual advantage, but under one condition—that there not be simply an “Agricultural, Mining, and Mechanical Arts College”, but a complete university, within which the assets of the College of California would be used to create a College of Letters (now known as the College of Letters and Science). Accordingly, the Organic Act, establishing the University of California, was introduced as a bill by Assemblyman John W. Dwinelle on March 5, 1868, and after it was duly passed by both houses of the state legislature, it was signed into state law by Governor Henry H. Haight (Low’s successor) on March 23, 1868. However, as legally constituted, the new university was not an actual merger of the two colleges, but was an entirely new institution which merely inherited certain objectives and assets from each of them. The University of California’s second president, Daniel Coit Gilman, opened its new campus in Berkeley in September 1873.

    Section 8 of the Organic Act authorized the Board of Regents to affiliate the University of California with independent self-sustaining professional colleges. “Affiliation” meant University of California and its affiliates would “share the risk in launching new endeavors in education.” The affiliates shared the prestige of the state university’s brand, and University of California agreed to award degrees in its own name to their graduates on the recommendation of their respective faculties, but the affiliates were otherwise managed independently by their own boards of trustees, charged their own tuition and fees, and maintained their own budgets separate from the University of California budget. It was through the process of affiliation that University of California was able to claim it had medical and law schools in San Francisco within a decade of its founding.

    In 1879, California adopted its second and current constitution, which included unusually strong language to ensure University of California’s independence from the rest of the state government. This had lasting consequences for the Hastings College of the Law, which had been separately chartered and affiliated in 1878 by an act of the state legislature at the behest of founder Serranus Clinton Hastings. After a falling out with his own handpicked board of directors, the founder persuaded the state legislature in 1883 and 1885 to pass new laws to place his law school under the direct control of the Board of Regents. In 1886, the Supreme Court of California declared those newer acts to be unconstitutional because the clause protecting University of California’s independence in the 1879 state constitution had stripped the state legislature of the ability to amend the 1878 act. To this day, the Hastings College of the Law remains an affiliate of University of California, maintains its own board of directors, and is not governed by the Regents.

    In contrast, Toland Medical College (founded in 1864 and affiliated in 1873) and later, the dental, pharmacy, and nursing schools in SF were affiliated with University of California through written agreements, and not statutes invested with constitutional importance by court decisions. In the early 20th century, the Affiliated Colleges (as they came to be called) began to agree to submit to the Regents’ governance during the term of President Benjamin Ide Wheeler, as the Board of Regents had come to recognize the problems inherent in the existence of independent entities that shared the University of California brand but over which University of California had no real control. While Hastings remained independent, the Affiliated Colleges were able to increasingly coordinate their operations with one another under the supervision of the University of California President and Regents, and evolved into the health sciences campus known today as the University of California-San Francisco.

    In August 1882, the California State Normal School (whose original normal school in San Jose is now San Jose State University) opened a second school in Los Angeles to train teachers for the growing population of Southern California. In 1887, the Los Angeles school was granted its own board of trustees independent of the San Jose school, and in 1919, the state legislature transferred it to University of California control and renamed it the Southern Branch of the University of California. In 1927, it became The University of California-Los Angeles; the “at” would be replaced with a comma in 1958.

    Los Angeles surpassed San Francisco in the 1920 census to become the most populous metropolis in California. Because Los Angeles had become the state government’s single largest source of both tax revenue and votes, its residents felt entitled to demand more prestige and autonomy for their campus. Their efforts bore fruit in March 1951, when UCLA became the first University of California site outside of Berkeley to achieve de jure coequal status with the Berkeley campus. That month, the Regents approved a reorganization plan under which both the Berkeley and Los Angeles campuses would be supervised by chancellors reporting to the University of California President. However, the 1951 plan was severely flawed; it was overly vague about how the chancellors were to become the “executive heads” of their campuses. Due to stubborn resistance from President Sproul and several vice presidents and deans—who simply carried on as before—the chancellors ended up as glorified provosts with limited control over academic affairs and long-range planning while the President and the Regents retained de facto control over everything else.

    Upon becoming president in October 1957, Clark Kerr supervised University of California’s rapid transformation into a true public university system through a series of proposals adopted unanimously by the Regents from 1957 to 1960. Kerr’s reforms included expressly granting all campus chancellors the full range of executive powers, privileges, and responsibilities which Sproul had denied to Kerr himself, as well as the radical decentralization of a tightly knit bureaucracy in which all lines of authority had always run directly to the President at Berkeley or to the Regents themselves. In 1965, UCLA Chancellor Franklin D. Murphy tried to push this to what he saw as its logical conclusion: he advocated for authorizing all chancellors to report directly to the Board of Regents, thereby rendering the University of California President redundant. Murphy wanted to transform University of California from one federated university into a confederation of independent universities, similar to the situation in Kansas (from where he was recruited). Murphy was unable to develop any support for his proposal, Kerr quickly put down what he thought of as “Murphy’s rebellion”, and therefore Kerr’s vision of University of California as a university system prevailed: “one university with pluralistic decision-making”.

    During the 20th century, University of California acquired additional satellite locations which, like Los Angeles, were all subordinate to administrators at the Berkeley campus. California farmers lobbied for University of California to perform applied research responsive to their immediate needs; in 1905, the Legislature established a “University Farm School” at Davis and in 1907 a “Citrus Experiment Station” at Riverside as adjuncts to the College of Agriculture at Berkeley. In 1912, University of California acquired a private oceanography laboratory in San Diego, which had been founded nine years earlier by local business promoters working with a Berkeley professor. In 1944, University of California acquired Santa Barbara State College from the California State Colleges, the descendants of the State Normal Schools. In 1958, the Regents began promoting these locations to general campuses, thereby creating The University of California-Santa Barbara (1958), The University of California-Davis (1959), The University of California-Riverside (1959), The University of California-San Diego (1960), and The University of California-San Francisco (1964). Each campus was also granted the right to have its own chancellor upon promotion. In response to California’s continued population growth, University of California opened two additional general campuses in 1965, with The University of California-Irvine opening in Irvine and The University of California-Santa Cruz opening in Santa Cruz. The youngest campus, The University of California-Merced opened in fall 2005 to serve the San Joaquin Valley.

    After losing campuses in Los Angeles and Santa Barbara to the University of California system, supporters of the California State College system arranged for the state constitution to be amended in 1946 to prevent similar losses from happening again in the future.

    The California Master Plan for Higher Education of 1960 established that University of California must admit undergraduates from the top 12.5% (one-eighth) of graduating high school seniors in California. Prior to the promulgation of the Master Plan, University of California was to admit undergraduates from the top 15%. University of California does not currently adhere to all tenets of the original Master Plan, such as the directives that no campus was to exceed total enrollment of 27,500 students (in order to ensure quality) and that public higher education should be tuition-free for California residents. Five campuses, Berkeley, Davis, Irvine, Los Angeles, and San Diego each have current total enrollment at over 30,000.

    After the state electorate severely limited long-term property tax revenue by enacting Proposition 13 in 1978, University of California was forced to make up for the resulting collapse in state financial support by imposing a variety of fees which were tuition in all but name. On November 18, 2010, the Regents finally gave up on the longstanding legal fiction that University of California does not charge tuition by renaming the Educational Fee to “Tuition.” As part of its search for funds during the 2000s and 2010s, University of California quietly began to admit higher percentages of highly accomplished (and more lucrative) students from other states and countries, but was forced to reverse course in 2015 in response to the inevitable public outcry and start admitting more California residents.

    As of 2019, University of California controls over 12,658 active patents. University of California researchers and faculty were responsible for 1,825 new inventions that same year. On average, University of California researchers create five new inventions per day.

    Seven of University of California’s ten campuses (UC Berkeley, UC Davis, UC Irvine, UCLA, UC San Diego, UC Santa Barbara, and UC Santa Cruz) are members of the Association of American Universities, an alliance of elite American research universities founded in 1900 at University of California’s suggestion. Collectively, the system counts among its faculty (as of 2002):

    389 members of the Academy of Arts and Sciences
    5 Fields Medal recipients
    19 Fulbright Scholars
    25 MacArthur Fellows
    254 members of the National Academy of Sciences
    91 members of the National Academy of Engineering
    13 National Medal of Science Laureates
    61 Nobel laureates.
    106 members of the Institute of Medicine

    Davis, Los Angeles, Riverside, and Santa Barbara all followed Berkeley’s example by aggregating the majority of arts, humanities, and science departments into a relatively large College of Letters and Science. Therefore, at Berkeley, Davis, Los Angeles, and Santa Barbara, their respective College of Letters and Science is by far the single largest academic unit on each campus. The College of Letters and Science at Los Angeles is the largest academic unit in the entire University of California system.

    Finally, Irvine is organized into 13 schools and San Francisco is organized into four schools, all of which are relatively narrow in scope.

    In 2006 the Scholarly Publishing and Academic Resources Coalition awarded the University of California the SPARC Innovator Award for its “extraordinarily effective institution-wide vision and efforts to move scholarly communication forward”, including the 1997 founding (under then University of California President Richard C. Atkinson) of the California Digital Library (CDL) and its 2002 launching of CDL’s eScholarship, an institutional repository. The award also specifically cited the widely influential 2005 academic journal publishing reform efforts of University of California faculty and librarians in “altering the marketplace” by publicly negotiating contracts with publishers, as well as their 2006 proposal to amend University of California’s copyright policy to allow open access to University of California faculty research. On July 24, 2013, the University of California Academic Senate adopted an Open Access Policy, mandating that all University of California faculty produced research with a publication agreement signed after that date be first deposited in University of California’s eScholarship open access repository.

    University of California system-wide research on the SAT exam found that, after controlling for familial income and parental education, so-called achievement tests known as the SAT II had 10 times more predictive ability of college aptitude than the SAT I.

    All University of California campuses except Hastings College of the Law are governed by the Regents of the University of California as required by the Constitution of the State of California. Eighteen regents are appointed by the governor for 12-year terms. One member is a student appointed for a one-year term. There are also seven ex officio members—the governor, lieutenant governor, speaker of the State Assembly, State Superintendent of Public Instruction, president and vice president of the alumni associations of University of California, and the University of California president. The Academic Senate, made up of faculty members, is empowered by the regents to set academic policies. In addition, the system-wide faculty chair and vice-chair sit on the Board of Regents as non-voting members.

    Originally, the president was the chief executive of the first campus, Berkeley. In turn, other University of California locations (with the exception of Hastings College of the Law) were treated as off-site departments of the Berkeley campus, and were headed by provosts who were subordinate to the president. In March 1951, the regents reorganized the university’s governing structure. Starting with the 1952–53 academic year, day-to-day “chief executive officer” functions for the Berkeley and Los Angeles campuses were transferred to chancellors who were vested with a high degree of autonomy, and reported as equals to University of California’s president. As noted above, the regents promoted five additional University of California locations to campuses and allowed them to have chancellors of their own in a series of decisions from 1958 to 1964, and the three campuses added since then have also been run by chancellors. In turn, all chancellors (again, with the exception of Hastings) report as equals to the University of California President. Today, the University of California Office of the President (UCOP) and the Office of the Secretary and Chief of Staff to the Regents of the University of California share an office building in downtown Oakland that serves as the University of California system’s headquarters.

    Kerr’s vision for University of California governance was “one university with pluralistic decision-making.” In other words, the internal delegation of operational authority to chancellors at the campus level and allowing nine other campuses to become separate centers of academic life independent of Berkeley did not change the fact that all campuses remain part of one legal entity. As a 1968 University of California centennial coffee table book explained: “Yet for all its campuses, colleges, schools, institutes, and research stations, it remains one University, under one Board of Regents and one president—the University of California.” University of California continues to take a “united approach” as one university in matters in which it inures to University of California’s advantage to do so, such as when negotiating with the legislature and governor in Sacramento. University of California continues to manage certain matters at the system wide level in order to maintain common standards across all campuses, such as student admissions, appointment and promotion of faculty, and approval of academic programs.

    The State of California currently (2021–2022) spends $3.467 billion on the University of California system, out of total University of California operating revenues of $41.6 billion. The “University of California Budget for Current Operations” lists the medical centers as the largest revenue source, contributing 39% of the budget, the federal government 11%, Core Funds (State General Funds, University of California General Funds, student tuition) 21%, private support (gifts, grants, endowments) 7% ,and Sales and Services at 21%. In 1980, the state funded 86.8% of the University of California budget. While state funding has somewhat recovered, as of 2019 state support still lags behind even recent historic levels (e.g. 2001) when adjusted for inflation.

    According to the California Public Policy Institute, California spends 12% of its General Fund on higher education, but that percentage is divided between the University of California, California State University and California Community Colleges. Over the past forty years, state funding of higher education has dropped from 18% to 12%, resulting in a drop in University of California’s per student funding from $23,000 in 2016 to a current $8,000 per year per student.

    In May 2004, University of California President Robert C. Dynes and CSU Chancellor Charles B. Reed struck a private deal, called the “Higher Education Compact”, with Governor Schwarzenegger. They agreed to slash spending by about a billion dollars (about a third of the university’s core budget for academic operations) in exchange for a funding formula lasting until 2011. The agreement calls for modest annual increases in state funds (but not enough to replace the loss in state funds Dynes and Schwarzenegger agreed to), private fundraising to help pay for basic programs, and large student fee hikes, especially for graduate and professional students. A detailed analysis of the Compact by the Academic Senate “Futures Report” indicated, despite the large fee increases, the university core budget did not recover to 2000 levels. Undergraduate student fees have risen 90% from 2003 to 2007. In 2011, for the first time in Univerchity of California’s history, student fees exceeded contributions from the State of California.

    The First District Court of Appeal in San Francisco ruled in 2007 that the University of California owed nearly $40 million in refunds to about 40,000 students who were promised that their tuition fees would remain steady, but were hit with increases when the state ran short of money in 2003.

    In September 2019, the University of California announced it will divest its $83 billion in endowment and pension funds from the fossil fuel industry, citing ‘financial risk’.

    At present, the University of California system officially describes itself as a “ten campus” system consisting of the campuses listed below.

    Berkeley
    Davis
    Irvine
    Los Angeles
    Merced
    Riverside
    San Diego
    San Francisco
    Santa Barbara
    Santa Cruz

    These campuses are under the direct control of the Regents and President. Only these ten campuses are listed on the official University of California letterhead.

    Although it shares the name and public status of the University of California system, the Hastings College of the Law is not controlled by the Regents or President; it has a separate board of directors and must seek funding directly from the Legislature. However, under the California Education Code, Hastings degrees are awarded in the name of the Regents and bear the signature of the University of California president. Furthermore, Education Code section 92201 states that Hastings “is affiliated with the University of California, and is the law department thereof”.

     
  • richardmitnick 11:19 am on December 28, 2021 Permalink | Reply
    Tags: , A Majorana particle is one that is indistinguishable from its antimatter partner. This sets it apart from all other particles., , , It is quantum mechanics' uncertancy principle at work that makes this flavor change possible., Neutrino research places detectors in underground caverns; at the South Pole; in the ocean; and even in a van for drive-by neutrino monitoring for nuclear safeguard applications., Neutrinoless double beta decay, , Of all the known fundamental particles that have mass neutrinos are the most abundant—only the massless photon- which we see as light is more abundant., Oscillation: neutrinos co-exist in a mixture of “flavors.” While they must start out as a particular flavor upon formation they can evolve into a mixture of other flavors: tau; electron., , Some experiments are only satisfied if we find no neutrinos- as in the case of neutrinoless double-beta decay searches., The Majorana Demonstrator (Majorana) project, , We look for neutrinos from nuclear reactors; particle accelerators; the earth; our atmosphere; the sun; from supernovae.   

    From The Sanford Underground Research Facility-SURF (US): “The neutrino puzzle” 

    SURF-Sanford Underground Research Facility, Lead, South Dakota, USA.

    From The Sanford Underground Research Facility-SURF (US)

    Homestake Mining, Lead, South Dakota, USA.
    Homestake Mining Company

    August 30, 2021 [Found in a year-end round up.]
    Constance Walter

    1
    Vincent Guiseppe in a clean suit in the Majorana Demonstrator cleanroom on the 4850 Level of SURF. Behind him, the Majorana Demonstrator shielding is opened to reveal the copper core and cryostat module, which houses the inner detector components. Photo by Nick Hubbard.

    Imagine trying to put together a jigsaw puzzle that has no picture for reference, is missing several pieces and, of the pieces you do have, some don’t quite fit together.

    Welcome to the life of a neutrino researcher.

    Vincente Guiseppe began his neutrino journey 15 years ago as a post-doc at DOE’s Los Alamos National Laboratory (US). He worked with germanium detectors and studied radon while a graduate student and followed the scientific community’s progress as the Solar Neutrino Problem was solved. The so-called Solar Neutrino Problem was created when Dr. Ray Davis Jr., who operated a solar neutrino experiment on the 4850 Level of the Homestake Gold Mine, discovered only one-third of the neutrinos that had been theorized. Nearly 30 years after Davis began his search, the problem was solved with the discovery of neutrino oscillation.

    “I began to understand that neutrinos had much more in store for us. That led me to move to neutrino physics and set me up to transition to The Majorana Demonstrator (Majorana) project,” said Guiseppe, who is now a co-spokesperson for Majorana, located nearly a mile underground at SURF, and a senior research staff member at DOE’s Oak Ridge National Laboratory (US).

    Majorana uses germanium crystals in a search for the theorized Majorana particle—a neutrino that is believed to be its own antiparticle. Its discovery could help unravel mysteries about the origins of the universe and would add yet another piece to this baffling neutrino puzzle.

    We caught up with Guiseppe recently to talk about neutrinos—what scientists know (and don’t know), why neutrinos behave so strangely and why scientists keep searching for this ghost-like particle.

    SURF: What are neutrinos?

    Guiseppe: Let’s start with what we know. Of all the known fundamental particles that have mass, neutrinos are the most abundant—only the massless photon, which we see as light, is more abundant. We know their mass is quite small, but not zero—much lighter than their counterparts in the Standard Model of Physics—and we know there are three types and that they can change flavors. They also rarely interact with matter, which makes them difficult to study.

    Standard Model of Particle Physics, Quantum Diaries.

    All of these data points are pieces of that neutrino puzzle. But every piece is important if we want to complete the picture.

    SURF: Why should we care about the neutrino?

    Guiseppe: We care because they are so abundant. It’s almost embarrassing to have something that is so prevalent all around us and to not fully understand it. Think of it this way: You see a forest and the most abundant thing in that forest is a tree. But that’s all you know. You don’t know anything about how a tree operates. You don’t know how it grows, you don’t know why it’s green, you don’t know why it’s alive. It would be embarrassing to not know that. But that’s not the case with trees. Something so abundant as what we see in nature—animal species, trees, plants—we understand them completely, there’s nothing surprising. So, the fact that they are so abundant, and yet we know so little about them, brings a sort of duty to understand them.

    SURF: What intrigues you most about neutrino research?

    Guiseppe: Most? I would say the breadth of research and the big questions that can be answered by a single particle. While similar claims could be made about other particle research, the experimental approach is wide open. We look for neutrinos from nuclear reactors, particle accelerators, the earth, our atmosphere, the sun, from supernovae, and some experiments are only satisfied if we find no neutrinos, as in the case of neutrinoless double-beta decay searches. Neutrino research places detectors in underground caverns; at the South Pole; in the ocean; and even in a van for drive-by neutrino monitoring for nuclear safeguard applications. It’s a diverse field with big and unique questions.

    SURF: What is oscillation?

    Guiseppe: Oscillation is the idea that neutrinos can co-exist in a mixture of types or “flavors.” While they must start out as a particular flavor upon formation, they can evolve into a mixture of other flavors while traveling before falling into one flavor upon interaction with matter or detection. Hence, they are observed to oscillate between flavors from formation to detection.

    SURF: It’s a fundamental idea that a thing can’t become another thing unless acted upon by an outside force or material. How can something spontaneously become something it wasn’t a split second ago? And why are we OK with that?

    Guiseppe: Are people really okay with the idea of neutrinos changing flavors? I think we are, inasmuch as we are really okay with the implications of quantum mechanics? (As an aside, this reminds me of a question I asked my undergraduate quantum mechanics professor. I felt I was doing fine in the class and could work the problems but was worried that I really didn’t understand quantum mechanics. He responded with a slight grin: “Oh, no one really ‘understands’ quantum mechanics.”).

    It is quantum mechanics at work that makes this flavor change possible. Since neutrinos come in three separate flavors and three separate masses (and more importantly, each flavor does not come as a definite mass), they can exist in a quantum mechanical mixture of flavors. The root of your concern stems from the idea of its identify—what does it mean to change this identity?

    The comforting aspect is that neutrinos are not found to change speed, direction, mass, shape, or anything else that would require an outside force or energy in the usual sense. By changing flavor, the neutrino is only changing its personality and the rules by which it should follow at a given time.

    While this bit of personification is probably not comforting, it is only how the neutrino must interact with other particles that changes over time. You could think of the neutrino as being formed as one type, but then realizing it is not forced into that identity. It then remains in an indecisive state while being swayed to one type over another before finally making a decision upon detection or other interaction. In that sense, it is not a spontaneous change, but the result of a well thought-out (or predictable) decision process.

    SURF: What is a Majorana Particle and why is it important?

    Guiseppe: A Majorana particle is one that is indistinguishable from its antimatter partner. This sets it apart from all other particles. With the Majorana Demonstrator, we are looking for this particle in a process called neutrinoless double-beta decay.

    Neutrinoless double-beta decay is a nuclear process whereby two neutrons transform into two protons and electrons (aka, beta particles), but without the emission of two anti-neutrinos. This is in contrast to the two neutrino double-beta decay process where the two anti-neutrinos are emitted; a process that has been observed.

    SURF: Why neutrinoless double-beta decay?

    Guiseppe: Neutrinoless double-beta decay experiments offer the right mix of simplicity, experimental challenges, and the potential for a fascinating discovery. The signature for neutrinoless double-beta decay is simple: a measurement made at a specific energy and at a fixed point in the detector. But it’s a rare occurrence that is easily obscured so reducing all background (interferences) that can partially mimic this signature and foil the measurement is critical. Searching for this decay requires innovative detectors, as well as the ability to control the ubiquitous radiation found in everything around us.

    2
    The Majorana Demonstrator’s cryostat module inside the detector shielding. Photo by Nick Hubbard.

    SURF: After so many years, how do you stay enthusiastic about neutrino research?

    Guiseppe: Its book isn’t finished yet. We have more to learn and more questions to answer—we only need the means to do so. I stay enthused due to the likelihood of some new surprises (or comforting discoveries) that await. Along the way, we can continue to make advances in detector technology and develop new (or cleaner) materials, which inevitably lead to applications outside of physics research. In the end, chasing down neutrino properties and the secrets they may hold remains exciting due to clever ideas that keep the next discovery within reach.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    About us: The Sanford Underground Research Facility-SURF (US) in Lead, South Dakota advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe.

    The U Washington MAJORANA Neutrinoless Double-beta Decay Experiment Demonstrator experiment (US), also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    The LUX Xenon dark matter detector | Sanford Underground Research Facility mission was to scour the universe for WIMPs, vetoing all other signatures. It would continue to do just that for another three years before it was decommissioned in 2016.

    In the midst of the excitement over first results, the LUX collaboration was already casting its gaze forward. Planning for a next-generation dark matter experiment at Sanford Lab was already under way. Named LUX-ZEPLIN (LZ), the next-generation experiment would increase the sensitivity of LUX 100 times.

    SLAC National Accelerator Laboratory(US) physicist Tom Shutt, a previous co-spokesperson for LUX, said one goal of the experiment was to figure out how to build an even larger detector.

    “LZ will be a thousand times more sensitive than the LUX detector,” Shutt said. “It will just begin to see an irreducible background of neutrinos that may ultimately set the limit to our ability to measure dark matter.”

    We celebrate five years of LUX, and look into the steps being taken toward the much larger and far more sensitive experiment.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    FNAL DUNE LBNF (US) from FNAL to SURF >, Lead, South Dakota, USA

    FNAL DUNE LBNF (US) Caverns at Sanford Lab.

    U Washington MAJORANA Neutrinoless Double-beta Decay Experiment (US) at SURF.

    The MAJORANA DEMONSTRATOR will contain 40 kg of germanium; up to 30 kg will be enriched to 86% in 76Ge. The DEMONSTRATOR will be deployed deep underground in an ultra-low-background shielded environment in the Sanford Underground Research Facility (SURF) in Lead, SD. The goal of the DEMONSTRATOR is to determine whether a future 1-tonne experiment can achieve a background goal of one count per tonne-year in a 4-keV region of interest around the 76Ge 0νββ Q-value at 2039 keV. MAJORANA plans to collaborate with Germanium Detector Array (or GERDA) experiment is searching for neutrinoless double beta decay (0νββ) in Ge-76 at the underground Laboratori Nazionali del Gran Sasso (LNGS) for a future tonne-scale 76Ge 0νββ search.

    Compact Accelerator System for Performing Astrophysical Research (CASPAR). Credit: Nick Hubbard.

    Compact Accelerator System for Performing Astrophysical Research (CASPAR). Credit: Nick Hubbard.
    CASPAR is a low-energy particle accelerator that allows researchers to study processes that take place inside collapsing stars.
    The scientists are using space in the Sanford Underground Research Facility (SURF) in Lead, South Dakota, to work on a project called the Compact Accelerator System for Performing Astrophysical Research (CASPAR). CASPAR uses a low-energy particle accelerator that will allow researchers to mimic nuclear fusion reactions in stars. If successful, their findings could help complete our picture of how the elements in our universe are built. “Nuclear astrophysics is about what goes on inside the star, not outside of it,” said Dan Robertson, a Notre Dame assistant research professor of astrophysics working on CASPAR. “It is not observational, but experimental. The idea is to reproduce the stellar environment, to reproduce the reactions within a star.”

     
  • richardmitnick 8:57 am on August 31, 2021 Permalink | Reply
    Tags: , , , Neutrino oscillation, , Neutrinoless double beta decay, , , Solar Neutrino Problem   

    From Sanford Underground Research Facility-SURF: “The neutrino puzzle” 

    SURF-Sanford Underground Research Facility, Lead, South Dakota, USA.

    From Sanford Underground Research Facility-SURF

    Homestake Mining, Lead, South Dakota, USA.


    Homestake Mining Company

    August 30, 2021
    Constance Walter

    Researchers continue to piece together information about the ghostly particle.

    Imagine trying to put together a jigsaw puzzle that has no picture for reference, is missing several pieces and, of the pieces you do have, some don’t quite fit together.

    Welcome to the life of a neutrino researcher.

    Vincente Guiseppe began his neutrino journey 15 years ago as a post-doc at DOE’s Los Alamos National Laboratory (US). He worked with germanium detectors and studied radon while a graduate student and followed the scientific community’s progress as the Solar Neutrino Problem was solved. The so-called Solar Neutrino Problem was created when Dr. Ray Davis Jr., who operated a solar neutrino experiment on the 4850 Level of the Homestake Gold Mine, discovered only one-third of the neutrinos that had been theorized. Nearly 30 years after Davis began his search, the problem was solved with the discovery of neutrino oscillation.

    “I began to understand that neutrinos had much more in store for us. That led me to move to neutrino physics and set me up to transition to the Majorana Demonstrator (Majorana) project,” said Guiseppe, who is now a co-spokesperson for Majorana, located nearly a mile underground at SURF, and a senior research staff member at DOE’s Oak Ridge National Lab (ORNL).

    Majorana uses germanium crystals in a search for the theorized Majorana particle—a neutrino that is believed to be its own antiparticle. Its discovery could help unravel mysteries about the origins of the universe and would add yet another piece to this baffling neutrino puzzle.

    We caught up with Guiseppe recently to talk about neutrinos—what scientists know (and don’t know), why neutrinos behave so strangely and why scientists keep searching for this ghost-like particle.

    SURF: What are neutrinos?

    Guiseppe: Let’s start with what we know. Of all the known fundamental particles that have mass, neutrinos are the most abundant—only the massless photon, which we see as light, is more abundant. We know their mass is quite small, but not zero—much lighter than their counterparts in the Standard Model of Physics—and we know there are three types and that they can change flavors. They also rarely interact with matter, which makes them difficult to study.

    All of these data points are pieces of that neutrino puzzle. But every piece is important if we want to complete the picture.

    SURF: Why should we care about the neutrino?

    Guiseppe: We care because they are so abundant. It’s almost embarrassing to have something that is so prevalent all around us and to not fully understand it. Think of it this way: You see a forest and the most abundant thing in that forest is a tree. But that’s all you know. You don’t know anything about how a tree operates. You don’t know how it grows, you don’t know why it’s green, you don’t know why it’s alive. It would be embarrassing to not know that. But that’s not the case with trees. Something so abundant as what we see in nature—animal species, trees, plants—we understand them completely, there’s nothing surprising. So, the fact that they are so abundant, and yet we know so little about them, brings a sort of duty to understand them.

    SURF: What intrigues you most about neutrino research?

    Guiseppe: Most? I would say the breadth of research and the big questions that can be answered by a single particle. While similar claims could be made about other particle research, the experimental approach is wide open. We look for neutrinos from nuclear reactors, particle accelerators, the earth, our atmosphere, the sun, from supernovae, and some experiments are only satisfied if we find no neutrinos, as in the case of neutrinoless double-beta decay searches. Neutrino research places detectors in underground caverns, at the South Pole, in the ocean, and even in a van for drive-by neutrino monitoring for nuclear safeguard applications. It’s a diverse field with big and unique questions.

    SURF: What is oscillation?

    Guiseppe: Oscillation is the idea that neutrinos can co-exist in a mixture of types or “flavors.” While they must start out as a particular flavor upon formation, they can evolve into a mixture of other flavors while traveling before falling into one flavor upon interaction with matter or detection. Hence, they are observed to oscillate between flavors from formation to detection.

    SURF: It’s a fundamental idea that a thing can’t become another thing unless acted upon by an outside force or material. How can something spontaneously become something it wasn’t a split second ago? And why are we OK with that?

    Guiseppe: Are people really okay with the idea of neutrinos changing flavors? I think we are, inasmuch as we are really okay with the implications of quantum mechanics? (As an aside, this reminds me of a question I asked my undergraduate quantum mechanics professor. I felt I was doing fine in the class and could work the problems but was worried that I really didn’t understand quantum mechanics. He responded with a slight grin: “Oh, no one really ‘understands’ quantum mechanics.”).

    It is quantum mechanics at work that makes this flavor change possible. Since neutrinos come in three separate flavors and three separate masses (and more importantly, each flavor does not come as a definite mass), they can exist in a quantum mechanical mixture of flavors. The root of your concern stems from the idea of its identify—what does it mean to change this identity?

    The comforting aspect is that neutrinos are not found to change speed, direction, mass, shape, or anything else that would require an outside force or energy in the usual sense. By changing flavor, the neutrino is only changing its personality and the rules by which it should follow at a given time.

    While this bit of personification is probably not comforting, it is only how the neutrino must interact with other particles that changes over time. You could think of the neutrino as being formed as one type, but then realizing it is not forced into that identity. It then remains in an indecisive state while being swayed to one type over another before finally making a decision upon detection or other interaction. In that sense, it is not a spontaneous change, but the result of a well thought-out (or predictable) decision process.

    SURF: What is a Majorana Particle and why is it important?

    Guiseppe: A Majorana particle is one that is indistinguishable from its antimatter partner. This sets it apart from all other particles. With the Majorana Demonstrator, we are looking for this particle in a process called neutrinoless double-beta decay.

    Neutrinoless double-beta decay is a nuclear process whereby two neutrons transform into two protons and electrons (aka, beta particles), but without the emission of two anti-neutrinos. This is in contrast to the two neutrino double-beta decay process where the two anti-neutrinos are emitted; a process that has been observed.

    SURF: Why neutrinoless double-beta decay?

    Guiseppe: Neutrinoless double-beta decay experiments offer the right mix of simplicity, experimental challenges, and the potential for a fascinating discovery. The signature for neutrinoless double-beta decay is simple: a measurement made at a specific energy and at a fixed point in the detector. But it’s a rare occurrence that is easily obscured so reducing all background (interferences) that can partially mimic this signature and foil the measurement is critical. Searching for this decay requires innovative detectors, as well as the ability to control the ubiquitous radiation found in everything around us.

    SURF: After so many years, how do you stay enthusiastic about neutrino research?

    Guiseppe: Its book isn’t finished yet. We have more to learn and more questions to answer—we only need the means to do so. I stay enthused due to the likelihood of some new surprises (or comforting discoveries) that await. Along the way, we can continue to make advances in detector technology and develop new (or cleaner) materials, which inevitably lead to applications outside of physics research. In the end, chasing down neutrino properties and the secrets they may hold remains exciting due to clever ideas that keep the next discovery within reach.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    About us: The Sanford Underground Research Facility-SURF in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The The U Washington MAJORANA Neutrinoless Double-beta Decay Experiment Demonstrator experiment (US), also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    The LUX Xenon dark matter detector | Sanford Underground Research Facility mission was to scour the universe for WIMPs, vetoing all other signatures. It would continue to do just that for another three years before it was decommissioned in 2016.

    In the midst of the excitement over first results, the LUX collaboration was already casting its gaze forward. Planning for a next-generation dark matter experiment at Sanford Lab was already under way. Named LUX-ZEPLIN (LZ), the next-generation experiment would increase the sensitivity of LUX 100 times.

    SLAC National Accelerator Laboratory(US) physicist Tom Shutt, a previous co-spokesperson for LUX, said one goal of the experiment was to figure out how to build an even larger detector.

    “LZ will be a thousand times more sensitive than the LUX detector,” Shutt said. “It will just begin to see an irreducible background of neutrinos that may ultimately set the limit to our ability to measure dark matter.”

    We celebrate five years of LUX, and look into the steps being taken toward the much larger and far more sensitive experiment.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    FNAL DUNE LBNF (US) from FNAL to SURF, Lead, South Dakota, USA

    FNAL DUNE LBNF (US) Caverns at Sanford Lab.

    The MAJORANA DEMONSTRATOR will contain 40 kg of germanium; up to 30 kg will be enriched to 86% in 76Ge. The DEMONSTRATOR will be deployed deep underground in an ultra-low-background shielded environment in the Sanford Underground Research Facility (SURF) in Lead, SD. The goal of the DEMONSTRATOR is to determine whether a future 1-tonne experiment can achieve a background goal of one count per tonne-year in a 4-keV region of interest around the 76Ge 0νββ Q-value at 2039 keV. MAJORANA plans to collaborate with Germanium Detector Array (or GERDA) experiment is searching for neutrinoless double beta decay (0νββ) in Ge-76 at the underground Laboratori Nazionali del Gran Sasso (LNGS) for a future tonne-scale 76Ge 0νββ search.

    CASPAR is a low-energy particle accelerator that allows researchers to study processes that take place inside collapsing stars.

    The scientists are using space in the Sanford Underground Research Facility (SURF) in Lead, South Dakota, to work on a project called the Compact Accelerator System for Performing Astrophysical Research (CASPAR). CASPAR uses a low-energy particle accelerator that will allow researchers to mimic nuclear fusion reactions in stars. If successful, their findings could help complete our picture of how the elements in our universe are built. “Nuclear astrophysics is about what goes on inside the star, not outside of it,” said Dan Robertson, a Notre Dame assistant research professor of astrophysics working on CASPAR. “It is not observational, but experimental. The idea is to reproduce the stellar environment, to reproduce the reactions within a star.”

     
  • richardmitnick 8:25 pm on July 18, 2021 Permalink | Reply
    Tags: "Curiosity and technology drive quest to reveal fundamental secrets of the universe", A very specific particle called a J/psi might provide a clearer picture of what’s going on inside a proton’s gluonic field., , Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together., , , , , , Computational Science, , , , , , Developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles., , , Exploring the hearts of protons and neutrons, , Neutrinoless double beta decay, Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle., , , , , , , SLAC National Accelerator Laboratory(US), , ,   

    From DOE’s Argonne National Laboratory (US) : “Curiosity and technology drive quest to reveal fundamental secrets of the universe” 

    Argonne Lab

    From DOE’s Argonne National Laboratory (US)

    July 15, 2021
    John Spizzirri

    Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together.

    Imagine the first of our species to lie beneath the glow of an evening sky. An enormous sense of awe, perhaps a little fear, fills them as they wonder at those seemingly infinite points of light and what they might mean. As humans, we evolved the capacity to ask big insightful questions about the world around us and worlds beyond us. We dare, even, to question our own origins.

    “The place of humans in the universe is important to understand,” said physicist and computational scientist Salman Habib. ​“Once you realize that there are billions of galaxies we can detect, each with many billions of stars, you understand the insignificance of being human in some sense. But at the same time, you appreciate being human a lot more.”

    The South Pole Telescope is part of a collaboration between Argonne and a number of national labs and universities to measure the CMB, considered the oldest light in the universe.

    The high altitude and extremely dry conditions of the South Pole keep water vapor from absorbing select light wavelengths.

    With no less a sense of wonder than most of us, Habib and colleagues at the U.S. Department of Energy’s (DOE) Argonne National Laboratory are actively researching these questions through an initiative that investigates the fundamental components of both particle physics and astrophysics.

    The breadth of Argonne’s research in these areas is mind-boggling. It takes us back to the very edge of time itself, to some infinitesimally small portion of a second after the Big Bang when random fluctuations in temperature and density arose, eventually forming the breeding grounds of galaxies and planets.

    It explores the heart of protons and neutrons to understand the most fundamental constructs of the visible universe, particles and energy once free in the early post-Big Bang universe, but later confined forever within a basic atomic structure as that universe began to cool.

    And it addresses slightly newer, more controversial questions about the nature of Dark Matter and Dark Energy, both of which play a dominant role in the makeup and dynamics of the universe but are little understood.
    _____________________________________________________________________________________
    Dark Energy Survey

    Dark Energy Camera [DECam] built at DOE’s Fermi National Accelerator Laboratory(US)

    NOIRLab National Optical Astronomy Observatory(US) Cerro Tololo Inter-American Observatory(CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLab(US)NSF NOIRLab NOAO (US) Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    _____________________________________________________________________________________

    “And this world-class research we’re doing could not happen without advances in technology,” said Argonne Associate Laboratory Director Kawtar Hafidi, who helped define and merge the different aspects of the initiative.

    “We are developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles,” she added. ​“And because all of these detectors create big data that have to be analyzed, we are developing, among other things, artificial intelligence techniques to do that as well.”

    Decoding messages from the universe

    Fleshing out a theory of the universe on cosmic or subatomic scales requires a combination of observations, experiments, theories, simulations and analyses, which in turn requires access to the world’s most sophisticated telescopes, particle colliders, detectors and supercomputers.

    Argonne is uniquely suited to this mission, equipped as it is with many of those tools, the ability to manufacture others and collaborative privileges with other federal laboratories and leading research institutions to access other capabilities and expertise.

    As lead of the initiative’s cosmology component, Habib uses many of these tools in his quest to understand the origins of the universe and what makes it tick.

    And what better way to do that than to observe it, he said.

    “If you look at the universe as a laboratory, then obviously we should study it and try to figure out what it is telling us about foundational science,” noted Habib. ​“So, one part of what we are trying to do is build ever more sensitive probes to decipher what the universe is trying to tell us.”

    To date, Argonne is involved in several significant sky surveys, which use an array of observational platforms, like telescopes and satellites, to map different corners of the universe and collect information that furthers or rejects a specific theory.

    For example, the South Pole Telescope survey, a collaboration between Argonne and a number of national labs and universities, is measuring the cosmic microwave background (CMB) [above], considered the oldest light in the universe. Variations in CMB properties, such as temperature, signal the original fluctuations in density that ultimately led to all the visible structure in the universe.

    Additionally, the Dark Energy Spectroscopic Instrument and the forthcoming Vera C. Rubin Observatory are specially outfitted, ground-based telescopes designed to shed light on dark energy and dark matter, as well as the formation of luminous structure in the universe.

    DOE’s Lawrence Berkeley National Laboratory(US) DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory, in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Optical Astronomy Observatory (US) Mayall 4 m telescope at NSF NOIRLab NOAO Kitt Peak National Observatory (US) in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NSF (US) NOIRLab NOAO Kitt Peak National Observatory on the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NOIRLab (US) NOAO Kitt Peak National Observatory (US) on Kitt Peak of the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft). annotated.

    NSF (US) NOIRLab (US) NOAO (US) Vera C. Rubin Observatory [LSST] Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing NSF (US) NOIRLab (US) NOAO (US) Gemini South Telescope and NSF (US) NOIRLab (US) NOAO (US) Southern Astrophysical Research Telescope.

    Darker matters

    All the data sets derived from these observations are connected to the second component of Argonne’s cosmology push, which revolves around theory and modeling. Cosmologists combine observations, measurements and the prevailing laws of physics to form theories that resolve some of the mysteries of the universe.

    But the universe is complex, and it has an annoying tendency to throw a curve ball just when we thought we had a theory cinched. Discoveries within the past 100 years have revealed that the universe is both expanding and accelerating its expansion — realizations that came as separate but equal surprises.

    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    “To say that we understand the universe would be incorrect. To say that we sort of understand it is fine,” exclaimed Habib. ​“We have a theory that describes what the universe is doing, but each time the universe surprises us, we have to add a new ingredient to that theory.”

    Modeling helps scientists get a clearer picture of whether and how those new ingredients will fit a theory. They make predictions for observations that have not yet been made, telling observers what new measurements to take.

    Habib’s group is applying this same sort of process to gain an ever-so-tentative grasp on the nature of dark energy and dark matter. While scientists can tell us that both exist, that they comprise about 68 and 26% of the universe, respectively, beyond that not much else is known.

    ______________________________________________________________________________________________________________

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.


    Coma cluster via NASA/ESA Hubble.


    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.
    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.
    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970

    Dark Matter Research

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    _____________________________________________________________________________________

    Observations of cosmological structure — the distribution of galaxies and even of their shapes — provide clues about the nature of dark matter, which in turn feeds simple dark matter models and subsequent predictions. If observations, models and predictions aren’t in agreement, that tells scientists that there may be some missing ingredient in their description of dark matter.

    But there are also experiments that are looking for direct evidence of dark matter particles, which require highly sensitive detectors [above]. Argonne has initiated development of specialized superconducting detector technology for the detection of low-mass dark matter particles.

    This technology requires the ability to control properties of layered materials and adjust the temperature where the material transitions from finite to zero resistance, when it becomes a superconductor. And unlike other applications where scientists would like this temperature to be as high as possible — room temperature, for example — here, the transition needs to be very close to absolute zero.

    Habib refers to these dark matter detectors as traps, like those used for hunting — which, in essence, is what cosmologists are doing. Because it’s possible that dark matter doesn’t come in just one species, they need different types of traps.

    “It’s almost like you’re in a jungle in search of a certain animal, but you don’t quite know what it is — it could be a bird, a snake, a tiger — so you build different kinds of traps,” he said.

    Lab researchers are working on technologies to capture these elusive species through new classes of dark matter searches. Collaborating with other institutions, they are now designing and building a first set of pilot projects aimed at looking for dark matter candidates with low mass.

    Tuning in to the early universe

    Amy Bender is working on a different kind of detector — well, a lot of detectors — which are at the heart of a survey of the cosmic microwave background (CMB).

    “The CMB is radiation that has been around the universe for 13 billion years, and we’re directly measuring that,” said Bender, an assistant physicist at Argonne.

    The Argonne-developed detectors — all 16,000 of them — capture photons, or light particles, from that primordial sky through the aforementioned South Pole Telescope, to help answer questions about the early universe, fundamental physics and the formation of cosmic structures.

    Now, the CMB experimental effort is moving into a new phase, CMB-Stage 4 (CMB-S4).

    CMB-S4 is the next-generation ground-based cosmic microwave background experiment.With 21 telescopes at the South Pole and in the Chilean Atacama desert surveying the sky with 550,000 cryogenically-cooled superconducting detectors for 7 years, CMB-S4 will deliver transformative discoveries in fundamental physics, cosmology, astrophysics, and astronomy. CMB-S4 is supported by the Department of Energy Office of Science and the National Science Foundation.

    This larger project tackles even more complex topics like Inflationary Theory, which suggests that the universe expanded faster than the speed of light for a fraction of a second, shortly after the Big Bang.
    _____________________________________________________________________________________
    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation
    [caption id="attachment_55311" align="alignnone" width="632"] HPHS Owls

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes
    Alex Mittelmann, Coldcreation


    Alan Guth’s notes:

    Alan Guth’s original notes on inflation


    _____________________________________________________________________________________

    3
    A section of a detector array with architecture suitable for future CMB experiments, such as the upcoming CMB-S4 project. Fabricated at Argonne’s Center for Nanoscale Materials, 16,000 of these detectors currently drive measurements collected from the South Pole Telescope. (Image by Argonne National Laboratory.)

    While the science is amazing, the technology to get us there is just as fascinating.

    Technically called transition edge sensing (TES) bolometers, the detectors on the telescope are made from superconducting materials fabricated at Argonne’s Center for Nanoscale Materials, a DOE Office of Science User Facility.

    Each of the 16,000 detectors acts as a combination of very sensitive thermometer and camera. As incoming radiation is absorbed on the surface of each detector, measurements are made by supercooling them to a fraction of a degree above absolute zero. (That’s over three times as cold as Antarctica’s lowest recorded temperature.)

    Changes in heat are measured and recorded as changes in electrical resistance and will help inform a map of the CMB’s intensity across the sky.

    CMB-S4 will focus on newer technology that will allow researchers to distinguish very specific patterns in light, or polarized light. In this case, they are looking for what Bender calls the Holy Grail of polarization, a pattern called B-modes.

    Capturing this signal from the early universe — one far fainter than the intensity signal — will help to either confirm or disprove a generic prediction of inflation.

    It will also require the addition of 500,000 detectors distributed among 21 telescopes in two distinct regions of the world, the South Pole and the Chilean desert. There, the high altitude and extremely dry conditions keep water vapor in the atmosphere from absorbing millimeter wavelength light, like that of the CMB.

    While previous experiments have touched on this polarization, the large number of new detectors will improve sensitivity to that polarization and grow our ability to capture it.

    “Literally, we have built these cameras completely from the ground up,” said Bender. ​“Our innovation is in how to make these stacks of superconducting materials work together within this detector, where you have to couple many complex factors and then actually read out the results with the TES. And that is where Argonne has contributed, hugely.”

    Down to the basics

    Argonne’s capabilities in detector technology don’t just stop at the edge of time, nor do the initiative’s investigations just look at the big picture.

    Most of the visible universe, including galaxies, stars, planets and people, are made up of protons and neutrons. Understanding the most fundamental components of those building blocks and how they interact to make atoms and molecules and just about everything else is the realm of physicists like Zein-Eddine Meziani.

    “From the perspective of the future of my field, this initiative is extremely important,” said Meziani, who leads Argonne’s Medium Energy Physics group. ​“It has given us the ability to actually explore new concepts, develop better understanding of the science and a pathway to enter into bigger collaborations and take some leadership.”

    Taking the lead of the initiative’s nuclear physics component, Meziani is steering Argonne toward a significant role in the development of the Electron-Ion Collider, a new U.S. Nuclear Physics Program facility slated for construction at DOE’s Brookhaven National Laboratory (US).

    Argonne’s primary interest in the collider is to elucidate the role that quarks, anti-quarks and gluons play in giving mass and a quantum angular momentum, called spin, to protons and neutrons — nucleons — the particles that comprise the nucleus of an atom.


    EIC Electron Animation, Inner Proton Motion.
    Electrons colliding with ions will exchange virtual photons with the nuclear particles to help scientists ​“see” inside the nuclear particles; the collisions will produce precision 3D snapshots of the internal arrangement of quarks and gluons within ordinary nuclear matter; like a combination CT/MRI scanner for atoms. (Image by Brookhaven National Laboratory.)

    While we once thought nucleons were the finite fundamental particles of an atom, the emergence of powerful particle colliders, like the Stanford Linear Accelerator Center at Stanford University and the former Tevatron at DOE’s Fermilab, proved otherwise.

    It turns out that quarks and gluons were independent of nucleons in the extreme energy densities of the early universe; as the universe expanded and cooled, they transformed into ordinary matter.

    “There was a time when quarks and gluons were free in a big soup, if you will, but we have never seen them free,” explained Meziani. ​“So, we are trying to understand how the universe captured all of this energy that was there and put it into confined systems, like these droplets we call protons and neutrons.”

    Some of that energy is tied up in gluons, which, despite the fact that they have no mass, confer the majority of mass to a proton. So, Meziani is hoping that the Electron-Ion Collider will allow science to explore — among other properties — the origins of mass in the universe through a detailed exploration of gluons.

    And just as Amy Bender is looking for the B-modes polarization in the CMB, Meziani and other researchers are hoping to use a very specific particle called a J/psi to provide a clearer picture of what’s going on inside a proton’s gluonic field.

    But producing and detecting the J/psi particle within the collider — while ensuring that the proton target doesn’t break apart — is a tricky enterprise, which requires new technologies. Again, Argonne is positioning itself at the forefront of this endeavor.

    “We are working on the conceptual designs of technologies that will be extremely important for the detection of these types of particles, as well as for testing concepts for other science that will be conducted at the Electron-Ion Collider,” said Meziani.

    Argonne also is producing detector and related technologies in its quest for a phenomenon called neutrinoless double beta decay. A neutrino is one of the particles emitted during the process of neutron radioactive beta decay and serves as a small but mighty connection between particle physics and astrophysics.

    “Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle,” said Hafidi. ​“If the existence of these very rare decays is confirmed, it would have important consequences in understanding why there is more matter than antimatter in the universe.”

    Argonne scientists from different areas of the lab are working on the Neutrino Experiment with Xenon Time Projection Chamber (NEXT) collaboration to design and prototype key systems for the collaborative’s next big experiment. This includes developing a one-of-a-kind test facility and an R&D program for new, specialized detector systems.

    “We are really working on dramatic new ideas,” said Meziani. ​“We are investing in certain technologies to produce some proof of principle that they will be the ones to pursue later, that the technology breakthroughs that will take us to the highest sensitivity detection of this process will be driven by Argonne.”

    The tools of detection

    Ultimately, fundamental science is science derived from human curiosity. And while we may not always see the reason for pursuing it, more often than not, fundamental science produces results that benefit all of us. Sometimes it’s a gratifying answer to an age-old question, other times it’s a technological breakthrough intended for one science that proves useful in a host of other applications.

    Through their various efforts, Argonne scientists are aiming for both outcomes. But it will take more than curiosity and brain power to solve the questions they are asking. It will take our skills at toolmaking, like the telescopes that peer deep into the heavens and the detectors that capture hints of the earliest light or the most elusive of particles.

    We will need to employ the ultrafast computing power of new supercomputers. Argonne’s forthcoming Aurora exascale machine will analyze mountains of data for help in creating massive models that simulate the dynamics of the universe or subatomic world, which, in turn, might guide new experiments — or introduce new questions.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, to be built at DOE’s Argonne National Laboratory.

    And we will apply artificial intelligence to recognize patterns in complex observations — on the subatomic and cosmic scales — far more quickly than the human eye can, or use it to optimize machinery and experiments for greater efficiency and faster results.

    “I think we have been given the flexibility to explore new technologies that will allow us to answer the big questions,” said Bender. ​“What we’re developing is so cutting edge, you never know where it will show up in everyday life.”

    Funding for research mentioned in this article was provided by Argonne Laboratory Directed Research and Development; Argonne program development; DOE Office of High Energy Physics: Cosmic Frontier, South Pole Telescope-3G project, Detector R&D; and DOE Office of Nuclear Physics.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Argonne National Laboratory (US) seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their is a science and engineering research national laboratory operated by UChicago Argonne LLC for the United States Department of Energy. The facility is located in Lemont, Illinois, outside of Chicago, and is the largest national laboratory by size and scope in the Midwest.

    Argonne had its beginnings in the Metallurgical Laboratory of the University of Chicago, formed in part to carry out Enrico Fermi’s work on nuclear reactors for the Manhattan Project during World War II. After the war, it was designated as the first national laboratory in the United States on July 1, 1946. In the post-war era the lab focused primarily on non-weapon related nuclear physics, designing and building the first power-producing nuclear reactors, helping design the reactors used by the United States’ nuclear navy, and a wide variety of similar projects. In 1994, the lab’s nuclear mission ended, and today it maintains a broad portfolio in basic science research, energy storage and renewable energy, environmental sustainability, supercomputing, and national security.

    UChicago Argonne, LLC, the operator of the laboratory, “brings together the expertise of the University of Chicago (the sole member of the LLC) with Jacobs Engineering Group Inc.” Argonne is a part of the expanding Illinois Technology and Research Corridor. Argonne formerly ran a smaller facility called Argonne National Laboratory-West (or simply Argonne-West) in Idaho next to the Idaho National Engineering and Environmental Laboratory. In 2005, the two Idaho-based laboratories merged to become the DOE’s Idaho National Laboratory.
    What would become Argonne began in 1942 as the Metallurgical Laboratory at the University of Chicago, which had become part of the Manhattan Project. The Met Lab built Chicago Pile-1, the world’s first nuclear reactor, under the stands of the University of Chicago sports stadium. Considered unsafe, in 1943, CP-1 was reconstructed as CP-2, in what is today known as Red Gate Woods but was then the Argonne Forest of the Cook County Forest Preserve District near Palos Hills. The lab was named after the surrounding forest, which in turn was named after the Forest of Argonne in France where U.S. troops fought in World War I. Fermi’s pile was originally going to be constructed in the Argonne forest, and construction plans were set in motion, but a labor dispute brought the project to a halt. Since speed was paramount, the project was moved to the squash court under Stagg Field, the football stadium on the campus of the University of Chicago. Fermi told them that he was sure of his calculations, which said that it would not lead to a runaway reaction, which would have contaminated the city.

    Other activities were added to Argonne over the next five years. On July 1, 1946, the “Metallurgical Laboratory” was formally re-chartered as Argonne National Laboratory for “cooperative research in nucleonics.” At the request of the U.S. Atomic Energy Commission, it began developing nuclear reactors for the nation’s peaceful nuclear energy program. In the late 1940s and early 1950s, the laboratory moved to a larger location in unincorporated DuPage County, Illinois and established a remote location in Idaho, called “Argonne-West,” to conduct further nuclear research.

    In quick succession, the laboratory designed and built Chicago Pile 3 (1944), the world’s first heavy-water moderated reactor, and the Experimental Breeder Reactor I (Chicago Pile 4), built-in Idaho, which lit a string of four light bulbs with the world’s first nuclear-generated electricity in 1951. A complete list of the reactors designed and, in most cases, built and operated by Argonne can be viewed in the, Reactors Designed by Argonne page. The knowledge gained from the Argonne experiments conducted with these reactors 1) formed the foundation for the designs of most of the commercial reactors currently used throughout the world for electric power generation and 2) inform the current evolving designs of liquid-metal reactors for future commercial power stations.

    Conducting classified research, the laboratory was heavily secured; all employees and visitors needed badges to pass a checkpoint, many of the buildings were classified, and the laboratory itself was fenced and guarded. Such alluring secrecy drew visitors both authorized—including King Leopold III of Belgium and Queen Frederica of Greece—and unauthorized. Shortly past 1 a.m. on February 6, 1951, Argonne guards discovered reporter Paul Harvey near the 10-foot (3.0 m) perimeter fence, his coat tangled in the barbed wire. Searching his car, guards found a previously prepared four-page broadcast detailing the saga of his unauthorized entrance into a classified “hot zone”. He was brought before a federal grand jury on charges of conspiracy to obtain information on national security and transmit it to the public, but was not indicted.

    Not all nuclear technology went into developing reactors, however. While designing a scanner for reactor fuel elements in 1957, Argonne physicist William Nelson Beck put his own arm inside the scanner and obtained one of the first ultrasound images of the human body. Remote manipulators designed to handle radioactive materials laid the groundwork for more complex machines used to clean up contaminated areas, sealed laboratories or caves. In 1964, the “Janus” reactor opened to study the effects of neutron radiation on biological life, providing research for guidelines on safe exposure levels for workers at power plants, laboratories and hospitals. Scientists at Argonne pioneered a technique to analyze the moon’s surface using alpha radiation, which launched aboard the Surveyor 5 in 1967 and later analyzed lunar samples from the Apollo 11 mission.

    In addition to nuclear work, the laboratory maintained a strong presence in the basic research of physics and chemistry. In 1955, Argonne chemists co-discovered the elements einsteinium and fermium, elements 99 and 100 in the periodic table. In 1962, laboratory chemists produced the first compound of the inert noble gas xenon, opening up a new field of chemical bonding research. In 1963, they discovered the hydrated electron.

    High-energy physics made a leap forward when Argonne was chosen as the site of the 12.5 GeV Zero Gradient Synchrotron, a proton accelerator that opened in 1963. A bubble chamber allowed scientists to track the motions of subatomic particles as they zipped through the chamber; in 1970, they observed the neutrino in a hydrogen bubble chamber for the first time.

    Meanwhile, the laboratory was also helping to design the reactor for the world’s first nuclear-powered submarine, the U.S.S. Nautilus, which steamed for more than 513,550 nautical miles (951,090 km). The next nuclear reactor model was Experimental Boiling Water Reactor, the forerunner of many modern nuclear plants, and Experimental Breeder Reactor II (EBR-II), which was sodium-cooled, and included a fuel recycling facility. EBR-II was later modified to test other reactor designs, including a fast-neutron reactor and, in 1982, the Integral Fast Reactor concept—a revolutionary design that reprocessed its own fuel, reduced its atomic waste and withstood safety tests of the same failures that triggered the Chernobyl and Three Mile Island disasters. In 1994, however, the U.S. Congress terminated funding for the bulk of Argonne’s nuclear programs.

    Argonne moved to specialize in other areas, while capitalizing on its experience in physics, chemical sciences and metallurgy. In 1987, the laboratory was the first to successfully demonstrate a pioneering technique called plasma wakefield acceleration, which accelerates particles in much shorter distances than conventional accelerators. It also cultivated a strong battery research program.

    Following a major push by then-director Alan Schriesheim, the laboratory was chosen as the site of the Advanced Photon Source, a major X-ray facility which was completed in 1995 and produced the brightest X-rays in the world at the time of its construction.

    On 19 March 2019, it was reported in the Chicago Tribune that the laboratory was constructing the world’s most powerful supercomputer. Costing $500 million it will have the processing power of 1 quintillion flops. Applications will include the analysis of stars and improvements in the power grid.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 3:39 pm on March 24, 2021 Permalink | Reply
    Tags: "Measuring the invisible" Particle physicist Lindley Winslow, ABRACADABRA experiment at MIT, , , CUORE Experiment LNGS - Gran Sasso National Laboratory(IT), Kamioka Liquid Scintillator Antineutrino Detector-KamLAND, LBNL Cryogenic Dark Matter Search(US), Lindley Winslow has participated in many experiments herein enumerated., , Neutrinoless double beta decay, , Particle physicist Lindley Winslow seeks the universe’s smallest particles for answers to its biggest questions., , ,   

    From MIT: “Measuring the invisible” Particle physicist Lindley Winslow 

    MIT News


    From MIT

    March 24, 2021
    Jennifer Chu

    Particle physicist Lindley Winslow seeks the universe’s smallest particles for answers to its biggest questions.

    1
    MIT particle physicist Lindley Winslow seeks the universe’s smallest particles for answers to its biggest questions.
    Credit: M. Scott Brauer.

    When she entered the field of particle physics in the early 2000’s, Lindley Winslow was swept into the center of a massive experiment to measure the invisible.

    Scientists were finalizing the Kamioka Liquid Scintillator Antineutrino Detector-KamLAND, a building-sized particle detector built within a cavernous mine deep inside the Japanese Alps.

    KamLAND Neutrino Detector(JP) at the Kamioka Observatory, [Institute for Cosmic Ray Research; (神岡宇宙素粒子研究施設](JP) in located in a mine in Hida, Japan.

    The experiment was designed to detect neutrinos — subatomic particles that pass by the billions through ordinary matter.

    Neutrinos are produced anywhere particles interact and decay, from the Big Bang to the death of stars in supernovae. They rarely interact with matter and are therefore pristine messengers from the environments that create them.

    By 2000, scientists had observed neutrinos from various sources, including the sun, and hypothesized that the particles were morphing into different “flavors” by oscillating. KamLAND was designed to observe the oscillation, as a function of distance and energy, in neutrinos generated by Japan’s nearby nuclear reactors.

    Winslow joined the KamLAND effort the summer before graduate school and spent months in Japan, helping to prepare the detector for operation and then collecting data.

    “I learned to drive a manual transmission on reinforced land cruisers into the mine, past a waterfall, and down a long tunnel, where we then had to hike up a steep hill to the top of the detector,” Winslow says.

    In 2002, the experiment detected neutrino oscillations for the first time.

    “It was one of those moments in science where you know something that no one else in the world does,” recalls Winslow, who was part of the scientific collaboration that received the Breakthrough Prize in Fundamental Physics in 2016 for the discovery.

    The experience was pivotal in shaping Winslow’s career path. In 2020, she received tenure as associate professor of physics at MIT, where she continues to search for neutrinos, with KamLAND and other particle-detecting experiments that she has had a hand in designing.

    “I like the challenge of measuring things that are very, very hard to measure,” Winslow says. “The motivation comes from trying to discover the smallest building blocks and how they affect the universe we live in.”

    Measuring the impossible

    Winslow grew up in Chadds Ford, Pennsylvania, where she explored the nearby forests and streams, and also learned to ride horses, even riding competitively in high school.

    She set her sights west for college, with the intention of studying astronomy, and was accepted to the University of California at Berkeley(US), where she happily spent the next decade, earning first an undergraduate degree in physics and astronomy, then a master’s and PhD in physics.

    Midway through college, Winslow learned of particle physics and the large experiments to detect elusive particles. A search for an undergraduate research project introduced her to the LBNL Cryogenic Dark Matter Search(US), or CDMS, an experiment that was run beneath the Stanford University(US) campus.

    CDMS at Stanford University

    CDMS was designed to detect weakly interacting massive particles, or WIMPS — hypothetical particles that are thought to comprise dark matter — in detectors wrapped in ultrapure copper. For her first research project, Winslow helped analyze copper samples for the experiment’s next generation.

    “I liked seeing how all these pieces worked together, from sourcing the copper to figuring out how to build an experiment to basically measure the impossible,” Winslow says.

    Her later work with KamLAND, facilitated by her quantum mechanics professor and eventual thesis advisor, further inspired her to design experiments to search for neutrinos and other fundamental particles.

    “Little particles, big questions”

    After completing her PhD, Winslow took a postdoc position with Janet Conrad, professor of physics at MIT. In Conrad’s group, Winslow had freedom to explore ideas beyond the lab’s primary projects. One day, after watching a video about nanocrystals, Conrad wondered whether the atomic-scale materials might be useful in particle detection.

    “I remember her saying, ‘These nanocrystals are really cool. What can we do with them? Go!’ And I went and thought about it,” Winslow says.

    She soon came back with an idea: What if nanocrystals made from interesting isotopes could be dissolved in liquid scintillator to also realize more sensitive neutrino detection? Conrad thought it was a good idea and helped Winslow seek out grants to get the project going.

    In 2010, Winslow was awarded the L’Oréal for Women in Science Fellowship and a grant that she put toward the nanocrystal experiment, which she named “NuDot”, for the quantum dots (a type of nanocrystal) that she planned to work into a detector. When she finished her postdoc, she accepted a faculty position at the University of California at Los Angeles(US), where she continued laying plans for NuDot.

    A cold bargain

    Winslow spent two years at UCLA, during a time when the search for neutrinos circled around a new target: neutrinoless double-beta decay, a hypothetical process that, if observed, would prove that the neutrino is also its own antiparticle, which would help to explain why the universe has more matter than antimatter.

    At MIT, physics professor and department head Peter Fisher was looking to hire someone to explore double-beta decay. He offered the job to Winslow, who negotiated in return.

    “I told him what I wanted was a dilution refrigerator,” Winslow recalls. “The base price for one these is not small, and it’s asking a lot in particle physics. But he was like, ‘done!’”

    Winslow joined the MIT faculty in 2015, setting her lab up with a new dilution refrigerator that would allow her to cool macroscopic crystals to millikelvin temperatures to look for heat signatures from double-beta decay and other interesting particles. Today she is continuing to work on NuDot and the new generation of KamLAND, and is also a key member of CUORE Experiment LNGS – Gran Sasso National Laboratory(IT), a massive underground experiment in Italy with a much larger dilution refrigerator, designed to observe neutrinoless double-beta decay.

    Winslow has also made her mark on Hollywood. In 2016, while settling in at MIT, a colleague at UCLA recommended her as a consultant to the remake of the film Ghostbusters. The set design department was looking for ideas for how to stage the lab of one of the movie’s characters, a particle physicist. “I had just inherited a lab with a huge amount of junk that needed to be cleared out — gigantic crates filled with old scientific equipment, some of which had started to rust,” Winslow says. “[The producers] came to my lab and said, ‘This is perfect!’ And in the end it was a really fun collaboration.”

    In 2018, her work took a surprising turn when she was approached by theorist Benjamin Safdi, then at MIT, who with MIT physicist Jesse Thaler and former graduate student Yonatan Kahn PhD ’15 had devised a thought experiment named ABRACADABRA, to detect another hypothetical particle, the axion, by simulating a magnetar — a type of neutron star with intense magnetic fields that should make any interacting axions briefly detectable. Safdi heard of Winslow’s refrigerator and wondered whether she could engineer a detector inside it to test the idea.

    3
    4
    ABRACADABRA experiment at MIT.

    “It was an example of the wonderfulness that is MIT,” recalls Winslow, who jumped at the opportunity to design an entirely new experiment. In its first successful run, the ABRACADABRA detector reported no evidence of axions. The team is now designing larger versions, with greater sensitivity, to add to Winslow’s stable of growing detectors.

    “That’s all part of my group’s vision for the next 25 years: building big experiments that might detect little particles, to answer big questions,” Winslow says.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal
    Massachusetts Institute of Technology (MIT) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory, the Bates Center, and the Haystack Observatory, as well as affiliated laboratories such as the Broad and Whitehead Institutes.

    MIT Haystack Observatory, Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, MIT adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with MIT. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. MIT is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia, wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after MIT was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst. In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    MIT was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, MIT faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the MIT administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.
    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, MIT catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at MIT that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    MIT’s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at MIT’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, MIT became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected MIT profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of MIT between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, MIT no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and MIT’s defense research. In this period MIT’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. MIT ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six MIT students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at MIT over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, MIT’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    MIT has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the OpenCourseWare project has made course materials for over 2,000 MIT classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    MIT was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, MIT launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, MIT announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the MIT faculty adopted an open-access policy to make its scholarship publicly accessible online.

    MIT has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the MIT community with thousands of police officers from the New England region and Canada. On November 25, 2013, MIT announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the MIT community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Laser Interferometer Gravitational-Wave Observatory (LIGO) was designed and constructed by a team of scientists from California Institute of Technology, MIT, and industrial contractors, and funded by the National Science Foundation.

    MIT/Caltech Advanced aLigo .

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and MIT physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an MIT graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

     
  • richardmitnick 10:27 am on May 12, 2020 Permalink | Reply
    Tags: , , , , , Neutrinoless double beta decay, ,   

    From Lawrence Berkeley National Lab: “Berkeley Lab COVID-19 related research and additional information. News Center CUORE Underground Experiment in Italy Carries on Despite Pandemic” 


    From Lawrence Berkeley National Lab

    May 12, 2020
    Glenn Roberts Jr.
    (510) 520-0843
    geroberts@lbl.gov

    Laura Marini, a postdoctoral researcher at UC Berkeley and a Berkeley Lab affiliate who serves as a run coordinator for the underground CUORE experiment, shares her experiences of working on CUORE and living near Gran Sasso during the COVID-19 pandemic. (Credit: Marilyn Sargent/Berkeley Lab)

    Note: This is the first part in a recurring series highlighting Berkeley Lab’s ongoing work in international physics collaborations during the pandemic.

    As the COVID-19 outbreak took hold in Italy, researchers working on a nuclear physics experiment called CUORE at an underground laboratory in central Italy scrambled to keep the ultrasensitive experiment running and launch new tools and rules for remote operations.

    This Cryogenic Underground Observatory for Rare Events experiment – designed to find a never-before-seen process involving ghostly particles known as neutrinos, to explain why matter won out over antimatter in our universe, and to also hunt for signs of mysterious dark matter – is carrying on with its data-taking uninterrupted while some other projects and experiments around the globe have been put on hold.

    Finding evidence for these rare processes requires long periods of data collection – and a lot of patience. CUORE has been collecting data since May 2017, and after upgrade efforts in 2018 and 2019 the experiment has been running continuously.

    Before the pandemic hit there were already tools in place that stabilized the extreme cooling required for CUORE’s detectors and provided some remote controls and monitoring of CUORE systems, noted Yury Kolomensky, senior faculty scientist at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the U.S. spokesperson for CUORE.

    The rapid global spread of the disease, and related restrictions on access to the CUORE experiment at Gran Sasso National Laboratory (Laboratori Nazionali del Gran Sasso, or LNGS, operated by the Italian Nuclear Physics Institute, INFN) in central Italy, prompted CUORE leadership and researchers – working in three continents – to act quickly to ramp up the remote controls to prepare for an extended period with only limited access to the experiment.

    CUORE experiment,at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) in located in the Abruzzo region of central Italy,a search for neutrinoless double beta decay

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    Just days before the new restrictions went into effect at Gran Sasso, CUORE leadership on March 4 made the decision to rapidly deploy a new remote system and to work out the details of how to best maintain the experiment with limited staffing and with researchers monitoring in different time zones. The new system was fully operational about a week later, and researchers at Berkeley Lab played a role in rolling it out.

    “We were already planning to transition to remote shift operations, whereby a scientist at a home institution would monitor the systems in real time, respond to alarms, and call on-site and on-call personnel in case an emergency intervention is needed,” Kolomensky said, adding, “We were commissioning the system at the time of the outbreak.”

    Brad Welliver, a postdoctoral researcher, served as Berkeley Lab’s lead developer for the new remote monitoring system, and Berkeley Lab staff scientist Brian Fujikawa was the overall project lead for the enhanced remote controls, collectively known as CORC, for CUORE Online/Offline Run Check.

    Fujikawa tested controls for starting and stopping the data collection process, and also performed other electronics testing for the experiment from his home in the San Francisco Bay Area.

    He noted that the system is programmed to send email and voice alarms to the designated on-shift CUORE researcher if something is awry with any CUORE system. “This alarm system is particularly important when operating CUORE remotely,” he said, as in some cases on-site workers may need to visit the experiment promptly to perform repairs or other needed work.

    Development of so-called “slow controls,” which allow researchers to monitor and control CUORE equipment such as pumps and sensors, was led by Joe Johnston at the Massachusetts Institute of Technology.

    “Now we can perform most of the operations from 6,000 miles away,” Kolomensky said.

    And many participants across the collaboration continue to play meaningful roles in the experiment from their homes, from analyzing data and writing papers to participating in long-term planning and remote meetings.

    Despite access restrictions at Gran Sasso, experiments are still accessible for necessary work and checkups. The laboratory remains open in a limited way, and its staff still maintains all of its needed services and equipment, from shuttles to computing services.

    Laura Marini, a postdoctoral researcher at UC Berkeley who serves as a run coordinator for CUORE and is now living near Gran Sasso, is among a handful of CUORE researchers who still routinely visits the lab site.

    “As a run coordinator, I need to make sure that the experiment works fine and the data quality is good,” she said. “Before the pandemic spread, I was going underground maybe not every day, but at least a few times a week.” Now, it can be about once every two weeks.

    Sometimes she is there to carry out simple fixes, like a stuck computer that needs to be restarted, she said. Now, in addition to the requisite hard hat and heavy shoes, Marini – like so many others around the globe who are continuing to work – must wear a mask and gloves to guard against the spread of COVID-19.

    The simple act of driving into the lab site can be complicated, too, she said. “The other day, I had to go underground and the police stopped me. So I had to fill in a paper to declare why I was going underground, the fact that it was needed, and that I was not just wandering around by car,” she said. Restrictions in Italy prevent most types of travel.

    2
    Laura Marini now wears a protective mask and gloves, in addition to a hard hat, during her visits to the CUORE experiment site. (Credit: Gran Sasso National Laboratory – INFN)

    CUORE researchers note that they are fortunate the experiment was already in a state of steady data-taking when the pandemic hit. “There is no need for continuous intervention,” Marini said. “We can do most of our checks by remote.”

    She said she is grateful to be part of an international team that has “worked together on a common goal and continues to do so” despite the present-day challenges.

    Kolomensky noted some of the regular maintenance and upgrades planned for CUORE will be put off as a result of the shelter-in-place restrictions, though there also appears to be an odd benefit of the reduced activity at the Gran Sasso site. “We see an overall reduction in the detector noise, which we attribute to a significantly lower level of activity at the underground lab and less traffic in the highway tunnel,” he said. Researchers are working to verify this.

    CUORE already had systems in place to individually and remotely monitor data-taking by each of the experiment’s 988 detectors. Benjamin Schmidt, a Berkeley Lab postdoctoral researcher, had even developed software that automatically flags periods of “noisy” or poor data-taking captured by CUORE’s array of detectors.

    Kolomensky noted that work on the CORC remote tools is continuing. “As we have gained more experience and discovered issues, improvements and bug fixes have been implemented, and these efforts are still ongoing,” he said.

    CUORE is supported by the U.S. Department of Energy Office of Science, Italy’s National Institute of Nuclear Physics (Instituto Nazionale di Fisica Nucleare, or INFN), and the National Science Foundation (NSF). CUORE collaboration members include: INFN, University of Bologna, University of Genoa, University of Milano-Bicocca, and Sapienza University in Italy; California Polytechnic State University, San Luis Obispo; Berkeley Lab; Lawrence Livermore National Laboratory; Massachusetts Institute of Technology; University of California, Berkeley; University of California, Los Angeles; University of South Carolina; Virginia Polytechnic Institute and State University; and Yale University in the US; Saclay Nuclear Research Center (CEA) and the Irène Joliot-Curie Laboratory (CNRS/IN2P3, Paris Saclay University) in France; and Fudan University and Shanghai Jiao Tong University in China.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

     
  • richardmitnick 11:55 am on January 14, 2020 Permalink | Reply
    Tags: "A voyage to the heart of the neutrino", , , Neutrinoless double beta decay, , , , SNOLAB- a Canadian underground physics laboratory at a depth of 2 km in Vale's Creighton nickel mine in Sudbury Ontario Canada., Super-Kamiokande experiment located under Mount Ikeno near the city of Hida Gifu Prefecture Japan, The Karlsruhe Tritium Neutrino (KATRIN) experiment, The most abundant particles in the universe besides photons., The three neutrino mass eigenstates, We know now that the three neutrino flavour states we observe in experiments – νe; νμ; and ντ – are mixtures of three neutrino mass states.   

    From CERN Courier: “A voyage to the heart of the neutrino” 


    From CERN Courier

    10 January 2020

    The Karlsruhe Tritium Neutrino (KATRIN) experiment has begun its seven-year-long programme to determine the absolute value of the neutrino mass.

    KATRIN experiment aims to measure the mass of the neutrino using a huge device called a spectrometer (interior shown)Karlsruhe Institute of Technology, Germany

    On 11 June 2018, a tense silence filled the large lecture hall of the Karlsruhe Institute of Technology (KIT) in Germany.

    2

    Karlsruhe Institute Of Technology (KIT)


    Karlsruhe Institute of Technology (KIT) in Germany.

    In front of an audience of more than 250 people, 15 red buttons were pressed simultaneously by a panel of senior figures including recent Nobel laureates Takaaki Kajita and Art McDonald. At the same time, operators in the control room of the Karlsruhe Tritium Neutrino (KATRIN) experiment lowered the retardation voltage of the apparatus so that the first beta electrons were able to pass into KATRIN’s giant spectrometer vessel. Great applause erupted when the first beta electrons hit the detector.

    In the long history of measuring the tritium beta-decay spectrum to determine the neutrino mass, the ensuing weeks of KATRIN’s first data-taking opened a new chapter. Everything worked as expected, and KATRIN’s initial measurements have already propelled it into the top ranks of neutrino experiments. The aim of this ultra-high-precision beta-decay spectroscope, more than 15 years in the making, is to determine, by the mid-2020s, the absolute mass of the neutrino.

    Massive discovery

    Since the discovery of the oscillation of atmospheric neutrinos by the Super-Kamiokande experiment in 1998, and of the flavour transitions of solar neutrinos by the SNO experiment shortly afterwards, it was strongly implied that neutrino masses are not zero, but big enough to cause interference between distinct mass eigenstates as a neutrino wavepacket evolves in time. We know now that the three neutrino flavour states we observe in experiments – νe, νμ and ντ – are mixtures of three neutrino mass states.

    Super-Kamiokande experiment. located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    SNOLAB, a Canadian underground physics laboratory at a depth of 2 km in Vale’s Creighton nickel mine in Sudbury, Ontario

    SNOLAB, Sudbury, Ontario, Canada.

    Though not massless, neutrinos are exceedingly light. Previous experiments designed to directly measure the scale of neutrino masses in Mainz and Troitsk produced an upper limit of 2 eV for the neutrino mass – a factor 250,000 times smaller than the mass of the otherwise lightest massive elementary particle, the electron. Nevertheless, neutrino masses are extremely important for cosmology as well as for particle physics. They have a number density of around 336 cm–3, making them the most abundant particles in the universe besides photons, and therefore play a distinct role in the formation of cosmic structure. Comparing data from the Planck satellite together with data from galaxy surveys (baryonic acoustic oscillations) with simulations of the evolution of structure yields an upper limit on the sum of all three neutrino masses of 0.12 eV at 95% confidence within the framework of the standard Lambda cold-dark matter (ΛCDM) cosmological model.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes
    Alex Mittelmann, Coldcreation

    Considerations of “naturalness” lead most theorists to speculate that the exceedingly tiny neutrino masses do not arise from standard Yukawa couplings to the Higgs boson, as per the other fermions, but are generated by a different mass mechanism. Since neutrinos are electrically neutral, they could be identical to their antiparticles, making them Majorana particles. Via the so-called seesaw mechanism, this interesting scenario would require a new and very high particle mass scale to balance the smallness of the neutrino masses, which would be unreachable with present accelerators.

    5
    Inner space KATRIN’s main spectrometer, the largest ultra-high-vacuum vessel in the world, contains a dual-layer electrode system comprising 23,000 wires to shield the inner volume from charged particles. Credit: KATRIN

    As neutrino oscillations arise due to interference between mass eigenstates, neutrino-oscillation experiments are only able to determine splittings between the squares of the neutrino mass eigenstates. Three experimental avenues are currently being pursued to determine the neutrino mass. The most stringent upper limit is currently the model-dependent bound set by cosmological data, as already mentioned, which is valid within the ΛCDM model. A second approach is to search for neutrinoless double-beta decay, which allows a statement to be made about the size of the neutrino masses but presupposes the Majorana nature of neutrinos.

    U Washington Majorana Demonstrator Experiment at SURF

    The third approach – the one adopted by KATRIN – is the direct determination of the neutrino mass from the kinematics of a weak process such as beta decay, which is completely model-independent and depends only on the principle of energy and momentum conservation.

    6
    Fig. 1. The beta spectrum of tritium (left), showing in detail the effect of different neutrino masses on the endpoint (right). Credit: CERN

    The direct determination of the neutrino mass relies on the precise measurement of the shape of the beta electron spectrum near the endpoint, which is governed by the available phase space (figure 1). This spectral shape is altered by the neutrino mass value: the smaller the mass, the smaller the spectral modification. One would expect to see three modifications, one for each neutrino mass eigenstate. However, due to the tiny neutrino mass differences, a weighted sum is observed. This “average electron neutrino mass” is formed by the incoherent sum of the squares of the three neutrino mass eigenstates, which contribute to the electron neutrino according to the PMNS neutrino-mixing matrix. The super-heavy hydrogen isotope tritium is ideal for this purpose because it combines a very low endpoint energy, Eo, of 18.6 keV and a short half-life of 12.3 years with a simple nuclear and atomic structure.

    KATRIN is born

    Around the turn of the millennium, motivated by the neutrino oscillation results, Ernst Otten of the University of Mainz and Vladimir Lobashev of INR Troitsk proposed a new, much more sensitive experiment to measure the neutrino mass from tritium beta decay. To this end, the best methods from the previous experiments in Mainz, Troitsk and Los Alamos were to be combined and upscaled by up to two orders of magnitude in size and precision. Together with new technologies and ideas, such as laser Raman spectroscopy or active background reduction methods, the apparatus would increase the sensitivity to the observable in beta decay (the square of the electron antineutrino mass) by a factor of 100, resulting in a neutrino-mass sensitivity of 0.2 eV. Accordingly, the entire experiment was designed to the limits of what was feasible and even beyond (see “Technology transfer delivers ultimate precision” box).

    _______________________________________________
    7
    Precise The electron transport and tritium retention system. Credit: KIT

    Many technologies had to be pushed to the limits of what was feasible or even beyond. KATRIN became a CERN-recognised experiment (RE14) in 2007 and the collaboration worked with CERN experts in many areas to achieve this. The KATRIN main spectrometer is the largest ultra-high vacuum vessel in the world, with a residual gas pressure in the range of 10–11 mbar – a pressure that is otherwise only found in large volumes inside the LHC ring – equivalent to the pressure recorded at the lunar surface.

    Even though the inner surface was instrumented with a complex dual-layer wire electrode system for background suppression and electric-field shaping, this extreme vacuum was made possible by rigorous material selection and treatment in addition to non-evaporable getter technology developed at CERN. KATRIN’s almost 40 m-long chain of superconducting magnets with two large chicanes was put into operation with the help of former CERN experts, and a 223Ra source was produced at ISOLDE for background studies at KATRIN.

    CERN ISOLDE Looking down into the ISOLDE experimental hall

    A series of 83mKr conversion electron sources based on implanted 83Rb for calibration purposes was initially produced at ISOLDE. At present these are produced by KATRIN collaborators and further developed with regard to line stability.

    Conversely, the KATRIN collaboration has returned its knowledge and methods to the community. For example, the ISOLDE high-voltage system was calibrated twice with the ppm-accuracy KATRIN voltage dividers, and the magnetic and electrical field calculation and tracking programme KASSIOPEIA developed by KATRIN was published as open source and has become the standard for low-energy precision experiments. The fast and precise laser Raman spectroscopy developed for KATRIN is also being applied to fusion technology.
    _______________________________________________

    KIT was soon identified as the best place for such an experiment, as it had the necessary experience and infrastructure with the Tritium Laboratory Karlsruhe. The KIT board of directors quickly took up this proposal and a small international working group started to develop the project. At a workshop at Bad Liebenzell in the Black Forest in January 2001, the project received so much international support that KIT, together with nearly all the groups from the previous neutrino-mass experiments, founded the KATRIN collaboration. Currently, the 150-strong KATRIN collaboration comprises 20 institutes from six countries.

    It took almost 16 years from the first design to complete KATRIN, largely because many new technologies had to be developed, such as a novel concept to limit the temperature fluctuations of the huge tritium source to the mK scale at 30 K or the high-voltage stabilisation and calibration to the 10 mV scale at 18.6 kV. The experiment’s two most important and also most complex components are the gaseous, windowless molecular tritium source (WGTS) and the very large spectrometer. In the WGTS, tritium gas is introduced in the midpoint of the 10 m-long beam tube, where it flows out to both sides to be pumped out again by turbomolecular pumps. After being partially cleaned it is re-injected, yielding a closed tritium cycle. This results in an almost opaque column density with a total decay rate of 1011 per second. The beta electrons are guided adiabatically to a tandem of a pre- and a main spectrometer by superconducting magnets of up to 6 T. Along the way, differential and cryogenic pumping sections including geometric chicanes reduce the tritium flow by more than 14 orders of magnitude to keep the spectrometers free of tritium (figure 2).

    6
    Fig. 2. The 70 m-long KATRIN setup showing the key stages and components. Credit: CERN

    The KATRIN spectrometers operate as so-called MAC-E filters, whereby electrons are guided by two superconducting solenoids at either end and their momenta are collimated by the magnetic field gradient. This “magnetic bottle” effect transforms almost all kinetic energy into longitudinal energy, which is filtered by an electrostatic retardation potential so that only electrons with enough energy to overcome the barrier are able to pass through. The smaller pre-spectrometer blocks the low-energy part of the beta spectrum (which carries no information on the neutrino mass), while the 10 m-diameter main spectrometer provides a much sharper filter width due to its huge size.

    The transmitted electrons are detected by a high-resolution segmented silicon detector. By varying the retarding potential of the main spectrometer, a narrow region of the beta spectrum of several tens of eV below the endpoint is scanned, where the imprint of a non-zero neutrino mass is maximal. Since the relative fraction of the tritium beta spectrum in the last 1 eV below the endpoints amounts to just 2 × 10–13, KATRIN demands a tritium source of the highest intensity. Of equal importance is the high precision needed to understand the measured beta spectrum. Therefore, KATRIN possesses a complex calibration and monitoring system to determine all systematics with the highest precision in situ, e.g. the source strength, the inelastic scattering of beta electrons in the tritium source, the retardation voltage and the work functions of the tritium source and the main spectrometer.

    Start-up and beyond

    After intense periods of commissioning during 2018, the tritium source activity was increased from its initial value of 0.5 GBq (which was used for the inauguration measurements) to 25 GBq (approximately 22% of nominal activity) in spring 2019. By April, the first KATRIN science run had begun and everything went like clockwork. The decisive source parameters – temperature, inlet pressure and tritium content – allowed excellent data to be taken, and the collaboration worked in several independent teams to analyse these data. The critical systematic uncertainties were determined both by Monte Carlo propagation and with the covariance-matrix method, and the analyses were also blinded so as not to generate bias. The excitement during the un-blinding process was huge within the KATRIN collaboration, which gathered for this special event, and relief spread when the result became known. The neutrino-mass square turned out to be compatible with zero within its uncertainty budget. The model fits the data very well (figure 3) and the fitted endpoint turned out to be compatible with the mass difference between 3He and tritium measured in Penning traps. The new results were presented at the international TAUP 2019 conference in Toyama, Japan, and have recently been published.

    7
    Fig. 3. The beta-electron spectrum in the vicinity of its endpoint with 50 times enlarged error bars and a best-fit model (top) and fit residuals (bottom). Credit: CERN

    This first result shows that all aspects of the KATRIN experiment, from hardware to data-acquisition to analysis, works as expected. The statistical uncertainty of the first KATRIN result is already smaller by a factor of two compared to previous experiments and systematic uncertainties have gone down by a factor of six. A neutrino mass was not yet extracted with these first four weeks of data, but an upper limit for the neutrino mass of 1.1 eV (90% confidence) can be drawn, catapulting KATRIN directly to the top of the world of direct neutrino-mass experiments. In the mass region around 1 eV, the limit corresponds to the quasi-degenerated neutrino-mass range where the mass splittings implied by neutrino-oscillation experiments are negligible compared to the absolute masses.

    The neutrino-mass result from KATRIN is complementary to results obtained from searches for neutrinoless double beta decay, which are sensitive to the “coherent sum” mββ of all neutrino mass eigenstates contributing to the electron neutrino. Apart from additional phases that can lead to possible cancellations in this sum, the values of the nuclear matrix elements that need to be calculated to connect the neutrino mass mββ with the observable (the half-life) still possess uncertainties of a factor two. Therefore, the result from a direct neutrino-mass determination is more closely connected to results from cosmological data, which give (model-dependent) access to the neutrino-mass sum.

    A sizeable influence

    Currently, KATRIN is taking more data and has already increased the source activity by a factor of four to close to its design value. The background rate is still a challenge. Various measures, such as out-baking and using liquid-nitrogen cooled baffles in front of the getter pumps, have already yielded a background reduction by a factor 10, and more will be implemented in the next few years. For the final KATRIN sensitivity of 0.2 eV (90% confidence) on the absolute neutrino-mass scale, a total of 1000 days of data are required. With this sensitivity KATRIN will either find the neutrino mass or will set a stringent upper limit. The former would confront standard cosmology, while the latter would exclude quasi-degenerate neutrino masses and a sizeable influence of neutrinos on the formation of structure in the universe. This will be augmented by searches for physics beyond the Standard Model, such as for sterile neutrino admixtures with masses from the eV to the keV scale.

    Standard Model of Particle Physics

    Neutrino-oscillation results yield a lower limit for the effective electron-neutrino mass to manifest in direct neutrino-mass experiments of about 10 meV (50 meV) for normal (inverse) mass ordering. Therefore, many plans exist to cover this region in the future. At KATRIN, there is a strong R&D programme to upgrade the MAC-E filter principle from the current integral to a differential read-out, which will allow a factor-of-two improvement in sensitivity on the neutrino mass. New approaches to determine the absolute neutrino-mass scale are also being developed: Project 8, a radio-spectroscopy method to eventually be applied to an atomic tritium source; and the electron-capture experiments ECHo and HOLMES, which intend to deploy large arrays of cryogenic bolometers with the implanted isotope 163Ho. In parallel, the next generation of neutrinoless double beta decay experiments like LEGEND, CUPID or nEXO (as well as future xenon-based dark-matter experiments) aim to cover the full range of inverted neutrino-mass ordering. Finally, refined cosmological data should allow us to probe the same mass region (and beyond) within the next decades, while long-baseline neutrino-oscillation experiments, such as JUNO, DUNE and Hyper-Kamiokande, will probe the neutrino-mass ordering implemented in nature. As a result of this broad programme for the 2020s, the elusive neutrino should finally yield some of its secrets and inner properties beyond mixing.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 3:31 pm on September 5, 2019 Permalink | Reply
    Tags: , , , , Neutrinoless double beta decay, , ,   

    From Techniche Universitat Munchen: “Closing in on elusive particles” 

    Techniche Universitat Munchen

    From Techniche Universitat Munchen

    1
    Working on the germanium detector array in the clean room of Gran Sasso underground laboratory.
    Image: J. Suvorov / GERDA

    05.09.2019
    Prof. Dr. Stefan Schönert
    Technical University of Munich
    Experimental Astroparticlephysics (E15)
    Tel.: +49 89 289 12511
    E-Mail: schoenert@ph.tum.de

    Major steps forward in understanding neutrino properties.

    In the quest to prove that matter can be produced without antimatter, the GERDA experiment at the Gran Sasso Underground Laboratory is looking for signs of neutrinoless double beta decay. The experiment has the greatest sensitivity worldwide for detecting the decay in question. To further improve the chances of success, a follow-up project, LEGEND, uses an even more refined decay experiment.

    MPG GERmanium Detector Array (GERDA) at Gran Sasso, Italy

    LEGEND Collaboration

    LEGEND experiment at Gran Sasso looking for signs of neutrinoless double beta decay

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    While the Standard Model of Particle Physics has remained mostly unchanged since its initial conception, experimental observations for neutrinos have forced the neutrino part of the theory to be reconsidered in its entirety.

    Standard Model of Particle Physics

    Neutrino oscillation was the first observation inconsistent with the predictions and proves that neutrinos have non-zero masses, a property that contradicts the Standard Model. In 2015, this discovery was rewarded with the Nobel Prize.

    Are neutrinos their own antiparticles?

    Additionally, there is the longstanding conjecture that neutrinos are so-called Majorana particles: Unlike all other constituents of matter, neutrinos might be their own antiparticles. This would also help explain why there is so much more matter than antimatter in the Universe.

    The GERDA experiment is designed to scrutinize the Majorana hypothesis by searching for the neutrinoless double beta decay of the germanium isotope 76Ge: Two neutrons inside a 76Ge nucleus simultaneously transform into two protons with the emission of two electrons. This decay is forbidden in the Standard Model because the two antineutrinos – the balancing antimatter – are missing.

    The Technical University of Munich (TUM) has been a key partner of the GERDA project (GERmanium Detector Array) for many years. Prof. Stefan Schönert, who heads the TUM research group, is the speaker of the new LEGEND project.

    The GERDA experiment achieves extreme levels of sensitivity

    GERDA is the first experiment to reach exceptionally low levels of background noise and has now surpassed the half-life sensitivity for decay of 1026 years. In other words: GERDA proves that the process has a half-life of at least 1026 years, or 10,000,000,000,000,000 times the age of the Universe.

    Physicists know that neutrinos are at least 100,000 times lighter than electrons, the next heaviest particles. What mass they have exactly, however, is still unknown and another important research topic.

    In the standard interpretation, the half-life of the neutrinoless double beta decay is related to a special variant of the neutrino mass called the Majorana mass. Based the new GERDA limit and those from other experiments, this mass must be at least a million times smaller than that of an electron, or in the terms of physicists, less than 0.07 to 0.16 eV/c2 [1] SCIENCE.

    Consistent with other experiments

    Also other experiments limit the neutrino mass: the Planck mission provides a limit on another variant of the neutrino mass: The sum of the masses of all known neutrino types is less than 0.12 to 0.66 eV/c2.

    The tritium decay experiment KATRIN at the Karlsruhe Institute of Technology (KIT) is set-up to measure the neutrino mass with a sensitivity of about 0.2 eV/c2 in the coming years. These masses are not directly comparable, but they provide a cross check on the paradigm that neutrinos are Majorana particles. So far, no discrepancy has been observed.

    From GERDA to LEGEND

    During the reported data collection period, GERDA operated detectors with a total mass of 35.6 kg of 76Ge. Now, a newly formed international collaboration, LEGEND, will increase this mass to 200 kg of 76Ge until 2021 and further reduce the background noise. The aim is to achieve a sensitivity of 1027 years within the next five years.

    More information:

    GERDA is an international European collaboration of more than 100 physicists from Belgium, Germany, Italy, Russia, Poland and Switzerland. In Germany, GERDA is supported by the Technical Universities of Munich and Dresden, the University of Tübingen and the Max Planck Institutes for Physics and for Nuclear Physics. German funding is provided by the German Federal Ministry of Education and Research (BMBF), the German Research Foundation (DFG) via the Excellence Cluster Universe and SFB1258, as well as the Max Planck Society.

    Prof. Schönert received an ERC Advanced Grant for preparatory work on the LEGEND project in 2018. A few days ago, Prof. Susanne Mertens received an ERC grant for her work on the KATRIN experiment. In the context of that experiment, she will search for so-called sterile neutrinos.

    KATRIN Experiment schematic

    +

    KIT Katrin experiment

    [1] In particle physics masses are specified not in kilograms, but rather in accordance with Einstein’s equation E=mc2: electron volts [eV] divided by the speed of light squared. Electron volts are a measure of energy. This convention is used to circumvent unfathomably small units of mass: 1 eV/c2 corresponds to 1.8 × 10-36 kilograms.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Techniche Universitat Munchin Campus

    Techniche Universitat Munchin is one of Europe’s top universities. It is committed to excellence in research and teaching, interdisciplinary education and the active promotion of promising young scientists. The university also forges strong links with companies and scientific institutions across the world. TUM was one of the first universities in Germany to be named a University of Excellence. Moreover, TUM regularly ranks among the best European universities in international rankings.

     
  • richardmitnick 12:19 pm on April 9, 2019 Permalink | Reply
    Tags: All the miners get very dirty but all the SNOLAB people are clean so the difference between them is stark., , Neutrinoless double beta decay, , Paul Dirac won the Nobel Prize in 1933 after calculating that every particle in the universe must have a corresponding antiparticle., , SNO-Sudbury Neutrino Observatory, , SNOLAB researchers share the elevator with miners on their way to work in the Vale's Creighton nickel mine., The question of what happened to all the antimatter has remained unanswered.,   

    From University of Pennsylvania: “Answering big questions by studying small particles” 

    U Penn bloc

    From University of Pennsylvania

    April 8, 2019

    Erica K. Brockmeier-Writer
    Eric Sucar- Photographer

    1
    A view inside the SNO detector, a 40-foot acrylic sphere that’s covered with thousands of photodetectors. The facility is located in SNOLAB, a research facility located 2km underground near in the Vale’s Creighton nickel mine, Sudbury, Canada (Photo credit: SNO+ Collaboration).

    Neutrinos are extremely lightweight subatomic particles that are produced during nuclear reactions both here on Earth and in the center of stars. But neutrinos aren’t harmful or radioactive: In fact, nearly 100 trillion neutrinos bombard Earth every second and usually pass through the world without notice.

    Joshua Klein is an experimental particle physicist who studies neutrinos and dark matter. His group, along with retired professor Eugene Beier, collaborates with the Sudbury Neutrino Observatory (SNO), an international research endeavor focused on the study of neutrinos. Klein and Beier’s groups previously designed and now maintain the electronics at SNOLAB that collect data on these subatomic particles.

    Klein is fascinated by neutrinos and how they could help answer fundamental questions about the nature of the universe. “They may explain why the universe is made up of matter and not equal parts matter and anti-matter, they may be responsible for how stars explode, they may even tell us something about the laws of physics at the highest energy scales,” says Klein.

    Previous research on neutrinos has already led to groundbreaking discoveries in particle physics. The SNO collaboration was awarded the 2016 Breakthrough Prize in Fundamental Physics for solving the “solar neutrino problem.” The problem was that the number of neutrinos being produced by the sun was only a third of what was predicted by theoretical physicists, a discrepancy that had puzzled researchers since the 1970s.

    To solve this, researchers went about 1.2 miles underground to study neutrinos in order to avoid the cosmic radioactive particles that could interfere with their minute and precise measurements. The SNOLAB facility in Sudbury, Canada, which houses a a 40-foot wide acrylic vessel surrounded by photodetectors, allowed physicists to measure the three different types of neutrinos at the same time. Physicists found that neutrinos were able to change from one type into another.

    2
    The exterior of the SNO Detector as seen from the ground at SNOLAB (Photo credit: SNOLAB).

    Today, 15 years later, researchers are looking for an incredibly rare process involving neutrinos that, if found, could revolutionize the field of fundamental physics. “Now that we know that neutrinos can change form, along with the fact that neutrinos have mass but no charge, we can hypothesize that they can be their own antiparticle. If this is true, it could explain why the universe is made of only matter,” says Klein.

    The question of what happened to all the antimatter has remained unanswered since Paul Dirac won the Nobel Prize in 1933 after calculating that every particle in the universe must have a corresponding antiparticle. But the majority of the universe is made of ordinary matter, not equal parts matter and anti-matter, and scientists are trying to figure out why.

    The photodetectors at SNOLAB are now being upgraded as part of SNO+ [Physical Review D] in order to search for a rare type of radioactive decay known as a neutrinoless double beta decay, a never-before seen process that would prove that neutrinos and anti-neutrinos are actually the same particle. Witnessing a neutrinoless double-beta decay event is so rare, if it even exists, and would give off such a small signal that the only way to detect it is through the combination of powerful equipment, refined analyses, and a lot of patience.

    Instead of sitting around waiting for a rare event to happen, researchers are actively taking advantage of this state-of-the-art underground facility. “One of the selling points of SNO+ is that it’s a multipurpose detector,” says graduate student Eric Marzec. “A lot of detectors are produced with a singular goal, like detecting dark matter, but SNO+ has a lot of other interesting physics that it can probe.”

    3
    4
    5
    6
    Here at Penn, students from the Klein lab conduct key maintenance and repairs on the electronic components that are instrumental to the success of SNO+. They also conduct research on new materials that can help increase the sensitivity of the detector, providing more chances of seeing a rare neutrinoless double-beta decay event. (Four photos, no individual descriptions.)

    Marzec and Klein were part of a recent study using SNO+’s upgraded capabilities to collect new data on solar neutrinos [Physical Review D]. Before the detector vessel is filled with scintillator, a soap-like liquid that will help them detect rare radioactive decays, it was briefly filled with water. This enabled researchers to collect data on what direction the neutrinos came from, which then allowed them to focus their efforts on studying neutrinos that came from the Sun.

    The solar neutrino problem may be solved, but new data on solar neutrinos is still incredibly useful, especially since data from SNO+ have very low background signals from things like cosmic radiation. “There’s only a few experiments that have ever been able to measure neutrinos coming from the sun,” says Marzec. “People might someday want to look at whether the neutrino production of the sun varies over time, so it’s useful to have as many time points and as many measurements over the years as possible.”

    Marzec has spent a considerable amount of time working at the SNOLAB facility in northern Ontario. He describes a typical day as starting with a 6 a.m. underground elevator ride that travels more than a mile underground. SNOLAB researchers share the elevator with miners on their way to work in the Vale’s Creighton nickel mine. “All the miners get very dirty, but all the SNOLAB people are clean, so the difference between them is stark. It’s very obvious who is the nerd underground and who the miners are,” says Marzec.

    7
    After traveling 6,800 floors underground, researchers walk more than half a mile through a series of tunnels to reach the entrance of SNOLAB (Photo credit: SNOLAB).

    After arriving at the –6,800th floor, researchers walk more than a half mile from the cage shaft to the SNOLAB through underground dirt tunnels. When they reach the lab, they have to shower and change into threadless uniforms to prevent any microscopic threads from getting inside the sensitive detector. After air quality checks are completed, the researchers are free to begin their work on the detector.

    When asked what it’s like to work more than a mile underground, Marzec comments that he got used to the strangeness after a few visits. “The first time, it feels very much like you’re underground because the pressure is very noticeable, and you feel exhausted at the end of the day.” Thankfully, Marzec and his colleagues don’t have to travel a mile underground every time they want to collect data from SNO+ since they can remotely collect and analyze the hundreds of terabytes of data generated by the detector.

    8
    To do any repair work or cleaning inside the detector, researchers must be lowered into the 40 foot tall sphere using a harness (Photo credit: SNOLAB).

    As Marzec is in the final stages of preparing his Ph.D. thesis, he says he will miss his time working on SNO+. “It’s kind of monastic,” Marzec says about his time working at SNOLAB. “You go there and mediate on physics while you’re there. But it’s also kind of a social thing as well: There are a lot of people you know who are working on the same stuff.”

    Klein and his group, including four graduate students and two post-docs, recently returned from a SNOLAB collaboration meeting, where upwards of 100 physicists met to present and discuss recent results and the upcoming plans for the next phase of the project. Klein is excited, and, admittedly, a little bit nervous, to see how everything comes together. “Putting in the liquid scintillator will change everything—there’s never been a detector being converted from a water-based detector to a scintillator detector. Here at Penn, for us, it’s big because we designed upgrades to the electronics to handle the fact that we will be getting data at a rate that’s about 100 times higher,” says Klein.

    9
    A scientist works inside the SNO+ detector while it is partially filled with deuterated water. Each one of the gold-colored circles is an individual photodetector (Photo credit: SNOLAB).

    Despite the numerous technical and logistical challenges ahead, researchers are enthusiastic about the potential that SNO+ can bring to particle physics research. Other areas of study include learning how neutrinos change form, studying low-energy neutrinos to figure out why the Sun seems to have less “heavy” elements than astronomers expect, and measuring geoneutrinos to figure out why Earth is hotter than other nearby planets like Mars.

    But for Klein, the prospect of finding a rare neutrinoless double beta decay event remains the most thrilling aspect of this research, which, if discovered, could turn the Standard Model of particle physics on its head. “After the question of what is dark energy and what is dark matter, the question of whether neutrinos are their own antiparticle is the most important question for particle physics to answer,” Klein says. “And if neutrinos are their own antiparticle, the simplest piece you can put into the equation [within the Standard Model] blows up: It doesn’t work, it’s mathematically inconsistent. And we don’t know how we would fix that. It is a completely experimental question, so that’s why we’re excited.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Penn campus

    Academic life at Penn is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

     
  • richardmitnick 9:25 am on March 20, 2019 Permalink | Reply
    Tags: "Solving a 50-year-old beta decay puzzle with advanced nuclear model simulations", , , , Neutrinoless double beta decay, , Synthesis of heavy elements, Technische Universität Darmstadt, The electroweak force, , When protons inside atomic nuclei convert into neutrons or vice versa   

    From Lawrence Livermore National Laboratory and ORNL: “Solving a 50-year-old beta decay puzzle with advanced nuclear model simulations” 

    i1

    Oak Ridge National Laboratory

    From Lawrence Livermore National Laboratory

    March 19, 2019

    Anne M Stark
    stark8@llnl.gov
    925-422-9799

    1
    First-principles calculations show that strong correlations and interactions between two nucleons slow down beta decays in atomic nuclei compared to what’s expected from the beta decays of free neutrons. This impacts the synthesis of heavy elements and the search for neutrinoless double beta decay. Image by Andy Sproles/Oak Ridge National Laboratory.

    For the first time, an international team including scientists at Lawrence Livermore National Laboratory (LLNL) has found the answer to a 50-year-old puzzle that explains why beta decays of atomic nuclei are slower than expected.

    The findings fill a long-standing gap in physicists’ understanding of beta decay (converting a neutron into a proton and vice versa), a key process stars use to create heavier elements. The research appeared in the March 11 edition of the journal Nature Physics.

    Using advanced nuclear model simulations, the team, including LLNL nuclear physicists Kyle Wendt (a Lawrence fellow), Sofia Quaglioni and twice-summer intern Peter Gysbers (UBC/TRIUMF), found their results to be consistent with experimental data showing that beta decays of atomic nuclei are slower than what is expected, based on the beta decays of free neutrons.

    “For decades, physicists couldn’t quite explain nuclear beta decay, when protons inside atomic nuclei convert into neutrons or vice versa, forming the nuclei of other elements,” Wendt said. “Combining modern theoretical tools with advanced computation, we demonstrate it is possible to reconcile, for a considerable number of nuclei, this long-standing discrepancy between experimental measurements and theoretical calculations.”

    Historically, calculations of beta decay rates have been much faster than what is seen experimentally. Nuclear physicists have worked around this discrepancy by artificially scaling the interaction of single nucleons with the electroweak force, a process referred to as “quenching.” This allowed physicists to describe beta decay rates, but not predict them. While nuclei near each other in mass would have similar quenching factors, the factors could differ dramatically for nuclei well separated in mass.

    Predictive calculations of beta decay require not just accurate calculations of the structure of both the mother and daughter nuclei, but also of how nucleons (both individually and as correlated pairs) couple to the electroweak force that drives beta decay. These pairwise interactions of nucleons with the weak force represented an extreme computational hurdle due to the strong nuclear correlations in nuclei.

    The team simulated beta decays from light to heavy nuclei, up to tin-100 decaying into indium-100, demonstrating their approach works consistently across the nuclei where ab initio calculations are possible. This sets the path toward accurate predictions of beta decay rates for unstable nuclei in violent astrophysical environments, such as supernova explosions or neutron star mergers that are responsible for producing most elements heavier than iron.

    “The methodology in this work also may hold the key to accurate predictions of the elusive neutrinoless double-beta decay, a process that if seen would revolutionize our understanding of particle physics,” Quaglioni said.

    Other institutions include Oak Ridge National Laboratory, TRIUMF and the Technische Universität Darmstadt Germany.


    Technische Universität Darmstadt campus

    Technische Universität Darmstadt

    The work was funded by the Laboratory Directed Research and Development Program.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LLNL Campus

    Operated by Lawrence Livermore National Security, LLC, for the National Nuclear Security Administration
    Lawrence Livermore National Laboratory (LLNL) is an American federal research facility in Livermore, California, United States, founded by the University of California, Berkeley in 1952. A Federally Funded Research and Development Center (FFRDC), it is primarily funded by the U.S. Department of Energy (DOE) and managed and operated by Lawrence Livermore National Security, LLC (LLNS), a partnership of the University of California, Bechtel, BWX Technologies, AECOM, and Battelle Memorial Institute in affiliation with the Texas A&M University System. In 2012, the laboratory had the synthetic chemical element livermorium named after it.

    LLNL is self-described as “a premier research and development institution for science and technology applied to national security.” Its principal responsibility is ensuring the safety, security and reliability of the nation’s nuclear weapons through the application of advanced science, engineering and technology. The Laboratory also applies its special expertise and multidisciplinary capabilities to preventing the proliferation and use of weapons of mass destruction, bolstering homeland security and solving other nationally important problems, including energy and environmental security, basic science and economic competitiveness.

    The Laboratory is located on a one-square-mile (2.6 km2) site at the eastern edge of Livermore. It also operates a 7,000 acres (28 km2) remote experimental test site, called Site 300, situated about 15 miles (24 km) southeast of the main lab site. LLNL has an annual budget of about $1.5 billion and a staff of roughly 5,800 employees.

    LLNL was established in 1952 as the University of California Radiation Laboratory at Livermore, an offshoot of the existing UC Radiation Laboratory at Berkeley. It was intended to spur innovation and provide competition to the nuclear weapon design laboratory at Los Alamos in New Mexico, home of the Manhattan Project that developed the first atomic weapons. Edward Teller and Ernest Lawrence,[2] director of the Radiation Laboratory at Berkeley, are regarded as the co-founders of the Livermore facility.

    The new laboratory was sited at a former naval air station of World War II. It was already home to several UC Radiation Laboratory projects that were too large for its location in the Berkeley Hills above the UC campus, including one of the first experiments in the magnetic approach to confined thermonuclear reactions (i.e. fusion). About half an hour southeast of Berkeley, the Livermore site provided much greater security for classified projects than an urban university campus.

    Lawrence tapped 32-year-old Herbert York, a former graduate student of his, to run Livermore. Under York, the Lab had four main programs: Project Sherwood (the magnetic-fusion program), Project Whitney (the weapons-design program), diagnostic weapon experiments (both for the Los Alamos and Livermore laboratories), and a basic physics program. York and the new lab embraced the Lawrence “big science” approach, tackling challenging projects with physicists, chemists, engineers, and computational scientists working together in multidisciplinary teams. Lawrence died in August 1958 and shortly after, the university’s board of regents named both laboratories for him, as the Lawrence Radiation Laboratory.

    Historically, the Berkeley and Livermore laboratories have had very close relationships on research projects, business operations, and staff. The Livermore Lab was established initially as a branch of the Berkeley laboratory. The Livermore lab was not officially severed administratively from the Berkeley lab until 1971. To this day, in official planning documents and records, Lawrence Berkeley National Laboratory is designated as Site 100, Lawrence Livermore National Lab as Site 200, and LLNL’s remote test location as Site 300.[3]

    The laboratory was renamed Lawrence Livermore Laboratory (LLL) in 1971. On October 1, 2007 LLNS assumed management of LLNL from the University of California, which had exclusively managed and operated the Laboratory since its inception 55 years before. The laboratory was honored in 2012 by having the synthetic chemical element livermorium named after it. The LLNS takeover of the laboratory has been controversial. In May 2013, an Alameda County jury awarded over $2.7 million to five former laboratory employees who were among 430 employees LLNS laid off during 2008.[4] The jury found that LLNS breached a contractual obligation to terminate the employees only for “reasonable cause.”[5] The five plaintiffs also have pending age discrimination claims against LLNS, which will be heard by a different jury in a separate trial.[6] There are 125 co-plaintiffs awaiting trial on similar claims against LLNS.[7] The May 2008 layoff was the first layoff at the laboratory in nearly 40 years.[6]

    On March 14, 2011, the City of Livermore officially expanded the city’s boundaries to annex LLNL and move it within the city limits. The unanimous vote by the Livermore city council expanded Livermore’s southeastern boundaries to cover 15 land parcels covering 1,057 acres (4.28 km2) that comprise the LLNL site. The site was formerly an unincorporated area of Alameda County. The LLNL campus continues to be owned by the federal government.

    LLNL/NIF


    DOE Seal
    NNSA

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: