Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:03 pm on October 7, 2022 Permalink | Reply
    Tags: "Stabilizing polarons opens up new physics", A new approach for solving a major shortcoming of a well-established theory that physicists use to study the interactions of electrons in materials: “DFT” - density functional theory., , “DFT” is used in physics; chemistry; and materials science to study the electronic structure of many-body systems like atoms and molecules., , DFT is susceptible to spurious interactions of the electron with its own self – what physicists refer to as the “self-interaction problem”., One of the many peculiarities of quantum mechanics is that particles can also be described as waves., Physics, , Technically a polaron is a quasi-particle made up of an electron “dressed” by its self-induced phonons which represent the quantized vibrations of the crystal., The new work introduces a theoretical formulation for electron self-interaction that solves the problem of polaron localization in density functional theory.,   

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “Stabilizing polarons opens up new physics” 

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

    10.7.22
    Papageorgiou

    1
    Physicists at EPFL have developed a formulation to solve the longstanding problem of electron self-interaction when studying polarons – quasiparticles produced by electron-phonon interactions in materials. The work can lead to unprecedented calculations of polarons in large systems, systematic studies of large sets of materials, and molecular dynamics evolving over long time periods.

    One of the many peculiarities of quantum mechanics is that particles can also be described as waves. A common example is the photon, the particle associated with light.

    In ordered structures, known as crystals, electrons can be seen and described as waves that spread across the entire system – a rather harmonious picture. As electrons move through the crystal, ions – atoms carrying a negative or positive charge — are periodically arranged in space.

    Now, if we were to add an extra electron to the crystal, its negative charge could make the ions around it move away from their equilibrium positions. The electron charge would localize in space and couple to the surrounding structural – “lattice” – distortions of the crystal, giving rise to a new particle known as a polaron.

    “Technically, a polaron is a quasi-particle, made up of an electron “dressed” by its self-induced phonons, which represent the quantized vibrations of the crystal,” says Stefano Falletta at EPFL’s School of Basic Sciences. He continues: “The stability of polarons arises from a competition between two energy contributions: the gain due to charge localization, and the cost due to lattice distortions. When the polaron destabilizes, the extra electron delocalizes over the entire system, while the ions restore their equilibrium positions.”

    2
    A polaron forming in magnesium oxide atoms. Credit: S. Falletta (EPFL)

    Working with Professor Alfredo Pasquarello at EPFL, they have published two papers in Physical Review Letters [below] and Physical Review B [below] describing a new approach for solving a major shortcoming of a well-established theory that physicists use to study the interactions of electrons in materials. The method is called density functional theory or DFT, and is used in physics, chemistry, and materials science to study the electronic structure of many-body systems like atoms and molecules.

    DFT is a powerful tool for performing ab-initio calculations of materials, by simplified treatment of the electron interactions. However, DFT is susceptible to spurious interactions of the electron with its own self – what physicists refer to as the “self-interaction problem”. This self-interaction is one of the greatest limitations of DFT, often leading to incorrect description of polarons, which are often destabilized.

    “In our work, we introduce a theoretical formulation for the electron self-interaction that solves the problem of polaron localization in density functional theory,” says Falletta. “This gives access to accurate polaron stabilities within a computationally-efficient scheme. Our study paves the way to unprecedented calculations of polarons in large systems, in systematic studies involving large sets of materials, or in molecular dynamics evolving over long time periods.”

    Science papers:
    Physical Review Letters
    Physical Review B

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is The Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich] (CH). Associated with several specialized research institutes, the two universities form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles Polytechniques Fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École Polytechnique Fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École Spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices were located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganized and acquired the status of a university in 1890, the technical faculty changed its name to École d’Ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich (CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

    Organization

    EPFL is organized into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences
    Institute of Mathematics
    Institute of Chemical Sciences and Engineering
    Institute of Physics
    European Centre of Atomic and Molecular Computations
    Bernoulli Center
    Biomedical Imaging Research Center
    Interdisciplinary Center for Electron Microscopy
    MPG-EPFL Centre for Molecular Nanosciences and Technology
    Swiss Plasma Center
    Laboratory of Astrophysics

    School of Engineering

    Institute of Electrical Engineering
    Institute of Mechanical Engineering
    Institute of Materials
    Institute of Microengineering
    Institute of Bioengineering

    School of Architecture, Civil and Environmental Engineering

    Institute of Architecture
    Civil Engineering Institute
    Institute of Urban and Regional Sciences
    Environmental Engineering Institute

    School of Computer and Communication Sciences

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Networking
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing
    Systems

    School of Life Sciences

    Bachelor-Master Teaching Section in Life Sciences and Technologies
    Brain Mind Institute
    Institute of Bioengineering
    Swiss Institute for Experimental Cancer Research
    Global Health Institute
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics
    NCCR Synaptic Bases of Mental Diseases

    College of Management of Technology

    Swiss Finance Institute at EPFL
    Section of Management of Technology and Entrepreneurship
    Institute of Technology and Public Policy
    Institute of Management of Technology and Entrepreneurship
    Section of Financial Engineering

    College of Humanities

    Human and social sciences teaching program

    EPFL Middle East

    Section of Energy Management and Sustainability

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École Cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

     
  • richardmitnick 1:04 pm on October 7, 2022 Permalink | Reply
    Tags: "New process could enable more efficient plastics recycling", A catalyst made of a microporous material called a zeolite containing cobalt can selectively break down various plastic polymer molecules and turn more than 80 percent of them into propane., A chemical process using a catalyst based on cobalt has been found to be very effective at breaking down a variety of plastics., A key problem is that plastics come in so many different varieties and chemical processes for breaking them down into a form that can be reused in some way tend to be very specific to each type., , , , Physics, Polyethylene (PET) and polypropylene (PP)-two widely produced forms of plastic-can be broken down into propane. Propane can then be used as a fuel or a feedstock for a variety of products., Recycling plastics has been a thorny problem because the long-chain molecules in plastics are held together by carbon bonds which are very stable and difficult to break apart., The accumulation of plastic waste is one of the major pollution issues of modern times., , The materials needed for the process-zeolites and cobalt-are both quite cheap and widely available., Today much of the plastic material gathered through recycling programs ends up in landfills anyway.   

    From The Massachusetts Institute of Technology: “New process could enable more efficient plastics recycling” 

    From The Massachusetts Institute of Technology

    10.6.22
    David L. Chandler

    1
    A new chemical process can break down a variety of plastics into usable propane — a possible solution to our inability to effectively recycle many types of plastic. Image: Courtesy of the researchers. Edited by MIT News.

    The accumulation of plastic waste in the oceans, soil, and even in our bodies is one of the major pollution issues of modern times, with over 5 billion tons disposed of so far. Despite major efforts to recycle plastic products, actually making use of that motley mix of materials has remained a challenging issue.

    A key problem is that plastics come in so many different varieties, and chemical processes for breaking them down into a form that can be reused in some way tend to be very specific to each type of plastic. Sorting the hodgepodge of waste material, from soda bottles to detergent jugs to plastic toys, is impractical at large scale. Today much of the plastic material gathered through recycling programs ends up in landfills anyway. Surely there’s a better way.

    According to new research from MIT and elsewhere, it appears there may indeed be a much better way. A chemical process using a catalyst based on cobalt has been found to be very effective at breaking down a variety of plastics, such as polyethylene (PET) and polypropylene (PP), the two most widely produced forms of plastic, into a single product, propane. Propane can then be used as a fuel for stoves, heaters, and vehicles, or as a feedstock for the production of a wide variety of products — including new plastics, thus potentially providing at least a partial closed-loop recycling system.

    The finding is described today in the open access journal JACS Au [below], in a paper by MIT professor of chemical engineering Yuriy Román-Leshkov, postdoc Guido Zichitella, and seven others at MIT, the DOE’s SLAC National Accelerator Laboratory, and the National Renewable Energy Laboratory.

    Recycling plastics has been a thorny problem, Román-Leshkov explains, because the long-chain molecules in plastics are held together by carbon bonds, which are “very stable and difficult to break apart.” Existing techniques for breaking these bonds tend to produce a random mix of different molecules, which would then require complex refining methods to separate out into usable specific compounds. “The problem is,” he says, “there’s no way to control where in the carbon chain you break the molecule.”

    But to the surprise of the researchers, a catalyst made of a microporous material called a zeolite that contains cobalt nanoparticles can selectively break down various plastic polymer molecules and turn more than 80 percent of them into propane.

    Although zeolites are riddled with tiny pores less than a nanometer wide (corresponding to the width of the polymer chains), a logical assumption had been that there would be little interaction at all between the zeolite and the polymers. Surprisingly, however, the opposite turned out to be the case: Not only do the polymer chains enter the pores, but the synergistic work between cobalt and the acid sites in the zeolite can break the chain at the same point. That cleavage site turned out to correspond to chopping off exactly one propane molecule without generating unwanted methane, leaving the rest of the longer hydrocarbons ready to undergo the process, again and again.

    “Once you have this one compound, propane, you lessen the burden on downstream separations,” Román-Leshkov says. “That’s the essence of why we think this is quite important. We’re not only breaking the bonds, but we’re generating mainly a single product” that can be used for many different products and processes.

    The materials needed for the process, zeolites and cobalt, “are both quite cheap” and widely available, he says, although today most cobalt comes from troubled areas in the Democratic Republic of Congo. Some new production is being developed in Canada, Cuba, and other places. The other material needed for the process is hydrogen, which today is mostly produced from fossil fuels but can easily be made other ways, including electrolysis of water using carbon-free electricity such as solar or wind power.

    The researchers tested their system on a real example of mixed recycled plastic, producing promising results. But more testing will be needed on a greater variety of mixed waste streams to determine how much fouling takes place from various contaminants in the material — such as inks, glues, and labels attached to the plastic containers, or other nonplastic materials that get mixed in with the waste — and how that affects the long-term stability of the process.

    Together with collaborators at NREL, the MIT team is also continuing to study the economics of the system, and analyzing how it can fit into today’s systems for handling plastic and mixed waste streams. “We don’t have all the answers yet,” Román-Leshkov says, but preliminary analysis looks promising.

    The research team included Amani Ebrahim and Simone Bare at the SLAC National Accelerator Laboratory; Jie Zhu, Anna Brenner, Griffin Drake and Julie Rorrer at MIT; and Greg Beckham at the National Renewable Energy Laboratory. The work was supported by the U.S. Department of Energy (DoE), the Swiss National Science Foundation, and the DoE’s Office of Energy Efficiency and Renewable Energy, Advanced Manufacturing Office (AMO), and Bioenergy Technologies Office (BETO), as part of the the Bio-Optimized Technologies to keep Thermoplastics out of Landfills and the Environment (BOTTLE) Consortium.

    Science paper:
    JACS Au
    See the science paper for instructive material.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 10:38 am on October 5, 2022 Permalink | Reply
    Tags: Physics, , , , , , "Magnetic nano mosaics", Physics team from the universities of Kiel and Hamburg discovers new class of magnetic lattices., For about ten years magnetic skyrmions - particle-like stable magnetic whirls that can form in certain materials and possess fascinating properties - have been a focus of research., "Skyrmion lattices"   

    From The Kiel University [Christian-Albrechts-Universität zu Kiel] (DE) And The University of Hamburg [Universität Hamburg] (DE): “Magnetic nano mosaics” 

    From The Kiel University [Christian-Albrechts-Universität zu Kiel] (DE)

    And

    1

    The University of Hamburg [Universität Hamburg] (DE)

    10.5.22

    PD Dr. Kirsten von Bergmann
    Institute for Nanostructure and Solid State Physics
    University of Hamburg
    040 / 42838-6295
    kirsten.von.bergmann@physik.uni-hamburg.de

    Professor Dr. Stefan Heinze
    Institute of Theoretical Physics and Astrophysics
    Kiel University
    0431 / 880-4127
    heinze@theo-physik.uni-kiel.de

    Press Contact:
    Julia Siekmann
    Science Communication Officer
    Research area Kiel Nano Surface and Interface Sciences
    jsiekmann@uv.uni-kiel.de
    +49 (0)431/880-4855

    Physics team from the universities of Kiel and Hamburg discovers new class of magnetic lattices.

    1
    The image shows the different orientation of atomic “bar magnets” of an iron film: In a magnetic mosaic lattice (above), they are oriented in groups either upwards (purple) or downwards (white). In the skyrmion lattice (below), on the other hand, they point in all directions. © André Kubetzka.

    2
    A measurement using spin-polarised scanning tunnelling microscopy (SP-STM) makes the hexagonal arrangement in the magnetic mosaic lattice visible on the nanometre scale. Due to a twist of the mosaic lattice on the atomic lattice, two rotational domains appear which deviate from each other by about 13° (see markings and graphs on the right). © André Kubetzka.

    For about ten years, magnetic skyrmions – particle-like, stable magnetic whirls that can form in certain materials and possess fascinating properties – have been a focus of research: electrically easily controlled and only a few nanometers in size, they are suitable for future applications in spin electronics, quantum computers or neuromorphic chips. These magnetic whirls were first found in regular lattices, so-called “skyrmion lattices”, and later individual skyrmions were also observed at the University of Hamburg. Researchers from Kiel University and the University of Hamburg have now discovered a new class of spontaneously occurring magnetic lattices. They are related to skyrmion lattices, but their “atomic bar magnets” on the nanometer scale are oriented differently. A fundamental understanding of how such complex spin structures form, how they are arranged and remain stable is also needed for future applications. The results are published in the current issue of Nature Communications [below].

    Quantum mechanical interactions

    Attaching magnets to a refrigerator or reading data from a hard drive is only possible because of a quantum mechanical exchange interaction between the atomic bar magnets on the microscopic scale. This interaction, discovered by Werner Heisenberg in 1926, explains not only the parallel alignment of atomic bar magnets in ferromagnets, but also the occurrence of other magnetic configurations, such as antiferromagnets. Today many other magnetic interactions are known, which has led to a variety of possible magnetic states and new research questions. This is also important for skyrmion lattices. Here the atomic bar magnets show in all spatial directions, which is only possible due to the competition of different interactions.

    “In our measurements, we found a hexagonal arrangement of magnetic contrasts, and at first we thought that was also a skyrmion lattice. Only later did it become clear that it could be a nanoscale magnetic mosaic,” says PD Dr. Kirsten von Bergmann. With her team from the University of Hamburg, she experimentally studied thin metallic films of iron and rhodium using spin-polarized scanning tunneling microscopy. This allows magnetic structures to be imaged down to the atomic scale. The observed magnetic lattices occurred spontaneously as in a ferromagnet, i.e., without an applied magnetic field. “With a magnetic field, we can invert the mosaic lattices, because the opposing spins only partially compensate for each other,” explains Dr. André Kubetzka, also from the University of Hamburg.

    Surprising: Magnetically different alignment

    Based on these measurements, the group of Prof. Dr. Stefan Heinze (Kiel University) performed quantum mechanical calculations on the supercomputers of the North German High Performance Computing Network (HLRN). They show that in the investigated iron films the tilting of the atomic bar magnets in a lattice of magnetic vortices, i.e. in all spatial directions, is very unfavorable. Instead, a nearly parallel or antiparallel alignment of neighboring atomic bar magnets is favored.

    “This result completely surprised us. A lattice of skyrmions was thus no longer an option to explain the experimental observations,” says Mara Gutzeit, doctoral researcher and first author of the study. The development of an atomistic spin model made clear that it must be a novel class of magnetic lattices, which the researchers called “mosaic lattices”. “We found out that these mosaic-like magnetic structures are caused by higher-order exchange terms, predicted only a few years ago,” says Dr. Soumyajyoti Haldar from the group of Kiel.

    “The study impressively shows how diverse spin structures can be and that a close collaboration between experimentally and theoretically working research groups can be really helpful for their understanding. In this field a few more surprises can be expected in the future,” states Professor Stefan Heinze.

    Science paper:
    Nature Communications
    See the science paper for instructive images.
    _________________________________________________
    About spin electronics:

    In addition to the charge of the electrons, spin electronics also uses their so-called spin. This electron spin is a quantum mechanical property and can be understood in simplified terms as the rotation of the electrons around their own axis. This is linked to a magnetic moment that leads to the formation of “atomic bar magnets” (atomic spins) in magnetic materials. They are suitable for processing and storing information. Through targeted electrical manipulation, it would be possible to create faster, more energy-saving and more powerful components for information technology.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Hamburg [Universität Hamburg] (DE) is the largest institution for research and education in northern Germany. As one of the country’s largest universities, we offer a diverse range of degree programs and excellent research opportunities. The University boasts numerous interdisciplinary projects in a broad range of fields and an extensive partner network of leading regional, national, and international higher education and research institutions.
    Sustainable science and scholarship

    Universität Hamburg is committed to sustainability. All our faculties have taken great strides towards sustainability in both research and teaching.
    Excellent research

    As part of the Excellence Strategy of the Federal and State Governments, Universität Hamburg has been granted clusters of excellence for 4 core research areas: Advanced Imaging of Matter (photon and nanosciences), Climate, Climatic Change, and Society (CliCCS) (climate research), Understanding Written Artefacts (manuscript research) and Quantum Universe (mathematics, particle physics, astrophysics, and cosmology).

    An equally important core research area is Infection Research, in which researchers investigate the structure, dynamics, and mechanisms of infection processes to promote the development of new treatment methods and therapies.
    Outstanding variety: over 170 degree programs

    Universität Hamburg offers approximately 170 degree programs within its eight faculties:

    Faculty of Law
    Faculty of Business, Economics and Social Sciences
    Faculty of Medicine
    Faculty of Education
    Faculty of Mathematics, Informatics and Natural Sciences
    Faculty of Psychology and Human Movement Science
    Faculty of Business Administration (Hamburg Business School).

    Universität Hamburg is also home to several museums and collections, such as the Zoological Museum, the Herbarium Hamburgense, the Geological-Paleontological Museum, the Loki Schmidt Garden, and the Hamburg Observatory.
    History

    Universität Hamburg was founded in 1919 by local citizens. Important founding figures include Senator Werner von Melle and the merchant Edmund Siemers. Nobel Prize winners such as the physicists Otto Stern, Wolfgang Pauli, and Isidor Rabi taught and researched at the University. Many other distinguished scholars, such as Ernst Cassirer, Erwin Panofsky, Aby Warburg, William Stern, Agathe Lasch, Magdalene Schoch, Emil Artin, Ralf Dahrendorf, and Carl Friedrich von Weizsäcker, also worked here.

    The Kiel University [ Christian-Albrechts-Universität zu Kiel ] (DE) was founded back in 1665. It is Schleswig-Holstein’s oldest, largest and best-known university, with over 26,000 students and around 3,000 members of staff. It is also the only fully-fledged university in the state. Seven Nobel prize winners have worked here. The CAU has been successfully taking part in the Excellence Initiative since 2006. The Cluster of Excellence The Future Ocean, which was established in cooperation with the GEOMAR [Helmholtz-Zentrum für Ozeanforschung Kiel](DE) in 2006, is internationally recognized. The second Cluster of Excellence “Inflammation at Interfaces” deals with chronic inflammatory diseases. The Kiel Institute for the World Economy is also affiliated with Kiel University. The university has a great reputation for its focus on public international law. The oldest public international law institution in Germany and Europe – the Walther Schuecking Institute for International Law – is based in Kiel.

    History

    The University of Kiel was founded under the name Christiana Albertina on 5 October 1665 by Christian Albert, Duke of Holstein-Gottorp. The citizens of the city of Kiel were initially quite sceptical about the upcoming influx of students, thinking that these could be “quite a pest with their gluttony, heavy drinking and their questionable character” (German: mit Fressen, Sauffen und allerley leichtfertigem Wesen sehr ärgerlich seyn). But those in the city who envisioned economic advantages of a university in the city won, and Kiel thus became the northernmost university in the German Holy Roman Empire.

    After 1773, when Kiel had come under Danish rule, the university began to thrive, and when Kiel became part of Prussia in the year 1867, the university grew rapidly in size. The university opened one of the first botanical gardens in Germany (now the Alter Botanischer Garten Kiel), and Martin Gropius designed many of the new buildings needed to teach the growing number of students.

    The Christiana Albertina was one of the first German universities to obey the Gleichschaltung in 1933 and agreed to remove many professors and students from the school, for instance Ferdinand Tönnies or Felix Jacoby. During World War II, the University of Kiel suffered heavy damage, therefore it was later rebuilt at a different location with only a few of the older buildings housing the medical school.

    In 2019, it was announced it has banned full-face coverings in classrooms, citing the need for open communication that includes facial expressions and gestures.

    Faculties

    Faculty of Theology
    Faculty of Law
    Faculty of Business, Economics and Social Sciences
    Faculty of Medicine
    Faculty of Arts and Humanities
    Faculty of Mathematics and Natural Sciences
    Faculty of Agricultural Science and Nutrition
    Faculty of Engineering

     
  • richardmitnick 1:38 pm on October 4, 2022 Permalink | Reply
    Tags: "Black holes can’t trash info about what they swallow—and that’s a problem", , , , , If I tell you the mass and electric charge and spin (i.e. angular momentum) of a black hole we’re done., Information must be preserved during any process., Physicists have come up with several interesting clues that are helping us move in what’s hopefully the right direction., Physics, , Solving the information paradox could unlock quantum gravity and unification of forces., , , We still do not have a solution to the Information Paradox.   

    From “ars technica“: “Black holes can’t trash info about what they swallow—and that’s a problem” 

    From “ars technica“

    10.3.22
    Paul Sutter

    Solving the information paradox could unlock quantum gravity and unification of forces.

    1
    Aaron Horowitz/Getty Images.

    Three numbers.

    Just three numbers—that’s all it takes to completely, unequivocally, 100 percent describe a black hole in general relativity. If I tell you the mass and electric charge and spin (i.e. angular momentum) of a black hole we’re done. That’s all we’ll ever know about it and all we’ll ever need to describe its features.

    Those three numbers allow us to calculate everything about how a black hole will interact with its environment, how objects around it will respond to it, and how the black hole will evolve in the future.

    For all their ferocious gravitational abilities and their unholy exotic natures, black holes are surprisingly simple. If I give you two black holes with the exact same mass, charge, and spin, you wouldn’t be able to tell them apart. If I swapped their places without you looking, you wouldn’t know that I did it.

    This also means that when you see a fully formed black hole, you have no idea what made it. Any combination of mass squeezed into a sufficiently small volume could have done the job. It could have been the ultra-dense core of a dying star. It could have been an extremely dense litter of adorable kittens squashed into oblivion.

    As long as the mass, charge, and spin are the same, the history is irrelevant. No information about the original material that created the black hole survives. Or does it?

    Founding charters

    “Information” is a bit of a loaded term; it can take on various definitions depending on who you ask and what mood they’re in. In physics, the concept of information is tightly linked to our understanding of how physical systems evolve and how we construct our theories of physics.

    We like to think that physics is a relatively useful paradigm for understanding the Universe we live in. One of the ways that physics is useful is its power of prediction. If I give you a list of all the information about a system, I should be able to apply my laws and theories of physics to tell you how that system will evolve. The reverse is also true. If I tell you the state of a system now, you can run all the math backward to figure out how the system got to its present state.

    These two concepts are known as determinism (I can predict the future) and reversibility (I can read the past) and are pretty much the foundational core of physics. If our theories of physics didn’t have these properties, we wouldn’t be able to get much work done.

    These two concepts also apply to quantum mechanics. Yes, quantum mechanics puts strict limits on what we can measure about the Universe, but that doesn’t mean all bets are off. Instead, we can simply replace a sharply defined classical state with a fuzzier quantum state and move on with our lives; the quantum state evolves according to the Schrödinger equation, which upholds both determinism and reversibility, so we’re all good.

    This one-two punch of determinism and reversibility means that, in terms of physics, information must be preserved during any process. It can’t be either created or destroyed—if we were to add or remove information willy-nilly, we wouldn’t be able to predict the future or read the past. Any loss or gain means there would either be missing information or extra information, so all of physics would crumble to dust.

    There are many processes that appear to destroy information, but that’s only because we’re not keeping careful enough track. Take, for example, the burning of a book. If I gave you a pile of ashes, this would appear to be irreversible: There’s no way you could put the book back together. But if you have a sufficiently powerful microscope at your disposal (and a lot of patience) and got to watch me in the act of burning the book, you could—in principle at least, which is good enough—watch and track the motion of every single molecule in the process. You could then reverse all those motions and all those interactions to reconstruct the book. Information is not lost when you burn a book; it’s merely scrambled.

    In the traditional, classical view of black holes, all this business about information is not a problem at all. The information that went into building the black hole is simply locked away behind the event horizon—the one-way boundary at the black hole’s surface that makes it so unique. Once there, the information will never be seen in this Universe again. Whether the black hole was formed from dying stars or squashed kittens, it doesn’t practically matter. The information may not be destroyed, but it’s permanently hidden from our prying eyes.

    Hawking’s surprise

    At least, that’s what we thought until the mid-1970s, when famed astrophysicist Stephen Hawking discovered that black holes aren’t entirely… well, black.

    Hawking was exploring the nature of quantum fields near the event horizons of black holes when he discovered an unusual property. The interaction of the event horizon with the quantum fields triggered the emission of radiation; light and particles could escape from the otherwise inescapable event horizons, causing the black holes to lose mass and eventually evaporate.

    Curiously, Hawking found that the radiation emitted by a black hole was perfectly thermal, meaning that it contained no information whatsoever except for that regarding the mass, charge, and spin of the black hole. Thus was born the black hole information paradox. Unlike if it were burned, were a book to fall into a black hole, there’s no way we could reconstruct the words from the radiation that came out. After the black hole radiated away all its mass and disappeared in a poof of particles, all the information about all the objects (books, stars, kittens, etc.) that fell in to create the black hole would disappear along with it.

    But as we went over earlier, information can’t just disappear, so this was a bit of a puzzle.

    The problem languished for decades, with physicists arguing back and forth (and even changing their minds!) about how to fix it. Hawking’s calculations could be wrong, but that would mean we were missing something important about the nature of quantum field theory—which was well-tested. Or our understanding of gravity could be wrong, although that was well-tested, too. Or we needed to give up our cherished notions of the conservation of information… which was also well-tested.

    It won’t be spoiling the rest of this article to tell you that we still do not have a solution to the paradox. But in studying this troubling problem, physicists have come up with several interesting clues that are helping us move in what’s hopefully the right direction.

    Information wants to be free

    The first major clue came in the late 1990s when theoretical physicist Juan Maldacena calculated the entropy of a black hole. In a nutshell, this calculation of the entropy was a count of all the missing information that gets locked behind an event horizon. He found that the amount of entropy inside a black hole is proportional to the radius squared—and thus proportional to the surface area of the black hole. (That’s in contrast to the radius cubed, which is proportional to the volume.)

    For example, if you take a standard black hole and add one single bit of information to it (as encoded by, say, a single photon with a wavelength equal to the radius of the black hole) its surface area will increase by exactly the square of the Planck length.

    Leading from Hawking’s insight, this result suggested that the most important property of a black hole—the place where we should focus our attention and efforts—was not the infinitely dense singularity in the center but the surface of the event horizon, which separates the insides of a black hole from the Universe outside.

    The relationship between a black hole’s surface and its entropy also dovetailed nicely with another concept evolving out of the string theory community at the time, something known as the “holographic principle.”

    String theory is an attempt to develop a theory of everything, a complete description of all the forces of nature under a single unifying framework. That attempt hasn’t seen a lot of success because nobody has been able to use string theory to develop a quantum theory of gravity—all the math just gets too complex to solve. So several physicists in the ’90s wondered if there was a way to simplify the problem. Instead of trying to work through the nasty problem of quantum gravity in our normal four-dimensional Universe, maybe we could encode all the information contained in the Universe onto an imaginary three-dimensional boundary and get an easier version of the math.

    Maldacena was able to provide a realization of that idea via what’s called the AdS/CFT correspondence. It works like this. You start by trying to solve a problem involving quantum gravity in a particular kind of Universe called anti-de Sitter space (AdS, which has no matter or radiation inside it but does have positive cosmological constant). Mathematically, you can project all the information in that Universe onto its surface. Once you make that transformation, your impossible-to-solve quantum gravity problem turns into a merely very-difficult-to-solve problem in conformal field theory (that’s the CFT part), which is a kind of quantum field theory that doesn’t include gravity at all. You can then solve your problem and translate the solution back into the full-dimensional Universe and move on with your life.

    This correspondence between the information within a volume and the information present on that volume’s surface is the holographic principle (named so because holograms store 3D information on a 2D surface). The correspondence has yet to be proven mathematically, but it has turned out to be useful for solving various kinds of specialized problems in the realm of high-energy physics.

    What does this have to do with black holes? The fact that a black hole’s information content is related directly to its surface and not its volume seems to be a major clue that the resolution to the paradox may come from using the AdS/CFT correspondence, which recasts problems involving extended objects with gravity into surface-layer problems without gravity. Leaving aside the slightly uncomfortable fact that the inside of a black hole is definitely not an anti-de Sitter space, perhaps the black holes are trying to tell us something fundamental not just about the nature of gravity but about reality itself.

    It was based on this correspondence that Hawking declared a winner in the love-it-or-leave-it debate regarding the preservation of information. Based on the AdS/CFT holographic picture of the Universe, information must be preserved (somehow) on the surface of a black hole and end up leaving the black hole (somehow) via Hawking radiation. If you threw a book into a black hole and kept careful track of the particles emitted over the next few trillions of years, you should be able to put the book back together again.

    Somehow.

    The “promised land” of quantum gravity

    The “how” part of this story has been keeping some physicists up late at night for the past 20 years. One particular line of thinking has been to closely examine the nature of spacetime near the event horizon. In Hawking’s original approach, he assumed that a large enough black hole would curve space in the region of the horizon, but only mildly so. But we know from our (limited and incomplete) forays into quantum gravity that we may have to account for a more dramatic curvature. To fully answer the question of “what’s gravity up to around an event horizon?” we may also have to fold in the same kind of quantum fuzziness that underlies theories of subatomic particles.

    When we do that, however, we typically get uncontrollable infinities popping up everywhere in the math because such theories need to account for every possible exotic shape that spacetime can take. This is generally why we don’t have a theory of quantum gravity. That said, some brave theorists have dared to venture into those uncharted waters and have discovered some clever tricks (really hardcore stuff, too, like imaginary wormholes threading together in a complex mathematical space) to untangle some of the equations, showing that it may be possible to create scenarios where information can leak into the Hawking process.

    Still other theorists have rejected this string-theory-driven approach to black holes and focus instead on the nature of space-time at the singularity. Their approaches consider whether space and time might come in discrete little chunks, the same way that energy levels and angular momentum do. In this view, the singularity is not an infinitely dense point but merely a really tiny one. And when the black hole evaporates, it doesn’t disappear completely—instead, it leaves behind a nugget of information-rich material. But those approaches run into major hurdles of their own, like having to figure out how to make the transition from a black hole with an inescapable horizon to a lump of matter existing bare naked in the Universe.

    Ultimately, physicists remain intrigued by the information paradox because it potentially exposes a feature of quantum gravity and makes it available to our examination. Quantum gravity is usually the domain of the ultra-exotic: the initial moments of the Big Bang or unachievable particle collider energies. But black holes are real things in the real Universe; with enough determination, we could reach out and dip a toe into an event horizon.

    If we can solve the information paradox, we just might be able to unlock quantum gravity, the unification of the forces, and more.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

     
  • richardmitnick 4:20 pm on October 3, 2022 Permalink | Reply
    Tags: "EFTs": Effective Field Theories, "EM": electric and magnetic fields, "HIGS": High Intensity Gamma Ray Source, "How Stiff Is the Proton?", , EM polarizabilities are a measure of the stiffness against the deformation induced by EM fields., In this research scientists validated EFTs using proton Compton scattering., , Nucleon Compton scattering, Physics, , The current theory of the strong nuclear force called quantum chromodynamics (QCD), The Department of Energy   

    From The Department of Energy: “How Stiff Is the Proton?” 

    From The Department of Energy

    9.30.22
    Contact
    Mohammad Ahmed
    North Carolina Central University and Triangle Universities Nuclear Laboratory
    mahmed2@nccu.edu

    1
    Compton scattering setup at the High Intensity Gamma Ray Source. The central cylinder is the liquid hydrogen target. High energy gamma rays are scattered from the liquid hydrogen into eight large detectors that measure the gamma rays’ energy. Credit: Mohammad Ahmed, North Carolina Central University and Triangle Universities Nuclear Laboratory

    The Science

    The proton is a composite particle made up of fundamental building blocks of quarks and gluons. These components and their interactions determine the proton’s structure, including its electrical charges and currents. This structure deforms when exposed to external electric and magnetic (EM) fields, a phenomenon known as polarizability. The EM polarizabilities are a measure of the stiffness against the deformation induced by EM fields. By measuring the EM polarizabilities, scientists learn about the internal structure of the proton. This knowledge helps to validate scientific understanding of how nucleons (protons and neutrons) form by comparing the results to theoretical descriptions of gamma-ray scattering from nucleons. Scientists call this scattering process nucleon Compton scattering.

    The Impact

    When scientists examine the proton at a distance and scale where EM responses dominate, they can determine values of EM polarizabilities with high precision. To do so, they use the theoretical frame of Effective Field Theories (EFTs). The EFTs hold the promise of matching the description of the nucleon structure at low energies to the current theory of the strong nuclear force called quantum chromodynamics (QCD). In this research scientists validated EFTs using proton Compton scattering. This approach also validated the framework and methodology that underlie EFTs.

    Summary

    Proton Compton scattering is the process by which scientists scatter circularly or linearly polarized gamma rays from a hydrogen target (in this case, a liquid target), then measure the angular distribution of the scattered gamma rays. High-energy gamma rays carry strong enough EM fields that the response of the charges and currents in the nucleon becomes significant. In this study, scientists performed new measurements of Compton scattering from the proton at the High Intensity Gamma Ray Source (HIGS) at the Triangle Universities Nuclear Laboratory. This work provided a novel experimental approach for Compton scattering from the proton at low energies using polarized gamma rays. The study advances the need for new high-precision measurements at HIGS to improve the accuracy of proton and neutron polarizabilities determinations. These measurements validate the theories which link the low-energy description of nucleons to QCD.

    Science paper:
    Physical Review Letters

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The The United States Department of Energy is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy. The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Energy Technology Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of River Protection
    Pantex
    Radiological and Environmental Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
  • richardmitnick 10:41 am on October 3, 2022 Permalink | Reply
    Tags: "Self-assembly breakthrough offers new promise for microscopic materials by mimicking biology", , , , , Physics   

    From New York University Via “COSMOS (AU)” : “Self-assembly breakthrough offers new promise for microscopic materials by mimicking biology” 

    NYU BLOC

    From New York University

    Via

    Cosmos Magazine bloc

    “COSMOS (AU)”

    10.1.22
    Evrim Yazgin

    1
    The illustration shows how droplets with different DNA strands first combine into chains, which are then programmed to fold into specific geometries, analogous to protein folding. The carpet highlights one folding pathway of a hexamer chain folding into a polytetrahedron. The zoom shows how the formation of DNA double helices drives droplet-droplet binding. Credit: Kaitlynn Snyder.

    A new method for self-assembly in particles by physicists at New York University (NYU) offers promise for developing complex and innovative microscopic materials.

    A note here that the “particles” exhibiting self-assembly are not subatomic particles – like protons and electrons – but particles like molecules, usually only visible through a microscope.

    Such self-assembling of particles is believed to be useful in future drug and vaccine delivery as well as other medical applications.

    Self-assembly was initially put forward in the early 2000s as the potential for nanotechnology began to make headlines. By “pre-programing” particles, scientists and engineers would be able to build materials at the microscopic level without human intervention. The particles organise themselves.

    Think of it like microscopic Ikea furniture that can assemble itself.

    But, don’t get the wrong end of the microscopic stick – this has nothing to do with artificial intelligence or particles with consciousness. The particles are programmed through chemistry.

    This self-assembly is reliably done to great effect if all the pieces being assembled are distinct or different. However, systems with fewer different types of particles are much harder to program. The work done at NYU is aimed at producing self-assembly in these systems.

    The NYU physicists reported their breakthrough in the journal Nature [below]. Their research centres on emulsion – droplets of oil in water. Droplet chains are made to fold into unique shapes – called “foldamers” – which can be theoretically predicted from the sequence of interactions between the droplets.

    Self-assembly already exists in nature. The team borrowed from what we understand of the physical chemistry of folding in proteins and RNA using colloids – a mixture of two or more substances which are not chemically combined, like an emulsion.

    By placing an array of DNA sequences on the tiny oil droplets, which served as assembly “instructions”, the team was able to get the droplets to first form flexible chains before sequentially folding or collapsing via the sticky DNA molecules.

    The physicists found that a simple alternating chain of up to 13 droplets, with two different types of oil, self-assembled into 11 two-dimensional ‘foldamers’ and an additional one in three dimensions.

    3
    Microscopy images show a chain of alternating blue and yellow droplets folding into a crown geometry through blue-blue, blue-yellow, and finally yellow-yellow interactions, mediated by sticky DNA strands. Microscopic droplets are programmed to interact via sticky DNA strands to uniquely fold into well-defined shapes, as shown here. Credit: The Brujic Lab.

    “Being able to pre-program colloidal architectures gives us the means to create materials with intricate and innovative properties,” explains senior author Jasna Brujic, a professor in New York University’s Department of Physics. “Our work shows how hundreds of self-assembled geometries can be uniquely created, offering new possibilities for the creation of the next generation of materials.”

    They say the counterintuitive and pioneering aspect of their research is in requiring fewer building blocks to produce a wide variety of shapes.

    “Unlike a jigsaw puzzle, in which every piece is different, our process uses only two types of particles, which greatly reduces the variety of building blocks needed to encode a particular shape. The innovation lies in using folding, similar to the way that proteins do, but on a length scale 1,000 times bigger – about one-tenth the width of a strand of hair. These particles first bind together to make a chain, which then folds, according to pre-programmed interactions that guide the chain through complex pathways, into a unique geometry,” says Brujic.

    “The ability to obtain a lexicon of shapes opens the path to further assembly into larger scale materials, just as proteins hierarchically aggregate to build cellular compartments in biology.”

    Science paper:
    Nature

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NYU Campus

    More than 175 years ago, Albert Gallatin, the distinguished statesman who served as secretary of the treasury under Presidents Thomas Jefferson and James Madison, declared his intention to establish “in this immense and fast-growing city … a system of rational and practical education fitting for all and graciously opened to all.” Founded in 1831, New York University is now one of the largest private universities in the United States. Of the more than 3,000 colleges and universities in America, New York University is one of only 60 member institutions of the distinguished Association of American Universities.

    New York University is a private research university in New York City. Chartered in 1831 by the New York State Legislature, NYU was founded by a group of New Yorkers led by then Secretary of the Treasury Albert Gallatin.

    In 1832, the initial non-denominational all-male institution began its first classes near City Hall based on a curriculum focused on a secular education. The university, in 1833, then moved and has maintained its main campus in Greenwich Village surrounding Washington Square Park. Since then, the university has added an engineering school in Brooklyn’s MetroTech Center and graduate schools throughout Manhattan. New York University has become the largest private university in the United States by enrollment, with a total of 51,848 enrolled students, including 26,733 undergraduate students and 25,115 graduate students, in 2019. New York University also receives the most applications of any private institution in the United States and admissions is considered highly selective.

    New York University is organized into 10 undergraduate schools, including the College of Arts & Science, Gallatin School, Steinhart School, Stern School of Business, Tandon School of Engineering, and the Tisch School of Arts. New York University’s 15 graduate schools includes the Grossman School of Medicine, School of Law, Wagner Graduate School of Public Service, School of Professional Studies, School of Social Work, Rory Meyers School of Nursing, and Silver School of Social Work. The university’s internal academic centers include the Courant Institute of Mathematical Sciences, Center for Data Science, Center for Neural Science, Clive Davis Institute, Institute for the Study of the Ancient World, Institute of Fine Arts, and the New York University Langone Health System. New York University is a global university with degree-granting campuses at New York University Abu Dhabi and New York University Shanghai, and academic centers in Accra, Berlin, Buenos Aires, Florence, London, Los Angeles, Madrid, Paris, Prague, Sydney, Tel Aviv, and Washington, D.C.

    Past and present faculty and alumni include 38 Nobel Laureates, 8 Turing Award winners, 5 Fields Medalists, 31 MacArthur Fellows, 26 Pulitzer Prize winners, 3 heads of state, a U.S. Supreme Court justice, 5 U.S. governors, 4 mayors of New York City, 12 U.S. Senators, 58 members of the U.S. House of Representatives, two Federal Reserve Chairmen, 38 Academy Award winners, 30 Emmy Award winners, 25 Tony Award winners, 12 Grammy Award winners, 17 billionaires, and seven Olympic medalists. The university has also produced six Rhodes Scholars, three Marshall Scholars, 29 Schwarzman Scholars, and one Mitchell Scholar.

    Research

    New York University is classified among “R1: Doctoral Universities – Very high research activity” and research expenditures totaled $917.7 million in 2017. The university was the founding institution of the American Chemical Society. The New York University Grossman School of Medicine received $305 million in external research funding from the National Institutes of Health in 2014. New York University was granted 90 patents in 2014, the 19th most of any institution in the world. New York University owns the fastest supercomputer in New York City. As of 2016, New York University hardware researchers and their collaborators enjoy the largest outside funding level for hardware security of any institution in the United States, including grants from the National Science Foundation, the Office of Naval Research, the Defense Advanced Research Projects Agency, the United States Army Research Laboratory, the Air Force Research Laboratory, the Semiconductor Research Corporation, and companies including Twitter, Boeing, Microsoft, and Google.

    In 2019, four New York University Arts & Science departments ranked in Top 10 of Shanghai Academic Rankings of World Universities by Academic Subjects (Economics, Politics, Psychology, and Sociology).

     
  • richardmitnick 12:41 pm on September 30, 2022 Permalink | Reply
    Tags: "What it’s like to be stationed at a particle accelerator", , , , , , , Physics   

    From “Penn Today” : “What it’s like to be stationed at a particle accelerator” 

    From “Penn Today”

    at

    U Penn bloc

    University of Pennsylvania

    9.29.22
    Blake Cole

    Gwen Gardner and Lauren Osojnak, Ph.D. candidates in physics, describe their work as part of the Penn ATLAS team at the Large Hadron Collider.

    ATLAS

    On July 5, 2022, the European Organization for Nuclear Research, more commonly referred to as CERN, brought all LHC systems online for its third run. This came after a three-year-long maintenance and upgrade phase, and on the tail of the 10th anniversary of one of the most significant discoveries associated with CERN: the Higgs boson, “the fundamental particle associated with the Higgs field, a field that gives mass to other fundamental particles such as electrons and quarks.”

    2
    Gwen Gardner (third from right) and Lauren Osojnak (second from right) below the detector, standing in front of one of the access points they use to climb up to our electronics. (Image: Courtesy of Gwen Gardner and Lauren Osojnak)

    The LHC, located in Geneva on the Franco-Swiss border, is the world’s largest and most powerful particle accelerator, a 27-kilometer ring of superconducting magnets. It speeds up and increases the energy of a beam of particles by generating electric fields that accelerate the particles, and magnetic fields that steer and focus them, which gives researchers a rare glimpse into the basic constituents of matter.

    Over 600 institutes and universities around the world use CERN’s facilities. Gardner and Osojnak describe their work as part of Penn’s team.

    “What I do right now is mostly instrumentation work. It’s hands on, dealing with electronics and writing what we call low-level code, which just means that the code that we write is meant to interact with electronics and hardware,” says Gardner. “This is more along the lines of the kind of stuff you might study in electrical engineering. Most of us here learn enough of it to get by from research experience.”

    “I work on the transition radiation tracker of ATLAS,” says Osojnak. “That involves a lot of time in the control room, which is really exciting, especially since the start of run three last week. I didn’t get to be in the actual control room for the first beams of Run Three, but I got to be in one of the other ATLAS buildings with a bunch of people watching it occur and cheering with everyone, which was really fun. The other half of my time is dedicated to working on a supersymmetry analysis.”

    “I always say that what I’m doing is kind of like looking for a needle in a haystack, but not even knowing if there is a needle at all,” explains Osojnak. “Not everything matches up exactly as we think that it should if the standard model was the end of the story. So, one way that it could make sense is if every particle had basically a mirror image particle of itself and the standard model was doubled. That’s what super symmetry is. But there are other options. It could be that instead of having this mirror image super symmetry, there could be a mirror image with a little crack in the mirror, and that might be the missing piece. But then that begs the question, ‘How specific do we go?’ If it’s a broken symmetry, maybe it’s just chaos and there is a multiverse theory and this super symmetry is just a garbage theory. The philosophical implications of it are interesting.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Penn campus

    Academic life at University of Pennsylvania is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

    The University of Pennsylvania is a private Ivy League research university in Philadelphia, Pennsylvania. The university claims a founding date of 1740 and is one of the nine colonial colleges chartered prior to the U.S. Declaration of Independence. Benjamin Franklin, Penn’s founder and first president, advocated an educational program that trained leaders in commerce, government, and public service, similar to a modern liberal arts curriculum.

    Penn has four undergraduate schools as well as twelve graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences; the School of Engineering and Applied Science; the Wharton School; and the School of Nursing. Penn’s “One University Policy” allows students to enroll in classes in any of Penn’s twelve schools. Among its highly ranked graduate and professional schools are a law school whose first professor wrote the first draft of the United States Constitution, the first school of medicine in North America (Perelman School of Medicine, 1765), and the first collegiate business school (Wharton School, 1881).

    Penn is also home to the first “student union” building and organization (Houston Hall, 1896), the first Catholic student club in North America (Newman Center, 1893), the first double-decker college football stadium (Franklin Field, 1924 when second deck was constructed), and Morris Arboretum, the official arboretum of the Commonwealth of Pennsylvania. The first general-purpose electronic computer (ENIAC) was developed at Penn and formally dedicated in 1946. In 2019, the university had an endowment of $14.65 billion, the sixth-largest endowment of all universities in the United States, as well as a research budget of $1.02 billion. The university’s athletics program, the Quakers, fields varsity teams in 33 sports as a member of the NCAA Division I Ivy League conference.

    As of 2018, distinguished alumni and/or Trustees include three U.S. Supreme Court justices; 32 U.S. senators; 46 U.S. governors; 163 members of the U.S. House of Representatives; eight signers of the Declaration of Independence and seven signers of the U.S. Constitution (four of whom signed both representing two-thirds of the six people who signed both); 24 members of the Continental Congress; 14 foreign heads of state and two presidents of the United States, including Donald Trump. As of October 2019, 36 Nobel laureates; 80 members of the American Academy of Arts and Sciences; 64 billionaires; 29 Rhodes Scholars; 15 Marshall Scholars and 16 Pulitzer Prize winners have been affiliated with the university.

    History

    The University of Pennsylvania considers itself the fourth-oldest institution of higher education in the United States, though this is contested by Princeton University and Columbia University. The university also considers itself as the first university in the United States with both undergraduate and graduate studies.

    In 1740, a group of Philadelphians joined together to erect a great preaching hall for the traveling evangelist George Whitefield, who toured the American colonies delivering open-air sermons. The building was designed and built by Edmund Woolley and was the largest building in the city at the time, drawing thousands of people the first time it was preached in. It was initially planned to serve as a charity school as well, but a lack of funds forced plans for the chapel and school to be suspended. According to Franklin’s autobiography, it was in 1743 when he first had the idea to establish an academy, “thinking the Rev. Richard Peters a fit person to superintend such an institution”. However, Peters declined a casual inquiry from Franklin and nothing further was done for another six years. In the fall of 1749, now more eager to create a school to educate future generations, Benjamin Franklin circulated a pamphlet titled Proposals Relating to the Education of Youth in Pensilvania, his vision for what he called a “Public Academy of Philadelphia”. Unlike the other colonial colleges that existed in 1749—Harvard University, William & Mary, Yale Unversity, and The College of New Jersey—Franklin’s new school would not focus merely on education for the clergy. He advocated an innovative concept of higher education, one which would teach both the ornamental knowledge of the arts and the practical skills necessary for making a living and doing public service. The proposed program of study could have become the nation’s first modern liberal arts curriculum, although it was never implemented because Anglican priest William Smith (1727-1803), who became the first provost, and other trustees strongly preferred the traditional curriculum.

    Franklin assembled a board of trustees from among the leading citizens of Philadelphia, the first such non-sectarian board in America. At the first meeting of the 24 members of the board of trustees on November 13, 1749, the issue of where to locate the school was a prime concern. Although a lot across Sixth Street from the old Pennsylvania State House (later renamed and famously known since 1776 as “Independence Hall”), was offered without cost by James Logan, its owner, the trustees realized that the building erected in 1740, which was still vacant, would be an even better site. The original sponsors of the dormant building still owed considerable construction debts and asked Franklin’s group to assume their debts and, accordingly, their inactive trusts. On February 1, 1750, the new board took over the building and trusts of the old board. On August 13, 1751, the “Academy of Philadelphia”, using the great hall at 4th and Arch Streets, took in its first secondary students. A charity school also was chartered on July 13, 1753 by the intentions of the original “New Building” donors, although it lasted only a few years. On June 16, 1755, the “College of Philadelphia” was chartered, paving the way for the addition of undergraduate instruction. All three schools shared the same board of trustees and were considered to be part of the same institution. The first commencement exercises were held on May 17, 1757.

    The institution of higher learning was known as the College of Philadelphia from 1755 to 1779. In 1779, not trusting then-provost the Reverend William Smith’s “Loyalist” tendencies, the revolutionary State Legislature created a University of the State of Pennsylvania. The result was a schism, with Smith continuing to operate an attenuated version of the College of Philadelphia. In 1791, the legislature issued a new charter, merging the two institutions into a new University of Pennsylvania with twelve men from each institution on the new board of trustees.

    Penn has three claims to being the first university in the United States, according to university archives director Mark Frazier Lloyd: the 1765 founding of the first medical school in America made Penn the first institution to offer both “undergraduate” and professional education; the 1779 charter made it the first American institution of higher learning to take the name of “University”; and existing colleges were established as seminaries (although, as detailed earlier, Penn adopted a traditional seminary curriculum as well).

    After being located in downtown Philadelphia for more than a century, the campus was moved across the Schuylkill River to property purchased from the Blockley Almshouse in West Philadelphia in 1872, where it has since remained in an area now known as University City. Although Penn began operating as an academy or secondary school in 1751 and obtained its collegiate charter in 1755, it initially designated 1750 as its founding date; this is the year that appears on the first iteration of the university seal. Sometime later in its early history, Penn began to consider 1749 as its founding date and this year was referenced for over a century, including at the centennial celebration in 1849. In 1899, the board of trustees voted to adjust the founding date earlier again, this time to 1740, the date of “the creation of the earliest of the many educational trusts the University has taken upon itself”. The board of trustees voted in response to a three-year campaign by Penn’s General Alumni Society to retroactively revise the university’s founding date to appear older than Princeton University, which had been chartered in 1746.

    Research, innovations and discoveries

    Penn is classified as an “R1” doctoral university: “Highest research activity.” Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn’s research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health.

    In line with its well-known interdisciplinary tradition, Penn’s research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing; the Center for Global Women’s Health at the Nursing School; the $13 million Morris Arboretum’s Horticulture Center; the $15 million Jay H. Baker Retailing Center at Wharton; and the $13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the “Penn Integrates Knowledge” title awarded to selected Penn professors “whose research and teaching exemplify the integration of knowledge”. These professors hold endowed professorships and joint appointments between Penn’s schools.

    Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia University and Cornell University (Harvard University did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale University) and tenth nationally.

    In most disciplines Penn professors’ productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn’s 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields.

    Penn’s research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school; the first university teaching hospital; the first business school; and the first student union Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, “Wharton is on the crest of a wave of reinvention and change in management education”.

    Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering.

    ENIAC UPenn

    It was here also where the world’s first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer’s link with genes; cognitive therapy; Retin-A (the cream used to treat acne), Resistin; the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the genes for fragile X syndrome, the most common form of inherited mental retardation; spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; and Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs.

    Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research; Simon Kuznets’s method of measuring Gross National Product; the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the “Wharton Model” developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon’s health reform in the 1970s.

    International partnerships

    Students can study abroad for a semester or a year at partner institutions such as the London School of Economics(UK), University of Barcelona [Universitat de Barcelona](ES), Paris Institute of Political Studies [Institut d’études politiques de Paris](FR), University of Queensland(AU), University College London(UK), King’s College London(UK), Hebrew University of Jerusalem(IL) and University of Warwick(UK).

     
  • richardmitnick 11:07 am on September 29, 2022 Permalink | Reply
    Tags: "Optimizing CLIC for reducing the electricity consumption at machine and laboratory level", , , , Electron-positron linear colliders are currently being studied as potential future Higgs-factories., , International Linear Collider (ILC) in Japan, , , Physics, The Compact Linear Collider (CLIC) at CERN   

    From CERN (CH) Accelerating News : “Optimizing CLIC for reducing the electricity consumption at machine and laboratory level” 

    From CERN (CH) Accelerating News

    9.19.22
    Steinar Stapnes
    Alexej Grudiev

    Optimized system designs for power efficiency, high efficiency klystrons, permanent magnets, renewable power… The linear collider projects are working to address power efficiency and reduce the environmental impact of the facilities.

    1
    CLIC accelerator structures optimised for RF power efficiency under test (Image: CERN)

    Electron-positron linear colliders are currently being studied as potential future Higgs-factories. The two most mature studies are for the International Linear Collider (ILC) in Japan, and the Compact Linear Collider (CLIC) at CERN, Switzerland.

    Linear colliders rely on low emittance high intensity beams created in damping rings and ultimately being focussed to the nano-meter level at the collision point.

    The current volatility in energy prices underlines the importance of reducing the power needed for operating future facilities. Both linear collider projects, collaborating in many areas, have extensively studied novel design and technology solutions to address power efficiency and reduce the environmental impact of the facilities. The sustainability considerations, in addition to the more traditional cost concern and need for developing core technologies, are today primary R&D drivers for the projects. These studies have recently been summarized in a contribution [1] to the International Atomic Energy Agency (IAEA) “Conference on Accelerators for Research and Development: from good practices towards socioeconomics impact”.

    This article briefly summarized the studies performed and on-going within the CLIC collaboration. The CLIC RF technology is based on normal conducting 12 GHz accelerating structures. The initial 11.5 km stage provides collisions at 380 GeV at a luminosity of 2.3 x 1034 cm-2s-1. CLIC can be upgraded in energy and luminosity as part of a longer-term electron-positron collider programme.

    Concerning energy consumption, the CLIC power consumption has been estimated to 110 MW at 380 GeV [2]. Turning these power numbers into yearly energy consumption gives estimates around 600 GWh. As a reference CERN uses around 1.2 TWh of electricity yearly. The initial stage CLIC numbers are considerably lower than earlier estimates, which were largely based on scaling from the 3 TeV machine studied for the Conceptual Design Report (CDR) in 2012. The reduction is around a factor two, out of which a fraction is a trivial scaling going from 500 GeV in the CDR, to 380 GeV adapted for Higgs and top physics.

    To achieve the reduced numbers several dedicated studies have been conducted to control and optimise the power consumption, in parallel with studies considering the environmental impact of the facilities in a wider sense. Many of these studies are widely applicable and generally relevant for future accelerator facilities. Among the studies carried out are:

    The designs of CLIC, including key performance parameters as accelerating gradients, pulse lengths, bunch-charges and luminosities, have been optimised for cost but also increasingly focussing on reducing power consumption. The parameter sets giving the lowest cost and power for a given luminosity have been identified and retained as baseline.
    Technical developments and studies targeting reduced power consumptions at system level, primary examples are RF system design optimisation, developments of high efficiency klystrons [3], and studies of permanent magnets for damping rings and linacs [4].
    The possibility of making use of the fact that the linear colliders are single pass, i.e. the beams and hence power are needed “shot by shot”, possibly allowing to operate in daily or weekly time-windows when power is available in abundance from suppliers and costs are reduced [5]. Seasonal operation is already being used for energy cost reasons.
    Estimating the renewable power that can be made available for running the colliders by investing for example 10% of the overall construction costs in solar and wind energy capabilities [5], again profiting from the fact that single pass colliders can quickly adapt to changes in energy output from such sources.
    Technical solutions for recovering energy losses in all parts of the accelerator, to be reused for acceleration and/or for use in the local area (homes, industry) near the facility.

    In many cases the studies mentioned are still on-going and further work is needed. For CLIC these studies will be included in the planned Project Readiness Report for the next European Strategy Update. Among the studies planned is an analysis of the start to end environmental impact including carbon footprints for CLIC. While one can expect that energy production in a decade or two are largely carbon free reducing the operational impact, the evaluation of raw materials, and their processing, being used for the civil engineering and accelerator will need to be carefully analysed. Decommission will also be considered. The power and energy use of CLIC at 1.5 and 3 TeV will be revised, including the saving mentioned above. Current estimates date back to the CDR in 2012 and are by now outdated and too high.

    As mentioned initially many of these studies are equally applicable to ILC and many will be done together with ILC. As ILC is a green field installation there are interesting possibilities to address sustainability from the very start for the facility.
    ___________________________________________________________
    [1] List B. et al, Sustainability studies for Linear Colliders: https://conferences.iaea.org/event/264/contributions/21011/.

    [2] The CLIC project, Snowmass White Paper, https://arxiv.org/abs/2203.09186.

    [3] Cai J. and Syratchev I., Modelling and Technical Design Study of Two-Stage Multibeam Klystron for CLIC, doi: 10.1109/TED.2020.3000191.

    [4] Shepherd B., Permanent Magnets for Accelerators, https://jacow.org/ipac2020/papers/moviro05.pdf.

    [5] Fraunhofer CLIC power/energy study: https://edms.cern.ch/document/2065162/1.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN (CH) Accelerating News is a quarterly online publication for the accelerator community.
    ISSN: 2296-6536

    The publication showcases news and results from the biggest accelerator research and development projects such as ARIES, HL-LHC, TIARA, FCC study, EuroCirCol, EUPRAXIA, EASITrain as well as interesting stories on other accelerator applications. The newsletter also collects upcoming accelerator research conferences and events.

    Accelerating News is published 4 times a year, in mid March, mid June, mid September and mid December.

    You can read Accelerating News via the homepage http://www.acceleratingnews.eu or by email.

    To subscribe to Accelerating News, enter your email in the “Subscribe to our newsletter” box in the footer.

    History

    Accelerating News evolved from the EuCARD quarterly project newsletter (see past issues), which was first published in June 2009 to a subscription list of approximately 200. Initiated by EuCARD and in collaboration with additional FP7 co-funded projects, the first edition of Accelerating News was published in April 2012 to an initial distribution list of about 800 subscribers. Currently more than 1750 members receive the quarterly issues.

     
  • richardmitnick 10:38 am on September 29, 2022 Permalink | Reply
    Tags: "The miniature accelerator:: dream or reality?", , , , , , , Physics, To look into the atomic and subatomic structure of materials and cells future industry will need ever-smaller accelerators.   

    From CERN (CH) Accelerating News : “The miniature accelerator:: dream or reality?” 

    From CERN (CH) Accelerating News

    9.26.22
    Maurizio Vretenar

    To look into the atomic and subatomic structure of materials and cells future industry will need ever-smaller accelerators.

    1
    The radio-frequency quadrupole of the MACHINA project under development (Image: CERN)

    Already now the large majority of the almost 40,000 particle accelerators in operation worldwide are used in industry and medicine, and this number is rapidly increasing [1]. Accelerator applications are progressing fast, and particle accelerators have the potential to become a crucial tool in the ongoing 4th industrial revolution, making accessible to industry and medicine processes that allow a direct interaction with the atomic and subatomic structure of materials and cells. But to succeed, this “industrial revolution” needs small accelerators that can easily fit in a medical or industrial environment, easy to operate with moderate energy requirements and minimum radiation concerns. In short, what is needed are “miniature accelerators”, for the moment still a technological dream although several accelerator teams are heading in this direction. Every technology starts from a dream, but how far are we from realizing it?

    The first important consideration is that size is not all. The basic parameters of an accelerator can be divided in two categories, those defining the “performance” for the final users (type of particle, energy, beam current, beam brightness, reliability) and those defining the “impact” on the operating environment (mains power, power efficiency, radiation emission and activation, dimensions, weight, construction and operation costs). The real challenge for the miniature accelerator consists in maximizing the first set of parameters, minimising the second one. Every “miniature” device must provide a minimum performance that will make the system attractive to its potential users.

    Lighter, smaller, cheaper: Compacting the convention

    2
    Close-up of the Compact Linear Collider prototype, on which the electron FLASH design is based (Image: CERN)

    The first direction to reduce the size of our accelerators is “incremental” innovation, pushing to its limits the good old concept of radiofrequency (RF) acceleration that since almost 100 years drives the particles within our accelerators.

    In the field of proton acceleration, this translates into miniaturising the traditional “workhorses” of low-energy acceleration: the Radio Frequency Quadrupole (RFQ) and the cyclotron. Several developments are going on towards high-frequency compact RFQ’s [2]; while the RFQ gradient can reach some 2-3 MeV/m, the main limitation comes from the small aperture and limited cooling capability that set a limit to the average beam current. Very compact cyclotrons in most cases superconducting are also a popular trend [3]. Here the average current can be higher, and the final energy is of the order of 5 MeV/m2 – taking the surface of the accelerator as a reference instead of its length! In terms of cost, for both cases the driver is the ancillary equipment: the RF generator for the RFQ, and the cryogenic system for the superconducting cyclotron.

    For electrons instead, the driving incremental development has been the decade long R&D work done by the CLIC team to push the gradient of X-band structures. Reaching some 100 MeV/m (corresponding to some 20 MeV/m2…), X-band technology is now proposed for compact Free Electron Lasers, X-ray source, FLASH cancer treatment, etc [4].

    3
    The CompactLight planned facility allows the production of X-rays up to 16 keV within about 400 m of length, including the experimental hall: half the length of equivalent facilities (Image: CompactLight).

    Accelerators in a shoe-box?

    An alternative very popular avenue to miniaturize accelerators comes from “disruptive” innovation that might eventually completely replace RF technology with lasers that provide energy to the particles using either plasmas or dielectric structures for the energy conversion. Here the technological landscape is very wide, but the challenges to face are still huge. Much experimental work is going in the direction of proton and ion laser-based acceleration to some MeV of energy, but critical issues that remain to be solved are beam quality and reproducibility. The energy is there, but the beam is still very far from what users need. Progress with electron acceleration is more promising, with the advantage for compact accelerators of being easily single-stage, free from the difficult problem of multi-stage acceleration that has still to be solved for high-energy acceleration.

    4
    Three “accelerators on a chip” made of silicon are mounted on a clear base. A shoebox-sized particle accelerator being developed under a $13.5 million Moore Foundation grant would use a series of these “accelerators on a chip” to boost the energy of electrons (The DOE’s SLAC National Accelerator Laboratory).​​​

    The way to communicate these projects is also very attractive, as for the “accelerator in a shoe-box”, aiming at accelerating electrons through a ridged silicon glass chip fed by a laser [4]. The project is producing its first results, but again the challenge is to push enough particles to be of some use through such a miniaturised structured. These developments often aim at medical application as first goal (as an accelerator in a catheter!), but if it is true that medical applications don’t need high currents, it is also true that this is the domain with the most stringent requirements in terms of beam quality and stability.

    In conclusion, accelerator science is progressing, but we are still far from having an accelerator that can fit in every small workshop and every medical ward. For the moment, dreams remain dreams, but there is some rapidly progressing reality in them. While the real “miniature” is still faraway, compact accelerators tailored to specific usages are at reach, and may benefit from some targeted R&D (additive manufacturing, high RF frequencies, solid-state RF technology) in the incremental direction, and from the fast development of powerful lasers and miniaturised chips in the disruptive direction. New accelerator applications are appearing, in particular in medicine, industry, and environment, opening new potential users and commercial markets for the new technologies that will come out of the quest for the miniature accelerator!
    __________________________________________________________________________
    [1] See for example the comprehensive study “Applications of Particle Accelerators in Europe” published by the EuCARD2 project, available at http://apae.ific.uv.es/apae/wp-content/uploads/2015/04/EuCARD_Applications-of-Accelerators-2017.pdf

    [2] See for example for small-scale accelerators for cancer treatment and to study heritage artworks.

    [3] AMIT project of CIEMAT.

    [4] See for example this project to mount “accelerators on a chip” to boost the energy of electrons.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN (CH) Accelerating News is a quarterly online publication for the accelerator community.
    ISSN: 2296-6536

    The publication showcases news and results from the biggest accelerator research and development projects such as ARIES, HL-LHC, TIARA, FCC study, EuroCirCol, EUPRAXIA, EASITrain as well as interesting stories on other accelerator applications. The newsletter also collects upcoming accelerator research conferences and events.

    Accelerating News is published 4 times a year, in mid March, mid June, mid September and mid December.

    You can read Accelerating News via the homepage http://www.acceleratingnews.eu or by email.

    To subscribe to Accelerating News, enter your email in the “Subscribe to our newsletter” box in the footer.

    History

    Accelerating News evolved from the EuCARD quarterly project newsletter (see past issues), which was first published in June 2009 to a subscription list of approximately 200. Initiated by EuCARD and in collaboration with additional FP7 co-funded projects, the first edition of Accelerating News was published in April 2012 to an initial distribution list of about 800 subscribers. Currently more than 1750 members receive the quarterly issues.

     
  • richardmitnick 8:47 pm on September 28, 2022 Permalink | Reply
    Tags: "Exploring a new algorithm for reconstructing particles", , , , , , , , , Physics   

    From The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] Via phys.org : “Exploring a new algorithm for reconstructing particles” 


    Cern New Particle Event

    From The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN]

    Via

    phys.org

    9.28.22

    Fig. 1
    1
    Schematic representation of the right-handed Cartesian coordinate system adopted to describe the detector. Credit: The European Physical Journal C (2022).

    Fig. 2
    2
    Left: schematic representation of the detector longitudinal sampling structure. Right: transverse view of the last active layer. Different colors represent different materials: copper (orange), stainless steel and lead (gray), air (white) and active sensors made of silicon (black)

    There are more instructive images in the science paper.

    A team of researchers from CERN, Massachusetts Institute of Technology, and Staffordshire University have implemented a new algorithm for reconstructing particles at the Large Hadron Collider.

    The Large Hadron Collider (LHC) is the most powerful particle accelerator ever built which sits in a tunnel 100 meters underground at CERN, the European Organization for Nuclear Research, near Geneva in Switzerland. It is the site of long-running experiments which enable physicists worldwide to learn more about the nature of the universe.

    The project is part of the Compact Muon Solenoid (CMS) experiment [below] —one of seven installed experiments which uses detectors to analyze the particles produced by collisions in the accelerator.

    The subject of a new academic paper published in European Physical Journal C [below], the project has been carried out ahead of the high luminosity upgrade of the Large Hadron Collider.

    The High Luminosity Large Hadron Collider (HL-LHC) project aims to crank up the performance of the LHC in order to increase the potential for discoveries after 2029. The HL-LHC will increase the number of proton-proton interactions in an event from 40 to 200.

    Professor Raheel Nawaz, Pro Vice-Chancellor for Digital Transformation, at Staffordshire University, has supervised the research. He explained that “limiting the increase of computing resource consumption at large pileups is a necessary step for the success of the HL-LHC physics program and we are advocating the use of modern machine learning techniques to perform particle reconstruction as a possible solution to this problem.”

    He added that “this project has been both a joy and a privilege to work on and is likely to dictate the future direction of research on particle reconstruction by using a more advanced AI-based solution.”

    Dr. Jan Kieseler from the Experimental Physics Department at CERN added that “this is the first single-shot reconstruction of about 1,000 particles from and in an unprecedentedly challenging environment with 200 simultaneous interactions each proton-proton collision. Showing that this novel approach, combining dedicated graph neural network layers (GravNet) and training methods (Object Condensation), can be extended to such challenging tasks while staying within resource constraints represents an important milestone towards future particle reconstruction.”

    Shah Rukh Qasim, leading this project as part of his Ph.D. at CERN and Manchester Metropolitan University, says that “the amount of progress we have made on this project in the last three years is truly remarkable. It was hard to imagine we would reach this milestone when we started.”

    Professor Martin Jones, Vice-Chancellor and Chief Executive at Staffordshire University, added that “CERN is one of the world’s most respected centers for scientific research and I congratulate the researchers on this project which is effectively paving the way for even greater discoveries in years to come.”

    “Artificial Intelligence is continuously evolving to benefit many different industries and to know that academics at Staffordshire University and elsewhere are contributing to the research behind such advancements is both exciting and significant.”

    Science paper:
    European Physical Journal C

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier


    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    ALICE

    CMS

    LHCb

    LHC

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] map.

    3D cut of the LHC dipole CERN LHC underground tunnel and tube.

    CERN SixTrack LHC particles.

    OTHER PROJECTS AT CERN

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] AEGIS.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] ALPHA Antimatter Factory.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] ALPHA-g Detector.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] AMS.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] ASACUSA.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear] [ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] ATRAP.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] Antiproton Decelerator.


    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] BASE: Baryon Antibaryon Symmetry Experiment.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] BASE instrument.

    The European Organization for Nuclear Research [Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH) [CERN] [CERN] BASE: Baryon Antibaryon Symmetry Experiment.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] CAST Axion Solar Telescope.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] CLOUD.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] COMPASS.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] CRIS experiment.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] DIRAC.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] FASER experiment schematic.

    CERN FASER is designed to study the interactions of high-energy neutrinos and search for new as-yet-undiscovered light and weakly interacting particles. Such particles are dominantly produced along the beam collision axis and may be long-lived particles, travelling hundreds of metres before decaying. The existence of such new particles is predicted by many models beyond the Standard Model that attempt to solve some of the biggest puzzles in physics, such as the nature of dark matter and the origin of neutrino masses.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] GBAR.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] ISOLDE Looking down into the ISOLDE experimental hall.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] LHCf.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] The MoEDAL experiment- a new light on the high-energy frontier.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] NA62.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] NA64.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] n_TOF.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] TOTEM.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] UA9.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] The SPS’s new RF system.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] Proto Dune.

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][ Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] HiRadMat-High Radiation to Materials.

    1
    The SND@LHC experiment consists of an emulsion/tungsten target for neutrinos (yellow) interleaved with electronic tracking devices (grey), followed downstream by a detector (brown) to identify muons and measure the energy of the neutrinos. (Image: Antonio Crupano/SND@LHC)

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: