Tagged: Mathematics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:13 pm on July 20, 2021 Permalink | Reply
    Tags: "A Video Tour of the Standard Model", , Mathematics, , , ,   

    From Quanta Magazine (US) via Symmetry: “A Video Tour of the Standard Model” 

    From Quanta Magazine

    via

    Symmetry Mag

    Symmetry

    July 16, 2021
    Kevin Hartnett

    1
    Standard Model of Particle Physics. Credit: Quanta Magazine.


    The Standard Model: The Most Successful Scientific Theory Ever.
    Video: The Standard Model of particle physics is the most successful scientific theory of all time. In this explainer, Cambridge University physicist David Tong recreates the model, piece by piece, to provide some intuition for how the fundamental building blocks of our universe fit together.
    Emily Buder/Quanta Magazine.
    Kristina Armitage and Rui Braz for Quanta Magazine.

    Recently, Quanta has explored the collaboration between physics and mathematics on one of the most important ideas in science: quantum field theory. The basic objects of a quantum field theory are quantum fields, which spread across the universe and, through their fluctuations, give rise to the most fundamental phenomena in the physical world. We’ve emphasized the unfinished business in both physics and mathematics — the ways in which physicists still don’t fully understand a theory they wield so effectively, and the grand rewards that await mathematicians if they can provide a full description of what quantum field theory actually is.

    This incompleteness, however, does not mean the work has been unsatisfying so far.

    For our final entry in this “Math Meets QFT” series, we’re exploring the most prominent quantum field theory of them all: the Standard Model. As the University of Cambridge (UK) physicist David Tong puts it in the accompanying video, it’s “the most successful scientific theory of all time” despite being saddled with a “rubbish name.”

    The Standard Model describes physics in the three spatial dimensions and one time dimension of our universe. It captures the interplay between a dozen quantum fields representing fundamental particles and a handful of additional fields representing forces. The Standard Model ties them all together into a single equation that scientists have confirmed countless times, often with astonishing accuracy. In the video, Professor Tong walks us through that equation term by term, introducing us to all the pieces of the theory and how they fit together. The Standard Model is complicated, but it is easier to work with than many other quantum field theories. That’s because sometimes the fields of the Standard Model interact with each other quite feebly, as writer Charlie Wood described in the second piece in our series.

    From Quanta Magazine : “Mathematicians Prove 2D Version of Quantum Gravity Really Works”

    The Standard Model has been a boon for physics, but it’s also had a bit of a hangover effect. It’s been extraordinarily effective at explaining experiments we can do here on Earth, but it can’t account for several major features of the wider universe, including the action of gravity at short distances and the presence of dark matter and dark energy. Physicists would like to move beyond the Standard Model to an even more encompassing physical theory. But, as the physicist Davide Gaiotto put it in the first piece in our series, the glow of the Standard Model is so strong that it’s hard to see beyond it.

    From Quanta Magazine : “The Mystery at the Heart of Physics That Only Math Can Solve”

    And that, maybe, is where math comes in. Mathematicians will have to develop a fresh perspective on quantum field theory if they want to understand it in a self-consistent and rigorous way. There’s reason to hope that this new vantage will resolve many of the biggest open questions in physics.

    The process of bringing QFT into math may take some time — maybe even centuries, as the physicist Nathan Seiberg speculated in the third piece in our series — but it’s also already well underway. By now, math and quantum field theory have indisputably met. It remains to be seen what happens as they really get to know each other.

    From Quanta Magazine : “Nathan Seiberg on How Math Might Complete the Ultimate Physics Theory”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 11:09 am on July 1, 2021 Permalink | Reply
    Tags: "The power of two", , , , , , , , Ellen Zhong, , , Mathematics, Software called cryoDRGN   

    From Massachusetts Institute of Technology (US) : “The power of two” 

    MIT News

    From Massachusetts Institute of Technology (US)

    June 30, 2021
    Saima Sidik | Department of Biology

    Graduate student Ellen Zhong helped biologists and mathematicians reach across departmental lines to address a longstanding problem in electron microscopy.

    1
    Ellen Zhong, a graduate student from the Computational and Systems Biology Program, is using a computational pattern-recognition tool called a neural network to study the shapes of molecular machines.
    Credit: Matthew Brown.

    MIT’s Hockfield Court is bordered on the west by the ultramodern Stata Center, with its reflective, silver alcoves that jut off at odd angles, and on the east by Building 68, which is a simple, window-lined, cement rectangle. At first glance, Bonnie Berger’s mathematics lab in the Stata Center and Joey Davis’s biology lab in Building 68 are as different as the buildings that house them. And yet, a recent collaboration between these two labs shows how their disciplines complement each other. The partnership started when Ellen Zhong, a graduate student from the Computational and Systems Biology (CSB) Program, decided to use a computational pattern-recognition tool called a neural network to study the shapes of molecular machines. Three years later, Zhong’s project is letting scientists see patterns that run beneath the surface of their data, and deepening their understanding of the molecules that shape life.

    Zhong’s work builds on a technique from the 1970s called cryo-electron microscopy (cryo-EM), which lets researchers take high-resolution images of frozen protein complexes. Over the past decade, better microscopes and cameras have led to a “resolution revolution” in cryo-EM that’s allowed scientists to see individual atoms within proteins. But, as good as these images are, they’re still only static snapshots. In reality, many of these molecular machines are constantly changing shape and composition as cells carry out their normal functions and adjust to new situations.

    Along with former Berger lab member Tristan Belper, Zhong devised software called cryoDRGN. The tool uses neural nets to combine hundreds of thousands of cryo-EM images, and shows scientists the full range of three-dimensional conformations that protein complexes can take, letting them reconstruct the proteins’ motion as they carry out cellular functions. Understanding the range of shapes that protein complexes can take helps scientists develop drugs that block viruses from entering cells, study how pests kill crops, and even design custom proteins that can cure disease. Covid-19 vaccines, for example, work partly because they include a mutated version of the virus’s spike protein that’s stuck in its active conformation, so vaccinated people produce antibodies that block the virus from entering human cells. Scientists needed to understand the variety of shapes that spike proteins can take in order to figure out how to force spike into its active conformation.

    Getting off the computer and into the lab

    Zhong’s interest in computational biology goes back to 2011 when, as a chemical engineering undergrad at the University of Virginia (US), she worked with Professor Michael Shirts to simulate how proteins fold and unfold. After college, Zhong took her skills to a company called D. E. Shaw Research, where, as a scientific programmer, she took a computational approach to studying how proteins interact with small-molecule drugs.

    “The research was very exciting,” Zhong says, “but all based on computer simulations. To really understand biological systems, you need to do experiments.”

    This goal of combining computation with experimentation motivated Zhong to join MIT’s CSB PhD program, where students often work with multiple supervisors to blend computational work with bench work. Zhong “rotated” in both the Davis and Berger labs, then decided to combine the Davis lab’s goal of understanding how protein complexes form with the Berger lab’s expertise in machine learning and algorithms. Davis was interested in building up the computational side of his lab, so he welcomed the opportunity to co-supervise a student with Berger, who has a long history of collaborating with biologists.

    Davis himself holds a dual bachelor’s degree in computer science and biological engineering, so he’s long believed in the power of combining complementary disciplines. “There are a lot of things you can learn about biology by looking in a microscope,” he says. “But as we start to ask more complicated questions about entire systems, we’re going to require computation to manage the high-dimensional data that come back.”


    Reconstructing Molecules in Motion.

    Before rotating in the Davis lab, Zhong had never performed bench work before — or even touched a pipette. She was fascinated to find how streamlined some very powerful molecular biology techniques can be. Still, Zhong realized that physical limitations mean that biology is much slower when it’s done at the bench instead of on a computer. “With computational research, you can automate experiments and run them super quickly, whereas in the wet lab, you only have two hands, so you can only do one experiment at a time,” she says.

    Zhong says that synergizing the two different cultures of the Davis and Berger labs is helping her become a well-rounded, adaptable scientist. Working around experimentalists in the Davis lab has shown her how much labor goes into experimental results, and also helped her to understand the hurdles that scientists face at the bench. In the Berger lab, she enjoys having coworkers who understand the challenges of computer programming.

    “The key challenge in collaborating across disciplines is understanding each other’s ‘languages,’” Berger says. “Students like Ellen are fortunate to be learning both biology and computing dialects simultaneously.”

    Bringing in the community

    Last spring revealed another reason for biologists to learn computational skills: these tools can be used anywhere there’s a computer and an internet connection. When the Covid-19 pandemic hit, Zhong’s colleagues in the Davis lab had to wind down their bench work for a few months, and many of them filled their time at home by using cryo-EM data that’s freely available online to help Zhong test her cryoDRGN software. The difficulty of understanding another discipline’s language quickly became apparent, and Zhong spent a lot of time teaching her colleagues to be programmers. Seeing the problems that nonprogrammers ran into when they used cryoDRGN was very informative, Zhong says, and helped her create a more user-friendly interface.

    Although the paper announcing cryoDRGN was just published in February, the tool created a stir as soon as Zhong posted her code online, many months prior. The cryoDRGN team thinks this is because leveraging knowledge from two disciplines let them visualize the full range of structures that protein complexes can have, and that’s something researchers have wanted to do for a long time. For example, the cryoDRGN team recently collaborated with researchers from Harvard and Washington universities to study locomotion of the single-celled organism Chlamydomonas reinhardtii. The mechanisms they uncovered could shed light on human health conditions, like male infertility, that arise when cells lose the ability to move. The team is also using cryoDRGN to study the structure of the SARS-CoV-2 spike protein, which could help scientists design treatments and vaccines to fight coronaviruses.

    Zhong, Berger, and Davis say they’re excited to continue using neural nets to improve cryo-EM analysis, and to extend their computational work to other aspects of biology. Davis cited mass spectrometry as “a ripe area to apply computation.” This technique can complement cryo-EM by showing researchers the identities of proteins, how many of them are bound together, and how cells have modified them.

    “Collaborations between disciplines are the future,” Berger says. “Researchers focused on a single discipline can take it only so far with existing techniques. Shining a different lens on the problem is how advances can be made.”

    Zhong says it’s not a bad way to spend a PhD, either. Asked what she’d say to incoming graduate students considering interdisciplinary projects, she says: “Definitely do it.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    Massachusetts Institute of Technology (US) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory, the Bates Center, and the Haystack Observatory, as well as affiliated laboratories such as the Broad and Whitehead Institutes.

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology (US) adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with Massachusetts Institute of Technology (US) . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology (US) is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia (US), wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology (US) was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst (US)). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology (US) was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology (US) faculty and alumni rebuffed Harvard University (US) president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology (US) administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities (US)in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology (US) that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology (US)‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology (US) became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology (US) profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology (US) between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology (US) no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology (US)’s defense research. In this period Massachusetts Institute of Technology (US)’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology (US) ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT (US) Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology (US) students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology (US) over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology (US) has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology (US) classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology (US) was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology (US) launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology (US) announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology (US) faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology (US) has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology (US) community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology (US) announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology (US) community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO (US) was designed and constructed by a team of scientists from California Institute of Technology (US), Massachusetts Institute of Technology (US), and industrial contractors, and funded by the National Science Foundation (US) .

    MIT/Caltech Advanced aLigo .

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology (US) physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology (US) graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology (US) is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the Massachusetts Institute of Technology (US) community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 7:30 am on June 27, 2021 Permalink | Reply
    Tags: "Looking for similarities across complex systems", , , Dunkel’s work is planted in theory and numerical principles of geometry; mechanics; and pattern formation., How does the motion of individual cells give rise to the structure of biological tissue?, Jörn Dunkel uses the “common language” of math to bridge disparate phenomena from an embryo’s wrinkles to the twist of spaghetti., , Mathematics   

    From Massachusetts Institute of Technology (US) : “Looking for similarities across complex systems” 

    MIT News

    From Massachusetts Institute of Technology (US)

    June 28, 2021
    Jennifer Chu

    Jörn Dunkel uses the “common language” of math to bridge disparate phenomena from an embryo’s wrinkles to the twist of spaghetti.

    1
    MIT mathematician Jörn Dunkel looks for similarities across complex systems. Credit: M. Scott Brauer.

    How does the motion of individual cells give rise to the structure of biological tissue? How do an embryo’s wrinkles relate to an animal’s shape? And what does the stability of knots have to do with the way spaghetti breaks?

    These are some of the questions Jörn Dunkel has explored at MIT, through the lens of mathematics.

    “There are many problems where, if you look at them from the right way, you can treat them in a similar manner because they have a common structure at some abstract level,” says Dunkel, who received tenure in 2020 as an associate professor in the Department of Mathematics.

    Dunkel’s work is planted in theory and numerical principles of geometry; mechanics; and pattern formation. From this base of mathematical operations, he has explored wide-ranging fields, looking for ways to bridge seemingly disparate systems through what he sees as the “common language” of math.

    “What’s helpful for me is to talk to people from many different fields,” Dunkel says. “This kicks me out of my comfort zone, and it’s almost like an algorithm for generating new ideas: Talk to people to get new perspectives, and then you try to combine that with what you know. And in this way you can make progress.”

    A system resettled

    Dunkel was born and raised in East Berlin, Germany, and vividly remembers the fall of the Berlin Wall, in 1989, as a time of both disruption and possibility.

    “It was a big, big change,” Dunkel recalls. “When my sister and I were little, our family couldn’t travel anywhere except for a small number of countries in the Eastern bloc. And when the wall came down, suddenly there were a lot of opportunities, and also uncertainties. There was more freedom to travel and see and learn different things. At the same time, people had to reestablish themselves, and many things were in limbo. You learned to adapt to a new system.”

    As the country’s schools restructured and settled into a unified, national education system, Dunkel was able for the first time to explore beyond East Berlin’s now-dissolved borders. After graduating from high school, he began to study at Humboldt University of Berlin [Humboldt-Universität zu Berlin] (DE). Dunkel’s undergraduate mentor, who worked in the area of active matter, introduced him to concepts of math and physics, and how to apply them to questions of biology, such as ways to describe how cells interact to give rise to macroscale tissues and whole organisms.

    “You start to see how things come together, and the way I was taught in those days had a big influence on how I think about systems today,” Dunkel says. “I was lucky many times along my path to have met people who helped teach and guide me.”

    Honest data

    After graduating with master’s degrees in math and physics, Dunkel dabbled briefly in astrophysics as a PhD student at the MPG Institute for Astrophysics [MPG Institut für Astrophysik](DE) before moving to the University of Augsburg [Universität Augsburg] (DE), in Germany, where he joined a statistical physics group headed by Peter Hanggi. There, he studied thermostatistical concepts in special relativity, and also in Brownian motion (the random motion of discrete particles), looking for ways to connect large-scale transport phenomena to their microscopic, single-particle machinery.

    He took the mathematical tools he developed in his PhD work to Oxford University (UK), where he did a postdoc with a group working in theoretical physics and exploring ways to mathematically model the individual and collective dynamics of bacterial cells. From there, he moved to University of Cambridge (UK), where he joined an experimental biophysics group as the only theory-oriented postdoc at the time. The researchers there were carrying out experiments on bacteria and algae and looking for patterns in the data to describe mathematically how the organisms swim.

    “What I learned there was, if you start working with a dataset that’s fundamentally new, it keeps you honest and forces you to think in new ways, because you have to explain that data,” Dunkel says. “So, even today when I advise my students, each one no matter the project, gets a real dataset to work with. Because as you try to understand the dataset, you can get new ideas that you wouldn’t have thought of otherwise.”

    Conscious capacity

    In 2013, Dunkel moved from the “other” Cambridge, to MIT, where he joined the Department of Mathematics as a junior faculty member in applied mathematics. In setting up his research program, he looked to develop mathematical tools to describe and predict the behavior of real-world phenomena at both the small and large scale, and to seek ways to mathematically bridge the two scales in various systems.

    He has primarily applied this thinking to understanding problems in developmental biology and soft matter, and has collaborated with experimentalists at MIT — especially professors Nikta Fakhri and Adam Martin — and elsewhere, looking through data they collect, for instance on the spiral waves in starfish eggs, for patterns that can be described and predicted through math. Dunkel has also intentionally left room to explore questions that might appear at first glance to divert from his main path of research.

    “For me it’s a conscious effort to leave capacities for new unexpected things,” Dunkel says. “This is what makes MIT special that you have this unique combination of people here who are excellent in so many areas and also very open to collaborating across the boundaries of disciplines.”

    A recent diversion sprang from a question that some of his students posed in class: Could a dry spaghetti noodle be broken in two? In addition to experimentally testing the idea, Dunkel helped the students develop a mathematical model to describe how the feat could be accomplished, with a precise bit of twisting.

    The work could have ended there. But it caught the attention of MIT Professor Matthias Kolle, who had developed a new type of fiber that changes color with strain. Kolle wondered whether the spaghetti model could be adapted to softer materials, to predict the strength of his fibers when knotted in certain configurations. He and Dunkel struck up a continuing collaboration, which has since drawn interest from surgeons looking to understand the stability of surgical knots, as well as biologists who are applying the model to predict the behavior of colonies of worms.

    “Though many of these systems are different, fundamentally, we can see similarities in the structure of their data,” Dunkel says. “It’s very easy to find differences. What’s more interesting is to find out what’s similar.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    Massachusetts Institute of Technology (US) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory, the Bates Center, and the Haystack Observatory, as well as affiliated laboratories such as the Broad and Whitehead Institutes.

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology (US) adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with Massachusetts Institute of Technology (US) . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology (US) is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia (US), wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology (US) was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst (US)). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology (US) was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology (US) faculty and alumni rebuffed Harvard University (US) president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology (US) administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities (US)in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology (US) that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology (US)‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology (US) became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology (US) profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology (US) between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology (US) no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology (US)’s defense research. In this period Massachusetts Institute of Technology (US)’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology (US) ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT (US) Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology (US) students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology (US) over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology (US) has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology (US) classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology (US) was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology (US) launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology (US) announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology (US) faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology (US) has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology (US) community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology (US) announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology (US) community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO (US) was designed and constructed by a team of scientists from California Institute of Technology (US), Massachusetts Institute of Technology (US), and industrial contractors, and funded by the National Science Foundation (US) .

    MIT/Caltech Advanced aLigo .

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology (US) physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology (US) graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology (US) is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the Massachusetts Institute of Technology (US) community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 4:17 pm on June 25, 2021 Permalink | Reply
    Tags: "Nathan Seiberg on How Math Might Complete the Ultimate Physics Theory", , From the time of the ancient Babylonians and Greeks there hasn’t been a real distinction between math and physics., Mathematics, Physicists and mathematicians are motivated by different questions. And different kinds of questions lead to different insights., , QFT: Quantum Field Theory, , Seiberg’s work has helped bring the study of quantum field theories closer to pure mathematics., The Standard Model of Particle Physics explains nearly every aspect of the physical world (except gravity)., We cannot yet formulate QFT in a rigorous way that would make mathematicians perfectly happy., [Isaac] Newton was motivated by physics when he invented calculus.   

    From Quanta Magazine : “Nathan Seiberg on How Math Might Complete the Ultimate Physics Theory” 

    From Quanta Magazine

    June 24, 2021
    Kevin Hartnett

    Even in an incomplete state, quantum field theory is the most successful physical theory ever discovered. Nathan Seiberg, one of its leading architects, talks about the gaps in QFT and how mathematicians could fill them.

    1
    Nathan Seiberg crosses a bridge over Stony Brook at the Institute for Advanced Study. Credit: Sasha Maslov for Quanta Magazine.

    Nathan Seiberg, 64, still does a lot of the electrical work and even some of the plumbing around his house in Princeton, New Jersey. It’s an interest he developed as a kid growing up in Israel, where he tinkered with his car and built a radio.

    “I was always fascinated by solving problems and understanding how things work,” he said.

    Seiberg’s professional career has been about problem solving, too, though nothing as straightforward as fixing radios. He’s a physicist at the Institute for Advanced Study (US), and over the course of a long and decorated career he has made many contributions to the development of Quantum Field Theory, or QFT.

    QFT refers broadly to the set of all possible quantum field theories. These are theories whose basic objects are “fields,” which stretch across space and time. There are fields associated with fundamental particles like electrons and quarks, and fields associated with fundamental forces, like gravity and electromagnetism. The most sweeping quantum field theory — and the most successful theory in the history of physics, period — is the Standard Model.

    It combines these fields into a single equation that explains nearly every aspect of the physical world (except gravity).

    By the time Seiberg started graduate school at the Weizmann Institute of Science (IL) in 1978, QFT was already well established as the principal perspective of physics. Its predictive power wasn’t in doubt, but many basic questions remained about how and why it worked so well.

    “It’s shocking that we have these techniques and sometimes they give beautiful answers, despite the fact that we don’t know how to formulate the problems rigorously,” said Seiberg.

    Much of Seiberg’s most important work has involved teasing apart how particular quantum field theories work the way they do. In the late 1980s he and Gregory Moore worked out mathematical details of types of quantum field theories called conformal field theories and topological field theories. Shortly after, partly in collaboration with Edward Witten, he focused on understanding features of three- and four-dimensional “supersymmetric” quantum field theories. This helped explain how quarks, the particles inside protons, are confined there.

    The work is complicated, but Seiberg retains an element of childlike fascination with it. Just as he once wanted to understand how a transistor radio produces sound, as a physicist he now seeks to explain how these quantum field theories yield often startlingly accurate predictions about the physical world.

    “You’re trying to figure out how something works and then you’re trying to use it,” he said.

    Seiberg’s work has also helped bring the study of quantum field theories closer to pure mathematics. In 1994, Witten discovered how to use physical phenomena that he and Seiberg had discovered to quantify basic characteristics of a space, like the number of holes it has. Their “Seiberg-Witten invariants” are now an essential tool in math. Seiberg believes quantum field theory and math must continue to grow closer if physicists are ever really going to understand the basic features underlying all quantum field theories.

    Quanta Magazine spoke with Seiberg about how physics and math are really two sides of the same coin, the ways in which QFT is incompletely understood today, and his own abandoned effort to write a textbook for the field. The interview has been condensed and edited for clarity.

    2
    Seiberg suspects that math and physics, which became separate fields of study only relatively recently, will one day merge together under the same deep intellectual structure. Credit: Sasha Maslov for Quanta Magazine.

    Math and physics have a long history together. What are some of the most important ways they have influenced each other over the centuries?

    From the time of the ancient Babylonians and Greeks there hasn’t been a real distinction between math and physics. They studied similar questions. There has been a lot of cross-fertilization between what today we call math and physics. [Isaac] Newton is a great example. He was motivated by physics when he invented calculus. Over the 20th century, things were a bit more complicated. People specialized in math or in physics.

    Physics usually offers very concrete questions and very concrete puzzles associated with reality and experiment. It’s also kind of grounded in reality. Math usually provides more generality, more powerful methods, and more rigor and precision. All of these elements are needed.

    Do you think they’ll continue to be increasingly separate fields?

    Given that they started as one field and lately diverged, but continue to influence each other, in the future I’d guess they’ll continue to influence each other to the point that there would be no clear separation between them. I think that there will be one deep, intellectual structure that encompasses math and physics.

    Why has QFT, and physics in general, been such a provocative stimulus for math?

    I think physicists and mathematicians are motivated by different questions. And different kinds of questions lead to different insights. There have been many examples where physicists came up with some ideas — which in most cases were not even rigorous — and mathematicians looked at them and said, “This is an equality between two different things; let’s try and prove it.” So the input from physics is another source of influence for the mathematicians. From this perspective, physics is like a machine that produces conjectures.

    And the track record with these conjectures has been quite amazing, so mathematicians have learned to take physics in general and quantum field theory in particular very seriously. But what is perhaps surprising for them is that they still can’t make QFT rigorous; they still can’t figure out where these insights come from.

    Let’s focus on the physics side for now, and that amazing track record. What are some of its biggest triumphs?

    QFT is by far the most successful theory ever created by mankind to explain anything. There are many [predictions] that agree perfectly with experiments to unprecedented accuracy. We’re talking about accuracy of up to the order of 12 digits between theory and experiment. And there are literally trillions and trillions of experiments that match the theory. I don’t think historically there has ever been any theory as successful as quantum field theory. And it includes as special cases all the previous discoveries, like Newton’s theory, [James Clerk] Maxwell’s theory of electromagnetism, and of course quantum mechanics and Albert Einstein’s special relativity. All these things are special cases of this one coherent intellectual structure. It’s an amazing, spectacular achievement.

    And yet we also think QFT is incomplete. What are its limitations?

    The biggest challenge is to merge it with Einstein’s general theory of relativity. There are many ideas how to do this. String theory is the main one. There has been a lot of progress, but we’re still not at the end of the story.

    You’ve referred to QFT as not yet “mature.” What do you mean by that?

    I have my preferred maturity test for a scientific field. That is to look at textbooks and at courses at universities that teach the topic. When you look at a mature field, most of the textbooks are more or less the same. They follow the same logical sequence of ideas. Similarly, most of the courses are more or less the same. When you learn calculus, you first learn one topic, then another, and then the third. It is the same sequence in all institutions. For me, this is a sign of a mature field.

    That’s not the case for QFT. There are several books with different perspectives from different points of view, with [ideas presented] in a different order. For me this means that we have not found the ultimate, streamlined version of presenting our understanding.

    You’ve also mentioned that it’s a sign of incompleteness that QFT doesn’t have its own place in mathematics. What does that mean?

    We cannot yet formulate QFT in a rigorous way that would make mathematicians perfectly happy. In special cases we can, but in general we cannot. In all the other theories in physics — in classical physics, in quantum mechanics — there is no such problem. Mathematicians have a rigorous description of it. They can prove theorems and make deep advances. That’s not yet the case in quantum field theory.

    I should emphasize that we do not look for rigor for the sake of rigor. That’s not our goal. But I think that the fact that we don’t yet have a rigorous description of it, the fact that mathematicians are not yet comfortable with it, is a clear reflection of the fact that we don’t yet fully understand what we’re doing.

    If we do have a rigorous description of QFT, it will give us new, deeper insights into the structure of the theory. It will give us new tools to perform calculations, and it will uncover new phenomena.

    Are we even close to doing this?

    Whatever approach we take, we get stuck somewhere. One approach that gets close to being rigorous is we imagine space as a lattice of points. Then we take the limit as the points approach each other and space becomes continuous. We describe space as a lattice, and as long as we’re on the lattice there is nothing non-rigorous about it. The challenge is to prove that the limit exists as the distance [between points on the lattice] becomes small and the number of points [on the lattice] becomes large. We assume this limit exists, but we cannot prove it.

    So if we do it, will a rigorous understanding of quantum field theory actually merge it with general relativity? That is, will it provide a long-sought theory of quantum gravity?

    It’s quite clear to me that there is one intellectual structure that includes everything. I think of quantum field theory as being the language of physics, simply because it already appears like the language of many different phenomena in many different fields. I expect it to encompass also quantum gravity. In fact, in special circumstances, quantum gravity is described by quantum field theory.

    It might take a century or two, even three centuries, to get there. But I personally don’t think it will take that long. This is not to say that in 200-300 years science will be over. There will still be many interesting questions to address. But with a better understanding of quantum field theory, I think [discovery] will be a lot faster.

    What could remain to be discovered after QFT is fully understood?

    Most physicists aren’t trying to find a more fundamental description of nature. [Instead they say,] “Given the rules, and given what we know, can we explain known phenomena and find new phenomena, like new materials that exhibit special properties?” I think this will continue for a long time. Nature is very rich, and once we fully understand the rules of nature we’ll be able to use these rules to predict new phenomena. This is not less exciting than finding the fundamental rules of nature.

    You mentioned that one indication the field of QFT is not complete is that it doesn’t yet have a canonical textbook. I mentioned this to another physicist recently, and he said a lot of people hope you’ll write it.

    I tried, actually, but I stopped. Around 2000, I took one summer, and at the end of the summer I had many pages written, and I realized I hated what I’d written.

    Honestly, my problem is that there are all these different ways of starting to write it, but I can’t find a preferred angle. I think it’s a reflection of the status of the field, a sign that it’s not yet mature enough. The fact that there isn’t a clear starting point, to me, is a sign that we haven’t yet found the ultimate way to think about it.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 11:37 am on June 25, 2021 Permalink | Reply
    Tags: "New math model traces the link between atmospheric CO2 and temperature over half a billion years", , , , , ESS: Earth system sensitivity, Mathematics,   

    From Rochester Institute of Technology (US) : “New math model traces the link between atmospheric CO2 and temperature over half a billion years” 

    From Rochester Institute of Technology (US)

    June 23, 2021
    Luke Auburn
    luke.auburn@rit.edu

    Lead author Assistant Professor Tony Wong’s paper featured in Nature Communications.

    1
    Assistant professor Tony Wong is the lead author of an article featured in the journal Nature Communications that outlines a new modeling method to explore the relationship between the Earth’s atmospheric carbon dioxide and surface temperature over hundreds of millions of years.

    A Rochester Institute of Technology mathematician helped develop a new modeling method to explore the relationship between the Earth’s atmospheric carbon dioxide (CO2) and surface temperature over hundreds of millions of years. Assistant professor Tony Wong is the lead author of an article featured in the journal Nature Communications that outlines the method. He hopes it will help answer fundamental questions about the geophysical processes that drive the Earth’s climate.

    Scientists often use a measurement method called equilibrium climate sensitivity (ECS) to study the long-term effect on temperature of doubling atmospheric CO2 concentrations above pre-industrial conditions. ECS is helpful for assessing the effectiveness of climate change policies and understanding changes of up to a few hundred years. Wong’s method works on a much grander scale and uses a measurement called Earth system sensitivity (ESS), which allows the method to factor in processes that take millions of years to change, such as the shifting of tectonic plates or variations in solar luminosity.

    “The model gives us better insight into how the world and its geophysical processes work,” said Wong, faculty in the School of Mathematical Sciences. “It offers this nice schematic to incorporate all of the physics that encapsulates our current understanding of the Earth’s system. If it doesn’t look good or there are areas where there are biases relative to data that we have, then that highlights a gap in the understanding of the physics of our world. So we can go back and say OK, so what do we need to do to improve our understanding of the world?”

    Wong said the new work improves upon previous studies that used more limited sets of data and less sophisticated statistical methods for calibration. He also noted that the team used a unique inverse parameter calibration approach.

    “It’s taking this kind of Sherlock Holmes approach of once you rule out what’s not possible, what you’re left with must contain the truth,” said Wong. “It’s an interesting approach and it’s nice because there’s this satisfying cycle where you run the model forward, you see how it compares against data, and you rule out the parameter values that don’t represent the world we live in.”

    Wong has been working with College of Science students on ways to leverage the new technique. Recent alumnus Ken Shultes ’20 (applied mathematics) ’21 MS (applied and computational mathematics) built on the method and is developing an even more detailed statistical technique to infer the best model parameter values based on observational data.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Rochester Institute of Technology (US) is a private doctoral university within the town of Henrietta in the Rochester, New York metropolitan area.

    RIT is composed of nine academic colleges, including National Technical Institute for the Deaf(RIT)(US). The Institute is one of only a small number of engineering institutes in the State of New York, including New York Institute of Technology, SUNY Polytechnic Institute, and Rensselaer Polytechnic Institute(US). It is most widely known for its fine arts, computing, engineering, and imaging science programs; several fine arts programs routinely rank in the national “Top 10” according to US News & World Report.

    The university offers undergraduate and graduate degrees, including doctoral and professional degrees and online masters as well.

    The university was founded in 1829 and is the tenth largest private university in the country in terms of full-time students. It is internationally known for its science; computer; engineering; and art programs as well as for the National Technical Institute for the Deaf- a leading deaf-education institution that provides educational opportunities to more than 1000 deaf and hard-of-hearing students. RIT is known for its Co-op program that gives students professional and industrial experience. It has the fourth oldest and one of the largest Co-op programs in the world. It is classified among “R2: Doctoral Universities – High research activity”.

    RIT’s student population is approximately 19,000 students, about 16,000 undergraduate and 3000 graduate. Demographically, students attend from all 50 states in the United States and from more than 100 countries around the world. The university has more than 4000 active faculty and staff members who engage with the students in a wide range of academic activities and research projects. It also has branches abroad, its global campuses, located in China, Croatia and United Arab Emirates (Dubai).

    Fourteen RIT alumni and faculty members have been recipients of the Pulitzer Prize.

    History

    The university began as a result of an 1891 merger between Rochester Athenæum, a literary society founded in 1829 by Colonel Nathaniel Rochester and associates and The Mechanics Institute- a Rochester school of practical technical training for local residents founded in 1885 by a consortium of local businessmen including Captain Henry Lomb- co-founder of Bausch & Lomb. The name of the merged institution at the time was called Rochester Athenæum and Mechanics Institute (RAMI). The Mechanics Institute however, was considered as the surviving school by taking over The Rochester Athenaeum’s charter. From the time of the merger until 1944 RAMI celebrated The former Mechanics Institute’s 1885 founding charter. In 1944 the school changed its name to Rochester Institute of Technology and re-established The Athenaeum’s 1829 founding charter and became a full-fledged research university.

    The university originally resided within the city of Rochester, New York, proper, on a block bounded by the Erie Canal; South Plymouth Avenue; Spring Street; and South Washington Street (approximately 43.152632°N 77.615157°W). Its art department was originally located in the Bevier Memorial Building. By the middle of the twentieth century, RIT began to outgrow its facilities, and surrounding land was scarce and expensive. Additionally in 1959 the New York Department of Public Works announced a new freeway- the Inner Loop- was to be built through the city along a path that bisected the university’s campus and required demolition of key university buildings. In 1961 an unanticipated donation of $3.27 million ($27,977,071 today) from local Grace Watson (for whom RIT’s dining hall was later named) allowed the university to purchase land for a new 1,300-acre (5.3 km^2) campus several miles south along the east bank of the Genesee River in suburban Henrietta. Upon completion in 1968 the university moved to the new suburban campus, where it resides today.

    In 1966 RIT was selected by the Federal government to be the site of the newly founded National Technical Institute for the Deaf (NTID). NTID admitted its first students in 1968 concurrent with RIT’s transition to the Henrietta campus.

    In 1979 RIT took over Eisenhower College- a liberal arts college located in Seneca Falls, New York. Despite making a 5-year commitment to keep Eisenhower open RIT announced in July 1982 that the college would close immediately. One final year of operation by Eisenhower’s academic program took place in the 1982–83 school year on the Henrietta campus. The final Eisenhower graduation took place in May 1983 back in Seneca Falls.

    In 1990 RIT started its first PhD program in Imaging Science – the first PhD program of its kind in the U.S. RIT subsequently established PhD programs in six other fields: Astrophysical Sciences and Technology; Computing and Information Sciences; Color Science; Microsystems Engineering; Sustainability; and Engineering. In 1996 RIT became the first college in the U.S to offer a Software Engineering degree at the undergraduate level.

    Colleges

    RIT has nine colleges:

    RIT College of Engineering Technology
    Saunders College of Business
    B. Thomas Golisano College of Computing and Information Sciences
    Kate Gleason College of Engineering
    RIT College of Health Sciences and Technology
    College of Art and Design
    RIT College of Liberal Arts
    RIT College of Science
    National Technical Institute for the Deaf

    There are also three smaller academic units that grant degrees but do not have full college faculties:

    RIT Center for Multidisciplinary Studies
    Golisano Institute for Sustainability
    University Studies

    In addition to these colleges, RIT operates three branch campuses in Europe, one in the Middle East and one in East Asia:

    RIT Croatia (formerly the American College of Management and Technology) in Dubrovnik and Zagreb, Croatia
    RIT Kosovo (formerly the American University in Kosovo) in Pristina, Kosovo
    RIT Dubai in Dubai, United Arab Emirates
    RIT China-Weihai Campus

    RIT also has international partnerships with the following schools:[34]

    Yeditepe University in Istanbul, Turkey
    Birla Institute of Technology and Science in India
    Pontificia Universidad Catolica Madre y Maestra (PUCMM) in Dominican Republic
    Instituto Tecnológico de Santo Domingo (INTEC) in Dominican Republic
    Universidad Tecnologica Centro-Americana (UNITEC) in Honduras
    Universidad del Norte (UNINORTE) in Colombia
    Universidad Peruana de Ciencias Aplicadas (UPC) in Peru

    Research

    RIT’s research programs are rapidly expanding. The total value of research grants to university faculty for fiscal year 2007–2008 totaled $48.5 million- an increase of more than twenty-two percent over the grants from the previous year. The university currently offers eight PhD programs: Imaging science; Microsystems Engineering; Computing and Information Sciences; Color science; Astrophysical Sciences and Technology; Sustainability; Engineering; and Mathematical modeling.

    In 1986 RIT founded the Chester F. Carlson Center for Imaging Science and started its first doctoral program in Imaging Science in 1989. The Imaging Science department also offers the only Bachelors (BS) and Masters (MS) degree programs in imaging science in the country. The Carlson Center features a diverse research portfolio; its major research areas include Digital Image Restoration; Remote Sensing; Magnetic Resonance Imaging; Printing Systems Research; Color Science; Nanoimaging; Imaging Detectors; Astronomical Imaging; Visual Perception; and Ultrasonic Imaging.

    The Center for Microelectronic and Computer Engineering was founded by RIT in 1986. The university was the first university to offer a bachelor’s degree in Microelectronic Engineering. The Center’s facilities include 50,000 square feet (4,600 m^2) of building space with 10,000 square feet (930 m^2) of clean room space. The building will undergo an expansion later this year. Its research programs include nano-imaging; nano-lithography; nano-power; micro-optical devices; photonics subsystems integration; high-fidelity modeling and heterogeneous simulation; microelectronic manufacturing; microsystems integration; and micro-optical networks for computational applications.

    The Center for Advancing the Study of CyberInfrastructure (CASCI) is a multidisciplinary center housed in the College of Computing and Information Sciences. The Departments of Computer science; Software Engineering; Information technology; Computer engineering; Imaging Science; and Bioinformatics collaborate in a variety of research programs at this center. RIT was the first university to launch a Bachelor’s program in Information technology in 1991; the first university to launch a Bachelor’s program in Software Engineering in 1996 and was also among the first universities to launch a Computer science Bachelor’s program in 1972. RIT helped standardize the Forth programming language and developed the CLAWS software package.

    The Center for Computational Relativity and Gravitation was founded in 2007. The CCRG comprises faculty and postdoctoral research associates working in the areas of general relativity; gravitational waves; and galactic dynamics. Computing facilities in the CCRG include gravitySimulator, a novel 32-node supercomputer that uses special-purpose hardware to achieve speeds of 4TFlops in gravitational N-body calculations, and newHorizons [image N/A], a state-of-the art 85-node Linux cluster for numerical relativity simulations.

    2
    Gravity Simulator at the Center for Computational Relativity and Gravitation, RIT, Rochester, New York, USA.

    The Center for Detectors was founded in 2010. The CfD designs; develops; and implements new advanced sensor technologies through collaboration with academic researchers; industry engineers; government scientists; and university/college students. The CfD operates four laboratories and has approximately a dozen funded projects to advance detectors in a broad array of applications, e.g. astrophysics; biomedical imaging; Earth system science; and inter-planetary travel. Center members span eight departments and four colleges.

    RIT has collaborated with many industry players in the field of research as well, including IBM; Xerox; Rochester’s Democrat and Chronicle; Siemens; National Aeronautics Space Agency(US); and the Defense Advanced Research Projects Agency (US) (DARPA). In 2005, it was announced by Russell W. Bessette- Executive Director New York State Office of Science Technology & Academic Research (NYSTAR), that RIT will lead the SUNY University at Buffalo (US) and Alfred University (US) in an initiative to create key technologies in microsystems; photonics; nanomaterials; and remote sensing systems and to integrate next generation IT systems. In addition, the collaboratory is tasked with helping to facilitate economic development and tech transfer in New York State. More than 35 other notable organizations have joined the collaboratory, including Boeing, Eastman Kodak, IBM, Intel, SEMATECH, ITT, Motorola, Xerox, and several Federal agencies, including as NASA.

    RIT has emerged as a national leader in manufacturing research. In 2017, the U.S. Department of Energy selected RIT to lead its Reducing Embodied-Energy and Decreasing Emissions (REMADE) Institute aimed at forging new clean energy measures through the Manufacturing USA initiative. RIT also participates in five other Manufacturing USA research institutes.

     
  • richardmitnick 1:52 pm on June 18, 2021 Permalink | Reply
    Tags: "Mathematicians Prove 2D Version of Quantum Gravity Really Works", A trilogy of landmark publications, , “Liouville field”- see the description in the full blog post., , DOZZ formula: a finding of Harald Dorn; Hans-Jörg Otto; Alexeif Zamolodchikov; Alexander Zamolodchikov, Fields are central to quantum physics too; however the situation here is more complicated due to the deep randomness of quantum theory., In classical physics for example a single field tells you everything about how a force pushes objects around., In physics today the main actors in the most successful theories are fields., Mathematics, , QFT: Quantum Field Theory-a model of how one or more quantum fields each with their infinite variations act and interact., ,   

    From Quanta Magazine : “Mathematicians Prove 2D Version of Quantum Gravity Really Works” 

    From Quanta Magazine

    June 17, 2021
    Charlie Wood

    In three towering papers, a team of mathematicians has worked out the details of Liouville quantum field theory, a two-dimensional model of quantum gravity.

    1
    Credit: Olena Shmahalo/Quanta Magazine.

    Alexander Polyakov, a theoretical physicist now at Princeton University (US), caught a glimpse of the future of quantum theory in 1981. A range of mysteries, from the wiggling of strings to the binding of quarks into protons, demanded a new mathematical tool whose silhouette he could just make out.

    “There are methods and formulae in science which serve as master keys to many apparently different problems,” he wrote in the introduction to a now famous four-page letter in Physics Letters B. “At the present time we have to develop an art of handling sums over random surfaces.”

    Polyakov’s proposal proved powerful. In his paper he sketched out a formula that roughly described how to calculate averages of a wildly chaotic type of surface, the “Liouville field.” His work brought physicists into a new mathematical arena, one essential for unlocking the behavior of theoretical objects called strings and building a simplified model of quantum gravity.

    Years of toil would lead Polyakov to breakthrough solutions for other theories in physics, but he never fully understood the mathematics behind the Liouville field.

    Over the last seven years, however, a group of mathematicians has done what many researchers thought impossible. In a trilogy of landmark publications, they have recast Polyakov’s formula using fully rigorous mathematical language and proved that the Liouville field flawlessly models the phenomena Polyakov thought it would.

    1
    Vincent Vargas of the National Centre for Scientific Research [Centre national de la recherche scientifique, [CNRS] (FR) and his collaborators have achieved a rare feat: a strongly interacting quantum field theory perfectly described by a brief mathematical formula.

    “It took us 40 years in math to make sense of four pages,” said Vincent Vargas, a mathematician at the French National Center for Scientific Research and co-author of the research with Rémi Rhodes of Aix-Marseille University [Aix-Marseille Université] (FR), Antti Kupiainen of the University of Helsinki [ Helsingin yliopisto; Helsingfors universitet] (FI), François David of the French National Centre for Scientific Research [Centre national de la recherche scientifique, [CNRS] (FR), and Colin Guillarmou of Paris-Saclay University [Université Paris-Saclay] (FR).

    The three papers forge a bridge between the pristine world of mathematics and the messy reality of physics — and they do so by breaking new ground in the mathematical field of probability theory. The work also touches on philosophical questions regarding the objects that take center stage in the leading theories of fundamental physics: quantum fields.

    “This is a masterpiece in mathematical physics,” said Xin Sun, a mathematician at the University of Pennsylvania (US).

    Infinite Fields

    In physics today the main actors in the most successful theories are fields — objects that fill space, taking on different values from place to place.

    In classical physics for example a single field tells you everything about how a force pushes objects around. Take Earth’s magnetic field: The twitches of a compass needle reveal the field’s influence (its strength and direction) at every point on the planet.

    Fields are central to quantum physics too; however the situation here is more complicated due to the deep randomness of quantum theory. From the quantum perspective, Earth doesn’t generate one magnetic field, but rather an infinite number of different ones. Some look almost like the field we observe in classical physics, but others are wildly different.

    But physicists still want to make predictions — predictions that ideally match, in this case, what a mountaineer reads on a compass. Assimilating the infinite forms of a quantum field into a single prediction is the formidable task of a “quantum field theory,” or QFT. This is a model of how one or more quantum fields each with their infinite variations act and interact.

    Driven by immense experimental support, QFTs have become the basic language of particle physics. The Standard Model is one such QFT, depicting fundamental particles like electrons as fuzzy bumps that emerge from an infinitude of electron fields. It has passed every experimental test to date (although various groups may be on the verge of finding the first holes).

    Physicists play with many different QFTs. Some, like the Standard Model, aspire to model real particles moving through the four dimensions of our universe (three spatial dimensions plus one dimension of time). Others describe exotic particles in strange universes, from two-dimensional flatlands to six-dimensional uber-worlds. Their connection to reality is remote, but physicists study them in the hopes of gaining insights they can carry back into our own world.

    Polyakov’s Liouville field theory is one such example.

    1

    Gravity’s Field

    The Liouville field, which is based on an equation from complex analysis developed in the 1800s by the French mathematician Joseph Liouville, describes a completely random two-dimensional surface — that is, a surface, like Earth’s crust, but one in which the height of every point is chosen randomly. Such a planet would erupt with mountain ranges of infinitely tall peaks, each assigned by rolling a die with infinite faces.

    Such an object might not seem like an informative model for physics, but randomness is not devoid of patterns. The bell curve, for example, tells you how likely you are to randomly pass a seven-foot basketball player on the street. Similarly, bulbous clouds and crinkly coastlines follow random patterns, but it’s nevertheless possible to discern consistent relationships between their large-scale and small-scale features.

    Liouville theory can be used to identify patterns in the endless landscape of all possible random, jagged surfaces. Polyakov realized this chaotic topography was essential for modeling strings, which trace out surfaces as they move. The theory has also been applied to describe quantum gravity in a two-dimensional world. Einstein defined gravity as space-time’s curvature, but translating his description into the language of quantum field theory creates an infinite number of space-times — much as the Earth produces an infinite collection of magnetic fields. Liouville theory packages all those surfaces together into one object. It gives physicists the tools to measure the curvature —and hence, gravitation — at every location on a random 2D surface.

    “Quantum gravity basically means random geometry, because quantum means random and gravity means geometry,” said Sun.

    Polyakov’s first step in exploring the world of random surfaces was to write down an expression defining the odds of finding a particular spiky planet, much as the bell curve defines the odds of meeting someone of a particular height. But his formula did not lead to useful numerical predictions.

    To solve a quantum field theory is to be able to use the field to predict observations. In practice, this means calculating a field’s “correlation functions,” which capture the field’s behavior by describing the extent to which a measurement of the field at one point relates, or correlates, to a measurement at another point. Calculating correlation functions in the photon field, for instance, can give you the textbook laws of quantum electromagnetism.

    Polyakov was after something more abstract: the essence of random surfaces, similar to the statistical relationships that make a cloud a cloud or a coastline a coastline. He needed the correlations between the haphazard heights of the Liouville field. Over the decades he tried two different ways of calculating them. He started with a technique called the Feynman path integral and ended up developing a workaround known as the bootstrap. Both methods came up short in different ways, until the mathematicians behind the new work united them in a more precise formulation.

    Add ’Em Up

    You might imagine that accounting for the infinitely many forms a quantum field can take is next to impossible. And you would be right. In the 1940s Richard Feynman, a quantum physics pioneer, developed one prescription for dealing with this bewildering situation, but the method proved severely limited.

    Take, again, Earth’s magnetic field. Your goal is to use quantum field theory to predict what you’ll observe when you take a compass reading at a particular location. To do this, Feynman proposed summing all the field’s forms together. He argued that your reading will represent some average of all the field’s possible forms. The procedure for adding up these infinite field configurations with the proper weighting is known as the Feynman path integral.

    It’s an elegant idea that yields concrete answers only for select quantum fields. No known mathematical procedure can meaningfully average an infinite number of objects covering an infinite expanse of space in general. The path integral is more of a physics philosophy than an exact mathematical recipe. Mathematicians question its very existence as a valid operation and are bothered by the way physicists rely on it.

    “I’m disturbed as a mathematician by something which is not defined,” said Eveliina Peltola, a mathematician at the University of Bonn [Rheinische Friedrich-Wilhelms-Universität Bonn](DE) in Germany.

    Physicists can harness Feynman’s path integral to calculate exact correlation functions for only the most boring of fields — free fields, which do not interact with other fields or even with themselves. Otherwise, they have to fudge it, pretending the fields are free and adding in mild interactions, or “perturbations.” This procedure, known as perturbation theory, gets them correlation functions for most of the fields in the Standard Model, because nature’s forces happen to be quite feeble.

    But it didn’t work for Polyakov. Although he initially speculated that the Liouville field might be amenable to the standard hack of adding mild perturbations, he found that it interacted with itself too strongly. Compared to a free field, the Liouville field seemed mathematically inscrutable, and its correlation functions appeared unattainable.

    Up by the Bootstraps

    Polyakov soon began looking for a workaround. In 1984, he teamed up with Alexander Belavin and Alexander Zamolodchikov to develop a technique called the bootstrap — a mathematical ladder that gradually leads to a field’s correlation functions.

    To start climbing the ladder, you need a function which expresses the correlations between measurements at a mere three points in the field. This “three-point correlation function,” plus some additional information about the energies a particle of the field can take, forms the bottom rung of the bootstrap ladder.

    From there you climb one point at a time: Use the three-point function to construct the four-point function, use the four-point function to construct the five-point function, and so on. But the procedure generates conflicting results if you start with the wrong three-point correlation function in the first rung.

    Polyakov, Belavin and Zamolodchikov used the bootstrap to successfully solve a variety of simple QFT theories, but just as with the Feynman path integral, they couldn’t make it work for the Liouville field.

    Then in the 1990s two pairs of physicists — Harald Dorn and Hans-Jörg Otto, and Zamolodchikov and his brother Alexei — managed to hit on the three-point correlation function that made it possible to scale the ladder, completely solving the Liouville field (and its simple description of quantum gravity). Their result, known by their initials as the DOZZ formula, let physicists make any prediction involving the Liouville field. But even the authors knew they had arrived at it partially by chance, not through sound mathematics.

    “They were these kind of geniuses who guessed formulas,” said Vargas.

    Educated guesses are useful in physics, but they don’t satisfy mathematicians, who afterward wanted to know where the DOZZ formula came from. The equation that solved the Liouville field should have come from some description of the field itself, even if no one had the faintest idea how to get it.

    “It looked to me like science fiction,” said Kupiainen. “This is never going to be proven by anybody.”

    Taming Wild Surfaces

    In the early 2010s, Vargas and Kupiainen joined forces with the probability theorist Rémi Rhodes and the physicist François David. Their goal was to tie up the mathematical loose ends of the Liouville field — to formalize the Feynman path integral that Polyakov had abandoned and, just maybe, demystify the DOZZ formula.

    As they began, they realized that a French mathematician named Jean-Pierre Kahane had discovered, decades earlier, what would turn out to be the key to Polyakov’s master theory.

    “In some sense it’s completely crazy that Liouville was not defined before us,” Vargas said. “All the ingredients were there.”

    The insight led to three milestone papers in mathematical physics completed between 2014 and 2020.

    2

    They first polished off the path integral, which had failed Polyakov because the Liouville field interacts strongly with itself, making it incompatible with Feynman’s perturbative tools. So instead, the mathematicians used Kahane’s ideas to recast the wild Liouville field as a somewhat milder random object known as the Gaussian free field. The peaks in the Gaussian free field don’t fluctuate to the same random extremes as the peaks in the Liouville field, making it possible for the mathematicians to calculate averages and other statistical measures in sensible ways.

    “Somehow it’s all just using the Gaussian free field,” Peltola said. “From that they can construct everything in the theory.”

    In 2014, they unveiled their result: a new and improved version of the path integral Polyakov had written down in 1981, but fully defined in terms of the trusted Gaussian free field. It’s a rare instance in which Feynman’s path integral philosophy has found a solid mathematical execution.

    “Path integrals can exist, do exist,” said Jörg Teschner, a physicist at the German Electron Synchrotron.

    With a rigorously defined path integral in hand, the researchers then tried to see if they could use it to get answers from the Liouville field and to derive its correlation functions. The target was the mythical DOZZ formula — but the gulf between it and the path integral seemed vast.

    “We’d write in our papers, just for propaganda reasons, that we want to understand the DOZZ formula,” said Kupiainen.

    The team spent years prodding their probabilistic path integral, confirming that it truly had all the features needed to make the bootstrap work. As they did so, they built on earlier work by Teschner. Eventually, Vargas, Kupiainen and Rhodes succeeded with a paper posted in 2017 [Annals of Mathematics] and another in October 2020, with Colin Guillarmou. They derived DOZZ and other correlation functions from the path integral and showed that these formulas perfectly matched the equations physicists had reached using the bootstrap.

    “Now we’re done,” Vargas said. “Both objects are the same.”

    The work explains the origins of the DOZZ formula and connects the bootstrap procedure —which mathematicians had considered sketchy — with verified mathematical objects. Altogether, it resolves the final mysteries of the Liouville field.

    “It’s somehow the end of an era,” said Peltola. “But I hope it’s also the beginning of some new, interesting things.”

    New Hope for QFTs

    Vargas and his collaborators now have a unicorn on their hands, a strongly interacting QFT perfectly described in a nonperturbative way by a brief mathematical formula that also makes numerical predictions.

    Now the literal million-dollar question is: How far can these probabilistic methods go? Can they generate tidy formulas for all QFTs? Vargas is quick to dash such hopes, insisting that their tools are specific to the two-dimensional environment of Liouville theory. In higher dimensions, even free fields are too irregular, so he doubts the group’s methods will ever be able to handle the quantum behavior of gravitational fields in our universe.

    But the fresh minting of Polyakov’s “master key” will open other doors. Its effects are already being felt in probability theory, where mathematicians can now wield previously dodgy physics formulas with impunity. Emboldened by the Liouville work, Sun and his collaborators have already imported equations from physics to solve two problems regarding random curves.

    Physicists await tangible benefits too, further down the road. The rigorous construction of the Liouville field could inspire mathematicians to try their hand at proving features of other seemingly intractable QFTs — not just toy theories of gravity but descriptions of real particles and forces that bear directly on the deepest physical secrets of reality.

    “[Mathematicians] will do things that we can’t even imagine,” said Davide Gaiotto, a theoretical physicist at the Perimeter Institute for Theoretical Physics (CA).

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 9:16 pm on April 20, 2021 Permalink | Reply
    Tags: "Looking at the stars or falling by the wayside? How astronomy is failing female scientists", , , , , , , , Mathematics, ,   

    From phys.org : “Looking at the stars or falling by the wayside? How astronomy and all of the Physical Sciences are failing female scientists” 

    From phys.org

    April 20, 2021
    Lisa Kewley

    1
    Women astronomers get disproportionately less telescope time than their male colleagues. Credit: Wikimedia Commons, CC BY-SA.

    “It will take until at least 2080 before women make up just one-third of Australia’s professional astronomers, unless there is a significant boost to how we nurture female researchers’ careers.

    Over the past decade, astronomy has been rightly recognized as leading the push towards gender equity in the sciences. But my new modeling, published today in Nature Astronomy, shows it is not working fast enough.

    The Australian Academy of Science’s decadal plan for astronomy in Australia proposes women should comprise one-third of the senior workforce by 2025.

    It’s a worthy, if modest, target. However, with new data from the academy’s Science in Australia Gender Equity (SAGE) program, I have modeled the effects of current hiring rates and practices and arrived at a depressing, if perhaps not surprising, conclusion. Without a change to the current mechanisms, it will take at least 60 years to reach that 30% level.

    However, the modeling also suggests that the introduction of ambitious, affirmative hiring programs aimed at recruiting and retaining talented women astronomers could see the target reached in just over a decade—and then growing to 50% in a quarter of a century.

    How did we get here?

    Before looking at how that might be done, it’s worth examining how the gender imbalance in physics arose in the first place. To put it bluntly: how did we get to a situation in which 40% of astronomy Ph.D.s are awarded to women, yet they occupy fewer than 20% of senior positions?

    On a broad level, the answer is simple: my analysis shows women depart astronomy at two to three times the rate of men. In Australia, from postdoc status to assistant professor level, 62% of women leave the field, compared with just 17% of men. Between assistant professor and full professor level, 47% of women leave; the male departure rate is about half that. Women’s departure rates are similar in US astronomy “The Leaky Pipeline for Postdocs: A study of the time between receiving a PhD and securing a faculty job for male and female astronomers”.

    The next question is: why?

    Many women leave out of sheer disillusionment. Women in physics and astronomy say their careers progress more slowly than those of male colleagues, and that the culture is not welcoming.

    They receive fewer career resources and opportunities. Randomized double blind trials and broad research studies in astronomy and across the sciences show implicit bias in astronomy, which means more men are published, cited, invited to speak at conferences, and given telescope time.

    It’s hard to build a solid research-based body of work when one’s access to tools and recognition is disproportionately limited.

    The loyalty problem

    There is another factor that sometimes contributes to the loss of women astronomers: loyalty. In situations where a woman’s male partner is offered a new job in another town or city, the woman more frequently gives up her work to facilitate the move.

    Encouraging universities or research institutes to help partners find suitable work nearby is thus one of the strategies I (and others) have suggested to help recruit women astrophysicists.

    But the bigger task at hand requires institutions to identify, tackle and overcome inherent bias—a legacy of a conservative academic tradition that, research shows, is weighted towards men.

    A key mechanism to achieve this was introduced in 2014 by the Astronomical Society of Australia. It devised a voluntary rating and assessment system known as the Pleiades Awards, which rewards institutions for taking concrete actions to advance the careers of women and close the gender gap.

    Initiatives include longer-term postdoctoral positions with part-time options, support for returning to astronomy research after career breaks, increasing the fraction of permanent positions relative to fixed-term contracts, offering women-only permanent positions, recruitment of women directly to professorial levels, and mentoring of women for promotion to the highest levels.

    Most if not all Australian organizations that employ astronomers have signed up to the Pleiades Awards, and are showing genuine commitment to change.

    So why is progress still so slow?

    Seven years on, we would expect to have seen an increase in women recruited to, and retained in, senior positions.

    And we are, but the effect is far from uniform. My own organization, the ARC Center of Excellence in All-Sky Astrophysics in 3 Dimensions (ASTRO 3D), is on track for a 50:50 women-to-men ratio working at senior levels by the end of this year.

    The University of Sydney School of Physics – Faculty of Science (AU) has made nine senior appointments over the past three years, seven of them women.

    But these examples are outliers. At many institutions, inequitable hiring ratios and high departure rates persist despite a large pool of women astronomers at postdoc levels and the positive encouragement of the Pleiades Awards.

    Using these results and my new workforce models, I have shown current targets of 33% or 50% of women at all levels is unattainable if the status quo remains.

    How to move forward

    I propose a raft of affirmative measures to increase the presence of women at all senior levels in Australian astronomy—and keep them there.

    These include creating multiple women-only roles, creating prestigious senior positions for women, and hiring into multiple positions for men and women to avoid perceptions of tokenism. Improved workplace flexibility is crucial to allowing female researchers to develop their careers while balancing other responsibilities.

    Australia is far from unique when it comes to dealing with gender disparities in astronomy. Broadly similar situations persist in China, the United States and Europe. An April 2019 paper [Nature Astronomy] outlined similar discrimination experienced by women astronomers in Europe.

    Australia, however, is well placed to play a leading role in correcting the imbalance. With the right action, it wouldn’t take long to make our approach to gender equity as world-leading as our research.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Science X in 100 words
    Science X™ is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004 (Physorg.com), Science X’s readership has grown steadily to include 5 million scientists, researchers, and engineers every month. Science X publishes approximately 200 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Science X community members enjoy access to many personalized features such as social networking, a personal home page set-up, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.
    Mission 12 reasons for reading daily news on Science X Organization Key editors and writersinclude 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 1:05 pm on April 14, 2021 Permalink | Reply
    Tags: "NOVEL THEORY ADDRESSES CENTURIES-OLD PHYSICS PROBLEM NOVEL THEORY ADDRESSES CENTURIES-OLD PHYSICS PROBLEM", , , , Mathematics, Mutual gravitational attraction, , The Hebrew University of Jerusalem [הַאוּנִיבֶרְסִיטָה הַעִבְרִית בְּיְרוּשָׁלַיִם ]   

    From The Hebrew University of Jerusalem [הַאוּנִיבֶרְסִיטָה הַעִבְרִית בְּיְרוּשָׁלַיִם ] (IL) : NOVEL THEORY ADDRESSES CENTURIES-OLD PHYSICS PROBLEM” 

    Hebrew U of Jerusalem bloc

    From The Hebrew University of Jerusalem [הַאוּנִיבֶרְסִיטָה הַעִבְרִית בְּיְרוּשָׁלַיִם‎] (IL)

    12/04/2021

    Hebrew University of Jerusalem Researcher introduces a new approach to the “three-body problem”; predicts its outcome statistics.

    1

    The “three-body problem,” the term coined for predicting the motion of three gravitating bodies in space, is essential for understanding a variety of astrophysical processes as well as a large class of mechanical problems, and has occupied some of the world’s best physicists, astronomers and mathematicians for over three centuries. Their attempts have led to the discovery of several important fields of science; yet its solution remained a mystery.

    At the end of the 17th century, Sir Isaac Newton succeeded in explaining the motion of the planets around the sun through a law of universal gravitation. He also sought to explain the motion of the moon. Since both the earth and the sun determine the motion of the moon, Newton became interested in the problem of predicting the motion of three bodies moving in space under the influence of their mutual gravitational attraction (see attached illustration), a problem that later became known as “the three-body problem”. However, unlike the two-body problem, Newton was unable to obtain a general mathematical solution for it. Indeed, the three-body problem proved easy to define, yet difficult to solve.

    New research, led by Prof. Barak Kol of the Racah Institute of Physics at the Hebrew University, adds a step to this scientific journey that began with Newton, touching on the limits of scientific prediction, and the role of chaos in it.

    The theoretical study presents a novel and exact reduction of the problem, enabled by a re-examination of the basic concepts that underlie previous theories. It allows for a precise prediction of the probability for each of the three bodies to escape the system.

    Following Newton and two centuries of fruitful research in the field including by Euler, Lagrange and Jacobi, by the late 19th century the mathematician Poincare discovered that the problem exhibits extreme sensitivity to the bodies’ initial positions and velocities. This sensitivity, which later became known as chaos, has far-reaching implications – it indicates that there is no deterministic solution in closed-form to the three-body problem.

    In the 20th century, the development of computers made it possible to re-examine the problem with the help of computerized simulations of the bodies’ motion. The simulations showed that under some general assumptions, a three-body system experiences periods of chaotic, or random, motion alternating with periods of regular motion, until finally the system disintegrates into a pair of bodies orbiting their common center of mass and a third one moving away, or escaping, from them.

    The chaotic nature implies that not only is a closed-form solution impossible, but also computer simulations cannot provide specific and reliable long-term predictions. However, the availability of large sets of simulations led in 1976 to the idea of seeking a statistical prediction of the system, and in particular, predicting the escape probability of each of the three bodies. In this sense, the original goal, to find a deterministic solution, was found to be wrong, and it was recognized that the right goal is to find a statistical solution.

    Determining the statistical solution has proven to be no easy task due to three features of this problem: the system presents chaotic motion that alternates with regular motion; it is unbounded and susceptible to disintegration. A year ago, Dr. Nicholas Stone of the Racah Institute of Physics at the Hebrew University and his colleagues used a new method of calculation, and for the first time achieved a closed mathematical expression for the statistical solution. However, this method, like all its predecessor statistical approaches, rests on certain assumptions. Inspired by these results, Kol initiated a re-examination of these assumptions.

    In order to understand the novelty of the new approach, it is necessary to discuss the notion of “phase space” that underlies all statistical theories in physics. A phase space is nothing but the space of all positions and velocities of the particles that compose a system. For instance, the phase space of a single particle allowed to move on a meter-long track with a velocity of at most two meters per second, is a rectangle, whose width is 1 meter, and whose length is four meters per second (since the velocity can be directed either to the left or to the right).

    Normally, physicists identify probability of an event of interest with its associated phase space volume (phase volume, in short). For instance, the probability for the particle to be found in the left half of the track, is associated with the volume of the left half of the phase space rectangle, which is one half of the total volume.

    The three-body problem is unbounded and the gravitational force is unlimited in range. This suggests infinite phase space volumes, which would imply infinite probabilities. In order to overcome this and related issues, all previous approaches postulated a “strong interaction region” and ignored phase volumes outside of it, such that phase volumes become finite. However, since the gravitational force decreases with distance, but never disappears, this is not an accurate theory, and it introduces a certain arbitrariness into the model.

    The new study, recently published in the scientific journal Celestial Mechanics and Dynamical Astronomy, focuses on the outgoing flux of phase-volume, rather than the phase-volume itself. For instance, consider a volume of gas within a container, a marked gas molecule moving within it, and consider the container wall to have two small holes. In this case, the probability for the molecule’s eventual exit hole would be proportional to the flux of the surrounding gas through each hole.

    Since the flux is finite even when the volume is infinite, this flux-based approach avoids the artificial problem of infinite probabilities, without ever introducing the artificial strong interaction region.

    In order to treat the mix between chaotic and regular motion, the flux-based theory further introduces an unknown quantity, the emissivity. In this way, the statistical prediction exactly factorizes into a closed-form expression, and the emissivity, which is presumably simpler and is left for future study.

    The flux-based theory predicts the escape probabilities of each body, under the assumption that the emissivity can be averaged out and ignored. The predictions are different from all previous frameworks, and Prof. Kol emphasizes that “tests by millions of computer simulations shows strong agreement between theory and simulation.” The simulations were carried out in collaboration with Viraj Manwadkar from the University of Chicago, Alessandro Trani from the Okinawa Institute in Japan, and Nathan Leigh from University of Concepcion in Chile. This agreement proves that understanding the system requires a paradigm shift and that the new conceptual basis describes the system well.

    It turns out, then, that even for the foundations of such an old problem, innovation is possible.

    The implications of this study are wide-ranging and is expected to influence both the solution of a variety of astrophysical problems and the understanding of an entire class of problems in mechanics. In astrophysics, it may have application to the mechanism that creates pairs of compact bodies that are the source of gravitational waves, as well as to deepen the understanding of the dynamics within star clusters. In mechanics, the three-body problem is a prototype for a variety of chaotic problems, so progress in it is likely to reflect on additional problems in this important class.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Hebrew University of Jerusalem campus

    The Hebrew University of Jerusalem (IL), founded in 1918 and opened officially in 1925, is Israel’s premier university as well as its leading research institution. The Hebrew University is ranked internationally among the 100 leading universities in the world and first among Israeli universities.
    The recognition the Hebrew University has attained confirms its reputation for excellence and its leading role in the scientific community. It stresses excellence and offers a wide array of study opportunities in the humanities, social sciences, exact sciences and medicine. The university encourages multi-disciplinary activities in Israel and overseas and serves as a bridge between academic research and its social and industrial applications.

    The Hebrew University has set as its goals the training of public, scientific, educational and professional leadership; the preservation of and research into Jewish, cultural, spiritual and intellectual traditions; and the expansion of the boundaries of knowledge for the benefit of all humanity.

     
  • richardmitnick 4:25 pm on January 27, 2021 Permalink | Reply
    Tags: "How heavy is Dark Matter? Scientists radically narrow the potential mass range for the first time", , , Dark Matter cannot be either ‘ultra-light’ or ‘super-heavy’., Gravity acts on Dark Matter just as it acts on the visible universe., If it turns out that the mass of Dark Matter is outside of the range predicted by the Sussex team then it will also prove that an additional force acts on Dark Matter., Mathematics, , , U Sussex (UK)   

    From U Sussex (UK): “How heavy is Dark Matter? Scientists radically narrow the potential mass range for the first time” 

    From U Sussex (UK)

    27 January 2021
    Anna Ford

    1
    Credit: Greg Rakozy on Unsplash.

    Scientists have calculated the mass range for Dark Matter – and it’s tighter than the science world thought.

    Their findings – due to be published in Physical Letters B in March – radically narrow the range of potential masses for Dark Matter particles, and help to focus the search for future Dark Matter-hunters. The University of Sussex researchers used the established fact that gravity acts on Dark Matter just as it acts on the visible universe to work out the lower and upper limits of Dark Matter’s mass.

    The results show that Dark Matter cannot be either ‘ultra-light’ or ‘super-heavy’, as some have theorised, unless an as-yet undiscovered force also acts upon it.

    The team used the assumption that the only force acting on Dark Matter is gravity, and calculated that Dark Matter particles must have a mass between 10-3 eV and 107 eV. That’s a much tighter range than the 10-24 eV – 1019 GeV spectrum which is generally theorised.

    What makes the discovery even more significant is that if it turns out that the mass of Dark Matter is outside of the range predicted by the Sussex team, then it will also prove that an additional force – as well as gravity – acts on Dark Matter.

    Professor Xavier Calmet from the School of Mathematical and Physical Sciences at the University of Sussex, said:

    “This is the first time that anyone has thought to use what we know about quantum gravity as a way to calculate the mass range for Dark Matter. We were surprised when we realised no-one had done it before – as were the fellow scientists reviewing our paper.

    “What we’ve done shows that Dark Matter cannot be either ‘ultra-light’ or ‘super-heavy’ as some theorise – unless there is an as-yet unknown additional force acting on it. This piece of research helps physicists in two ways: it focuses the search area for Dark Matter, and it will potentially also help reveal whether or not there is a mysterious unknown additional force in the universe.”

    Folkert Kuipers, a PhD student working with Professor Calmet, at the University of Sussex, said:

    “As a PhD student, it’s great to be able to work on research as exciting and impactful as this. Our findings are very good news for experimentalists as it will help them to get closer to discovering the true nature of Dark Matter.”

    The visible universe – such as ourselves, the planets and stars – accounts for 25 per cent of all mass in the universe. The remaining 75 per cent is comprised of Dark Matter.

    It is known that gravity acts on Dark Matter because that’s what accounts for the shape of galaxies.

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.


    Coma cluster via NASA/ESA Hubble.


    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.
    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.
    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Sussex (UK) is a leading research-intensive university near Brighton. We have both an international and local outlook, with staff and students from more than 100 countries and frequent engagement in community activities and services.

     
  • richardmitnick 2:55 pm on January 26, 2021 Permalink | Reply
    Tags: "Connected Moments" for Quantum Computing, , , , Mathematics,   

    From DOE’s Pacific Northwest National Laboratory: “Connected Moments for Quantum Computing” 

    From DOE’s Pacific Northwest National Laboratory

    January 12, 2021 [Just now in social media.]

    Karyn Hede
    karyn.hede@pnnl.gov

    Math shortcut shaves time and cost of quantum calculations while maintaining accuracy.

    1

    Quantum computers are exciting in part because they are being designed to show how the world is held together. This invisible “glue” is made of impossibly tiny particles and energy. And like all glue, it’s kind of messy.

    Once the formula for the glue is known, it can be used to hold molecules together in useful structures. And these new kinds of materials and chemicals may one day fuel our vehicles and warm our homes.

    But before all that, we need math. That’s where theoretical chemists Bo Peng and Karol Kowalski have excelled. The Pacific Northwest National Laboratory duo are teaching today’s computers to do the math that will reveal the universe’s subatomic glue, once full-scale quantum computing becomes feasible.

    2
    The connected moments mathematical method is helping understand the universal energy glue that binds molecules together. Credit: Nathan Johnson /Pacific Northwest National Laboratory.

    The team recently showed that they could use a mathematical tool called “connected moments,” to greatly reduce the time and calculation costs of conducting one kind of quantum calculation. Using what’s called a quantum simulator, the team showed that they could accurately model simple molecules. This feat, which mathematically describes the energy glue holding together molecules, garnered “editor’s pick” in the Journal of Chemical Physics, signifying its scientific importance.

    “We showed that we can use this approach to reduce the complexity of quantum calculations needed to model a chemical system, while also reducing errors,” said Peng. “We see this as a compromise that will allow us to get from what we can do right now with a quantum computer to what will be possible in the near future.”

    Connected moments

    The research team applied a mathematical concept that was first described 40 years ago. They were attracted to the connected moments method because of its ability to accurately reconstruct the total energy of a molecular system using much less time and many fewer cycles of calculations. This is important because today’s quantum computers are prone to error. The more quantum circuits needed for a calculation, the more opportunity for error to creep in. By using fewer of these fragile quantum circuits, they reduced the error rate of the whole calculation, while maintaining an accurate result.

    “The design of this algorithm allows us to do the equivalent of a full-scale quantum calculation with modest resources,” said Kowalski.

    Timing-saving method applies to chemistry and materials science.

    In the study, the team established the reliability of the connected moments method for accurately describing the energy in both a simple molecule of hydrogen and a simple metal impurity. Using relatively simple models allowed the team to compare its method with existing full-scale computing models known to be correct and accurate.

    “This study demonstrated that the connected moments method can advance the accuracy and affordability of electronic structure methods,” said Kowalski. “We are already working on extending the work to larger systems, and integrating it with emerging quantum computing frameworks.”

    By studying both a chemical system and a material system the researchers showed the versatility of the approach for describing the total energy in both systems. The preparation of this so-called “initial state” is a steppingstone to studying more complex interactions between molecules—how the energy shifts around to keep molecules glued together.

    Bridge to quantum computing

    The published study [The Journal of Chemical Physics] used IBM’s QISKIT quantum computing software, but work is already under way to extend its use with other quantum computing platforms. Specifically, the research team is working to extend the work to support XACC, an infrastructure developed at Oak Ridge National Laboratory. The XACC software will allow the scientists to take advantage of the fastest, most accurate world-class computers as a quantum–classical computing hybrid.

    This discovery will now be incorporated into research to be performed in the Quantum Science Center, a U.S. Department of Energy Office of Science (DOE-SC)-supported initiative.

    “This work was conducted with a very small system of four qubits, but we hope to extend to a 12-qubit system in the near term, with an ultimate goal of a 50-qubit system within three to five years,” said Peng.

    At that point, the messy glue of the universe may be easier to apply.

    The research was supported by the DOE-SC Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: