Tagged: Machine learning Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:17 pm on August 2, 2022 Permalink | Reply
    Tags: "Reinforcement learning" in which the model learns through trial-and-error with a reward., "Using artificial intelligence to control digital manufacturing", , , , Machine learning, , Researchers train a machine-learning model to monitor and adjust the 3D printing process to correct errors in real-time.,   

    From The MT Computer Science & Artificial Intelligence Laboratory (CSAIL) : “Using artificial intelligence to control digital manufacturing” 

    1

    From The MT Computer Science & Artificial Intelligence Laboratory (CSAIL)

    at

    The Massachusetts Institute of Technology

    August 2, 2022

    Watch Video
    Adam Zewe | MIT News Office

    Researchers train a machine-learning model to monitor and adjust the 3D printing process to correct errors in real-time.


    [SIGGRAPH 2022] Closed-Loop Control of Direct Ink Writing via Reinforcement Learning.

    1
    MIT researchers have trained a machine-learning model to monitor and adjust the 3D printing process in real-time.
    Image: Courtesy of the researchers.

    Scientists and engineers are constantly developing new materials with unique properties that can be used for 3D printing, but figuring out how to print with these materials can be a complex, costly conundrum.

    Often, an expert operator must use manual trial-and-error — possibly making thousands of prints — to determine ideal parameters that consistently print a new material effectively. These parameters include printing speed and how much material the printer deposits.

    MIT researchers have now used artificial intelligence to streamline this procedure. They developed a machine-learning system that uses computer vision to watch the manufacturing process and then correct errors in how it handles the material in real-time.

    They used simulations to teach a neural network how to adjust printing parameters to minimize error, and then applied that controller to a real 3D printer. Their system printed objects more accurately than all the other 3D printing controllers they compared it to.

    The work avoids the prohibitively expensive process of printing thousands or millions of real objects to train the neural network. And it could enable engineers to more easily incorporate novel materials into their prints, which could help them develop objects with special electrical or chemical properties. It could also help technicians make adjustments to the printing process on-the-fly if material or environmental conditions change unexpectedly.

    “This project is really the first demonstration of building a manufacturing system that uses machine learning to learn a complex control policy,” says senior author Wojciech Matusik, professor of electrical engineering and computer science at MIT who leads the Computational Design and Fabrication Group (CDFG) within the Computer Science and Artificial Intelligence Laboratory (CSAIL). “If you have manufacturing machines that are more intelligent, they can adapt to the changing environment in the workplace in real-time, to improve the yields or the accuracy of the system. You can squeeze more out of the machine.”

    The co-lead authors on the research [Closed-Loop Control of Direct Ink Writing via Reinforcement Learning (below)] are Mike Foshey, a mechanical engineer and project manager in the CDFG, and Michal Piovarci, a postdoc at the Institute of Science and Technology in Austria. MIT co-authors include Jie Xu, a graduate student in electrical engineering and computer science, and Timothy Erps, a former technical associate with the CDFG.

    Picking parameters

    Determining the ideal parameters of a digital manufacturing process can be one of the most expensive parts of the process because so much trial-and-error is required. And once a technician finds a combination that works well, those parameters are only ideal for one specific situation. She has little data on how the material will behave in other environments, on different hardware, or if a new batch exhibits different properties.

    Using a machine-learning system is fraught with challenges, too. First, the researchers needed to measure what was happening on the printer in real-time.

    To do this, they developed a machine-vision system using two cameras aimed at the nozzle of the 3D printer. The system shines light at material as it is deposited and, based on how much light passes through, calculates the material’s thickness.

    “You can think of the vision system as a set of eyes watching the process in real-time,” Foshey says.

    The controller would then process images it receives from the vision system and, based on any error it sees, adjust the feed rate and the direction of the printer.

    But training a neural network-based controller to understand this manufacturing process is data-intensive, and would require making millions of prints. So, the researchers built a simulator instead.

    Successful simulation

    To train their controller, they used a process known as reinforcement learning in which the model learns through trial-and-error with a reward. The model was tasked with selecting printing parameters that would create a certain object in a simulated environment. After being shown the expected output, the model was rewarded when the parameters it chose minimized the error between its print and the expected outcome.

    In this case, an “error” means the model either dispensed too much material, placing it in areas that should have been left open, or did not dispense enough, leaving open spots that should be filled in. As the model performed more simulated prints, it updated its control policy to maximize the reward, becoming more and more accurate.

    However, the real world is messier than a simulation. In practice, conditions typically change due to slight variations or noise in the printing process. So the researchers created a numerical model that approximates noise from the 3D printer. They used this model to add noise to the simulation, which led to more realistic results.

    “The interesting thing we found was that, by implementing this noise model, we were able to transfer the control policy that was purely trained in simulation onto hardware without training with any physical experimentation,” Foshey says. “We didn’t need to do any fine-tuning on the actual equipment afterwards.”

    When they tested the controller, it printed objects more accurately than any other control method they evaluated. It performed especially well at infill printing, which is printing the interior of an object. Some other controllers deposited so much material that the printed object bulged up, but the researchers’ controller adjusted the printing path so the object stayed level.

    Their control policy can even learn how materials spread after being deposited and adjust parameters accordingly.

    “We were also able to design control policies that could control for different types of materials on-the-fly. So if you had a manufacturing process out in the field and you wanted to change the material, you wouldn’t have to revalidate the manufacturing process. You could just load the new material and the controller would automatically adjust,” Foshey says.

    Now that they have shown the effectiveness of this technique for 3D printing, the researchers want to develop controllers for other manufacturing processes. They’d also like to see how the approach can be modified for scenarios where there are multiple layers of material, or multiple materials being printed at once. In addition, their approach assumed each material has a fixed viscosity (“syrupiness”), but a future iteration could use AI to recognize and adjust for viscosity in real-time.

    Additional co-authors on this work include Vahid Babaei, who leads the Artificial Intelligence Aided Design and Manufacturing Group at the Max Planck Institute; Piotr Didyk, associate professor at the University of Lugano in Switzerland; Szymon Rusinkiewicz, the David M. Siegel ’83 Professor of computer science at Princeton University; and Bernd Bickel, professor at the Institute of Science and Technology in Austria.

    Science paper:
    Closed-Loop Control of Direct Ink Writing via Reinforcement Learning

    The work was supported, in part, by the FWF Lise-Meitner program, a European Research Council starting grant, and the U.S. National Science Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    4

    The MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) is a research institute at the Massachusetts Institute of Technology (MIT) formed by the 2003 merger of the Laboratory for Computer Science (LCS) and the Artificial Intelligence Laboratory (AI Lab). Housed within the Ray and Maria Stata Center, CSAIL is the largest on-campus laboratory as measured by research scope and membership. It is part of the Schwarzman College of Computing but is also overseen by the MIT Vice President of Research.

    Research activities

    CSAIL’s research activities are organized around a number of semi-autonomous research groups, each of which is headed by one or more professors or research scientists. These groups are divided up into seven general areas of research:

    Artificial intelligence
    Computational biology
    Graphics and vision
    Language and learning
    Theory of computation
    Robotics
    Systems (includes computer architecture, databases, distributed systems, networks and networked systems, operating systems, programming methodology, and software engineering among others)

    In addition, CSAIL hosts the World Wide Web Consortium (W3C).

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center (US), and the Haystack Observatory, as well as affiliated laboratories such as the Broad Institute of MIT and Harvard(US) and Whitehead Institute (US).

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology ‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology ( students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology, Massachusetts Institute of Technology , and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 8:24 pm on July 19, 2022 Permalink | Reply
    Tags: "Hey Siri:: How Much Does This Galaxy Cluster Weigh?", , , , , Machine learning,   

    From Carnegie Mellon University: “Hey Siri:: How Much Does This Galaxy Cluster Weigh?” 

    From Carnegie Mellon University

    July 19, 2022
    Amy Pavlak Laird

    Jocelyn Duffy
    Mellon College of Science
    jhduffy@cmu.edu
    412-268-9982

    It’s been nearly a century since astronomer Fritz Zwicky first calculated the mass of the Coma Cluster, a dense collection of almost 1,000 galaxies located in the nearby universe. But estimating the mass of something so huge and dense, not to mention 320 million light-years away, has its share of problems — then and now. Zwicky’s initial measurements, and the many made since, are plagued by sources of error that bias the mass higher or lower.

    Now, using tools from machine learning, a team led by Carnegie Mellon University physicists has developed a deep-learning method that accurately estimates the mass of the Coma Cluster and effectively mitigates the sources of error.

    “People have made mass estimates of the Coma Cluster for many, many years. But by showing that our machine-learning methods are consistent with these previous mass estimates, we are building trust in these new, very powerful methods that are hot in the field of cosmology right now,” said Matthew Ho, a fifth-year graduate student in the Department of Physics’ McWilliams Center for Cosmology and a member of Carnegie Mellon’s NSF AI Planning Institute for Physics of the Future.

    Machine-learning methods are used successfully in a variety of fields to find patterns in complex data, but they have only gained a foothold in cosmology research in the last decade. For some researchers in the field, these methods come with a major concern: Since it is difficult to understand the inner workings of a complex machine-learning model, can they be trusted to do what they are designed to do? Ho and his colleagues set out to address these reservations with their latest research.

    To calculate the mass of the Coma Cluster, Zwicky and others used a dynamical mass measurement, in which they studied the motion or velocity of objects orbiting in and around the cluster and then used their understanding of gravity to infer the cluster’s mass. But this measurement is susceptible to a variety of errors. Galaxy clusters exist as nodes in a huge web of matter distributed throughout the universe, and they are constantly colliding and merging with each other, which distorts the velocity profile of the constituent galaxies.

    And because astronomers are observing the cluster from a great distance, there are a lot of other things in between that can look and act like they are part of the galaxy cluster, which can bias the mass measurement. Recent research has made progress toward quantifying and accounting for the effect of these errors, but machine-learning-based methods offer an innovative data-driven approach, according to Ho.

    “Our deep-learning method learns from real data what are useful measurements and what are not,” Ho said, adding that their method eliminates errors from interloping galaxies (selection effects) and accounts for various galaxy shapes (physical effects). “The usage of these data-driven methods makes our predictions better and automated.”

    “One of the major shortcomings with standard machine learning approaches is that they usually yield results without any uncertainties,” added Associate Professor of Physics Hy Trac, Ho’s adviser. “Our method includes robust Bayesian statistics, which allow us to quantify the uncertainty in our results.”

    Ho and his colleagues developed their novel method by customizing a well-known machine-learning tool called a convolutional neural network, which is a type of deep-learning algorithm used in image recognition. The researchers trained their model by feeding it data from cosmological simulations of the universe. The model learned by looking at the observable characteristics of thousands of galaxy clusters, whose mass is already known. After in-depth analysis of the model’s handling of the simulation data, Ho applied it to a real system — the Coma Cluster — whose true mass is not known. Ho’s method calculated a mass estimate that is consistent with most of the mass estimates made since the 1980s. This marks the first time this specific machine-learning methodology has been applied to an observational system.

    “To build reliability of machine-learning models, it’s important to validate the model’s predictions on well-studied systems, like Coma,” Ho said. “We are currently undertaking a more rigorous, extensive check of our method. The promising results are a strong step toward applying our method on new, unstudied data.”

    Models such as these are going to be critical moving forward, especially when large-scale spectroscopic surveys, such as the Dark Energy Spectroscopic Instrument, the Vera C. Rubin Observatory and Euclid, start releasing the vast amounts of data they are collecting of the sky.

    “Soon we’re going to have a petabyte-scale data flow,” Ho explained. “That’s huge. It’s impossible for humans to parse that by hand. As we work on building models that can be robust estimators of things like mass while mitigating sources of error, another important aspect is that they need to be computationally efficient if we’re going to process this huge data flow from these new surveys. And that is exactly what we are trying to address — using machine learning to improve our analyses and make them faster.”

    This work is supported by NSF AI Institute: Physics of the Future, NSF PHY-2020295, and the McWilliams-PSC Seed Grant Program. The computing resources necessary to complete this analysis were provided by the Pittsburgh Supercomputing Center.

    ________________________________________
    Pittsburgh Supercomputer Center


    ________________________________________

    The CosmoSim database used in this paper is a service by the Leibniz-Institute for Astrophysics Potsdam (AIP).

    The study’s authors include: Trac; Michelle Ntampaka, who graduated from CMU with a doctorate in physics in 2017 and is now deputy head of Data Science at the Space Telescope Science Institute; Markus Michael Rau, a McWilliams postdoctoral fellow who is now a postdoctoral fellow at Argonne National Lab; Minghan Chen, who graduated with a bachelor’s degree in physics in 2018, and is a Ph.D. student at the University of California, Santa Barbara.; Alexa Lansberry, who graduated with a bachelor’s degree in physics in 2020; and Faith Ruehle, who graduated with a bachelor’s degree in physics in 2021.

    Science paper:
    Nature Astronomy

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Carnegie Mellon University is a global research university with more than 12,000 students, 95,000 alumni, and 5,000 faculty and staff.
    CMU has been a birthplace of innovation since its founding in 1900.
    Today, we are a global leader bringing groundbreaking ideas to market and creating successful startup businesses.
    Our award-winning faculty members are renowned for working closely with students to solve major scientific, technological and societal challenges. We put a strong emphasis on creating things—from art to robots. Our students are recruited by some of the world’s most innovative companies.

    We have campuses in Pittsburgh, Qatar and Silicon Valley, and degree-granting programs around the world, including Africa, Asia, Australia, Europe and Latin America.

    The university was established by Andrew Carnegie as the Carnegie Technical Schools, the university became the Carnegie Institute of Technology in 1912 and began granting four-year degrees. In 1967, the Carnegie Institute of Technology merged with the Mellon Institute of Industrial Research, formerly a part of the University of Pittsburgh. Since then, the university has operated as a single institution.

    The university has seven colleges and independent schools, including the College of Engineering, College of Fine Arts, Dietrich College of Humanities and Social Sciences, Mellon College of Science, Tepper School of Business, Heinz College of Information Systems and Public Policy, and the School of Computer Science. The university has its main campus located 3 miles (5 km) from Downtown Pittsburgh, and the university also has over a dozen degree-granting locations in six continents, including degree-granting campuses in Qatar and Silicon Valley.

    Past and present faculty and alumni include 20 Nobel Prize laureates, 13 Turing Award winners, 23 Members of the American Academy of Arts and Sciences, 22 Fellows of the American Association for the Advancement of Science , 79 Members of the National Academies, 124 Emmy Award winners, 47 Tony Award laureates, and 10 Academy Award winners. Carnegie Mellon enrolls 14,799 students from 117 countries and employs 1,400 faculty members.
    Research

    Carnegie Mellon University is classified among “R1: Doctoral Universities – Very High Research Activity”. For the 2006 fiscal year, the university spent $315 million on research. The primary recipients of this funding were the School of Computer Science ($100.3 million), the Software Engineering Institute ($71.7 million), the College of Engineering ($48.5 million), and the Mellon College of Science ($47.7 million). The research money comes largely from federal sources, with a federal investment of $277.6 million. The federal agencies that invest the most money are the National Science Foundation and the Department of Defense, which contribute 26% and 23.4% of the total university research budget respectively.

    The recognition of Carnegie Mellon as one of the best research facilities in the nation has a long history—as early as the 1987 Federal budget Carnegie Mellon University was ranked as third in the amount of research dollars with $41.5 million, with only Massachusetts Institute of Technology and Johns Hopkins University receiving more research funds from the Department of Defense.

    The Pittsburgh Supercomputing Center is a joint effort between Carnegie Mellon, University of Pittsburgh, and Westinghouse Electric Company. Pittsburgh Supercomputing Center was founded in 1986 by its two scientific directors, Dr. Ralph Roskies of the University of Pittsburgh and Dr. Michael Levine of Carnegie Mellon. Pittsburgh Supercomputing Center is a leading partner in the TeraGrid, the National Science Foundation’s cyberinfrastructure program.

    Scarab lunar rover is being developed by the RI.

    The Robotics Institute (RI) is a division of the School of Computer Science and considered to be one of the leading centers of robotics research in the world. The Field Robotics Center (FRC) has developed a number of significant robots, including Sandstorm and H1ghlander, which finished second and third in the DARPA Grand Challenge, and Boss, which won the DARPA Urban Challenge. The Robotics Institute has partnered with a spinoff company, Astrobotic Technology Inc., to land a CMU robot on the moon by 2016 in pursuit of the Google Lunar XPrize. The robot, known as Andy, is designed to explore lunar pits, which might include entrances to caves. The RI is primarily sited at Carnegie Mellon’s main campus in Newell-Simon hall.

    The Software Engineering Institute (SEI) is a federally funded research and development center sponsored by the U.S. Department of Defense and operated by Carnegie Mellon, with offices in Pittsburgh, Pennsylvania, USA; Arlington, Virginia, and Frankfurt, Germany. The SEI publishes books on software engineering for industry, government and military applications and practices. The organization is known for its Capability Maturity Model (CMM) and Capability Maturity Model Integration (CMMI), which identify essential elements of effective system and software engineering processes and can be used to rate the level of an organization’s capability for producing quality systems. The SEI is also the home of CERT/CC, the federally funded computer security organization. The CERT Program’s primary goals are to ensure that appropriate technology and systems management practices are used to resist attacks on networked systems and to limit damage and ensure continuity of critical services subsequent to attacks, accidents, or failures.

    The Human–Computer Interaction Institute (HCII) is a division of the School of Computer Science and is considered one of the leading centers of human–computer interaction research, integrating computer science, design, social science, and learning science. Such interdisciplinary collaboration is the hallmark of research done throughout the university.

    The Language Technologies Institute (LTI) is another unit of the School of Computer Science and is famous for being one of the leading research centers in the area of language technologies. The primary research focus of the institute is on machine translation, speech recognition, speech synthesis, information retrieval, parsing and information extraction. Until 1996, the institute existed as the Center for Machine Translation that was established in 1986. From 1996 onwards, it started awarding graduate degrees and the name was changed to Language Technologies Institute.

    Carnegie Mellon is also home to the Carnegie School of management and economics. This intellectual school grew out of the Tepper School of Business in the 1950s and 1960s and focused on the intersection of behavioralism and management. Several management theories, most notably bounded rationality and the behavioral theory of the firm, were established by Carnegie School management scientists and economists.

    Carnegie Mellon also develops cross-disciplinary and university-wide institutes and initiatives to take advantage of strengths in various colleges and departments and develop solutions in critical social and technical problems. To date, these have included the Cylab Security and Privacy Institute, the Wilton E. Scott Institute for Energy Innovation, the Neuroscience Institute (formerly known as BrainHub), the Simon Initiative, and the Disruptive Healthcare Technology Institute.

    Carnegie Mellon has made a concerted effort to attract corporate research labs, offices, and partnerships to the Pittsburgh campus. Apple Inc., Intel, Google, Microsoft, Disney, Facebook, IBM, General Motors, Bombardier Inc., Yahoo!, Uber, Tata Consultancy Services, Ansys, Boeing, Robert Bosch GmbH, and the Rand Corporation have established a presence on or near campus. In collaboration with Intel, Carnegie Mellon has pioneered research into claytronics.

     
  • richardmitnick 10:51 am on June 10, 2022 Permalink | Reply
    Tags: "University of Illinois-Chicago Joins Brookhaven Lab's Quantum Center", , , , C^2QA is one of five U.S. Department of Energy (DOE) Office of Science National Quantum Information Science Research Centers (NQISRCs) established in support of the National Quantum Initiative Act., , Machine learning, , , , , The University of Illinois-Chicago, Three research areas or thrusts: Software and Algorithms; Devices and Materials., University of Illinois Chicago (UIC) has joined the Brookhaven National Laboratory-led Co-design Center for Quantum Advantage (C^2QA) making the university the C^2QA’s 24th partner institution.   

    From The DOE’s Brookhaven National Laboratory and The University of Illinois-Chicago : “University of Illinois-Chicago Joins Brookhaven Lab’s Quantum Center” 

    From The DOE’s Brookhaven National Laboratory

    and

    The University of Illinois-Chicago

    June 9, 2022
    Written by Denise Yazak
    Contact:
    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    1
    The University of Illinois Chicago’s Engineering Innovation Building in Chicago, September 5, 2019.

    University of Illinois Chicago (UIC) has joined the Brookhaven National Laboratory-led Co-design Center for Quantum Advantage (C^2QA), making the public research university C^2QA’s 24th partner institution.

    C^2QA is one of five U.S. Department of Energy (DOE) Office of Science National Quantum Information Science Research Centers (NQISRCs) established in support of the National Quantum Initiative Act, which aims to develop the full potential of quantum-based applications in computing, communication, and sensing to benefit national security, economic competitiveness, and leadership in scientific discovery. C^2QA’s primary focus is on building the tools necessary to create scalable, distributed, and fault-tolerant quantum computer systems.

    C^2QA consists of collaborative, multidisciplinary research teams that span across several domains to apply quantum co-design principles in three research areas or thrusts: Software and Algorithms; Devices and Materials. “The Center is fortunate to count Associate Professor of Electrical and Computer Engineering Thomas Searles among the principal investigators (PIs) from UIC helping to advance the mission of C^2QA in the Devices thrust,” remarked Jens Koch, Novel Qubits & Circuit Elements subthrust leader. “Professor Searles has been an active member since the very beginning of C^2QA at his former institution, and will continue his research.” Professor Searles’ lab is currently applying machine learning methods towards error mitigation in Noisy Intermediate Scale Quantum (NISQ) devices like the Quantum Processing Units (QPUs) offered by the IBM Quantum program, thanks to funding from C2QA. Searles is further looking forward to intensifying his work on quantum state tomography on the IBM machines and other platforms and increasing opportunities in his lab.

    “Quantum computing has the potential to completely revolutionize how we interact with the world around us and in particular, how we approach problem solving in scientific disciplines like physics, computer science, chemistry and engineering. We have a long way to go, however, in developing better quantum devices for practical application before this is a reality,” said UIC Associate Professor of Electrical and Computer Engineering Thomas Searles. “The co-design center and its affiliated researchers are leaders in advancing quantum-based technologies through scientific research – our partnership with the co-design center opens up incredible opportunities for our students and faculty to partner on innovative discoveries in quantum computing, network and participate in seminars and career fairs.”

    “There is some very exciting work involving data-centric models for Machine Learning in quantum information science,” said C^2QA director, Andrew Houck. “Using his access to the IBM cloud machines, Thomas Searles and UIC are really helping us figure out how to more efficiently use, train, and run interesting algorithms on real hardware that’s currently available.”

    Searles, formerly of Howard University, is widely recognized as an avid and active supporter of increasing participation of underrepresented minorities in quantum research. With UIC representing one of six minority serving institutions (MSI) in C^2QA, he is an invaluable asset to everyone in the Center seeking guidance on how attract, train, and mentor the next generation of diverse researchers and engineers joining the quantum workforce. “The [NQISRCs] serve as hubs for collaboration for the entire country. I think it’s important that these hubs be as inclusive as possible,” said Searles. “We are, as a whole, an MSI in the heart of Chicago. We have fantastic students within our Electrical and Computer Engineering Department. We’re not only a Hispanic-serving institution, but a Hispanic-serving department with greater than 25 percent of our students identifying as such. C^2QA and UIC are bringing opportunities in the field of quantum to underserved groups in Chicago that don’t exist.”

    UIC is Chicago’s only public research university and is an integral part of the educational, technological and cultural fabric of the city. Chicago is not only a diverse city full of fresh new talent in the field, but it is also the epicenter of Quantum Information Science in the Midwest. “It’s the right place, the right time, and the right people. With C2QA having a large concentration on the east coast, this partnership will broaden its reach. We’re bringing something to the Midwest that’s not there currently, so we’re very, very excited about that,” said Searles.

    Searles also acknowledged the support of C^2QA and Brookhaven Laboratory staff in facilitating this collaboration, “I wanted to thank C^2QA Director, Andrew Houck, former Director, Steve Girvin, and Operations Manager, Kimberly McGuire. I also wanted to highlight the work of Brookhaven Lab’s Diversity Equity, and Inclusion Officer Noel Blackburn, and National Synchrotron Light Source II (NSLS-II)[below] Director John Hill.”

    Besides Brookhaven Lab and UIC, the partnering institutions in C^2QA are The DOE’s Ames Laboratory, California Institute of Technology, City College of New York, Columbia University, Harvard University, Howard University, IBM, Johns Hopkins University, Massachusetts Institute of Technology, Montana State University, National Aeronautics and Space Administration’s Ames Research Center, Northwestern University, Pacific Northwest National Laboratory, Princeton University, State University of New York Polytechnic Institute, Stony Brook University, The DOE’s Thomas Jefferson National Accelerator Facility, University of California-Santa Barbara, University of Massachusetts-Amherst, University of Pittsburgh, University of Washington, Virginia Polytechnic Institute and State University, and Yale University. In addition to its 24 existing partners, C^2QA recently welcomed The DOE’s Princeton Plasma Physics Laboratory as its first unfunded affiliate. For more information on the U.S. DOE Office of Science Quantum Centers, visit https://science.osti.gov/Initiatives/QIS/QIS-Centers.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    More than a century of discovery and service

    The The University of Illinois at Chicago traces its origins to several private health colleges that were founded in Chicago during the 19th century.

    In the 20th century, new campuses were built in Chicago and later joined together to form a comprehensive learning community. In the last three decades, UIC has transformed itself into one of the top 65 research universities in the United States.

    As part of the University of Illinois, UIC grew to meet the needs of the people of Illinois, but its deepest roots are in health care. The Chicago College of Pharmacy, founded in 1859, predated the Civil War is the oldest unit in the university. Other early colleges were the College of Physicians and Surgeons and the Columbian College of Dentistry.

    These Chicago-based health colleges became fully incorporated in 1913 as the Colleges of Medicine, Dentistry and Pharmacy. The College of Pharmacy was the first pharmacy school west of the Alleghenies and emphasized laboratory instruction and research.

    Dentistry became the first American dental school fully equipped with electric drills. The College of Medicine developed the country’s first occupational therapy program and grew rapidly to become the largest medical school in the U.S.

    In the decades following incorporation, several other health science colleges were created. Together with the Colleges of Medicine, Dentistry and Pharmacy, they formed the Chicago Professional Colleges of the University of Illinois. In 1961, the professional colleges became the University of Illinois at the Medical Center.

    Following World War II, the University of Illinois increased its presence in Chicago by creating a temporary, two-year branch campus on Navy Pier. The Chicago Undergraduate Division primarily accommodated student veterans on the G.I. Bill. The program allowed all students to complete their first two years of study in Chicago before going downstate to finish their undergraduate degrees at Urbana-Champaign.

    The lakeside location earned the Navy Pier campus the name “Harvard on the rocks.” The university shared the 3,000-foot pier with other tenants that included the Chicago Police Department Traffic Division and several military detachments. At that time Navy Pier was not the bright, attractive venue it is today as Chicago’s leading tourist attraction. The pier was a dreary, functioning port facility. But because the pier had only a single corridor along its half-mile length, students were able to see their peers each day.

    Brookhaven Campus

    One of ten national laboratories overseen and primarily funded by the The DOE Office of Science, The DOE’s Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University and Battelle Memorial Institute. From 1947 to 1998, it was operated by Associated Universities, Inc.(AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology to have a facility near Boston, Massachusetts. Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University, Cornell University, Harvard University, Johns Hopkins University, Massachusetts Institute of Technology, Princeton University, University of Pennsylvania, University of Rochester, and Yale University.

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966.

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II. [below].

    BNL National Synchrotron Light Source.

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] (US) as the future Electron–ion collider (EIC) in the United States.

    Brookhaven Lab Electron-Ion Collider (EIC) to be built inside the tunnel that currently houses the RHIC.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.

    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.

    BNL National Synchrotron Light Source II, Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years. NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.

    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.

    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University-SUNY.

    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to the ATLAS experiment, one of the four detectors located at the The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN] Large Hadron Collider(LHC).

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN] map.

    Iconic view of the European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear] [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN] ATLAS detector.

    It is currently operating at The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN] near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    DOE’s Oak Ridge National Laboratory Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.

    Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China .


    BNL Center for Functional Nanomaterials.

    BNL National Synchrotron Light Source II.

    BNL NSLS II.

    BNL Relative Heavy Ion Collider Campus.

    BNL/RHIC Star Detector.

    BNL/RHIC Phenix detector.

     
  • richardmitnick 9:05 am on June 10, 2022 Permalink | Reply
    Tags: "FWI": Full-Waveform Inversion, "The Big Data Revolution Unlocks New Opportunities for Seismology", Advances in computing are allowing seismologists to apply data-hungry algorithms to big data experiments., , , , , , , , In Seismology the volumes of data being acquired from individual experiments are now reaching hundreds of terabytes in passive seismology., Machine learning, Parallel and distributed computing allow scientists to perform many computations simultaneously.   

    From Eos: “The Big Data Revolution Unlocks New Opportunities for Seismology” 

    Eos news bloc

    From Eos

    AT

    AGU

    9 June 2022

    Stephen J. Arrowsmith
    sarrowsmith@smu.edu
    Daniel T. Trugman

    Karianne Bergen
    Beatrice Magnani

    1
    A recent experiment that used over 50,000 seismic nodes achieved the densest seismic survey on land. These photographs show nodes being staged, transported by truck, and charged/harvested in racks. Credit: Ourabah and Crosby [2020]

    Scientists have been measuring earthquakes for hundreds of years. As instruments have advanced, so has our understanding of how and why the ground moves. A recent article published in Reviews of Geophysics describes how the “Big Data” revolution is now advancing the field of seismology. We asked some of the authors to explain how seismic waves are measured, how measurement techniques have changed over time, and how big data is being collected and used to advance science.

    In simple terms, what is seismology and why is it important?

    Seismology is a science that is based on vibrational waves (‘seismic waves’) that travel through the Earth. Seismic waves produce ground motions that are recorded by seismometers. Recorded ground motions can provide vital clues both about the sources of waves (e.g., earthquakes, volcanoes, explosions, etc.) and about the properties of the Earth the waves travel through. Seismology provides tools for understanding the physics of earthquakes, for monitoring natural hazards, and for revealing the internal structure of the Earth.

    How does a seismometer work and what important advancements in knowledge have been made since their development?

    It’s surprisingly hard to accurately measure the motion of the ground because any instrument that does so must also move with the ground (otherwise it would have to be in the air, where it couldn’t directly record ground motion). To deal with this challenge, seismometers contain a mass on a spring that remains stationary (an ‘inertial mass’), and they measure the motion of the instrument relative to that mass. Early seismometers were entirely mechanical, but it’s hard to design a mechanical system where the inertial mass stays still over a range of frequencies of ground motion.

    A key advancement was the use of electronics to keep the mass fixed and therefore record a much wider range of frequencies. An ideal seismometer can record accurately over a broad band of frequencies and a wide range of ground motion amplitudes without going off scale. This is easier said than done, but seismometers are improving every year.

    What is the difference between passive and exploration seismology?

    Passive seismology is about recording the seismic waves generated by natural or existing sources like earthquakes. Passive seismologists typically deploy instruments for a long time in order to gather the data they need from the spontaneous occurrence of natural sources of seismic waves. In contrast, exploration seismologists generate their own seismic waves using anthropogenic sources like explosions, air guns, or truck vibrations. Because they control the timing and location of the source of seismic waves, exploration seismologists typically work with large numbers of instruments that are deployed for a short time. Exploration seismology is most widely used in the oil industry but can also be used for more general scientific purposes when high resolution imaging is needed.

    How have advances in seismologic methods improved subsurface imaging?

    Developments in seismic imaging techniques are allowing seismologists to significantly improve the resolution of images of the subsurface. A particularly powerful technique for high resolution imaging is called Full-Waveform Inversion (FWI). FWI uses the full seismogram for imaging, trying to match data and model “wiggle for wiggle” rather than only using simplified measures like travel times, and can thus provide better image resolution. The method has become widely adopted by the exploration seismology community for this reason and is now becoming more common in the passive seismic community as well.

    Another important innovation in imaging uses persistent sources of ambient noise like ocean waves to image the subsurface. This is particularly useful for short-term deployments where there is often insufficient time to wait around for natural sources like earthquakes to occur.

    What is “Big Data” and how is it being used in seismology?

    “Big Data” is a relative term that defines data containing greater variety, with larger volumes or coming in at a faster rate, which requires different data analysis methods and technologies than “small data”. In seismology the volumes of data being acquired from individual experiments are now reaching hundreds of terabytes in passive seismology, and petabytes in exploration seismology. For perspective, a typical laptop has less than one terabyte of disk storage. The velocity of data is the rate at which it is acquired or analyzed. In seismology, a new measurement technique called Distributed Acoustic Sensing (DAS) can fill a 1 terabyte hard drive in approximately 14 hours. The variety of data being used for seismic investigations is also increasing, with complementary data types like GNSS, barometric pressure, and infrasound becoming more commonly combined with seismic data.

    What are the main drivers of Big Data Seismology?

    There are three main drivers. First, innovations in sensing systems are allowing seismologists to conduct ‘big data’ experiments. Second, new data-hungry algorithms such as machine learning and deep neural networks are enabling seismologists to scale up their data analysis and extract more meaning from massive seismic datasets. Third, advances in computing are allowing seismologists to apply data-hungry algorithms to big data experiments. Parallel and distributed computing allow scientists to perform many computations simultaneously, with calculations often split across multiple machines, and cloud computing services provide researchers with access to on-demand computing power.

    Moving forward, what are some of the challenges and opportunities that Big Data seismologists face?

    In terms of challenges, the first relates to handling large amounts of data. Most seismologists are accustomed to easily accessing and sharing data via web services, with most of their processing and analysis of the data done on their own computers. This workflow and the infrastructure that supports it doesn’t scale well for Big Data Seismology. Another challenge is obtaining the skills that it takes to do research with big seismic datasets, which requires expertise not only in seismology but also in statistics and computer science. Skills in statistics and computer science are not routinely part of most Earth Science curricula, but they’re becoming increasingly important in order to do research at the cutting edge of Big Data Seismology.

    The opportunities are wide-ranging, and our paper discusses many opportunities for fundamental science discovery in detail, but it’s also hard to anticipate all the discoveries that will be made possible. Our best guide is to look back at the history of seismology, where many major discoveries have been driven by advances in data. For instance, the discovery of the layers of Earth followed the development of seismometers that were sufficiently sensitive to measure teleseismic earthquakes. The discovery of the global pattern of seismicity – which played a key part in the development of the theory of plate tectonics – was preceded by the development of the first global seismic network. The first digital global seismic network was followed by our first images of the convecting mantle. If we take the past as our guide, we can anticipate that the era of Big Data Seismology will provide the foundation for creative seismologists to make new discoveries.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 12:51 pm on June 9, 2022 Permalink | Reply
    Tags: "Keeping web-browsing data safe from hackers", , , , , Machine learning, ,   

    From The MT Computer Science & Artificial Intelligence Laboratory (CSAIL) : “Keeping web-browsing data safe from hackers” 

    1

    From The MT Computer Science & Artificial Intelligence Laboratory (CSAIL)

    at

    The Massachusetts Institute of Technology

    June 9, 2022
    Adam Zewe | MIT News Office

    1
    MIT researchers analyzed a powerful cyberattack, known as a website-fingerprinting attack, and then developed strategies that dramatically reduce the attacker’s chances of success. Pictured, from left to right: graduate student Jules Drean, Mengjia Yan, the Homer A. Burnell Career Development Assistant Professor of Electrical Engineering and Computer Science, and Jack Cook ’22. Image: Photo courtesy of the researchers and edited by Jose-Luis Olivares, MIT

    Malicious agents can use machine learning to launch powerful attacks that steal information in ways that are tough to prevent and often even more difficult to study.

    Attackers can capture data that “leaks” between software programs running on the same computer. They then use machine-learning algorithms to decode those signals, which enables them to obtain passwords or other private information. These are called “side-channel attacks” because information is acquired through a channel not meant for communication.

    Researchers at MIT have shown that machine-learning-assisted side-channel attacks are both extremely robust and poorly understood. The use of machine-learning algorithms, which are often impossible to fully comprehend due to their complexity, is a particular challenge. In a new paper [below], the team studied a documented attack that was thought to work by capturing signals leaked when a computer accesses memory. They found that the mechanisms behind this attack were misidentified, which would prevent researchers from crafting effective defenses.

    To study the attack, they removed all memory accesses and noticed the attack became even more powerful. Then they searched for sources of information leakage and found that the attack actually monitors events that interrupt a computer’s other processes. They show that an adversary can use this machine-learning-assisted attack to exploit a security flaw and determine the website a user is browsing with almost perfect accuracy.

    With this knowledge in hand, they developed two strategies that can thwart this attack.

    “The focus of this work is really on the analysis to find the root cause of the problem. As researchers, we should really try to delve deeper and do more analysis work, rather than just blindly using black-box machine-learning tactics to demonstrate one attack after another. The lesson we learned is that these machine-learning-assisted attacks can be extremely misleading,” says senior author Mengjia Yan, the Homer A. Burnell Career Development Assistant Professor of Electrical Engineering and Computer Science (EECS) and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL).

    The lead author of the paper is Jack Cook ’22, a recent graduate in computer science. Co-authors include CSAIL graduate student Jules Drean and Jonathan Behrens PhD ’22. The research will be presented at the International Symposium on Computer Architecture.

    A side-channel surprise

    Cook launched the project while taking Yan’s advanced seminar course. For a class assignment, he tried to replicate a machine-learning-assisted side-channel attack from the literature. Past work had concluded that this attack counts how many times the computer accesses memory as it loads a website and then uses machine learning to identify the website. This is known as a website-fingerprinting attack.

    He showed that prior work relied on a flawed machine-learning-based analysis to incorrectly pinpoint the source of the attack. Machine learning can’t prove causality in these types of attacks, Cook says.

    “All I did was remove the memory access and the attack still worked just as well, or even better. So, then I wondered, what actually opens up the side channel?” he says.

    This led to a research project in which Cook and his collaborators embarked on a careful analysis of the attack. They designed an almost identical attack, but without memory accesses, and studied it in detail.

    They found that the attack actually records a computer’s timer values at fixed intervals and uses that information to infer what website is being accessed. Essentially, the attack measures how busy the computer is over time.

    A fluctuation in the timer value means the computer is processing a different amount of information in that interval. This is due to system interrupts. A system interrupt occurs when the computer’s processes are interrupted by requests from hardware devices; the computer must pause what it is doing to handle the new request.

    When a website is loading, it sends instructions to a web browser to run scripts, render graphics, load videos, etc. Each of these can trigger many system interrupts.

    An attacker monitoring the timer can use machine learning to infer high-level information from these system interrupts to determine what website a user is visiting. This is possible because interrupt activity generated by one website, like CNN.com, is very similar each time it loads, but very different from other websites, like Wikipedia.com, Cook explains.

    “One of the really scary things about this attack is that we wrote it in JavaScript, so you don’t have to download or install any code. All you have to do is open a website. Someone could embed this into a website and then theoretically be able to snoop on other activity on your computer,” he says.

    The attack is extremely successful. For instance, when a computer is running Chrome on the macOS operating system, the attack was able to identify websites with 94 percent accuracy. All commercial browsers and operating systems they tested resulted in an attack with more than 91 percent accuracy.

    There are many factors that can affect a computer’s timer, so determining what led to an attack with such high accuracy was akin to finding a needle in a haystack, Cook says. They ran many controlled experiments, removing one variable at a time, until they realized the signal must be coming for system interrupts, which often can’t be processed separately from the attacker’s code.

    Fighting back

    Once the researchers understood the attack, they crafted security strategies to prevent it.

    First, they created a browser extension that generates frequent interrupts, like pinging random websites to create bursts of activity. The added noise makes it much more difficult for the attacker to decode signals. This dropped the attack’s accuracy from 96 percent to 62 percent, but it slowed the computer’s performance.

    For their second countermeasure, they modified the timer to return values that are close to, but not the actual time. This makes it much harder for an attacker to measure the computer’s activity over an interval, Cook explains. This mitigation cut the attack’s accuracy from 96 percent down to just 1 percent.

    “I was surprised by how such a small mitigation like adding randomness to the timer could be so effective. This mitigation strategy could really be put in use today. It doesn’t affect how you use most websites,” he says.

    Building off this work, the researchers plan to develop a systematic analysis framework for machine-learning-assisted side-channel attacks. This could help the researchers get to the root cause of more attacks, Yan says. They also want to see how they can use machine learning to discover other types of vulnerabilities.

    “This paper presents a new interrupt-based side channel attack and demonstrates that it can be effectively used for website fingerprinting attacks, while previously, such attacks were believed to be possible due to cache side channels,” says Yanjing Li, assistant professor in the Department of Computer Science at the University of Chicago, who was not involved with this research. “I liked this paper immediately after I first read it, not only because the new attack is interesting and successfully challenges existing notions, but also because it points out a key limitation of ML-assisted side-channel attacks — blindly relying on machine-learning models without careful analysis cannot provide any understanding on the actual causes/sources of an attack, and can even be misleading. This is very insightful and I believe will inspire many future works in this direction.”

    This research was funded, in part, by the National Science Foundation, the Air Force Office of Scientific Research, and the MIT-IBM Watson AI Lab.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    4

    The MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) is a research institute at the Massachusetts Institute of Technology (MIT) formed by the 2003 merger of the Laboratory for Computer Science (LCS) and the Artificial Intelligence Laboratory (AI Lab). Housed within the Ray and Maria Stata Center, CSAIL is the largest on-campus laboratory as measured by research scope and membership. It is part of the Schwarzman College of Computing but is also overseen by the MIT Vice President of Research.

    Research activities

    CSAIL’s research activities are organized around a number of semi-autonomous research groups, each of which is headed by one or more professors or research scientists. These groups are divided up into seven general areas of research:

    Artificial intelligence
    Computational biology
    Graphics and vision
    Language and learning
    Theory of computation
    Robotics
    Systems (includes computer architecture, databases, distributed systems, networks and networked systems, operating systems, programming methodology, and software engineering among others)

    In addition, CSAIL hosts the World Wide Web Consortium (W3C).

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory, the MIT Bates Research and Engineering Center, and the Haystack Observatory (US), as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology ‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology ( students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology, Massachusetts Institute of Technology , and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 7:02 am on June 6, 2022 Permalink | Reply
    Tags: "How we’re using machine learning to detect coral-eating COTS", "Internet of Things", , , , CSIRO’s Data61, Machine learning, , , , The Great Barrier Reef is one of Australia’s most diverse and unique landscapes.   

    From CSIRO (AU)-Commonwealth Scientific and Industrial Research Organization: “How we’re using machine learning to detect coral-eating COTS” 

    CSIRO bloc

    From CSIRO (AU)-Commonwealth Scientific and Industrial Research Organization

    June 5th, 2022
    Alex Persley

    The Great Barrier Reef is one of Australia’s most diverse and unique landscapes. Covering more than 2,900 individual reefs, it is home to unmatched marine biodiversity. A multidisciplinary group of researchers from Australia’s national science agency, CSIRO, have been working on projects using innovative science and technology to help combat some of the threats facing our reef.

    The team have worked with a range of stakeholders, most recently joining forces with Google and the international Kaggle community to explore ways to help with the monitoring and detection of crown-of-thorns starfish.

    Meet the team and learn more about their work here.

    Dr. Brano Kusy

    Dr. Brano Kusy is an internationally respected scientist and research group leader with CSIRO’s Data61. Dr. Kusy’s work focuses on new frontiers in networked embedded systems, mobile and wearable computing, and Internet of Things. 

    Brano, tell us about your work on the Great Barrier Reef and what attracted you to it?

    My research interests are at an intersection between digital technology and the physical world. Digital technology delivers high value in land environments, however, coastal ecosystems such as coral reefs remain poorly understood. This is due to their size, that seawater hides detail from remote sensing methods in all but the shallowest marine ecosystems, and the general difficulty of operating digital technology in remote marine environments.

    I have championed a multi-pronged approach to solve reef challenges that relied on CSIRO’s in-house technologies, such as Internet of Things, robotics, machine learning, and computer vision.

    We have developed biosensors that can monitor feeding of coral trouts and physiology of oysters, a new underwater hyperspectral imaging platform, and a robust method for detecting Irukandji jellyfish based on eDNA contained in seawater.

    Can you tell us more about the machine learning technology behind the crown-of-thorns starfish surveys?

    The COTS monitoring application is the culmination of edge ML (machine learning) and imaging technologies developed over the past four years.

    It is based on a close collaboration of CSIRO computer vision and edge ML experts with Google and Kaggle and it is a shining example of ML technology helping to protect the environment.

    We have built an edge ML platform for oceans that can analyse underwater images as they are collected by marine scientists in the field and basically uncover the hidden world under the surface through an intuitive touch-screen interface.

    In the COTS monitoring use case, the ML platform processes the images in real-time and shows the survey team on the boat how many COTS have been detected and their whereabouts.

    The beauty of this approach is that it is not locked in – it generalizes too many applications and devices. We demonstrated it works amazingly well for mapping COTS on coral reefs, but the method can be adapted for sea cucumbers in a sustainable aquaculture context, seagrass biomass for carbon accounting, or surveying condition, health, and diversity of sea life on coral reefs for climate impact assessment.

    Additionally, our platform works with many different data collection technologies and supports multiple ML software frameworks, all you need is a wired or wireless connection from your data collection platform. We demonstrated the platform with in-house data collection technology, real-time GoPro camera streams, commercial Pro Squid platform, and will be adding more in the future.

    1
    AI model detecting crown-of-thorn-starfish.

    What role can digital sciences play in ensuring the sustainability of our natural environments?

    One of our major objectives was to scale our invention to increase global impact. This was achieved by allowing fellow researchers to use our technology to explore the plethora of opportunities in this space.

    In collaboration with Google TensorFlow team, we open sourced the COTS ML model and workflows under Apache license. This allowed students, scientists, and entrepreneurs worldwide to evaluate our ML technology with their own image datasets and extend it to suit their application.

    The ML model training toolchain will be released soon to retrain the ML models for other species or object identification. By democratising ML capabilities in this space, we can make a tangible difference in ocean and marine life protection.

    How important are partnerships to this kind of work?

    It’s impossible to overstate the importance of partnerships and open sharing of scientific ideas in this line of work. In addition to our technology being inherently multi-disciplinary (designed by computer geeks like me, but used and interpreted by marine scientists), careful planning is required to deploy the technology prototypes reliably and safely at sea. Conditions can change in an instant and internet connectivity is non-existent.

    The project team needs to work as a tight-knit unit. We are very fortunate to have worked with some of the most competent and experienced crews in Australia. Shout out to University of Queensland’s Heron Island and Moreton Bay research stations, University of Sydney’s One Tree Island research station, Blue Planet Marine, and GBR Marine Park Authority.

    It was also a great privilege to work with Google Tensorflow and Kaggle teams. Having access to the latest ML expertise and hardware resources coupled with the global reach of both brands was pivotal in getting the message out. Over 2,000 international ML teams participated in the competition, the video was viewed 26 million times, and we were featured in a keynote at Google’s annual I/O conference.

    Dr Joey Crosswell

    Dr Joey Crosswell is a biogeochemist with broad research interests across oceanography and engineering. His research includes diverse environments around the world, ranging from mangroves and mesoscale eddies, to arid tropical estuaries in northern Australia and fjords in Patagonia.

    Joey, tell us about your work on the Great Barrier Reef and what attracted you to it?

    My research focuses on the connectivity of coastal systems, particularly carbon and nutrient cycling between land, ocean, and atmosphere. The Reef is particularly interesting in this regard because it is one of the largest and most complex coastal ecosystems in the world. For example, human activities far up in river catchments and oceanic processes that start on the other side of the Pacific come together in the GBR to affect the health and resilience of the Reef.

    My work looks at untangling these processes across the multiple time and space scales by using novel observation methods combined with advanced modelling tools, such as eReefs. This multi-scale understanding is important for managing the Reef because it informs where meaningful local actions can be taken, such as restoration, through to needs for larger-scale efforts such as global climate action.

    I have worked in estuaries and coral reefs along the entire coast of the GBR, but I am particularly interested in those further afield. That is, the more remote, the better. These systems provide a valuable comparison that help us gauge the impact of coastal development and future change.

    The lack of existing data in many of these remote environments also presents the challenge of building a holistic understanding from the ground up, a task for which I think CSIRO’s research disciplines, researchers, and partnerships are uniquely suited. I also have a keen interest in extreme events such as cyclones and floods that are relatively brief but have lasting impacts.

    We currently have a limited ability to resolve these events, and the development of new observational tools, methods and models for extreme conditions is one of my long-term research passions.

    2
    An aerial map of the reef showing where where crown-of-thorn-starfish have been detected.

    How important is multidisciplinary science and collaboration between different groups in this space?

    Put simply, it is the only effective way forward.

    Like the Reef faces combined threats from rising sea temperature, water quality, COTS and coastal development, so too must we employ cross-cutting science to support Reef resilience to these threats.

    The benefits of multidisciplinary research are being widely recognized through programs like eReefs the Reef Restoration and Adaptation Program, and the COTS Control and Innovation Program. The COTS ML model that we recently developed through a cross disciplinary multi-institutional collaboration clearly shows how integrative research can drive a step change in technical methods that have otherwise made little progress for decades.

    3
    AI model detecting 6 crown-of-thorn starfish in underwater imagery.

    Moreover, closer coupling of research and management disciplines allows technical innovations to have a ripple effect that drives systemic change. In the COTS application, more and better data collected using ML COTS detection will enable more efficient decision support for active control measures.

    New data dimensions unlocked by computer vision will also feedback to research on key relationships and thresholds, such as triggers for COTS outbreaks that can be proactively managed. Even more exciting than the potential of multidisciplinary science to mitigate threats is the potential to maximize benefits and ecosystem services.

    Last but not least is the more personal aspect of multidisciplinary science. This blog highlights only a few members and accomplishments of much larger teams of which I am a part. Not only is it fun and fulfilling to learn from diverse expertise, backgrounds and perspectives, but it also expands the impact of our research to new environments and cultures.

    Multidisciplinary research teams are a big part of why I enjoy what I do and who I do it with, which is particularly important when you spend a lot of time on small boats at sea!

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CSIRO campus

    CSIRO (AU)-Commonwealth Scientific and Industrial Research Organization, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    CSIRO works with leading organizations around the world. From its headquarters in Canberra, CSIRO maintains more than 50 sites across Australia and in France, Chile and the United States, employing about 5,500 people.

    Federally funded scientific research began in Australia 104 years ago. The Advisory Council of Science and Industry was established in 1916 but was hampered by insufficient available finance. In 1926 the research effort was reinvigorated by establishment of the Council for Scientific and Industrial Research (CSIR), which strengthened national science leadership and increased research funding. CSIR grew rapidly and achieved significant early successes. In 1949 further legislated changes included renaming the organization as CSIRO.

    Notable developments by CSIRO have included the invention of atomic absorption spectroscopy; essential components of Wi-Fi technology; development of the first commercially successful polymer banknote; the invention of the insect repellent in Aerogard and the introduction of a series of biological controls into Australia, such as the introduction of myxomatosis and rabbit calicivirus for the control of rabbit populations.

    Research and focus areas

    Research Business Units

    As at 2019, CSIRO’s research areas are identified as “Impact science” and organized into the following Business Units:

    Agriculture and Food
    Health and Biosecurity
    Data 61
    Energy
    Land and Water
    Manufacturing
    Mineral Resources
    Oceans and Atmosphere

    National Facilities
    CSIRO manages national research facilities and scientific infrastructure on behalf of the nation to assist with the delivery of research. The national facilities and specialized laboratories are available to both international and Australian users from industry and research. As at 2019, the following National Facilities are listed:

    Australian Animal Health Laboratory (AAHL)
    Australia Telescope National Facility – radio telescopes included in the Facility include the Australia Telescope Compact Array, the Parkes Observatory, Mopra Radio Telescope Observatory and the Australian Square Kilometre Array Pathfinder.

    STCA CSIRO Australia Compact Array (AU), six radio telescopes at the Paul Wild Observatory, is an array of six 22-m antennas located about twenty five kilometres (16 mi) west of the town of Narrabri in Australia.

    CSIRO-Commonwealth Scientific and Industrial Research Organization (AU) Parkes Observatory [Murriyang, the traditional Indigenous name], located 20 kilometres north of the town of Parkes, New South Wales, Australia, 414.80m above sea level.

    NASA Canberra Deep Space Communication Complex, AU, Deep Space Network. Credit: NASA.

    CSIRO Canberra campus.

    ESA DSA 1, hosts a 35-metre deep-space antenna with transmission and reception in both S- and X-band and is located 140 kilometres north of Perth, Western Australia, near the town of New Norcia.

    CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU)CSIRO R/V Investigator.

    UK Space NovaSAR-1 satellite (UK) synthetic aperture radar satellite.

    CSIRO Pawsey Supercomputing Centre AU)

    Magnus Cray XC40 supercomputer at Pawsey Supercomputer Centre Perth Australia.

    Galaxy Cray XC30 Series Supercomputer at at Pawsey Supercomputer Centre Perth Australia.

    Pausey Supercomputer CSIRO Zeus SGI Linux cluster.

    Others not shown

    SKA

    SKA- Square Kilometer Array.

    SKA Square Kilometre Array low frequency at Murchison Widefield Array, Boolardy station in outback Western Australia on the traditional lands of the Wajarri peoples.

    EDGES telescope in a radio quiet zone at the Murchison Radio-astronomy Observatory in Western Australia, on the traditional lands of the Wajarri peoples.

     
  • richardmitnick 1:45 pm on June 1, 2022 Permalink | Reply
    Tags: "Astronomers identify 116000 new variable stars", , , , , , Machine learning, , Surveys like ASAS-SN are an especially important tool for finding systems that can reveal the complexities of stellar processes., The All-Sky Automated Survey for Supernovae (ASAS-SN)   

    From Ohio State University: “Astronomers identify 116000 new variable stars” 

    From Ohio State University

    5.31.22

    Tatyana Woodall
    Ohio State News
    woodall.52@osu.edu

    New technique locates stellar objects that change brightness.

    Ohio State University astronomers have identified about 116,000 new variable stars, according to a new paper.

    These heavenly bodies were found by The All-Sky Automated Survey for Supernovae (ASAS-SN), a network of 20 telescopes around the world which can observe the entire sky about 50,000 times deeper than the human eye. Researchers from Ohio State have operated the project for nearly a decade.

    Now in a paper published for MNRAS, researchers describe how they used machine learning techniques to identify and classify variable stars — celestial objects whose brightness waxes and wanes over time, especially if observed from our perspective on Earth.

    The changes these stars undergo can reveal important information about their mass, radius, temperature and even their composition. In fact, even our sun is considered a variable star. Surveys like ASAS-SN are an especially important tool for finding systems that can reveal the complexities of stellar processes, said Collin Christy, the lead author of the paper and an ASAS-SN analyst at Ohio State.

    “Variable stars are sort of like a stellar laboratory,” he said. “They’re really neat places in the universe where we can study and learn more about how stars actually work and the little intricacies that they all have.”

    But to locate more of these elusive entities, the team first had to bring in previously unused data from the project. For years, ASAS-SN gazed at the sky using V-band filters, optical lenses that can only identify stars whose light falls into the spectrum of colors visible to the naked eye. But in 2018, the project shifted to using g-band filters — lenses that can detect more varieties of blue light — and the network went from being able to observe about 60 million stars at a time to more than 100 million.

    But unlike ASAS-SN’s citizen science campaign, which relies on volunteers to sift through and classify astronomical data, Christy’s study required the help of artificial intelligence.

    “If you want to look at millions of stars, it’s impossible for a few humans to do it by themselves. It’ll take forever,” said Tharindu Jayasinghe, co-author of the paper, a doctoral student in astronomy and an Ohio State presidential fellow. “So we had to bring something creative into the mix, like machine learning techniques.”

    The new study focused on data from Gaia, a mission to chart a three-dimensional map of our galaxy, as well as from 2MASS and AllWISE. Christy’s team used a machine learning algorithm to generate a list of 1.5 million candidate variable stars from a catalog of about 55 million isolated stars.

    Afterward, researchers whittled the number of candidates down even further. Of the 1.5 million stars they studied, nearly 400,000 turned out to be real variable stars. More than half were already known to the astronomy community, but 116,027 of them proved to be new discoveries.

    Although the study needed machine learning to complete it, Christy’s team says there is still a role for citizen scientists. In fact, volunteers with the citizen science campaign have already started to identify junk data, he said. “Having people tell us what our bad data looks like is super useful, because initially, the algorithm would look at the bad data and try to make sense of it,” Christy said.

    But using a training set of all that bad data allows the team to modify and improve the overall performance of their algorithm. “This is the first time that we’re actually combining citizen science with machine learning techniques in the field of variable star astronomy,” said Jayasinghe. “We’re expanding the boundaries of what you can do when you put those two together.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Ohio State University is a public research university in Columbus, Ohio. Founded in 1870 as a land-grant university and the ninth university in Ohio with the Morrill Act of 1862, the university was originally known as the Ohio Agricultural and Mechanical College. The college originally focused on various agricultural and mechanical disciplines but it developed into a comprehensive university under the direction of then-Governor (later, U.S. President) Rutherford B. Hayes, and in 1878 the Ohio General Assembly passed a law changing the name to “The Ohio State University”. The main campus in Columbus, Ohio, has since grown into the third-largest university campus in the United States. The university also operates regional campuses in Lima, Mansfield, Marion, Newark, and Wooster.

    The university has an extensive student life program, with over 1,000 student organizations; intercollegiate, club and recreational sports programs; student media organizations and publications, fraternities and sororities; and three student governments. Ohio State athletic teams compete in Division I of the NCAA and are known as the Ohio State Buckeyes. As of the 2016 Summer Olympics, athletes from Ohio State have won 104 Olympic medals (46 gold, 35 silver, and 23 bronze). The university is a member of the Big Ten Conference for the majority of sports.

     
  • richardmitnick 4:10 pm on May 30, 2022 Permalink | Reply
    Tags: "Frontier supercomputer debuts as world’s fastest-breaking exascale barrier", , , , , Machine learning, , ,   

    From The DOE’s Oak Ridge National Laboratory: “Frontier supercomputer debuts as world’s fastest-breaking exascale barrier” 

    From The DOE’s Oak Ridge National Laboratory

    May 30, 2022

    Media Contacts:

    Sara Shoemaker
    shoemakerms@ornl.gov,
    865.576.9219

    Secondary Media Contact
    Katie Bethea
    Oak Ridge Leadership Computing Facility
    betheakl@ornl.gov
    757.817.2832


    Frontier: The World’s First Exascale Supercomputer Has Arrived

    The Frontier supercomputer [below] at the Department of Energy’s Oak Ridge National Laboratory earned the top ranking today as the world’s fastest on the 59th TOP500 list, with 1.1 exaflops of performance. The system is the first to achieve an unprecedented level of computing performance known as exascale, a threshold of a quintillion calculations per second.

    Frontier features a theoretical peak performance of 2 exaflops, or two quintillion calculations per second, making it ten times more powerful than ORNL’s Summit system [below]. The system leverages ORNL’s extensive expertise in accelerated computing and will enable scientists to develop critically needed technologies for the country’s energy, economic and national security, helping researchers address problems of national importance that were impossible to solve just five years ago.

    “Frontier is ushering in a new era of exascale computing to solve the world’s biggest scientific challenges,” ORNL Director Thomas Zacharia said. “This milestone offers just a preview of Frontier’s unmatched capability as a tool for scientific discovery. It is the result of more than a decade of collaboration among the national laboratories, academia and private industry, including DOE’s Exascale Computing Project, which is deploying the applications, software technologies, hardware and integration necessary to ensure impact at the exascale.”

    Rankings were announced at the International Supercomputing Conference 2022 in Hamburg, Germany, which gathers leaders from around the world in the field of high-performance computing, or HPC. Frontier’s speeds surpassed those of any other supercomputer in the world, including ORNL’s Summit, which is also housed at ORNL’s Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility.

    Frontier, a HPE Cray EX supercomputer, also claimed the number one spot on the Green500 list, which rates energy use and efficiency by commercially available supercomputing systems, with 62.68 gigaflops performance per watt. Frontier rounded out the twice-yearly rankings with the top spot in a newer category, mixed-precision computing, that rates performance in formats commonly used for artificial intelligence, with a performance of 6.88 exaflops.

    The work to deliver, install and test Frontier began during the COVID-19 pandemic, as shutdowns around the world strained international supply chains. More than 100 members of a public-private team worked around the clock, from sourcing millions of components to ensuring deliveries of system parts on deadline to carefully installing and testing 74 HPE Cray EX supercomputer cabinets, which include more than 9,400 AMD-powered nodes and 90 miles of networking cables.

    “When researchers gain access to the fully operational Frontier system later this year, it will mark the culmination of work that began over three years ago involving hundreds of talented people across the Department of Energy and our industry partners at HPE and AMD,” ORNL Associate Lab Director for computing and computational sciences Jeff Nichols said. “Scientists and engineers from around the world will put these extraordinary computing speeds to work to solve some of the most challenging questions of our era, and many will begin their exploration on Day One.”

    3

    Frontier’s overall performance of 1.1 exaflops translates to more than one quintillion floating point operations per second, or flops, as measured by the High-Performance Linpack Benchmark test. Each flop represents a possible calculation, such as addition, subtraction, multiplication or division.

    Frontier’s early performance on the Linpack benchmark amounts to more than seven times that of Summit at 148.6 petaflops. Summit continues as an impressive, highly ranked workhorse machine for open science, listed at number four on the TOP500.

    Frontier’s mixed-precision computing performance clocked in at roughly 6.88 exaflops, or more than 6.8 quintillion flops per second, as measured by the High-Performance Linpack-Accelerator Introspection, or HPL-AI, test. The HPL-AI test measures calculation speeds in the computing formats typically used by the machine-learning methods that drive advances in artificial intelligence.

    Detailed simulations relied on by traditional HPC users to model such phenomena as cancer cells, supernovas, the coronavirus or the atomic structure of elements require 64-bit precision, a computationally demanding form of computing accuracy. Machine-learning algorithms typically require much less precision — sometimes as little as 32-, 24- or 16-bit accuracy — and can take advantage of special hardware in the graphic processing units, or GPUs, relied on by machines like Frontier to reach even faster speeds.

    ORNL and its partners continue to execute the bring-up of Frontier on schedule. Next steps include continued testing and validation of the system, which remains on track for final acceptance and early science access later in 2022 and open for full science at the beginning of 2023.

    4
    Credit: Laddy Fields/ORNL, U.S. Dept. of Energy.

    FACTS ABOUT FRONTIER

    The Frontier supercomputer’s exascale performance is enabled by some of the world’s most advanced pieces of technology from HPE and AMD:

    Frontier has 74 HPE Cray EX supercomputer cabinets, which are purpose-built to support next-generation supercomputing performance and scale, once open for early science access.

    Each node contains one optimized EPYC™ processor and four AMD Instinct™ accelerators, for a total of more than 9,400 CPUs and more than 37,000 GPUs in the entire system. These nodes provide developers with easier capabilities to program their applications, due to the coherency enabled by the EPYC processors and Instinct accelerators.

    HPE Slingshot, the world’s only high-performance Ethernet fabric designed for next-generation HPC and AI solutions, including larger, data-intensive workloads, to address demands for higher speed and congestion control for applications to run smoothly and boost performance.

    An I/O subsystem from HPE that will come online this year to support Frontier and the OLCF. The I/O subsystem features an in-system storage layer and Orion, a Lustre-based enhanced center-wide file system that is also the world’s largest and fastest single parallel file system, based on the Cray ClusterStor E1000 storage system. The in-system storage layer will employ compute-node local storage devices connected via PCIe Gen4 links to provide peak read speeds of more than 75 terabytes per second, peak write speeds of more than 35 terabytes per second, and more than 15 billion random-read input/output operations per second. The Orion center-wide file system will provide around 700 petabytes of storage capacity and peak write speeds of 5 terabytes per second.

    As a next-generation supercomputing system and the world’s fastest for open science, Frontier is also energy-efficient, due to its liquid-cooled capabilities. This cooling system promotes a quieter data center by removing the need for a noisier, air-cooled system.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition


    Established in 1942, The DOE’s Oak Ridge National Laboratory is the largest science and energy national laboratory in the Department of Energy system (by size) and third largest by annual budget. It is located in the Roane County section of Oak Ridge, Tennessee. Its scientific programs focus on materials, neutron science, energy, high-performance computing, systems biology and national security, sometimes in partnership with the state of Tennessee, universities and other industries.

    ORNL has several of the world’s top supercomputers, including Summit [below], ranked by the TOP500 as Earth’s second-most powerful.

    ORNL OLCF IBM Q AC922 SUMMIT supercomputer, was No.1 on the TOP500..

    The lab is a leading neutron and nuclear power research facility that includes the Spallation Neutron Source and High Flux Isotope Reactor.

    ORNL Spallation Neutron Source annotated.

    It hosts the Center for Nanophase Materials Sciences, the BioEnergy Science Center, and the Consortium for Advanced Simulation of Light Water Nuclear Reactors.

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    Areas of research

    ORNL conducts research and development activities that span a wide range of scientific disciplines. Many research areas have a significant overlap with each other; researchers often work in two or more of the fields listed here. The laboratory’s major research areas are described briefly below.

    Chemical sciences – ORNL conducts both fundamental and applied research in a number of areas, including catalysis, surface science and interfacial chemistry; molecular transformations and fuel chemistry; heavy element chemistry and radioactive materials characterization; aqueous solution chemistry and geochemistry; mass spectrometry and laser spectroscopy; separations chemistry; materials chemistry including synthesis and characterization of polymers and other soft materials; chemical biosciences; and neutron science.
    Electron microscopy – ORNL’s electron microscopy program investigates key issues in condensed matter, materials, chemical and nanosciences.
    Nuclear medicine – The laboratory’s nuclear medicine research is focused on the development of improved reactor production and processing methods to provide medical radioisotopes, the development of new radionuclide generator systems, the design and evaluation of new radiopharmaceuticals for applications in nuclear medicine and oncology.
    Physics – Physics research at ORNL is focused primarily on studies of the fundamental properties of matter at the atomic, nuclear, and subnuclear levels and the development of experimental devices in support of these studies.
    Population – ORNL provides federal, state and international organizations with a gridded population database, called Landscan, for estimating ambient population. LandScan is a raster image, or grid, of population counts, which provides human population estimates every 30 x 30 arc seconds, which translates roughly to population estimates for 1 kilometer square windows or grid cells at the equator, with cell width decreasing at higher latitudes. Though many population datasets exist, LandScan is the best spatial population dataset, which also covers the globe. Updated annually (although data releases are generally one year behind the current year) offers continuous, updated values of population, based on the most recent information. Landscan data are accessible through GIS applications and a USAID public domain application called Population Explorer.

     
  • richardmitnick 10:50 am on May 18, 2022 Permalink | Reply
    Tags: "Bargain proteins", , DOE INCITE, , Machine learning, , , , U Washington Baker Lab   

    From DOE’s ASCR Discovery: “Bargain proteins” 

    From DOE’s ASCR Discovery

    May 2022

    A Flatiron Institute biologist uses supercomputers and their quantum cousins to streamline the search for promising drugs.

    1
    An X-ray crystal structure of a self-assembling peptide helical bundle. The structure was designed on a quantum annealer and validated with a classical simulation on large-scale high-performance computing resources. Lab researchers later synthesized and characterized it experimentally. The two mirror-image subunits (one made from L-amino acids in cyan, the other made from D-amino acids in orange) were designed to pack together. Image courtesy of Vikram Mulligan, Flatiron Institute.

    Devising a drug to treat a disease isn’t easy. It means screening hundreds of thousands of compounds, testing a small pool of promising candidates first in tissue culture and later in animals, and then, after many intermediate steps, finding perhaps one molecule worthy of human testing.

    Using powerful computers to design novel proteins with the best properties can save much of this time, effort and expense. Vikram Mulligan, a research scientist at the Flatiron Institute in New York, aims to make the process even quicker and cheaper.

    “A protein’s function is uniquely determined by the way it folds, which in turn is determined by its sequence of amino-acid building blocks,” Mulligan says. “Untangling the sequence-fold-function relationship is challenging due to the vastness of both the possible sequence space and the possible conformational space” – the huge number of configurations a protein could fold into.

    Unfortunately, most researchers lack the computational resources to model protein folding. Mulligan hopes to address that limit. With allocations of supercomputer time from the Department of Energy (DOE), he and collaborators are developing machine learning methods and quantum computing technology, which relies on the strange physics that dominate at the tiniest scales, to improve protein-folding models and make them accessible to the average scientist.

    Mulligan’s quest to understand the sequence-fold-function relationship began as a University of Toronto (CA) doctoral student, when he used lab experiments to investigate the kinetics of protein folding and misfolding in late-onset neurodegenerative diseases. As a postdoctoral researcher, he later sought additional computational skills and joined David Baker’s University of Washington lab, one of the preeminent hubs for computationally designed proteins.

    Starting in the early 2000s, Baker developed Rosetta@home, an open-source software suite that hundreds of labs worldwide now use to predict and design protein structures.

    Until Mulligan joined Baker’s lab, Rosetta usually was used to model proteins made from the 20 naturally occurring amino acids. Mulligan, however, saw potential in designing peptides – amino acid chains – made from nonnatural amino acids that differ from the natural ones in various ways, such as an extra chlorine atom here or a fully reconfigured side chain there. “This flexibility allows us to make structures of mixed handedness, where we have helices that spiral in opposite directions packing against one another and things like that,” Mulligan says. “Ultimately, it allows us to access much more diverse structures, which in turn means we can access more diverse functions.”

    Increasing structural diversity, however, also means increasing the challenge of computationally exploring the space of possible protein sequences. Mulligan embraced the challenge. “The ultimate test of whether we really understand how proteins fold is when we try to make something new out of building blocks that nature doesn’t use.”

    The first proof-of-principle compound derived from nonnatural amino acids that Mulligan computationally designed at Baker’s lab was a peptide that binds to and inhibits New Delhi metallo-beta-lactamase 1, an enzyme implicated in antibiotic-resistant bacteria. “If we can make something that inhibits this antibiotic-resistance factor then we can treat antibiotic-resistant infection using a combination of the inhibitor and existing antibiotics,” Mulligan says. “This would make all our old antibiotics useful again.”

    Since joining Flatiron in 2018, Mulligan has worked on methods that use ever fewer computational resources to design proteins with ever greater precision. Now he’s pursuing a pair of projects with a million node-hours on Theta, a Cray XC40 supercomputer at the Argonne Leadership Computing Facility, via a DOE INCITE (Innovative and Novel Computational Impact on Theory and Experiment) allocation.

    “On Theta, we can do a calculation in a day that might take us a week or two on smaller clusters,” Mulligan says. “That allows us to iterate very fast.” Theta’s mix of GPU- and CPU-based nodes are well suited to the research, he adds. “There are a number of computational problems related to protein folding that don’t map all that well to GPUs, so it is valuable to also have access to a lot of CPUs.”

    For his first INCITE project, Mulligan will develop machine-learning methods with low computational cost that can approximate the output of demanding validation simulations. “As we develop new methods, we need to validate the method against a reliable, established method that might be more computationally expensive,” Mulligan says. “So, we will do a one-time run of a ton of calculations on Theta to produce the data on which we will train the machine-learning method.” Once trained, researchers can use the technique to perform design and validation tasks on smaller computing systems. “It’s a one-time expensive use of computation to enable a lot of cheap computations down the road.”

    The second INCITE project focuses on two areas. First, Mulligan and his colleagues will use simulations of quantum computers running on Theta to design proteins using an energy function from Rosetta that’s based on classical physics. Second, they’ll attempt to improve computations of energies performed on standard computers by incorporating quantum mechanical energy calculations to complement the Rosetta energy function.

    Quantum computers could implicitly consider every possible amino acid sequence and let researchers efficiently sample from the best ones. With collaborators Hans Melo at drug-design company Menten AI and Brian Weitzner at protein-engineering firm Outpace Bio, Mulligan has successfully mapped the protein design problem to quantum annealers, special-purpose quantum computers that solve optimization problems, such as finding the most stable folding state for proteins with specific amino acid sequences.

    With Menten AI’s Alexey Galda and Gavin Crooks at the Berkeley Institute for Theoretical Science, Mulligan also is beginning to map the problem to general-purpose gate-based quantum computers. The team has validated the first real proteins that were designed on a quantum computer, working with New York University’s Paramjit “Bobby” Arora and his graduate student, Haley Merritt, to synthesize molecules, and with UCLA researchers Michael Sawaya and Todd Yeates to solve structures.

    Although its rewards seem far off, quantum computing will open the door to exploring, for the first time, the full palette of thousands of nonnatural amino acids and other chemical building blocks available to researchers, Mulligan expects. “We hope this will be the extra little boost we need to design proteins that get across the cell membrane and bind to a target and have all the other properties we’d like to see in a drug.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

    The United States Department of Energy (DOE) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy. The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include:
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of Fossil Energy[32]
    Office of River Protection[33]
    Pantex
    Radiological and Environmental Sciences Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
  • richardmitnick 10:07 am on May 16, 2022 Permalink | Reply
    Tags: "Johannes Eichstaedt- Exploring the Intersection of Psychology and AI", , Machine learning, One current focus is building a dataset of millions of Americans whose tweets will be analyzed over time to screen for individual communities’ mental health., , , , The Stanford Computational Psychology and Well-Being Lab   

    From Stanford University: “Johannes Eichstaedt- Exploring the Intersection of Psychology and AI” 

    Stanford University Name

    From Stanford University

    1
    Stanford University Human-Centered Artificial Intelligence

    May 3, 2022
    Beth Jensen

    “The beauty of what we’re doing at these interdisciplinary AI intersections is that we’re really building things that haven’t existed before,” says the Shriram Faculty Fellow.

    1
    Johannes Eichstaedt is a computational psychologist and the Ram and Vijay Shriram Faculty Fellow at the Stanford Institute for Human-Centered Artificial Intelligence. | Christine Baker.

    Johannes Eichstaedt remembers the moment he knew he wanted to switch his focus from particle physics to psychology. A decade ago, gazing at the particle accelerator at The DOE’s Argonne National Laboratory, he realized he felt out of place both professionally and personally.

    “I realized I cared more about people than I did about particles,” he says. “I wanted to work on how humans live their lives, and I wanted it to be work that had the potential to be relevant to a lot of people, so I decided to switch into the social sciences. I chose psychology because it’s a connector between fields like cognitive science, sociology, public health, and economics. It does a lot of linkage, which I appreciate.”

    Eichstaedt found himself drawn not only to the interdisciplinary potential of psychology, but also to the relatively new subfield of positive psychology, which strives to understand what makes human life most worth living. At The University of Pennsylvania, he joined a cadre of young psychologists and computer scientists anxious to take advantage of increasing government interest in measuring the happiness and mental health of their citizens.

    “We thought that if these governments were doing this at a small scale, maybe we could find ways to do it globally and cheaply using social media-based indicators,” he says. “We started the World Well-Being Project in an attempt to use social media and large-scale aggregated data and run it through natural language processing to create indicators of well-being for counties and cities.” Today, Eichstaedt is a computational social scientist and the Ram and Vijay Shriram Faculty Fellow at the Stanford Institute for Human-Centered Artificial Intelligence (HAI). He continues to collaborate with many of those colleagues as they study a growing volume of social media data to try to understand the mental and physical health of users.

    “Together we now have tremendous firepower, with 30 to 40 people working on this from all different aspects,” says Eichstaedt, who also directs The Stanford Computational Psychology and Well-Being Lab. “Ten years ago, we began the consortium by studying well-being, but now we want to know if we can use these methods to help understand disease and reduce mortality.”

    His team uses social media data and machine learning to gain insight into a wide range of health-related issues; they’ve recently identified heightened levels of depression in the Black community following the murder of George Floyd, and tracked the public’s COVID-19 social-distancing trends and adherence to public health guidelines. One current focus is building a dataset of millions of Americans whose tweets will be analyzed over time to screen for individual communities’ mental health, which is believed to have worsened significantly during the pandemic. Using prediction models that search for words associated with negative emotions and cognitions, Eichstaedt is able to pinpoint counties throughout the U.S. where the population is at particular risk of depression and its associated physical complications.

    “Communities have properties,” he says. “It turns out that these social media pipelines are really good at predicting things that are psychological in nature—things like suicides, accidents, drinking, and even atherosclerotic heart disease.”

    Eichstaedt is also working with HAI Associate Director Russell Altman to develop a project tracking the opioid pandemic using drug-related language culled from social media. The project could give healthcare professionals and policymakers much-needed community health information more quickly and cheaply than traditional surveying.

    Privacy is at the forefront of Eichstaedt’s work. When studying individuals, his team obtains the necessary permission to analyze participants’ social media feeds, follows strict privacy guidelines, and is externally reviewed for compliance and ethical research conduct. Few social media users, however, realize how much information can be revealed by allowing access to their statuses or “likes,” raising important questions on user privacy, informed consent, data protection, and data ownership. Yet the technology also holds the potential to greatly benefit public health.

    “I think we’re not even foreseeing some of the most important applications for this technology,” he says, adding that they could range from predicting communities most at risk of having low birthweight babies to assessing human stress levels following climate change-related disasters such as wildfires. The technology could be especially useful for gaining health insights into populations in under-resourced areas of the world where data collection isn’t as prolific as in the U.S.

    “The beauty of what we’re doing at these interdisciplinary AI intersections is that we’re really building things that haven’t existed before,” Eichstaedt says. “It’s often hard for interdisciplinary people like myself to find truly interdisciplinary environments in which to work. That’s why I’m really grateful for HAI, which has given me the chance to do my work at full force. And that’s been really nice.”

    For his work, Eichstaedt recently received two major early career researcher awards from The Association for Psychological Science and The International Positive Psychology Association.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus

    Leland and Jane Stanford founded Stanford University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members.

    Stanford University, officially Leland Stanford Junior University, is a private research university located in Stanford, California. Stanford was founded in 1885 by Leland and Jane Stanford in memory of their only child, Leland Stanford Jr., who had died of typhoid fever at age 15 the previous year. Stanford is consistently ranked as among the most prestigious and top universities in the world by major education publications. It is also one of the top fundraising institutions in the country, becoming the first school to raise more than a billion dollars in a year.

    Leland Stanford was a U.S. senator and former governor of California who made his fortune as a railroad tycoon. The school admitted its first students on October 1, 1891, as a coeducational and non-denominational institution. Stanford University struggled financially after the death of Leland Stanford in 1893 and again after much of the campus was damaged by the 1906 San Francisco earthquake. Following World War II, provost Frederick Terman supported faculty and graduates’ entrepreneurialism to build self-sufficient local industry in what would later be known as Silicon Valley.

    The university is organized around seven schools: three schools consisting of 40 academic departments at the undergraduate level as well as four professional schools that focus on graduate programs in law, medicine, education, and business. All schools are on the same campus. Students compete in 36 varsity sports, and the university is one of two private institutions in the Division I FBS Pac-12 Conference. It has gained 126 NCAA team championships, and Stanford has won the NACDA Directors’ Cup for 24 consecutive years, beginning in 1994–1995. In addition, Stanford students and alumni have won 270 Olympic medals including 139 gold medals.

    As of October 2020, 84 Nobel laureates, 28 Turing Award laureates, and eight Fields Medalists have been affiliated with Stanford as students, alumni, faculty, or staff. In addition, Stanford is particularly noted for its entrepreneurship and is one of the most successful universities in attracting funding for start-ups. Stanford alumni have founded numerous companies, which combined produce more than $2.7 trillion in annual revenue, roughly equivalent to the 7th largest economy in the world (as of 2020). Stanford is the alma mater of one president of the United States (Herbert Hoover), 74 living billionaires, and 17 astronauts. It is also one of the leading producers of Fulbright Scholars, Marshall Scholars, Rhodes Scholars, and members of the United States Congress.

    Stanford University was founded in 1885 by Leland and Jane Stanford, dedicated to Leland Stanford Jr, their only child. The institution opened in 1891 on Stanford’s previous Palo Alto farm.

    Jane and Leland Stanford modeled their university after the great eastern universities, most specifically Cornell University. Stanford opened being called the “Cornell of the West” in 1891 due to faculty being former Cornell affiliates (either professors, alumni, or both) including its first president, David Starr Jordan, and second president, John Casper Branner. Both Cornell and Stanford were among the first to have higher education be accessible, nonsectarian, and open to women as well as to men. Cornell is credited as one of the first American universities to adopt this radical departure from traditional education, and Stanford became an early adopter as well.

    Despite being impacted by earthquakes in both 1906 and 1989, the campus was rebuilt each time. In 1919, The Hoover Institution on War, Revolution and Peace was started by Herbert Hoover to preserve artifacts related to World War I. The Stanford Medical Center, completed in 1959, is a teaching hospital with over 800 beds. The DOE’s SLAC National Accelerator Laboratory (originally named the Stanford Linear Accelerator Center), established in 1962, performs research in particle physics.

    Land

    Most of Stanford is on an 8,180-acre (12.8 sq mi; 33.1 km^2) campus, one of the largest in the United States. It is located on the San Francisco Peninsula, in the northwest part of the Santa Clara Valley (Silicon Valley) approximately 37 miles (60 km) southeast of San Francisco and approximately 20 miles (30 km) northwest of San Jose. In 2008, 60% of this land remained undeveloped.

    Stanford’s main campus includes a census-designated place within unincorporated Santa Clara County, although some of the university land (such as the Stanford Shopping Center and the Stanford Research Park) is within the city limits of Palo Alto. The campus also includes much land in unincorporated San Mateo County (including the SLAC National Accelerator Laboratory and the Jasper Ridge Biological Preserve), as well as in the city limits of Menlo Park (Stanford Hills neighborhood), Woodside, and Portola Valley.

    Non-central campus

    Stanford currently operates in various locations outside of its central campus.

    On the founding grant:

    Jasper Ridge Biological Preserve is a 1,200-acre (490 ha) natural reserve south of the central campus owned by the university and used by wildlife biologists for research.
    SLAC National Accelerator Laboratory is a facility west of the central campus operated by the university for the Department of Energy. It contains the longest linear particle accelerator in the world, 2 miles (3.2 km) on 426 acres (172 ha) of land.
    Golf course and a seasonal lake: The university also has its own golf course and a seasonal lake (Lake Lagunita, actually an irrigation reservoir), both home to the vulnerable California tiger salamander. As of 2012 Lake Lagunita was often dry and the university had no plans to artificially fill it.

    Off the founding grant:

    Hopkins Marine Station, in Pacific Grove, California, is a marine biology research center owned by the university since 1892.
    Study abroad locations: unlike typical study abroad programs, Stanford itself operates in several locations around the world; thus, each location has Stanford faculty-in-residence and staff in addition to students, creating a “mini-Stanford”.

    Redwood City campus for many of the university’s administrative offices located in Redwood City, California, a few miles north of the main campus. In 2005, the university purchased a small, 35-acre (14 ha) campus in Midpoint Technology Park intended for staff offices; development was delayed by The Great Recession. In 2015 the university announced a development plan and the Redwood City campus opened in March 2019.

    The Bass Center in Washington, DC provides a base, including housing, for the Stanford in Washington program for undergraduates. It includes a small art gallery open to the public.

    China: Stanford Center at Peking University, housed in the Lee Jung Sen Building, is a small center for researchers and students in collaboration with Beijing University [北京大学](CN) (Kavli Institute for Astronomy and Astrophysics at Peking University(CN) (KIAA-PKU).

    Administration and organization

    Stanford is a private, non-profit university that is administered as a corporate trust governed by a privately appointed board of trustees with a maximum membership of 38. Trustees serve five-year terms (not more than two consecutive terms) and meet five times annually.[83] A new trustee is chosen by the current trustees by ballot. The Stanford trustees also oversee the Stanford Research Park, the Stanford Shopping Center, the Cantor Center for Visual Arts, Stanford University Medical Center, and many associated medical facilities (including the Lucile Packard Children’s Hospital).

    The board appoints a president to serve as the chief executive officer of the university, to prescribe the duties of professors and course of study, to manage financial and business affairs, and to appoint nine vice presidents. The provost is the chief academic and budget officer, to whom the deans of each of the seven schools report. Persis Drell became the 13th provost in February 2017.

    As of 2018, the university was organized into seven academic schools. The schools of Humanities and Sciences (27 departments), Engineering (nine departments), and Earth, Energy & Environmental Sciences (four departments) have both graduate and undergraduate programs while the Schools of Law, Medicine, Education and Business have graduate programs only. The powers and authority of the faculty are vested in the Academic Council, which is made up of tenure and non-tenure line faculty, research faculty, senior fellows in some policy centers and institutes, the president of the university, and some other academic administrators, but most matters are handled by the Faculty Senate, made up of 55 elected representatives of the faculty.

    The Associated Students of Stanford University (ASSU) is the student government for Stanford and all registered students are members. Its elected leadership consists of the Undergraduate Senate elected by the undergraduate students, the Graduate Student Council elected by the graduate students, and the President and Vice President elected as a ticket by the entire student body.

    Stanford is the beneficiary of a special clause in the California Constitution, which explicitly exempts Stanford property from taxation so long as the property is used for educational purposes.

    Endowment and donations

    The university’s endowment, managed by the Stanford Management Company, was valued at $27.7 billion as of August 31, 2019. Payouts from the Stanford endowment covered approximately 21.8% of university expenses in the 2019 fiscal year. In the 2018 NACUBO-TIAA survey of colleges and universities in the United States and Canada, only Harvard University, the University of Texas System, and Yale University had larger endowments than Stanford.

    In 2006, President John L. Hennessy launched a five-year campaign called the Stanford Challenge, which reached its $4.3 billion fundraising goal in 2009, two years ahead of time, but continued fundraising for the duration of the campaign. It concluded on December 31, 2011, having raised a total of $6.23 billion and breaking the previous campaign fundraising record of $3.88 billion held by Yale. Specifically, the campaign raised $253.7 million for undergraduate financial aid, as well as $2.33 billion for its initiative in “Seeking Solutions” to global problems, $1.61 billion for “Educating Leaders” by improving K-12 education, and $2.11 billion for “Foundation of Excellence” aimed at providing academic support for Stanford students and faculty. Funds supported 366 new fellowships for graduate students, 139 new endowed chairs for faculty, and 38 new or renovated buildings. The new funding also enabled the construction of a facility for stem cell research; a new campus for the business school; an expansion of the law school; a new Engineering Quad; a new art and art history building; an on-campus concert hall; a new art museum; and a planned expansion of the medical school, among other things. In 2012, the university raised $1.035 billion, becoming the first school to raise more than a billion dollars in a year.

    Research centers and institutes

    DOE’s SLAC National Accelerator Laboratory
    Stanford Research Institute, a center of innovation to support economic development in the region.
    Hoover Institution, a conservative American public policy institution and research institution that promotes personal and economic liberty, free enterprise, and limited government.
    Hasso Plattner Institute of Design, a multidisciplinary design school in cooperation with the Hasso Plattner Institute of University of Potsdam [Universität Potsdam](DE) that integrates product design, engineering, and business management education).
    Martin Luther King Jr. Research and Education Institute, which grew out of and still contains the Martin Luther King Jr. Papers Project.
    John S. Knight Fellowship for Professional Journalists
    Center for Ocean Solutions
    Together with UC Berkeley and UC San Francisco, Stanford is part of the Biohub, a new medical science research center founded in 2016 by a $600 million commitment from Facebook CEO and founder Mark Zuckerberg and pediatrician Priscilla Chan.

    Discoveries and innovation

    Natural sciences

    Biological synthesis of deoxyribonucleic acid (DNA) – Arthur Kornberg synthesized DNA material and won the Nobel Prize in Physiology or Medicine 1959 for his work at Stanford.
    First Transgenic organism – Stanley Cohen and Herbert Boyer were the first scientists to transplant genes from one living organism to another, a fundamental discovery for genetic engineering. Thousands of products have been developed on the basis of their work, including human growth hormone and hepatitis B vaccine.
    Laser – Arthur Leonard Schawlow shared the 1981 Nobel Prize in Physics with Nicolaas Bloembergen and Kai Siegbahn for his work on lasers.
    Nuclear magnetic resonance – Felix Bloch developed new methods for nuclear magnetic precision measurements, which are the underlying principles of the MRI.

    Computer and applied sciences

    ARPANETStanford Research Institute, formerly part of Stanford but on a separate campus, was the site of one of the four original ARPANET nodes.

    Internet—Stanford was the site where the original design of the Internet was undertaken. Vint Cerf led a research group to elaborate the design of the Transmission Control Protocol (TCP/IP) that he originally co-created with Robert E. Kahn (Bob Kahn) in 1973 and which formed the basis for the architecture of the Internet.

    Frequency modulation synthesis – John Chowning of the Music department invented the FM music synthesis algorithm in 1967, and Stanford later licensed it to Yamaha Corporation.

    Google – Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford. They were working on the Stanford Digital Library Project (SDLP). The SDLP’s goal was “to develop the enabling technologies for a single, integrated and universal digital library” and it was funded through the National Science Foundation, among other federal agencies.

    Klystron tube – invented by the brothers Russell and Sigurd Varian at Stanford. Their prototype was completed and demonstrated successfully on August 30, 1937. Upon publication in 1939, news of the klystron immediately influenced the work of U.S. and UK researchers working on radar equipment.

    RISCARPA funded VLSI project of microprocessor design. Stanford and University of California- Berkeley are most associated with the popularization of this concept. The Stanford MIPS would go on to be commercialized as the successful MIPS architecture, while Berkeley RISC gave its name to the entire concept, commercialized as the SPARC. Another success from this era were IBM’s efforts that eventually led to the IBM POWER instruction set architecture, PowerPC, and Power ISA. As these projects matured, a wide variety of similar designs flourished in the late 1980s and especially the early 1990s, representing a major force in the Unix workstation market as well as embedded processors in laser printers, routers and similar products.
    SUN workstation – Andy Bechtolsheim designed the SUN workstation for the Stanford University Network communications project as a personal CAD workstation, which led to Sun Microsystems.

    Businesses and entrepreneurship

    Stanford is one of the most successful universities in creating companies and licensing its inventions to existing companies; it is often held up as a model for technology transfer. Stanford’s Office of Technology Licensing is responsible for commercializing university research, intellectual property, and university-developed projects.

    The university is described as having a strong venture culture in which students are encouraged, and often funded, to launch their own companies.

    Companies founded by Stanford alumni generate more than $2.7 trillion in annual revenue, equivalent to the 10th-largest economy in the world.

    Some companies closely associated with Stanford and their connections include:

    Hewlett-Packard, 1939, co-founders William R. Hewlett (B.S, PhD) and David Packard (M.S).
    Silicon Graphics, 1981, co-founders James H. Clark (Associate Professor) and several of his grad students.
    Sun Microsystems, 1982, co-founders Vinod Khosla (M.B.A), Andy Bechtolsheim (PhD) and Scott McNealy (M.B.A).
    Cisco, 1984, founders Leonard Bosack (M.S) and Sandy Lerner (M.S) who were in charge of Stanford Computer Science and Graduate School of Business computer operations groups respectively when the hardware was developed.[163]
    Yahoo!, 1994, co-founders Jerry Yang (B.S, M.S) and David Filo (M.S).
    Google, 1998, co-founders Larry Page (M.S) and Sergey Brin (M.S).
    LinkedIn, 2002, co-founders Reid Hoffman (B.S), Konstantin Guericke (B.S, M.S), Eric Lee (B.S), and Alan Liu (B.S).
    Instagram, 2010, co-founders Kevin Systrom (B.S) and Mike Krieger (B.S).
    Snapchat, 2011, co-founders Evan Spiegel and Bobby Murphy (B.S).
    Coursera, 2012, co-founders Andrew Ng (Associate Professor) and Daphne Koller (Professor, PhD).

    Student body

    Stanford enrolled 6,996 undergraduate and 10,253 graduate students as of the 2019–2020 school year. Women comprised 50.4% of undergraduates and 41.5% of graduate students. In the same academic year, the freshman retention rate was 99%.

    Stanford awarded 1,819 undergraduate degrees, 2,393 master’s degrees, 770 doctoral degrees, and 3270 professional degrees in the 2018–2019 school year. The four-year graduation rate for the class of 2017 cohort was 72.9%, and the six-year rate was 94.4%. The relatively low four-year graduation rate is a function of the university’s coterminal degree (or “coterm”) program, which allows students to earn a master’s degree as a 1-to-2-year extension of their undergraduate program.

    As of 2010, fifteen percent of undergraduates were first-generation students.

    Athletics

    As of 2016 Stanford had 16 male varsity sports and 20 female varsity sports, 19 club sports and about 27 intramural sports. In 1930, following a unanimous vote by the Executive Committee for the Associated Students, the athletic department adopted the mascot “Indian.” The Indian symbol and name were dropped by President Richard Lyman in 1972, after objections from Native American students and a vote by the student senate. The sports teams are now officially referred to as the “Stanford Cardinal,” referring to the deep red color, not the cardinal bird. Stanford is a member of the Pac-12 Conference in most sports, the Mountain Pacific Sports Federation in several other sports, and the America East Conference in field hockey with the participation in the inter-collegiate NCAA’s Division I FBS.

    Its traditional sports rival is the University of California, Berkeley, the neighbor to the north in the East Bay. The winner of the annual “Big Game” between the Cal and Cardinal football teams gains custody of the Stanford Axe.

    Stanford has had at least one NCAA team champion every year since the 1976–77 school year and has earned 126 NCAA national team titles since its establishment, the most among universities, and Stanford has won 522 individual national championships, the most by any university. Stanford has won the award for the top-ranked Division 1 athletic program—the NACDA Directors’ Cup, formerly known as the Sears Cup—annually for the past twenty-four straight years. Stanford athletes have won medals in every Olympic Games since 1912, winning 270 Olympic medals total, 139 of them gold. In the 2008 Summer Olympics, and 2016 Summer Olympics, Stanford won more Olympic medals than any other university in the United States. Stanford athletes won 16 medals at the 2012 Summer Olympics (12 gold, two silver and two bronze), and 27 medals at the 2016 Summer Olympics.

    Traditions

    The unofficial motto of Stanford, selected by President Jordan, is Die Luft der Freiheit weht. Translated from the German language, this quotation from Ulrich von Hutten means, “The wind of freedom blows.” The motto was controversial during World War I, when anything in German was suspect; at that time the university disavowed that this motto was official.
    Hail, Stanford, Hail! is the Stanford Hymn sometimes sung at ceremonies or adapted by the various University singing groups. It was written in 1892 by mechanical engineering professor Albert W. Smith and his wife, Mary Roberts Smith (in 1896 she earned the first Stanford doctorate in Economics and later became associate professor of Sociology), but was not officially adopted until after a performance on campus in March 1902 by the Mormon Tabernacle Choir.
    “Uncommon Man/Uncommon Woman”: Stanford does not award honorary degrees, but in 1953 the degree of “Uncommon Man/Uncommon Woman” was created to recognize individuals who give rare and extraordinary service to the University. Technically, this degree is awarded by the Stanford Associates, a voluntary group that is part of the university’s alumni association. As Stanford’s highest honor, it is not conferred at prescribed intervals, but only when appropriate to recognize extraordinary service. Recipients include Herbert Hoover, Bill Hewlett, Dave Packard, Lucile Packard, and John Gardner.
    Big Game events: The events in the week leading up to the Big Game vs. UC Berkeley, including Gaieties (a musical written, composed, produced, and performed by the students of Ram’s Head Theatrical Society).
    “Viennese Ball”: a formal ball with waltzes that was initially started in the 1970s by students returning from the now-closed Stanford in Vienna overseas program. It is now open to all students.
    “Full Moon on the Quad”: An annual event at Main Quad, where students gather to kiss one another starting at midnight. Typically organized by the Junior class cabinet, the festivities include live entertainment, such as music and dance performances.
    “Band Run”: An annual festivity at the beginning of the school year, where the band picks up freshmen from dorms across campus while stopping to perform at each location, culminating in a finale performance at Main Quad.
    “Mausoleum Party”: An annual Halloween Party at the Stanford Mausoleum, the final resting place of Leland Stanford Jr. and his parents. A 20-year tradition, the “Mausoleum Party” was on hiatus from 2002 to 2005 due to a lack of funding, but was revived in 2006. In 2008, it was hosted in Old Union rather than at the actual Mausoleum, because rain prohibited generators from being rented. In 2009, after fundraising efforts by the Junior Class Presidents and the ASSU Executive, the event was able to return to the Mausoleum despite facing budget cuts earlier in the year.
    Former campus traditions include the “Big Game bonfire” on Lake Lagunita (a seasonal lake usually dry in the fall), which was formally ended in 1997 because of the presence of endangered salamanders in the lake bed.

    Award laureates and scholars

    Stanford’s current community of scholars includes:

    19 Nobel Prize laureates (as of October 2020, 85 affiliates in total)
    171 members of the National Academy of Sciences
    109 members of National Academy of Engineering
    76 members of National Academy of Medicine
    288 members of the American Academy of Arts and Sciences
    19 recipients of the National Medal of Science
    1 recipient of the National Medal of Technology
    4 recipients of the National Humanities Medal
    49 members of American Philosophical Society
    56 fellows of the American Physics Society (since 1995)
    4 Pulitzer Prize winners
    31 MacArthur Fellows
    4 Wolf Foundation Prize winners
    2 ACL Lifetime Achievement Award winners
    14 AAAI fellows
    2 Presidential Medal of Freedom winners

    Stanford University Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: