Tagged: Artificial Intelligence Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:30 am on August 2, 2022 Permalink | Reply
    Tags: "Making Intelligent Traffic", A camera was directed at the area determining the cars' locations centrally by computer vision and sent the information to the cars., , Artificial Intelligence, Communication Science, , Maybe in the future there could be a satellite system and the automobile industry could use the software on their vehicles!, The main objective was to save people’s time and make traffic safer by making travel more efficient., The software developed by the students predicted where the cars should go giving them instructions and steering them in the right direction., , The team built a road network with crossings and streets which had a number of alternative simulation modes., The vehicles had small barcodes attached and the camera detected and tracked these.   

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “Making Intelligent Traffic” 

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

    8.2.22
    Tanya Petersen

    Bachelor’s project (5). Delivering sustainability in traffic is no easy feat. Could software that globally coordinates cars to avoid traffic jams and congestion be the answer?

    In 2019 drivers in Rome lost an average of 166 hours to traffic jams and congestion. In Paris it was 165, Dublin 154 and Athens 107. A 2016 report published by the Swiss Federal Government showed that traffic congestion cost the country CHF 1.6 billion annually in lost time, wasted fuel, environmental damage and accidents.

    This year a group of EPFL bachelor’s students in the School of Computer and Communication Sciences undertook a project to Make Intelligent Traffic as part of the Making Intelligent Things course. They used different centralized traffic algorithms on groups of 3D printed Arduino cars to try to coordinate traffic at the same time as allowing users to know the road system and where cars were, in such a way that drivers could avoid traffic jams and congestion.

    1
    Students designed the cars. © Anirudhh Ramesh.

    “We wanted to build a traffic simulation that was more efficient than what we have right now in the world. We had a lot of ideas and in the end built a prototype in which cars didn’t speak directly with each other or determine their own locations but where a camera was directed at the area determining the cars’ location centrally by computer vision and sending that information to the cars,” said Anirudhh Ramesh, a second-year IC bachelor’s student and team member.

    The team built a road network with crossings and streets which had a number of alternative simulation modes. In one the cars tried to reach a destination that was randomly generated, in another the cars needed to pick up passengers like in a taxi service. The vehicles had small barcodes attached and the camera detected and tracked these. The software developed by the students predicted where the cars should go giving them instructions and steering them in the right direction.

    Small scale demonstration

    Currently, autonomous car projects are about specifically making one car able to successfully navigate traffic, avoid dangerous situations and stay on the road. They are not part of a larger network such as this and do not communicate with each other. “On a small, pilot scale this project demonstrated that centralized traffic algorithms were able to make decisions on where the cars should travel in a coordinated manner. It’s wonderful that these undergraduates came together with all their ideas and realized them in a very short time span,” said Professor Christoph Koch, who teaches the course.

    “Our main objective was to save people’s time and make traffic safer by making travel more efficient. In creating a system that’s more efficient we also hoped to save energy and fuel, making driving more sustainable in many ways,” Ramesh continued.

    2
    The cars. © Anirudhh Ramesh.

    A real challenge!

    But the project wasn’t all smooth sailing. “Everything that you thought would be a challenge was a challenge! From coming up with a good design for the cars and back end, to getting the camera to detect all the cars and make the Bluetooth connect, there were a lot of loopholes we had to go through to get it to work. I think in the end, ironically, the computer vision things that we had to do were the easiest!” added Louis Dumas, a third-year bachelor’s student and also a team member.

    And where might the students want to take this robust and successful simulation project? “We made our project open source and it would certainly be great to see this advance further, perhaps with future students from the course?” said Dumas. “We have laid out a very strong foundation for hardware traffic simulations, and in doing so have learnt many invaluable skills. Maybe in the future we could have a satellite system and the automobile industry could use our software on their vehicles! But we know that’s a long way off!” concluded Ramesh.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is The Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich] (CH). Associated with several specialized research institutes, the two universities form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles Polytechniques Fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École Polytechnique Fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École Spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices were located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganized and acquired the status of a university in 1890, the technical faculty changed its name to École d’Ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich (CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

    Organization

    EPFL is organized into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences
    Institute of Mathematics
    Institute of Chemical Sciences and Engineering
    Institute of Physics
    European Centre of Atomic and Molecular Computations
    Bernoulli Center
    Biomedical Imaging Research Center
    Interdisciplinary Center for Electron Microscopy
    MPG-EPFL Centre for Molecular Nanosciences and Technology
    Swiss Plasma Center
    Laboratory of Astrophysics

    School of Engineering

    Institute of Electrical Engineering
    Institute of Mechanical Engineering
    Institute of Materials
    Institute of Microengineering
    Institute of Bioengineering

    School of Architecture, Civil and Environmental Engineering

    Institute of Architecture
    Civil Engineering Institute
    Institute of Urban and Regional Sciences
    Environmental Engineering Institute

    School of Computer and Communication Sciences

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Networking
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing
    Systems

    School of Life Sciences

    Bachelor-Master Teaching Section in Life Sciences and Technologies
    Brain Mind Institute
    Institute of Bioengineering
    Swiss Institute for Experimental Cancer Research
    Global Health Institute
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics
    NCCR Synaptic Bases of Mental Diseases

    College of Management of Technology

    Swiss Finance Institute at EPFL
    Section of Management of Technology and Entrepreneurship
    Institute of Technology and Public Policy
    Institute of Management of Technology and Entrepreneurship
    Section of Financial Engineering

    College of Humanities

    Human and social sciences teaching program

    EPFL Middle East

    Section of Energy Management and Sustainability

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École Cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

     
  • richardmitnick 12:51 pm on June 9, 2022 Permalink | Reply
    Tags: "Keeping web-browsing data safe from hackers", Artificial Intelligence, , , , , ,   

    From The MT Computer Science & Artificial Intelligence Laboratory (CSAIL) : “Keeping web-browsing data safe from hackers” 

    1

    From The MT Computer Science & Artificial Intelligence Laboratory (CSAIL)

    at

    The Massachusetts Institute of Technology

    June 9, 2022
    Adam Zewe | MIT News Office

    1
    MIT researchers analyzed a powerful cyberattack, known as a website-fingerprinting attack, and then developed strategies that dramatically reduce the attacker’s chances of success. Pictured, from left to right: graduate student Jules Drean, Mengjia Yan, the Homer A. Burnell Career Development Assistant Professor of Electrical Engineering and Computer Science, and Jack Cook ’22. Image: Photo courtesy of the researchers and edited by Jose-Luis Olivares, MIT

    Malicious agents can use machine learning to launch powerful attacks that steal information in ways that are tough to prevent and often even more difficult to study.

    Attackers can capture data that “leaks” between software programs running on the same computer. They then use machine-learning algorithms to decode those signals, which enables them to obtain passwords or other private information. These are called “side-channel attacks” because information is acquired through a channel not meant for communication.

    Researchers at MIT have shown that machine-learning-assisted side-channel attacks are both extremely robust and poorly understood. The use of machine-learning algorithms, which are often impossible to fully comprehend due to their complexity, is a particular challenge. In a new paper [below], the team studied a documented attack that was thought to work by capturing signals leaked when a computer accesses memory. They found that the mechanisms behind this attack were misidentified, which would prevent researchers from crafting effective defenses.

    To study the attack, they removed all memory accesses and noticed the attack became even more powerful. Then they searched for sources of information leakage and found that the attack actually monitors events that interrupt a computer’s other processes. They show that an adversary can use this machine-learning-assisted attack to exploit a security flaw and determine the website a user is browsing with almost perfect accuracy.

    With this knowledge in hand, they developed two strategies that can thwart this attack.

    “The focus of this work is really on the analysis to find the root cause of the problem. As researchers, we should really try to delve deeper and do more analysis work, rather than just blindly using black-box machine-learning tactics to demonstrate one attack after another. The lesson we learned is that these machine-learning-assisted attacks can be extremely misleading,” says senior author Mengjia Yan, the Homer A. Burnell Career Development Assistant Professor of Electrical Engineering and Computer Science (EECS) and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL).

    The lead author of the paper is Jack Cook ’22, a recent graduate in computer science. Co-authors include CSAIL graduate student Jules Drean and Jonathan Behrens PhD ’22. The research will be presented at the International Symposium on Computer Architecture.

    A side-channel surprise

    Cook launched the project while taking Yan’s advanced seminar course. For a class assignment, he tried to replicate a machine-learning-assisted side-channel attack from the literature. Past work had concluded that this attack counts how many times the computer accesses memory as it loads a website and then uses machine learning to identify the website. This is known as a website-fingerprinting attack.

    He showed that prior work relied on a flawed machine-learning-based analysis to incorrectly pinpoint the source of the attack. Machine learning can’t prove causality in these types of attacks, Cook says.

    “All I did was remove the memory access and the attack still worked just as well, or even better. So, then I wondered, what actually opens up the side channel?” he says.

    This led to a research project in which Cook and his collaborators embarked on a careful analysis of the attack. They designed an almost identical attack, but without memory accesses, and studied it in detail.

    They found that the attack actually records a computer’s timer values at fixed intervals and uses that information to infer what website is being accessed. Essentially, the attack measures how busy the computer is over time.

    A fluctuation in the timer value means the computer is processing a different amount of information in that interval. This is due to system interrupts. A system interrupt occurs when the computer’s processes are interrupted by requests from hardware devices; the computer must pause what it is doing to handle the new request.

    When a website is loading, it sends instructions to a web browser to run scripts, render graphics, load videos, etc. Each of these can trigger many system interrupts.

    An attacker monitoring the timer can use machine learning to infer high-level information from these system interrupts to determine what website a user is visiting. This is possible because interrupt activity generated by one website, like CNN.com, is very similar each time it loads, but very different from other websites, like Wikipedia.com, Cook explains.

    “One of the really scary things about this attack is that we wrote it in JavaScript, so you don’t have to download or install any code. All you have to do is open a website. Someone could embed this into a website and then theoretically be able to snoop on other activity on your computer,” he says.

    The attack is extremely successful. For instance, when a computer is running Chrome on the macOS operating system, the attack was able to identify websites with 94 percent accuracy. All commercial browsers and operating systems they tested resulted in an attack with more than 91 percent accuracy.

    There are many factors that can affect a computer’s timer, so determining what led to an attack with such high accuracy was akin to finding a needle in a haystack, Cook says. They ran many controlled experiments, removing one variable at a time, until they realized the signal must be coming for system interrupts, which often can’t be processed separately from the attacker’s code.

    Fighting back

    Once the researchers understood the attack, they crafted security strategies to prevent it.

    First, they created a browser extension that generates frequent interrupts, like pinging random websites to create bursts of activity. The added noise makes it much more difficult for the attacker to decode signals. This dropped the attack’s accuracy from 96 percent to 62 percent, but it slowed the computer’s performance.

    For their second countermeasure, they modified the timer to return values that are close to, but not the actual time. This makes it much harder for an attacker to measure the computer’s activity over an interval, Cook explains. This mitigation cut the attack’s accuracy from 96 percent down to just 1 percent.

    “I was surprised by how such a small mitigation like adding randomness to the timer could be so effective. This mitigation strategy could really be put in use today. It doesn’t affect how you use most websites,” he says.

    Building off this work, the researchers plan to develop a systematic analysis framework for machine-learning-assisted side-channel attacks. This could help the researchers get to the root cause of more attacks, Yan says. They also want to see how they can use machine learning to discover other types of vulnerabilities.

    “This paper presents a new interrupt-based side channel attack and demonstrates that it can be effectively used for website fingerprinting attacks, while previously, such attacks were believed to be possible due to cache side channels,” says Yanjing Li, assistant professor in the Department of Computer Science at the University of Chicago, who was not involved with this research. “I liked this paper immediately after I first read it, not only because the new attack is interesting and successfully challenges existing notions, but also because it points out a key limitation of ML-assisted side-channel attacks — blindly relying on machine-learning models without careful analysis cannot provide any understanding on the actual causes/sources of an attack, and can even be misleading. This is very insightful and I believe will inspire many future works in this direction.”

    This research was funded, in part, by the National Science Foundation, the Air Force Office of Scientific Research, and the MIT-IBM Watson AI Lab.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    4

    The MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) is a research institute at the Massachusetts Institute of Technology (MIT) formed by the 2003 merger of the Laboratory for Computer Science (LCS) and the Artificial Intelligence Laboratory (AI Lab). Housed within the Ray and Maria Stata Center, CSAIL is the largest on-campus laboratory as measured by research scope and membership. It is part of the Schwarzman College of Computing but is also overseen by the MIT Vice President of Research.

    Research activities

    CSAIL’s research activities are organized around a number of semi-autonomous research groups, each of which is headed by one or more professors or research scientists. These groups are divided up into seven general areas of research:

    Artificial intelligence
    Computational biology
    Graphics and vision
    Language and learning
    Theory of computation
    Robotics
    Systems (includes computer architecture, databases, distributed systems, networks and networked systems, operating systems, programming methodology, and software engineering among others)

    In addition, CSAIL hosts the World Wide Web Consortium (W3C).

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory, the MIT Bates Research and Engineering Center, and the Haystack Observatory (US), as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology ‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology ( students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology, Massachusetts Institute of Technology , and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 1:45 pm on June 1, 2022 Permalink | Reply
    Tags: "Astronomers identify 116000 new variable stars", Artificial Intelligence, , , , , , , Surveys like ASAS-SN are an especially important tool for finding systems that can reveal the complexities of stellar processes., The All-Sky Automated Survey for Supernovae (ASAS-SN)   

    From Ohio State University: “Astronomers identify 116000 new variable stars” 

    From Ohio State University

    5.31.22

    Tatyana Woodall
    Ohio State News
    woodall.52@osu.edu

    New technique locates stellar objects that change brightness.

    Ohio State University astronomers have identified about 116,000 new variable stars, according to a new paper.

    These heavenly bodies were found by The All-Sky Automated Survey for Supernovae (ASAS-SN), a network of 20 telescopes around the world which can observe the entire sky about 50,000 times deeper than the human eye. Researchers from Ohio State have operated the project for nearly a decade.

    Now in a paper published for MNRAS, researchers describe how they used machine learning techniques to identify and classify variable stars — celestial objects whose brightness waxes and wanes over time, especially if observed from our perspective on Earth.

    The changes these stars undergo can reveal important information about their mass, radius, temperature and even their composition. In fact, even our sun is considered a variable star. Surveys like ASAS-SN are an especially important tool for finding systems that can reveal the complexities of stellar processes, said Collin Christy, the lead author of the paper and an ASAS-SN analyst at Ohio State.

    “Variable stars are sort of like a stellar laboratory,” he said. “They’re really neat places in the universe where we can study and learn more about how stars actually work and the little intricacies that they all have.”

    But to locate more of these elusive entities, the team first had to bring in previously unused data from the project. For years, ASAS-SN gazed at the sky using V-band filters, optical lenses that can only identify stars whose light falls into the spectrum of colors visible to the naked eye. But in 2018, the project shifted to using g-band filters — lenses that can detect more varieties of blue light — and the network went from being able to observe about 60 million stars at a time to more than 100 million.

    But unlike ASAS-SN’s citizen science campaign, which relies on volunteers to sift through and classify astronomical data, Christy’s study required the help of artificial intelligence.

    “If you want to look at millions of stars, it’s impossible for a few humans to do it by themselves. It’ll take forever,” said Tharindu Jayasinghe, co-author of the paper, a doctoral student in astronomy and an Ohio State presidential fellow. “So we had to bring something creative into the mix, like machine learning techniques.”

    The new study focused on data from Gaia, a mission to chart a three-dimensional map of our galaxy, as well as from 2MASS and AllWISE. Christy’s team used a machine learning algorithm to generate a list of 1.5 million candidate variable stars from a catalog of about 55 million isolated stars.

    Afterward, researchers whittled the number of candidates down even further. Of the 1.5 million stars they studied, nearly 400,000 turned out to be real variable stars. More than half were already known to the astronomy community, but 116,027 of them proved to be new discoveries.

    Although the study needed machine learning to complete it, Christy’s team says there is still a role for citizen scientists. In fact, volunteers with the citizen science campaign have already started to identify junk data, he said. “Having people tell us what our bad data looks like is super useful, because initially, the algorithm would look at the bad data and try to make sense of it,” Christy said.

    But using a training set of all that bad data allows the team to modify and improve the overall performance of their algorithm. “This is the first time that we’re actually combining citizen science with machine learning techniques in the field of variable star astronomy,” said Jayasinghe. “We’re expanding the boundaries of what you can do when you put those two together.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Ohio State University is a public research university in Columbus, Ohio. Founded in 1870 as a land-grant university and the ninth university in Ohio with the Morrill Act of 1862, the university was originally known as the Ohio Agricultural and Mechanical College. The college originally focused on various agricultural and mechanical disciplines but it developed into a comprehensive university under the direction of then-Governor (later, U.S. President) Rutherford B. Hayes, and in 1878 the Ohio General Assembly passed a law changing the name to “The Ohio State University”. The main campus in Columbus, Ohio, has since grown into the third-largest university campus in the United States. The university also operates regional campuses in Lima, Mansfield, Marion, Newark, and Wooster.

    The university has an extensive student life program, with over 1,000 student organizations; intercollegiate, club and recreational sports programs; student media organizations and publications, fraternities and sororities; and three student governments. Ohio State athletic teams compete in Division I of the NCAA and are known as the Ohio State Buckeyes. As of the 2016 Summer Olympics, athletes from Ohio State have won 104 Olympic medals (46 gold, 35 silver, and 23 bronze). The university is a member of the Big Ten Conference for the majority of sports.

     
  • richardmitnick 4:10 pm on May 30, 2022 Permalink | Reply
    Tags: "Frontier supercomputer debuts as world’s fastest-breaking exascale barrier", , Artificial Intelligence, , , , , ,   

    From The DOE’s Oak Ridge National Laboratory: “Frontier supercomputer debuts as world’s fastest-breaking exascale barrier” 

    From The DOE’s Oak Ridge National Laboratory

    May 30, 2022

    Media Contacts:

    Sara Shoemaker
    shoemakerms@ornl.gov,
    865.576.9219

    Secondary Media Contact
    Katie Bethea
    Oak Ridge Leadership Computing Facility
    betheakl@ornl.gov
    757.817.2832


    Frontier: The World’s First Exascale Supercomputer Has Arrived

    The Frontier supercomputer [below] at the Department of Energy’s Oak Ridge National Laboratory earned the top ranking today as the world’s fastest on the 59th TOP500 list, with 1.1 exaflops of performance. The system is the first to achieve an unprecedented level of computing performance known as exascale, a threshold of a quintillion calculations per second.

    Frontier features a theoretical peak performance of 2 exaflops, or two quintillion calculations per second, making it ten times more powerful than ORNL’s Summit system [below]. The system leverages ORNL’s extensive expertise in accelerated computing and will enable scientists to develop critically needed technologies for the country’s energy, economic and national security, helping researchers address problems of national importance that were impossible to solve just five years ago.

    “Frontier is ushering in a new era of exascale computing to solve the world’s biggest scientific challenges,” ORNL Director Thomas Zacharia said. “This milestone offers just a preview of Frontier’s unmatched capability as a tool for scientific discovery. It is the result of more than a decade of collaboration among the national laboratories, academia and private industry, including DOE’s Exascale Computing Project, which is deploying the applications, software technologies, hardware and integration necessary to ensure impact at the exascale.”

    Rankings were announced at the International Supercomputing Conference 2022 in Hamburg, Germany, which gathers leaders from around the world in the field of high-performance computing, or HPC. Frontier’s speeds surpassed those of any other supercomputer in the world, including ORNL’s Summit, which is also housed at ORNL’s Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility.

    Frontier, a HPE Cray EX supercomputer, also claimed the number one spot on the Green500 list, which rates energy use and efficiency by commercially available supercomputing systems, with 62.68 gigaflops performance per watt. Frontier rounded out the twice-yearly rankings with the top spot in a newer category, mixed-precision computing, that rates performance in formats commonly used for artificial intelligence, with a performance of 6.88 exaflops.

    The work to deliver, install and test Frontier began during the COVID-19 pandemic, as shutdowns around the world strained international supply chains. More than 100 members of a public-private team worked around the clock, from sourcing millions of components to ensuring deliveries of system parts on deadline to carefully installing and testing 74 HPE Cray EX supercomputer cabinets, which include more than 9,400 AMD-powered nodes and 90 miles of networking cables.

    “When researchers gain access to the fully operational Frontier system later this year, it will mark the culmination of work that began over three years ago involving hundreds of talented people across the Department of Energy and our industry partners at HPE and AMD,” ORNL Associate Lab Director for computing and computational sciences Jeff Nichols said. “Scientists and engineers from around the world will put these extraordinary computing speeds to work to solve some of the most challenging questions of our era, and many will begin their exploration on Day One.”

    3

    Frontier’s overall performance of 1.1 exaflops translates to more than one quintillion floating point operations per second, or flops, as measured by the High-Performance Linpack Benchmark test. Each flop represents a possible calculation, such as addition, subtraction, multiplication or division.

    Frontier’s early performance on the Linpack benchmark amounts to more than seven times that of Summit at 148.6 petaflops. Summit continues as an impressive, highly ranked workhorse machine for open science, listed at number four on the TOP500.

    Frontier’s mixed-precision computing performance clocked in at roughly 6.88 exaflops, or more than 6.8 quintillion flops per second, as measured by the High-Performance Linpack-Accelerator Introspection, or HPL-AI, test. The HPL-AI test measures calculation speeds in the computing formats typically used by the machine-learning methods that drive advances in artificial intelligence.

    Detailed simulations relied on by traditional HPC users to model such phenomena as cancer cells, supernovas, the coronavirus or the atomic structure of elements require 64-bit precision, a computationally demanding form of computing accuracy. Machine-learning algorithms typically require much less precision — sometimes as little as 32-, 24- or 16-bit accuracy — and can take advantage of special hardware in the graphic processing units, or GPUs, relied on by machines like Frontier to reach even faster speeds.

    ORNL and its partners continue to execute the bring-up of Frontier on schedule. Next steps include continued testing and validation of the system, which remains on track for final acceptance and early science access later in 2022 and open for full science at the beginning of 2023.

    4
    Credit: Laddy Fields/ORNL, U.S. Dept. of Energy.

    FACTS ABOUT FRONTIER

    The Frontier supercomputer’s exascale performance is enabled by some of the world’s most advanced pieces of technology from HPE and AMD:

    Frontier has 74 HPE Cray EX supercomputer cabinets, which are purpose-built to support next-generation supercomputing performance and scale, once open for early science access.

    Each node contains one optimized EPYC™ processor and four AMD Instinct™ accelerators, for a total of more than 9,400 CPUs and more than 37,000 GPUs in the entire system. These nodes provide developers with easier capabilities to program their applications, due to the coherency enabled by the EPYC processors and Instinct accelerators.

    HPE Slingshot, the world’s only high-performance Ethernet fabric designed for next-generation HPC and AI solutions, including larger, data-intensive workloads, to address demands for higher speed and congestion control for applications to run smoothly and boost performance.

    An I/O subsystem from HPE that will come online this year to support Frontier and the OLCF. The I/O subsystem features an in-system storage layer and Orion, a Lustre-based enhanced center-wide file system that is also the world’s largest and fastest single parallel file system, based on the Cray ClusterStor E1000 storage system. The in-system storage layer will employ compute-node local storage devices connected via PCIe Gen4 links to provide peak read speeds of more than 75 terabytes per second, peak write speeds of more than 35 terabytes per second, and more than 15 billion random-read input/output operations per second. The Orion center-wide file system will provide around 700 petabytes of storage capacity and peak write speeds of 5 terabytes per second.

    As a next-generation supercomputing system and the world’s fastest for open science, Frontier is also energy-efficient, due to its liquid-cooled capabilities. This cooling system promotes a quieter data center by removing the need for a noisier, air-cooled system.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition


    Established in 1942, The DOE’s Oak Ridge National Laboratory is the largest science and energy national laboratory in the Department of Energy system (by size) and third largest by annual budget. It is located in the Roane County section of Oak Ridge, Tennessee. Its scientific programs focus on materials, neutron science, energy, high-performance computing, systems biology and national security, sometimes in partnership with the state of Tennessee, universities and other industries.

    ORNL has several of the world’s top supercomputers, including Summit [below], ranked by the TOP500 as Earth’s second-most powerful.

    ORNL OLCF IBM Q AC922 SUMMIT supercomputer, was No.1 on the TOP500..

    The lab is a leading neutron and nuclear power research facility that includes the Spallation Neutron Source and High Flux Isotope Reactor.

    ORNL Spallation Neutron Source annotated.

    It hosts the Center for Nanophase Materials Sciences, the BioEnergy Science Center, and the Consortium for Advanced Simulation of Light Water Nuclear Reactors.

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    Areas of research

    ORNL conducts research and development activities that span a wide range of scientific disciplines. Many research areas have a significant overlap with each other; researchers often work in two or more of the fields listed here. The laboratory’s major research areas are described briefly below.

    Chemical sciences – ORNL conducts both fundamental and applied research in a number of areas, including catalysis, surface science and interfacial chemistry; molecular transformations and fuel chemistry; heavy element chemistry and radioactive materials characterization; aqueous solution chemistry and geochemistry; mass spectrometry and laser spectroscopy; separations chemistry; materials chemistry including synthesis and characterization of polymers and other soft materials; chemical biosciences; and neutron science.
    Electron microscopy – ORNL’s electron microscopy program investigates key issues in condensed matter, materials, chemical and nanosciences.
    Nuclear medicine – The laboratory’s nuclear medicine research is focused on the development of improved reactor production and processing methods to provide medical radioisotopes, the development of new radionuclide generator systems, the design and evaluation of new radiopharmaceuticals for applications in nuclear medicine and oncology.
    Physics – Physics research at ORNL is focused primarily on studies of the fundamental properties of matter at the atomic, nuclear, and subnuclear levels and the development of experimental devices in support of these studies.
    Population – ORNL provides federal, state and international organizations with a gridded population database, called Landscan, for estimating ambient population. LandScan is a raster image, or grid, of population counts, which provides human population estimates every 30 x 30 arc seconds, which translates roughly to population estimates for 1 kilometer square windows or grid cells at the equator, with cell width decreasing at higher latitudes. Though many population datasets exist, LandScan is the best spatial population dataset, which also covers the globe. Updated annually (although data releases are generally one year behind the current year) offers continuous, updated values of population, based on the most recent information. Landscan data are accessible through GIS applications and a USAID public domain application called Population Explorer.

     
  • richardmitnick 3:25 pm on May 25, 2022 Permalink | Reply
    Tags: "Machine Learning Gets Smarter To Speed Up Drug Discovery", , Artificial Intelligence, , , , , , , , There are millions of molecules from which to select for use in a potential drug candidate.   

    Carnegie Mellon University – College of Engineering: “Machine Learning Gets Smarter To Speed Up Drug Discovery” 

    May 25, 2022
    Lisa Kulick

    Researchers at Carnegie Mellon University developed a self-supervised learning framework that leverages the large amounts of unlabeled data that other models can’t.

    1
    Machine learning gets smarter to speed up drug discovery – College of Engineering at Carnegie Mellon University.

    Predicting molecular properties quickly and accurately is important to advancing scientific discovery and application in areas ranging from materials science to pharmaceuticals. Because experiments and simulations to explore potential options are time-consuming and costly, scientists have investigated using machine learning (ML) methods to aid in computational chemistry research. But most ML models can only make use of known, or labeled, data. This makes it nearly impossible to predict with accuracy the properties of novel compounds.

    In an industry like drug discovery, there are millions of molecules from which to select for use in a potential drug candidate. A prediction error as small as 1% can lead to the misidentification of more than 10,000 molecules. Improving the accuracy of ML models with limited data will play a vital role in developing new treatments for disease.

    While the amount of labeled molecule data is limited, there is a rapidly growing amount of feasible, but unlabeled, data. Researchers at Carnegie Mellon’s College of Engineering pondered using this large volume of unlabeled molecules to build ML models that could perform better on property predictions than other models.

    Their work culminated in the development of a self-supervised learning framework named “MolCLR”, short for Molecular Contrastive Learning of Representations via Graph Neural Networks. The findings were published in the journal Nature Machine Intelligence.

    “MolCLR significantly boosts the performance of ML models by leveraging approximately 10 million unlabeled molecule data,” said Amir Barati Farimani, assistant professor of mechanical engineering.

    For a simple explanation of labeled versus unlabeled data, Ph.D. student Yuyang Wang suggested thinking of two sets of images of dogs and cats. In one set, each animal is labeled with the name of its species. In the other set, no labels accompany the images. To a human, the difference between the two types of animals might be obvious. But to a ML model, the difference isn’t clear. The unlabeled data is therefore not reliably useful. Applying this analogy to the millions of unlabeled molecules that could take humans decades to manually identify, the critical need for smarter ML tools becomes obvious.

    The research team sought to teach its MolCLR framework how to use unlabeled data by contrasting positive and negative pairs of augmented molecule graph representations. Graphs transformed from the same molecule are considered a positive pair, while those from different molecules are negative pairs. By this means, representations of similar molecules stay close to each other, while distinct ones are pushed far apart.

    The researchers had applied three graph augmentations to remove small amounts of information from the unknown molecules: atom masking, bond deletion and subgraph removal. In atom masking, a piece of information about a molecule is eliminated. In bond deletion, a chemical bond between atoms is erased. A combination of both augmentations results in subgraph removal. Through these three types of changes, the MolCLR was forced to learn intrinsic information and make correlations.

    When the team applied MolCLR to ClinTox, a database used to predict drug toxicity, MolCLR significantly outperformed other ML baseline models. On another database, Tox21, MolCLR stood out from the other ML models with the potential to distinguish which environmental chemicals posed the most severe threats to human health.

    “We have demonstrated that MolCLR bears promise for efficient molecule design,” said Barati Farimani. “It can be applied to a wide variety of applications, including drug discovery, energy storage and environmental protection.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The College of Engineering is well-known for working on problems of both scientific and practical importance. Our acclaimed faculty focus on transformative results that will drive the intellectual and economic vitality of our community, nation and world. Our “maker” culture is ingrained in all that we do, leading to novel approaches and unprecedented results.

    Carnegie Mellon University is a global research university with more than 12,000 students, 95,000 alumni, and 5,000 faculty and staff.
    CMU has been a birthplace of innovation since its founding in 1900.
    Today, we are a global leader bringing groundbreaking ideas to market and creating successful startup businesses.
    Our award-winning faculty members are renowned for working closely with students to solve major scientific, technological and societal challenges. We put a strong emphasis on creating things—from art to robots. Our students are recruited by some of the world’s most innovative companies.
    We have campuses in Pittsburgh, Qatar and Silicon Valley, and degree-granting programs around the world, including Africa, Asia, Australia, Europe and Latin America.

    The university was established by Andrew Carnegie as the Carnegie Technical Schools, the university became the Carnegie Institute of Technology in 1912 and began granting four-year degrees. In 1967, the Carnegie Institute of Technology merged with the Mellon Institute of Industrial Research, formerly a part of the University of Pittsburgh. Since then, the university has operated as a single institution.

    The university has seven colleges and independent schools, including the College of Engineering, College of Fine Arts, Dietrich College of Humanities and Social Sciences, Mellon College of Science, Tepper School of Business, Heinz College of Information Systems and Public Policy, and the School of Computer Science. The university has its main campus located 3 miles (5 km) from Downtown Pittsburgh, and the university also has over a dozen degree-granting locations in six continents, including degree-granting campuses in Qatar and Silicon Valley.

    Past and present faculty and alumni include 20 Nobel Prize laureates, 13 Turing Award winners, 23 Members of the American Academy of Arts and Sciences, 22 Fellows of the American Association for the Advancement of Science, 79 Members of the National Academies, 124 Emmy Award winners, 47 Tony Award laureates, and 10 Academy Award winners. Carnegie Mellon enrolls 14,799 students from 117 countries and employs 1,400 faculty members.
    Research

    Carnegie Mellon University is classified among “R1: Doctoral Universities – Very High Research Activity”. For the 2006 fiscal year, the university spent $315 million on research. The primary recipients of this funding were the School of Computer Science ($100.3 million), the Software Engineering Institute ($71.7 million), the College of Engineering ($48.5 million), and the Mellon College of Science ($47.7 million). The research money comes largely from federal sources, with a federal investment of $277.6 million. The federal agencies that invest the most money are the National Science Foundation and the Department of Defense, which contribute 26% and 23.4% of the total university research budget respectively.

    The recognition of Carnegie Mellon as one of the best research facilities in the nation has a long history—as early as the 1987 Federal budget Carnegie Mellon University was ranked as third in the amount of research dollars with $41.5 million, with only Massachusetts Institute of Technology and Johns Hopkins University receiving more research funds from the Department of Defense.

    The Pittsburgh Supercomputing Center (PSC) is a joint effort between Carnegie Mellon, University of Pittsburgh, and Westinghouse Electric Company. Pittsburgh Supercomputing Center was founded in 1986 by its two scientific directors, Dr. Ralph Roskies of the University of Pittsburgh and Dr. Michael Levine of Carnegie Mellon. Pittsburgh Supercomputing Center is a leading partner in the TeraGrid, the National Science Foundation’s cyberinfrastructure program.
    Scarab lunar rover is being developed by the RI.

    The Robotics Institute (RI) is a division of the School of Computer Science and considered to be one of the leading centers of robotics research in the world. The Field Robotics Center (FRC) has developed a number of significant robots, including Sandstorm and H1ghlander, which finished second and third in the DARPA Grand Challenge, and Boss, which won the DARPA Urban Challenge. The Robotics Institute has partnered with a spinoff company, Astrobotic Technology Inc., to land a CMU robot on the moon by 2016 in pursuit of the Google Lunar XPrize. The robot, known as Andy, is designed to explore lunar pits, which might include entrances to caves. The RI is primarily sited at Carnegie Mellon’s main campus in Newell-Simon hall.

    The Software Engineering Institute (SEI) is a federally funded research and development center sponsored by the U.S. Department of Defense and operated by Carnegie Mellon, with offices in Pittsburgh, Pennsylvania, USA; Arlington, Virginia, and Frankfurt, Germany. The SEI publishes books on software engineering for industry, government and military applications and practices. The organization is known for its Capability Maturity Model (CMM) and Capability Maturity Model Integration (CMMI), which identify essential elements of effective system and software engineering processes and can be used to rate the level of an organization’s capability for producing quality systems. The SEI is also the home of CERT/CC, the federally funded computer security organization. The CERT Program’s primary goals are to ensure that appropriate technology and systems management practices are used to resist attacks on networked systems and to limit damage and ensure continuity of critical services subsequent to attacks, accidents, or failures.

    The Human–Computer Interaction Institute (HCII) is a division of the School of Computer Science and is considered one of the leading centers of human–computer interaction research, integrating computer science, design, social science, and learning science. Such interdisciplinary collaboration is the hallmark of research done throughout the university.

    The Language Technologies Institute (LTI) is another unit of the School of Computer Science and is famous for being one of the leading research centers in the area of language technologies. The primary research focus of the institute is on machine translation, speech recognition, speech synthesis, information retrieval, parsing and information extraction. Until 1996, the institute existed as the Center for Machine Translation that was established in 1986. From 1996 onwards, it started awarding graduate degrees and the name was changed to Language Technologies Institute.

    Carnegie Mellon is also home to the Carnegie School of management and economics. This intellectual school grew out of the Tepper School of Business in the 1950s and 1960s and focused on the intersection of behavioralism and management. Several management theories, most notably bounded rationality and the behavioral theory of the firm, were established by Carnegie School management scientists and economists.

    Carnegie Mellon also develops cross-disciplinary and university-wide institutes and initiatives to take advantage of strengths in various colleges and departments and develop solutions in critical social and technical problems. To date, these have included the Cylab Security and Privacy Institute, the Wilton E. Scott Institute for Energy Innovation, the Neuroscience Institute (formerly known as BrainHub), the Simon Initiative, and the Disruptive Healthcare Technology Institute.

    Carnegie Mellon has made a concerted effort to attract corporate research labs, offices, and partnerships to the Pittsburgh campus. Apple Inc., Intel, Google, Microsoft, Disney, Facebook, IBM, General Motors, Bombardier Inc., Yahoo!, Uber, Tata Consultancy Services, Ansys, Boeing, Robert Bosch GmbH, and the Rand Corporation have established a presence on or near campus. In collaboration with Intel, Carnegie Mellon has pioneered research into claytronics.

     
  • richardmitnick 9:42 am on May 18, 2022 Permalink | Reply
    Tags: "Living better with algorithms" Graduate student Sarah Cen, Artificial Intelligence, MIT Laboratory for Information and Decision Systems,   

    From The Massachusetts Institute of Technology: “Living better with algorithms” Graduate student Sarah Cen 

    From The Massachusetts Institute of Technology

    May 18, 2022
    Grace Chua | MIT Laboratory for Information and Decision Systems

    1

    1
    Sarah Cen explores algorithm design, in areas including social media algorithms, the fairness of matching markets, and the impact of policy interventions in complex scenarios. Photo courtesy of LIDS.

    Laboratory for Information and Decision Systems (LIDS) student Sarah Cen remembers the lecture that sent her down the track to an upstream question.

    At a talk on ethical artificial intelligence, the speaker brought up a variation on the famous trolley problem, which outlines a philosophical choice between two undesirable outcomes.

    The speaker’s scenario: Say a self-driving car is traveling down a narrow alley with an elderly woman walking on one side and a small child on the other, and no way to thread between both without a fatality. Who should the car hit?

    Then the speaker said: Let’s take a step back. Is this the question we should even be asking?

    That’s when things clicked for Cen. Instead of considering the point of impact, a self-driving car could have avoided choosing between two bad outcomes by making a decision earlier on — the speaker pointed out that, when entering the alley, the car could have determined that the space was narrow and slowed to a speed that would keep everyone safe.

    Recognizing that today’s AI safety approaches often resemble the trolley problem, focusing on downstream regulation such as liability after someone is left with no good choices, Cen wondered: What if we could design better upstream and downstream safeguards to such problems? This question has informed much of Cen’s work.

    “Engineering systems are not divorced from the social systems on which they intervene,” Cen says. Ignoring this fact risks creating tools that fail to be useful when deployed or, more worryingly, that are harmful.

    Cen arrived at LIDS in 2018 via a slightly roundabout route. She first got a taste for research during her undergraduate degree at Princeton University, where she majored in mechanical engineering. For her master’s degree, she changed course, working on radar solutions in mobile robotics (primarily for self-driving cars) at The University of Oxford (UK). There, she developed an interest in AI algorithms, curious about when and why they misbehave. So, she came to MIT and LIDS for her doctoral research, working with Professor Devavrat Shah in the Department of Electrical Engineering and Computer Science, for a stronger theoretical grounding in information systems.

    Auditing social media algorithms

    Together with Shah and other collaborators, Cen has worked on a wide range of projects during her time at LIDS, many of which tie directly to her interest in the interactions between humans and computational systems. In one such project, Cen studies options for regulating social media. Her recent work provides a method for translating human-readable regulations into implementable audits.

    To get a sense of what this means, suppose that regulators require that any public health content — for example, on vaccines — not be vastly different for politically left- and right-leaning users. How should auditors check that a social media platform complies with this regulation? Can a platform be made to comply with the regulation without damaging its bottom line? And how does compliance affect the actual content that users do see?

    Designing an auditing procedure is difficult in large part because there are so many stakeholders when it comes to social media. Auditors have to inspect the algorithm without accessing sensitive user data. They also have to work around tricky trade secrets, which can prevent them from getting a close look at the very algorithm that they are auditing because these algorithms are legally protected. Other considerations come into play as well, such as balancing the removal of misinformation with the protection of free speech.

    To meet these challenges, Cen and Shah developed an auditing procedure that does not need more than black-box access to the social media algorithm (which respects trade secrets), does not remove content (which avoids issues of censorship), and does not require access to users (which preserves users’ privacy).

    In their design process, the team also analyzed the properties of their auditing procedure, finding that it ensures a desirable property they call decision robustness. As good news for the platform, they show that a platform can pass the audit without sacrificing profits. Interestingly, they also found the audit naturally incentivizes the platform to show users diverse content, which is known to help reduce the spread of misinformation, counteract echo chambers, and more.

    Who gets good outcomes and who gets bad ones?

    In another line of research, Cen looks at whether people can receive good long-term outcomes when they not only compete for resources, but also don’t know upfront what resources are best for them.

    Some platforms, such as job-search platforms or ride-sharing apps, are part of what is called a matching market, which uses an algorithm to match one set of individuals (such as workers or riders) with another (such as employers or drivers). In many cases, individuals have matching preferences that they learn through trial and error. In labor markets, for example, workers learn their preferences about what kinds of jobs they want, and employers learn their preferences about the qualifications they seek from workers.

    But learning can be disrupted by competition. If workers with a particular background are repeatedly denied jobs in tech because of high competition for tech jobs, for instance, they may never get the knowledge they need to make an informed decision about whether they want to work in tech. Similarly, tech employers may never see and learn what these workers could do if they were hired.

    Cen’s work examines this interaction between learning and competition, studying whether it is possible for individuals on both sides of the matching market to walk away happy.

    Modeling such matching markets, Cen and Shah found that it is indeed possible to get to a stable outcome (workers aren’t incentivized to leave the matching market), with low regret (workers are happy with their long-term outcomes), fairness (happiness is evenly distributed), and high social welfare.

    Interestingly, it’s not obvious that it’s possible to get stability, low regret, fairness, and high social welfare simultaneously. So another important aspect of the research was uncovering when it is possible to achieve all four criteria at once and exploring the implications of those conditions.

    What is the effect of X on Y?

    For the next few years, though, Cen plans to work on a new project, studying how to quantify the effect of an action X on an outcome Y when it’s expensive — or impossible — to measure this effect, focusing in particular on systems that have complex social behaviors.

    For instance, when Covid-19 cases surged in the pandemic, many cities had to decide what restrictions to adopt, such as mask mandates, business closures, or stay-home orders. They had to act fast and balance public health with community and business needs, public spending, and a host of other considerations.

    Typically, in order to estimate the effect of restrictions on the rate of infection, one might compare the rates of infection in areas that underwent different interventions. If one county has a mask mandate while its neighboring county does not, one might think comparing the counties’ infection rates would reveal the effectiveness of mask mandates.

    But of course, no county exists in a vacuum. If, for instance, people from both counties gather to watch a football game in the maskless county every week, people from both counties mix. These complex interactions matter, and Sarah plans to study questions of cause and effect in such settings.

    “We’re interested in how decisions or interventions affect an outcome of interest, such as how criminal justice reform affects incarceration rates or how an ad campaign might change the public’s behaviors,” Cen says.

    Cen has also applied the principles of promoting inclusivity to her work in the MIT community.

    As one of three co-presidents of the Graduate Women in MIT EECS student group, she helped organize the inaugural GW6 research summit featuring the research of women graduate students — not only to showcase positive role models to students, but also to highlight the many successful graduate women at MIT who are not to be underestimated.

    Whether in computing or in the community, a system taking steps to address bias is one that enjoys legitimacy and trust, Cen says. “Accountability, legitimacy, trust — these principles play crucial roles in society and, ultimately, will determine which systems endure with time.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 3:23 pm on March 24, 2022 Permalink | Reply
    Tags: "What Can We Learn About the Universe from Just One Galaxy?", Artificial Intelligence, , , CAMELS: Cosmology and Astrophysics with MachinE Learning Simulations, , , , , Omega matter: a cosmological parameter that describes how much dark matter is in the universe, ,   

    From The New Yorker: “What Can We Learn About the Universe from Just One Galaxy?” 


    Rea Irvin

    From The New Yorker

    March 23, 2022
    Rivka Galchen

    1
    Illustration by Nicholas Konrad /The New Yorker

    In new research, begun by an undergraduate, William Blake’s phrase “to see a world in a grain of sand” is suddenly relevant to astrophysics.

    Imagine if you could look at a snowflake at the South Pole and determine the size and the climate of all of Antarctica. Or study a randomly selected tree in the Amazon rain forest and, from that one tree—be it rare or common, narrow or wide, young or old—deduce characteristics of the forest as a whole. Or, what if, by looking at one galaxy among the hundred billion or so in the observable universe, one could say something substantial about the universe as a whole? A recent paper, whose lead authors include a cosmologist, a galaxy-formation expert, and an undergraduate named Jupiter (who did the initial work), suggests that this may be the case. The result at first seemed “crazy” to the paper’s authors. Now, having discussed their work with other astrophysicists and done various “sanity checks,” trying to find errors in their methods, the results are beginning to seem pretty clear. Francisco Villaescusa-Navarro, one of the lead authors of the work, said, “It does look like galaxies somehow retain a memory of the entire universe.”

    The research began as a sort of homework exercise. Jupiter Ding, while a freshman at Princeton University, wrote to the department of astrophysics, hoping to get involved in research. He mentioned that he had some experience with machine learning, a form of artificial intelligence that is adept at picking out patterns in very large data sets. Villaescusa-Navarro, an astrophysicist focused on cosmology, had an idea for what the student might work on. Villaescusa-Navarro had long wanted to look into whether machine learning could be used to help find relationships between galaxies and the universe. “I was thinking, What if you could look at only a thousand galaxies and from that learn properties about the entire universe? I wondered, What is the smallest number we could look at? What if you looked at only one hundred? I thought, O.K., we’ll start with one galaxy.”

    He had no expectation that one galaxy would provide much. But he thought that it would be a good way for Ding to practice using machine learning on a database known as CAMELS (Cosmology and Astrophysics with MachinE Learning Simulations). Shy Genel, an astrophysicist focussed on galaxy formation, who is another lead author on the paper, explained CAMELS this way: “We start with a description of reality shortly after the Big Bang. At that point, the universe is mostly hydrogen gas, and some helium and dark matter. And then, using what we know of the laws of physics, our best guess, we then run the cosmic history for roughly fourteen billion years.” Cosmological simulations have been around for about forty years, but they are increasingly sophisticated—and fast. CAMELS contains some four thousand simulated universes. Working with simulated universes, as opposed to our own, lets researchers ask questions that the gaps in our observational data preclude us from answering. They also let researchers play with different parameters, like the proportions of dark matter and hydrogen gas, to test their impact.

    Ding did the work on CAMELS from his dorm room, on his laptop. He wrote programs to work with the CAMELS data, then sent them to one of the university’s computing clusters, a collection of computers with far more power than his MacBook Air. That computing cluster contained the CAMELS data. Ding’s model trained itself by taking a set of simulated universes and looking at the galaxies within them. Once trained, the model would then be shown a sample galaxy and asked to predict features of the universe from which it was sampled.

    Ding is very humble about his contribution to the research, but he knows far more about astrophysics than even an exceptional first-year student typically does. Ding, a middle child with two sisters, grew up in State College, Pennsylvania. In high school, he took a series of college-level astronomy courses at Penn State and worked on a couple of research projects that involved machine learning. “My dad was really interested in astronomy as a high schooler,” Ding told me. “He went another direction, though.” His father is a professor of marketing at Penn State’s business school.

    Artificial intelligence is an umbrella concept for various disciplines, including machine learning. A famous early machine-learning task was to get a computer to recognize an image of a cat. This is something that a human can do easily, but, for a computer, there are no simple parameters that define the visual concept of a cat. Machine learning is now used for detecting patterns or relationships that are nearly impossible for humans to see, in part because the data is often in many dimensions. The programmer remains the captain, telling the computer what to learn, and deciding what input it’s trained on. But the computer adapts, iteratively, as it learns, and in that way becomes the author of its own algorithms. It was machine learning, for example, that discovered, through analyzing language patterns, the alleged main authors of the posts by “Q” (the supposed high-ranking government official who sparked the QAnon conspiracy theory). It was also able to identify which of Q’s posts appeared to be written by Paul Furber, a South African software developer, and by Ron Watkins, the son of the former owner of 8chan. Machine-learning programs have also been applied in health care, using data to predict which patients are most at risk of falling. Compared with the intuition of doctors, the machine-learning-based assessments reduced falls by about forty per cent, an enormous margin of improvement for a medical intervention.

    Machine learning has catapulted astrophysics research forward, too. Villaescusa-Navarro said, “As a community, we have been dealing with super-hard problems for many, many years. Problems that the smartest people in the field have been working on for decades. And from one day to the next, these problems are getting solved with machine learning.” Even generating a single simulated universe used to take a very long time. You gave a computer some initial conditions and then had to wait while it worked out what those conditions would produce some fourteen billion years down the line. It took less than fourteen billion years, of course, but there was no way to build up a large database of simulated universes in a timely way. Machine-learning advances have sped up these simulations, making a project like CAMELS possible. An even more ambitious project, Learning the Universe, will use machine learning to create simulated universes millions of times faster than CAMELS can; it will then use what’s called simulation-based inference—along with real observational data from telescopes—to determine which starting parameters lead to a universe that most closely resembles our own.

    Ding told me that one of the reasons he chose astronomy has been the proximity he feels to breakthroughs in the field, even as an undergraduate. “For example, I’m in a cosmology class right now, and when my professor talks about dark matter, she talks about it as something ‘a good friend of mine, Vera Rubin, put on the map,’ ” he said. “And dark energy was discovered by a team at Harvard University about twenty years ago, and I did a summer program there. So here I am, learning about this stuff pretty much in the places where these things were happening.” Ding’s research produced something profoundly unexpected. His model used a single galaxy in a simulated universe to pretty accurately say something about that universe. The specific characteristic it was able to predict is called Omega matter, which relates to the density of a universe. Its value was accurately predicted to within ten per cent.

    Ding was initially unsure how meaningful his results were and was curious to hear Villaescusa-Navarro’s perspective. He was more than skeptical. “My first thought was, This is completely crazy, I don’t believe it, this is the work of an undergraduate, there must be a mistake,” Villaescusa-Navarro said. “I asked him to run the program in a few other ways to see if he would still come up with similar results.” The results held.

    Villaescusa-Navarro began to do his own calculations. His doubt focussed foremost on the way that the machine learning itself worked. “One thing about neural networks is that they are amazing at finding correlations, but they also can pick up on numerical artifacts,” he said. Was a parameter wrong? Was there a bug in the code? Villaescusa-Navarro wrote his own program, to ask the same sort of question that he had assigned to Ding: What could information about one galaxy say about the universe in which it resided? Even when asked by a different program, written from scratch, the answer was still coming out the same. This suggested that the result was catching something real.

    “But we couldn’t just publish that,” Villaescusa-Navarro said. “We needed to try and understand why this might be working.” It was working for small galaxies, and for large galaxies, and for galaxies with very different features; only for a small handful of eccentric galaxies did the work not hold. Why?

    The recipe for making a universe is to start with a lot of hydrogen, a little helium, some dark matter, and some dark energy. Dark matter has mass, like the matter we’re familiar with, but it doesn’t reflect or emit light, so we can’t see it. We also can’t see dark energy, but we can think of it as working in the opposite direction of gravity. The universe’s matter, via gravity, pushes it to contract; the universe’s dark energy pushes it to expand.

    Omega matter is a cosmological parameter that describes how much dark matter is in the universe. Along with other parameters, it controls how much the universe is expanding. The higher its value, the slower the universe would grow. One of the research group’s hypotheses to explain their results is, roughly, that the amount of dark matter in a universe has a very strong effect on a galaxy’s properties—a stronger effect than other characteristics. For this reason, even one galaxy could have something to say about the Omega matter of its parent universe, since Omega matter is correlated to what can be pictured as the density of matter that makes a galaxy clump together.

    In December, Genel, an expert on galaxy formation, presented the preliminary results of the paper to the galaxy-formation group he belongs to at The Flatiron Institute Center for Computational Astrophysics. “This was really one of the most fun things that happened to me,” he said. He told me that any galaxy-formation expert could have no other first reaction than to think, This is impossible. A galaxy is, on the scale of a universe, about as substantial as a grain of sand is, relative to the size of the Earth. To think that all by itself it can say something so substantial is, to the majority of the astrophysics community, extremely surprising, in a way analogous to the discovery that each of our cells—from a fingernail cell to a liver cell—contains coding describing our entire body. (Though maybe to the poetic way of thinking—to see the world in a grain of sand—the surprise is that this is surprising.)

    Rachel Somerville, an astrophysicist who was at the talk, recalled the initial reaction as “skepticism, but respectful skepticism, since we knew these were serious researchers.” She remembers being surprised that the approach had even been tried, since it seemed so tremendously unlikely that it would work. Since that time, the researchers have shared their coding and results with experts in the field; the results are taken to be credible and compelling, though the hesitations that the authors themselves have about the results remain.

    The results are not “robust”—for now, the computer can make valid predictions only on the type of universe that it has been trained on. Even within CAMELS, there are two varieties of simulations, and, if the machine is trained on one variety, it cannot be used to make predictions for galaxies in the other variety. That also means that the results cannot be used to make predictions about the universe we live in—at least not yet.

    Villaescusa-Navarro told me, “It is a very beautiful result—I know I shouldn’t say that about my own work.” But what is beauty to an astrophysicist? “It’s about an unexpected connection between two things that seemed not to be related. In this case, cosmology and galaxy formation. It’s about something hidden being revealed.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 9:27 pm on March 17, 2022 Permalink | Reply
    Tags: "‘Self-Driving’ Lab Speeds Up Research and Synthesis of Energy Materials", , Artificial Intelligence, , , , The AI algorithm selects and runs its own experiments., , Using AI and automated robotic systems to perform multi-step chemical synthesis and analysis.   

    From The North Carolina State University: “‘Self-Driving’ Lab Speeds Up Research and Synthesis of Energy Materials” 

    NC State bloc

    From The North Carolina State University

    March 16, 2022
    Milad Abolhasani
    abolhasani@ncsu.edu

    Matt Shipman
    matt_shipman@ncsu.edu

    1

    Researchers from North Carolina State University and The University at Buffalo have developed and demonstrated a ‘self-driving lab’ that uses artificial intelligence (AI) and fluidic systems to advance our understanding of metal halide perovskite (MHP) nanocrystals. This self-driving lab can also be used to investigate a broad array of other semiconductor and metallic nanomaterials.

    “We’ve created a self-driving laboratory that can be used to advance both fundamental nanoscience and applied engineering,” says Milad Abolhasani, corresponding author of a paper on the work and an associate professor of chemical and bimolecular engineering at NC State.

    For their proof-of-concept demonstrations, the researchers focused on all-inorganic metal halide perovskite (MHP) nanocrystals, cesium lead halide (CsPbX3, X=Cl, Br). MHP nanocrystals are an emerging class of semiconductor materials that, because of their solution-processability and unique size- and composition-tunable properties, are thought to have potential for use in printed photonic devices and energy technologies. For example, MHP nanocrystals are very efficient optically active materials and are under consideration for use in next-generation LEDs. And because they can be made using solution processing, they have the potential to be made in a cost-effective way.

    Solution-processed materials are materials that are made using liquid chemical precursors, including high-value materials such as quantum dots, metal/metal oxide nanoparticles and metal organic frameworks.

    However, MHP nanocrystals are not in industrial use yet.

    “In part, that’s because we’re still developing a better understanding of how to synthesize these nanocrystals in order to engineer all of the properties associated with MHPs,” Abolhasani says. “And, in part, because synthesizing them requires a degree of precision that has prevented large-scale manufacturing from being cost-effective. Our work here addresses both of those issues.”

    The new technology expands on the concept of Artificial Chemist 2.0, which Abolhasani’s lab unveiled in 2020. Artificial Chemist 2.0 is completely autonomous, and uses AI and automated robotic systems to perform multi-step chemical synthesis and analysis. In practice, that system focused on tuning the bandgap of MHP quantum dots, allowing users to go from requesting a custom quantum dot to completing the relevant R&D and beginning manufacturing in less than an hour.

    “Our new self-driving lab technology can autonomously dope MHP nanocrystals, adding manganese atoms into the crystalline lattice of the nanocrystals on demand,” Abolhasani says.

    Doping the material with varying levels of manganese changes the optical and electronic properties of the nanocrystals and introduces magnetic properties to the material. For example, doping the MHP nanocrystals with manganese can change the wavelength of light emitted from the material.

    “This capability gives us even greater control over the properties of the MHP nanocrystals,” Abolhasani says. “In essence, the universe of potential colors that can be produced by MHP nanocrystals is now larger. And it’s not just color. It offers a much greater range of electronic and magnetic properties.”

    The new self-driving lab technology also offers a much faster and more efficient means of understanding how to engineer MHP nanocrystals in order to obtain the desired combination of properties. Video of the new technology works can be found at https://www.youtube.com/watch?v=2BflpW6R4HI.

    “Let’s say you want to get an in-depth understanding of how manganese-doping and bandgap tuning will affect a specific class of MHP nanocrystals, such as CsPbX3,” Abolhasani says. “There are approximately 160 billion possible experiments that you could run, if you wanted to control for every possible variable in each experiment. Using conventional techniques, it would still generally take hundreds or thousands of experiments to learn how those two processes – manganese-doping and bandgap tuning – would affect the properties of the cesium lead halide nanocrystals.”

    But the new system does all of this autonomously. Specifically, its AI algorithm selects and runs its own experiments. The results from each completed experiment inform which experiment it will run next – and it keeps going until it understands which mechanisms control the MHP’s various properties.

    “We found, in a practical demonstration, that the system was able to get a thorough understanding of how these processes alter the properties of cesium lead halide nanocrystals in only 60 experiments,” Abolhasani says. “In other words, we can get the information we need to engineer a material in hours instead of months.”

    While the work demonstrated in the paper focuses on MHP nanocrystals, the autonomous system could also be used to characterize other nanomaterials that are made using solution processes, including a wide variety of metallic and semiconductor nanomaterials.

    “We’re excited about how this technology will broaden our understanding of how to control the properties of these materials, but it’s worth noting that this system can also be used for continuous manufacturing,” Abolhasani says. “So you can use the system to identify the best possible process for creating your desired nanocrystals, and then set the system to start producing material nonstop – and with incredible specificity.

    “We’ve created a powerful technology. And we’re now looking for partners to help us apply this technology to specific challenges in the industrial sector.”

    The science paper is published open access in the journal Advanced Intelligent Systems. The paper was co-authored by Fazel Bateni, a Ph.D. student at NC State; Robert Epps and Jeffery Bennett, postdoctoral researchers at NC State; Kameel Antami, a former Ph.D. student at NC State; Rokas Dargis, an undergraduate at NC State; and Kristofer Reyes, an assistant professor at the University at Buffalo.

    The work was done with support from the National Science Foundation, under grant number 1940959, and from the UNC Research Opportunities Initiative.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NC State campus

    The North Carolina State University was founded with a purpose: to create economic, societal and intellectual prosperity for the people of North Carolina and the country. We began as a land-grant institution teaching the agricultural and mechanical arts. Today, we’re a pre-eminent research enterprise that excels in science, technology, engineering, math, design, the humanities and social sciences, textiles and veterinary medicine.

    North Carolina State University students, faculty and staff take problems in hand and work with industry, government and nonprofit partners to solve them. Our 34,000-plus high-performing students apply what they learn in the real world by conducting research, working in internships and co-ops, and performing acts of world-changing service. That experiential education ensures they leave here ready to lead the workforce, confident in the knowledge that NC State consistently rates as one of the best values in higher education.

    North Carolina State University is a public land-grant research university in Raleigh, North Carolina. Founded in 1887 and part of the University of North Carolina system, it is the largest university in the Carolinas. The university forms one of the corners of the Research Triangle together with Duke University in Durham and the University of North Carolina at Chapel Hill. It is classified among “R1: Doctoral Universities – Very high research activity”.

    The North Carolina General Assembly established the North Carolina College of Agriculture and Mechanic Arts, now North Carolina State University, on March 7, 1887, originally as a land-grant college. The college underwent several name changes and officially became North Carolina State University at Raleigh in 1965, and by longstanding convention, the “at Raleigh” portion was omitted. Today, North Carolina State University has an enrollment of more than 35,000 students, making it among the largest in the country. North Carolina State University has historical strengths in engineering, statistics, agriculture, life sciences, textiles, and design and offers bachelor’s degrees in 106 fields of study. The graduate school offers master’s degrees in 104 fields, doctoral degrees in 61 fields, and a Doctor of Veterinary Medicine.

    North Carolina State University athletic teams are known as the Wolfpack. The name was adopted in 1922 when a disgruntled fan described the behavior of the student body at athletic events as being “like a wolf pack.” They compete in NCAA Division I and have won eight national championships: two NCAA championships, two AIAW championships, and four titles under other sanctioning bodies.

    The North Carolina General Assembly founded North Carolina State University on March 7, 1887 as a land-grant college under the name “North Carolina College of Agriculture and Mechanic Arts,” or “North Carolina A&M” for short. In the segregated system, it was open only to white students. As a land-grant college, North Carolina A&M would provide a liberal and practical education while focusing on military tactics, agriculture, and the mechanical arts without excluding classical studies. Since its founding, the university has maintained these objectives while building on them. After opening in 1889, North Carolina A&M saw its enrollment fluctuate and its mandate expand. In 1917, it changed its name to “North Carolina State College of Agriculture and Engineering”—or “North Carolina State” for short. During the Great Depression, the North Carolina state government, under Governor O. Max Gardner, administratively combined the University of North Carolina, the Woman’s College (now the University of North Carolina at Greensboro), and North Carolina State University. This conglomeration became the University of North Carolina in 1931. In 1937 Blake R Van Leer joined as Dean and started the graduate program for engineering. Following World War II, the university grew and developed. The G.I. Bill enabled thousands of veterans to attend college, and enrollment shot past the 5,000 mark in 1947.

    State College created new academic programs, including the School of Architecture and Landscape Design in 1947 (renamed as the School of Design in 1948), the School of Education in 1948, and the School of Forestry in 1950. In the summer of 1956, following the US Supreme Court ruling in Brown v. Board of Education (1954) that segregated public education was unconstitutional, North Carolina State College enrolled its first African-American undergraduates, Ed Carson, Manuel Crockett, Irwin Holmes, and Walter Holmes.

    In 1962, State College officials desired to change the institution’s name to North Carolina State University. Consolidated university administrators approved a change to the University of North Carolina at Raleigh, frustrating many students and alumni who protested the change with letter writing campaigns. In 1963, State College officially became North Carolina State of the University of North Carolina. Students, faculty, and alumni continued to express dissatisfaction with this name, however, and after two additional years of protest, the name was changed to the current North Carolina State University at Raleigh. However, by longstanding convention, the “at Raleigh” portion is omitted, and the shorter names “North Carolina State University” and “NC State University” are accepted on first reference in news stories. Indeed, school officials discourage using “at Raleigh” except when absolutely necessary, as the full name implies that there is another branch of the university elsewhere in the state.

    In 1966, single-year enrollment reached 10,000. In the 1970s enrollment surpassed 19,000 and the School of Humanities and Social Sciences was added.

    Celebrating its centennial in 1987, North Carolina State University reorganized its internal structure, renaming all its schools to colleges (e.g. School of Engineering to the College of Engineering). Also in this year, it gained 700 acres (2.8 km^2) of land that was developed as Centennial Campus. Since then, North Carolina State University has focused on developing its new Centennial Campus. It has invested more than $620 million in facilities and infrastructure at the new campus, with 62 acres (0.3 km^2) of space being constructed. Sixty-one private and government agency partners are located on Centennial Campus.

    North Carolina State University has almost 8,000 employees, nearly 35,000 students, a $1.495 billion annual budget, and a $1.4 billion endowment. It is the largest university in the state and one of the anchors of North Carolina’s Research Triangle, together with Duke University and the University of North Carolina at Chapel Hill.

    In 2009, North Carolina State University canceled a planned appearance by the Dalai Lama to speak on its Raleigh campus, citing concerns about a Chinese backlash and a shortage of time and resources.

    North Carolina State University Libraries Special Collections Research Center, located in D.H. Hill Library, maintains a website devoted to NC State history entitled Historical State.

    North Carolina State University is one of 17 institutions that constitute the University of North Carolina system. Each campus has a high degree of independence, but each submits to the policies of the UNC system Board of Governors. The 32 voting members of the Board of Governors are elected by the North Carolina General Assembly for four-year terms. President Thomas W. Ross heads the system.

    The Board of Trustees of North Carolina State University has thirteen members and sets all policies for the university. The UNC system Board of Governors elects eight of the trustees and the Governor of North Carolina appoints four. The student body president serves on the Board of Trustees as a voting member. The UNC system also elects the Chancellor of North Carolina State University.

    The Board of Trustees administers North Carolina State University’s eleven academic colleges. Each college grants its own degrees with the exception of the First Year College which provides incoming freshmen the opportunity to experience several disciplines before selecting a major. The College of Agriculture and Life Sciences is the only college to offer associate’s degrees and the College of Veterinary Medicine does not grant undergraduate degrees. Each college is composed of numerous departments that focus on a particular discipline or degree program, for example Food Science, Civil Engineering, Genetics or Accounting. There are a total of 66 departments administered by all eleven NC State colleges.

    In total, North Carolina State University offers nine associate’s degrees in agriculture, bachelor’s degrees in 102 areas of study, master’s degrees in 108 areas and doctorate degrees in 60 areas. North Carolina State University is known for its programs in agriculture, engineering, textiles, and design. The textile and paper engineering programs are notable, given the uniqueness of the subject area.

    As of the 2018-2019 school year, North Carolina State University has the following colleges and academic departments:

    College of Agriculture and Life Sciences
    College of Design
    College of Education
    College of Engineering
    College of Humanities and Social Sciences
    College of Natural Resources
    Poole College of Management
    College of Sciences
    Wilson College of Textiles
    College of Veterinary Medicine
    The Graduate School
    University College

    In 2014 – 2015 North Carolina State University became part of only fifty-four institutions in the U.S. to have earned the “Innovation and Economic Prosperity University” designation by the Association of Public and Land-grant Universities.

    For 2020, U.S. News & World Report ranks North Carolina State University tied for 84th out of all national universities and tied for 34th out of public universities in the U.S., tied at 31st for “most innovative” and 69th for “best value” schools.

    North Carolina State University’s College of Engineering was tied for 24th by U.S. News & World Report, with many of its programs ranking in the top 30 nationally.North Carolina State University’s Nuclear Engineering program is considered to be one of the best in the world and in 2020, was ranked 3rd in the country (behind The Massachusetts Institute of Technology and the University of Michigan Ann Arbor). The biological and agricultural engineering programs are also widely recognized and were ranked 4th nationally. In 2019 North Carolina State University’s manufacturing and industrial engineering program was ranking 13th in the nation, and material science at 15th. Other notable programs included civil engineering at 20th, environmental engineering tied at 21st, chemical engineering tied for 22nd, computer engineering at 28th, and biomedical engineering ranking 28th nationally in 2019. In 2019, the Academic Ranking of World Universities ranked NC State’s electrical engineering program 9th internationally and chemical engineering 20th. In 2020, The Princeton Review ranked NC State 36th for game design.

    North Carolina State University is also home to the only college dedicated to textiles in the country, the Wilson College of Textiles, which is a partner of the National Council of Textile Organizations and is widely regarded as one of the best textiles programs in the world. In 2020 the textile engineering program was ranked 1st nationally by College Factual. In 2017, Business of Fashion Magazine ranked the college’s fashion and apparel design program 8th in the country and 30th in the world. In 2018, Fashion Schools ranked the college’s fashion and textile management program 11th in the nation.

    North Carolina State University’s Masters program in Data Analytics was the first in the United States. Launched in 2007, it is part of the Institute for Advanced Analytics and was created as a university-wide multidisciplinary initiative to meet the rapidly growing demand in the labor market for analytics professionals. In 2012, Thomas H. Davenport and D.J. Patil highlighted the MSA program in Harvard Business Review as one of only a few sources of talent with proven strengths in data science.

    North Carolina State University is known for its College of Veterinary Medicine and in 2020 it was ranked 4th nationally, by U.S. News & World Report, 25th internationally by NTU Ranking and 36th internationally by the Academic Ranking of World Universities.

    In 2020, North Carolina State University’s College of Design was ranked 25th by College Factual. In 2018, the Animation Career Review ranked North Carolina State University’s Graphic Design program 4th in the country and best among public universities.

    In 2020, the College of Education tied for 45th in the U.S. and the Poole College of Management is tied for 52nd among business schools. North Carolina State University’s Entrepreneurship program is ranked 10th internationally among undergraduate programs by The Princeton Review in 2020. For 2010 the Wall Street Journal surveyed recruiters and ranked NC State number 19 among the top 25 recruiter picks. In 2018, U.S. News & World Report ranked the Department of Statistics 16th (tied) in the nation.

    In fiscal year 2019, North Carolina State University received 95 awards and $29,381,782 in National Institutes of Health (NIH) Funds for Research. For fiscal year 2017, NC State was ranked 45th in total research expenditure by the National Science Foundation.

    Kiplinger’s Personal Finance placed North Carolina State University 9th in its 2018 ranking of best value public colleges in the United States.

     
  • richardmitnick 6:13 pm on March 11, 2022 Permalink | Reply
    Tags: "Computational modeling guides development of new materials", , Artificial Intelligence, , , Made of metal atoms linked by organic molecules, , MOF's consist of metal atoms joined by organic molecules called linkers to create a rigid cage-like structure., MOF’s can be configured in hundreds of thousands of different ways., , , The materials have many pores which makes them useful for catalyzing reactions involving gases but can also make them less structurally stable., The MIT team is now using the model to try to identify MOFs that could be used to catalyze the conversion of methane gas to methanol which could be used as fuel., This work will allow researchers to test the promise of specific materials before they go through the trouble of synthesizing them.   

    From The Massachusetts Institute of Technology (US): “Computational modeling guides development of new materials” 

    MIT News

    From The Massachusetts Institute of Technology (US)

    March 11, 2022

    1
    MIT computational chemists developed a model that can analyze the features of a metal-organic framework structure and predict if it will be stable enough to be useful. Image: Courtesy of the researchers.

    Metal-organic frameworks, a class of materials with porous molecular structures, have a variety of possible applications, such as capturing harmful gases and catalyzing chemical reactions. Made of metal atoms linked by organic molecules, they can be configured in hundreds of thousands of different ways.

    To help researchers sift through all of the possible metal-organic framework (MOF) structures and help identify the ones that would be most practical for a particular application, a team of MIT computational chemists has developed a model that can analyze the features of a MOF structure and predict if it will be stable enough to be useful.

    The researchers hope that these computational predictions will help cut the development time of new MOFs.

    “This will allow researchers to test the promise of specific materials before they go through the trouble of synthesizing them,” says Heather Kulik, an associate professor of chemical engineering at MIT.

    The MIT team is now working to develop MOFs that could be used to capture methane gas and convert it to useful compounds such as fuels.

    The researchers described their new model in two papers, one in the Journal of the American Chemical Society and one in Scientific Data. Graduate students Aditya Nandy and Gianmarco Terrones are the lead authors of the Scientific Data paper, and Nandy is also the lead author of the JACS paper. Kulik is the senior author of both papers.

    Modeling structure

    MOF’s consist of metal atoms joined by organic molecules called linkers to create a rigid cage-like structure. The materials have many pores which makes them useful for catalyzing reactions involving gases but can also make them less structurally stable.

    “The limitation in seeing MOFs realized at industrial scale is that although we can control their properties by controlling where each atom is in the structure, they’re not necessarily that stable, as far as materials go,” Kulik says. “They’re very porous and they can degrade under realistic conditions that we need for catalysis.”

    Scientists have been working on designing MOFs for more than 20 years, and thousands of possible structures have been published. A centralized repository contains about 10,000 of these structures but is not linked to any of the published findings on the properties of those structures.

    Kulik, who specializes in using computational modeling to discover structure-property relationships of materials, wanted to take a more systematic approach to analyzing and classifying the properties of MOFs.

    “When people make these now, it’s mostly trial and error. The MOF dataset is really promising because there are so many people excited about MOFs, so there’s so much to learn from what everyone’s been working on, but at the same time, it’s very noisy and it’s not systematic the way it’s reported,” she says.

    Kulik and her colleagues set out to analyze published reports of MOF structures and properties using a natural-language-processing algorithm. Using this algorithm, they scoured nearly 4,000 published papers, extracting information on the temperature at which a given MOF would break down. They also pulled out data on whether particular MOFs can withstand the conditions needed to remove solvents used to synthesize them and make sure they become porous.

    Once the researchers had this information, they used it to train two neural networks to predict MOFs’ thermal stability and stability during solvent removal, based on the molecules’ structure.

    “Before you start working with a material and thinking about scaling it up for different applications, you want to know will it hold up, or is it going to degrade in the conditions I would want to use it in?” Kulik says. “Our goal was to get better at predicting what makes a stable MOF.”

    Better stability

    Using the model, the researchers were able to identify certain features that influence stability. In general, simpler linkers with fewer chemical groups attached to them are more stable. Pore size is also important: Before the researchers did their analysis, it had been thought that MOFs with larger pores might be too unstable. However, the MIT team found that large-pore MOFs can be stable if other aspects of their structure counteract the large pore size.

    “Since MOFs have so many things that can vary at the same time, such as the metal, the linkers, the connectivity, and the pore size, it is difficult to nail down what governs stability across different families of MOFs,” Nandy says. “Our models enable researchers to make predictions on existing or new materials, many of which have yet to be made.”

    The researchers have made their data and models available online. Scientists interested in using the models can get recommendations for strategies to make an existing MOF more stable, and they can also add their own data and feedback on the predictions of the models.

    The MIT team is now using the model to try to identify MOFs that could be used to catalyze the conversion of methane gas to methanol which could be used as fuel. Kulik also plans to use the model to create a new dataset of hypothetical MOFs that haven’t been built before but are predicted to have high stability. Researchers could then screen this dataset for a variety of properties.

    “People are interested in MOFs for things like quantum sensing and quantum computing, all sorts of different applications where you need metals distributed in this atomically precise way,” Kulik says.

    The research was funded by DARPA, the U.S. Office of Naval Research, the U.S. Department of Energy, a National Science Foundation Graduate Research Fellowship, a Career Award at the Scientific Interface from the Burroughs Wellcome Fund, and an AAAS Marion Milligan Mason Award.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology (US) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory (US), the MIT Bates Research and Engineering Center (US), and the Haystack Observatory (US), as well as affiliated laboratories such as the Broad Institute of MIT and Harvard(US) and Whitehead Institute (US).

    Massachusettes Institute of Technology-Haystack Observatory(US) Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology (US) adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology (US) . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology (US) is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia (US), wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology (US) was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst (US)). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology (US) was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology (US) faculty and alumni rebuffed Harvard University (US) president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology (US) administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities (US)in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology (US) that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology (US)‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology (US) became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology (US) profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology (US) between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology (US) no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology (US)’s defense research. In this period Massachusetts Institute of Technology (US)’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology (US) ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT (US) Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology (US) students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology (US) over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology (US) has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology (US) classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology (US) was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology (US) launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology (US) announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology (US) faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology (US) has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology (US) community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology (US) announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology (US) community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO (US) was designed and constructed by a team of scientists from California Institute of Technology (US), Massachusetts Institute of Technology (US), and industrial contractors, and funded by the National Science Foundation (US) .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology (US) physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology (US) graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology (US) is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: