Tagged: Machine learning Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:31 am on January 16, 2023 Permalink | Reply
    Tags: "LAIs": long-acting injectables, "University of Toronto scientists use AI to fast-track drug formulation development", , , , , Machine learning, Machine-learning algorithms can be used to predict experimental drug release from long-acting injectables (LAI) and can also help guide the design of new LAIs., , Reducing ‘trial and error’ for new drug development, , Theoretical and Quantum Chemistry   

    From The University of Toronto (CA): “University of Toronto scientists use AI to fast-track drug formulation development” 

    From The University of Toronto (CA)

    1.11.23
    Kate Richards | Leslie Dan Faculty of Pharmacy

    1
    Researchers Christine Allen and Alán Aspuru-Guzik used machine learning to predict experimental drug release from long-acting injectables (photo by Steve Southon)

    In a bid to reduce the time and cost associated with developing promising new medicines, University of Toronto scientists have successfully tested the use of artificial intelligence to guide the design of long-acting injectable drug formulations.

    The study, published this week in Nature Communication [below], was led by Professor Christine Allen in the Leslie Dan Faculty of Pharmacy and Alán Aspuru-Guzik in the departments of chemistry and computer science in the Faculty of Arts & Science.

    Fig. 1: Schematic demonstrating traditional and data-driven formulation development approaches for long-acting injectables (LAIs).
    2
    [a] Selected routes of administration for FDA-approved LAI formulations. [b] Typical trial-and-error loop commonly employed during the development of LAIs termed “traditional LAI formulation development”. [c] Workflow employed in this study to train and analyze machine learning (ML) models to accelerate the design of new LAI systems, termed “Data-driven LAI formulation development”.

    Their multidisciplinary research shows that machine-learning algorithms can be used to predict experimental drug release from long-acting injectables (LAI) and can also help guide the design of new LAIs.

    “This study takes a critical step towards data-driven drug formulation development with an emphasis on long-acting injectables,” said Allen, who is a member of U of T’s Acceleration Consortium, a global initiative that uses artificial intelligence and automation to accelerate the discovery of materials and molecules needed for a sustainable future.

    “We’ve seen how machine learning has enabled incredible leap-step advances in the discovery of new molecules that have the potential to become medicines. We are now working to apply the same techniques to help us design better drug formulations and, ultimately, better medicines.”

    Considered one of the most promising therapeutic strategies for the treatment of chronic diseases, long-acting injectables are a class of advanced drug delivery systems that are designed to release their cargo over extended periods of time to achieve a prolonged therapeutic effect. This approach can help patients better adhere to their medication regimen, reduce side effects and increase efficacy when injected close to the site of action in the body.

    However, achieving the optimal amount of drug release over the desired period of time requires the development of a wide array of formulation candidates through extensive and time-consuming experiments. This trial-and-error approach has created a significant bottleneck in LAI development compared to more conventional types of drug formulation.

    “AI is transforming the way we do science. It helps accelerate discovery and optimization. This is a perfect example of a ‘before AI’ and an ‘after AI’ moment and shows how drug delivery can be impacted by this multidisciplinary research,” said Aspuru-Guzik, who is director of the Acceleration Consortium and holds the CIFAR Artificial Intelligence Research Chair at the Vector Institute in Toronto and the Canada 150 Research Chair in Theoretical and Quantum Chemistry.

    3
    From left: Zeqing Bao, PhD trainee in pharmaceutical sciences, and Riley Hickman, PhD trainee in chemistry, are co-authors on the study published in Nature Communication (photo by Steve Southon)

    Reducing ‘trial and error’ for new drug development

    To investigate whether machine-learning tools could accurately predict the rate of drug release, the research team trained and evaluated a series of 11 different models, including multiple linear regression (MLR), random forest (RF), light gradient boosting machine (lightGBM) and neural networks (NN). The data set used to train the selected panel of machine learning models was constructed from previously published studies by the authors and other research groups.

    “Once we had the data set, we split it into two subsets: one used for training the models and one for testing,” said Pauric Bannigan, research associate with the Allen research group at the Leslie Dan Faculty of Pharmacy. “We then asked the models to predict the results of the test set and directly compared with previous experimental data. We found that the tree-based models, and specifically lightGBM, delivered the most accurate predictions.”

    As a next step, the team worked to apply these predictions and illustrate how machine learning models might be used to inform the design of new LAIs by using advanced analytical techniques to extract design criteria from the lightGBM model. This allowed the design of a new LAI formulation for a drug currently used to treat ovarian cancer.

    Expectations around the speed with which new drug formulations are developed have heightened drastically since the onset of the COVID-19 pandemic.

    “We’ve seen in the pandemic that there was a need to design a new formulation in weeks, to catch up with evolving variants. Allowing for new formulations to be developed in a short period of time, relative to what has been done in the past using conventional methods, is crucially important so that patients can benefit from new therapies,” Allen said, explaining that the research team is also investigating using machine learning to support the development of novel mRNA and lipid nanoparticle formulations.

    More robust databases needed for future advances

    The results of the current study signal the potential for machine learning to reduce reliance on trial-and-error testing. However, Allen and the research team identify that the lack of available open-source data sets in pharmaceutical sciences represents a significant challenge to future progress.

    “When we began this project, we were surprised by the lack of data reported across numerous studies using polymeric microparticles,” Allen said. “This meant the studies and the work that went into them couldn’t be leveraged to develop the machine learning models we need to propel advances in this space. There is a real need to create robust databases in pharmaceutical sciences that are open access and available for all so that we can work together to advance the field.”

    To that end, Allen and the research team have published their datasets and code on the open-source platform Zenodo.

    “For this study our goal was to lower the barrier of entry to applying machine learning in pharmaceutical sciences,” Bannigan said. “We’ve made our data sets fully available so others can hopefully build on this work. We want this to be the start of something and not the end of the story for machine learning in drug formulation.”

    The study was supported by the Natural Sciences and Engineering Research Council of Canada, the Defense Advance Research Projects Agency and the Vector Institute.

    Science paper:
    Nature Communication

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The The University of Toronto (CA) is a public research university in Toronto, Ontario, Canada, located on the grounds that surround Queen’s Park. It was founded by royal charter in 1827 as King’s College, the oldest university in the province of Ontario.

    Originally controlled by the Church of England, the university assumed its present name in 1850 upon becoming a secular institution.

    As a collegiate university, it comprises eleven colleges each with substantial autonomy on financial and institutional affairs and significant differences in character and history. The university also operates two satellite campuses located in Scarborough and Mississauga.

    University of Toronto has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

    Academically, the University of Toronto is noted for movements and curricula in literary criticism and communication theory, known collectively as the Toronto School.

    The university was the birthplace of insulin and stem cell research, and was the site of the first electron microscope in North America; the identification of the first black hole Cygnus X-1; multi-touch technology, and the development of the theory of NP-completeness.

    The university was one of several universities involved in early research of deep learning. It receives the most annual scientific research funding of any Canadian university and is one of two members of the Association of American Universities outside the United States, the other being McGill(CA).

    The Varsity Blues are the athletic teams that represent the university in intercollegiate league matches, with ties to gridiron football, rowing and ice hockey. The earliest recorded instance of gridiron football occurred at University of Toronto’s University College in November 1861.

    The university’s Hart House is an early example of the North American student centre, simultaneously serving cultural, intellectual, and recreational interests within its large Gothic-revival complex.

    The University of Toronto has educated three Governors General of Canada, four Prime Ministers of Canada, three foreign leaders, and fourteen Justices of the Supreme Court. As of March 2019, ten Nobel laureates, five Turing Award winners, 94 Rhodes Scholars, and one Fields Medalist have been affiliated with the university.

    Early history

    The founding of a colonial college had long been the desire of John Graves Simcoe, the first Lieutenant-Governor of Upper Canada and founder of York, the colonial capital. As an University of Oxford (UK)-educated military commander who had fought in the American Revolutionary War, Simcoe believed a college was needed to counter the spread of republicanism from the United States. The Upper Canada Executive Committee recommended in 1798 that a college be established in York.

    On March 15, 1827, a royal charter was formally issued by King George IV, proclaiming “from this time one College, with the style and privileges of a University … for the education of youth in the principles of the Christian Religion, and for their instruction in the various branches of Science and Literature … to continue for ever, to be called King’s College.” The granting of the charter was largely the result of intense lobbying by John Strachan, the influential Anglican Bishop of Toronto who took office as the college’s first president. The original three-storey Greek Revival school building was built on the present site of Queen’s Park.

    Under Strachan’s stewardship, King’s College was a religious institution closely aligned with the Church of England and the British colonial elite, known as the Family Compact. Reformist politicians opposed the clergy’s control over colonial institutions and fought to have the college secularized. In 1849, after a lengthy and heated debate, the newly elected responsible government of the Province of Canada voted to rename King’s College as the University of Toronto and severed the school’s ties with the church. Having anticipated this decision, the enraged Strachan had resigned a year earlier to open Trinity College as a private Anglican seminary. University College was created as the nondenominational teaching branch of the University of Toronto. During the American Civil War the threat of Union blockade on British North America prompted the creation of the University Rifle Corps which saw battle in resisting the Fenian raids on the Niagara border in 1866. The Corps was part of the Reserve Militia lead by Professor Henry Croft.

    Established in 1878, the School of Practical Science was the precursor to the Faculty of Applied Science and Engineering which has been nicknamed Skule since its earliest days. While the Faculty of Medicine opened in 1843 medical teaching was conducted by proprietary schools from 1853 until 1887 when the faculty absorbed the Toronto School of Medicine. Meanwhile the university continued to set examinations and confer medical degrees. The university opened the Faculty of Law in 1887, followed by the Faculty of Dentistry in 1888 when the Royal College of Dental Surgeons became an affiliate. Women were first admitted to the university in 1884.

    A devastating fire in 1890 gutted the interior of University College and destroyed 33,000 volumes from the library but the university restored the building and replenished its library within two years. Over the next two decades a collegiate system took shape as the university arranged federation with several ecclesiastical colleges including Strachan’s Trinity College in 1904. The university operated the Royal Conservatory of Music from 1896 to 1991 and the Royal Ontario Museum from 1912 to 1968; both still retain close ties with the university as independent institutions. The University of Toronto Press was founded in 1901 as Canada’s first academic publishing house. The Faculty of Forestry founded in 1907 with Bernhard Fernow as dean was Canada’s first university faculty devoted to forest science. In 1910, the Faculty of Education opened its laboratory school, the University of Toronto Schools.

    World wars and post-war years

    The First and Second World Wars curtailed some university activities as undergraduate and graduate men eagerly enlisted. Intercollegiate athletic competitions and the Hart House Debates were suspended although exhibition and interfaculty games were still held. The David Dunlap Observatory in Richmond Hill opened in 1935 followed by the University of Toronto Institute for Aerospace Studies in 1949. The university opened satellite campuses in Scarborough in 1964 and in Mississauga in 1967. The university’s former affiliated schools at the Ontario Agricultural College and Glendon Hall became fully independent of the University of Toronto and became part of University of Guelph (CA) in 1964 and York University (CA) in 1965 respectively. Beginning in the 1980s reductions in government funding prompted more rigorous fundraising efforts.

    Since 2000

    In 2000 Kin-Yip Chun was reinstated as a professor of the university after he launched an unsuccessful lawsuit against the university alleging racial discrimination. In 2017 a human rights application was filed against the University by one of its students for allegedly delaying the investigation of sexual assault and being dismissive of their concerns. In 2018 the university cleared one of its professors of allegations of discrimination and antisemitism in an internal investigation after a complaint was filed by one of its students.

    The University of Toronto was the first Canadian university to amass a financial endowment greater than c. $1 billion in 2007. On September 24, 2020 the university announced a $250 million gift to the Faculty of Medicine from businessman and philanthropist James C. Temerty- the largest single philanthropic donation in Canadian history. This broke the previous record for the school set in 2019 when Gerry Schwartz and Heather Reisman jointly donated $100 million for the creation of a 750,000-square foot innovation and artificial intelligence centre.

    Research

    Since 1926 the University of Toronto has been a member of the Association of American Universities a consortium of the leading North American research universities. The university manages by far the largest annual research budget of any university in Canada with sponsored direct-cost expenditures of $878 million in 2010. In 2018 the University of Toronto was named the top research university in Canada by Research Infosource with a sponsored research income (external sources of funding) of $1,147.584 million in 2017. In the same year the university’s faculty averaged a sponsored research income of $428,200 while graduate students averaged a sponsored research income of $63,700. The federal government was the largest source of funding with grants from the Canadian Institutes of Health Research; the Natural Sciences and Engineering Research Council; and the Social Sciences and Humanities Research Council amounting to about one-third of the research budget. About eight percent of research funding came from corporations- mostly in the healthcare industry.

    The first practical electron microscope was built by the physics department in 1938. During World War II the university developed the G-suit- a life-saving garment worn by Allied fighter plane pilots later adopted for use by astronauts.Development of the infrared chemiluminescence technique improved analyses of energy behaviours in chemical reactions. In 1963 the asteroid 2104 Toronto was discovered in the David Dunlap Observatory (CA) in Richmond Hill and is named after the university. In 1972 studies on Cygnus X-1 led to the publication of the first observational evidence proving the existence of black holes. Toronto astronomers have also discovered the Uranian moons of Caliban and Sycorax; the dwarf galaxies of Andromeda I, II and III; and the supernova SN 1987A. A pioneer in computing technology the university designed and built UTEC- one of the world’s first operational computers- and later purchased Ferut- the second commercial computer after UNIVAC I. Multi-touch technology was developed at Toronto with applications ranging from handheld devices to collaboration walls. The AeroVelo Atlas which won the Igor I. Sikorsky Human Powered Helicopter Competition in 2013 was developed by the university’s team of students and graduates and was tested in Vaughan.

    The discovery of insulin at the University of Toronto in 1921 is considered among the most significant events in the history of medicine. The stem cell was discovered at the university in 1963 forming the basis for bone marrow transplantation and all subsequent research on adult and embryonic stem cells. This was the first of many findings at Toronto relating to stem cells including the identification of pancreatic and retinal stem cells. The cancer stem cell was first identified in 1997 by Toronto researchers who have since found stem cell associations in leukemia; brain tumors; and colorectal cancer. Medical inventions developed at Toronto include the glycaemic index; the infant cereal Pablum; the use of protective hypothermia in open heart surgery; and the first artificial cardiac pacemaker. The first successful single-lung transplant was performed at Toronto in 1981 followed by the first nerve transplant in 1988; and the first double-lung transplant in 1989. Researchers identified the maturation promoting factor that regulates cell division and discovered the T-cell receptor which triggers responses of the immune system. The university is credited with isolating the genes that cause Fanconi anemia; cystic fibrosis; and early-onset Alzheimer’s disease among numerous other diseases. Between 1914 and 1972 the university operated the Connaught Medical Research Laboratories- now part of the pharmaceutical corporation Sanofi-Aventis. Among the research conducted at the laboratory was the development of gel electrophoresis.

    The University of Toronto is the primary research presence that supports one of the world’s largest concentrations of biotechnology firms. More than 5,000 principal investigators reside within 2 kilometres (1.2 mi) from the university grounds in Toronto’s Discovery District conducting $1 billion of medical research annually. MaRS Discovery District is a research park that serves commercial enterprises and the university’s technology transfer ventures. In 2008, the university disclosed 159 inventions and had 114 active start-up companies. Its SciNet Consortium operates the most powerful supercomputer in Canada.

     
  • richardmitnick 9:56 am on January 11, 2023 Permalink | Reply
    Tags: "AI to monitor changes to globally important glacier", , , , , Crevassing is an important component of ice shelf dynamics., , Machine learning, , , Un-corking the flow of ice - a process known as "unbuttressing", Using radar satellite images   

    From The University of Leeds (UK) And The University of Bristol (UK): “AI to monitor changes to globally important glacier” 

    U Leeds bloc

    From The University of Leeds (UK)

    And

    The University of Bristol (UK)

    1.9.23

    1
    Crevasses on Antarctic ice shelves change the material properties of the ice and influence their flow-speed. Research shows this coupling to be relevant but more complicated than previously thought for the Thwaites Glacier Ice Tongue. Credit: Dr Anna Hogg, University of Leeds.

    Scientists have developed AI to track the development of crevasses – or fractures – on the Thwaites Glacier ice tongue in west Antarctica.

    Crevasses are indicators of stress building-up in the glacier. 

    A team of researchers from the University of Leeds and University of Bristol have adapted an AI algorithm originally developed to identify cells in microscope images to spot crevasses forming in the ice from satellite images.

    Thwaites is a particularly important part of the Antarctic Ice Sheet because it holds enough ice to raise global sea levels by around 60 centimetres and is considered by many to be at risk of rapid retreat, threatening coastal communities around the world.

    Use of AI will allow scientists to more accurately monitor and model changes to this important glacier. 

    Published in the journal Nature Geoscience [below], the research focussed on a part of the glacier system where the ice flows into the sea and begins to float. Where this happens is known as the grounding line and it forms the start of the Thwaites Eastern ice shelf and the Thwaites Glacier ice tongue, which is also an ice shelf.

    Despite being small in comparison to the size of the entire glacier, changes to these ice shelves could have wide-ranging implications for the whole glacier system and future sea-level rise. 

    The scientists wanted to know if crevassing or fracture formation was more likely to occur with changes to the speed of the ice flow. 

    2
    Scientists have mapped the crevasses on the Thwaites Glacier Ice Tongue through time using deep learning. This new research marks a change in the way in which the structural and dynamic properties of ice shelves can be investigated. Credit: Trystan Surawy-Stepney, University of Leeds.

    Developing the algorithm

    Using machine learning, the researchers taught a computer to look at radar satellite images and identify changes over the last decade. The images were taken by the European Space Agency’s Sentinel-1 satellites, which can “see” through the top layer of snow and onto the glacier, revealing the fractured surface of the ice normally hidden from sight.

    The analysis revealed that over the last six years, the Thwaites Glacier ice tongue has sped up and slowed down twice, by around 40% each time – from four km/year to six km/year before slowing. This is a substantial increase in the magnitude and frequency of speed change compared with past records.

    The study found a complex interplay between crevasse formation and speed of the ice flow. When the ice flow quickens or slows, more crevasses are likely to form. In turn, the increase in crevasses causes the ice to change speed as the level of friction between the ice and underlying rock alters.

    Dr Anna Hogg, a glaciologist in the Satellite Ice Dynamics group at Leeds and an author on the study, said: “Dynamic changes on ice shelves are traditionally thought to occur on timescales of decades to centuries, so it was surprising to see this huge glacier speed up and slow down so quickly.”

    “The study also demonstrates the key role that fractures play in un-corking the flow of ice – a process known as “unbuttressing”.

    3
    Scientists have used radar imagery from the European Space Agency’s Sentinel-1 satellites to measure flow speed of the Thwaites Glacier Ice Tongue (shown) and analyse its structural integrity using deep learning. Credit: Benjamin J. Davison, University of Leeds.

    “Ice sheet models must be evolved to account for the fact that ice can fracture, which will allow us to measure future sea level contributions more accurately.”

    Trystan Surawy-Stepney, lead author of the paper and a doctoral researcher at Leeds, added: “The nice thing about this study is the precision with which the crevasses were mapped.

    “It has been known for a while that crevassing is an important component of ice shelf dynamics and this study demonstrates that this link can be studied on a large scale with beautiful resolution, using computer vision techniques applied to the deluge of satellite images acquired each week.” 

    Satellites orbiting the Earth provide scientists with new data over the most remote and inaccessible regions of Antarctica. The radar on board Sentinel-1 allows places like Thwaites Glacier to be imaged day or night, every week, all year round.

    Dr Mark Drinkwater of the European Space Agency commented: “Studies like this would not be possible without the large volume of high-resolution data provided by Sentinel-1. By continuing to plan future missions, we can carry on supporting work like this and broaden the scope of scientific research on vital areas of the Earth’s climate system.”

    As for Thwaites Glacier ice tongue, it remains to be seen whether such short-term changes have any impact on the long-term dynamics of the glacier, or whether they are simply isolated symptoms of an ice shelf close to its end. 

    Science paper:
    Nature Geoscience

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Bristol (UK) is one of the most popular and successful universities in the UK and was ranked within the top 50 universities in the world in the QS World University Rankings 2018.

    The U Bristol (UK) is at the cutting edge of global research. We have made innovations in areas ranging from cot death prevention to nanotechnology.

    The University has had a reputation for innovation since its founding in 1876. Our research tackles some of the world’s most pressing issues in areas as diverse as infection and immunity, human rights, climate change, and cryptography and information security.

    The University currently has 40 Fellows of the Royal Society and 15 of the British Academy – a remarkable achievement for a relatively small institution.

    We aim to bring together the best minds in individual fields, and encourage researchers from different disciplines and institutions to work together to find lasting solutions to society’s pressing problems.

    We are involved in numerous international research collaborations and integrate practical experience in our curriculum, so that students work on real-life projects in partnership with business, government and community sectors.

    U Leeds Campus

    The University of Leeds is a public research university in Leeds, West Yorkshire, England. It was established in 1874 as the Yorkshire College of Science. In 1884 it merged with the Leeds School of Medicine (established 1831) and was renamed Yorkshire College. It became part of the federal Victoria University in 1887, joining Owens College (which became The University of Manchester (UK)) and University College Liverpool (which became The University of Liverpool (UK)). In 1904 a royal charter was granted to the University of Leeds by King Edward VII.

    The university has 36,330 students, the 5th largest university in the UK (out of 169). From 2006 to present, the university has consistently been ranked within the top 5 (alongside the University of Manchester, The Manchester Metropolitan University (UK), The University of Nottingham (UK) and The University of Edinburgh (SCT)) in the United Kingdom for the number of applications received. Leeds had an income of £751.7 million in 2020/21, of which £130.1 million was from research grants and contracts. The university has financial endowments of £90.5 million (2020–21), ranking outside the top ten British universities by financial endowment.

    Notable alumni include current Leader of the Labour Party Keir Starmer, former Secretary of State Jack Straw, former co-chairman of the Conservative Party Sayeeda Warsi, Piers Sellers (NASA astronaut) and six Nobel laureates.

    The university’s history is linked to the development of Leeds as an international centre for the textile industry and clothing manufacture in the United Kingdom during the Victorian era. The university’s roots can be traced back to the formation of schools of medicine in English cities to serve the general public.

    Before 1900, only six universities had been established in England and Wales: The University of Oxford (UK) (founded c. 1096–1201), The University of Cambridge (UK) (c. 1201), The University of London (UK) (1836), The University of Durham (UK) (1837), Victoria University (UK) (1880), and The University of Wales Trinity Saint David[ Prifysgol Cymru Y Drindod Dewi Sant](WLS) (1893).

    The Victoria University was established in Manchester in 1880 as a federal university in the North of England, instead of the government elevating Owens College to a university and grant it a royal charter. Owens College was the sole college of Victoria University from 1880 to 1884; in 1887 Yorkshire College was the third to join the university.

    Leeds was given its first university in 1887 when the Yorkshire College joined the federal Victoria University on 3 November. The Victoria University had been established by royal charter in 1880; Owens College being at first the only member college. Leeds now found itself in an educational union with close social cousins from Manchester and Liverpool.

    Unlike Owens College, the Leeds campus of the Victoria University had never barred women from its courses. However, it was not until special facilities were provided at the Day Training College in 1896 that women began enrolling in significant numbers. The first female student to begin a course at the university was Lilias Annie Clark, who studied Modern Literature and Education.

    The Victoria (Leeds) University was a short-lived concept, as the multiple university locations in Manchester and Liverpool were keen to establish themselves as separate, independent universities. This was partially due to the benefits a university had for the cities of Liverpool and Manchester whilst the institutions were also unhappy with the practical difficulties posed by maintaining a federal arrangement across broad distances. The interests of the universities and respective cities in creating independent institutions was further spurred by the granting of a charter to the University of Birmingham in 1900 after lobbying from Joseph Chamberlain.

    Following a Royal Charter and Act of Parliament in 1903, the then newly formed University of Liverpool began the fragmentation of the Victoria University by being the first member to gain independence. The University of Leeds soon followed suit and had been granted a royal charter as an independent body by King Edward VII by 1904.

    The Victoria University continued after the break-up of the group, with an amended constitution and renamed as the Victoria University of Manchester (though “Victoria” was usually omitted from its name except in formal usage) until September 2004. On 1 October 2004 a merger with the University of Manchester Institute of Science and Technology was enacted to form The University of Manchester.

    In December 2004, financial pressures forced the university’s governing body (the Council) to decide to close the Bretton campus. Activities at Bretton were moved to the main university campus in the summer of 2007 (allowing all Bretton-based students to complete their studies there). There was substantial opposition to the closure by the Bretton students. The university’s other satellite site, Manygates in Wakefield, also closed, but Lifelong Learning and Healthcare programmes are continuing on a new site next to Wakefield College.

    In May 2006, the university began re-branding itself to consolidate its visual identity to promote one consistent image. A new logo was produced, based on that used during the centenary celebrations in 2004, to replace the combined use of the modified university arms and the Parkinson Building, which has been in use since 2004. The university arms will still be used in its original form for ceremonial purposes only. Four university colours were also specified as being green, red, black and beige.

    Leeds provides the local community with over 2,000 university student volunteers. With 8,700 staff employed in 2019-20, the university is the third largest employer in Leeds and contributes around £1.23bn a year to the local economy – students add a further £211m through rents and living costs.

    The university’s educational partnerships have included providing formal accreditation of degree awards to The Leeds Arts University (UK) and The Leeds Trinity University (UK), although the latter now has the power to award its own degrees. The College of the Resurrection, an Anglican theological college in Mirfield with monastic roots, has, since its inception in 1904, been affiliated to the university, and ties remain close. The university is also a founding member of The Northern Consortium (UK).

    In August 2010, the university was one of the most targeted institutions by students entering the UCAS clearing process for 2010 admission, which matches undersubscribed courses to students who did not meet their firm or insurance choices. The university was one of nine The Russell Group Association(UK) universities offering extremely limited places to “exceptional” students after the universities in Birmingham, Bristol, Cambridge, Edinburgh and Oxford declared they would not enter the process due to courses being full to capacity.

    On 12 October 2010, The Refectory of the Leeds University Union hosted a live edition of the Channel 4 News, with students, academics and economists expressing their reaction to the Browne Review, an independent review of Higher Education funding and student finance conducted by John Browne, Baron Browne of Madingley. University of Leeds Vice-Chancellor and Russell Group chairman Michael Arthur participated, giving an academic perspective alongside current vice-chancellor of The Kingston University (UK) and former Pro Vice-Chancellor and Professor of Education at the University of Leeds, Sir Peter Scott. Midway through the broadcast a small group of protesters against the potential rise of student debt entered the building before being restrained and evacuated.

    In 2016, The University of Leeds became University of the Year 2017 in The Times and The Sunday Times’ Good University Guide. The university has risen to 13th place overall, which reflects impressive results in student experience, high entry standards, services and facilities, and graduate prospects.

    In 2018, the global world ranking of the University of Leeds is No.93. There are currently 30,842 students are studying in this university. The average tuition fee is 12,000 – US$14,000.

    Research

    Many of the academic departments have specialist research facilities, for use by staff and students to support research from internationally significant collections in university libraries to state-of-the-art laboratories. These include those hosted at the Institute for Transport Studies, such as the University of Leeds Driving Simulator which is one of the most advanced worldwide in a research environment, allowing transport researchers to watch driver behaviour in accurately controlled laboratory conditions without the risks associated with a live, physical environment.

    With extensive links to the St James’s University Hospital through the Leeds School of Medicine, the university operates a range of high-tech research laboratories for biomedical and physical sciences, food and engineering – including clean rooms for bionanotechnology and plant science greenhouses. The university is connected to Leeds General Infirmary and the institute of molecular medicine based at St James’s University Hospital which aids integration of research and practice in the medical field.

    The university also operate research facilities in the aviation field, with the Airbus A320 flight simulator. The simulator was devised with an aim to promote the safety and efficiency of flight operations; where students use the simulator to develop their reactions to critical situations such as engine failure, display malfunctioning and freak weather.

    In addition to these facilities, many university departments conduct research in their respective fields. There are also various research centres, including Leeds University Centre for African Studies.

    Leeds was ranked joint 19th (along with The University of St Andrews (SCT)) amongst multi-faculty institutions in the UK for the quality (GPA) of its research and 10th for its Research Power in the 2014 Research Excellence Framework.

    Between 2014-15, Leeds was ranked as the 10th most targeted British university by graduate employers, a two place decrease from 8th position in the previous 2014 rankings.

    The 2021 The Times Higher Education World University Rankings ranked Leeds as 153rd in the world. The university ranks 84th in the world in the CWTS Leiden Ranking. Leeds is ranked 91st in the world (and 15th in the UK) in the 2021 QS World University Rankings.

    The university won the biennially awarded Queen’s Anniversary Prize in 2009 for services to engineering and technology. The honour being awarded to the university’s Institute for Transport Studies (ITS) which for over forty years has been a world leader in transport teaching and research.

    The university is a founding member of The Russell Group Association(UK), comprising the leading research-intensive universities in the UK, as well as the N8 Group for research collaboration, The Worldwide Universities Network (UK), The Association of Commonwealth Universities (UK), The European University Association (EU), The White Rose University Consortium (UK), the Santander Network and the CDIO Initiative. It is also affiliated to The Universities (UK). The Leeds University Business School holds the ‘Triple Crown’ of accreditations from the Association to Advance Collegiate Schools of Business, the Association of MBAs and the European Quality Improvement System.

     
  • richardmitnick 8:24 am on January 9, 2023 Permalink | Reply
    Tags: "Entrepreneurial Milestones in Life Sciences", , , , Machine learning, Measuring the many proteins in a tumor sample in high resolution., , Quantitative Biomedicine, Spatial single-cell proteomics, The field of "image-based systems biology",   

    From The University of Zürich (Universität Zürich) (CH): “Entrepreneurial Milestones in Life Sciences” 

    From The University of Zürich (Universität Zürich) (CH)

    1.9.23
    Nathalie Huber
    English translations by Philip Isler

    UZH Spin-Offs in 2022

    Three new spin-offs were founded at UZH in 2022, transferring scientific findings into industry practice. The business ventures explore new perspectives in the fight against cancer, space factories to produce human tissue, and ways to accelerate the development of novel drugs.

    1
    The goal of the UZH spin-off Navignostics is to enable a more precise cancer diagnosis for patients. (Image: iStock / utah778)

    At UZH, new ideas evolve into pioneering technologies of the future. Last year, three groups of business founders with roots at UZH took the entrepreneurial leap and signed a licensing agreement with UZH. Their spin-offs emerged from life sciences research conducted at the Faculty of Medicine and the Faculty of Science. 

    Precision diagnostics, bespoke therapies

    Despite a wide variety of available drugs and treatment options, many people still succumb to cancer. Every tumor is unique, making it difficult to find the ideal treatment for each patient. The spin-off Navignostics develops novel diagnostic methods to perform advanced tumor sample analyses. “We want to help specialists find targeted immuno-oncology therapies that are tailored to the individual cancer patient’s tumor phenotype,” says Bernd Bodenmiller, professor of Quantitative Biomedicine.

    Navignostics leverages spatial single-cell proteomics, an approach that was developed by Bodenmiller and his research group. Their approach involves measuring the many proteins in a tumor sample in high resolution. This enables clinicians to use algorithms to determine the cell types present in the tumor as well as which of the cells’ processes are deregulated and how the tumor cells affect the surrounding cells. The aim is to use these data and artificial intelligence to recommend therapies that are tailored to the individual cancer patient.

    Navignostics is currently providing pharmaceutics companies with various services to support them in developing cancer drugs and companion diagnostics or to increase the chances of their clinical trials. Thanks to its successful round of seed financing (CHF 7.5 million), the spin-off can accelerate the development of its first diagnostic product and step up its cooperation with clinical, pharma and biotech partners.

    Human tissue from space

    The ambitious goal of Prometheus Life Technologies AG is to set up a factory that can produce human tissue – in space, no less. The spin-off wants to use the microgravity environment in space to manufacture three-dimensional organ-like tissues – dubbed organoids – using human stem cells. These tissues only grow three-dimensionally in zero gravity. On Earth and in labs, they require highly complicated auxiliary structures to do so. “At the moment, there’s an unmet demand for 3D organoids,” says Oliver Ullrich, director of the UZH Space Hub and co-inventor.

    These tissues are particularly popular among pharmaceutical companies, as they enable them to carry out toxicological trials on human tissue without first having to use animal models. Organoids produced from a patient’s stem cells could also one day be used as the building blocks for transplants to treat damaged organs, as the number of donated organs is nowhere near enough to meet the worldwide demand. Further opportunities for growth arise from replacing 2D with the more in-vivo-like 3D cell cultures.

    The spin-off’s technology is based on a previous joint project of UZH and Airbus. The research and development phase included comprehensive experiments on the ground as well as two successful production tests aboard the International Space Station (ISS). The whole process, from idea to commercialization, originated, developed and matured in the UZH Space Hub. Prometheus Life Technologies AG already won a high-ranking international award last month. The spin-off was selected as the winner of the Reef Starter Innovation Challenge, an innovation engine powered by Orbital Reef, a mixed-use space station to be built in the Earth’s lower orbit.

    Mapping drug activity contexts

    Just as statements shouldn’t be considered out of context, the effects of drugs need to be seen in a bigger picture. Founded by Lucas Pelkmans, professor of molecular biology, Apricot Therapeutics specializes in mapping drug activity contexts, or DACs. “We’re the first pharmaceutical company worldwide that focuses on DACs, and our goal is to drive forward the development of novel and innovative drugs,” Pelkmans says. The technology used by the spin-off is based on Pelkmans’ pioneering discovery that it is possible to predict the behavior of individual cells by mapping their surroundings using multi-scale microscopy and imaging technology. DACs capture how the various spatial organizations of our individual cells cause drugs to have variable effects.

    Apricot Therapeutics’ technology platform is based on methods in the field of “image-based systems biology”, for which the spin-off is currently evaluating two patent applications. The goal of the spin-off is to develop a procedure to measure all DACs relevant for drug activity and use machine learning to predict cellular responses to drugs with unprecedented accuracy. The company is the first to apply novel genomics 3.0 technologies to predict drug activity and treatment outcomes. Future clients include pharmaceutical companies, biotech and medtech start-ups, diagnostic centers, clinicians and research laboratories.

    Here are some of the milestones: 

    Successful cooperation

    Biotech company Molecular Partners concluded a licensing agreement with Novartis for Ensovibep, a drug against Covid-19. Molecular Partners sold the drug’s worldwide rights to Novartis for a one-time payment of CHF 150 million and a 22 percent royalty on sales. 
    Neuroimmune entered into a licensing agreement with AstraZeneca subsidiary Alexion to develop and market the NI006 heart drug. The spin-off also stepped up its cooperation with Japanese company Ono Pharmaceutical in the field of neurodegenerative diseases with the aim of co-developing new drugs.

    Medtech firsts

    Clemedi rolled out Tuberculini in 2022. The molecular test for drug-resistant tuberculosis can deliver results within 48 hours. 
    CUTISS AG received certification from Swissmedic that allows the UZH spin-off to manufacture personalized human skin transplants in its Schlieren facilities. On-site production increases the company’s flexibility and production capacity. In addition, CUTISS was awarded a tissue graft patent by the European Patent Office. 
    Oncobit AG obtained CE marking for its first product, oncobit™ PM. This marking, granted by European regulatory authorities, guarantees that the product can be used without restrictions throughout Europe. oncobit™ PM can be used to monitor treatment response, minimal residual disease, and disease recurrence in melanoma patients.

    New capital

    ImmunOs Therapeutics AG completed a highly successful financing round, raising over CHF 72 million. The biopharmaceutical company develops novel therapeutics for the treatment of cancer and autoimmune diseases.  
    Schlieren-based Kuros Biosciences AG announced a capital increase of CHF 6 million. The spin-off develops spinal fusion technologies that ease the burden of back pain.
    Invasight AG successfully raised CHF 4.5 million. Founded in 2020, the biotech spin-off develops protein-protein interaction antagonists (PPIAs) against invasive cancers.

    KOVE Medical and OxyPrem were each awarded an EIC Accelerator Grant funded by the State Secretariat for Education, Research and Innovation (SERI) to promote groundbreaking innovations by Swiss start-ups. KOVE is developing a method to make prenatal surgical interventions, while OxyPrem is producing a device to monitor oxygen supply to the brain.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Zürich (Universität Zürich) (CH), located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the existing colleges of theology, law, medicine and a new faculty of philosophy.

    Currently, the university has seven faculties: Philosophy, Human Medicine, Economic Sciences, Law, Mathematics and Natural Sciences, Theology and Veterinary Medicine. The university offers the widest range of subjects and courses of any Swiss higher education institutions.

    As a member of the League of European Research Universities (EU) (LERU) and Universitas 21 (U21) network, a global network of 27 research universities from around the world, promoting research collaboration and exchange of knowledge.

    Numerous distinctions highlight the University’s international renown in the fields of medicine, immunology, genetics, neuroscience and structural biology as well as in economics. To date, the Nobel Prize has been conferred on twelve UZH scholars.

    Sharing Knowledge

    The academic excellence of the University of Zürich brings benefits to both the public and the private sectors not only in the Canton of Zürich, but throughout Switzerland. Knowledge is shared in a variety of ways: in addition to granting the general public access to its twelve museums and many of its libraries, the University makes findings from cutting-edge research available to the public in accessible and engaging lecture series and panel discussions.

    1. Identity of the University of Zürich

    Scholarship

    The University of Zürich (UZH) is an institution with a strong commitment to the free and open pursuit of scholarship.

    Scholarship is the acquisition, the advancement and the dissemination of knowledge in a methodological and critical manner.

    Academic freedom and responsibility

    To flourish, scholarship must be free from external influences, constraints and ideological pressures. The University of Zürich is committed to unrestricted freedom in research and teaching.

    Academic freedom calls for a high degree of responsibility, including reflection on the ethical implications of research activities for humans, animals and the environment.

    Universitas

    Work in all disciplines at the University is based on a scholarly inquiry into the realities of our world

    As Switzerland’s largest university, the University of Zürich promotes wide diversity in both scholarship and in the fields of study offered. The University fosters free dialogue, respects the individual characteristics of the disciplines, and advances interdisciplinary work.

    2. The University of Zurich’s goals and responsibilities

    Basic principles

    UZH pursues scholarly research and teaching, and provides services for the benefit of the public.

    UZH has successfully positioned itself among the world’s foremost universities. The University attracts the best researchers and students, and promotes junior scholars at all levels of their academic career.

    UZH sets priorities in research and teaching by considering academic requirements and the needs of society. These priorities presuppose basic research and interdisciplinary methods.

    UZH strives to uphold the highest quality in all its activities.
    To secure and improve quality, the University regularly monitors and evaluates its performance.

    Research

    UZH contributes to the increase of knowledge through the pursuit of cutting-edge research.

    UZH is primarily a research institution. As such, it enables and expects its members to conduct research, and supports them in doing so.

    While basic research is the core focus at UZH, the University also pursues applied research.

     
  • richardmitnick 2:13 pm on January 8, 2023 Permalink | Reply
    Tags: "Unpacking the 'black box' to build better AI models", , , , , , Computer Science and Artificial Intelligence Laboratory (CSAIL), , From butterflies to bioinformatics, Machine learning, , , Stefanie Jegelka, Stefanie Jegelka seeks to understand how machine-learning models behave to help researchers build more robust models for applications in biology and computer vision and optimization and more., Teaching models to learn,   

    From The Massachusetts Institute of Technology: “Unpacking the ‘black box’ to build better AI models” Stefanie Jegelka 

    From The Massachusetts Institute of Technology

    1.8.23
    Adam Zewe

    Stefanie Jegelka seeks to understand how machine-learning models behave, to help researchers build more robust models for applications in biology, computer vision, optimization, and more.

    1
    Stefanie Jegelka, a newly-tenured associate professor in the Department of Electrical Engineering and Computer Science at MIT, develops algorithms for deep learning applications and studies how deep learning models behave and what they can learn. Photo: M. Scott Brauer.

    2
    “What I really loved about MIT, from the very beginning, was that the people really care deeply about research and creativity. That is what I appreciate the most about MIT. The people here really value originality and digging deep into research,” Jegelka says. Photo: M. Scott Brauer.

    When deep learning models are deployed in the real world, perhaps to detect financial fraud from credit card activity or identify cancer in medical images, they are often able to outperform humans.

    But what exactly are these deep learning models learning? Does a model trained to spot skin cancer in clinical images, for example, actually learn the colors and textures of cancerous tissue, or is it flagging some other features or patterns?

    These powerful machine-learning models are typically based on artificial neural networks that can have millions of nodes that process data to make predictions. Due to their complexity, researchers often call these models “black boxes” because even the scientists who build them don’t understand everything that is going on under the hood.

    Stefanie Jegelka isn’t satisfied with that “black box” explanation. A newly tenured associate professor in the MIT Department of Electrical Engineering and Computer Science, Jegelka is digging deep into deep learning to understand what these models can learn and how they behave, and how to build certain prior information into these models.

    “At the end of the day, what a deep-learning model will learn depends on so many factors. But building an understanding that is relevant in practice will help us design better models, and also help us understand what is going on inside them so we know when we can deploy a model and when we can’t. That is critically important,” says Jegelka, who is also a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Institute for Data, Systems, and Society (IDSS).

    Jegelka is particularly interested in optimizing machine-learning models when input data are in the form of graphs. Graph data pose specific challenges: For instance, information in the data consists of both information about individual nodes and edges, as well as the structure — what is connected to what. In addition, graphs have mathematical symmetries that need to be respected by the machine-learning model so that, for instance, the same graph always leads to the same prediction. Building such symmetries into a machine-learning model is usually not easy.

    Take molecules, for instance. Molecules can be represented as graphs, with vertices that correspond to atoms and edges that correspond to chemical bonds between them. Drug companies may want to use deep learning to rapidly predict the properties of many molecules, narrowing down the number they must physically test in the lab.

    Jegelka studies methods to build mathematical machine-learning models that can effectively take graph data as an input and output something else, in this case a prediction of a molecule’s chemical properties. This is particularly challenging since a molecule’s properties are determined not only by the atoms within it, but also by the connections between them.

    Other examples of machine learning on graphs include traffic routing, chip design, and recommender systems.

    Designing these models is made even more difficult by the fact that data used to train them are often different from data the models see in practice. Perhaps the model was trained using small molecular graphs or traffic networks, but the graphs it sees once deployed are larger or more complex.

    In this case, what can researchers expect this model to learn, and will it still work in practice if the real-world data are different?

    “Your model is not going to be able to learn everything because of some hardness problems in computer science, but what you can learn and what you can’t learn depends on how you set the model up,” Jegelka says.

    She approaches this question by combining her passion for algorithms and discrete mathematics with her excitement for machine learning.

    From butterflies to bioinformatics

    Jegelka grew up in a small town in Germany and became interested in science when she was a high school student; a supportive teacher encouraged her to participate in an international science competition. She and her teammates from the U.S. and Singapore won an award for a website they created about butterflies, in three languages.

    “For our project, we took images of wings with a scanning electron microscope at a local university of applied sciences. I also got the opportunity to use a high-speed camera at Mercedes Benz — this camera usually filmed combustion engines — which I used to capture a slow-motion video of the movement of a butterfly’s wings. That was the first time I really got in touch with science and exploration,” she recalls.

    Intrigued by both biology and mathematics, Jegelka decided to study bioinformatics at the University of Tübingen and the University of Texas-Austin. She had a few opportunities to conduct research as an undergraduate, including an internship in computational neuroscience at Georgetown University, but wasn’t sure what career to follow.

    When she returned for her final year of college, Jegelka moved in with two roommates who were working as research assistants at the MPG Institute in Tübingen.

    “They were working on machine learning, and that sounded really cool to me. I had to write my bachelor’s thesis, so I asked at the institute if they had a project for me. I started working on machine learning at the MPG Institute and I loved it. I learned so much there, and it was a great place for research,” she says.

    She stayed on at the MPG Institute to complete a master’s thesis, and then embarked on a PhD in machine learning at the MPG Institute and the Swiss Federal Institute of Technology.

    During her PhD, she explored how concepts from discrete mathematics can help improve machine-learning techniques.

    Teaching models to learn

    The more Jegelka learned about machine learning, the more intrigued she became by the challenges of understanding how models behave, and how to steer this behavior.

    “You can do so much with machine learning, but only if you have the right model and data. It is not just a black-box thing where you throw it at the data and it works. You actually have to think about it, its properties, and what you want the model to learn and do,” she says.

    After completing a postdoc at the University of California-Berkeley, Jegelka was hooked on research and decided to pursue a career in academia. She joined the faculty at MIT in 2015 as an assistant professor.

    “What I really loved about MIT, from the very beginning, was that the people really care deeply about research and creativity. That is what I appreciate the most about MIT. The people here really value originality and depth in research,” she says.

    That focus on creativity has enabled Jegelka to explore a broad range of topics.

    In collaboration with other faculty at MIT, she studies machine-learning applications in biology, imaging, computer vision, and materials science.

    But what really drives Jegelka is probing the fundamentals of machine learning, and most recently, the issue of robustness. Often, a model performs well on training data, but its performance deteriorates when it is deployed on slightly different data. Building prior knowledge into a model can make it more reliable, but understanding what information the model needs to be successful and how to build it in is not so simple, she says.

    She is also exploring methods to improve the performance of machine-learning models for image classification.

    Image classification models are everywhere, from the facial recognition systems on mobile phones to tools that identify fake accounts on social media. These models need massive amounts of data for training, but since it is expensive for humans to hand-label millions of images, researchers often use unlabeled datasets to pretrain models instead.

    These models then reuse the representations they have learned when they are fine-tuned later for a specific task.

    Ideally, researchers want the model to learn as much as it can during pretraining, so it can apply that knowledge to its downstream task. But in practice, these models often learn only a few simple correlations — like that one image has sunshine and one has shade — and use these “shortcuts” to classify images.

    “We showed that this is a problem in ‘contrastive learning,’ which is a standard technique for pre-training, both theoretically and empirically. But we also show that you can influence the kinds of information the model will learn to represent by modifying the types of data you show the model. This is one step toward understanding what models are actually going to do in practice,” she says.

    Researchers still don’t understand everything that goes on inside a deep-learning model, or details about how they can influence what a model learns and how it behaves, but Jegelka looks forward to continue exploring these topics.

    “Often in machine learning, we see something happen in practice and we try to understand it theoretically. This is a huge challenge. You want to build an understanding that matches what you see in practice, so that you can do better. We are still just at the beginning of understanding this,” she says.

    Outside the lab, Jegelka is a fan of music, art, traveling, and cycling. But these days, she enjoys spending most of her free time with her preschool-aged daughter.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    4

    The Computer Science and Artificial Intelligence Laboratory (CSAIL)

    From The Kavli Institute For Astrophysics and Space Research

    MIT’s Institute for Medical Engineering and Science is a research institute at the Massachusetts Institute of Technology

    The MIT Laboratory for Nuclear Science

    The MIT Media Lab

    The MIT School of Engineering

    The MIT Sloan School of Management

    Spectrum

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 11:40 am on December 13, 2022 Permalink | Reply
    Tags: "Glassy Discovery Offers Computational Windfall to Researchers Across Disciplines", A counterintuitive algorithmic strategy called “metadynamics”, , , , , Computational protein folding, , Crystals, Finding rare low-energy canyons in glassy materials., Folding peptide sequences into proteins, Glassy materials, Machine learning, , ,   

    From The School of Engineering and Applied Science At The University of Pennsylvania: “Glassy Discovery Offers Computational Windfall to Researchers Across Disciplines” 

    From The School of Engineering and Applied Science

    At

    U Penn bloc

    The University of Pennsylvania

    12.5.22
    Devorah Fischler

    1
    Penn Engineers used a counterintuitive algorithmic strategy called “metadynamics” to find rare low-energy canyons in glassy materials. Their breakthrough suggests the algorithm may have a wide range of useful scientific applications, potentially speeding up the pace of computational protein folding and eliminating the need for large data sets in machine learning. (Image credit: Dariusz Jemielniak)

    John Crocker had expected to see a flat line — a familiar horizontal track with some slight peaks and valleys — but the plot of energy in front of him dove sharply downward.

    “It’s a once-in-a-lifetime finding,” says Crocker. “It was as if the simulation had unexpectedly fallen into a deep canyon on an energy surface. This was lucky for two reasons. Firstly, it turned out to be a game changer for our study of glassy materials. And secondly, similar canyons have the potential to help others grappling with the same computational obstacles we face in our field, from computer scientists working on machine learning algorithms to bioengineers studying protein folding. We ended up with significant results because we were curious enough to try a method that shouldn’t have worked. But it did.”

    The method is metadynamics, a computational approach to exploring energy landscapes. Its counterintuitive application is the subject of a recent publication in PNAS [below] from a group of Penn Engineers at the University of Pennsylvania led by Crocker, Professor and Graduate Group Chair in the Department of Chemical and Biomolecular Engineering (CBE), along with Robert Riggleman, Associate Professor in CBE, and Amruthesh Thirumalaiswamy, Ph.D. student in CBE.

    Most solids are glasses (or glassy). We categorize the rest as crystals. These categorizations are not limited to glass or crystal as we might imagine them, but instead indicate how atoms in any solid are arranged. Crystals have neat, repetitive atomic structures. Glasses, however, are amorphous. Their atoms and molecules take on a vast number of disordered configurations.

    2
    Glassy and crystal solids.

    Glassy configurations get stuck while pursuing — as all systems do — their most stable, lowest energy states. Given enough time, glasses will still very slowly relax in energy, but their disordered atoms make it a slow and difficult process.

    Low-energy, stable glasses, or “ideal glasses,” are the key to a storehouse of knowledge that researchers are keen to unlock.

    Seeking to understand and eventually replicate the conditions of glassy materials that overcome the obstacles of their own atomic quirks, scientists use both experimental and theoretical approaches.

    Labs have, for example, melted and re-cooled fossilized amber to develop processes for recreating the encouraging effects that millions of years have had on its glassy pursuit of low-energy states. Crocker’s team, affiliated with the cross-disciplinary Penn Institute for Computational Science (PICS), explores physical structures with mathematical models.

    “We use computational models to simulate the positions and movements of atoms in different glasses,” says Thirumalaiswamy. “In order to keep track of a material’s particles, which are so numerous and dynamic they are impossible to visualize in three dimensions, we need to represent them mathematically in high-dimensional virtual spaces. If we have 300 atoms, for example, we need to represent them in 900 dimensions. We call these energy landscapes. We then investigate the landscapes, navigating them almost like explorers.”

    In these computational models, single configuration points, digests of atomic movement, tell the story of a glass’ energy levels. They show where a glass has gotten stuck and where it might have achieved a low-energy state.

    The problem is that until now, researchers have not been able to navigate landscapes efficiently enough to find these rare instances of stability.

    “Most studies do random walks around high-dimensional landscapes at enormous computational cost. It would take an infinite amount of time to find anything of interest. The landscapes are immense, and these walks are repetitive, wasting large amounts of time fixed in a single state before moving on to the next one,” says Riggleman.

    And so, they took a chance in trying metadynamics, a method that seemed destined to fail.

    Metadynamics is an algorithmic strategy developed to explore the entire landscape and avoid repetition. It assigns a penalty for going back to the same place twice. Metadynamics never works in high-dimensional spaces, however, because it takes too long to construct the penalties, canceling out the strategy’s potential for efficiency.

    Yet as the researchers watched their configuration energy trend downward, they realized it had succeeded.

    “We couldn’t have guessed it, but the landscapes proved to have these canyons with floors that are only two- or three-dimensional,” says Crocker. “Our algorithm literally fell right in. We found regularly occurring low-energy configurations in several different glasses with a method we think could be revolutionary for other disciplines as well.”

    The potential applications of the Crocker Lab canyons are wide-ranging.

    In the two decades since the Human Genome Project finished its mapping, scientists have been using computational models to fold peptide sequences into proteins. Proteins that fold well in nature have, through evolution, found ways to explore low-energy states analogous to those of ideal glasses.

    Theoretical studies of proteins use energy landscapes to learn about the folding processes that create the functional (or dysfunctional) foundations for biological health. Yet measuring these structures takes time, money and energy that scientists and the populations they aim to serve don’t have to spare. Bogged down by the same computational inefficiencies that glassy materials researchers face, genomic scientists may find similar successes with metadynamics-based approaches, accelerating the pace of medical research.

    Machine learning processes have a lot in common with random walks in high-dimensional space. Training artificial intelligence takes an enormous amount of computational time and power and has a long way to go in terms of predictive accuracies.

    A neural net needs to “see,” for example, thousands to millions of faces in order to acquire enough skill for facial recognition. With a more strategic computational process, machine learning could become faster, cheaper and more accessible. The metadynamics algorithm may have the potential to overcome the need for the huge and costly datasets typical of the process.

    Not only would this provide solutions for industry efficiency, but it could also democratize AI, allowing people with modest resources to do their own training and development.

    “We’re conjecturing that the landscapes in these different fields have similar geometric structures to ours,” says Crocker. “We suspect there might be a deep mathematical reason for why these canyons exist, and they may be present in these other related systems. This is our invitation; we look forward to the dialogue it begins.”

    This work was supported by NSF-Division of Material Research 1609525 and 1720530 and computational resources provided by XSEDE (Extreme Science and Engineering Discovery Environment) through TG-DMR150034.

    Science paper:
    PNAS

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The School of Engineering and Applied Science is an undergraduate and graduate school of The University of Pennsylvania. The School offers programs that emphasize hands-on study of engineering fundamentals (with an offering of approximately 300 courses) while encouraging students to leverage the educational offerings of the broader University. Engineering students can also take advantage of research opportunities through interactions with Penn’s School of Medicine, School of Arts and Sciences and the Wharton School.

    Penn Engineering offers bachelors, masters and Ph.D. degree programs in contemporary fields of engineering study. The nationally ranked bioengineering department offers the School’s most popular undergraduate degree program. The Jerome Fisher Program in Management and Technology, offered in partnership with the Wharton School, allows students to simultaneously earn a Bachelor of Science degree in Economics as well as a Bachelor of Science degree in Engineering. SEAS also offers several masters programs, which include: Executive Master’s in Technology Management, Master of Biotechnology, Master of Computer and Information Technology, Master of Computer and Information Science and a Master of Science in Engineering in Telecommunications and Networking.

    History

    The study of engineering at The University of Pennsylvania can be traced back to 1850 when the University trustees adopted a resolution providing for a professorship of “Chemistry as Applied to the Arts”. In 1852, the study of engineering was further formalized with the establishment of the School of Mines, Arts and Manufactures. The first Professor of Civil and Mining Engineering was appointed in 1852. The first graduate of the school received his Bachelor of Science degree in 1854. Since that time, the school has grown to six departments. In 1973, the school was renamed as the School of Engineering and Applied Science.

    The early growth of the school benefited from the generosity of two Philadelphians: John Henry Towne and Alfred Fitler Moore. Towne, a mechanical engineer and railroad developer, bequeathed the school a gift of $500,000 upon his death in 1875. The main administration building for the school still bears his name. Moore was a successful entrepreneur who made his fortune manufacturing telegraph cable. A 1923 gift from Moore established the Moore School of Electrical Engineering, which is the birthplace of the first electronic general-purpose Turing-complete digital computer, ENIAC, in 1946.

    During the latter half of the 20th century the school continued to break new ground. In 1958, Barbara G. Mandell became the first woman to enroll as an undergraduate in the School of Engineering. In 1965, the university acquired two sites that were formerly used as U.S. Army Nike Missile Base (PH 82L and PH 82R) and created the Valley Forge Research Center. In 1976, the Management and Technology Program was created. In 1990, a Bachelor of Applied Science in Biomedical Science and Bachelor of Applied Science in Environmental Science were first offered, followed by a master’s degree in Biotechnology in 1997.

    The school continues to expand with the addition of the Melvin and Claire Levine Hall for computer science in 2003, Skirkanich Hall for Bioengineering in 2006, and the Krishna P. Singh Center for Nanotechnology in 2013.

    Academics

    Penn’s School of Engineering and Applied Science is organized into six departments:

    Bioengineering
    Chemical and Biomolecular Engineering
    Computer and Information Science
    Electrical and Systems Engineering
    Materials Science and Engineering
    Mechanical Engineering and Applied Mechanics

    The school’s Department of Bioengineering, originally named Biomedical Electronic Engineering, consistently garners a top-ten ranking at both the undergraduate and graduate level from U.S. News & World Report. The department also houses the George H. Stephenson Foundation Educational Laboratory & Bio-MakerSpace (aka Biomakerspace) for training undergraduate through PhD students. It is Philadelphia’s and Penn’s only Bio-MakerSpace and it is open to the Penn community, encouraging a free flow of ideas, creativity, and entrepreneurship between Bioengineering students and students throughout the university.

    Founded in 1893, the Department of Chemical and Biomolecular Engineering is “America’s oldest continuously operating degree-granting program in chemical engineering.”

    The Department of Electrical and Systems Engineering is recognized for its research in electroscience, systems science and network systems and telecommunications.

    Originally established in 1946 as the School of Metallurgical Engineering, the Materials Science and Engineering Department “includes cutting edge programs in nanoscience and nanotechnology, biomaterials, ceramics, polymers, and metals.”

    The Department of Mechanical Engineering and Applied Mechanics draws its roots from the Department of Mechanical and Electrical Engineering, which was established in 1876.

    Each department houses one or more degree programs. The Chemical and Biomolecular Engineering, Materials Science and Engineering, and Mechanical Engineering and Applied Mechanics departments each house a single degree program.

    Bioengineering houses two programs (both a Bachelor of Science in Engineering degree as well as a Bachelor of Applied Science degree). Electrical and Systems Engineering offers four Bachelor of Science in Engineering programs: Electrical Engineering, Systems Engineering, Computer Engineering, and the Networked & Social Systems Engineering, the latter two of which are co-housed with Computer and Information Science (CIS). The CIS department, like Bioengineering, offers Computer and Information Science programs under both bachelor programs. CIS also houses Digital Media Design, a program jointly operated with PennDesign.

    Research

    Penn’s School of Engineering and Applied Science is a research institution. SEAS research strives to advance science and engineering and to achieve a positive impact on society.

    U Penn campus

    Academic life at University of Pennsylvania is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

    The University of Pennsylvania is a private Ivy League research university in Philadelphia, Pennsylvania. The university claims a founding date of 1740 and is one of the nine colonial colleges chartered prior to the U.S. Declaration of Independence. Benjamin Franklin, Penn’s founder and first president, advocated an educational program that trained leaders in commerce, government, and public service, similar to a modern liberal arts curriculum.

    Penn has four undergraduate schools as well as twelve graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences; the School of Engineering and Applied Science; the Wharton School; and the School of Nursing. Penn’s “One University Policy” allows students to enroll in classes in any of Penn’s twelve schools. Among its highly ranked graduate and professional schools are a law school whose first professor wrote the first draft of the United States Constitution, the first school of medicine in North America (Perelman School of Medicine, 1765), and the first collegiate business school (Wharton School, 1881).

    Penn is also home to the first “student union” building and organization (Houston Hall, 1896), the first Catholic student club in North America (Newman Center, 1893), the first double-decker college football stadium (Franklin Field, 1924 when second deck was constructed), and Morris Arboretum, the official arboretum of the Commonwealth of Pennsylvania. The first general-purpose electronic computer (ENIAC) was developed at Penn and formally dedicated in 1946. In 2019, the university had an endowment of $14.65 billion, the sixth-largest endowment of all universities in the United States, as well as a research budget of $1.02 billion. The university’s athletics program, the Quakers, fields varsity teams in 33 sports as a member of the NCAA Division I Ivy League conference.

    As of 2018, distinguished alumni and/or Trustees include three U.S. Supreme Court justices; 32 U.S. senators; 46 U.S. governors; 163 members of the U.S. House of Representatives; eight signers of the Declaration of Independence and seven signers of the U.S. Constitution (four of whom signed both representing two-thirds of the six people who signed both); 24 members of the Continental Congress; 14 foreign heads of state and two presidents of the United States, including Donald Trump. As of October 2019, 36 Nobel laureates; 80 members of the American Academy of Arts and Sciences; 64 billionaires; 29 Rhodes Scholars; 15 Marshall Scholars and 16 Pulitzer Prize winners have been affiliated with the university.

    History

    The University of Pennsylvania considers itself the fourth-oldest institution of higher education in the United States, though this is contested by Princeton University and Columbia University. The university also considers itself as the first university in the United States with both undergraduate and graduate studies.

    In 1740, a group of Philadelphians joined together to erect a great preaching hall for the traveling evangelist George Whitefield, who toured the American colonies delivering open-air sermons. The building was designed and built by Edmund Woolley and was the largest building in the city at the time, drawing thousands of people the first time it was preached in. It was initially planned to serve as a charity school as well, but a lack of funds forced plans for the chapel and school to be suspended. According to Franklin’s autobiography, it was in 1743 when he first had the idea to establish an academy, “thinking the Rev. Richard Peters a fit person to superintend such an institution”. However, Peters declined a casual inquiry from Franklin and nothing further was done for another six years. In the fall of 1749, now more eager to create a school to educate future generations, Benjamin Franklin circulated a pamphlet titled Proposals Relating to the Education of Youth in Pensilvania, his vision for what he called a “Public Academy of Philadelphia”. Unlike the other colonial colleges that existed in 1749—Harvard University, William & Mary, Yale Unversity, and The College of New Jersey—Franklin’s new school would not focus merely on education for the clergy. He advocated an innovative concept of higher education, one which would teach both the ornamental knowledge of the arts and the practical skills necessary for making a living and doing public service. The proposed program of study could have become the nation’s first modern liberal arts curriculum, although it was never implemented because Anglican priest William Smith (1727-1803), who became the first provost, and other trustees strongly preferred the traditional curriculum.

    Franklin assembled a board of trustees from among the leading citizens of Philadelphia, the first such non-sectarian board in America. At the first meeting of the 24 members of the board of trustees on November 13, 1749, the issue of where to locate the school was a prime concern. Although a lot across Sixth Street from the old Pennsylvania State House (later renamed and famously known since 1776 as “Independence Hall”), was offered without cost by James Logan, its owner, the trustees realized that the building erected in 1740, which was still vacant, would be an even better site. The original sponsors of the dormant building still owed considerable construction debts and asked Franklin’s group to assume their debts and, accordingly, their inactive trusts. On February 1, 1750, the new board took over the building and trusts of the old board. On August 13, 1751, the “Academy of Philadelphia”, using the great hall at 4th and Arch Streets, took in its first secondary students. A charity school also was chartered on July 13, 1753 by the intentions of the original “New Building” donors, although it lasted only a few years. On June 16, 1755, the “College of Philadelphia” was chartered, paving the way for the addition of undergraduate instruction. All three schools shared the same board of trustees and were considered to be part of the same institution. The first commencement exercises were held on May 17, 1757.

    The institution of higher learning was known as the College of Philadelphia from 1755 to 1779. In 1779, not trusting then-provost the Reverend William Smith’s “Loyalist” tendencies, the revolutionary State Legislature created a University of the State of Pennsylvania. The result was a schism, with Smith continuing to operate an attenuated version of the College of Philadelphia. In 1791, the legislature issued a new charter, merging the two institutions into a new University of Pennsylvania with twelve men from each institution on the new board of trustees.

    Penn has three claims to being the first university in the United States, according to university archives director Mark Frazier Lloyd: the 1765 founding of the first medical school in America made Penn the first institution to offer both “undergraduate” and professional education; the 1779 charter made it the first American institution of higher learning to take the name of “University”; and existing colleges were established as seminaries (although, as detailed earlier, Penn adopted a traditional seminary curriculum as well).

    After being located in downtown Philadelphia for more than a century, the campus was moved across the Schuylkill River to property purchased from the Blockley Almshouse in West Philadelphia in 1872, where it has since remained in an area now known as University City. Although Penn began operating as an academy or secondary school in 1751 and obtained its collegiate charter in 1755, it initially designated 1750 as its founding date; this is the year that appears on the first iteration of the university seal. Sometime later in its early history, Penn began to consider 1749 as its founding date and this year was referenced for over a century, including at the centennial celebration in 1849. In 1899, the board of trustees voted to adjust the founding date earlier again, this time to 1740, the date of “the creation of the earliest of the many educational trusts the University has taken upon itself”. The board of trustees voted in response to a three-year campaign by Penn’s General Alumni Society to retroactively revise the university’s founding date to appear older than Princeton University, which had been chartered in 1746.

    Research, innovations and discoveries

    Penn is classified as an “R1” doctoral university: “Highest research activity.” Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn’s research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health.

    In line with its well-known interdisciplinary tradition, Penn’s research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing; the Center for Global Women’s Health at the Nursing School; the $13 million Morris Arboretum’s Horticulture Center; the $15 million Jay H. Baker Retailing Center at Wharton; and the $13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the “Penn Integrates Knowledge” title awarded to selected Penn professors “whose research and teaching exemplify the integration of knowledge”. These professors hold endowed professorships and joint appointments between Penn’s schools.

    Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia University and Cornell University (Harvard University did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale University) and tenth nationally.

    In most disciplines Penn professors’ productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn’s 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields.

    Penn’s research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school; the first university teaching hospital; the first business school; and the first student union Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, “Wharton is on the crest of a wave of reinvention and change in management education”.

    Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering.

    ENIAC UPenn

    It was here also where the world’s first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer’s link with genes; cognitive therapy; Retin-A (the cream used to treat acne), Resistin; the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the genes for fragile X syndrome, the most common form of inherited mental retardation; spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; and Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs.

    Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research; Simon Kuznets’s method of measuring Gross National Product; the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the “Wharton Model” developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon’s health reform in the 1970s.

    International partnerships

    Students can study abroad for a semester or a year at partner institutions such as the London School of Economics(UK), University of Barcelona [Universitat de Barcelona](ES), Paris Institute of Political Studies [Institut d’études politiques de Paris](FR), University of Queensland(AU), University College London(UK), King’s College London(UK), Hebrew University of Jerusalem(IL) and University of Warwick(UK).

     
  • richardmitnick 8:59 am on December 13, 2022 Permalink | Reply
    Tags: "Building Trust with the Algorithms in Our Lives", , , , Machine learning,   

    From Yale University: “Building Trust with the Algorithms in Our Lives” 

    From Yale University

    1
    Credit: Sean David Williams.

    12.6.22
    Taly Reich

    Algorithms are omnipresent in our increasingly digital lives. They offer us new music and friends. They recommend books and clothing. They deliver information about the world. They help us find romantic partners one day, efficient commutes the next, cancer diagnoses the third.

    And yet most people display an aversion to algorithms. They don’t fully trust the recommendations made by computer programs. When asked, they prefer human predictions to those put forward by algorithms.

    “But given the growing prevalence of algorithms, it seems important we learn to trust and appreciate them,” says Taly Reich, associate professor at Yale SOM. “Is there an intervention that would help reduce this aversion?”

    New research conducted by Reich and two colleagues, Alex Kaju of HEC Montreal and Sam Maglio of the University of Toronto [Journal of Consumer Psychology (below)], finds that clearly demonstrating an algorithm’s ability to learn from past mistakes increases the trust that people place in the algorithm. It also inclines people to prefer the predictions made by algorithms over those made by humans.

    In arriving at this result, Reich drew on her foundational work on the value of mistakes [Organizational Behavior and Human Decision Processes (below)]. In a series of prior papers, Reich has established how mistakes, in the right context, can create benefits; people who make mistakes can come across as more knowledgeable and credible than people who don’t. Applying this insight to predictive models, Reich and her colleagues investigated whether framing algorithms as capable of learning from their mistakes enhanced trust in the recommendations that algorithms make.

    In one of several experiments, for instance, participants were asked whether a trained psychologist or an algorithm would be better at evaluating somebody’s personality. Under one condition, no further information was provided. In another condition, identical performance data for both the psychologist and the algorithm explicitly demonstrated improvement over time. In the first three months, each one was correct 60% of the time, incorrect 40% of the time; by six months, they were correct 70% of the time; and over the course of the first year the rate moved up to 80% correct.

    Absent information about the capacity to learn, participants chose a psychologist over an algorithm 75% of the time. But when shown how the algorithm improved over time, they chose it 66% of the time—more often than the human. Participants overcame any potential algorithm aversion and instead expressed what Reich and her colleagues term “algorithm appreciation,” or even “algorithm investment,” by choosing it at a higher rate than the human. These results held across several different cases, from selecting the best artwork to finding a well-matched romantic partner. In every instance, when the algorithm exhibited learning over time, it was trusted more often than its human counterpart.

    Of course, Reich recognizes that companies often can’t or don’t want to disclose specific details about the accuracy of their algorithms. Most likely, they won’t break outcomes down to percentages and share these with consumers. “Importantly, though, this was a hybrid paper, where we cared about the practical implications as much as the theory,” she says. “Given constraints in the real world, we wanted to know whether there were more subtle methods for dispelling this notion that AI can’t learn.”

    The researchers explored whether small changes to how predictive software is described has an impact on choice. In one study, participants were asked whether they wanted to rely on themselves to judge the quality of a piece of art, or whether they wanted to rely on technology to do it for them. The technology was described as either an “algorithm” or a “machine-learning algorithm.” When given the choice of “algorithm,” the majority of people chose themselves. When offered instead a “machine-learning algorithm,” the majority of people chose technology. Simply providing a name suggestive of an algorithm’s ability to learn proved sufficient to overcome lack of trust.

    For Reich, this presents a clear and practical takeaway for companies that rely, in one way or another, on predictive algorithms. Companies need to be aware that consumers, for the most part, harbor distrust of the recommendations made by algorithms. But this distrust, to a point, is readily overcome: a simple semantic nod toward an algorithm’s ability to learn will build greater trust with the consumers it serves.

    “If we understand that machines, like humans, can learn from their mistakes,” Reich says, “we won’t resist them as much.”

    Science papers:
    Organizational Behavior and Human Decision Processes 2018
    Journal of Consumer Psychology

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Yale University is a private Ivy League research university in New Haven, Connecticut. Founded in 1701 as the Collegiate School, it is the third-oldest institution of higher education in the United States and one of the nine Colonial Colleges chartered before the American Revolution. The Collegiate School was renamed Yale College in 1718 to honor the school’s largest private benefactor for the first century of its existence, Elihu Yale. Yale University is consistently ranked as one of the top universities and is considered one of the most prestigious in the nation.

    Chartered by Connecticut Colony, the Collegiate School was established in 1701 by clergy to educate Congregational ministers before moving to New Haven in 1716. Originally restricted to theology and sacred languages, the curriculum began to incorporate humanities and sciences by the time of the American Revolution. In the 19th century, the college expanded into graduate and professional instruction, awarding the first PhD in the United States in 1861 and organizing as a university in 1887. Yale’s faculty and student populations grew after 1890 with rapid expansion of the physical campus and scientific research.

    Yale is organized into fourteen constituent schools: the original undergraduate college, the Yale Graduate School of Arts and Sciences and twelve professional schools. While the university is governed by the Yale Corporation, each school’s faculty oversees its curriculum and degree programs. In addition to a central campus in downtown New Haven, the university owns athletic facilities in western New Haven, a campus in West Haven, Connecticut, and forests and nature preserves throughout New England. As of June 2020, the university’s endowment was valued at $31.1 billion, the second largest of any educational institution. The Yale University Library, serving all constituent schools, holds more than 15 million volumes and is the third-largest academic library in the United States. Students compete in intercollegiate sports as the Yale Bulldogs in the NCAA Division I – Ivy League.

    As of October 2020, 65 Nobel laureates, five Fields Medalists, four Abel Prize laureates, and three Turing award winners have been affiliated with Yale University. In addition, Yale has graduated many notable alumni, including five U.S. Presidents, 19 U.S. Supreme Court Justices, 31 living billionaires, and many heads of state. Hundreds of members of Congress and many U.S. diplomats, 78 MacArthur Fellows, 252 Rhodes Scholars, 123 Marshall Scholars, and nine Mitchell Scholars have been affiliated with the university.

    Research

    Yale is a member of the Association of American Universities (AAU) and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation , Yale spent $990 million on research and development in 2018, ranking it 15th in the nation.

    Yale’s faculty include 61 members of the National Academy of Sciences , 7 members of the National Academy of Engineering and 49 members of the American Academy of Arts and Sciences . The college is, after normalization for institution size, the tenth-largest baccalaureate source of doctoral degree recipients in the United States, and the largest such source within the Ivy League.

    Yale’s English and Comparative Literature departments were part of the New Criticism movement. Of the New Critics, Robert Penn Warren, W.K. Wimsatt, and Cleanth Brooks were all Yale faculty. Later, the Yale Comparative literature department became a center of American deconstruction. Jacques Derrida, the father of deconstruction, taught at the Department of Comparative Literature from the late seventies to mid-1980s. Several other Yale faculty members were also associated with deconstruction, forming the so-called “Yale School”. These included Paul de Man who taught in the Departments of Comparative Literature and French, J. Hillis Miller, Geoffrey Hartman (both taught in the Departments of English and Comparative Literature), and Harold Bloom (English), whose theoretical position was always somewhat specific, and who ultimately took a very different path from the rest of this group. Yale’s history department has also originated important intellectual trends. Historians C. Vann Woodward and David Brion Davis are credited with beginning in the 1960s and 1970s an important stream of southern historians; likewise, David Montgomery, a labor historian, advised many of the current generation of labor historians in the country. Yale’s Music School and Department fostered the growth of Music Theory in the latter half of the 20th century. The Journal of Music Theory was founded there in 1957; Allen Forte and David Lewin were influential teachers and scholars.

    In addition to eminent faculty members, Yale research relies heavily on the presence of roughly 1200 Postdocs from various national and international origin working in the multiple laboratories in the sciences, social sciences, humanities, and professional schools of the university. The university progressively recognized this working force with the recent creation of the Office for Postdoctoral Affairs and the Yale Postdoctoral Association.

    Notable alumni

    Over its history, Yale has produced many distinguished alumni in a variety of fields, ranging from the public to private sector. According to 2020 data, around 71% of undergraduates join the workforce, while the next largest majority of 16.6% go on to attend graduate or professional schools. Yale graduates have been recipients of 252 Rhodes Scholarships, 123 Marshall Scholarships, 67 Truman Scholarships, 21 Churchill Scholarships, and 9 Mitchell Scholarships. The university is also the second largest producer of Fulbright Scholars, with a total of 1,199 in its history and has produced 89 MacArthur Fellows. The U.S. Department of State Bureau of Educational and Cultural Affairs ranked Yale fifth among research institutions producing the most 2020–2021 Fulbright Scholars. Additionally, 31 living billionaires are Yale alumni.

    At Yale, one of the most popular undergraduate majors among Juniors and Seniors is political science, with many students going on to serve careers in government and politics. Former presidents who attended Yale for undergrad include William Howard Taft, George H. W. Bush, and George W. Bush while former presidents Gerald Ford and Bill Clinton attended Yale Law School. Former vice-president and influential antebellum era politician John C. Calhoun also graduated from Yale. Former world leaders include Italian prime minister Mario Monti, Turkish prime minister Tansu Çiller, Mexican president Ernesto Zedillo, German president Karl Carstens, Philippine president José Paciano Laurel, Latvian president Valdis Zatlers, Taiwanese premier Jiang Yi-huah, and Malawian president Peter Mutharika, among others. Prominent royals who graduated are Crown Princess Victoria of Sweden, and Olympia Bonaparte, Princess Napoléon.

    Yale alumni have had considerable presence in U.S. government in all three branches. On the U.S. Supreme Court, 19 justices have been Yale alumni, including current Associate Justices Sonia Sotomayor, Samuel Alito, Clarence Thomas, and Brett Kavanaugh. Numerous Yale alumni have been U.S. Senators, including current Senators Michael Bennet, Richard Blumenthal, Cory Booker, Sherrod Brown, Chris Coons, Amy Klobuchar, Ben Sasse, and Sheldon Whitehouse. Current and former cabinet members include Secretaries of State John Kerry, Hillary Clinton, Cyrus Vance, and Dean Acheson; U.S. Secretaries of the Treasury Oliver Wolcott, Robert Rubin, Nicholas F. Brady, Steven Mnuchin, and Janet Yellen; U.S. Attorneys General Nicholas Katzenbach, John Ashcroft, and Edward H. Levi; and many others. Peace Corps founder and American diplomat Sargent Shriver and public official and urban planner Robert Moses are Yale alumni.

    Yale has produced numerous award-winning authors and influential writers, like Nobel Prize in Literature laureate Sinclair Lewis and Pulitzer Prize winners Stephen Vincent Benét, Thornton Wilder, Doug Wright, and David McCullough. Academy Award winning actors, actresses, and directors include Jodie Foster, Paul Newman, Meryl Streep, Elia Kazan, George Roy Hill, Lupita Nyong’o, Oliver Stone, and Frances McDormand. Alumni from Yale have also made notable contributions to both music and the arts. Leading American composer from the 20th century Charles Ives, Broadway composer Cole Porter, Grammy award winner David Lang, and award-winning jazz pianist and composer Vijay Iyer all hail from Yale. Hugo Boss Prize winner Matthew Barney, famed American sculptor Richard Serra, President Barack Obama presidential portrait painter Kehinde Wiley, MacArthur Fellow and contemporary artist Sarah Sze, Pulitzer Prize winning cartoonist Garry Trudeau, and National Medal of Arts photorealist painter Chuck Close all graduated from Yale. Additional alumni include architect and Presidential Medal of Freedom winner Maya Lin, Pritzker Prize winner Norman Foster, and Gateway Arch designer Eero Saarinen. Journalists and pundits include Dick Cavett, Chris Cuomo, Anderson Cooper, William F. Buckley, Jr., and Fareed Zakaria.

    In business, Yale has had numerous alumni and former students go on to become founders of influential business, like William Boeing (Boeing, United Airlines), Briton Hadden and Henry Luce (Time Magazine), Stephen A. Schwarzman (Blackstone Group), Frederick W. Smith (FedEx), Juan Trippe (Pan Am), Harold Stanley (Morgan Stanley), Bing Gordon (Electronic Arts), and Ben Silbermann (Pinterest). Other business people from Yale include former chairman and CEO of Sears Holdings Edward Lampert, former Time Warner president Jeffrey Bewkes, former PepsiCo chairperson and CEO Indra Nooyi, sports agent Donald Dell, and investor/philanthropist Sir John Templeton.

    Yale alumni distinguished in academia include literary critic and historian Henry Louis Gates, economists Irving Fischer, Mahbub ul Haq, and Nobel Prize laureate Paul Krugman; Nobel Prize in Physics laureates Ernest Lawrence and Murray Gell-Mann; Fields Medalist John G. Thompson; Human Genome Project leader and National Institutes of Health director Francis S. Collins; brain surgery pioneer Harvey Cushing; pioneering computer scientist Grace Hopper; influential mathematician and chemist Josiah Willard Gibbs; National Women’s Hall of Fame inductee and biochemist Florence B. Seibert; Turing Award recipient Ron Rivest; inventors Samuel F.B. Morse and Eli Whitney; Nobel Prize in Chemistry laureate John B. Goodenough; lexicographer Noah Webster; and theologians Jonathan Edwards and Reinhold Niebuhr.

    In the sporting arena, Yale alumni include baseball players Ron Darling and Craig Breslow and baseball executives Theo Epstein and George Weiss; football players Calvin Hill, Gary Fenick, Amos Alonzo Stagg, and “the Father of American Football” Walter Camp; ice hockey players Chris Higgins and Olympian Helen Resor; Olympic figure skaters Sarah Hughes and Nathan Chen; nine-time U.S. Squash men’s champion Julian Illingworth; Olympic swimmer Don Schollander; Olympic rowers Josh West and Rusty Wailes; Olympic sailor Stuart McNay; Olympic runner Frank Shorter; and others.

     
  • richardmitnick 10:09 am on November 30, 2022 Permalink | Reply
    Tags: "Breaking the scaling limits of analog computing", "MZI": Mach-Zehnder Inferometers, An analog optical neural network could perform the same tasks as a digital one but optical neural networks can run many times faster while consuming less energy., An optical neural network is composed of many connected components that function like reprogrammable tunable mirrors., Analog devices are prone to hardware errors that can make computations less precise., Conventional digital computers are struggling to keep up., In an optical neural network that has many connected components errors can quickly accumulate., Machine learning, MIT researchers have overcome this hurdle and found a way to effectively scale an optical neural network., Multiplying with light, , , ,   

    From The Schwarzman College of Computing At The Massachusetts Institute of Technology: “Breaking the scaling limits of analog computing” 

    From The Schwarzman College of Computing

    At

    The Massachusetts Institute of Technology

    11.29.22
    Adam Zewe

    1
    MIT researchers have developed a technique that greatly reduces the error in an optical neural network, which uses light to process data instead of electrical signals. With their technique, the larger an optical neural network becomes, the lower the error in its computations. This could enable them to scale these devices up so they would be large enough for commercial uses.

    As machine-learning models become larger and more complex, they require faster and more energy-efficient hardware to perform computations. Conventional digital computers are struggling to keep up.

    An analog optical neural network could perform the same tasks as a digital one, such as image classification or speech recognition, but because computations are performed using light instead of electrical signals, optical neural networks can run many times faster while consuming less energy.

    However, these analog devices are prone to hardware errors that can make computations less precise. Microscopic imperfections in hardware components are one cause of these errors. In an optical neural network that has many connected components errors can quickly accumulate.

    Even with error-correction techniques, due to fundamental properties of the devices that make up an optical neural network, some amount of error is unavoidable. A network that is large enough to be implemented in the real world would be far too imprecise to be effective.

    MIT researchers have overcome this hurdle and found a way to effectively scale an optical neural network. By adding a tiny hardware component to the optical switches that form the network’s architecture, they can reduce even the uncorrectable errors that would otherwise accumulate in the device.

    Their work could enable a super-fast, energy-efficient, analog neural network that can function with the same accuracy as a digital one. With this technique, as an optical circuit becomes larger, the amount of error in its computations actually decreases.

    “This is remarkable, as it runs counter to the intuition of analog systems, where larger circuits are supposed to have higher errors, so that errors set a limit on scalability. This present paper allows us to address the scalability question of these systems with an unambiguous ‘yes,’” says lead author Ryan Hamerly, a visiting scientist in the MIT Research Laboratory for Electronics (RLE) and Quantum Photonics Laboratory and senior scientist at NTT Research.

    Hamerly’s co-authors are graduate student Saumil Bandyopadhyay and senior author Dirk Englund, an associate professor in the MIT Department of Electrical Engineering and Computer Science (EECS), leader of the Quantum Photonics Laboratory, and member of the RLE. The research is published today in Nature Communications [below].

    Multiplying with light

    An optical neural network is composed of many connected components that function like reprogrammable, tunable mirrors. These tunable mirrors are called Mach-Zehnder Inferometers (MZI). Neural network data are encoded into light, which is fired into the optical neural network from a laser.

    A typical MZI contains two mirrors and two beam splitters. Light enters the top of an MZI, where it is split into two parts which interfere with each other before being recombined by the second beam splitter and then reflected out the bottom to the next MZI in the array. Researchers can leverage the interference of these optical signals to perform complex linear algebra operations, known as matrix multiplication, which is how neural networks process data.

    Fig. 3: 3-splitter MZI design and simulated performance.
    3
    a) Schematic of 3-MZI. b) Splitter Möbius transformation on s∈C, which pushes the forbidden regions away from s = {0, ∞}, corresponding to a Riemann sphere rotation. c) Dependence of matrix error E0, Ec on the splitter variation σ, contrasting the standard and 3-splitter MZIs (fixed mesh size N = 256). d) Scaling of corrected error Ec with mesh size N, showing the qualitative scaling difference between MZI and 3-MZI (fixed splitter variation σ = 0.05). e) Corrected error Ec as function of both σ and N. The sudden onset of “perfect” hardware error correction (Ec=0) occurs when the coverage approaches unity (C≈1).

    But errors that can occur in each MZI quickly accumulate as light moves from one device to the next. One can avoid some errors by identifying them in advance and tuning the MZIs so earlier errors are cancelled out by later devices in the array.

    “It is a very simple algorithm if you know what the errors are. But these errors are notoriously difficult to ascertain because you only have access to the inputs and outputs of your chip,” says Hamerly. “This motivated us to look at whether it is possible to create calibration-free error correction.”

    Hamerly and his collaborators previously demonstrated a mathematical technique that went a step further. They could successfully infer the errors and correctly tune the MZIs accordingly, but even this didn’t remove all the error.

    Due to the fundamental nature of an MZI, there are instances where it is impossible to tune a device so all light flows out the bottom port to the next MZI. If the device loses a fraction of light at each step and the array is very large, by the end there will only be a tiny bit of power left.

    “Even with error correction, there is a fundamental limit to how good a chip can be. MZIs are physically unable to realize certain settings they need to be configured to,” he says.

    So, the team developed a new type of MZI. The researchers added an additional beam splitter to the end of the device, calling it a 3-MZI because it has three beam splitters instead of two. Due to the way this additional beam splitter mixes the light, it becomes much easier for an MZI to reach the setting it needs to send all light from out through its bottom port.

    Importantly, the additional beam splitter is only a few micrometers in size and is a passive component, so it doesn’t require any extra wiring. Adding additional beam splitters doesn’t significantly change the size of the chip.

    Bigger chip, fewer errors

    When the researchers conducted simulations to test their architecture, they found that it can eliminate much of the uncorrectable error that hampers accuracy. And as the optical neural network becomes larger, the amount of error in the device actually drops — the opposite of what happens in a device with standard MZIs.

    Using 3-MZIs, they could potentially create a device big enough for commercial uses with error that has been reduced by a factor of 20, Hamerly says.

    The researchers also developed a variant of the MZI design specifically for correlated errors. These occur due to manufacturing imperfections — if the thickness of a chip is slightly wrong, the MZIs may all be off by about the same amount, so the errors are all about the same. They found a way to change the configuration of an MZI to make it robust to these types of errors. This technique also increased the bandwidth of the optical neural network so it can run three times faster.

    Now that they have showcased these techniques using simulations, Hamerly and his collaborators plan to test these approaches on physical hardware and continue driving toward an optical neural network they can effectively deploy in the real world.

    This research is funded, in part, by a National Science Foundation graduate research fellowship and the U.S. Air Force Office of Scientific Research.

    Science paper:
    Nature Communications
    See the science paper for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    The MIT Stephen A. Schwarzman College of Computing is a college at the Massachusetts Institute of Technology (MIT), located in Cambridge, Massachusetts. Announced in 2018 to address the growing applications of computing technology, the college is an Institute-wide academic unit that works alongside MIT’s five Schools of Architecture and Planning, Engineering, Humanities, Arts, and Social Sciences, Science, and Management. The college emphasizes artificial intelligence research, interdisciplinary applications of computing, and social and ethical responsibilities of computing. It aims to be an interdisciplinary hub for work in artificial intelligence, computer science, data science, and related fields. Its creation was the first significant change to MIT’s academic structure since the early 1950s.

    The MIT Schwarzman College of Computing is named after The Blackstone Group chairman Stephen A. Schwarzman, who donated $350 million of the college’s $1.1 billion funding commitment. The college’s funding sources were met with criticism, with students and staff contrasting MIT’s stated emphasis on ethics against Schwarzman’s controversial business practices and support for Donald Trump.

    Academics and research

    The Schwarzman College of Computing has one academic department and several research enterprises which also have degree programs:

    Department of Electrical Engineering and Computer Science (EECS, more commonly known at MIT as Course 6), which is jointly administered with the School of Engineering.[24] Upon creation of the college, the department formerly only in the School of Engineering was reorganized into three “overlapping subunits”:
    Electrical Engineering (EE)
    Computer Science (CS)
    Artificial Intelligence and Decision-Making (AI+D)
    Operations Research Center (ORC), jointly administered with the MIT Sloan School of Management
    Institute for Data, Systems and Society (IDSS)
    Technology and Policy Program (TPP, adegree program)
    Sociotechnical Systems Research Center (SSRC)
    Center for Computational Science and Engineering (CCSE, renamed from Center for Computational Engineering upon formation of the college)

    The non-degree-granting research labs which are part of the college are:

    MIT Computer Science and Artificial Intelligence Laboratory (CSAIL)
    MIT Laboratory for Information and Decision Systems (LIDS)
    Quest for Intelligence
    MIT-IBM Watson AI Lab
    MIT Abdul Latif Jameel Clinic for Machine Learning in Health

    The establishment of the college added 50 new faculty positions to the university. Half of these positions focus on computer science, while the other half are jointly appointed in collaboration with other departments in the Architecture and Planning, Engineering, Humanities, Arts, and Social Sciences, Science, and Management. The New York Times described the college’s structure as an effort to “alter traditional academic thinking and practice” and allow the university to more effectively bring computing to other fields.

    The creation of the College of Computing also started the development of three additional programs meant to integrate closely with other MIT computing activities, for which plans have not been finalized:

    Social and Ethical Responsibilities of Computing (SERC) aims to develop “responsible habits of mind and action” regarding computing technology. SERC facilitates the teaching of ethics throughout MIT courses, conducts research in social, ethical, and policy implications of technology, and coordinates public forums regarding technology and public policy.
    Common Ground for Computing Education coordinates interdepartmental teaching in computing, supporting interdisciplinary courses, majors, and minors on computing and its applications.
    Center for Advanced Studies of Computing hosts research fellows and assists project-oriented programs in computing-related topics.

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 4:18 pm on November 29, 2022 Permalink | Reply
    Tags: "AlphaFold" is an AI algorithm developed by Google "DeepMind"., "IAP": interatomic potential, "Matterverse.ai", "Nanoengineers Develop a Predictive Database for Materials", A breakthrough algorithm expands the exploration space for materials by orders of magnitude., , , M3GNet, Machine learning, , , Scientists need to know the structure of a material to predict its properties., , The properties of a material are determined by the arrangement of its atoms., , U.S. Department of Energy Office of Basic Energy Sciences,   

    From The Jacobs School of Engineering At The University of California-San Diego: “Nanoengineers Develop a Predictive Database for Materials” 


    From The Jacobs School of Engineering

    At

    The University of California-San Diego

    11.28.22
    By:
    Emerson Dameron
    edameron@ucsd.edu

    Media Contact:
    Daniel Kane
    dbkane@ucsd.edu

    A breakthrough algorithm expands the exploration space for materials by orders of magnitude.

    1
    Nanoengineers Develop a Predictive Database for Materials. Credit: iStock.

    Nanoengineers at the University of California San Diego’s Jacobs School of Engineering have developed an AI algorithm that predicts the structure and dynamic properties of any material—whether existing or new—almost instantaneously. Known as M3GNet, the algorithm was used to develop matterverse.ai, a database of more than 31 million yet-to-be-synthesized materials with properties predicted by machine learning algorithms. Matterverse.ai facilitates the discovery of new technological materials with exceptional properties. 

    The team behind M3GNet, led by UC San Diego nanoengineering professor Shyue Ping Ong, uses matterverse.ai and the new capabilities of M3GNet in their search for safer and more energy-dense electrodes and electrolytes for rechargeable lithium-ion batteries. The project is explored in the November issue of Nature Computational Science [below].

    The properties of a material are determined by the arrangement of its atoms. However, existing approaches to obtain that arrangement are either prohibitively expensive or ineffective for many elements. 

    1
    Schematic courtesy of Shyue Ping Ong, Jacobs School of Engineering, University of California-San Diego.

    “Similar to proteins, we need to know the structure of a material to predict its properties.” said Ong, the associate director of the Sustainable Power and Energy Center at the Jacobs School of Engineering. “What we need is an AlphaFold for materials.”

    “AlphaFold” is an AI algorithm developed by “Google DeepMind” to predict protein structure. To build the equivalent for materials, Ong and his team combined graph neural networks with many-body interactions to build a deep learning architecture that works universally, with high accuracy, across all the elements of the periodic table. 

    “Mathematical graphs are really natural representations of a collection of atoms,” said Chi Chen, a former senior project scientist in Ong’s lab and first author of the work, who is now a senior quantum architect at Microsoft Quantum. “Using graphs, we can represent the full complexity of materials without being subject to the combinatorial explosion of terms in traditional formalisms.”

    To train their model, the team used the huge database of materials energies, forces and stresses collected in the Materials Project over the past decade. The result is the M3GNet interatomic potential (IAP), which can predict the energies and forces in any collection of atoms. Matterverse.ai was generated through combinatorial elemental substitutions on more than 5,000 structural prototypes in the Inorganic Crystal Structure Database (ICSD). The M3GNet IAP was then used to obtain the equilibrium crystal structure—a process called “relaxation”—for property prediction. 

    Of the 31 million materials in matterverse.ai today, more than a million are predicted to be potentially stable. Ong and his team intend to greatly expand not just the number of materials, but also the number of ML-predicted properties, including high-value properties with small data sizes using a multi-fidelity approach they developed earlier.

    Beyond structural relaxations, the M3GNet IAP also has broad applications in dynamic simulations of materials and property predictions as well. 

    “For instance, we are often interested in how fast lithium ions diffuse in a lithium-ion battery electrode or electrolyte. The faster the diffusion, the more quickly you can charge or discharge a battery,” Ong said. “We have shown that the M3GNet IAP can be used to predict the lithium conductivity of a material with good accuracy. We truly believe that the M3GNet architecture is a transformative tool that can greatly expand our ability to explore new material chemistries and structures.”

    To promote the use of M3GNet, the team has released the framework as an open-source Python code on Github. Since posting the preprint on Arxiv in Feb 2022, the team has received interest from academic researchers and those in the industry. There are plans to integrate the M3GNet IAP as a tool in commercial materials simulation packages.

    This work was authored by Chi Chen and Shyue Ping Ong at The University of California-San Diego. The research was primarily funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division under the Materials Project program. Part of the work was funded by LG Energy Solution through the Frontier Research Laboratory Program. This work used the Extreme Science and Engineering Discovery Environment (XSEDE).

    Science paper:
    Nature Computational Science

     

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About the Jacobs School of Engineering
    Innovation Happens Here

    The University of California-San Diego Jacobs School of Engineering is a premier research school set apart by our entrepreneurial culture and integrative engineering approach.

    The Jacobs School’s Mission:

    Educate Tomorrow’s Technology Leaders
    Conduct Leading Edge Research and Drive Innovation
    Transfer Discoveries for the Benefit of Society

    The Jacobs School’s Values:

    Engineering for the global good
    Exponential impact through entrepreneurism
    Collaboration to enrich relevance
    Our education models focus on deep and broad engineering fundamentals, enhanced by real-world design and research, often in partnership with industry. Through our Team Internship Program and GlobalTeams in Engineering Service program, for example, we encourage students to develop their communications and leadership skills while working in the kind of multi-disciplinary team environment experienced by real-world engineers.

    We are home to exciting research centers, such as the San Diego Supercomputer Center, a national resource for data-intensive computing; our Powell Structural Research Laboratories, the largest and most active in the world for full-scale structural testing; and the Qualcomm Institute, which is the UC San Diego division of the California Institute for Telecommunications and Information Technology (Calit2), which is forging new ground in multi-disciplinary applications for information technology.

    Located at the hub of San Diego’s thriving information technology, biotechnology, clean technology, and nanotechnology sectors, the Jacobs School proactively seeks corporate partners to collaborate with us in research, education and innovation.

    The University of California-San Diego

    The University of California- San Diego, is a public research university located in the La Jolla area of San Diego, California, in the United States. The university occupies 2,141 acres (866 ha) near the coast of the Pacific Ocean with the main campus resting on approximately 1,152 acres (466 ha). Established in 1960 near the pre-existing Scripps Institution of Oceanography, University of California, San Diego is the seventh oldest of the 10 University of California campuses and offers over 200 undergraduate and graduate degree programs, enrolling about 22,700 undergraduate and 6,300 graduate students. The University of California-San Diego is one of America’s “Public Ivy” universities, which recognizes top public research universities in the United States. The University of California-San Diego was ranked 8th among public universities and 37th among all universities in the United States, and rated the 18th Top World University by U.S. News & World Report’s 2015 rankings.

    The University of California-San Diego is organized into seven undergraduate residential colleges (Revelle; John Muir; Thurgood Marshall; Earl Warren; Eleanor Roosevelt; Sixth; and Seventh), four academic divisions (Arts and Humanities; Biological Sciences; Physical Sciences; and Social Sciences), and seven graduate and professional schools (Jacobs School of Engineering; Rady School of Management; Scripps Institution of Oceanography; School of Global Policy and Strategy; School of Medicine; Skaggs School of Pharmacy and Pharmaceutical Sciences; and the newly established Wertheim School of Public Health and Human Longevity Science). University of California-San Diego Health, the region’s only academic health system, provides patient care; conducts medical research; and educates future health care professionals at the University of California-San Diego Medical Center, Hillcrest; Jacobs Medical Center; Moores Cancer Center; Sulpizio Cardiovascular Center; Shiley Eye Institute; Institute for Genomic Medicine; Koman Family Outpatient Pavilion and various express care and urgent care clinics throughout San Diego.

    The university operates 19 organized research units (ORUs), including the Center for Energy Research; Qualcomm Institute (a branch of the California Institute for Telecommunications and Information Technology); San Diego Supercomputer Center; and the Kavli Institute for Brain and Mind, as well as eight School of Medicine research units, six research centers at Scripps Institution of Oceanography and two multi-campus initiatives, including the Institute on Global Conflict and Cooperation. The University of California-San Diego is also closely affiliated with several regional research centers, such as the Salk Institute; the Sanford Burnham Prebys Medical Discovery Institute; the Sanford Consortium for Regenerative Medicine; and the Scripps Research Institute. It is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation, UC San Diego spent $1.265 billion on research and development in fiscal year 2018, ranking it 7th in the nation.

    The University of California-San Diego is considered one of the country’s “Public Ivies”. As of February 2021, The University of California-San Diego faculty, researchers and alumni have won 27 Nobel Prizes and three Fields Medals, eight National Medals of Science, eight MacArthur Fellowships, and three Pulitzer Prizes. Additionally, of the current faculty, 29 have been elected to the National Academy of Engineering, 70 to the National Academy of Sciences, 45 to the National Academy of Medicine and 110 to the American Academy of Arts and Sciences.

    History

    When the Regents of the University of California originally authorized the San Diego campus in 1956, it was planned to be a graduate and research institution, providing instruction in the sciences, mathematics, and engineering. Local citizens supported the idea, voting the same year to transfer to the university 59 acres (24 ha) of mesa land on the coast near the preexisting Scripps Institution of Oceanography. The Regents requested an additional gift of 550 acres (220 ha) of undeveloped mesa land northeast of Scripps, as well as 500 acres (200 ha) on the former site of Camp Matthews from the federal government, but Roger Revelle, then director of Scripps Institution and main advocate for establishing the new campus, jeopardized the site selection by exposing the La Jolla community’s exclusive real estate business practices, which were antagonistic to minority racial and religious groups. This outraged local conservatives, as well as Regent Edwin W. Pauley.

    University of California President Clark Kerr satisfied San Diego city donors by changing the proposed name from University of California, La Jolla, to University of California-San Diego. The city voted in agreement to its part in 1958, and the University of California approved construction of the new campus in 1960. Because of the clash with Pauley, Revelle was not made chancellor. Herbert York, first director of DOE’s Lawrence Livermore National Laboratory, was designated instead. York planned the main campus according to the “Oxbridge” model, relying on many of Revelle’s ideas.

    According to Kerr, “San Diego always asked for the best,” though this created much friction throughout the University of California system, including with Kerr himself, because University of California-San Diego often seemed to be “asking for too much and too fast.” Kerr attributed University of California-San Diego’s “special personality” to Scripps, which for over five decades had been the most isolated University of California unit in every sense: geographically, financially, and institutionally. It was a great shock to the Scripps community to learn that Scripps was now expected to become the nucleus of a new University of California campus and would now be the object of far more attention from both the university administration in Berkeley and the state government in Sacramento.

    The University of California-San Diego was the first general campus of the University of California to be designed “from the top down” in terms of research emphasis. Local leaders disagreed on whether the new school should be a technical research institute or a more broadly based school that included undergraduates as well. John Jay Hopkins of General Dynamics Corporation pledged one million dollars for the former while the City Council offered free land for the latter. The original authorization for the University of California-San Diego campus given by the University of California Regents in 1956 approved a “graduate program in science and technology” that included undergraduate programs, a compromise that won both the support of General Dynamics and the city voters’ approval.

    Nobel laureate Harold Urey, a physicist from the University of Chicago, and Hans Suess, who had published the first paper on the greenhouse effect with Revelle in the previous year, were early recruits to the faculty in 1958. Maria Goeppert-Mayer, later the second female Nobel laureate in physics, was appointed professor of physics in 1960. The graduate division of the school opened in 1960 with 20 faculty in residence, with instruction offered in the fields of physics, biology, chemistry, and earth science. Before the main campus completed construction, classes were held in the Scripps Institution of Oceanography.

    By 1963, new facilities on the mesa had been finished for the School of Science and Engineering, and new buildings were under construction for Social Sciences and Humanities. Ten additional faculty in those disciplines were hired, and the whole site was designated the First College, later renamed after Roger Revelle, of the new campus. York resigned as chancellor that year and was replaced by John Semple Galbraith. The undergraduate program accepted its first class of 181 freshman at Revelle College in 1964. Second College was founded in 1964, on the land deeded by the federal government, and named after environmentalist John Muir two years later. The University of California-San Diego School of Medicine also accepted its first students in 1966.

    Political theorist Herbert Marcuse joined the faculty in 1965. A champion of the New Left, he reportedly was the first protester to occupy the administration building in a demonstration organized by his student, political activist Angela Davis. The American Legion offered to buy out the remainder of Marcuse’s contract for $20,000; the Regents censured Chancellor William J. McGill for defending Marcuse on the basis of academic freedom, but further action was averted after local leaders expressed support for Marcuse. Further student unrest was felt at the university, as the United States increased its involvement in the Vietnam War during the mid-1960s, when a student raised a Viet Minh flag over the campus. Protests escalated as the war continued and were only exacerbated after the National Guard fired on student protesters at Kent State University in 1970. Over 200 students occupied Urey Hall, with one student setting himself on fire in protest of the war.

    Early research activity and faculty quality, notably in the sciences, was integral to shaping the focus and culture of the university. Even before The University of California-San Diego had its own campus, faculty recruits had already made significant research breakthroughs, such as the Keeling Curve, a graph that plots rapidly increasing carbon dioxide levels in the atmosphere and was the first significant evidence for global climate change; the Kohn–Sham equations, used to investigate particular atoms and molecules in quantum chemistry; and the Miller–Urey experiment, which gave birth to the field of prebiotic chemistry.

    Engineering, particularly computer science, became an important part of the university’s academics as it matured. University researchers helped develop University of California-San Diego Pascal, an early machine-independent programming language that later heavily influenced Java; the National Science Foundation Network, a precursor to the Internet; and the Network News Transfer Protocol during the late 1970s to 1980s. In economics, the methods for analyzing economic time series with time-varying volatility (ARCH), and with common trends (cointegration) were developed. The University of California-San Diego maintained its research intense character after its founding, racking up 25 Nobel Laureates affiliated within 50 years of history; a rate of five per decade.

    Under Richard C. Atkinson’s leadership as chancellor from 1980 to 1995, the university strengthened its ties with the city of San Diego by encouraging technology transfer with developing companies, transforming San Diego into a world leader in technology-based industries. He oversaw a rapid expansion of the School of Engineering, later renamed after Qualcomm founder Irwin M. Jacobs, with the construction of the San Diego Supercomputer Center and establishment of the computer science, electrical engineering, and bioengineering departments. Private donations increased from $15 million to nearly $50 million annually, faculty expanded by nearly 50%, and enrollment doubled to about 18,000 students during his administration. By the end of his chancellorship, the quality of The University of California-San Diego graduate programs was ranked 10th in the nation by the National Research Council.

    The university continued to undergo further expansion during the first decade of the new millennium with the establishment and construction of two new professional schools — the Skaggs School of Pharmacy and Rady School of Management—and the California Institute for Telecommunications and Information Technology, a research institute run jointly with University of California Irvine. The University of California-San Diego also reached two financial milestones during this time, becoming the first university in the western region to raise over $1 billion in its eight-year fundraising campaign in 2007 and also obtaining an additional $1 billion through research contracts and grants in a single fiscal year for the first time in 2010. Despite this, due to the California budget crisis, the university loaned $40 million against its own assets in 2009 to offset a significant reduction in state educational appropriations. The salary of Pradeep Khosla, who became chancellor in 2012, has been the subject of controversy amidst continued budget cuts and tuition increases.

    On November 27, 2017, the university announced it would leave its longtime athletic home of the California Collegiate Athletic Association, an NCAA Division II league, to begin a transition to Division I in 2020. At that time, it will join the Big West Conference, already home to four other UC campuses (Davis, Irvine, Riverside, Santa Barbara). The transition period will run through the 2023–24 school year. The university prepares to transition to NCAA Division I competition on July 1, 2020.

    Research

    Applied Physics and Mathematics

    The Nature Index lists The University of California-San Diego as 6th in the United States for research output by article count in 2019. In 2017, The University of California-San Diego spent $1.13 billion on research, the 7th highest expenditure among academic institutions in the U.S. The university operates several organized research units, including the Center for Astrophysics and Space Sciences (CASS), the Center for Drug Discovery Innovation, and the Institute for Neural Computation. The University of California-San Diego also maintains close ties to the nearby Scripps Research Institute and Salk Institute for Biological Studies. In 1977, The University of California-San Diego developed and released the University of California-San Diego Pascal programming language. The university was designated as one of the original national Alzheimer’s disease research centers in 1984 by the National Institute on Aging. In 2018, The University of California-San Diego received $10.5 million from the DOE National Nuclear Security Administration to establish the Center for Matters under Extreme Pressure (CMEC).

    The university founded the San Diego Supercomputer Center (SDSC) in 1985, which provides high performance computing for research in various scientific disciplines. In 2000, The University of California-San Diego partnered with The University of California-Irvine to create the Qualcomm Institute – University of California-San Diego, which integrates research in photonics, nanotechnology, and wireless telecommunication to develop solutions to problems in energy, health, and the environment.

    The University of California-San Diego also operates the Scripps Institution of Oceanography, one of the largest centers of research in earth science in the world, which predates the university itself. Together, SDSC and SIO, along with funding partner universities California Institute of Technology, San Diego State University, and The University of California-Santa Barbara, manage the High Performance Wireless Research and Education Network.

     
  • richardmitnick 9:04 am on November 23, 2022 Permalink | Reply
    Tags: "Accelerating 3D imaging", "Open-top light-sheet microscopy", 3D pathology could enable more accurate identification., , Currently most pathologists use a 2D method of imaging tissues allowing a view of only a small fraction of a sample in 2D and can lead to inaccurate diagnoses and suboptimal treatments., , Machine learning, , , The Department of Mechanical Engineering,   

    From The Department of Mechanical Engineering In The College of Engineering At The University of Washington :”Accelerating 3D imaging” 

    From The Department of Mechanical Engineering

    In

    The College of Engineering

    At

    The University of Washington

    10.31.22 [Just now in social media.]

    In 2017, the Molecular Biophotonics Laboratory – led by ME professor Jonathan Liu – pioneered a now-patented technology called “open-top light-sheet microscopy”. The 3D imaging method enables clinicians to see a complete microscopic view of tissue specimens, such as a biopsy or surgically removed tumor, which could improve how diseases are diagnosed and treated.

    1
    A 3D pathology dataset of a prostate biopsy stained with a fluorescent analogue of H&E. Deep learning-based image translation was used to convert the H&E dataset into a synthetic dataset that looks like it has been immunolabeled to highlight a cytokeratin biomarker (brown) expressed by the epithelial cells in all prostate glands. In turn, this synthetically immunolabeled dataset allows for accurate 3D segmentation of the prostate gland epithelium (yellow) and lumen spaces (red). Quantitative features derived from these segmented 3D structures are used to train a machine classifier to stratify between recurrent versus non-recurrent cancer. Reference: W. Xie et al., Cancer Research, 2022.

    “What gets people excited at the end of day is how this is going to impact patient care,” Liu says.

    Currently most pathologists use a 2D method of imaging tissues, which involves cutting a small percentage of the tissue sample into thin slices and staining and viewing them on glass slides under a standard microscope to determine their level of abnormality. This traditional method allows them to see only a small fraction of a sample in 2D and can lead to inaccurate diagnoses and suboptimal treatments, such as a person receiving radiation or surgery although their cancer only requires periodic monitoring.

    The nondestructive 3D imaging method developed in Liu’s lab can rapidly image 100% of certain tissue samples such as needle biopsies and keeps the tissues intact so that they can be used for other tests.

    The open-top light-sheet microscopy device shines a sheet of light that optically “slices” through samples that are made transparent through a simple and gentle “optical clearing” process. All of the optical components in the microscope are positioned below a glass or plastic sample-holder platform, allowing for rapid and simple imaging of a wide range and number of clinical specimens. Researchers in the Liu lab then develop computational analysis methods to delineate and quantify key tissue structures to determine cancer aggressiveness.

    In 2022, the researchers published a study in the journal Cancer Research [below] showing that 3D pathology could enable more accurate identification of aggressive prostate cancer cases that could recur within five years. Prostate cancer is the most common cancer and the second leading cause of cancer death for U.S. men. Because it can be slow-growing, identifying which cases require monitoring or treatment is important. 3D pathology could also be used in other cancer treatments, such as predicting which cancer patients might respond to immunotherapy.

    “Better ways to determine which drugs a patient should take is a huge need,” Liu says. “Machine learning can find needles in a haystack that are hard for human observers to see within our large 3D datasets, which can be important for determining the aggressiveness of a disease or how likely the disease will respond to specific treatments.”

    Alpenglow Biosciences is a company co-founded by Liu that commercializes the lab’s technologies, such as the researchers’ newest and most versatile microscopy system, a hybrid open-top light sheet microscope. It can perform rapid, automated imaging of multiple specimens at various levels of magnification. Liu’s lab published this latest microscope system in a paper in Nature Methods in May.

    This year, the lab is scaling up its prostate cancer research to include collaborations with universities across the country and even internationally – pointing to a growing awareness and interest in open-top light-sheet microscopy.

    Projects include National Institutes of Health-funded research with Emory University to combine 3D tissue imaging with AI-powered diagnostic imaging to better determine prostate cancer risk and working with the University of Pennsylvania to compare the 3D pathology of specific populations to develop tailored AI methods. In October, the lab will host researchers from the University of Oxford and train them on 3D pathology methods. Through a project funded by Prostate Cancer U.K., the lab aims to ship a microscope to Oxford in 2023 so that U.K. researchers can use it to image tissues.

    “Oxford has one of the largest collections of prostate tissues in the world from patients in which outcomes are being tracked for 10 to 20 years,” Liu says. “Working with Oxford could really help us to show the value of our 3D pathology methods for predicting which patients have lethal versus slow-growing disease.”

    In addition, Liu is working with UW Medicine to create a hub for 3D pathology, and the lab will conduct smaller-scale clinical studies to show the feasibility of the 3D imaging method. The goal is to use AI to assist pathologists’ final diagnosis and to guide oncologists’ treatment recommendations.

    Science paper:
    Cancer Research
    See the science paper for instructive material with images.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Mechanical engineering is one of the broadest and oldest of the engineering disciplines and therefore provides some of the strongest interdisciplinary opportunities in the engineering profession. Power utilization (and power generation) is often used to describe the focus of mechanical engineering. Within this focus are such diverse topics as thermodynamics, heat transfer, fluid mechanics, machine design, mechanics of materials, manufacturing, stress analysis, system dynamics, numerical modeling, vibrations, turbomachinery, combustion, heating, ventilating, and air conditioning. Degrees in mechanical engineering open doors to careers not only in the engineering profession but also in business, law, medicine, finance, and other non-technical professions.

    About The University of Washington College of Engineering

    Mission, Facts, and Stats
    Our mission is to develop outstanding engineers and ideas that change the world.

    Faculty:
    275 faculty (25.2% women)
    Achievements:

    128 NSF Young Investigator/Early Career Awards since 1984
    32 Sloan Foundation Research Awards
    2 MacArthur Foundation Fellows (2007 and 2011)

    A national leader in educating engineers, each year the College turns out new discoveries, inventions and top-flight graduates, all contributing to the strength of our economy and the vitality of our community.

    Engineering innovation

    Engineers drive the innovation economy and are vital to solving society’s most challenging problems. The College of Engineering is a key part of a world-class research university in a thriving hub of aerospace, biotechnology, global health and information technology innovation. Over 50% of The University of Washington startups in FY18 came from the College of Engineering.

    Commitment to diversity and access

    The College of Engineering is committed to developing and supporting a diverse student body and faculty that reflect and elevate the populations we serve. We are a national leader in women in engineering; 25.5% of our faculty are women compared to 17.4% nationally. We offer a robust set of diversity programs for students and faculty.
    Research and commercialization

    The University of Washington is an engine of economic growth, today ranked third in the nation for the number of startups launched each year, with 65 companies having been started in the last five years alone by UW students and faculty, or with technology developed here. The College of Engineering is a key contributor to these innovations, and engineering faculty, students or technology are behind half of all UW startups. In FY19, UW received $1.58 billion in total research awards from federal and nonfederal sources.

    u-washington-campus

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So, what defines us —the students, faculty and community members at The University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

    The University of Washington is a public research university in Seattle, Washington, United States. Founded in 1861, The University of Washington is one of the oldest universities on the West Coast; it was established in downtown Seattle approximately a decade after the city’s founding to aid its economic development. Today, The University of Washington’s 703-acre main Seattle campus is in the University District above the Montlake Cut, within the urban Puget Sound region of the Pacific Northwest. The university has additional campuses in Tacoma and Bothell. Overall, The University of Washington encompasses over 500 buildings and over 20 million gross square footage of space, including one of the largest library systems in the world with more than 26 university libraries, as well as the UW Tower, lecture halls, art centers, museums, laboratories, stadiums, and conference centers. The University of Washington offers bachelor’s, master’s, and doctoral degrees through 140 departments in various colleges and schools, sees a total student enrollment of roughly 46,000 annually, and functions on a quarter system.

    The University of Washington is a member of the Association of American Universities and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation, UW spent $1.41 billion on research and development in 2018, ranking it 5th in the nation. As the flagship institution of the six public universities in Washington state, it is known for its medical, engineering and scientific research as well as its highly competitive computer science and engineering programs. Additionally, The University of Washington continues to benefit from its deep historic ties and major collaborations with numerous technology giants in the region, such as Amazon, Boeing, Nintendo, and particularly Microsoft. Paul G. Allen, Bill Gates and others spent significant time at Washington computer labs for a startup venture before founding Microsoft and other ventures. The University of Washington’s 22 varsity sports teams are also highly competitive, competing as the Huskies in the Pac-12 Conference of the NCAA Division I, representing the United States at the Olympic Games, and other major competitions.

    The University of Washington has been affiliated with many notable alumni and faculty, including 21 Nobel Prize laureates and numerous Pulitzer Prize winners, Fulbright Scholars, Rhodes Scholars and Marshall Scholars.

    In 1854, territorial governor Isaac Stevens recommended the establishment of a university in the Washington Territory. Prominent Seattle-area residents, including Methodist preacher Daniel Bagley, saw this as a chance to add to the city’s potential and prestige. Bagley learned of a law that allowed United States territories to sell land to raise money in support of public schools. At the time, Arthur A. Denny, one of the founders of Seattle and a member of the territorial legislature, aimed to increase the city’s importance by moving the territory’s capital from Olympia to Seattle. However, Bagley eventually convinced Denny that the establishment of a university would assist more in the development of Seattle’s economy. Two universities were initially chartered, but later the decision was repealed in favor of a single university in Lewis County provided that locally donated land was available. When no site emerged, Denny successfully petitioned the legislature to reconsider Seattle as a location in 1858.

    In 1861, scouting began for an appropriate 10 acres (4 ha) site in Seattle to serve as a new university campus. Arthur and Mary Denny donated eight acres, while fellow pioneers Edward Lander, and Charlie and Mary Terry, donated two acres on Denny’s Knoll in downtown Seattle. More specifically, this tract was bounded by 4th Avenue to the west, 6th Avenue to the east, Union Street to the north, and Seneca Streets to the south.

    John Pike, for whom Pike Street is named, was the university’s architect and builder. It was opened on November 4, 1861, as the Territorial University of Washington. The legislature passed articles incorporating the University, and establishing its Board of Regents in 1862. The school initially struggled, closing three times: in 1863 for low enrollment, and again in 1867 and 1876 due to funds shortage. The University of Washington awarded its first graduate Clara Antoinette McCarty Wilt in 1876, with a bachelor’s degree in science.

    19th century relocation

    By the time Washington state entered the Union in 1889, both Seattle and The University of Washington had grown substantially. The University of Washington’s total undergraduate enrollment increased from 30 to nearly 300 students, and the campus’s relative isolation in downtown Seattle faced encroaching development. A special legislative committee, headed by The University of Washington graduate Edmond Meany, was created to find a new campus to better serve the growing student population and faculty. The committee eventually selected a site on the northeast of downtown Seattle called Union Bay, which was the land of the Duwamish, and the legislature appropriated funds for its purchase and construction. In 1895, The University of Washington relocated to the new campus by moving into the newly built Denny Hall. The University of Washington Regents tried and failed to sell the old campus, eventually settling with leasing the area. This would later become one of the University’s most valuable pieces of real estate in modern-day Seattle, generating millions in annual revenue with what is now called the Metropolitan Tract. The original Territorial University building was torn down in 1908, and its former site now houses the Fairmont Olympic Hotel.

    The sole-surviving remnants of The University of Washington’s first building are four 24-foot (7.3 m), white, hand-fluted cedar, Ionic columns. They were salvaged by Edmond S. Meany, one of The University of Washington’s first graduates and former head of its history department. Meany and his colleague, Dean Herbert T. Condon, dubbed the columns as “Loyalty,” “Industry,” “Faith”, and “Efficiency”, or “LIFE.” The columns now stand in the Sylvan Grove Theater.

    20th century expansion

    Organizers of the 1909 Alaska-Yukon-Pacific Exposition eyed the still largely undeveloped campus as a prime setting for their world’s fair. They came to an agreement with The University of Washington ‘s Board of Regents that allowed them to use the campus grounds for the exposition, surrounding today’s Drumheller Fountain facing towards Mount Rainier. In exchange, organizers agreed Washington would take over the campus and its development after the fair’s conclusion. This arrangement led to a detailed site plan and several new buildings, prepared in part by John Charles Olmsted. The plan was later incorporated into the overall University of Washington campus master plan, permanently affecting the campus layout.

    Both World Wars brought the military to campus, with certain facilities temporarily lent to the federal government. In spite of this, subsequent post-war periods were times of dramatic growth for The University of Washington. The period between the wars saw a significant expansion of the upper campus. Construction of the Liberal Arts Quadrangle, known to students as “The Quad,” began in 1916 and continued to 1939. The University’s architectural centerpiece, Suzzallo Library, was built in 1926 and expanded in 1935.

    After World War II, further growth came with the G.I. Bill. Among the most important developments of this period was the opening of the School of Medicine in 1946, which is now consistently ranked as the top medical school in the United States. It would eventually lead to The University of Washington Medical Center, ranked by U.S. News and World Report as one of the top ten hospitals in the nation.

    In 1942, all persons of Japanese ancestry in the Seattle area were forced into inland internment camps as part of Executive Order 9066 following the attack on Pearl Harbor. During this difficult time, university president Lee Paul Sieg took an active and sympathetic leadership role in advocating for and facilitating the transfer of Japanese American students to universities and colleges away from the Pacific Coast to help them avoid the mass incarceration. Nevertheless, many Japanese American students and “soon-to-be” graduates were unable to transfer successfully in the short time window or receive diplomas before being incarcerated. It was only many years later that they would be recognized for their accomplishments during The University of Washington’s Long Journey Home ceremonial event that was held in May 2008.

    From 1958 to 1973, The University of Washington saw a tremendous growth in student enrollment, its faculties and operating budget, and also its prestige under the leadership of Charles Odegaard. The University of Washington student enrollment had more than doubled to 34,000 as the baby boom generation came of age. However, this era was also marked by high levels of student activism, as was the case at many American universities. Much of the unrest focused around civil rights and opposition to the Vietnam War. In response to anti-Vietnam War protests by the late 1960s, the University Safety and Security Division became The University of Washington Police Department.

    Odegaard instituted a vision of building a “community of scholars”, convincing the Washington State legislatures to increase investment in The University of Washington. Washington senators, such as Henry M. Jackson and Warren G. Magnuson, also used their political clout to gather research funds for the University of Washington. The results included an increase in the operating budget from $37 million in 1958 to over $400 million in 1973, solidifying The University of Washington as a top recipient of federal research funds in the United States. The establishment of technology giants such as Microsoft, Boeing and Amazon in the local area also proved to be highly influential in the University of Washington’s fortunes, not only improving graduate prospects but also helping to attract millions of dollars in university and research funding through its distinguished faculty and extensive alumni network.

    21st century

    In 1990, The University of Washington opened its additional campuses in Bothell and Tacoma. Although originally intended for students who have already completed two years of higher education, both schools have since become four-year universities with the authority to grant degrees. The first freshman classes at these campuses started in fall 2006. Today both Bothell and Tacoma also offer a selection of master’s degree programs.

    In 2012, The University of Washington began exploring plans and governmental approval to expand the main Seattle campus, including significant increases in student housing, teaching facilities for the growing student body and faculty, as well as expanded public transit options. The University of Washington light rail station was completed in March 2015, connecting Seattle’s Capitol Hill neighborhood to The University of Washington Husky Stadium within five minutes of rail travel time. It offers a previously unavailable option of transportation into and out of the campus, designed specifically to reduce dependence on private vehicles, bicycles and local King County buses.

    The University of Washington has been listed as a “Public Ivy” in Greene’s Guides since 2001, and is an elected member of the American Association of Universities. Among the faculty by 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences(US), 67 members of the American Academy of Arts and Sciences, 53 members of the National Academy of Medicine, 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering, 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among The University of Washington students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017.

    The Academic Ranking of World Universities has consistently ranked The University of Washington as one of the top 20 universities worldwide every year since its first release. In 2019, The University of Washington ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900.

    U.S. News & World Report ranked The University of Washington 8th out of nearly 1,500 universities worldwide for 2021, with The University of Washington’s undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities.

    In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world’s 500 major universities, ranked The University of Washington 12th globally and 5th in the U.S.

    In 2019, Kiplinger Magazine’s review of “top college values” named University of Washington 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings The University of Washington was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service.

     
  • richardmitnick 1:04 pm on November 21, 2022 Permalink | Reply
    Tags: "CfC": closed-form continuous-time neural network, "Solving brain dynamics gives rise to flexible machine-learning models", , , Differential equations enable us to compute the state of the world or a phenomenon as it evolves but not all the way through time — just step-by-step., Machine learning, , Studying the brains of small species recently helped MIT researchers better model the interaction between neurons and synapses — the building blocks of natural and artificial neural networks., , , The team reached into a bag of mathematical tricks to find a “closed form” solution that models the entire description of a whole system in a single compute step., There is early evidence of Liquid CfC models in learning tasks in one environment from visual inputs and transferring their learned skills to an entirely new environment without additional training., This framework can help solve more complex machine learning tasks — enabling better representation learning — and should be the basic building blocks of any future embedded intelligence system., With the models one can compute the equations at any time in the future and at any time in the past.   

    From The Computer Science & Artificial Intelligence Laboratory (CSAIL) At The Massachusetts Institute of Technology: “Solving brain dynamics gives rise to flexible machine-learning models” 

    1

    From The Computer Science & Artificial Intelligence Laboratory (CSAIL)

    At

    The Massachusetts Institute of Technology

    11.15.22
    Rachel Gordon

    1
    Studying the brains of small species recently helped MIT researchers better model the interaction between neurons and synapses — the building blocks of natural and artificial neural networks — into a class of flexible, robust machine-learning models that learn on the job and can adapt to changing conditions. Image: Ramin Hasani/Stable Diffusion.

    Last year, MIT researchers announced that they had built “liquid” neural networks, inspired by the brains of small species: a class of flexible, robust machine learning models that learn on the job and can adapt to changing conditions, for real-world safety-critical tasks, like driving and flying. The flexibility of these “liquid” neural nets meant boosting the bloodline to our connected world, yielding better decision-making for many tasks involving time-series data, such as brain and heart monitoring, weather forecasting, and stock pricing.

    But these models become computationally expensive as their number of neurons and synapses increase and require clunky computer programs to solve their underlying, complicated math. And all of this math, similar to many physical phenomena, becomes harder to solve with size, meaning computing lots of small steps to arrive at a solution. 

    Now, the same team of scientists has discovered a way to alleviate this bottleneck by solving the differential equation behind the interaction of two neurons through synapses to unlock a new type of fast and efficient artificial intelligence algorithms. These modes have the same characteristics of liquid neural nets — flexible, causal, robust, and explainable — but are orders of magnitude faster, and scalable. This type of neural net could therefore be used for any task that involves getting insight into data over time, as they’re compact and adaptable even after training — while many traditional models are fixed. There hasn’t been a known solution since 1907 — the year that the differential equation of the neuron model was introduced.

    The models, dubbed a “closed-form continuous-time” (CfC) neural network, outperformed state-of-the-art counterparts on a slew of tasks, with considerably higher speedups and performance in recognizing human activities from motion sensors, modeling physical dynamics of a simulated walker robot, and event-based sequential image processing. On a medical prediction task, for example, the new models were 220 times faster on a sampling of 8,000 patients. 

    A new paper on the work is published today in Nature Machine Intelligence [below].

    “The new machine-learning models we call ‘CfC’s’ replace the differential equation defining the computation of the neuron with a closed form approximation, preserving the beautiful properties of liquid networks without the need for numerical integration,” says MIT Professor Daniela Rus, director of the Computer Science and Artificial Intelligence Laboratory (CSAIL) and senior author on the new paper. “CfC models are causal, compact, explainable, and efficient to train and predict. They open the way to trustworthy machine learning for safety-critical applications.”

    Keeping things liquid 

    Differential equations enable us to compute the state of the world or a phenomenon as it evolves, but not all the way through time — just step-by-step. To model natural phenomena through time and understand previous and future behavior, like human activity recognition or a robot’s path, for example, the team reached into a bag of mathematical tricks to find just the ticket: a “closed form’” solution that models the entire description of a whole system, in a single compute step. 

    With their models, one can compute this equation at any time in the future, and at any time in the past. Not only that, but the speed of computation is much faster because you don’t need to solve the differential equation step-by-step. 

    Imagine an end-to-end neural network that receives driving input from a camera mounted on a car. The network is trained to generate outputs, like the car’s steering angle. In 2020, the team solved this by using liquid neural networks with 19 nodes, so 19 neurons plus a small perception module could drive a car. A differential equation describes each node of that system. With the closed-form solution, if you replace it inside this network, it would give you the exact behavior, as it’s a good approximation of the actual dynamics of the system. They can thus solve the problem with an even lower number of neurons, which means it would be faster and less computationally expensive. 

    These models can receive inputs as time series (events that happened in time), which could be used for classification, controlling a car, moving a humanoid robot, or forecasting financial and medical events. With all of these various modes, it can also increase accuracy, robustness, and performance, and, importantly, computation speed — which sometimes comes as a trade-off. 

    Solving this equation has far-reaching implications for advancing research in both natural and artificial intelligence systems. “When we have a closed-form description of neurons and synapses’ communication, we can build computational models of brains with billions of cells, a capability that is not possible today due to the high computational complexity of neuroscience models. The closed-form equation could facilitate such grand-level simulations and therefore opens new avenues of research for us to understand intelligence,” says MIT CSAIL Research Affiliate Ramin Hasani, first author on the new paper.

    Portable learning

    Moreover, there is early evidence of Liquid CfC models in learning tasks in one environment from visual inputs, and transferring their learned skills to an entirely new environment without additional training. This is called out-of-distribution generalization, which is one of the most fundamental open challenges of artificial intelligence research.  

    “Neural network systems based on differential equations are tough to solve and scale to, say, millions and billions of parameters. Getting that description of how neurons interact with each other, not just the threshold, but solving the physical dynamics between cells enables us to build up larger-scale neural networks,” says Hasani. “This framework can help solve more complex machine learning tasks — enabling better representation learning — and should be the basic building blocks of any future embedded intelligence system.”

    “Recent neural network architectures, such as neural ODEs and liquid neural networks, have hidden layers composed of specific dynamical systems representing infinite latent states instead of explicit stacks of layers,” says Sildomar Monteiro, AI and Machine Learning Group lead at Aurora Flight Sciences, a Boeing company, who was not involved in this paper. “These implicitly-defined models have shown state-of-the-art performance while requiring far fewer parameters than conventional architectures. However, their practical adoption has been limited due to the high computational cost required for training and inference.” He adds that this paper “shows a significant improvement in the computation efficiency for this class of neural networks … [and] has the potential to enable a broader range of practical applications relevant to safety-critical commercial and defense systems.”

    Hasani and Mathias Lechner, a postdoc at MIT CSAIL, wrote the paper supervised by Rus, alongside MIT Alexander Amini, a CSAIL postdoc; Lucas Liebenwein SM ’18, PhD ’21; Aaron Ray, an MIT electrical engineering and computer science PhD student and CSAIL affiliate; Max Tschaikowski, associate professor in computer science at Aalborg University in Denmark; and Gerald Teschl, professor of mathematics at the University of Vienna.

    Science paper:
    Nature Machine Intelligence
    See the science paper for instructive material with images.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    4

    The Computer Science and Artificial Intelligence Laboratory (CSAIL) is a research institute at the Massachusetts Institute of Technology (MIT) formed by the 2003 merger of the Laboratory for Computer Science (LCS) and the Artificial Intelligence Laboratory (AI Lab). Housed within the Ray and Maria Stata Center, CSAIL is the largest on-campus laboratory as measured by research scope and membership. It is part of the Schwarzman College of Computing but is also overseen by the MIT Vice President of Research.

    Research activities

    CSAIL’s research activities are organized around a number of semi-autonomous research groups, each of which is headed by one or more professors or research scientists. These groups are divided up into seven general areas of research:

    Artificial intelligence
    Computational biology
    Graphics and vision
    Language and learning
    Theory of computation
    Robotics
    Systems (includes computer architecture, databases, distributed systems, networks and networked systems, operating systems, programming methodology, and software engineering among others)

    In addition, CSAIL hosts the World Wide Web Consortium (W3C).

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center (US), and the Haystack Observatory, as well as affiliated laboratories such as the Broad Institute of MIT and Harvard(US) and Whitehead Institute (US).

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology ‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology ( students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology, Massachusetts Institute of Technology , and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: