Tagged: Computer Science & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:03 pm on January 19, 2023 Permalink | Reply
    Tags: "MIT engineers grow 'perfect' atom-thin materials on industrial silicon wafers", 2D materials can conduct electrons far more efficiently than silicon., Before the electronics industry can transition to 2D materials scientists have to find a way to engineer the materials on industry-standard silicon wafers., Computer Science & Technology, , Enter 2D materials — delicate two-dimensional sheets of perfect crystals that are as thin as a single atom., It’s considered almost impossible to grow single-crystalline 2D materials on silicon. Now researchers show you can., , MIT engineers may now have a solution: a form of nonepitaxial single-crystalline material to grow pure defect-free 2D materials onto industrial silicon wafers., Moore’s Law is predicted to soon plateau because silicon — the backbone of modern transistors — loses its electrical properties once devices made from this material dip below a certain size., Most bulk materials are polycrystalline containing multiple crystals that grow in random orientations. Where one crystal meets another the “grain boundary” acts as an electric barrier., Researchers first covered a silicon wafer in a “mask” — a coating of silicon dioxide that they patterned into tiny pockets each of which is designed to trap a crystal seed., Researchers have found other ways to fabricate 2D materials- on wafers of sapphire — a material with a hexagonal pattern of atoms encouraging 2D materials to assemble in single orientation., Silicon technology, single-crystalline growth does not require peeling and searching flakes of 2D material., , The researchers unlocked a way to catch up to Moore’s Law using 2D materials., The search for next-generation transistor materials therefore has focused on 2D materials as potential successors to silicon., The team fabricated a simple functional transistor from a type of 2D materials called transition-metal dichalcogenides., The team fabricated a simple TMD transistor and showed that its electrical performance was just as good as a pure flake of the same material., The team’s new nonepitaxial, Wafers of silicon lack sapphire’s hexagonal supporting scaffold., When researchers attempt to grow 2D materials on silicon the result is a random patchwork of crystals that merge haphazardly forming numerous grain boundaries that stymie conductivity.   

    From The Massachusetts Institute of Technology: “MIT engineers grow ‘perfect’ atom-thin materials on industrial silicon wafers” 

    From The Massachusetts Institute of Technology

    1.18.23
    Jennifer Chu

    1
    By depositing atoms on a wafer coated in a “mask” (top left), MIT engineers can corral the atoms in the mask’s individual pockets (center middle), and encourage the atoms to grow into perfect, 2D, single-crystalline layers (bottom right). Courtesy of the researchers. Edited by MIT News.

    True to Moore’s Law, the number of transistors on a microchip has doubled every year since the 1960s. But this trajectory is predicted to soon plateau because silicon — the backbone of modern transistors — loses its electrical properties once devices made from this material dip below a certain size.

    Enter 2D materials — delicate two-dimensional sheets of perfect crystals that are as thin as a single atom. At the scale of nanometers, 2D materials can conduct electrons far more efficiently than silicon. The search for next-generation transistor materials therefore has focused on 2D materials as potential successors to silicon.

    But before the electronics industry can transition to 2D materials scientists have to first find a way to engineer the materials on industry-standard silicon wafers while preserving their perfect crystalline form. And MIT engineers may now have a solution.

    The team has developed a method that could enable chip manufacturers to fabricate ever-smaller transistors from 2D materials by growing them on existing wafers of silicon and other materials. The new method is a form of “nonepitaxial, single-crystalline growth,” which the team used for the first time to grow pure, defect-free 2D materials onto industrial silicon wafers.

    With their method, the team fabricated a simple functional transistor from a type of 2D materials called transition-metal dichalcogenides, or TMDs, which are known to conduct electricity better than silicon at nanometer scales.

    “We expect our technology could enable the development of 2D semiconductor-based, high-performance, next-generation electronic devices,” says Jeehwan Kim, associate professor of mechanical engineering at MIT. “We’ve unlocked a way to catch up to Moore’s Law using 2D materials.”

    Kim and his colleagues detail their method in a paper appearing today in Nature [below]. The study’s MIT co-authors include Ki Seok Kim, Doyoon Lee, Celesta Chang, Seunghwan Seo, Hyunseok Kim, Jiho Shin, Sangho Lee, Jun Min Suh, and Bo-In Park, along with collaborators at the University of Texas-Dallas, the University of California-Riverside, Washington University in Saint Louis, and institutions across South Korea.

    A crystal patchwork

    To produce a 2D material, researchers have typically employed a manual process by which an atom-thin flake is carefully exfoliated from a bulk material, like peeling away the layers of an onion.

    But most bulk materials are polycrystalline containing multiple crystals that grow in random orientations. Where one crystal meets another, the “grain boundary” acts as an electric barrier. Any electrons flowing through one crystal suddenly stop when met with a crystal of a different orientation, damping a material’s conductivity. Even after exfoliating a 2D flake, researchers must then search the flake for “single-crystalline” regions — a tedious and time-intensive process that is difficult to apply at industrial scales.

    Recently, researchers have found other ways to fabricate 2D materials, by growing them on wafers of sapphire — a material with a hexagonal pattern of atoms which encourages 2D materials to assemble in the same, single-crystalline orientation.

    “But nobody uses sapphire in the memory or logic industry,” Kim says. “All the infrastructure is based on silicon. For semiconductor processing, you need to use silicon wafers.”

    However, wafers of silicon lack sapphire’s hexagonal supporting scaffold. When researchers attempt to grow 2D materials on silicon, the result is a random patchwork of crystals that merge haphazardly, forming numerous grain boundaries that stymie conductivity.

    “It’s considered almost impossible to grow single-crystalline 2D materials on silicon,” Kim says. “Now we show you can. And our trick is to prevent the formation of grain boundaries.”

    Seed pockets

    The team’s new nonepitaxial, single-crystalline growth does not require peeling and searching flakes of 2D material. Instead, the researchers use conventional vapor deposition methods to pump atoms across a silicon wafer. The atoms eventually settle on the wafer and nucleate, growing into two-dimensional crystal orientations. If left alone, each “nucleus,” or seed of a crystal, would grow in random orientations across the silicon wafer. But Kim and his colleagues found a way to align each growing crystal to create single-crystalline regions across the entire wafer.

    To do so, they first covered a silicon wafer in a “mask” — a coating of silicon dioxide that they patterned into tiny pockets, each designed to trap a crystal seed. Across the masked wafer, they then flowed a gas of atoms that settled into each pocket to form a 2D material — in this case, a TMD. The mask’s pockets corralled the atoms and encouraged them to assemble on the silicon wafer in the same, single-crystalline orientation.

    “That is a very shocking result,” Kim says “You have single-crystalline growth everywhere, even if there is no epitaxial relation between the 2D material and silicon wafer.”

    With their masking method, the team fabricated a simple TMD transistor and showed that its electrical performance was just as good as a pure flake of the same material.

    They also applied the method to engineer a multilayered device. After covering a silicon wafer with a patterned mask, they grew one type of 2D material to fill half of each square, then grew a second type of 2D material over the first layer to fill the rest of the squares. The result was an ultrathin, single-crystalline bilayer structure within each square. Kim says that going forward, multiple 2D materials could be grown and stacked together in this way to make ultrathin, flexible, and multifunctional films.

    “Until now, there has been no way of making 2D materials in single-crystalline form on silicon wafers, thus the whole community has been struggling to realize next-generation processors without transferring 2D materials,” Kim says. “Now we have completely solved this problem, with a way to make devices smaller than a few nanometers. This will change the paradigm of Moore’s Law.”

    This research was supported in part by the U.S. Defense Advanced Research Projects Agency, Intel, the IARPA MicroE4AI program, MicroLink Devices, Inc., ROHM Co., and Samsung.

    Nature

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    4

    The Computer Science and Artificial Intelligence Laboratory (CSAIL)

    From The Kavli Institute For Astrophysics and Space Research

    MIT’s Institute for Medical Engineering and Science is a research institute at the Massachusetts Institute of Technology

    The MIT Laboratory for Nuclear Science

    The MIT Media Lab

    The MIT School of Engineering

    The MIT Sloan School of Management

    Spectrum

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 9:31 am on January 16, 2023 Permalink | Reply
    Tags: "LAIs": long-acting injectables, "University of Toronto scientists use AI to fast-track drug formulation development", , , , Computer Science & Technology, , Machine-learning algorithms can be used to predict experimental drug release from long-acting injectables (LAI) and can also help guide the design of new LAIs., , Reducing ‘trial and error’ for new drug development, , Theoretical and Quantum Chemistry   

    From The University of Toronto (CA): “University of Toronto scientists use AI to fast-track drug formulation development” 

    From The University of Toronto (CA)

    1.11.23
    Kate Richards | Leslie Dan Faculty of Pharmacy

    1
    Researchers Christine Allen and Alán Aspuru-Guzik used machine learning to predict experimental drug release from long-acting injectables (photo by Steve Southon)

    In a bid to reduce the time and cost associated with developing promising new medicines, University of Toronto scientists have successfully tested the use of artificial intelligence to guide the design of long-acting injectable drug formulations.

    The study, published this week in Nature Communication [below], was led by Professor Christine Allen in the Leslie Dan Faculty of Pharmacy and Alán Aspuru-Guzik in the departments of chemistry and computer science in the Faculty of Arts & Science.

    Fig. 1: Schematic demonstrating traditional and data-driven formulation development approaches for long-acting injectables (LAIs).
    2
    [a] Selected routes of administration for FDA-approved LAI formulations. [b] Typical trial-and-error loop commonly employed during the development of LAIs termed “traditional LAI formulation development”. [c] Workflow employed in this study to train and analyze machine learning (ML) models to accelerate the design of new LAI systems, termed “Data-driven LAI formulation development”.

    Their multidisciplinary research shows that machine-learning algorithms can be used to predict experimental drug release from long-acting injectables (LAI) and can also help guide the design of new LAIs.

    “This study takes a critical step towards data-driven drug formulation development with an emphasis on long-acting injectables,” said Allen, who is a member of U of T’s Acceleration Consortium, a global initiative that uses artificial intelligence and automation to accelerate the discovery of materials and molecules needed for a sustainable future.

    “We’ve seen how machine learning has enabled incredible leap-step advances in the discovery of new molecules that have the potential to become medicines. We are now working to apply the same techniques to help us design better drug formulations and, ultimately, better medicines.”

    Considered one of the most promising therapeutic strategies for the treatment of chronic diseases, long-acting injectables are a class of advanced drug delivery systems that are designed to release their cargo over extended periods of time to achieve a prolonged therapeutic effect. This approach can help patients better adhere to their medication regimen, reduce side effects and increase efficacy when injected close to the site of action in the body.

    However, achieving the optimal amount of drug release over the desired period of time requires the development of a wide array of formulation candidates through extensive and time-consuming experiments. This trial-and-error approach has created a significant bottleneck in LAI development compared to more conventional types of drug formulation.

    “AI is transforming the way we do science. It helps accelerate discovery and optimization. This is a perfect example of a ‘before AI’ and an ‘after AI’ moment and shows how drug delivery can be impacted by this multidisciplinary research,” said Aspuru-Guzik, who is director of the Acceleration Consortium and holds the CIFAR Artificial Intelligence Research Chair at the Vector Institute in Toronto and the Canada 150 Research Chair in Theoretical and Quantum Chemistry.

    3
    From left: Zeqing Bao, PhD trainee in pharmaceutical sciences, and Riley Hickman, PhD trainee in chemistry, are co-authors on the study published in Nature Communication (photo by Steve Southon)

    Reducing ‘trial and error’ for new drug development

    To investigate whether machine-learning tools could accurately predict the rate of drug release, the research team trained and evaluated a series of 11 different models, including multiple linear regression (MLR), random forest (RF), light gradient boosting machine (lightGBM) and neural networks (NN). The data set used to train the selected panel of machine learning models was constructed from previously published studies by the authors and other research groups.

    “Once we had the data set, we split it into two subsets: one used for training the models and one for testing,” said Pauric Bannigan, research associate with the Allen research group at the Leslie Dan Faculty of Pharmacy. “We then asked the models to predict the results of the test set and directly compared with previous experimental data. We found that the tree-based models, and specifically lightGBM, delivered the most accurate predictions.”

    As a next step, the team worked to apply these predictions and illustrate how machine learning models might be used to inform the design of new LAIs by using advanced analytical techniques to extract design criteria from the lightGBM model. This allowed the design of a new LAI formulation for a drug currently used to treat ovarian cancer.

    Expectations around the speed with which new drug formulations are developed have heightened drastically since the onset of the COVID-19 pandemic.

    “We’ve seen in the pandemic that there was a need to design a new formulation in weeks, to catch up with evolving variants. Allowing for new formulations to be developed in a short period of time, relative to what has been done in the past using conventional methods, is crucially important so that patients can benefit from new therapies,” Allen said, explaining that the research team is also investigating using machine learning to support the development of novel mRNA and lipid nanoparticle formulations.

    More robust databases needed for future advances

    The results of the current study signal the potential for machine learning to reduce reliance on trial-and-error testing. However, Allen and the research team identify that the lack of available open-source data sets in pharmaceutical sciences represents a significant challenge to future progress.

    “When we began this project, we were surprised by the lack of data reported across numerous studies using polymeric microparticles,” Allen said. “This meant the studies and the work that went into them couldn’t be leveraged to develop the machine learning models we need to propel advances in this space. There is a real need to create robust databases in pharmaceutical sciences that are open access and available for all so that we can work together to advance the field.”

    To that end, Allen and the research team have published their datasets and code on the open-source platform Zenodo.

    “For this study our goal was to lower the barrier of entry to applying machine learning in pharmaceutical sciences,” Bannigan said. “We’ve made our data sets fully available so others can hopefully build on this work. We want this to be the start of something and not the end of the story for machine learning in drug formulation.”

    The study was supported by the Natural Sciences and Engineering Research Council of Canada, the Defense Advance Research Projects Agency and the Vector Institute.

    Science paper:
    Nature Communication

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The The University of Toronto (CA) is a public research university in Toronto, Ontario, Canada, located on the grounds that surround Queen’s Park. It was founded by royal charter in 1827 as King’s College, the oldest university in the province of Ontario.

    Originally controlled by the Church of England, the university assumed its present name in 1850 upon becoming a secular institution.

    As a collegiate university, it comprises eleven colleges each with substantial autonomy on financial and institutional affairs and significant differences in character and history. The university also operates two satellite campuses located in Scarborough and Mississauga.

    University of Toronto has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

    Academically, the University of Toronto is noted for movements and curricula in literary criticism and communication theory, known collectively as the Toronto School.

    The university was the birthplace of insulin and stem cell research, and was the site of the first electron microscope in North America; the identification of the first black hole Cygnus X-1; multi-touch technology, and the development of the theory of NP-completeness.

    The university was one of several universities involved in early research of deep learning. It receives the most annual scientific research funding of any Canadian university and is one of two members of the Association of American Universities outside the United States, the other being McGill(CA).

    The Varsity Blues are the athletic teams that represent the university in intercollegiate league matches, with ties to gridiron football, rowing and ice hockey. The earliest recorded instance of gridiron football occurred at University of Toronto’s University College in November 1861.

    The university’s Hart House is an early example of the North American student centre, simultaneously serving cultural, intellectual, and recreational interests within its large Gothic-revival complex.

    The University of Toronto has educated three Governors General of Canada, four Prime Ministers of Canada, three foreign leaders, and fourteen Justices of the Supreme Court. As of March 2019, ten Nobel laureates, five Turing Award winners, 94 Rhodes Scholars, and one Fields Medalist have been affiliated with the university.

    Early history

    The founding of a colonial college had long been the desire of John Graves Simcoe, the first Lieutenant-Governor of Upper Canada and founder of York, the colonial capital. As an University of Oxford (UK)-educated military commander who had fought in the American Revolutionary War, Simcoe believed a college was needed to counter the spread of republicanism from the United States. The Upper Canada Executive Committee recommended in 1798 that a college be established in York.

    On March 15, 1827, a royal charter was formally issued by King George IV, proclaiming “from this time one College, with the style and privileges of a University … for the education of youth in the principles of the Christian Religion, and for their instruction in the various branches of Science and Literature … to continue for ever, to be called King’s College.” The granting of the charter was largely the result of intense lobbying by John Strachan, the influential Anglican Bishop of Toronto who took office as the college’s first president. The original three-storey Greek Revival school building was built on the present site of Queen’s Park.

    Under Strachan’s stewardship, King’s College was a religious institution closely aligned with the Church of England and the British colonial elite, known as the Family Compact. Reformist politicians opposed the clergy’s control over colonial institutions and fought to have the college secularized. In 1849, after a lengthy and heated debate, the newly elected responsible government of the Province of Canada voted to rename King’s College as the University of Toronto and severed the school’s ties with the church. Having anticipated this decision, the enraged Strachan had resigned a year earlier to open Trinity College as a private Anglican seminary. University College was created as the nondenominational teaching branch of the University of Toronto. During the American Civil War the threat of Union blockade on British North America prompted the creation of the University Rifle Corps which saw battle in resisting the Fenian raids on the Niagara border in 1866. The Corps was part of the Reserve Militia lead by Professor Henry Croft.

    Established in 1878, the School of Practical Science was the precursor to the Faculty of Applied Science and Engineering which has been nicknamed Skule since its earliest days. While the Faculty of Medicine opened in 1843 medical teaching was conducted by proprietary schools from 1853 until 1887 when the faculty absorbed the Toronto School of Medicine. Meanwhile the university continued to set examinations and confer medical degrees. The university opened the Faculty of Law in 1887, followed by the Faculty of Dentistry in 1888 when the Royal College of Dental Surgeons became an affiliate. Women were first admitted to the university in 1884.

    A devastating fire in 1890 gutted the interior of University College and destroyed 33,000 volumes from the library but the university restored the building and replenished its library within two years. Over the next two decades a collegiate system took shape as the university arranged federation with several ecclesiastical colleges including Strachan’s Trinity College in 1904. The university operated the Royal Conservatory of Music from 1896 to 1991 and the Royal Ontario Museum from 1912 to 1968; both still retain close ties with the university as independent institutions. The University of Toronto Press was founded in 1901 as Canada’s first academic publishing house. The Faculty of Forestry founded in 1907 with Bernhard Fernow as dean was Canada’s first university faculty devoted to forest science. In 1910, the Faculty of Education opened its laboratory school, the University of Toronto Schools.

    World wars and post-war years

    The First and Second World Wars curtailed some university activities as undergraduate and graduate men eagerly enlisted. Intercollegiate athletic competitions and the Hart House Debates were suspended although exhibition and interfaculty games were still held. The David Dunlap Observatory in Richmond Hill opened in 1935 followed by the University of Toronto Institute for Aerospace Studies in 1949. The university opened satellite campuses in Scarborough in 1964 and in Mississauga in 1967. The university’s former affiliated schools at the Ontario Agricultural College and Glendon Hall became fully independent of the University of Toronto and became part of University of Guelph (CA) in 1964 and York University (CA) in 1965 respectively. Beginning in the 1980s reductions in government funding prompted more rigorous fundraising efforts.

    Since 2000

    In 2000 Kin-Yip Chun was reinstated as a professor of the university after he launched an unsuccessful lawsuit against the university alleging racial discrimination. In 2017 a human rights application was filed against the University by one of its students for allegedly delaying the investigation of sexual assault and being dismissive of their concerns. In 2018 the university cleared one of its professors of allegations of discrimination and antisemitism in an internal investigation after a complaint was filed by one of its students.

    The University of Toronto was the first Canadian university to amass a financial endowment greater than c. $1 billion in 2007. On September 24, 2020 the university announced a $250 million gift to the Faculty of Medicine from businessman and philanthropist James C. Temerty- the largest single philanthropic donation in Canadian history. This broke the previous record for the school set in 2019 when Gerry Schwartz and Heather Reisman jointly donated $100 million for the creation of a 750,000-square foot innovation and artificial intelligence centre.

    Research

    Since 1926 the University of Toronto has been a member of the Association of American Universities a consortium of the leading North American research universities. The university manages by far the largest annual research budget of any university in Canada with sponsored direct-cost expenditures of $878 million in 2010. In 2018 the University of Toronto was named the top research university in Canada by Research Infosource with a sponsored research income (external sources of funding) of $1,147.584 million in 2017. In the same year the university’s faculty averaged a sponsored research income of $428,200 while graduate students averaged a sponsored research income of $63,700. The federal government was the largest source of funding with grants from the Canadian Institutes of Health Research; the Natural Sciences and Engineering Research Council; and the Social Sciences and Humanities Research Council amounting to about one-third of the research budget. About eight percent of research funding came from corporations- mostly in the healthcare industry.

    The first practical electron microscope was built by the physics department in 1938. During World War II the university developed the G-suit- a life-saving garment worn by Allied fighter plane pilots later adopted for use by astronauts.Development of the infrared chemiluminescence technique improved analyses of energy behaviours in chemical reactions. In 1963 the asteroid 2104 Toronto was discovered in the David Dunlap Observatory (CA) in Richmond Hill and is named after the university. In 1972 studies on Cygnus X-1 led to the publication of the first observational evidence proving the existence of black holes. Toronto astronomers have also discovered the Uranian moons of Caliban and Sycorax; the dwarf galaxies of Andromeda I, II and III; and the supernova SN 1987A. A pioneer in computing technology the university designed and built UTEC- one of the world’s first operational computers- and later purchased Ferut- the second commercial computer after UNIVAC I. Multi-touch technology was developed at Toronto with applications ranging from handheld devices to collaboration walls. The AeroVelo Atlas which won the Igor I. Sikorsky Human Powered Helicopter Competition in 2013 was developed by the university’s team of students and graduates and was tested in Vaughan.

    The discovery of insulin at the University of Toronto in 1921 is considered among the most significant events in the history of medicine. The stem cell was discovered at the university in 1963 forming the basis for bone marrow transplantation and all subsequent research on adult and embryonic stem cells. This was the first of many findings at Toronto relating to stem cells including the identification of pancreatic and retinal stem cells. The cancer stem cell was first identified in 1997 by Toronto researchers who have since found stem cell associations in leukemia; brain tumors; and colorectal cancer. Medical inventions developed at Toronto include the glycaemic index; the infant cereal Pablum; the use of protective hypothermia in open heart surgery; and the first artificial cardiac pacemaker. The first successful single-lung transplant was performed at Toronto in 1981 followed by the first nerve transplant in 1988; and the first double-lung transplant in 1989. Researchers identified the maturation promoting factor that regulates cell division and discovered the T-cell receptor which triggers responses of the immune system. The university is credited with isolating the genes that cause Fanconi anemia; cystic fibrosis; and early-onset Alzheimer’s disease among numerous other diseases. Between 1914 and 1972 the university operated the Connaught Medical Research Laboratories- now part of the pharmaceutical corporation Sanofi-Aventis. Among the research conducted at the laboratory was the development of gel electrophoresis.

    The University of Toronto is the primary research presence that supports one of the world’s largest concentrations of biotechnology firms. More than 5,000 principal investigators reside within 2 kilometres (1.2 mi) from the university grounds in Toronto’s Discovery District conducting $1 billion of medical research annually. MaRS Discovery District is a research park that serves commercial enterprises and the university’s technology transfer ventures. In 2008, the university disclosed 159 inventions and had 114 active start-up companies. Its SciNet Consortium operates the most powerful supercomputer in Canada.

     
  • richardmitnick 11:00 am on January 14, 2023 Permalink | Reply
    Tags: "How do customers feel about algorithms?", , Computer Science & Technology, , The Wharton School,   

    From The Wharton School At The University of Pennsylvania Via “Today” : “How do customers feel about algorithms?” 

    1

    From The Wharton School

    At

    U Penn bloc

    The University of Pennsylvania

    Via

    From “Today”

    1.12.23

    1
    Wharton marketing professor Stefano Puntoni. (Image: Knowledge at Wharton)

    Many managers worry that algorithms alienate customers. New research from Wharton’s Stefano Puntoni looks at how the attitudes of customers are influenced by algorithmic versus human decision-making.

    Customers feel good about a company when its representatives make decisions in their favor, such as approving their loan application or gold member status. But when an algorithm reaches the same favorable conclusion, those warm and fuzzy feelings tend to fade.

    This surprising contradiction is revealed in a new paper that examines how customers react differently depending on whether a computer or a fellow human being decides their fate.

    This surprising contradiction is revealed in a new paper that examines how customers react differently depending on whether a computer or a fellow human being decides their fate.

    In the study, Wharton marketing professor Stefano Puntoni and his colleagues found that customers are happiest when they receive a positive decision from a person, less happy when the positive decision is made by an algorithm, and equally unhappy with both man and machine when the news is bad. Puntoni is a co-author “Thumbs Up or Down: Consumer Reactions to Decisions by Algorithms Versus Humans,” published in the Journal of Marketing Research [below].

    “What’s interesting is that if you talk to companies, they’ll often tell you that they’re reluctant to let algorithms make decisions because they are worried about what would happen to customers when things go wrong. But we don’t actually find that. The negative consequences of using algorithms for companies seem to be, in fact, when the news is good,” Puntoni says.

    The researchers believe the results can be explained through attribution theory, a psychology term that refers to how people translate their own experiences and perceptions to make sense of their place in the world. Simply put, people have a psychological need to feel good about themselves, and it helps to internalize a good decision and externalize a bad one. When a company representative greenlights a request, customers attribute that to their own exemplary behavior, social status, excellent credit score, or other value-adds to the firm. That’s harder to do when the decision-maker is a bot.

    “These decisions are diagnostic of some characteristic of ourselves,” Puntoni says. “People find it easier to internalize the good decision when the decision was made by a person. Now they get what they want, and it feels better to them that it was a human [deciding] than if it was an algorithm.”

    Science paper:
    Journal of Marketing Research

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    2

    The Wharton School of the University of Pennsylvania is the business school of the University of Pennsylvania, a private Ivy League research university in Philadelphia. Generally considered to be one of the most prestigious business schools in the world, the Wharton School is the world’s oldest collegiate business school, having been established in 1881 through a donation from Joseph Wharton.

    The Wharton School awards the Bachelor of Science with a school-specific economics major, with concentrations in over 18 disciplines in Wharton’s academic departments. The degree is a general business degree focused on core business skills. At the graduate level, the Master of Business Administration (MBA) program can be pursued standalone or offers dual studies leading to a joint degree from other schools (e.g., law, engineering, government). Similarly, in addition to its tracks in accounting, finance, operations, statistics, and other academic departments, the doctoral and post-doctoral programs co-sponsors several diploma programs in conjunction with other schools within the University. The college was a pioneer in so-called ‘Executive Education’; custom learning experiences that lead to academic enrichment, however with no standing.

    Since the establishment of journalistic rankings, Wharton’s undergraduate and graduate programs have been consistently ranked in the top tier. Its MBA program is ranked No. 1 in the World according to the Financial Times and No. 1 in the United States according to the 2023 U.S. News & World Report ranking. Meanwhile, Wharton’s undergraduate business program has been ranked No. 1 in the United States, and the MBA for Executives No. 2 in the US by U.S. News. MBA graduates of Wharton earn an average $175,000 (USD) first year base pay, the highest of all the leading business schools. Wharton’s MBA program is tied for the highest in the United States with an average GMAT score of 732 (97th percentile) for its entering class.

    Wharton School alumni include Tesla, SpaceX, and Twitter CEO Elon Musk, former U.S. President Donald Trump, and billionaire investor Warren Buffett. Current and former CEOs of Fortune 500 companies including Alphabet Inc., Boeing, Comcast, General Electric, Johnson & Johnson, Oracle, Pfizer, PepsiCo, and Tesla are also Wharton School alumni.

    U Penn campus

    Academic life at University of Pennsylvania is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

    The University of Pennsylvania is a private Ivy League research university in Philadelphia, Pennsylvania. The university claims a founding date of 1740 and is one of the nine colonial colleges chartered prior to the U.S. Declaration of Independence. Benjamin Franklin, Penn’s founder and first president, advocated an educational program that trained leaders in commerce, government, and public service, similar to a modern liberal arts curriculum.

    Penn has four undergraduate schools as well as twelve graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences; the School of Engineering and Applied Science; the Wharton School; and the School of Nursing. Penn’s “One University Policy” allows students to enroll in classes in any of Penn’s twelve schools. Among its highly ranked graduate and professional schools are a law school whose first professor wrote the first draft of the United States Constitution, the first school of medicine in North America (Perelman School of Medicine, 1765), and the first collegiate business school (Wharton School, 1881).

    Penn is also home to the first “student union” building and organization (Houston Hall, 1896), the first Catholic student club in North America (Newman Center, 1893), the first double-decker college football stadium (Franklin Field, 1924 when second deck was constructed), and Morris Arboretum, the official arboretum of the Commonwealth of Pennsylvania. The first general-purpose electronic computer (ENIAC) was developed at Penn and formally dedicated in 1946. In 2019, the university had an endowment of $14.65 billion, the sixth-largest endowment of all universities in the United States, as well as a research budget of $1.02 billion. The university’s athletics program, the Quakers, fields varsity teams in 33 sports as a member of the NCAA Division I Ivy League conference.

    As of 2018, distinguished alumni and/or Trustees include three U.S. Supreme Court justices; 32 U.S. senators; 46 U.S. governors; 163 members of the U.S. House of Representatives; eight signers of the Declaration of Independence and seven signers of the U.S. Constitution (four of whom signed both representing two-thirds of the six people who signed both); 24 members of the Continental Congress; 14 foreign heads of state and two presidents of the United States, including Donald Trump. As of October 2019, 36 Nobel laureates; 80 members of the American Academy of Arts and Sciences; 64 billionaires; 29 Rhodes Scholars; 15 Marshall Scholars and 16 Pulitzer Prize winners have been affiliated with the university.

    History

    The University of Pennsylvania considers itself the fourth-oldest institution of higher education in the United States, though this is contested by Princeton University and Columbia University. The university also considers itself as the first university in the United States with both undergraduate and graduate studies.

    In 1740, a group of Philadelphians joined together to erect a great preaching hall for the traveling evangelist George Whitefield, who toured the American colonies delivering open-air sermons. The building was designed and built by Edmund Woolley and was the largest building in the city at the time, drawing thousands of people the first time it was preached in. It was initially planned to serve as a charity school as well, but a lack of funds forced plans for the chapel and school to be suspended. According to Franklin’s autobiography, it was in 1743 when he first had the idea to establish an academy, “thinking the Rev. Richard Peters a fit person to superintend such an institution”. However, Peters declined a casual inquiry from Franklin and nothing further was done for another six years. In the fall of 1749, now more eager to create a school to educate future generations, Benjamin Franklin circulated a pamphlet titled Proposals Relating to the Education of Youth in Pensilvania, his vision for what he called a “Public Academy of Philadelphia”. Unlike the other colonial colleges that existed in 1749—Harvard University, William & Mary, Yale Unversity, and The College of New Jersey—Franklin’s new school would not focus merely on education for the clergy. He advocated an innovative concept of higher education, one which would teach both the ornamental knowledge of the arts and the practical skills necessary for making a living and doing public service. The proposed program of study could have become the nation’s first modern liberal arts curriculum, although it was never implemented because Anglican priest William Smith (1727-1803), who became the first provost, and other trustees strongly preferred the traditional curriculum.

    Franklin assembled a board of trustees from among the leading citizens of Philadelphia, the first such non-sectarian board in America. At the first meeting of the 24 members of the board of trustees on November 13, 1749, the issue of where to locate the school was a prime concern. Although a lot across Sixth Street from the old Pennsylvania State House (later renamed and famously known since 1776 as “Independence Hall”), was offered without cost by James Logan, its owner, the trustees realized that the building erected in 1740, which was still vacant, would be an even better site. The original sponsors of the dormant building still owed considerable construction debts and asked Franklin’s group to assume their debts and, accordingly, their inactive trusts. On February 1, 1750, the new board took over the building and trusts of the old board. On August 13, 1751, the “Academy of Philadelphia”, using the great hall at 4th and Arch Streets, took in its first secondary students. A charity school also was chartered on July 13, 1753 by the intentions of the original “New Building” donors, although it lasted only a few years. On June 16, 1755, the “College of Philadelphia” was chartered, paving the way for the addition of undergraduate instruction. All three schools shared the same board of trustees and were considered to be part of the same institution. The first commencement exercises were held on May 17, 1757.

    The institution of higher learning was known as the College of Philadelphia from 1755 to 1779. In 1779, not trusting then-provost the Reverend William Smith’s “Loyalist” tendencies, the revolutionary State Legislature created a University of the State of Pennsylvania. The result was a schism, with Smith continuing to operate an attenuated version of the College of Philadelphia. In 1791, the legislature issued a new charter, merging the two institutions into a new University of Pennsylvania with twelve men from each institution on the new board of trustees.

    Penn has three claims to being the first university in the United States, according to university archives director Mark Frazier Lloyd: the 1765 founding of the first medical school in America made Penn the first institution to offer both “undergraduate” and professional education; the 1779 charter made it the first American institution of higher learning to take the name of “University”; and existing colleges were established as seminaries (although, as detailed earlier, Penn adopted a traditional seminary curriculum as well).

    After being located in downtown Philadelphia for more than a century, the campus was moved across the Schuylkill River to property purchased from the Blockley Almshouse in West Philadelphia in 1872, where it has since remained in an area now known as University City. Although Penn began operating as an academy or secondary school in 1751 and obtained its collegiate charter in 1755, it initially designated 1750 as its founding date; this is the year that appears on the first iteration of the university seal. Sometime later in its early history, Penn began to consider 1749 as its founding date and this year was referenced for over a century, including at the centennial celebration in 1849. In 1899, the board of trustees voted to adjust the founding date earlier again, this time to 1740, the date of “the creation of the earliest of the many educational trusts the University has taken upon itself”. The board of trustees voted in response to a three-year campaign by Penn’s General Alumni Society to retroactively revise the university’s founding date to appear older than Princeton University, which had been chartered in 1746.

    Research, innovations and discoveries

    Penn is classified as an “R1” doctoral university: “Highest research activity.” Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn’s research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health.

    In line with its well-known interdisciplinary tradition, Penn’s research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing; the Center for Global Women’s Health at the Nursing School; the $13 million Morris Arboretum’s Horticulture Center; the $15 million Jay H. Baker Retailing Center at Wharton; and the $13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the “Penn Integrates Knowledge” title awarded to selected Penn professors “whose research and teaching exemplify the integration of knowledge”. These professors hold endowed professorships and joint appointments between Penn’s schools.

    Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia University and Cornell University (Harvard University did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale University) and tenth nationally.

    In most disciplines Penn professors’ productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn’s 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields.

    Penn’s research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school; the first university teaching hospital; the first business school; and the first student union Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, “Wharton is on the crest of a wave of reinvention and change in management education”.

    Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering.

    ENIAC UPenn

    It was here also where the world’s first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer’s link with genes; cognitive therapy; Retin-A (the cream used to treat acne), Resistin; the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the genes for fragile X syndrome, the most common form of inherited mental retardation; spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; and Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs.

    Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research; Simon Kuznets’s method of measuring Gross National Product; the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the “Wharton Model” developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon’s health reform in the 1970s.

    International partnerships

    Students can study abroad for a semester or a year at partner institutions such as the London School of Economics(UK), University of Barcelona [Universitat de Barcelona](ES), Paris Institute of Political Studies [Institut d’études politiques de Paris](FR), University of Queensland(AU), University College London(UK), King’s College London(UK), Hebrew University of Jerusalem(IL) and University of Warwick(UK).

     
  • richardmitnick 9:10 am on January 9, 2023 Permalink | Reply
    Tags: "BCIs": brain-computer interfaces, "Family matters - For the Winston siblings the intersection of software engineering and neuroscience research is relatively inspiring at The University of Washington", , Computer Science & Technology, , Neural Engineering, , ,   

    From The Paul G. Allen School of Computer Science and Engineering In The College of Engineering At The University of Washington : “Family matters – For the Winston siblings the intersection of software engineering and neuroscience research is relatively inspiring at The University of Washington” 

    From The Paul G. Allen School of Computer Science and Engineering

    In

    The College of Engineering

    At

    The University of Washington

    1.7.23
    Kristin Osborne

    1
    The Winston siblings pose for a family photo in downtown Pitttsburgh during the ICSE 2022 conference (from left): Caleb, Cailin, Cleah, Claris and Chloe. Credit: The University of Washington.

    Back in May, a group of five student researchers advised by Allen School professors Rajesh Rao and René Just disembarked in Pittsburgh, Pennsylvania for the 44th International Conference on Software Engineering. They had traveled to ICSE 2022 from Seattle to present a paper [Proceedings of the 44th International Conference on Software Engineering (below)] describing a methodology they had developed at the University of Washington for detecting and repairing faults in brain-computer interfaces (BCIs), which are designed to enhance or restore sensorimotor function in people with neurological disorders or spinal cord injury. 

    The paper was noteworthy for its contributions toward ensuring that BCIs, which decode or encode neural signals to mediate the connection between the brain and assistive devices, are safe and robust for everyday use. The team was noteworthy for their connection with each other: All five student co-authors — Cailin, twins Caleb and Chloe, Claris, and Cleah — are siblings. And all five were, or were about to become, Allen School majors.

    The research that prompted the ICSE paper had its roots in a project initiated by four of the siblings during Rao’s Neural Engineering capstone course last year. While Rao appreciated the novelty of so many siblings working on the same project, he was most appreciative of their ambition and ingenuity in tackling an open problem in neural software engineering with the potential to significantly improve people’s quality of life.

    “The field of BCIs is still in its early stages, with most researchers focusing on proof-of-concept demonstrations,” said Rao, co-director of the Center for Neurotechnology and the Cherng Jia and Elizabeth Yun Hwang Professor in the Allen School and the UW Department of Electrical & Computer Engineering. “I was therefore surprised and impressed when the Winston team proposed a forward-looking class project seeking to apply state-of-the-art techniques in software engineering to the design and implementation of BCIs.”

    Recent Allen School alum Cailin Winston (B.S., ‘20, M.S., ‘22) — the eldest of the Winston siblings — developed and evaluated components of the team’s approach, which applies widely accepted methods for automated software testing and debugging, such as partial test oracles for detecting faults, corrective heuristics for labeling faulty data and slice functions for localizing faults, to the nascent domain of BCIs. The acquired data is then used to retrain the model to correct its performance of fault-prone tasks or used to suggest additional classes of data to target data collection and labeling. A student in the Allen School’s fifth-year master’s program at the time of publication, Cailin was already keenly aware of the importance of software and computational methods to biomedical research. That awareness prompted her to seek out ways to explore the intersection of the two disciplines early in her academic career.

    2
    Cailin Winston presents the group’s paper on techniques for repairing BCIs at ICSE 2022. Credit: The University of Washington.

    “I initially got involved by contacting research groups at the University of Washington with prior publications that piqued my interest,” explained Cailin, who joined NVIDIA as a Deep Learning Engineer after graduation. “Attending research talks and colloquiums also made me aware of the various research projects being carried out and allowed me to further my involvement.”

    As it turns out, the people who would take her involvement furthest — all the way to Pittsburgh as first author of a major conference paper — were closest to home. Her brother, Caleb Winston (B.S., ‘22), was also eager to find a pathway into research; in his case, it was a weekly reading group focused on the latest program synthesis papers organized by graduate students in the Allen School’s Programming Languages & Software Engineering (PLSE) group that set him on his way. Fast forward a few years, and Caleb and Cailin are collaborating on a methodology for real-time debugging and repair of BCIs and writing a paper accepted to one of the top conferences in the field.

    In addition to sharing responsibilities for aspects of the ICSE paper, Caleb also shared his sister’s interest in how computing intersects with biomedicine — along with many other fields.

    “Computer science intersects with so many different fields of study, from law, to healthcare, to urban planning,” noted Caleb, who is currently pursuing a Ph.D. in Computer Science at Stanford University after graduating from the Allen School in the spring. “This generalizability is what excites me to study programming languages, software engineering, AI and hardware/software systems. Simple ideas from these subfields have potentially impactful applications in many fields outside of computing.”

    Claris Winston, who is now in her third year at the UW, became intrigued by the connection between computing and biomedicine in part by her experience working on a mobile app for scoliosis treatment as well as her experience participating in a summer computing camp organized by Girls Who Code. After earning direct admission to the Allen School as a freshman, she worked with members of the Molecular Information Systems Lab (MISL) on a new combinatorial polymerase chain reaction method [ACS Synthetic Bioliology (below)] for efficient retrieval of DNA oligo pools. That work, for which Claris was first author, was presented in a journal, in which her graphic design was also featured on the front cover. As she subsequently discovered at ICSE, presenting her research at a conference offered an entirely different — and exhilarating — experience.

    “It was exciting to see researchers from all over the world and with such diverse backgrounds,” said Claris, who currently works with Allen School professor Jennifer Mankoff in the Make4All Group on research related to optimization for embroidered tactile graphics. “I was impressed by the range of topics covered, and the talks themselves had so many creative ideas and applications in the field.”

    Youngest sibling and current freshman Cleah Winston contributed to the ICSE paper even before she arrived at the UW. She credits this and other early research experiences with opening her eyes to how an Allen School education would help her reach her goal of creating real-world impact.

    “After being involved in several research projects in high school, I realized how much I enjoyed designing and developing solutions to problems in society,” she explained. “I felt that studying computer science would give me the tools and thought process for designing such solutions.”

    The BCI project certainly gave her a head start in that regard, where she collaborated with sisters Claris and Chloe Winston (B.S., ‘22) in implementing a set of neural decoding BCI applications and using focused data acquisition and data labeling techniques to evaluate the team’s methodology for testing and repairing BCIs. Cleah is currently working with Allen School professor Byron Boots in the Robot Learning Lab exploring neural networks for computer vision and applications for hazard avoidance for off-road autonomous vehicles.

    Chloe, who also took the lead on the statistical analysis of the results, credited an “internship-like class” in biotechnology research that she took in high school with setting her on a path to research at the UW. Her experience in the Garden Laboratory, in UW Medicine’s Department of Neurology, further fueled her love for research.

    “I enjoyed the process of formulating research questions and designing and conducting experiments,” said Chloe, who double-majored in computer science and neuroscience at the UW. “That experience led me to seek other research opportunities throughout my undergraduate years.”

    The two senior authors, Rao and Just, saw to it that all five student researchers would be able to attend the conference in person. While the siblings enjoyed the thrill of presenting their work to more senior researchers, their first conference experience was memorable for a variety of other reasons.

    “I especially enjoyed the talks in the ‘Human Aspects of Software Engineering’ session, in which the social and cognitive aspects of the field were discussed,” said Claris. “Many of these topics were ones that I had not thought deeply about before, but this research is very important to study so we can build better engineering communities and software that benefits everyone.”

    Chloe, meanwhile, found the conference eye-opening for the breadth of research happening in software engineering and the diverse problems it is trying to solve. The experience also impressed upon her the importance of researchers showing up to share their work. It’s a lesson she took with her to the University of Pennsylvania, where she is pursuing a M.D./Ph.D. with the goal of incorporating deep learning techniques into biomedical research and patient care as a physician-scientist.

    “The conference environment was highly collaborative, and I was impressed by how new ideas were sparked through presentation and discussion,” she said. “Despite the inconvenience of travel and the anxiety that can come with presenting, I aim to continue attending and presenting at conferences. This is how new research directions are formed.”

    For Caleb, ICSE offered a chance to bring what, at this point, could arguably be referred to as “the family business” full circle.

    “The first time we all worked on a research project together was in high school. We were printing, cutting, and taping together a poster on precision medicine the night before a science fair,” he recalled. “It’s exciting to think that we have gone from working on high school science fair projects to cutting-edge research at the intersection of neural engineering and software engineering, which led to us presenting at ICSE.”

    Read the team’s and the siblings’ retrospective on ICSE 2022 here.

    Science papers:
    Proceedings of the 44th International Conference on Software Engineering
    ACS Synthetic Biology

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About the University of Washington Paul G. Allen School of Computer Science and Engineering
    Mission, Facts, and Stats

    Our mission is to develop outstanding engineers and ideas that change the world.

    Faculty:
    275 faculty (25.2% women)
    Achievements:

    128 NSF Young Investigator/Early Career Awards since 1984
    32 Sloan Foundation Research Awards
    2 MacArthur Foundation Fellows (2007 and 2011)

    A national leader in educating engineers, each year the College turns out new discoveries, inventions and top-flight graduates, all contributing to the strength of our economy and the vitality of our community.

    Engineering innovation

    PEOPLE Innovation at UW ECE is exemplified by our outstanding faculty and by the exceptional group of students they advise and mentor. Students receive a robust education through a strong technical foundation, group project work and hands-on research opportunities. Our faculty work in dynamic research areas with diverse opportunities for projects and collaborations. Through their research, they address complex global challenges in health, energy, technology and the environment, and receive significant research and education grants. IMPACT We continue to expand our innovation ecosystem by promoting an entrepreneurial mindset in our teaching and through diverse partnerships. The field of electrical and computer engineering is at the forefront of solving emerging societal challenges, empowered by innovative ideas from our community. As our department evolves, we are dedicated to expanding our faculty and student body to meet the growing demand for engineers. We welcomed six new faculty hires in the 2018-2019 academic year. Our meaningful connections and collaborations place the department as a leader in the field.

    Engineers drive the innovation economy and are vital to solving society’s most challenging problems. The College of Engineering is a key part of a world-class research university in a thriving hub of aerospace, biotechnology, global health and information technology innovation. Over 50% of UW startups in FY18 came from the College of Engineering.

    Commitment to diversity and access

    The College of Engineering is committed to developing and supporting a diverse student body and faculty that reflect and elevate the populations we serve. We are a national leader in women in engineering; 25.5% of our faculty are women compared to 17.4% nationally. We offer a robust set of diversity programs for students and faculty.

    u-washington-campus

    The University of Washington is an engine of economic growth, today ranked third in the nation for the number of startups launched each year, with 65 companies having been started in the last five years alone by UW students and faculty, or with technology developed here.

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    u-washington-campus

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

    The University of Washington is a public research university in Seattle, Washington, United States. Founded in 1861, University of Washington is one of the oldest universities on the West Coast; it was established in downtown Seattle approximately a decade after the city’s founding to aid its economic development. Today, the university’s 703-acre main Seattle campus is in the University District above the Montlake Cut, within the urban Puget Sound region of the Pacific Northwest. The university has additional campuses in Tacoma and Bothell. Overall, University of Washington encompasses over 500 buildings and over 20 million gross square footage of space, including one of the largest library systems in the world with more than 26 university libraries, as well as the UW Tower, lecture halls, art centers, museums, laboratories, stadiums, and conference centers. The university offers bachelor’s, master’s, and doctoral degrees through 140 departments in various colleges and schools, sees a total student enrollment of roughly 46,000 annually, and functions on a quarter system.

    University of Washington is a member of the Association of American Universities and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation, UW spent $1.41 billion on research and development in 2018, ranking it 5th in the nation. As the flagship institution of the six public universities in Washington state, it is known for its medical, engineering and scientific research as well as its highly competitive computer science and engineering programs. Additionally, University of Washington continues to benefit from its deep historic ties and major collaborations with numerous technology giants in the region, such as Amazon, Boeing, Nintendo, and particularly Microsoft. Paul G. Allen, Bill Gates and others spent significant time at Washington computer labs for a startup venture before founding Microsoft and other ventures. The University of Washington’s 22 varsity sports teams are also highly competitive, competing as the Huskies in the Pac-12 Conference of the NCAA Division I, representing the United States at the Olympic Games, and other major competitions.

    The university has been affiliated with many notable alumni and faculty, including 21 Nobel Prize laureates and numerous Pulitzer Prize winners, Fulbright Scholars, Rhodes Scholars and Marshall Scholars.

    In 1854, territorial governor Isaac Stevens recommended the establishment of a university in the Washington Territory. Prominent Seattle-area residents, including Methodist preacher Daniel Bagley, saw this as a chance to add to the city’s potential and prestige. Bagley learned of a law that allowed United States territories to sell land to raise money in support of public schools. At the time, Arthur A. Denny, one of the founders of Seattle and a member of the territorial legislature, aimed to increase the city’s importance by moving the territory’s capital from Olympia to Seattle. However, Bagley eventually convinced Denny that the establishment of a university would assist more in the development of Seattle’s economy. Two universities were initially chartered, but later the decision was repealed in favor of a single university in Lewis County provided that locally donated land was available. When no site emerged, Denny successfully petitioned the legislature to reconsider Seattle as a location in 1858.

    In 1861, scouting began for an appropriate 10 acres (4 ha) site in Seattle to serve as a new university campus. Arthur and Mary Denny donated eight acres, while fellow pioneers Edward Lander, and Charlie and Mary Terry, donated two acres on Denny’s Knoll in downtown Seattle. More specifically, this tract was bounded by 4th Avenue to the west, 6th Avenue to the east, Union Street to the north, and Seneca Streets to the south.

    John Pike, for whom Pike Street is named, was the university’s architect and builder. It was opened on November 4, 1861, as the Territorial University of Washington. The legislature passed articles incorporating the University, and establishing its Board of Regents in 1862. The school initially struggled, closing three times: in 1863 for low enrollment, and again in 1867 and 1876 due to funds shortage. University of Washington awarded its first graduate Clara Antoinette McCarty Wilt in 1876, with a bachelor’s degree in science.

    19th century relocation

    By the time Washington state entered the Union in 1889, both Seattle and the University had grown substantially. University of Washington’s total undergraduate enrollment increased from 30 to nearly 300 students, and the campus’s relative isolation in downtown Seattle faced encroaching development. A special legislative committee, headed by University of Washington graduate Edmond Meany, was created to find a new campus to better serve the growing student population and faculty. The committee eventually selected a site on the northeast of downtown Seattle called Union Bay, which was the land of the Duwamish, and the legislature appropriated funds for its purchase and construction. In 1895, the University relocated to the new campus by moving into the newly built Denny Hall. The University Regents tried and failed to sell the old campus, eventually settling with leasing the area. This would later become one of the University’s most valuable pieces of real estate in modern-day Seattle, generating millions in annual revenue with what is now called the Metropolitan Tract. The original Territorial University building was torn down in 1908, and its former site now houses the Fairmont Olympic Hotel.

    The sole-surviving remnants of Washington’s first building are four 24-foot (7.3 m), white, hand-fluted cedar, Ionic columns. They were salvaged by Edmond S. Meany, one of the University’s first graduates and former head of its history department. Meany and his colleague, Dean Herbert T. Condon, dubbed the columns as “Loyalty,” “Industry,” “Faith”, and “Efficiency”, or “LIFE.” The columns now stand in the Sylvan Grove Theater.

    20th century expansion

    Organizers of the 1909 Alaska-Yukon-Pacific Exposition eyed the still largely undeveloped campus as a prime setting for their world’s fair. They came to an agreement with Washington’s Board of Regents that allowed them to use the campus grounds for the exposition, surrounding today’s Drumheller Fountain facing towards Mount Rainier. In exchange, organizers agreed Washington would take over the campus and its development after the fair’s conclusion. This arrangement led to a detailed site plan and several new buildings, prepared in part by John Charles Olmsted. The plan was later incorporated into the overall University of Washington campus master plan, permanently affecting the campus layout.

    Both World Wars brought the military to campus, with certain facilities temporarily lent to the federal government. In spite of this, subsequent post-war periods were times of dramatic growth for the University. The period between the wars saw a significant expansion of the upper campus. Construction of the Liberal Arts Quadrangle, known to students as “The Quad,” began in 1916 and continued to 1939. The University’s architectural centerpiece, Suzzallo Library, was built in 1926 and expanded in 1935.

    After World War II, further growth came with the G.I. Bill. Among the most important developments of this period was the opening of the School of Medicine in 1946, which is now consistently ranked as the top medical school in the United States. It would eventually lead to the University of Washington Medical Center, ranked by U.S. News and World Report as one of the top ten hospitals in the nation.

    In 1942, all persons of Japanese ancestry in the Seattle area were forced into inland internment camps as part of Executive Order 9066 following the attack on Pearl Harbor. During this difficult time, university president Lee Paul Sieg took an active and sympathetic leadership role in advocating for and facilitating the transfer of Japanese American students to universities and colleges away from the Pacific Coast to help them avoid the mass incarceration. Nevertheless, many Japanese American students and “soon-to-be” graduates were unable to transfer successfully in the short time window or receive diplomas before being incarcerated. It was only many years later that they would be recognized for their accomplishments during the University of Washington’s Long Journey Home ceremonial event that was held in May 2008.

    From 1958 to 1973, the University of Washington saw a tremendous growth in student enrollment, its faculties and operating budget, and also its prestige under the leadership of Charles Odegaard. University of Washington student enrollment had more than doubled to 34,000 as the baby boom generation came of age. However, this era was also marked by high levels of student activism, as was the case at many American universities. Much of the unrest focused around civil rights and opposition to the Vietnam War. In response to anti-Vietnam War protests by the late 1960s, the University Safety and Security Division became the University of Washington Police Department.

    Odegaard instituted a vision of building a “community of scholars”, convincing the Washington State legislatures to increase investment in the University. Washington senators, such as Henry M. Jackson and Warren G. Magnuson, also used their political clout to gather research funds for the University of Washington. The results included an increase in the operating budget from $37 million in 1958 to over $400 million in 1973, solidifying University of Washington as a top recipient of federal research funds in the United States. The establishment of technology giants such as Microsoft, Boeing and Amazon in the local area also proved to be highly influential in the University of Washington’s fortunes, not only improving graduate prospects but also helping to attract millions of dollars in university and research funding through its distinguished faculty and extensive alumni network.

    21st century

    In 1990, the University of Washington opened its additional campuses in Bothell and Tacoma. Although originally intended for students who have already completed two years of higher education, both schools have since become four-year universities with the authority to grant degrees. The first freshman classes at these campuses started in fall 2006. Today both Bothell and Tacoma also offer a selection of master’s degree programs.

    In 2012, the University began exploring plans and governmental approval to expand the main Seattle campus, including significant increases in student housing, teaching facilities for the growing student body and faculty, as well as expanded public transit options. The University of Washington light rail station was completed in March 2015, connecting Seattle’s Capitol Hill neighborhood to the University of Washington Husky Stadium within five minutes of rail travel time. It offers a previously unavailable option of transportation into and out of the campus, designed specifically to reduce dependence on private vehicles, bicycles and local King County buses.

    University of Washington has been listed as a “Public Ivy” in Greene’s Guides since 2001, and is an elected member of the American Association of Universities. Among the faculty by 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences, 67 members of the American Academy of Arts and Sciences, 53 members of the National Academy of Medicine, 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering, 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among UW students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017.

    The Academic Ranking of World Universities (ARWU) has consistently ranked University of Washington as one of the top 20 universities worldwide every year since its first release. In 2019, University of Washington ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900.

    U.S. News & World Report ranked University of Washington 8th out of nearly 1,500 universities worldwide for 2021, with University of Washington’s undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities.

    In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world’s 500 major universities, ranked University of Washington 12th globally and 5th in the U.S.

    In 2019, Kiplinger Magazine’s review of “top college values” named University of Washington 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings University of Washington was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service.

     
  • richardmitnick 6:28 pm on January 8, 2023 Permalink | Reply
    Tags: "Flash Center moves to Rochester and advances cutting-edge physics research", , , , , Computer Science & Technology, , , , , ,   

    From The University of Rochester: “Flash Center moves to Rochester and advances cutting-edge physics research” 

    From The University of Rochester

    1.6.23

    1
    Petros Tzeferacos (right), associate professor of physics and astronomy at The University of Rochester, senior scientist at the University’s Laboratory for Laser Energetics (LLE), and director of the Flash Center for Computational Science, uses the University’s VISTA Collaboratory visualization facility to explain FLASH simulations of a laser-driven experiment to (from left) LLE deputy director Chris Deeney, Flash center graduate research assistant and Horton Fellow Abigail Armstrong, and Flash center research scientist Adam Reyes. The center is devoted to computer simulations used to advance an understanding of astrophysics, plasma science, high-energy-density physics, and fusion energy. (Photo: J. Adam Fenster/The University of Rochester.)

    The Flash Center for Computational Science offers researchers worldwide access to a computer code that simulates phenomena in astrophysics, high-energy-density science, and fusion research.

    UPDATE: New FLASH code expands possibilities for physics experiments (January 6, 2023)

    The University of Rochester is the new home of a research center devoted to computer simulations used to advance the understanding of astrophysics, plasma science, high-energy-density physics, and fusion energy.

    The Flash Center for Computational Science recently moved from the University of Chicago to the Department of Physics and Astronomy at Rochester. Located in the Bausch and Lomb building on the River Campus, the center encompasses numerous cross-disciplinary, computational physics research projects conducted using the FLASH code. The FLASH code is a publicly available multi-physics code that allows researchers to accurately simulate and model many scientific phenomena—including plasma physics, computational fluid dynamics, high-energy-density physics (HEDP), and fusion energy research—and inform the design and execution of experiments.

    “We are thrilled to have the Flash Center and the FLASH code join the University of Rochester research enterprise and family, and we want to thank the University of Chicago for working hand-in-hand with us to facilitate this transfer,” says Stephen Dewhurst. Dewhurst, the vice dean for research at the School of Medicine and Dentistry and associate vice president for health sciences research for the University, is currently serving a one-year appointment as interim vice president for research.

    The ‘premiere’ code used at the world’s top laser facilities

    Development of the FLASH code began in 1997 when the Flash Center was founded at the University of Chicago. The code, which is continuously updated, is currently used by more than 3,500 scientists across the globe to simulate various physics processes.

    The Flash Center fosters joint research projects between national laboratories, industry partners, and academic groups around the world. It also supports training in numerical modeling and code development for graduate students, undergraduate students, and postdoctoral research associates, while continuing to develop and steward the FLASH code itself.

    “In the last five years FLASH has become the premiere academic code for designing and interpreting experiments at the world’s largest laser facilities, such the National Ignition Facility at The DOE’s Lawrence Livermore National Laboratory and the Omega Laser Facility at the Laboratory for Laser Energetics (LLE), here at the University of Rochester,” says Michael Campbell, the director of the LLE. “Having the Flash Center and the FLASH code at Rochester significantly strengthens LLE’s position as a unique national resource for research and education in science and technology.”

    Petros Tzeferacos, an associate professor of Physics and Astronomy and a senior scientist at the LLE, serves as the center’s director. Tzeferacos’s research combines theory, numerical modeling with the FLASH code, and laboratory experiments to study fundamental processes in Plasma Physics and Astrophysics, high-energy-density laboratory Astrophysics, and Fusion Energy. Tzeferacos became director of the Flash Center in 2018 after serving for five years as associate director and code group leader, when the center was still housed at the University of Chicago.

    “The University of Rochester is a unique place where Plasma Physics, Plasma Astrophysics, and high-energy-density science are core research efforts,” Tzeferacos says. “We have in-house computational resources and leverage the high-power computing resources at LLE, the Center for Integrated Research Computing (CIRC), and national supercomputing facilities to perform our numerical studies. We also train the next generation of Computational Physics and Astrophysics scientists in the use and development of simulation codes.”

    Research at the Flash Center is funded by the DOE National Nuclear Security Administration, the DOE Office of Science Fusion Energy Sciences, the US DOE Advanced Research Projects Agency, The National Science Foundation, The DOE’s Los Alamos National Laboratory, The DOE’s Lawrence Livermore National Laboratory, and the LLE.

    “FLASH is a critically important simulation tool for academic groups engaging with NNSA’s academic programs and performing HEDP research on NNSA facilities,” says Ann J. Satsangi, federal program manager at the NNSA Office of Experimental Sciences. “The Flash Center joining forces with the LLE is a very positive development that promises to significantly contribute to advancing high-energy-density science and the NNSA mission.”

    UPDATE: New FLASH code expands possibilities for physics experiments
    The Flash Center for Computational Science at the University of Rochester recently announced an exciting milestone: researchers have developed a new version of the FLASH code, the first official update of the code since the FLASH center moved to Rochester from the University of Chicago.

    The new version of the code, FLASH v4.7, increases the accuracy of simulations of magnetized plasmas and drastically expands the range of laboratory experiments the code can model.

    “This expansion fuels discovery science for thousands of researchers around the world, across application domains, while concurrently enabling the Flash Center to pursue a rich portfolio of research topics at the frontiers of plasma astrophysics, high-energy-density physics, and fusion,” says Petros Tzeferacos, an associate professor of physics and astronomy at Rochester and a senior scientist at the LLE, who serves as the center’s director.

    FLASH v4.7 is the culmination of nearly two and a half years of code development, spearheaded by Adam Reyes, the Flash Center code group leader in the Department of Physics and Astronomy, and other Flash Center personnel.

    According to Tzeferacos, the development of the FLASH code also draws heavily from the Flash Center’s robust education program that engages Rochester graduate and undergraduate students.

    “A key aspect of what we do at the Flash Center is to train the next generation of computational physicists and astrophysicists to develop multi-physics codes like FLASH and perform validated simulations,” Tzeferacos says. “Several of the items in the new FLASH release were developed and verified by our graduate students, who may ultimately use the new capabilities in their graduate research.”

    Read more about the new FLASH code release here.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    University of Rochester campus

    The University of Rochester is a private research university in Rochester, New York. The university grants undergraduate and graduate degrees, including doctoral and professional degrees.

    The University of Rochester enrolls approximately 6,800 undergraduates and 5,000 graduate students. Its 158 buildings house over 200 academic majors. According to the National Science Foundation , The University of Rochester spent $370 million on research and development in 2018, ranking it 68th in the nation. The university is the 7th largest employer in the Finger lakes region of New York.

    The College of Arts, Sciences, and Engineering is home to departments and divisions of note. The Institute of Optics was founded in 1929 through a grant from Eastman Kodak and Bausch and Lomb as the first educational program in the US devoted exclusively to optics and awards approximately half of all optics degrees nationwide and is widely regarded as the premier optics program in the nation and among the best in the world.

    The Departments of Political Science and Economics have made a significant and consistent impact on positivist social science since the 1960s and historically rank in the top 5 in their fields. The Department of Chemistry is noted for its contributions to synthetic organic chemistry, including the first lab-based synthesis of morphine. The Rossell Hope Robbins Library serves as the university’s resource for Old and Middle English texts and expertise. The university is also home to Rochester’s Laboratory for Laser Energetics, a Department of Energy supported national laboratory.

    University of Rochester Laboratory for Laser Energetics.

    The University of Rochester’s Eastman School of Music ranks first among undergraduate music schools in the U.S. The Sibley Music Library at Eastman is the largest academic music library in North America and holds the third largest collection in the United States.

    In its history The University of Rochester alumni and faculty have earned 13 Nobel Prizes; 13 Pulitzer Prizes; 45 Grammy Awards; 20 Guggenheim Awards; 5 National Academy of Sciences; 4 National Academy of Engineering; 3 Rhodes Scholarships; 3 National Academy of Inventors; and 1 National Academy of Inventors Hall of Fame.

    History

    Early history

    The University of Rochester traces its origins to The First Baptist Church of Hamilton (New York) which was founded in 1796. The church established the Baptist Education Society of the State of New York later renamed the Hamilton Literary and Theological Institution in 1817. This institution gave birth to both Colgate University and The University of Rochester. Its function was to train clergy in the Baptist tradition. When it aspired to grant higher degrees it created a collegiate division separate from the theological division.

    The collegiate division was granted a charter by the State of New York in 1846 after which its name was changed to Madison University. John Wilder and the Baptist Education Society urged that the new university be moved to Rochester, New York. However, legal action prevented the move. In response, dissenting faculty, students, and trustees defected and departed for Rochester, where they sought a new charter for a new university.

    Madison University was eventually renamed as Colgate University.

    Founding

    Asahel C. Kendrick- professor of Greek- was among the faculty that departed Madison University for The University of Rochester. Kendrick served as acting president while a national search was conducted. He reprised this role until 1853 when Martin Brewer Anderson of the Newton Theological Seminary in Massachusetts was selected to fill the inaugural posting.

    The University of Rochester’s new charter was awarded by the Regents of the State of New York on January 31, 1850. The charter stipulated that The University of Rochester have $100,000 in endowment within five years upon which the charter would be reaffirmed. An initial gift of $10,000 was pledged by John Wilder which helped catalyze significant gifts from individuals and institutions.

    Classes began that November with approximately 60 students enrolled including 28 transfers from Madison. From 1850 to 1862 The University of Rochester was housed in the old United States Hotel in downtown Rochester on Buffalo Street near Elizabeth Street- today West Main Street near the I-490 overpass. On a February 1851 visit Ralph Waldo Emerson said of the university:

    “They had bought a hotel, once a railroad terminus depot, for $8,500, turned the dining room into a chapel by putting up a pulpit on one side, made the barroom into a Pythologian Society’s Hall, & the chambers into Recitation rooms, Libraries, & professors’ apartments, all for $700 a year. They had brought an omnibus load of professors down from Madison bag and baggage… called in a painter and sent him up the ladder to paint the title “University of Rochester” on the wall, and they had runners on the road to catch students. And they are confident of graduating a class of ten by the time green peas are ripe.”

    For the next 10 years The University of Rochester expanded its scope and secured its future through an expanding endowment; student body; and faculty. In parallel a gift of 8 acres of farmland from local businessman and Congressman Azariah Boody secured the first campus of The University of Rochester upon which Anderson Hall was constructed and dedicated in 1862. Over the next sixty years this Prince Street Campus grew by a further 17 acres and was developed to include fraternity houses; dormitories; and academic buildings including Anderson Hall; Sibley Library; Eastman and Carnegie Laboratories the Memorial Art Gallery and Cutler Union.

    Twentieth century

    Coeducation

    The first female students were admitted in 1900- the result of an effort led by Susan B. Anthony and Helen Barrett Montgomery. During the 1890s a number of women took classes and labs at The University of Rochester as “visitors” but were not officially enrolled nor were their records included in The University of Rochester register. President David Jayne Hill allowed the first woman- Helen E. Wilkinson- to enroll as a normal student although she was not allowed to matriculate or to pursue a degree. Thirty-three women enrolled among the first class in 1900 and Ella S. Wilcoxen was the first to receive a degree in 1901. The first female member of the faculty was Elizabeth Denio who retired as Professor Emeritus in 1917. Male students moved to River Campus upon its completion in 1930 while the female students remained on the Prince Street campus until 1955.

    Expansion

    Major growth occurred under the leadership of Benjamin Rush Rhees over his 1900-1935 tenure. During this period George Eastman became a major donor giving more than $50 million to The University of Rochester during his life. Under the patronage of Eastman the Eastman School of Music was created in 1921. In 1925 at the behest of the General Education Board and with significant support for John D. Rockefeller George Eastman and Henry A. Strong’s family medical and dental schools were created. The University of Rochester award its first Ph.D that same year.

    During World War II The University of Rochester was one of 131 colleges and universities nationally that took part in the V-12 Navy College Training Program which offered students a path to a Navy commission. In 1942, The University of Rochester was invited to join the Association of American Universities as an affiliate member and it was made a full member by 1944. Between 1946 and 1947 in infamous uranium experiments researchers at the university injected uranium-234 and uranium-235 into six people to study how much uranium their kidneys could tolerate before becoming damaged.

    In 1955 the separate colleges for men and women were merged into The College on the River Campus. In 1958 three new schools were created in engineering; business administration and education. The Graduate School of Management was named after William E. Simon- former Secretary of the Treasury in 1986. He committed significant funds to the school because of his belief in the school’s free market philosophy and grounding in economic analysis.

    Financial decline and name change controversy

    Following the princely gifts given throughout his life George Eastman left the entirety of his estate to The University of Rochester after his death by suicide. The total of these gifts surpassed $100 million before inflation and as such The University of Rochester enjoyed a privileged position amongst the most well endowed universities. During the expansion years between 1936 and 1976 The University of Rochester’s financial position ranked third, near Harvard University’s endowment and the University of Texas System’s Permanent University Fund . Due to a decline in the value of large investments and a lack of portfolio diversity The University of Rochester’s place dropped to the top 25 by the end of the 1980s. At the same time the preeminence of the city of Rochester’s major employers began to decline.

    In response The University of Rochester commissioned a study to determine if the name of the institution should be changed to “Eastman University” or “Eastman Rochester University”. The study concluded a name change could be beneficial because the use of a place name in the title led respondents to incorrectly believe it was a public university, and because the name “Rochester” connoted a “cold and distant outpost.” Reports of the latter conclusion led to controversy and criticism in the Rochester community. Ultimately, the name “The University of Rochester” was retained.

    Renaissance Plan
    In 1995 The University of Rochester president Thomas H. Jackson announced the launch of a “Renaissance Plan” for The College that reduced enrollment from 4,500 to 3,600 creating a more selective admissions process. The plan also revised the undergraduate curriculum significantly creating the current system with only one required course and only a few distribution requirements known as clusters. Part of this plan called for the end of graduate doctoral studies in Chemical Engineering; comparative literature; linguistics; and Mathematics the last of which was met by national outcry. The plan was largely scrapped and Mathematics exists as a graduate course of study to this day.

    Twenty-first century

    Meliora Challenge

    Shortly after taking office university president Joel Seligman commenced the private phase of the “Meliora Challenge”- a $1.2 billion capital campaign- in 2005. The campaign reached its goal in 2015- a year before the campaign was slated to conclude. In 2016, The University of Rochester announced the Meliora Challenge had exceeded its goal and surpassed $1.36 billion. These funds were allocated to support over 100 new endowed faculty positions and nearly 400 new scholarships.

    The Mangelsdorf Years

    On December 17, 2018 The University of Rochester announced that Sarah C. Mangelsdorf would succeed Richard Feldman as President of the University. Her term started in July 2019 with a formal inauguration following in October during Meliora Weekend. Mangelsdorf is the first woman to serve as President of The University of Rochester and the first person with a degree in psychology to be appointed to Rochester’s highest office.

    In 2019 students from China mobilized by the Chinese Students and Scholars Association (CSSA) defaced murals in the University’s access tunnels which had expressed support for the 2019 Hong Kong Protests, condemned the oppression of the Uighurs, and advocated for Taiwanese independence. The act was widely seen as a continuation of overseas censorship of Chinese issues. In response a large group of students recreated the original murals. There have also been calls for Chinese government run CSSA to be banned from campus.

    Research

    The University of Rochester is a member of the Association of American Universities and is classified among “R1: Doctoral Universities – Very High Research Activity”.

    The University of Rochester had a research expenditure of $370 million in 2018.

    In 2008 The University of Rochester ranked 44th nationally in research spending but this ranking has declined gradually to 68 in 2018.

    Some of the major research centers include the Laboratory for Laser Energetics, a laser-based nuclear fusion facility, and the extensive research facilities at The University of Rochester Medical Center.

    Recently The University of Rochester has also engaged in a series of new initiatives to expand its programs in Biomedical Engineering and Optics including the construction of the new $37 million Robert B. Goergen Hall for Biomedical Engineering and Optics on the River Campus.

    Other new research initiatives include a cancer stem cell program and a Clinical and Translational Sciences Institute. The University of Rochester also has the ninth highest technology revenue among U.S. higher education institutions with $46 million being paid for commercial rights to university technology and research in 2009. Notable patents include Zoloft and Gardasil. WeBWorK, a web-based system for checking homework and providing immediate feedback for students was developed by The University of Rochester professors Gage and Pizer. The system is now in use at over 800 universities and colleges as well as several secondary and primary schools. The University of Rochester scientists work in diverse areas. For example, physicists developed a technique for etching metal surfaces such as platinum; titanium; and brass with powerful lasers enabling self-cleaning surfaces that repel water droplets and will not rust if tilted at a 4 degree angle; and medical researchers are exploring how brains rid themselves of toxic waste during sleep.

     
  • richardmitnick 4:20 pm on January 8, 2023 Permalink | Reply
    Tags: , "What’s next for quantum computing", , Companies are moving away from setting qubit records in favor of practical hardware and long-term goals., Competition around the world, Computer Science & Technology, Getting serious about software, Stringing quantum computers together, Taking on the noise   

    From “The MIT Technology Review” : “What’s next for quantum computing” 

    From “The MIT Technology Review”

    1.6.23
    Michael Brooks

    1
    Stephanie Arnett/MITTR; Getty.

    Companies are moving away from setting qubit records in favor of practical hardware and long-term goals.

    In 2023, progress in quantum computing will be defined less by big hardware announcements than by researchers consolidating years of hard work, getting chips to talk to one another, and shifting away from trying to make do with noise as the field gets ever more international in scope.

    For years, quantum computing’s news cycle was dominated by headlines about record-setting systems. Researchers at Google and IBM have had spats over who achieved what—and whether it was worth the effort.

    But the time for arguing over who’s got the biggest processor seems to have passed: firms are heads-down and preparing for life in the real world. Suddenly, everyone is behaving like grown-ups.

    As if to emphasize how much researchers want to get off the hype train, IBM is expected to announce a processor in 2023 that bucks the trend of putting ever more quantum bits, or “qubits,” into play. Qubits, the processing units of quantum computers, can be built from a variety of technologies, including superconducting circuitry, trapped ions, and photons, the quantum particles of light. 

    IBM has long pursued superconducting qubits, and over the years the company has been making steady progress in increasing the number it can pack on a chip. In 2021, for example, IBM unveiled one with a record-breaking 127 of them. In November, it debuted its 433-qubit Osprey processor, and the company aims to release a 1,121-qubit processor called Condor in 2023. 

    2
    IBM Osprey 433-qubit quantum computer

    But this year IBM is also expected to debut its Heron processor, which will have just 133 qubits. It might look like a backwards step, but as the company is keen to point out, Heron’s qubits will be of the highest quality. And, crucially, each chip will be able to connect directly to other Heron processors, heralding a shift from single quantum computing chips toward “modular” quantum computers built from multiple processors connected together—a move that is expected to help quantum computers scale up significantly. 

    Heron is a signal of larger shifts in the quantum computing industry. Thanks to some recent breakthroughs, aggressive roadmapping, and high levels of funding, we may see general-purpose quantum computers earlier than many would have anticipated just a few years ago, some experts suggest. “Overall, things are certainly progressing at a rapid pace,” says Michele Mosca, deputy director of the Institute for Quantum Computing at the University of Waterloo. 

    Here are a few areas where experts expect to see progress.

    Stringing quantum computers together

    IBM’s Heron project is just a first step into the world of modular quantum computing. The chips will be connected with conventional electronics, so they will not be able to maintain the “quantumness” of information as it moves from processor to processor. But the hope is that such chips, ultimately linked together with quantum-friendly fiber-optic or microwave connections, will open the path toward distributed, large-scale quantum computers with as many as a million connected qubits. That may be how many are needed to run useful, error-corrected quantum algorithms. “We need technologies that scale both in size and in cost, so modularity is key,” says Jerry Chow, director at IBM Quantum Hardware System Development.

    Other companies are beginning similar experiments. “Connecting stuff together is suddenly a big theme,” says Peter Shadbolt, chief scientific officer of PsiQuantum, which uses photons as its qubits. PsiQuantum is putting the finishing touches on a silicon-based modular chip. Shadbolt says the last piece it requires—an extremely fast, low-loss optical switch—will be fully demonstrated by the end of 2023. “That gives us a feature-complete chip,” he says. Then warehouse-scale construction can begin: “We’ll take all of the silicon chips that we’re making and assemble them together in what is going to be a building-scale, high-performance computer-like system.” 

    The desire to shuttle qubits among processors means that a somewhat neglected quantum technology will come to the fore now, according to Jack Hidary, CEO of SandboxAQ, a quantum technology company that was spun out of Alphabet last year. Quantum communications, where coherent qubits are transferred over distances as large as hundreds of kilometers, will be an essential part of the quantum computing story in 2023, he says.

    “The only pathway to scale quantum computing is to create modules of a few thousand qubits and start linking them to get coherent linkage,” Hidary told MIT Technology Review. “That could be in the same room, but it could also be across campus, or across cities. We know the power of distributed computing from the classical world, but for quantum, we have to have coherent links: either a fiber-optic network with quantum repeaters, or some fiber that goes to a ground station and a satellite network.”

    Many of these communication components have been demonstrated in recent years. In 2017, for example, China’s Micius satellite showed that coherent quantum communications could be accomplished between nodes separated by 1,200 kilometers. And in March 2022, an international group of academic and industrial researchers demonstrated a quantum repeater that effectively relayed quantum information over 600 kilometers of fiber optics. 

    Taking on the noise

    At the same time that the industry is linking up qubits, it is also moving away from an idea that came into vogue in the last five years—that chips with just a few hundred qubits might be able to do useful computing, even though noise easily disrupts their operations. 

    This notion, called “noisy intermediate-scale quantum” (NISQ), would have been a way to see some short-term benefits from quantum computing, potentially years before reaching the ideal of large-scale quantum computers with many hundreds of thousands of qubits devoted to correcting errors. But optimism about NISQ seems to be fading. “The hope was that these computers could be used well before you did any error correction, but the emphasis is shifting away from that,” says Joe Fitzsimons, CEO of Singapore-based Horizon Quantum Computing.

    Some companies are taking aim at the classic form of error correction, using some qubits to correct errors in others. Last year, both Google Quantum AI and Quantinuum, a new company formed by Honeywell and Cambridge Quantum Computing, issued papers demonstrating that qubits can be assembled into error-correcting ensembles that outperform the underlying physical qubits.

    Other teams are trying to see if they can find a way to make quantum computers “fault tolerant” without as much overhead. IBM, for example, has been exploring characterizing the error-inducing noise in its machines and then programming in a way to subtract it (similar to what noise-canceling headphones do). It’s far from a perfect system—the algorithm works from a prediction of the noise that is likely to occur, not what actually shows up. But it does a decent job, Chow says: “We can build an error-correcting code, with a much lower resource cost, that makes error correction approachable in the near term.”

    Maryland-based IonQ, which is building trapped-ion quantum computers, is doing something similar. “The majority of our errors are imposed by us as we poke at the ions and run programs,” says Chris Monroe, chief scientist at IonQ. “That noise is knowable, and different types of mitigation have allowed us to really push our numbers.”

    Getting serious about software

    For all the hardware progress, many researchers feel that more attention needs to be given to programming. “Our toolbox is definitely limited, compared to what we need to have 10 years down the road,” says Michal Stechly of Zapata Computing, a quantum software company based in Boston. 

    The way code runs on a cloud-accessible quantum computer is generally “circuit-based,” which means the data is put through a specific, predefined series of quantum operations before a final quantum measurement is made, giving the output. That’s problematic for algorithm designers, Fitzsimons says. Conventional programming routines tend to involve looping some steps until a desired output is reached, and then moving into another subroutine. In circuit-based quantum computing, getting an output generally ends the computation: there is no option for going round again.

    Horizon Quantum Computing is one of the companies that have been building programming tools to allow these flexible computation routines. “That gets you to a different regime in terms of the kinds of things you’re able to run, and we’ll start rolling out early access in the coming year,” Fitzsimons says.

    Helsinki-based Algorithmiq is also innovating in the programming space. “We need nonstandard frameworks to program current quantum devices,” says CEO Sabrina Maniscalco. Algorithmiq’s newly launched drug discovery platform, Aurora, combines the results of a quantum computation with classical algorithms. Such “hybrid” quantum computing is a growing area, and it’s widely acknowledged as the way the field is likely to function in the long term. The company says it expects to achieve a useful quantum advantage—a demonstration that a quantum system can outperform a classical computer on real-world, relevant calculations—in 2023. 

    Competition around the world

    Change is likely coming on the policy front as well. Government representatives including Alan Estevez, US undersecretary of commerce for industry and security, have hinted that trade restrictions surrounding quantum technologies are coming. 

    Tony Uttley, COO of Quantinuum, says that he is in active dialogue with the US government about making sure this doesn’t adversely affect what is still a young industry. “About 80% of our system is components or subsystems that we buy from outside the US,” he says. “Putting a control on them doesn’t help, and we don’t want to put ourselves at a disadvantage when competing with other companies in other countries around the world.”

    And there are plenty of competitors. Last year, the Chinese search company Baidu opened access to a 10-superconducting-qubit processor that it hopes will help researchers make forays into applying quantum computing to fields such as materials design and pharmaceutical development. The company says it has recently completed the design of a 36-qubit superconducting quantum chip. “Baidu will continue to make breakthroughs in integrating quantum software and hardware and facilitate the industrialization of quantum computing,” a spokesman for the company told MIT Technology Review. The tech giant Alibaba also has researchers working on quantum computing with superconducting qubits.

    In Japan, Fujitsu is working with the Riken research institute to offer companies access to the country’s first home-grown quantum computer in the fiscal year starting April 2023. It will have 64 superconducting qubits. “The initial focus will be on applications for materials development, drug discovery, and finance,” says Shintaro Sato, head of the quantum laboratory at Fujitsu Research.

    Not everyone is following the well-trodden superconducting path, however. In 2020, the Indian government pledged to spend 80 billion rupees ($1.12 billion when the announcement was made) on quantum technologies. A good chunk will go to photonics technologies—for satellite-based quantum communications, and for innovative “qudit” photonics computing.

    Qudits expand the data encoding scope of qubits—they offer three, four, or more dimensions, as opposed to just the traditional binary 0 and 1, without necessarily increasing the scope for errors to arise. “This is the kind of work that will allow us to create a niche, rather than competing with what has already been going on for several decades elsewhere,” says Urbasi Sinha, who heads the quantum information and computing laboratory at the Raman Research Institute in Bangalore, India.

    Though things are getting serious and internationally competitive, quantum technology remains largely collaborative—for now. “The nice thing about this field is that competition is fierce, but we all recognize that it’s necessary,” Monroe says. “We don’t have a zero-sum-game mentality: there are different technologies out there, at different levels of maturity, and we all play together right now. At some point there’s going to be some kind of consolidation, but not yet.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of “The MIT Technology Review” is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 2:13 pm on January 8, 2023 Permalink | Reply
    Tags: "Unpacking the 'black box' to build better AI models", , , , , Computer Science & Technology, Computer Science and Artificial Intelligence Laboratory (CSAIL), , From butterflies to bioinformatics, , , , Stefanie Jegelka, Stefanie Jegelka seeks to understand how machine-learning models behave to help researchers build more robust models for applications in biology and computer vision and optimization and more., Teaching models to learn,   

    From The Massachusetts Institute of Technology: “Unpacking the ‘black box’ to build better AI models” Stefanie Jegelka 

    From The Massachusetts Institute of Technology

    1.8.23
    Adam Zewe

    Stefanie Jegelka seeks to understand how machine-learning models behave, to help researchers build more robust models for applications in biology, computer vision, optimization, and more.

    1
    Stefanie Jegelka, a newly-tenured associate professor in the Department of Electrical Engineering and Computer Science at MIT, develops algorithms for deep learning applications and studies how deep learning models behave and what they can learn. Photo: M. Scott Brauer.

    2
    “What I really loved about MIT, from the very beginning, was that the people really care deeply about research and creativity. That is what I appreciate the most about MIT. The people here really value originality and digging deep into research,” Jegelka says. Photo: M. Scott Brauer.

    When deep learning models are deployed in the real world, perhaps to detect financial fraud from credit card activity or identify cancer in medical images, they are often able to outperform humans.

    But what exactly are these deep learning models learning? Does a model trained to spot skin cancer in clinical images, for example, actually learn the colors and textures of cancerous tissue, or is it flagging some other features or patterns?

    These powerful machine-learning models are typically based on artificial neural networks that can have millions of nodes that process data to make predictions. Due to their complexity, researchers often call these models “black boxes” because even the scientists who build them don’t understand everything that is going on under the hood.

    Stefanie Jegelka isn’t satisfied with that “black box” explanation. A newly tenured associate professor in the MIT Department of Electrical Engineering and Computer Science, Jegelka is digging deep into deep learning to understand what these models can learn and how they behave, and how to build certain prior information into these models.

    “At the end of the day, what a deep-learning model will learn depends on so many factors. But building an understanding that is relevant in practice will help us design better models, and also help us understand what is going on inside them so we know when we can deploy a model and when we can’t. That is critically important,” says Jegelka, who is also a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Institute for Data, Systems, and Society (IDSS).

    Jegelka is particularly interested in optimizing machine-learning models when input data are in the form of graphs. Graph data pose specific challenges: For instance, information in the data consists of both information about individual nodes and edges, as well as the structure — what is connected to what. In addition, graphs have mathematical symmetries that need to be respected by the machine-learning model so that, for instance, the same graph always leads to the same prediction. Building such symmetries into a machine-learning model is usually not easy.

    Take molecules, for instance. Molecules can be represented as graphs, with vertices that correspond to atoms and edges that correspond to chemical bonds between them. Drug companies may want to use deep learning to rapidly predict the properties of many molecules, narrowing down the number they must physically test in the lab.

    Jegelka studies methods to build mathematical machine-learning models that can effectively take graph data as an input and output something else, in this case a prediction of a molecule’s chemical properties. This is particularly challenging since a molecule’s properties are determined not only by the atoms within it, but also by the connections between them.

    Other examples of machine learning on graphs include traffic routing, chip design, and recommender systems.

    Designing these models is made even more difficult by the fact that data used to train them are often different from data the models see in practice. Perhaps the model was trained using small molecular graphs or traffic networks, but the graphs it sees once deployed are larger or more complex.

    In this case, what can researchers expect this model to learn, and will it still work in practice if the real-world data are different?

    “Your model is not going to be able to learn everything because of some hardness problems in computer science, but what you can learn and what you can’t learn depends on how you set the model up,” Jegelka says.

    She approaches this question by combining her passion for algorithms and discrete mathematics with her excitement for machine learning.

    From butterflies to bioinformatics

    Jegelka grew up in a small town in Germany and became interested in science when she was a high school student; a supportive teacher encouraged her to participate in an international science competition. She and her teammates from the U.S. and Singapore won an award for a website they created about butterflies, in three languages.

    “For our project, we took images of wings with a scanning electron microscope at a local university of applied sciences. I also got the opportunity to use a high-speed camera at Mercedes Benz — this camera usually filmed combustion engines — which I used to capture a slow-motion video of the movement of a butterfly’s wings. That was the first time I really got in touch with science and exploration,” she recalls.

    Intrigued by both biology and mathematics, Jegelka decided to study bioinformatics at the University of Tübingen and the University of Texas-Austin. She had a few opportunities to conduct research as an undergraduate, including an internship in computational neuroscience at Georgetown University, but wasn’t sure what career to follow.

    When she returned for her final year of college, Jegelka moved in with two roommates who were working as research assistants at the MPG Institute in Tübingen.

    “They were working on machine learning, and that sounded really cool to me. I had to write my bachelor’s thesis, so I asked at the institute if they had a project for me. I started working on machine learning at the MPG Institute and I loved it. I learned so much there, and it was a great place for research,” she says.

    She stayed on at the MPG Institute to complete a master’s thesis, and then embarked on a PhD in machine learning at the MPG Institute and the Swiss Federal Institute of Technology.

    During her PhD, she explored how concepts from discrete mathematics can help improve machine-learning techniques.

    Teaching models to learn

    The more Jegelka learned about machine learning, the more intrigued she became by the challenges of understanding how models behave, and how to steer this behavior.

    “You can do so much with machine learning, but only if you have the right model and data. It is not just a black-box thing where you throw it at the data and it works. You actually have to think about it, its properties, and what you want the model to learn and do,” she says.

    After completing a postdoc at the University of California-Berkeley, Jegelka was hooked on research and decided to pursue a career in academia. She joined the faculty at MIT in 2015 as an assistant professor.

    “What I really loved about MIT, from the very beginning, was that the people really care deeply about research and creativity. That is what I appreciate the most about MIT. The people here really value originality and depth in research,” she says.

    That focus on creativity has enabled Jegelka to explore a broad range of topics.

    In collaboration with other faculty at MIT, she studies machine-learning applications in biology, imaging, computer vision, and materials science.

    But what really drives Jegelka is probing the fundamentals of machine learning, and most recently, the issue of robustness. Often, a model performs well on training data, but its performance deteriorates when it is deployed on slightly different data. Building prior knowledge into a model can make it more reliable, but understanding what information the model needs to be successful and how to build it in is not so simple, she says.

    She is also exploring methods to improve the performance of machine-learning models for image classification.

    Image classification models are everywhere, from the facial recognition systems on mobile phones to tools that identify fake accounts on social media. These models need massive amounts of data for training, but since it is expensive for humans to hand-label millions of images, researchers often use unlabeled datasets to pretrain models instead.

    These models then reuse the representations they have learned when they are fine-tuned later for a specific task.

    Ideally, researchers want the model to learn as much as it can during pretraining, so it can apply that knowledge to its downstream task. But in practice, these models often learn only a few simple correlations — like that one image has sunshine and one has shade — and use these “shortcuts” to classify images.

    “We showed that this is a problem in ‘contrastive learning,’ which is a standard technique for pre-training, both theoretically and empirically. But we also show that you can influence the kinds of information the model will learn to represent by modifying the types of data you show the model. This is one step toward understanding what models are actually going to do in practice,” she says.

    Researchers still don’t understand everything that goes on inside a deep-learning model, or details about how they can influence what a model learns and how it behaves, but Jegelka looks forward to continue exploring these topics.

    “Often in machine learning, we see something happen in practice and we try to understand it theoretically. This is a huge challenge. You want to build an understanding that matches what you see in practice, so that you can do better. We are still just at the beginning of understanding this,” she says.

    Outside the lab, Jegelka is a fan of music, art, traveling, and cycling. But these days, she enjoys spending most of her free time with her preschool-aged daughter.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    4

    The Computer Science and Artificial Intelligence Laboratory (CSAIL)

    From The Kavli Institute For Astrophysics and Space Research

    MIT’s Institute for Medical Engineering and Science is a research institute at the Massachusetts Institute of Technology

    The MIT Laboratory for Nuclear Science

    The MIT Media Lab

    The MIT School of Engineering

    The MIT Sloan School of Management

    Spectrum

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 10:26 am on January 8, 2023 Permalink | Reply
    Tags: "Erica Prates - Bridging science across scales with computational biology", , , Computer Science & Technology, Linking the function of the smallest molecules to their effects on large-scale processes., One of Prates’ main efforts is building a computational structural systems biology workflow that lets scientists identify protein targets that can be engineered to achieve biological traits., Prates advises young people interested in a career in science to “be fearless., Prates integrates structural information on molecules into complex systems biology models., Prates is also learning the secrets of how microbes such as fungi use molecular signaling to talk to plants to support a healthy ecosystem for both., Prates is using her interdisciplinary approach to develop hardier plants that can be grown on inhospitable lands., Prates works to understand the three-dimensional structure of biomolecules with a particular interest in proteins and interacting metabolites., ,   

    From The DOE’s Oak Ridge National Laboratory: “Erica Prates – Bridging science across scales with computational biology” 

    From The DOE’s Oak Ridge National Laboratory

    1.6.23

    1
    Erica Prates has found a way to help speed the pursuit of healthier ecosystems by linking the function of the smallest molecules to their effects on large-scale processes, leveraging a combination of science, math and computing. Credit: ORNL.

    Prates is a computational systems biologist in Oak Ridge National Laboratory’s Biosciences Division. She’s using her interdisciplinary approach to develop hardier plants that can be grown on inhospitable lands to make clean jet fuels, to create healthier plants and improve carbon storage by exploring plant-microbe interactions, and to figure out how viruses affect human health.

    “I integrate structural information on molecules into complex systems biology models,” Prates said. “And I am fortunate to get to do so on the world’s fastest supercomputers here at ORNL.”

    She works to understand the three-dimensional structure of biomolecules, with a particular interest in proteins and interacting metabolites.. “I help predict their structure and interactions using high-throughput methods that run on supercomputers. A molecule’s structure is tightly related to its function and how it creates physical traits in an organism. Those traits then influence ecosystems on a large scale,” Prates said.

    If scientists can describe how information passes from genes to a cascade of molecular events that produce a given biological phenomenon, they can predict how genetic variation changes biological behavior, she added.

    Versatile science

    Prates studies a wide variety of subjects, including plants, microbes, viruses and species interactions. One of her main efforts is building a computational structural systems biology workflow that lets scientists identify protein targets that can be engineered to achieve biological traits of interest.

    An example is her work identifying genes encoding proteins that can trigger desirable characteristics in plants for the Center for Bioenergy Innovation, or CBI, at ORNL. A key mission of CBI is developing improved nonfood crops like poplar and switchgrass that have greater biomass yield and resistance to pathogens and pests.

    She is also learning the secrets of how microbes such as fungi use molecular signaling to talk to plants to support a healthy ecosystem for both. Those signals, known as lipo-chitooligosaccharides, or LCOs, are believed to govern the beneficial colonization of plant roots by fungus and may be involved in other important biological processes.

    Prates played a role in ORNL’s pioneering efforts to characterize all the proteins of the SARS-CoV-2 virus for insights into its evolution and the body’s response to COVID-19. Prates and colleagues recently followed up their research with lab experiments supporting their theories about the virus’s pathogenesis. The team described how the virus inactivates an important protein in the body’s immune system.

    “This was very exciting work,” Prates said. “Early in the pandemic there was this idea that the major target of the virus was lung cells. But then it became clearer that COVID-19 was a systemic disease, affecting the whole body.” The team demonstrated at a molecular detail how the virus can dismantle NEMO, a protein in the host cell that is key for an effective immune response.

    “One of the things I really enjoy about my work is the ability to migrate between very different systems.” Prates said. “I was working with a lot of plants and microbes, and then at the onset of the pandemic suddenly started working with viruses. Proteins are proteins no matter whether the organism they influence is a virus, a human or a microbe. So it’s easy and useful to migrate to these different subjects using the same tools. That’s one thing I love about this job.”

    Encouraging words

    Prates cites her mother’s influence for her successful entry into a science career.

    “You have to be confident when you practice science,” Prates said. “It was my mother who boosted my confidence every day growing up with messages that ran counter to an often sexist culture.” She also cites the influence of a physician in the family who discussed science and medicine with her routinely from a young age. When her parents built her a doll house, Prates turned it into a play laboratory.

    Prates earned her bachelor’s, master’s and doctoral degrees in chemistry from the University of Campinas, or UNICAMP, in Brazil. She first came to the United States with an internship at the University of Washington, and then spent a year at the National Renewable Energy Laboratory, as a Sao Paulo Research Foundation Fellow researching biofuels.

    In Brazil, Prates was no stranger to bioenergy. The nation is the world’s second largest producer of ethanol. Renewables make up almost half of Brazil’s energy mix, and about 70% of that supply is from plant biomass, according to the International Energy Agency.

    It was at NREL, a key partner in CBI, that she became acquainted with ORNL and eventually joined as a postdoctoral researcher in 2018, hiring on as staff three years later.

    “I’ve been very lucky in my career to have worked with very generous scientists who opened doors for me and made me feel empowered and capable,” she said. She cited key mentors like Professor Munir Skaf, her doctoral advisor at UNICAMP, Gregg Beckham at NREL and Dan Jacobson at ORNL.

    At Oak Ridge, Prates said she feels “lucky to be around very smart co-workers. The team that I work with directly supports my work in systems biology where you need to understand the connections between molecules, and often that requires people with very different expertise working together. It makes you talk a lot, this interdependence of a team where everyone might have a different approach.” By having the same goal, the environment is more cooperative than competitive, she said.

    She also enjoys the immense capabilities of working in a national lab environment, including the supercomputers at the Oak Ridge Leadership Computing Facility. “Just working here with Summit [below] and Frontier [below] is a big achievement already,” she said.

    Fearless and flexible

    Prates advises young people interested in a career in science to “be fearless. It’s important to be confident and creative. Don’t give up, even on the ideas that at first may feel wrong. Be flexible and resilient. Just like Darwin’s theories in nature, adaptability is key to success.” She also stressed the benefit of learning how to write. “You will write more than you expect to, and it’s critical to be able to effectively communicate your ideas to others.”

    Prates’s enthusiasm extends to her personal life as her family grows. “I’m very excited by the most important project of my life: the baby girl that I’m expecting,” she said. “I plan to be very supportive of her in whatever she wants to do. I want to show her how the universe is complex and beautiful, as my inspirations did for me.”

    In her research as well as in parenting, she hopes to continue bridging the gap between the tiniest elements and the largest impact. “When you make this connection between the molecular world and the big picture, then you’re learning which of the tiny gears can influence the entire system.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition


    Established in 1942, The DOE’s Oak Ridge National Laboratory is the largest science and energy national laboratory in the Department of Energy system (by size) and third largest by annual budget. It is located in the Roane County section of Oak Ridge, Tennessee. Its scientific programs focus on materials, neutron science, energy, high-performance computing, systems biology and national security, sometimes in partnership with the state of Tennessee, universities and other industries.

    ORNL has several of the world’s top supercomputers, including Summit, ranked by the TOP500 as Earth’s second-most powerful.

    ORNL OLCF IBM Q AC922 SUMMIT supercomputer, No. 5 on the TOP500. .

    The lab is a leading neutron and nuclear power research facility that includes the Spallation Neutron Source and High Flux Isotope Reactor.

    ORNL Spallation Neutron Source annotated.

    It hosts the Center for Nanophase Materials Sciences, the BioEnergy Science Center, and the Consortium for Advanced Simulation of Light Water Nuclear Reactors.

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    Areas of research

    ORNL conducts research and development activities that span a wide range of scientific disciplines. Many research areas have a significant overlap with each other; researchers often work in two or more of the fields listed here. The laboratory’s major research areas are described briefly below.

    Chemical sciences – ORNL conducts both fundamental and applied research in a number of areas, including catalysis, surface science and interfacial chemistry; molecular transformations and fuel chemistry; heavy element chemistry and radioactive materials characterization; aqueous solution chemistry and geochemistry; mass spectrometry and laser spectroscopy; separations chemistry; materials chemistry including synthesis and characterization of polymers and other soft materials; chemical biosciences; and neutron science.
    Electron microscopy – ORNL’s electron microscopy program investigates key issues in condensed matter, materials, chemical and nanosciences.
    Nuclear medicine – The laboratory’s nuclear medicine research is focused on the development of improved reactor production and processing methods to provide medical radioisotopes, the development of new radionuclide generator systems, the design and evaluation of new radiopharmaceuticals for applications in nuclear medicine and oncology.
    Physics – Physics research at ORNL is focused primarily on studies of the fundamental properties of matter at the atomic, nuclear, and subnuclear levels and the development of experimental devices in support of these studies.
    Population – ORNL provides federal, state and international organizations with a gridded population database, called Landscan, for estimating ambient population. LandScan is a raster image, or grid, of population counts, which provides human population estimates every 30 x 30 arc seconds, which translates roughly to population estimates for 1 kilometer square windows or grid cells at the equator, with cell width decreasing at higher latitudes. Though many population datasets exist, LandScan is the best spatial population dataset, which also covers the globe. Updated annually (although data releases are generally one year behind the current year) offers continuous, updated values of population, based on the most recent information. Landscan data are accessible through GIS applications and a USAID public domain application called Population Explorer.

     
  • richardmitnick 12:28 pm on January 7, 2023 Permalink | Reply
    Tags: , "What can we learn about Quantum Computing Companies from technology history?", Apple launched the iPad-again a whole new product category, Businesses which create innovations and new products and experiments stay alive., Computer Science & Technology, DEC (Digital Equipment Corporation), Do not forget the iPhone., Fairchild Semiconductor, IBM moved into personal computing and started a whole new category of products., Innovation that doesn’t survive, Killer Applications, Sun Microsystems, The Integrated Circuit was the turning point for Electronics., What happened and what can we learn?, Will Quantum Computing Companies Explode in number when we get the Integrated Circuit equivalent?   

    From “Quantum Zeitgeist” : “What can we learn about Quantum Computing Companies from technology history?” 

    From “Quantum Zeitgeist”

    1.6.23

    1
    Retro Computer

    History isn’t always a guide to the future, but it can often help us pick up trends and similarities which might help us predict the future. Is there anything we can learn from technology companies of the past, such as some brands that are no longer in business. Companies like Sun Microsystems and DEC were computing powerhouses but have faded from view.

    What happened and what can we learn?

    Some might liken the quantum computing industry to the technology companies of the past and especially some of the companies making processors and chips that have powered the technological revolution. Can quantum companies learn something valuable from the early days of the computer revolution that brought us semiconductor chips, programming languages that are household names, and even the transition from mainframe to desktop computers?

    Fairchild Semiconductor

    Fairchild Semiconductor was a technology company founded in 1957 by eight engineers who had previously worked at Shockley Semiconductor. William Shockley was famous for getting the 1956 Nobel prize and had an infamous temper which led his employees to look elsewhere for employment. Hence Fairchild was born. The company was known for developing the first commercially successful integrated circuit, revolutionizing the computer and electronics industry. Key figures at Fairchild Semiconductor include Robert Noyce, who is often credited as the co-inventor of the microchip, and Gordon Moore, who co-founded Intel and is known for Moore’s Law, which predicted the exponential increase in computer processing power over time.

    Fairchild Semiconductor did not fail, but it did face challenges and changes throughout its history. In the 1960s and 1970s, the company faced intense competition from other semiconductor manufacturers and struggled to adapt to changes in the industry. In 1979, Fairchild Semiconductor was acquired by Schlumberger Limited, a French conglomerate, and became a subsidiary. In 1987, the company was sold to National Semiconductor, which rebranded it as Fairchild Semiconductor International. In 2016, Fairchild Semiconductor was acquired by ON Semiconductor, a global semiconductor company. Today, Fairchild Semiconductor continues to operate as a division of ON Semiconductor.

    Takeaway Lesson(s): William Shockley, despite his brilliance, couldn’t keep his employees. Even the company his disgruntled employees founded-Fairchild-fell away into obscurity. There is an innovation cost to being first or even second. But if companies don’t invest in people, and allow the right culture to flourish, no amount of innovation can enable them to remain in business. Even publicly listed Quantum Company companies could be market-leading now but fail to catch the eventual big wave that carries them through turbulent times.

    Sun Microsystems

    Sun Microsystems was a technology company founded back in 1982 and known for its work in computer hardware, software, and network technology. Playing a significant role in developing the internet and was at the forefront of many technological innovations in the computing industry. Sun was behind the popular programming language: Java, which is used now by many companies. Sun made large workstations that were seminal in some early innovations like the render farms used to produce the images for films such as Toy Story on their SPARC Clusters and Workstations.

    However, Sun Microsystems faced several challenges and ultimately struggled to remain competitive in the rapidly evolving technology market. One of the main factors contributing to the company’s decline was increased competition from other technology companies, such as IBM, Hewlett-Packard, and Dell. Sun Microsystems faced financial challenges, including rising debt and declining profits. In 2010, Sun Microsystems was acquired by Oracle Corporation.

    Takeaway Lesson(s): Failure to adopt open standards, as what happened with the SPARC clusters, meant it was assailed by more open architectures such as x86 used in IBM PC’s according to Enterprise Strategy Group analyst Brian Babineau. Proprietary systems are not the way to go.

    DEC (Digital Equipment Corporation)

    DEC (Digital Equipment Corporation) was a technology company founded in 1957 and known for its work in computer hardware and software systems. The company was a pioneer in the computer industry and played a significant role in developing the personal computer market with a variety of well-known and well-liked products, such as the PDP range (which was sold from the ’70s through to the ’90s). Although the company never became a household name like IBM.

    In 1998, DEC was acquired by Compaq, a major computer company later acquired by Hewlett-Packard (HP). As a result of the acquisition, DEC became a subsidiary of Compaq and was integrated into the parent company. Many of DEC’s products and technologies were absorbed into Compaq’s product portfolio, and the DEC brand was phased out. Today, many of the products and technologies that DEC developed are no longer in use. The company’s legacy lives on in the products and technologies it grew and introduced during its time as a leading player in the computer industry.

    There are several reasons why DEC could not sustain its success and eventually failed. One reason was that the company faced intense competition from other computer companies, particularly in the personal computer market, where DEC struggled to keep up with the rapid pace of technological change.

    In contrast, DEC was a relatively newer player in the tech industry, and it struggled to keep up with the rapid pace of technological change. The company was thought to have made several strategic mistakes, such as failing to embrace the internet and the emergence of the World Wide Web in the 1990s, which contributed to its decline. Although Microsoft was initially slow to catch on to the WWW, it eventually got with the program.

    Takeaway Lesson(s): DEC, according to Clayton Christensen (of the well-known book Innovators Dilemma), could not innovate on price. Whilst IBM were able to create a desktop machine for less than $2,000, Digitial Equipment Corporation could not compete with machines costing more than $50,000.

    Quantum Computing Companies should be focused on costs even now and prepare for mainstream quantum computing to ensure price points are possible. Of course, much will come down to the qubit technology, and we are yet to see a winner, but we should not be surprised that it will come down to cost per qubit ($/Q)

    The Integrated Circuit was the turning point for Electronics.

    The integrated circuit (IC), or the microchip, was invented by Jack Kilby in 1958 while working at Texas Instruments. Kilby’s invention revolutionized the field of electronics by creating a way to shrink electronic circuits onto a small, flat piece of material, making it possible to create much more complex and powerful electronic devices. Devices could be etched into silicon, and this has been the mainstay technology and is today. However, transistors are often just single nanometers across compared to micrometre dimensions of the past.

    Before Kilby’s invention, electronic circuits were built using discrete components, such as transistors, resistors, and capacitors, connected using wires. These circuits were large, expensive, and prone to errors. Kilby’s integrated circuit, on the other hand, allows for the creation of much smaller, more reliable, and more cost-effective electronic devices by allowing all of the components to be fabricated together on a single piece of material.

    While Kilby’s first microchip had only a single transistor, subsequent microchips developed in the 1960s and 1970s had hundreds or thousands of transistors, which allowed for the creation of much more powerful and complex electronic devices. Today, microchips used in modern electronic devices often have billions of transistors and are capable of performing a wide range of complex tasks and are produced by a variety of companies from AMD, Intel and Texas Instruments.

    Kilby’s work on the integrated circuit was recognized with the Nobel Prize in Physics in 2000. Today, integrated circuits are an essential component of many electronic devices, including computers, smartphones, and other electronic devices, and have had a profound impact on modern society.

    Processors that followed the 4004 included the 8008 (3100 transistors introduced in 1972) and the 8080, which was introduced in 1974 (with 6500 transistors), and the 8086, which hit the market in 1978. These processors were even more powerful than the 4004 and were used in many computers and other electronic devices.

    Will Quantum Computing Companies Explode in number when we get the Integrated Circuit equivalent?

    We are currently dealing with single qubits or a handful of qubits. We are now approaching a few hundred qubits. Similarly, are we still in the field of single qubits or meagre numbers of qubits per chip or device?

    The Intel 4004 was one of the first microprocessors, which are computer processors that are designed to be used in small electronic devices. The 4004 was developed by Intel Corporation in 1971 and was the first microprocessor produced commercially. The 4004 had a total of 2300 transistors. As yet, there are no commercially produced quantum processors. Whilst numbers of qubits, analogous to transistors, are increasing, no processors are commercially produced that we know of at scale. Perhaps the closest might for the SpinQ devices, but nothing approaching a mass market or commercial scale.

    2
    Image of the Intel 4004 courtesy of the London Science Museum.

    The 4004 was a very basic microprocessor with a minimal instruction set and relatively low performance compared to modern microprocessors. However, it was a significant step forward in the development of microprocessors. It paved the way for creating more advanced and powerful microprocessors used in computers and other electronic devices today.

    Innovation that doesn’t survive

    An essential thread we can see is that despite innovation, these companies didn’t have the longevity of companies such as Microsoft, Apple or IBM. What was in the DNA that was so different?

    One thesis running through companies such as DEC, Sun, and Fairchild, is that any mass commercial product never reached the end consumer, which might be why it never survived – but it’s more complicated than that. These companies never made it to household names compared to IBM, Apple or Microsoft. We might see the same with Quantum Computing Companies where there is a race to innovate, but the real winners might be those companies who can wrap that innovation into something that more than “just” proves the technology works. Think price points, scale, knowing the applications, customers and more. Could a second wave of companies come that exploit the technology that the original quantum companies produced and leapfrog them, just as Intel did with the likes of Shockley, and Fairchild Semiconductor who arguably invented the silicon chip?

    Perhaps IBM’s creation of the Personal Computer was pivotal in cementing its legacy because it aimed to put a computer on every desk. Later on, we had the Home Computer revolution, which aimed to put a computer in every home. There has to be an economy of scale that will permit quantum companies to survive, which means a mass market. At the moment, just like in the early days of computing, products and services are not mass markets. What is in the company of those early pioneers that will enable them to prosper?

    Killer Applications

    Quantum computing devices are loosely coupled to mean everything from all technology solutions are only in specialist uses. We don’t have the application pressure where there is a demand for massive Quantum workloads. This contrasts with AI workflows that create demand for specialist processors such as GPU’s and even. Just look at the explosion of AI, which uses GPU’s and AI is, of course, becoming a mass market with the likes of chatGPT.

    Intel, AMD, NVIDIA and even IBM transitioned from Industrial producer to consumer champion. For example, Intel launched the “Intel Inside” campaign in 1991 to promote its microprocessors in personal computers and other electronic devices to convey the idea that Intel’s microprocessors were a key component of many electronic devices.

    Do today’s Quantum hardware creators such as D-wave, IonQ, and Xanadu have enough flexibility to survive the pressure of the eventual market needs? Some have stated that quantum computing is waiting for a killer application to drive adoption; therefore, quantum businesses must be able to provide the services that companies need. We believe those businesses that can be vertically integrated are likely to be successful.

    We think those quantum computing companies that do the following will have the best chance of success:

    Vertical integration or full-stack. Those companies that can take real-world problems and solve them by running on their stack will likely convey the most value—consistently looking for ways to solve real-world problems and give an absolute quantum advantage. Who can take a business problem, provide a solution with just a few steps, and run on quantum hardware at a competitive price? Companies must solve consumers’ pain points.
    Deep push into applications with application frameworks and associated services. Creating the tools that enable quantum workflows. Think languages, programming frameworks, environments, and application frameworks. Think of specialist frameworks for solving problems such as chemistry, optimization, and integrations with business workflows.
    Culture of taking risks to push towards plugging the gaps. I.e. ensuring they can capture more of the process. i.e. consultancy, education, outreach and more. Specialisation may only get so far; organisations must be dynamic and nimble to ensure they don’t become technology dinosaurs. They must invest, experiment and thoroughly push all.
    Ecosystems. The successful giants of the past made sure there was an ecosystem. They didn’t look to create a stranglehold. Instead, they looked to build a “flywheel” that continued to spin and accelerate as more users came on board. Think marketplaces, sharing, algorithms, knowledge and experience. Look at how Amazon’s AWS (Amazon Web Service) has shaken up the way that companies deploy digital assets.
    The realization that technological innovation isn’t enough. The Quantum Industry needs “killer applications”.The industry needs “real” value to be created. It’s not about the number of qubits but how they can create value for end users. That could be a $/Q (cost per qubit) price strategy, just as today how we buy millions of transistors per $ when we buy a computer with a microprocessor.

    IBM moved into personal computing and started a whole new category of products. Apple launched the iPad-again a whole new product category – and do not forget the iPhone, which was an entirely new product category. Businesses which create innovations, new products and experiments stay alive. Intel survived the push into commoditized hardware by extolling the benefits of its processors to end consumers; whether it works today is another matter, as consumers care more about what they can do with a machine, battery life, aesthetics and the applications they can run.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Quantum Zeitgeist is the original online publication for Quantum Computing News, Quantum technology Features and Articles on the Quantum Industry around the globe.

    Quantum Computing is perhaps one of the most revolutionary technologies of our time and could change multiple industries and the fabric of our world, affecting us all.

    Quantum Technologies are not just about computing but represent new ways to exchange data in quantum Security, Quantum encryption and the Quantum internet.

    See how Quantum is progressing towards a Qubit Future.

     
  • richardmitnick 9:19 am on January 3, 2023 Permalink | Reply
    Tags: , "What’s next for the chip industry", , Computer Science & Technology   

    From “The MIT Technology Review” : “What’s next for the chip industry” 

    From “The MIT Technology Review”

    1.3.23
    Zeyi Yang

    1
    Getty Images

    The year ahead was already shaping up to be a hard one for semiconductor businesses. Famously defined by cycles of soaring and dwindling demand, the chip industry is expected to see declining growth this year as the demand for consumer electronics plateaus.

    But concerns over the economic cycle—and the challenges associated with making ever more advanced chips—could easily be eclipsed by geopolitics.

    In recent months, the US has instituted the widest restrictions ever on what chips can be sold to China and who can work for Chinese companies.

    [Read Chip War: The Fight for the World’s Most Critical Technology by Chris Miller (below)]

    At the same time, it has targeted the supply side of the chip industry, introducing generous federal subsidies to attract manufacturing back to the US. Other governments in Europe and Asia that are home to major chip companies have introduced similar policies to maintain their own positions in the industry.  

    As these changes continue to take effect in 2023, they will throw a new element of uncertainty into an industry that has long relied on globally distributed supply chains and a fair amount of freedom in deciding who they do business with.

    What will these new geopolitical machinations mean for the more than $500 billion semiconductor industry? MIT Technology Review asked experts how they think it will all play out in the coming year. Here’s what they said.

    The great “reshoring” push

    The US committed $52 billion to semiconductor manufacturing and research in 2022 with the CHIPS and Science Act. Of that, $39 billion will be used to subsidize building factories domestically. Companies will be able to officially apply for that funding in February 2023, and the awards will be announced on a rolling basis. 

    Some of the funding could be used to help firms with US-based factories manufacture military chips; the US government has long been concerned about the national security risks of sourcing chips from abroad. “Probably more and more manufacturing would be reinstated within the US with the purpose to rebuild the defense supply chain,” says Jason Hsu, a former legislator in Taiwan who is currently researching the intersection of semiconductors and geopolitics as a senior fellow at Harvard’s Kennedy School. Hsu says that defense applications are likely one of the main reasons the Taiwanese chip giant TSMC decided to invest $40 billion in manufacturing five- and three-nanometer chips, currently the two most advanced generations, in the US. 

    But “reshoring” commercial chip production is another matter. Most of the chips that go into consumer products and data centers, among other commercial applications, are produced in Asia. Moving that manufacturing to the US would be likely to push up costs and make chips less commercially competitive, even with government subsidies. In April 2022, TSMC founder Morris Chang said that chip manufacturing costs in the US are 50% higher than in Taiwan.

    “The problem is going to be that Apple, Qualcomm, and Nvidia—they’re going to buy the chips manufactured in the US—are going to have to figure out how to balance those costs, because it’s going to still be cheaper to source those chips in Taiwan,” says Paul Triolo, a senior vice president at the business strategy firm Albright Stonebridge, which advises companies operating in China.

    If chip companies can’t figure out how to pay the higher labor costs in the US or keep getting subsidies from the government—which is hard to guarantee—they won’t have an incentive to keep investing in US production in the long term.

    And the United States is not the only government that wants to attract more chip factories. Taiwan passed a subsidy act in November to give chip companies large tax breaks. Japan and South Korea are doing the same.

    Woz Ahmed, a UK-based consultant and former chip industry executive, expects that subsidies from the European Union will also be moving along in 2023, although he says they likely won’t be finalized until the following year. “It’ll take them a lot longer than it will [take] the US, because of the horse trading amongst all the member states,” he says. 

    Navigating a newly restricted market

    The controls the US introduced in October on the export of advanced chips and technologies represented a major escalation in the stranglehold on China’s chip industry. Rules that once barred selling this advanced tech to a few specific Chinese companies were expanded to apply to virtually all entities in China. There are also novel measures, like restricting the sale of essential chipmaking equipment to China.

    The policies put the industry in uncharted enforcement territory. Which chips and manufacturing technologies will be considered “advanced”? If a Chinese company makes both advanced and older-generation chips, can it still source US technologies for the latter? 

    The US Department of Commerce answered some questions in a Q&A at the end of October. Among other things, it clarified that less advanced chip production lines can be spared the restrictions if they are in a separate factory building. But it’s still unclear how—and to what extent—the rules will be enforced. 

    We’ll see this play out in 2023. Chinese companies will likely look for ways to circumvent the rules. At least one has already tried to make its chips seem less advanced. Non-Chinese companies will also be motivated to find work-arounds—the Chinese market is gigantic and lucrative. 

    “If you don’t have enough enforcement people on the ground, or they can’t get the access, as soon as people realize that, lots of people will break the rules,” Ahmed says.

    Several experts believe that the US may hit China with yet more restrictions this year. Those rules may take the form of more export controls, a review process for outbound US investments, or other moves targeting chip-adjacent industries like quantum computing. 

    Not everyone agrees. Chris Miller, an international history professor at Tufts University, thinks the US administration may take a break and focus on the current restrictions. “I don’t expect major expansion of export controls on chips [in 2023],” says Miller, the author of the new book Chip War: The Fight for the World’s Most Critical Technology.

    2

    “The Biden administration spent most of the first two years in office working on those restrictions. I think they are hoping that the policy sticks and they don’t have to make changes to it for some time.”

    How China will respond

    So far, the Chinese government has had little response to the new US export controls except for some diplomatic statements and a legal dispute that it filed with the World Trade Organization, which is unlikely to yield much result. 

    Will there be a more dramatic response to come? Most experts say no. China doesn’t seem to have a big enough advantage within the chips sector to significantly hit back at the US with trade restrictions of its own. “The Americans own enough of the core technology that they can [use it] against people who are downstream in the supply chain, like the Chinese. So by definition, that means [China doesn’t] have tools for retaliation,” says John Lee, the director of East West Futures Consulting. 

    But the country does control 80% of the world’s refining capacity for rare-earth materials, which are essential in making both military products like parts for fighter jets and everyday consumer device components like batteries and screens. Restricting exports could provide China with some leverage. The Chinese could also choose to sanction a few US companies, whether in the chip industry or not, to send a message.

    But so far, China doesn’t seem interested in a scorched-earth path when it comes to semiconductors. “I think the Chinese leaders realized that that approach will be just as costly to China as it would be to the US,” says Miller. The current Chinese chip industry cannot survive without working with the global supply chain—it depends on other companies in other countries for lithography machines, core chip IP, and wafers, so avoiding aggressive retaliation that further poisons the business environment is “probably the smartest strategy for China,” he says. 

    Instead of hitting back at the US, China is likely to focus more on propping up the domestic chip industry. It’s been reported [Reuters] that China may announce a trillion yuan ($143 billion) support package for domestic companies as soon as the first quarter of 2023. Offering generous subsidies is a tried and tested method that has helped boost the Chinese semiconductor industry in the last decade. But there remains the question of how to allocate that funding efficiently and to the right companies, especially after the efficiency of China’s flagship government chip investment fund was questioned in 2022 and shaken by high-level corruption investigations

    The Taiwan question

    The US doesn’t call all the shots. To pull off its chip tech blockade, it must coordinate closely with governments controlling key processes of chipmaking that China can’t replace with domestic alternatives. These include those of the Netherlands, Japan, South Korea, and Taiwan.

    That won’t be as easy as it sounds, because despite their ideological differences with China, these places also have an economic interest in maintaining the trade relationship.

    The Netherlands and Japan have reportedly agreed [Bloomberg] to codify some of the US export control rules in their own countries. But the devil is in the fine print. “There are certainly voices supporting the Americans on this,” says Lee, who’s based in Germany. “But there’re also pretty strong voices arguing that to simply follow the Americans and lockstep on this would be bad for European interests.” Peter Wennink, CEO of Dutch lithography equipment company ASML, has said that his company “sacrificed” for the export controls while American companies benefited.

    Fissures between countries may grow bigger as time goes on. “The history of these tech restriction coalitions shows that they are complex to manage over time and they require active management to keep them functional,” Miller says.

    Taiwan is in an especially awkward position. Because of their geographical proximity and historical relationship, its economy is heavily entangled with that of China. Many Taiwanese chip companies, like TSMC, sell to Chinese companies and build factories there. In October, the US granted TSMC a one-year exemption from the export restrictions, but the exemption may not be renewed when it expires in 2023. There’s also the possibility that a military conflict between Beijing and Taipei would derail all chip manufacturing activities, but most experts don’t see that happening in the near term. 

    “So Taiwanese companies must be hedging against the uncertainties,” Hsu says. This doesn’t mean they will pull out from all their operations in China, but they may consider investing more in overseas facilities, like the two chip fabs TSMC plans to build in Arizona. 

    As Taiwan’s chip industry drifts closer towards the US and an alliance solidifies around the American export-control regime, the once globalized semiconductor industry comes one step closer to being separated by ideological lines. “Effectively, we will be entering the world of two chips,” Hsu says, with the US and its allies representing one of those worlds and the other comprising China and the various countries in Southeast Asia, the Middle East, Eurasia, and Africa where China is pushing for its technologies to be adopted. Countries that have traditionally relied on China’s financial aid and trade deals with that country will more likely accept the Chinese standards when building their digital infrastructure, Hsu says.

    Though it would unfold very slowly, Hsu says this decoupling is beginning to seem inevitable. Governments will need to start making contingency plans for when it happens, he says: “The plan B should be—what’s our China strategy?”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of “The MIT Technology Review” is to equip its audiences with the intelligence to understand a world shaped by technology.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: