Tagged: Artificial Intelligence Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:56 pm on January 12, 2022 Permalink | Reply
    Tags: "Data Science- Refining Data into Knowledge and Turning Knowledge into Action", , Artificial Intelligence, New tools for collecting reams of data from massive systems—from electron microscopes to smart watches—are what have allowed computational power to turn entire fields on their heads., , The University of Pennsylvania School of Engineering and Applied Science (US)   

    From The University of Pennsylvania School of Engineering and Applied Science (US) at The University of Pennsylvania (US) : “Data Science- Refining Data into Knowledge and Turning Knowledge into Action” 

    From The University of Pennsylvania School of Engineering and Applied Science (US)

    at

    U Penn bloc

    The University of Pennsylvania (US)

    January 6, 2022
    Janelle Weaver

    1
    Jennifer Phillips-Cremins, Rob Riggleman, Dan Roth, (upper row, left to right) Victor Preciado, Eric Stach, and Paris Perdikaris (bottom row, left to right) each use elements of data science in their fields of research, which cut across topics as diverse as genetics, medical imaging, materials design and more.

    More data is being produced across diverse fields within science, engineering, and medicine than ever before, and our ability to collect, store, and manipulate it grows by the day. With scientists of all stripes reaping the raw materials of the digital age, there is an increasing focus on developing better strategies and techniques for refining this data into knowledge, and that knowledge into action.

    Enter data science, where researchers try to sift through and combine this information to understand relevant phenomena, build or augment models, and make predictions.

    One powerful technique in data science’s armamentarium is machine learning, a type of artificial intelligence that enables computers to automatically generate insights from data without being explicitly programmed as to which correlations they should attempt to draw.

    Advances in computational power; storage and sharing have enabled machine learning to be more easily and widely applied, but new tools for collecting reams of data from massive, messy, and complex systems—from electron microscopes to smart watches—are what have allowed it to turn entire fields on their heads.

    “This is where data science comes in,” says Susan Davidson, Weiss Professor in Computer and Information Science (CIS) at Penn’s School of Engineering and Applied Science. “In contrast to fields where we have well-defined models, like in physics, where we have Newton’s laws and the theory of relativity, the goal of data science is to make predictions where we don’t have good models: a data-first approach using machine learning rather than using simulation.”

    Penn Engineering’s formal data science efforts include the establishment of the Warren Center for Network & Data Sciences, which brings together researchers from across Penn with the goal of fostering research and innovation in interconnected social, economic and technological systems. Other research communities, including Penn Research in Machine Learning and the student-run Penn Data Science Group, bridge the gap between schools, as well as between industry and academia. Programmatic opportunities for Penn students include a Data Science minor for undergraduates, and a Master of Science in Engineering in Data Science, which is directed by Davidson and jointly administered by CIS and Electrical and Systems Engineering.

    Penn academic programs and researchers on the leading edge of the data science field will soon have a new place to call home: Amy Gutmann Hall. The 116,000-square-foot, six-floor building, located on the northeast corner of 34th and Chestnut Streets near Lauder College House, will centralize resources for researchers and scholars across Penn’s 12 schools and numerous academic centers while making the tools of data analysis more accessible to the entire Penn community.

    Faculty from all six departments in Penn Engineering are at the forefront of developing innovative data science solutions, primarily relying on machine learning, to tackle a wide range of challenges. Researchers show how they use data science in their work to answer fundamental questions in topics as diverse as genetics, “information pollution,” medical imaging, nanoscale microscopy, materials design, and the spread of infectious diseases.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Pennsylvania School of Engineering and Applied Science is an undergraduate and graduate school of The University of Pennsylvania. The School offers programs that emphasize hands-on study of engineering fundamentals (with an offering of approximately 300 courses) while encouraging students to leverage the educational offerings of the broader University. Engineering students can also take advantage of research opportunities through interactions with Penn’s School of Medicine, School of Arts and Sciences and the Wharton School.

    Penn Engineering offers bachelors, masters and Ph.D. degree programs in contemporary fields of engineering study. The nationally ranked bioengineering department offers the School’s most popular undergraduate degree program. The Jerome Fisher Program in Management and Technology, offered in partnership with the Wharton School, allows students to simultaneously earn a Bachelor of Science degree in Economics as well as a Bachelor of Science degree in Engineering. SEAS also offers several masters programs, which include: Executive Master’s in Technology Management, Master of Biotechnology, Master of Computer and Information Technology, Master of Computer and Information Science and a Master of Science in Engineering in Telecommunications and Networking.

    History

    The study of engineering at The University of Pennsylvania can be traced back to 1850 when the University trustees adopted a resolution providing for a professorship of “Chemistry as Applied to the Arts”. In 1852, the study of engineering was further formalized with the establishment of the School of Mines, Arts and Manufactures. The first Professor of Civil and Mining Engineering was appointed in 1852. The first graduate of the school received his Bachelor of Science degree in 1854. Since that time, the school has grown to six departments. In 1973, the school was renamed as the School of Engineering and Applied Science.

    The early growth of the school benefited from the generosity of two Philadelphians: John Henry Towne and Alfred Fitler Moore. Towne, a mechanical engineer and railroad developer, bequeathed the school a gift of $500,000 upon his death in 1875. The main administration building for the school still bears his name. Moore was a successful entrepreneur who made his fortune manufacturing telegraph cable. A 1923 gift from Moore established the Moore School of Electrical Engineering, which is the birthplace of the first electronic general-purpose Turing-complete digital computer, ENIAC, in 1946.

    During the latter half of the 20th century the school continued to break new ground. In 1958, Barbara G. Mandell became the first woman to enroll as an undergraduate in the School of Engineering. In 1965, the university acquired two sites that were formerly used as U.S. Army Nike Missile Base (PH 82L and PH 82R) and created the Valley Forge Research Center. In 1976, the Management and Technology Program was created. In 1990, a Bachelor of Applied Science in Biomedical Science and Bachelor of Applied Science in Environmental Science were first offered, followed by a master’s degree in Biotechnology in 1997.

    The school continues to expand with the addition of the Melvin and Claire Levine Hall for computer science in 2003, Skirkanich Hall for Bioengineering in 2006, and the Krishna P. Singh Center for Nanotechnology in 2013.

    Academics

    Penn’s School of Engineering and Applied Science is organized into six departments:

    Bioengineering
    Chemical and Biomolecular Engineering
    Computer and Information Science
    Electrical and Systems Engineering
    Materials Science and Engineering
    Mechanical Engineering and Applied Mechanics

    The school’s Department of Bioengineering, originally named Biomedical Electronic Engineering, consistently garners a top-ten ranking at both the undergraduate and graduate level from U.S. News & World Report. The department also houses the George H. Stephenson Foundation Educational Laboratory & Bio-MakerSpace (aka Biomakerspace) for training undergraduate through PhD students. It is Philadelphia’s and Penn’s only Bio-MakerSpace and it is open to the Penn community, encouraging a free flow of ideas, creativity, and entrepreneurship between Bioengineering students and students throughout the university.

    Founded in 1893, the Department of Chemical and Biomolecular Engineering is “America’s oldest continuously operating degree-granting program in chemical engineering.”

    The Department of Electrical and Systems Engineering is recognized for its research in electroscience, systems science and network systems and telecommunications.

    Originally established in 1946 as the School of Metallurgical Engineering, the Materials Science and Engineering Department “includes cutting edge programs in nanoscience and nanotechnology, biomaterials, ceramics, polymers, and metals.”

    The Department of Mechanical Engineering and Applied Mechanics draws its roots from the Department of Mechanical and Electrical Engineering, which was established in 1876.

    Each department houses one or more degree programs. The Chemical and Biomolecular Engineering, Materials Science and Engineering, and Mechanical Engineering and Applied Mechanics departments each house a single degree program.

    Bioengineering houses two programs (both a Bachelor of Science in Engineering degree as well as a Bachelor of Applied Science degree). Electrical and Systems Engineering offers four Bachelor of Science in Engineering programs: Electrical Engineering, Systems Engineering, Computer Engineering, and the Networked & Social Systems Engineering, the latter two of which are co-housed with Computer and Information Science (CIS). The CIS department, like Bioengineering, offers Computer and Information Science programs under both bachelor programs. CIS also houses Digital Media Design, a program jointly operated with PennDesign.

    Research

    Penn’s School of Engineering and Applied Science is a research institution. SEAS research strives to advance science and engineering and to achieve a positive impact on society.

    U Penn campus

    Academic life at The University of Pennsylvania is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, The University of Pennsylvania enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    The University of Pennsylvania ‘s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

    The University of Pennsylvania(US) is a private Ivy League research university in Philadelphia, Pennsylvania. The university claims a founding date of 1740 and is one of the nine colonial colleges chartered prior to the U.S. Declaration of Independence. Benjamin Franklin, The University of Pennsylvania ‘s founder and first president, advocated an educational program that trained leaders in commerce, government, and public service, similar to a modern liberal arts curriculum.

    The University of Pennsylvania has four undergraduate schools as well as twelve graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences; the School of Engineering and Applied Science; the Wharton School; and the School of Nursing. Penn’s One University Policy allows students to enroll in classes in any of The University of Pennsylvania ‘s twelve schools. Among its highly ranked graduate and professional schools are a law school whose first professor wrote the first draft of the United States Constitution, the first school of medicine in North America (Perelman School of Medicine, 1765), and the first collegiate business school (Wharton School, 1881).

    The University of Pennsylvania is also home to the first “student union” building and organization (Houston Hall, 1896), the first Catholic student club in North America (Newman Center, 1893), the first double-decker college football stadium (Franklin Field, 1924 when second deck was constructed), and Morris Arboretum, the official arboretum of the Commonwealth of Pennsylvania. The first general-purpose electronic computer (ENIAC) was developed at The University of Pennsylvania and formally dedicated in 1946. In 2019, the university had an endowment of $14.65 billion, the sixth-largest endowment of all universities in the United States, as well as a research budget of $1.02 billion. The university’s athletics program, the Quakers, fields varsity teams in 33 sports as a member of the NCAA Division I Ivy League conference.

    As of 2018, distinguished alumni and/or Trustees include three U.S. Supreme Court justices; 32 U.S. senators; 46 U.S. governors; 163 members of the U.S. House of Representatives; 8 signers of the Declaration of Independence and 7 signers of the U.S. Constitution (4 of whom signed both representing two-thirds of the six people who signed both); 24 members of the Continental Congress; 14 foreign heads of state and 2 presidents of the United States, including Donald Trump. As of October 2019, 36 Nobel laureates; 80 members of the American Academy of Arts and Sciences(US); 64 billionaires; 29 Rhodes Scholars; 15 Marshall Scholars and 16 Pulitzer Prize winners have been affiliated with the university.

    History

    The University of Pennsylvania considers itself the fourth-oldest institution of higher education in the United States, though this is contested by Princeton University(US) and Columbia University (US). The university also considers itself as the first university in the United States with both undergraduate and graduate studies.

    In 1740, a group of Philadelphians joined together to erect a great preaching hall for the traveling evangelist George Whitefield, who toured the American colonies delivering open-air sermons. The building was designed and built by Edmund Woolley and was the largest building in the city at the time, drawing thousands of people the first time in which there was preaching. It was initially planned to serve as a charity school as well, but a lack of funds forced plans for the chapel and school to be suspended. According to Franklin’s autobiography, it was in 1743 when he first had the idea to establish an academy, “thinking the Rev. Richard Peters a fit person to superintend such an institution”. However, Peters declined a casual inquiry from Franklin and nothing further was done for another six years. In the fall of 1749, now more eager to create a school to educate future generations, Benjamin Franklin circulated a pamphlet titled Proposals Relating to the Education of Youth in Pensilvania, his vision for what he called a Public Academy of Philadelphia. Unlike the other colonial colleges that existed in 1749—Harvard University(US), William & Mary(US), Yale Unversity(US), and The College of New Jersey [later Princeton University(US) —Franklin’s new school would not focus merely on education for the clergy. He advocated an innovative concept of higher education, one which would teach both the ornamental knowledge of the arts and the practical skills necessary for making a living and doing public service. The proposed program of study could have become the nation’s first modern liberal arts curriculum, although it was never implemented because Anglican priest William Smith (1727-1803), who became the first provost, and other trustees strongly preferred the traditional curriculum.

    Franklin assembled a board of trustees from among the leading citizens of Philadelphia, the first such non-sectarian board in America. At the first meeting of the 24 members of the board of trustees on November 13, 1749, the issue of where to locate the school was a prime concern. Although a lot across Sixth Street from the old Pennsylvania State House (later renamed and famously known since 1776 as “Independence Hall”), was offered without cost by James Logan, its owner, the trustees realized that the building erected in 1740, which was still vacant, would be an even better site. The original sponsors of the dormant building still owed considerable construction debts and asked Franklin’s group to assume their debts and, accordingly, their inactive trusts. On February 1, 1750, the new board took over the building and trusts of the old board. On August 13, 1751, the “Academy of Philadelphia”, using the great hall at 4th and Arch Streets, took in its first secondary students. A charity school also was chartered on July 13, 1753 by the intentions of the original “New Building” donors, although it lasted only a few years. On June 16, 1755, the “College of Philadelphia” was chartered, paving the way for the addition of undergraduate instruction. All three schools shared the same board of trustees and were considered to be part of the same institution. The first commencement exercises were held on May 17, 1757.

    The institution of higher learning was known as the College of Philadelphia from 1755 to 1779. In 1779, not trusting then-provost the Reverend William Smith’s “Loyalist” tendencies, the revolutionary State Legislature created a University of the State of Pennsylvania. The result was a schism, with Smith continuing to operate an attenuated version of the College of Philadelphia. In 1791, the legislature issued a new charter, merging the two institutions into a new University of Pennsylvania with twelve men from each institution on the new board of trustees.

    Penn has three claims to being the first university in the United States, according to university archives director Mark Frazier Lloyd: the 1765 founding of the first medical school in America made Penn the first institution to offer both “undergraduate” and professional education; the 1779 charter made it the first American institution of higher learning to take the name of “University”; and existing colleges were established as seminaries (although, as detailed earlier, Penn adopted a traditional seminary curriculum as well).

    After being located in downtown Philadelphia for more than a century, the campus was moved across the Schuylkill River to property purchased from the Blockley Almshouse in West Philadelphia in 1872, where it has since remained in an area now known as University City. Although Penn began operating as an academy or secondary school in 1751 and obtained its collegiate charter in 1755, it initially designated 1750 as its founding date; this is the year that appears on the first iteration of the university seal. Sometime later in its early history, Penn began to consider 1749 as its founding date and this year was referenced for over a century, including at the centennial celebration in 1849. In 1899, the board of trustees voted to adjust the founding date earlier again, this time to 1740, the date of “the creation of the earliest of the many educational trusts the University has taken upon itself”. The board of trustees voted in response to a three-year campaign by Penn’s General Alumni Society to retroactively revise the university’s founding date to appear older than Princeton University, which had been chartered in 1746.

    Research, innovations and discoveries

    The University of Pennsylvania is classified as an “R1” doctoral university: “Highest research activity.” Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn’s research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health(US).

    In line with its well-known interdisciplinary tradition, The University of Pennsylvania ‘s research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing; the Center for Global Women’s Health at the Nursing School; the $13 million Morris Arboretum’s Horticulture Center; the $15 million Jay H. Baker Retailing Center at Wharton; and the $13 million Translational Research Center at Penn Medicine. With these additions, The University of Pennsylvania now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the “Penn Integrates Knowledge” title awarded to selected Penn professors “whose research and teaching exemplify the integration of knowledge”. These professors hold endowed professorships and joint appointments between Penn’s schools.

    The University of Pennsylvania is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia University(US) and Cornell University(US) (Harvard University(US) did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale University(US)) and tenth nationally.

    In most disciplines The University of Pennsylvania professors’ productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn’s 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields.

    The University of Pennsylvania’s research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school; the first university teaching hospital; the first business school; and the first student union The University of Pennsylvania was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, “Wharton is on the crest of a wave of reinvention and change in management education”.

    Several major scientific discoveries have also taken place at The University of Pennsylvania. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering. It was here also where the world’s first spelling and grammar checkers were created, as well as the popular COBOL programming language. The University of Pennsylvania can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer’s link with genes; cognitive therapy; Retin-A (the cream used to treat acne), Resistin; the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the genes for fragile X syndrome, the most common form of inherited mental retardation; spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; and Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs.

    Conductive polymer was also developed at The University of Pennsylvania by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at The University of Pennsylvania and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at The University of Pennsylvania, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research; Simon Kuznets’s method of measuring Gross National Product; the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the “Wharton Model” developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to The University of Pennsylvania professor Robert Eilers, who put it into practice during then-President Nixon’s health reform in the 1970s.

    International partnerships

    Students can study abroad for a semester or a year at partner institutions such as the London School of Economics(UK), University of Barcelona [Universitat de Barcelona](ES), Paris Institute of Political Studies [Institut d’études politiques de Paris](FR), University of Queensland(AU), University College London(UK), King’s College London(UK), Hebrew University of Jerusalem(IL) and University of Warwick(UK).

     
  • richardmitnick 4:05 pm on December 6, 2021 Permalink | Reply
    Tags: "Evolution of intelligent data pipelines", Accelerating data science, Artificial Intelligence, As the volume variety and velocity of data continue to grow the need for intelligent pipelines is becoming critical to business operations., Logic and algorithms can be built into a data pipeline to create an “intelligent” data pipeline., ,   

    From The MIT Technology Review (US) : “Evolution of intelligent data pipelines” 

    From The MIT Technology Review (US)

    December 6, 2021
    Bill Schmarzo

    As the volume variety and velocity of data continue to grow the need for intelligent pipelines is becoming critical to business operations.

    1

    A study by Kearney titled The Impact of Analytics in 2020 highlights the untapped profitability and business impact for organizations looking for justification to accelerate their data science (AI / ML) and data management investments:

    -Explorers could improve profitability by 20% if they were as effective as Leaders
    -Followers could improve profitability by 55% if they were as effective as Leaders
    -Laggards could improve profitability by 81% if they were as effective as Leaders

    The business, operational, and societal impacts could be staggering except for one significant organizational challenge—data. No one less than the godfather of AI, Andrew Ng, has noted the impediment of data and data management in empowering organizations and society in realizing the potential of AI and ML:

    “The model and the code for many applications are basically a solved problem. Now that the models have advanced to a certain point, we’ve got to make the data work as well.” — Andrew Ng

    Data is the heart of training AI and ML models. And high-quality, trusted data orchestrated through highly efficient and scalable pipelines means that AI can enable these compelling business and operational outcomes. Just like a healthy heart needs oxygen and reliable blood flow, so too is a steady stream of cleansed, accurate, enriched, and trusted data important to the AI / ML engines.

    For example, one CIO has a team of 500 data engineers managing over 15,000 extract, transform, and load (ETL) jobs that are responsible for acquiring, moving, aggregating, standardizing, and aligning data across 100s of special-purpose data repositories (data marts, data warehouses, data lakes, and data lakehouses). They’re performing these tasks in the organization’s operational and customer-facing systems under ridiculously tight service level agreements (SLAs) to support their growing number of diverse data consumers. It seems Rube Goldberg certainly must have become a data architect (Figure 1).

    1
    Figure 1: Rube Goldberg data architecture.

    Reducing the debilitating spaghetti architecture structures of one-off, special-purpose, static ETL programs to move, cleanse, align, and transform data is greatly inhibiting the “time to insights” necessary for organizations to fully exploit the unique economic characteristics of data, the world’s most valuable resource according to The Economist.

    Emergence of intelligent data pipelines

    The purpose of a data pipeline is to automate and scale common and repetitive data acquisition, transformation, movement, and integration tasks. A properly constructed data pipeline strategy can accelerate and automate the processing associated with gathering, cleansing, transforming, enriching, and moving data to downstream systems and applications. As the volume, variety, and velocity of data continue to grow, the need for data pipelines that can linearly scale within cloud and hybrid cloud environments is becoming increasingly critical to the operations of a business.

    A data pipeline refers to a set of data processing activities that integrates both operational and business logic to perform advanced sourcing, transformation, and loading of data. A data pipeline can run on either a scheduled basis, in real time (streaming), or be triggered by a predetermined rule or set of conditions.

    Additionally, logic and algorithms can be built into a data pipeline to create an “intelligent” data pipeline. Intelligent pipelines are reusable and extensible economic assets that can be specialized for source systems and perform the data transformations necessary to support the unique data and analytic requirements for the target system or application.

    As machine learning and AutoML become more prevalent, data pipelines will increasingly become more intelligent. Data pipelines can move data between advanced data enrichment and transformation modules, where neural network and machine learning algorithms can create more advanced data transformations and enrichments. This includes segmentation, regression analysis, clustering, and the creation of advanced indices and propensity scores.

    Finally, one could integrate AI into the data pipelines such that they could continuously learn and adapt based upon the source systems, required data transformations and enrichments, and the evolving business and operational requirements of the target systems and applications.

    For example: an intelligent data pipeline in health care could analyze the grouping of health care diagnosis-related groups (DRG) codes to ensure consistency and completeness of DRG submissions and detect fraud as the DRG data is being moved by the data pipeline from the source system to the analytic systems.

    Realizing business value

    Chief data officers and chief data analytic officers are being challenged to unleash the business value of their data—to apply data to the business to drive quantifiable financial impact.

    The ability to get high-quality, trusted data to the right data consumer at the right time in order to facilitate more timely and accurate decisions will be a key differentiator for today’s data-rich companies. A Rube Goldberg system of ELT scripts and disparate, special analytic-centric repositories hinders an organizations’ ability to achieve that goal.

    Learn more about intelligent data pipelines in Modern Enterprise Data Pipelines (eBook) by Dell Technologies here.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of The MIT Technology Review (US) is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 2:33 am on December 6, 2021 Permalink | Reply
    Tags: "Taking some of the guesswork out of drug discovery", , Artificial Intelligence, , , , GeoMol, , ,   

    From The Massachusetts Institute of Technology (US) : “Taking some of the guesswork out of drug discovery” 

    MIT News

    From The Massachusetts Institute of Technology (US)

    December 6, 2021
    Adam Zewe

    1
    MIT researchers have developed a deep learning model that can rapidly predict the likely 3D shapes of a molecule given a 2D graph of its structure. This technique could accelerate drug discovery. Image: Courtesy of the researchers edited by MIT News.

    In their quest to discover effective new medicines, scientists search for drug-like molecules that can attach to disease-causing proteins and change their functionality. It is crucial that they know the 3D shape of a molecule to understand how it will attach to specific surfaces of the protein.

    But a single molecule can fold in thousands of different ways, so solving that puzzle experimentally is a time consuming and expensive process akin to searching for a needle in a molecular haystack.

    MIT researchers are using machine learning to streamline this complex task. They have created a deep learning model that predicts the 3D shapes of a molecule solely based on a graph in 2D of its molecular structure. Molecules are typically represented as small graphs.

    Their system, GeoMol, processes molecules in only seconds and performs better than other machine learning models, including some commercial methods. GeoMol could help pharmaceutical companies accelerate the drug discovery process by narrowing down the number of molecules they need to test in lab experiments, says Octavian-Eugen Ganea, a postdoc in the Computer Science and Artificial Intelligence Laboratory (CSAIL) and co-lead author of the paper.

    “When you are thinking about how these structures move in 3D space, there are really only certain parts of the molecule that are actually flexible, these rotatable bonds. One of the key innovations of our work is that we think about modeling the conformational flexibility like a chemical engineer would. It is really about trying to predict the potential distribution of rotatable bonds in the structure,” says Lagnajit Pattanaik, a graduate student in the Department of Chemical Engineering and co-lead author of the paper.

    Other authors include Connor W. Coley, the Henri Slezynger Career Development Assistant Professor of Chemical Engineering; Regina Barzilay, the School of Engineering Distinguished Professor for AI and Health in CSAIL; Klavs F. Jensen, the Warren K. Lewis Professor of Chemical Engineering; William H. Green, the Hoyt C. Hottel Professor in Chemical Engineering; and senior author Tommi S. Jaakkola, the Thomas Siebel Professor of Electrical Engineering in CSAIL and a member of the Institute for Data, Systems, and Society. The research will be presented this week at the Conference on Neural Information Processing Systems.

    Mapping a molecule

    In a molecular graph, a molecule’s individual atoms are represented as nodes and the chemical bonds that connect them are edges.

    GeoMol leverages a recent tool in deep learning called a message passing neural network, which is specifically designed to operate on graphs. The researchers adapted a message passing neural network to predict specific elements of molecular geometry.

    Given a molecular graph, GeoMol initially predicts the lengths of the chemical bonds between atoms and the angles of those individual bonds. The way the atoms are arranged and connected determines which bonds can rotate.

    GeoMol then predicts the structure of each atom’s local neighborhood individually and assembles neighboring pairs of rotatable bonds by computing the torsion angles and then aligning them. A torsion angle determines the motion of three segments that are connected, in this case, three chemical bonds that connect four atoms.

    “Here, the rotatable bonds can take a huge range of possible values. So, the use of these message passing neural networks allows us to capture a lot of the local and global environments that influences that prediction. The rotatable bond can take multiple values, and we want our prediction to be able to reflect that underlying distribution,” Pattanaik says.

    Overcoming existing hurdles

    One major challenge to predicting the 3D structure of molecules is to model chirality. A chiral molecule can’t be superimposed on its mirror image, like a pair of hands (no matter how you rotate your hands, there is no way their features exactly line up). If a molecule is chiral, its mirror image won’t interact with the environment in the same way.

    This could cause medicines to interact with proteins incorrectly, which could result in dangerous side effects. Current machine learning methods often involve a long, complex optimization process to ensure chirality is correctly identified, Ganea says.

    Because GeoMol determines the 3D structure of each bond individually, it explicitly defines chirality during the prediction process, eliminating the need for optimization after-the-fact.

    After performing these predictions, GeoMol outputs a set of likely 3D structures for the molecule.

    “What we can do now is take our model and connect it end-to-end with a model that predicts this attachment to specific protein surfaces. Our model is not a separate pipeline. It is very easy to integrate with other deep learning models,” Ganea says.

    A “super-fast” model

    The researchers tested their model using a dataset of molecules and the likely 3D shapes they could take, which was developed by Rafael Gomez-Bombarelli, the Jeffrey Cheah Career Development Chair in Engineering, and graduate student Simon Axelrod.

    They evaluated how many of these likely 3D structures their model was able to capture, in comparison to machine learning models and other methods.

    In nearly all instances, GeoMol outperformed the other models on all tested metrics.

    “We found that our model is super-fast, which was really exciting to see. And importantly, as you add more rotatable bonds, you expect these algorithms to slow down significantly. But we didn’t really see that. The speed scales nicely with the number of rotatable bonds, which is promising for using these types of models down the line, especially for applications where you are trying to quickly predict the 3D structures inside these proteins,” Pattanaik says.

    In the future, the researchers hope to apply GeoMol to the area of high-throughput virtual screening, using the model to determine small molecule structures that would interact with a specific protein. They also want to keep refining GeoMol with additional training data so it can more effectively predict the structure of long molecules with many flexible bonds.

    “Conformational analysis is a key component of numerous tasks in computer-aided drug design, and an important component in advancing machine learning approaches in drug discovery,” says Pat Walters, senior vice president of computation at Relay Therapeutics, who was not involved in this research. “I’m excited by continuing advances in the field and thank MIT for contributing to broader learnings in this area.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology (US) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory (US), the MIT Bates Research and Engineering Center (US), and the Haystack Observatory (US), as well as affiliated laboratories such as the Broad Institute of MIT and Harvard(US) and Whitehead Institute (US).

    Massachusettes Institute of Technology-Haystack Observatory(US) Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology (US) adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology (US) . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology (US) is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia (US), wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology (US) was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst (US)). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology (US) was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology (US) faculty and alumni rebuffed Harvard University (US) president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology (US) administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities (US)in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology (US) that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology (US)‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology (US) became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology (US) profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology (US) between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology (US) no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology (US)’s defense research. In this period Massachusetts Institute of Technology (US)’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology (US) ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT (US) Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology (US) students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology (US) over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology (US) has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology (US) classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology (US) was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology (US) launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology (US) announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology (US) faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology (US) has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology (US) community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology (US) announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology (US) community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO (US) was designed and constructed by a team of scientists from California Institute of Technology (US), Massachusetts Institute of Technology (US), and industrial contractors, and funded by the National Science Foundation (US) .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology (US) physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology (US) graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology (US) is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 9:21 am on December 1, 2021 Permalink | Reply
    Tags: "Team builds first living robots that can reproduce", AI-designed Xenobots reveal entirely new form of biological self-replication—promising for regenerative medicine., , Artificial Intelligence, , , It’s very hard actually to get the system to keep reproducing., On its own the Xenobot parent made of some 3000 cells forms a sphere., , The AI came up with some strange designs after months of chugging away including one that resembled Pac-Man., The Wyss Institute (US), These new Xenobots can go out find cells and build copies of themselves. Again and again.   

    From The Wyss Institute (US): “Team builds first living robots that can reproduce” 

    Harvard bloc tiny
    Wyss Institute bloc
    From The Wyss Institute (US)

    at

    Harvard University (US)

    November 29, 2021
    Joshua Brown, The University of Vermont (US) Communications

    AI-designed Xenobots reveal entirely new form of biological self-replication—promising for regenerative medicine.

    1
    AI-designed (C-shaped) organisms push loose stem cells (white) into piles as they move through their environment. Credit: Douglas Blackiston and Sam Kriegman.

    To persist, life must reproduce. Over billions of years, organisms have evolved many ways of replicating, from budding plants to sexual animals to invading viruses.

    Now scientists at The University of Vermont (US), Tufts University (US), and the Wyss Institute for Biologically Inspired Engineering at Harvard University have discovered an entirely new form of biological reproduction—and applied their discovery to create the first-ever, self-replicating living robots.

    The same team that built the first living robots (“Xenobots,” assembled from frog cells—reported in 2020) [PNAS] has discovered that these computer-designed and hand-assembled organisms can swim out into their tiny dish, find single cells, gather hundreds of them together, and assemble “baby” Xenobots inside their Pac-Man-shaped “mouth”—that, a few days later, become new Xenobots that look and move just like themselves.

    And then these new Xenobots can go out find cells and build copies of themselves. Again and again.

    “With the right design—they will spontaneously self-replicate,” says Joshua Bongard, Ph.D., a computer scientist and robotics expert at the University of Vermont who co-led the new research.

    The results of the new research were published November 29, 2021, in the PNAS.

    Into the Unknown

    In a Xenopus laevis frog, these embryonic cells would develop into skin. “They would be sitting on the outside of a tadpole, keeping out pathogens and redistributing mucus,” says Michael Levin, Ph.D., a professor of biology and director of the Allen Discovery Center at Tufts University and co-leader of the new research. “But we’re putting them into a novel context. We’re giving them a chance to reimagine their multicellularity.” Levin is also an Associate Faculty member at the Wyss Institute.

    And what they imagine is something far different than skin. “People have thought for quite a long time that we’ve worked out all the ways that life can reproduce or replicate. But this is something that’s never been observed before,” says co-author Douglas Blackiston, Ph.D., the senior scientist at Tufts University and the Wyss Institute who assembled the Xenobot “parents” and developed the biological portion of the new study.

    “This is profound,” says Levin. “These cells have the genome of a frog, but, freed from becoming tadpoles, they use their collective intelligence, a plasticity, to do something astounding.” In earlier experiments, the scientists were amazed that Xenobots could be designed to achieve simple tasks. Now they are stunned that these biological objects—a computer-designed collection of cells—will spontaneously replicate. “We have the full, unaltered frog genome,” says Levin, “but it gave no hint that these cells can work together on this new task,” of gathering and then compressing separated cells into working self-copies.

    “These are frog cells replicating in a way that is very different from how frogs do it. No animal or plant known to science replicates in this way,” says Sam Kriegman, Ph.D., the lead author on the new study, who completed his Ph.D. in Bongard’s lab at UVM and is now a post-doctoral researcher at Tuft’s Allen Center and Harvard University’s Wyss Institute for Biologically Inspired Engineering.

    On its own the Xenobot parent made of some 3000 cells forms a sphere. “These can make children but then the system normally dies out after that. It’s very hard actually to get the system to keep reproducing,” says Kriegman. But with an artificial intelligence program working on the Deep Green supercomputer cluster at UVM’s Vermont Advanced Computing Core, an evolutionary algorithm was able to test billions of body shapes in simulation—triangles, squares, pyramids, starfish—to find ones that allowed the cells to be more effective at the motion-based “kinematic” replication reported in the new research.

    3
    Deep Green supercomputer at The University of Vermont (US).

    “We asked the supercomputer at UVM to figure out how to adjust the shape of the initial parents, and the AI came up with some strange designs after months of chugging away including one that resembled Pac-Man,” says Kriegman. “It’s very non-intuitive. It looks very simple, but it’s not something a human engineer would come up with. Why one tiny mouth? Why not five? We sent the results to Doug and he built these Pac-Man-shaped parent Xenobots. Then those parents built children, who built grandchildren, who built great-grandchildren, who built great-great-grandchildren.” In other words, the right design greatly extended the number of generations.

    4
    An AI-designed “parent” organism (C shape; red) beside stem cells that have been compressed into a ball (“offspring”; green). Credit: Douglas Blackiston and Sam Kriegman.

    Kinematic replication is well-known at the level of molecules—but it has never been observed before at the scale of whole cells or organisms.

    “We’ve discovered that there is this previously unknown space within organisms, or living systems, and it’s a vast space,” says Bongard. “How do we then go about exploring that space? We found Xenobots that walk. We found Xenobots that swim. And now, in this study, we’ve found Xenobots that kinematically replicate. What else is out there?”

    Or, as the scientists write in the PNAS study: “life harbors surprising behaviors just below the surface, waiting to be uncovered.”

    Responding to Risk

    Some people may find this exhilarating. Others may react with concern, or even terror, to the notion of a self-replicating biotechnology. For the team of scientists, the goal is deeper understanding.

    “We are working to understand this property: replication. The world and technologies are rapidly changing. It’s important, for society as a whole, that we study and understand how this works,” says Bongard. These millimeter-sized living machines, entirely contained in a laboratory, easily extinguished, and vetted by federal, state and institutional ethics experts, “are not what keep me awake at night. What presents risk is the next pandemic; accelerating ecosystem damage from pollution; intensifying threats from climate change,” says UVM’s Bongard. “This is an ideal system in which to study self-replicating systems. We have a moral imperative to understand the conditions under which we can control it, direct it, douse it, exaggerate it.”

    Bongard points to the COVID epidemic and the hunt for a vaccine. “The speed at which we can produce solutions matters deeply. If we can develop technologies, learning from Xenobots, where we can quickly tell the AI: ‘We need a biological tool that does X and Y and suppresses Z,’ —that could be very beneficial. Today, that takes an exceedingly long time.” The team aims to accelerate how quickly people can go from identifying a problem to generating solutions—”like deploying living machines to pull microplastics out of waterways or build new medicines,” Bongard says.

    “We need to create technological solutions that grow at the same rate as the challenges we face,” Bongard says.

    And the team sees promise in the research for advancements toward regenerative medicine. “If we knew how to tell collections of cells to do what we wanted them to do, ultimately, that’s regenerative medicine—that’s the solution to traumatic injury, birth defects, cancer, and aging,” says Levin. “All of these different problems are here because we don’t know how to predict and control what groups of cells are going to build. Xenobots are a new platform for teaching us.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Wyss Institute campus

    The Wyss (pronounced “Veese”) Institute (US) for Biologically Inspired Engineering uses Nature’s design principles to develop bioinspired materials and devices that will transform medicine and create a more sustainable world.

    Working as an alliance among Harvard’s Schools of Medicine, Engineering, and Arts & Sciences, and in partnership with Beth Israel Deaconess Medical Center, Boston Children’s Hospital, Brigham and Women’s Hospital, Dana Farber Cancer Institute, Massachusetts General Hospital, the University of Massachusetts Medical School, Spaulding Rehabilitation Hospital, Tufts University, and Boston University, the Institute crosses disciplinary and institutional barriers to engage in high-risk research that leads to transformative technological breakthroughs.

    Harvard University campus

    Harvard University is a private Ivy League research university in Cambridge, Massachusetts. Established in 1636 and named for its first benefactor, clergyman John Harvard, Harvard is the oldest institution of higher learning in the United States and among the most prestigious in the world.
    The Massachusetts colonial legislature, the General Court, authorized Harvard’s founding. In its early years, Harvard College primarily trained Congregational and Unitarian clergy, although it has never been formally affiliated with any denomination. Its curriculum and student body were gradually secularized during the 18th century, and by the 19th century, Harvard had emerged as the central cultural establishment among the Boston elite. Following the American Civil War, President Charles William Eliot’s long tenure (1869–1909) transformed the college and affiliated professional schools into a modern research university; Harvard became a founding member of the Association of American Universities in 1900.[10] James B. Conant led the university through the Great Depression and World War II; he liberalized admissions after the war.
    The university is composed of ten academic faculties plus the Radcliffe Institute for Advanced Study. Arts and Sciences offers study in a wide range of academic disciplines for undergraduates and for graduates, while the other faculties offer only graduate degrees, mostly professional. Harvard has three main campuses: the 209-acre (85 ha) Cambridge campus centered on Harvard Yard; an adjoining campus immediately across the Charles River in the Allston neighborhood of Boston; and the medical campus in Boston’s Longwood Medical Area. Harvard’s endowment is valued at $41.9 billion, making it the largest of any academic institution. Endowment income helps enable the undergraduate college to admit students regardless of financial need and provide generous financial aid with no loans The Harvard Library is the world’s largest academic library system, comprising 79 individual libraries holding about 20.4 million items.
    Harvard has more alumni, faculty, and researchers who have won Nobel Prizes (161) and Fields Medals (18) than any other university in the world and more alumni who have been members of the U.S. Congress, MacArthur Fellows, Rhodes Scholars (375), and Marshall Scholars (255) than any other university in the United States. Its alumni also include eight U.S. presidents and 188 living billionaires, the most of any university. Fourteen Turing Award laureates have been Harvard affiliates. Students and alumni have also won 10 Academy Awards, 48 Pulitzer Prizes, and 108 Olympic medals (46 gold), and they have founded many notable companies.
    Colonial
    Harvard was established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. In 1638, it acquired British North America’s first known printing press. In 1639, it was named Harvard College after deceased clergyman John Harvard, an alumnus of the University of Cambridge(UK) who had left the school £779 and his library of some 400 volumes.[22] The charter creating the Harvard Corporation was granted in 1650.
    A 1643 publication gave the school’s purpose as “to advance learning and perpetuate it to posterity, dreading to leave an illiterate ministry to the churches when our present ministers shall lie in the dust.” It trained many Puritan ministers in its early years and offered a classic curriculum based on the English university model—many leaders in the colony had attended the University of Cambridge—but conformed to the tenets of Puritanism. Harvard has never affiliated with any particular denomination, though many of its earliest graduates went on to become clergymen in Congregational and Unitarian churches.
    Increase Mather served as president from 1681 to 1701. In 1708, John Leverett became the first president who was not also a clergyman, marking a turning of the college away from Puritanism and toward intellectual independence.
    19th century
    In the 19th century, Enlightenment ideas of reason and free will were widespread among Congregational ministers, putting those ministers and their congregations in tension with more traditionalist, Calvinist parties. When Hollis Professor of Divinity David Tappan died in 1803 and President Joseph Willard died a year later, a struggle broke out over their replacements. Henry Ware was elected to the Hollis chair in 1805, and the liberal Samuel Webber was appointed to the presidency two years later, signaling the shift from the dominance of traditional ideas at Harvard to the dominance of liberal, Arminian ideas.
    Charles William Eliot, president 1869–1909, eliminated the favored position of Christianity from the curriculum while opening it to student self-direction. Though Eliot was the crucial figure in the secularization of American higher education, he was motivated not by a desire to secularize education but by Transcendentalist Unitarian convictions influenced by William Ellery Channing and Ralph Waldo Emerson.
    20th century
    In the 20th century, Harvard’s reputation grew as a burgeoning endowment and prominent professors expanded the university’s scope. Rapid enrollment growth continued as new graduate schools were begun and the undergraduate college expanded. Radcliffe College, established in 1879 as the female counterpart of Harvard College, became one of the most prominent schools for women in the United States. Harvard became a founding member of the Association of American Universities in 1900.
    The student body in the early decades of the century was predominantly “old-stock, high-status Protestants, especially Episcopalians, Congregationalists, and Presbyterians.” A 1923 proposal by President A. Lawrence Lowell that Jews be limited to 15% of undergraduates was rejected, but Lowell did ban blacks from freshman dormitories.
    President James B. Conant reinvigorated creative scholarship to guarantee Harvard’s preeminence among research institutions. He saw higher education as a vehicle of opportunity for the talented rather than an entitlement for the wealthy, so Conant devised programs to identify, recruit, and support talented youth. In 1943, he asked the faculty to make a definitive statement about what general education ought to be, at the secondary as well as at the college level. The resulting Report, published in 1945, was one of the most influential manifestos in 20th century American education.

    Between 1945 and 1960, admissions were opened up to bring in a more diverse group of students. No longer drawing mostly from select New England prep schools, the undergraduate college became accessible to striving middle class students from public schools; many more Jews and Catholics were admitted, but few blacks, Hispanics, or Asians. Throughout the rest of the 20th century, Harvard became more diverse.
    Harvard’s graduate schools began admitting women in small numbers in the late 19th century. During World War II, students at Radcliffe College (which since 1879 had been paying Harvard professors to repeat their lectures for women) began attending Harvard classes alongside men. Women were first admitted to the medical school in 1945. Since 1971, Harvard has controlled essentially all aspects of undergraduate admission, instruction, and housing for Radcliffe women. In 1999, Radcliffe was formally merged into Harvard.
    21st century
    Drew Gilpin Faust, previously the dean of the Radcliffe Institute for Advanced Study, became Harvard’s first woman president on July 1, 2007. She was succeeded by Lawrence Bacow on July 1, 2018.
    Research
    Harvard is a founding member of the Association of American Universities and a preeminent research university with “very high” research activity (R1) and comprehensive doctoral programs across the arts, sciences, engineering, and medicine according to the Carnegie Classification.
    With the medical school consistently ranking first among medical schools for research, biomedical research is an area of particular strength for the university. More than 11,000 faculty and over 1,600 graduate students conduct research at the medical school as well as its 15 affiliated hospitals and research institutes. The medical school and its affiliates attracted $1.65 billion in competitive research grants from the National Institutes of Health in 2019, more than twice as much as any other university.

     
  • richardmitnick 9:26 am on October 30, 2021 Permalink | Reply
    Tags: "Taming The Data Deluge", , Artificial Intelligence, , , , Brain imaging neuroscience, , , , , , , , , , ,   

    From Kavli MIT Institute For Astrophysics and Space Research : “Taming The Data Deluge” 

    KavliFoundation

    http://www.kavlifoundation.org/institutes

    MIT Kavli Institute for Astrophysics and Space Research.

    From Kavli MIT Institute For Astrophysics and Space Research

    October 29, 2021

    Sandi Miller | Department of Physics

    An oncoming tsunami of data threatens to overwhelm huge data-rich research projects on such areas that range from the tiny neutrino to an exploding supernova, as well as the mysteries deep within the brain.

    2
    Left to right: Erik Katsavounidis of MIT’s Kavli Institute, Philip Harris of the Department of Physics, and Song Han of the Department of Electrical Engineering and Computer Science are part of a team from nine institutions that secured $15 million in National Science Foundation funding to set up the Accelerated AI Algorithms for Data-Driven Discovery (A3D3) Institute. Photo: Sandi Miller.

    When LIGO picks up a gravitational-wave signal from a distant collision of black holes and neutron stars, a clock starts ticking for capturing the earliest possible light that may accompany them: time is of the essence in this race.

    Caltech /MIT Advanced aLigo

    Data collected from electrical sensors monitoring brain activity are outpacing computing capacity. Information from the Large Hadron Collider (LHC)’s smashed particle beams will soon exceed 1 petabit per second.

    To tackle this approaching data bottleneck in real-time, a team of researchers from nine institutions led by The University of Washington (US), including The Massachusetts Institute of Technology (US), has received $15 million in funding to establish the Accelerated AI Algorithms for Data-Driven Discovery (A3D3) Institute. From MIT, the research team includes Philip Harris, assistant professor of physics, who will serve as the deputy director of the A3D3 Institute; Song Han, assistant professor of electrical engineering and computer science, who will serve as the A3D3’s co-PI; and Erik Katsavounidis, senior research scientist with the MIT Kavli Institute for Astrophysics and Space Research.

    Infused with this five-year Harnessing the Data Revolution Big Idea grant, and jointly funded by the Office of Advanced Cyberinfrastructure, A3D3 will focus on three data-rich fields: multi-messenger astrophysics, high-energy particle physics, and brain imaging neuroscience. By enriching AI algorithms with new processors, A3D3 seeks to speed up AI algorithms for solving fundamental problems in collider physics, neutrino physics, astronomy, gravitational-wave physics, computer science, and neuroscience.

    “I am very excited about the new Institute’s opportunities for research in nuclear and particle physics,” says Laboratory for Nuclear Science Director Boleslaw Wyslouch. “Modern particle detectors produce an enormous amount of data, and we are looking for extraordinarily rare signatures. The application of extremely fast processors to sift through these mountains of data will make a huge difference in what we will measure and discover.”

    The seeds of A3D3 were planted in 2017, when Harris and his colleagues at DOE’s Fermi National Accelerator Laboratory (US) and The European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN] decided to integrate real-time AI algorithms to process the incredible rates of data at the LHC. Through email correspondence with Han, Harris’ team built a compiler, HLS4ML, that could run an AI algorithm in nanoseconds.

    “Before the development of HLS4ML, the fastest processing that we knew of was roughly a millisecond per AI inference, maybe a little faster,” says Harris. “We realized all the AI algorithms were designed to solve much slower problems, such as image and voice recognition. To get to nanosecond inference timescales, we recognized we could make smaller algorithms and rely on custom implementations with Field Programmable Gate Array (FPGA) processors in an approach that was largely different from what others were doing.”

    A few months later, Harris presented their research at a physics faculty meeting, where Katsavounidis became intrigued. Over coffee in Building 7, they discussed combining Harris’ FPGA with Katsavounidis’s use of machine learning for finding gravitational waves. FPGAs and other new processor types, such as graphics processing units (GPUs), accelerate AI algorithms to more quickly analyze huge amounts of data.

    “I had worked with the first FPGAs that were out in the market in the early ’90s and have witnessed first-hand how they revolutionized front-end electronics and data acquisition in big high-energy physics experiments I was working on back then,” recalls Katsavounidis. “The ability to have them crunch gravitational-wave data has been in the back of my mind since joining LIGO over 20 years ago.”

    Two years ago they received their first grant, and the University of Washington’s Shih-Chieh Hsu joined in. The team initiated the Fast Machine Lab, published about 40 papers on the subject, built the group to about 50 researchers, and “launched a whole industry of how to explore a region of AI that has not been explored in the past,” says Harris. “We basically started this without any funding. We’ve been getting small grants for various projects over the years. A3D3 represents our first large grant to support this effort.”

    “What makes A3D3 so special and suited to MIT is its exploration of a technical frontier, where AI is implemented not in high-level software, but rather in lower-level firmware, reconfiguring individual gates to address the scientific question at hand,” says Rob Simcoe, director of MIT Kavli Institute for Astrophysics and Space Research and the Francis Friedman Professor of Physics. “We are in an era where experiments generate torrents of data. The acceleration gained from tailoring reprogrammable, bespoke computers at the processor level can advance real-time analysis of these data to new levels of speed and sophistication.”

    The Huge Data from the Large Hadron Collider

    With data rates already exceeding 500 terabits per second, the LHC processes more data than any other scientific instrument on earth. Its future aggregate data rates will soon exceed 1 petabit per second, the biggest data rate in the world.

    “Through the use of AI, A3D3 aims to perform advanced analyses, such as anomaly detection, and particle reconstruction on all collisions happening 40 million times per second,” says Harris.

    The goal is to find within all of this data a way to identify the few collisions out of the 3.2 billion collisions per second that could reveal new forces, explain how Dark Matter is formed, and complete the picture of how fundamental forces interact with matter. Processing all of this information requires a customized computing system capable of interpreting the collider information within ultra-low latencies.

    “The challenge of running this on all of the 100s of terabits per second in real-time is daunting and requires a complete overhaul of how we design and implement AI algorithms,” says Harris. “With large increases in the detector resolution leading to data rates that are even larger the challenge of finding the one collision, among many, will become even more daunting.”

    The Brain and the Universe

    Thanks to advances in techniques such as medical imaging and electrical recordings from implanted electrodes, neuroscience is also gathering larger amounts of data on how the brain’s neural networks process responses to stimuli and perform motor information. A3D3 plans to develop and implement high-throughput and low-latency AI algorithms to process, organize, and analyze massive neural datasets in real time, to probe brain function in order to enable new experiments and therapies.

    With Multi-Messenger Astrophysics (MMA), A3D3 aims to quickly identify astronomical events by efficiently processing data from gravitational waves, gamma-ray bursts, and neutrinos picked up by telescopes and detectors.

    The A3D3 researchers also include a multi-disciplinary group of 15 other researchers, including project lead the University of Washington, along with The California Institute of Technology (US), Duke University (US), Purdue University (US), The University of California-San Diego (US), The University of Illinois-Urbana-Champaign (US), The University of Minnesota (US), and The University of Wisconsin-Madison (US). It will include neutrinos research at The University of Wisconsin IceCube Neutrino Observatory(US) and The Fermi National Accelerator Laboratory DUNE/LBNF experiment (US), and visible astronomy at The Zwicky Transient Facility (US), and will organize deep-learning workshops and boot camps to train students and researchers on how to contribute to the framework and widen the use of fast AI strategies.

    “We have reached a point where detector network growth will be transformative, both in terms of event rates and in terms of astrophysical reach and ultimately, discoveries,” says Katsavounidis. “‘Fast’ and ‘efficient’ is the only way to fight the ‘faint’ and ‘fuzzy’ that is out there in the universe, and the path for getting the most out of our detectors. A3D3 on one hand is going to bring production-scale AI to gravitational-wave physics and multi-messenger astronomy; but on the other hand, we aspire to go beyond our immediate domains and become the go-to place across the country for applications of accelerated AI to data-driven disciplines.”

    Science paper:
    Hardware-accelerated Inference for Real-Time Gravitational-Wave Astronomy

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Mission Statement

    The mission of the MIT Kavli Institute (MKI) for Astrophysics and Space Research is to facilitate and carry out the research programs of faculty and research staff whose interests lie in the broadly defined area of astrophysics and space research. Specifically, the MKI will

    Provide an intellectual home for faculty, research staff, and students engaged in space- and ground-based astrophysics
    Develop and operate space- and ground-based instrumentation for astrophysics
    Engage in technology development
    Maintain an engineering and technical core capability for enabling and supporting innovative research
    Communicate to students, educators, and the public an understanding of and an appreciation for the goals, techniques and results of MKI’s research.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    To date, The Kavli Foundation has made grants to establish Kavli Institutes on the campuses of 20 major universities. In addition to the Kavli Institutes, nine Kavli professorships have been established: three at Harvard University, two at University of California, Santa Barbara, one each at University of California, Los Angeles, University of California, Irvine, Columbia University, Cornell University, and California Institute of Technology.

    The Kavli Institutes:

    The Kavli Foundation’s 20 institutes focus on astrophysics, nanoscience, neuroscience and theoretical physics.

    Astrophysics

    The Kavli Institute for Particle Astrophysics and Cosmology at Stanford University
    The Kavli Institute for Cosmological Physics, University of Chicago
    The Kavli Institute for Astrophysics and Space Research at the Massachusetts Institute of Technology
    The Kavli Institute for Astronomy and Astrophysics at Peking University
    The Kavli Institute for Cosmology at the University of Cambridge
    The Kavli Institute for the Physics and Mathematics of the Universe at the University of Tokyo

    Nanoscience

    The Kavli Institute for Nanoscale Science at Cornell University
    The Kavli Institute of Nanoscience at Delft University of Technology in the Netherlands
    The Kavli Nanoscience Institute at the California Institute of Technology
    The Kavli Energy NanoSciences Institute at University of California, Berkeley and the Lawrence Berkeley National Laboratory
    The Kavli Institute for NanoScience Discovery at the University of Oxford

    Neuroscience

    The Kavli Institute for Brain Science at Columbia University
    The Kavli Institute for Brain & Mind at the University of California, San Diego
    The Kavli Institute for Neuroscience at Yale University
    The Kavli Institute for Systems Neuroscience at the Norwegian University of Science and Technology
    The Kavli Neuroscience Discovery Institute at Johns Hopkins University
    The Kavli Neural Systems Institute at The Rockefeller University
    The Kavli Institute for Fundamental Neuroscience at the University of California, San Francisco

    Theoretical physics

    Kavli Institute for Theoretical Physics at the University of California, Santa Barbara
    The Kavli Institute for Theoretical Physics China at the University of Chinese Academy of Sciences

     
  • richardmitnick 1:01 pm on October 1, 2021 Permalink | Reply
    Tags: "Deep-learning-based image analysis is now just a click away", Artificial Intelligence, deepImageJ, , University Carlos III of Madrid [Universidad Carlos III de Madrid](ES), Using neural networks in biomedical research   

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “Deep-learning-based image analysis is now just a click away” 

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

    01.10.21
    Cécilia Carron

    1
    Under an initiative by EPFL’s Center for Imaging, a team of engineers from EPFL and Universidad Carlos III de Madrid have developed a plugin that makes it easier to incorporate artificial intelligence into image analysis for life-science research. The plugin, called deepImageJ, is described in a paper appearing today in Nature Methods.

    Over the past five years, image analysis has been shifting away from traditional mathematical- and observational-based methods towards data-driven processing and artificial intelligence. This major development is making the detection and identification of valuable information in images easier, faster, and increasingly automated – in just about every research field. When it comes to life science, deep-learning-, a sub-field of artificial intelligence, is showing an increasing potential for bioimage analysis. Unfortunately, using the deep-learning models often requires coding skills that few life scientists possess. To make the process easier, image analysis experts from EPFL and University Carlos III of Madrid [Universidad Carlos III de Madrid](ES), working in association with EPFL’s Center for Imaging, have developed deepImageJ – an open-source plugin that’s described in a paper published today in Nature Methods.

    Using neural networks in biomedical research

    Deep-learning models are a significant breakthrough for the many fields that rely on imaging, such as diagnostics and drug development. In bio-imaging, for example, deep learning can be used to process vast collections of images and detect lesions in organic tissue, identify synapses between nerve cells, and determine the structure of cell membranes and nuclei. It’s ideal for recognizing and classifying images, identifying specific elements, and predicting experimental results.

    This type of artificial intelligence involves training a computer to perform a task by drawing on large amounts of previously annotated data. It’s similar to CCTV systems that perform facial recognition, or to mobile-camera apps that enhance photos. Deep-learning models are based on sophisticated computational architectures called artificial neural networks that can be trained for specific research purposes, such as to recognize certain types of cells or tissue lesions or to improve image quality. The trained neural network is then saved as a computer model.

    Artificial intelligence, but without the code

    For biomedical imaging, a consortium of European researchers is developing a repository of these pre-trained deep-learning models, called the BioImage Model Zoo. “To train these models, researchers need specific resources and technical knowledge – especially in Python coding – that many life scientists do not have,” says Daniel Sage, the engineer at EPFL’s Center for Imaging who is overseeing the deepImageJ development. “But ideally, these models should be available to everyone.”

    The deepImageJ plugin bridges the gap between artificial neural networks and the researchers who use them. Now, a life scientist can ask a computer engineer to design and train a machine-learning algorithm to perform a specific task, which the scientist can then easily run via a user interface – without ever seeing a single line of code. The plugin is open-source and free-of-charge, and will speed the dissemination of new developments in computer science and the publication of biomedical research. It is designed to be a collaborative resource that enables engineers, computer scientists, mathematicians and biologists to work together more efficiently. For example, a model developed recently by an EPFL Master’s student, working as part of a cross-disciplinary team, enables scientists to distinguish human cells from mouse cells in tissue sections.

    Researchers can train users, too

    Life scientists around the world have been hoping for such a system for several years, but – until EPFL’s Center for Imaging stepped in – no one had taken up the challenge of building one. The research group is headed by Daniel Sage and Michael Unser, the Center’s academic director, together with Arrate Muñoz Barrutia, associate professor at UC3M. Professor Muñoz-Barrutia led the operational development work along with one of her PhD students, Estibaliz Gómez-de-Mariscal, and Carlos García López de Haro, a bioengineering research assistant .

    So that as many researchers can use the plugin as possible, the group is also developing virtual seminars, training materials and online resources, with a view to better exploiting the full potential of artificial intelligence. These materials are being designed with both programmers and life scientists in mind, so that users can quickly come to grips with the new method. DeepImageJ will also be presented at ZIDAS – a week-long class on image and data analysis for life scientists in Switzerland.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École polytechnique fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is the Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH) . Associated with several specialized research institutes, the two universities form the Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form the Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices was located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganised and acquired the status of a university in 1890, the technical faculty changed its name to École d’ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich(CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

    Organization

    EPFL is organised into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences (SB, Jan S. Hesthaven)

    Institute of Mathematics (MATH, Victor Panaretos)
    Institute of Chemical Sciences and Engineering (ISIC, Emsley Lyndon)
    Institute of Physics (IPHYS, Harald Brune)
    European Centre of Atomic and Molecular Computations (CECAM, Ignacio Pagonabarraga Mora)
    Bernoulli Center (CIB, Nicolas Monod)
    Biomedical Imaging Research Center (CIBM, Rolf Gruetter)
    Interdisciplinary Center for Electron Microscopy (CIME, Cécile Hébert)
    Max Planck-EPFL Centre for Molecular Nanosciences and Technology (CMNT, Thomas Rizzo)
    Swiss Plasma Center (SPC, Ambrogio Fasoli)
    Laboratory of Astrophysics (LASTRO, Jean-Paul Kneib)

    School of Engineering (STI, Ali Sayed)

    Institute of Electrical Engineering (IEL, Giovanni De Micheli)
    Institute of Mechanical Engineering (IGM, Thomas Gmür)
    Institute of Materials (IMX, Michaud Véronique)
    Institute of Microengineering (IMT, Olivier Martin)
    Institute of Bioengineering (IBI, Matthias Lütolf)

    School of Architecture, Civil and Environmental Engineering (ENAC, Claudia R. Binder)

    Institute of Architecture (IA, Luca Ortelli)
    Civil Engineering Institute (IIC, Eugen Brühwiler)
    Institute of Urban and Regional Sciences (INTER, Philippe Thalmann)
    Environmental Engineering Institute (IIE, David Andrew Barry)

    School of Computer and Communication Sciences (IC, James Larus)

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Networking
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing
    Systems

    School of Life Sciences (SV, Gisou van der Goot)

    Bachelor-Master Teaching Section in Life Sciences and Technologies (SSV)
    Brain Mind Institute (BMI, Carmen Sandi)
    Institute of Bioengineering (IBI, Melody Swartz)
    Swiss Institute for Experimental Cancer Research (ISREC, Douglas Hanahan)
    Global Health Institute (GHI, Bruno Lemaitre)
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics (CPG)
    NCCR Synaptic Bases of Mental Diseases (NCCR-SYNAPSY)

    College of Management of Technology (CDM)

    Swiss Finance Institute at EPFL (CDM-SFI, Damir Filipovic)
    Section of Management of Technology and Entrepreneurship (CDM-PMTE, Daniel Kuhn)
    Institute of Technology and Public Policy (CDM-ITPP, Matthias Finger)
    Institute of Management of Technology and Entrepreneurship (CDM-MTEI, Ralf Seifert)
    Section of Financial Engineering (CDM-IF, Julien Hugonnier)

    College of Humanities (CDH, Thomas David)

    Human and social sciences teaching program (CDH-SHS, Thomas David)

    EPFL Middle East (EME, Dr. Franco Vigliotti)[62]

    Section of Energy Management and Sustainability (MES, Prof. Maher Kayal)

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

     
  • richardmitnick 9:17 am on August 13, 2021 Permalink | Reply
    Tags: "From detecting earthquakes to preventing disease- 27 U of T research projects receive CFI funding", Aerospace Studies and Engineering, Artificial Intelligence, Baby Brain and Behaviour, , Cellular and Biomolecular Research, Chemical Engineering & Applied Chemistry, Civil and Mineral engineering, Dynamic Emotional Behavior, , Macromolecular bioelectronics encoded for self-assembly, Mechanical & Industrial Engineering, Medical Biophysics and Cancer studies, Multi-organ repair and regeneration after lung injury, Nutritional sciences, Pharmacology and Toxicology, Radiation Oncology, Stem cell models, , Sustainable Water Management and Resource Recovery, Targeted brain tumour therapies,   

    From University of Toronto (CA) : “From detecting earthquakes to preventing disease- 27 U of T research projects receive CFI funding” 

    From University of Toronto (CA)

    August 12, 2021
    Tyler Irving

    1
    In a U of T Engineering lab, rock samples are subjected to the stress, fluid pressure and temperature conditions they experience in nature. Photo courtesy of Sebastian Goodfellow.

    Sebastian Goodfellow, a researcher at the University of Toronto (CA), listens for hidden signals that the ground is about to move beneath our feet.

    That includes so-called “induced” earthquakes that stem from human activities such as hydraulic fracturing (‘fracking’) and enhanced geothermal systems.

    “Think of the cracking sounds a cube of ice makes when you drop it in a cup of warm water, or the sound a wooden stick makes when you bend it until it breaks,” says Goodfellow, an assistant professor in the department of civil and mineral engineering in the Faculty of Applied Science & Engineering.

    “This occurs as a consequence of sudden localized changes in stress, and we study these microfracture sounds in the lab to understand how rock responds to changes in stress, fluid pressure and temperature.”

    While the frequency of these sonic clues is beyond the range of human hearing, they can be picked up with acoustic emission sensors. The challenge, however, is that scientists must listen continuously for hours in the absence of a method to predict when they will occur.

    “We’re talking about more than a terabyte of data per hour,” says Goodfellow. “We use a form of artificial intelligence called machine learning to extract patterns from these large waveform datasets.”

    Goodfellow’s study of induced seismicity project is one of 27 at U of T – and nine from U of T Engineering – to share more than $8.2 million in funding from the Canada Foundation for Innovation’s John R. Evans Leaders Fund (Read the full list of researchers and their projects).

    Named for the late U of T President Emeritus John R. Evans, the fund equips university researchers with the technology and infrastructure they need to remain at the forefront of innovation in Canada and globally. It also helps Canadian universities attract top researchers from around the world.

    “From sustainable electric transportation and engineering of novel materials to non-invasive neuro-imaging and applications of AI in public health, U of T researchers across our three campuses are advancing some of the most important discoveries of our time,” said Leah Cowen, U of T’s associate vice-president, research.

    “Addressing such complex challenges often requires cutting-edge technology, equipment and facilities. The support provided by the Canada Foundation for Innovation will go a long way towards enabling our researchers’ important work.”

    Goodfellow’s team will use the funding to buy a triaxial geophysical imaging cell fitted with acoustic emissions sensors as well as hardware for high-frequency acquisition of acoustic emissions data. The equipment will enable them to carry out controlled experiments in the lab, test better algorithms and develop new techniques to turn the data into insights – all to better understand processes that lead to induced earthquakes.

    By learning more about how these tiny cracks and pops are related to larger seismic events such as earthquakes, the team hopes to help professionals in a wide range of sectors make better decisions. That includes industries that employ underground injection technologies – geothermal power, hydraulic fracturing and carbon sequestration, among others – along with the bodies charged with regulating them.

    “Up until now, our poor understanding of the causal links between fluid injection and large, induced earthquakes limited the economic development of these industries,” says Goodfellow.

    “Our research will help mitigate the human and environmental impacts, leading to new economic growth opportunities for Canada.”

    ______________________________________________________________________________________________________________

    Here is the full list of 27 U of T researchers who received support for their projects:

    Cristina Amon, department of mechanical & industrial engineering in the Faculty of Applied Science & Engineering: Enabling sustainable e-mobility through intelligent thermal management systems for EVs and charging infrastructure.

    Jacqueline Beaudry, department of nutritional sciences in the Temerty Faculty of Medicine and Lunenfeld-Tannenbaum Research Institute at Sinai Health: Role of pancreatic and gut hormones in energy metabolism.

    Swetaprovo Chaudhuri, U of T Institute for Aerospace Studies in the Faculty of Applied Science & Engineering: Kinetics-transport interaction towards deposition of carbon particulates in meso-channel supercritical fuel flows.

    Mark Currie, department of cell and systems biology in Faculty of Arts & Science: Structural Biology Laboratory.

    Marcus Dillon, department of biology at U of T Mississauga: The evolutionary genomics of infectious phytopathogen emergence.

    Landon Edgar, department of pharmacology and toxicology in the Temerty Faculty of Medicine: Technologies to interrogate and control carbohydrate-mediated immunity.

    Gregory Fairn, department of biochemistry in the Temerty Faculty of Medicine and St. Michael’s Hospital: Advanced live cell imaging and isothermal calorimetry for the study immune cell dysfunction and inflammation.

    Kevin Golovin, department of mechanical and industrial engineering in the Faculty of Applied Science & Engineering: Durable Low Ice Adhesion Coatings Laboratory.

    Sebastian Goodfellow, department of civil and mineral engineering in the Faculty of Applied Science & Engineering: A study of induced seismicity through novel triaxial experiments and data analysis methodologies.

    Giovanni Grasselli, department of civil and mineral engineering in the Faculty of Applied Science & Engineering: Towards the sustainable development of energy resources – fundamentals and implications of hydraulic fracturing technology.

    Kristin Hope, department of medical biophysics in the Temerty Faculty of Medicine and Princess Margaret Cancer Centre, University Health Network: Characterizing and unlocking the therapeutic potential of stem cells and the leukemic microenvironment.

    Elizabeth Johnson, department of psychology at U of T Mississauga: Baby Brain and Behaviour Lab (BaBBL) – electrophysiological measures of infant speech and language development.

    Omar Khan, Institute of Biomedical Engineering in the Faculty of Applied Science & Engineering and department of immunology in the Temerty Faculty of Medicine: Combination ribonucleic acid treatment technology lab.

    Marianne Koritzinsky, department of radiation oncology in the Temerty Faculty of Medicine and Princess Margaret Cancer Centre, University Health Network: Targeted therapeutics to enhance radiotherapy efficacy and safety in the era of image-guided conformal treatment.

    Christopher Lawson, department of chemical engineering & applied chemistry in the Faculty of Applied Science & Engineering: The Microbiome Engineering Laboratory for Resource Recovery.

    Fa-Hsuan Lin, department of medical biophysics in the Temerty Faculty of Medicine and Sunnybrook Research Institute: Integrated non-invasive human neuroimaging and neuromodulation platform.

    Vasanti Malik, department of nutritional sciences in the Temerty Faculty of Medicine: Child obesity and metabolic health in pregnancy – a novel approach to chronic disease prevention and planetary health.

    Rafael Montenegro-Burke, Donnelly Centre for Cellular and Biomolecular Research and department of molecular genetics in the Temerty Faculty of Medicine: Mapping the dark metabolome using click chemistry tools.

    Robert Rozeske, department of psychology at U of T Scarborough: Neuronal mechanisms of dynamic emotional behavior.

    Karun Singh, department of laboratory medicine and pathobiology in the Temerty Faculty of Medicine and Toronto Western Hospital, University Health Network: Stem cell models to investigate brain function in development and disease.

    Corliss Kin I Sio, department of Earth sciences in the Faculty of Arts & Science: Constraining source compositions and timescales of mass transport using femtosecond LA-MC-ICPMS.

    Helen Tran, department of chemistry in the Faculty of Arts & Science: Macromolecular bioelectronics encoded for self-assembly, degradability and electron transport.

    Andrea Tricco, Dalla Lana School of Public Health: Expediting knowledge synthesis using artificial intelligence – CAL®-Synthesi.SR Dashboard.

    Jay Werber, department of chemical engineering and applied chemistry in the Faculty of Applied Science & Engineering: The Advanced Membranes (AM) Laboratory for Sustainable Water Management and Resource Recovery.

    Haibo Zhang, department of physiology in the Temerty Faculty of Medicine and St. Michael’s Hospital: Real time high-resolution imaging and cell sorting for studying multi-organ repair and regeneration after lung injury.

    Gang Zheng, department of medical biophysics in the Temerty Faculty of Medicine and Princess Margaret Cancer Centre, University Health Network: Preclinical magnetic resonance imaging for targeted brain tumour therapies.

    Shurui Zhou, department of electrical and computer engineering in the Faculty of Applied Science & Engineering: Improving collaboration efficiency for fork-based software development.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Toronto (CA) is a public research university in Toronto, Ontario, Canada, located on the grounds that surround Queen’s Park. It was founded by royal charter in 1827 as King’s College, the oldest university in the province of Ontario.

    Originally controlled by the Church of England, the university assumed its present name in 1850 upon becoming a secular institution.

    As a collegiate university, it comprises eleven colleges each with substantial autonomy on financial and institutional affairs and significant differences in character and history. The university also operates two satellite campuses located in Scarborough and Mississauga.

    University of Toronto has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

    Academically, the University of Toronto is noted for movements and curricula in literary criticism and communication theory, known collectively as the Toronto School.

    The university was the birthplace of insulin and stem cell research, and was the site of the first electron microscope in North America; the identification of the first black hole Cygnus X-1; multi-touch technology, and the development of the theory of NP-completeness.

    The university was one of several universities involved in early research of deep learning. It receives the most annual scientific research funding of any Canadian university and is one of two members of the Association of American Universities (US) outside the United States, the other being McGill(CA).

    The Varsity Blues are the athletic teams that represent the university in intercollegiate league matches, with ties to gridiron football, rowing and ice hockey. The earliest recorded instance of gridiron football occurred at University of Toronto’s University College in November 1861.

    The university’s Hart House is an early example of the North American student centre, simultaneously serving cultural, intellectual, and recreational interests within its large Gothic-revival complex.

    The University of Toronto has educated three Governors General of Canada, four Prime Ministers of Canada, three foreign leaders, and fourteen Justices of the Supreme Court. As of March 2019, ten Nobel laureates, five Turing Award winners, 94 Rhodes Scholars, and one Fields Medalist have been affiliated with the university.

    Early history

    The founding of a colonial college had long been the desire of John Graves Simcoe, the first Lieutenant-Governor of Upper Canada and founder of York, the colonial capital. As an University of Oxford (UK)-educated military commander who had fought in the American Revolutionary War, Simcoe believed a college was needed to counter the spread of republicanism from the United States. The Upper Canada Executive Committee recommended in 1798 that a college be established in York.

    On March 15, 1827, a royal charter was formally issued by King George IV, proclaiming “from this time one College, with the style and privileges of a University … for the education of youth in the principles of the Christian Religion, and for their instruction in the various branches of Science and Literature … to continue for ever, to be called King’s College.” The granting of the charter was largely the result of intense lobbying by John Strachan, the influential Anglican Bishop of Toronto who took office as the college’s first president. The original three-storey Greek Revival school building was built on the present site of Queen’s Park.

    Under Strachan’s stewardship, King’s College was a religious institution closely aligned with the Church of England and the British colonial elite, known as the Family Compact. Reformist politicians opposed the clergy’s control over colonial institutions and fought to have the college secularized. In 1849, after a lengthy and heated debate, the newly elected responsible government of the Province of Canada voted to rename King’s College as the University of Toronto and severed the school’s ties with the church. Having anticipated this decision, the enraged Strachan had resigned a year earlier to open Trinity College as a private Anglican seminary. University College was created as the nondenominational teaching branch of the University of Toronto. During the American Civil War the threat of Union blockade on British North America prompted the creation of the University Rifle Corps which saw battle in resisting the Fenian raids on the Niagara border in 1866. The Corps was part of the Reserve Militia lead by Professor Henry Croft.

    Established in 1878, the School of Practical Science was the precursor to the Faculty of Applied Science and Engineering which has been nicknamed Skule since its earliest days. While the Faculty of Medicine opened in 1843 medical teaching was conducted by proprietary schools from 1853 until 1887 when the faculty absorbed the Toronto School of Medicine. Meanwhile the university continued to set examinations and confer medical degrees. The university opened the Faculty of Law in 1887, followed by the Faculty of Dentistry in 1888 when the Royal College of Dental Surgeons became an affiliate. Women were first admitted to the university in 1884.

    A devastating fire in 1890 gutted the interior of University College and destroyed 33,000 volumes from the library but the university restored the building and replenished its library within two years. Over the next two decades a collegiate system took shape as the university arranged federation with several ecclesiastical colleges including Strachan’s Trinity College in 1904. The university operated the Royal Conservatory of Music from 1896 to 1991 and the Royal Ontario Museum from 1912 to 1968; both still retain close ties with the university as independent institutions. The University of Toronto Press was founded in 1901 as Canada’s first academic publishing house. The Faculty of Forestry founded in 1907 with Bernhard Fernow as dean was Canada’s first university faculty devoted to forest science. In 1910, the Faculty of Education opened its laboratory school, the University of Toronto Schools.

    World wars and post-war years

    The First and Second World Wars curtailed some university activities as undergraduate and graduate men eagerly enlisted. Intercollegiate athletic competitions and the Hart House Debates were suspended although exhibition and interfaculty games were still held. The David Dunlap Observatory in Richmond Hill opened in 1935 followed by the University of Toronto Institute for Aerospace Studies in 1949. The university opened satellite campuses in Scarborough in 1964 and in Mississauga in 1967. The university’s former affiliated schools at the Ontario Agricultural College and Glendon Hall became fully independent of the University of Toronto and became part of University of Guelph (CA) in 1964 and York University (CA) in 1965 respectively. Beginning in the 1980s reductions in government funding prompted more rigorous fundraising efforts.

    Since 2000

    In 2000 Kin-Yip Chun was reinstated as a professor of the university after he launched an unsuccessful lawsuit against the university alleging racial discrimination. In 2017 a human rights application was filed against the University by one of its students for allegedly delaying the investigation of sexual assault and being dismissive of their concerns. In 2018 the university cleared one of its professors of allegations of discrimination and antisemitism in an internal investigation after a complaint was filed by one of its students.

    The University of Toronto was the first Canadian university to amass a financial endowment greater than c. $1 billion in 2007. On September 24, 2020 the university announced a $250 million gift to the Faculty of Medicine from businessman and philanthropist James C. Temerty- the largest single philanthropic donation in Canadian history. This broke the previous record for the school set in 2019 when Gerry Schwartz and Heather Reisman jointly donated $100 million for the creation of a 750,000-square foot innovation and artificial intelligence centre.

    Research

    Since 1926 the University of Toronto has been a member of the Association of American Universities (US) a consortium of the leading North American research universities. The university manages by far the largest annual research budget of any university in Canada with sponsored direct-cost expenditures of $878 million in 2010. In 2018 the University of Toronto was named the top research university in Canada by Research Infosource with a sponsored research income (external sources of funding) of $1,147.584 million in 2017. In the same year the university’s faculty averaged a sponsored research income of $428,200 while graduate students averaged a sponsored research income of $63,700. The federal government was the largest source of funding with grants from the Canadian Institutes of Health Research; the Natural Sciences and Engineering Research Council; and the Social Sciences and Humanities Research Council amounting to about one-third of the research budget. About eight percent of research funding came from corporations- mostly in the healthcare industry.

    The first practical electron microscope was built by the physics department in 1938. During World War II the university developed the G-suit- a life-saving garment worn by Allied fighter plane pilots later adopted for use by astronauts.Development of the infrared chemiluminescence technique improved analyses of energy behaviours in chemical reactions. In 1963 the asteroid 2104 Toronto was discovered in the David Dunlap Observatory (CA) in Richmond Hill and is named after the university. In 1972 studies on Cygnus X-1 led to the publication of the first observational evidence proving the existence of black holes. Toronto astronomers have also discovered the Uranian moons of Caliban and Sycorax; the dwarf galaxies of Andromeda I, II and III; and the supernova SN 1987A. A pioneer in computing technology the university designed and built UTEC- one of the world’s first operational computers- and later purchased Ferut- the second commercial computer after UNIVAC I. Multi-touch technology was developed at Toronto with applications ranging from handheld devices to collaboration walls. The AeroVelo Atlas which won the Igor I. Sikorsky Human Powered Helicopter Competition in 2013 was developed by the university’s team of students and graduates and was tested in Vaughan.

    The discovery of insulin at the University of Toronto in 1921 is considered among the most significant events in the history of medicine. The stem cell was discovered at the university in 1963 forming the basis for bone marrow transplantation and all subsequent research on adult and embryonic stem cells. This was the first of many findings at Toronto relating to stem cells including the identification of pancreatic and retinal stem cells. The cancer stem cell was first identified in 1997 by Toronto researchers who have since found stem cell associations in leukemia; brain tumors; and colorectal cancer. Medical inventions developed at Toronto include the glycaemic index; the infant cereal Pablum; the use of protective hypothermia in open heart surgery; and the first artificial cardiac pacemaker. The first successful single-lung transplant was performed at Toronto in 1981 followed by the first nerve transplant in 1988; and the first double-lung transplant in 1989. Researchers identified the maturation promoting factor that regulates cell division and discovered the T-cell receptor which triggers responses of the immune system. The university is credited with isolating the genes that cause Fanconi anemia; cystic fibrosis; and early-onset Alzheimer’s disease among numerous other diseases. Between 1914 and 1972 the university operated the Connaught Medical Research Laboratories- now part of the pharmaceutical corporation Sanofi-Aventis. Among the research conducted at the laboratory was the development of gel electrophoresis.

    The University of Toronto is the primary research presence that supports one of the world’s largest concentrations of biotechnology firms. More than 5,000 principal investigators reside within 2 kilometres (1.2 mi) from the university grounds in Toronto’s Discovery District conducting $1 billion of medical research annually. MaRS Discovery District is a research park that serves commercial enterprises and the university’s technology transfer ventures. In 2008, the university disclosed 159 inventions and had 114 active start-up companies. Its SciNet Consortium operates the most powerful supercomputer in Canada.

     
  • richardmitnick 8:25 pm on July 18, 2021 Permalink | Reply
    Tags: "Curiosity and technology drive quest to reveal fundamental secrets of the universe", A very specific particle called a J/psi might provide a clearer picture of what’s going on inside a proton’s gluonic field., , Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together., Artificial Intelligence, , , , , Computational Science, , , , , , Developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles., , Electron-Ion Collider (EIC) at DOE's Brookhaven National Laboratory (US) to be built inside the tunnel that currently houses the Relativistic Heavy Ion Collider [RHIC]., Exploring the hearts of protons and neutrons, , , Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle., , , , , , , SLAC National Accelerator Laboratory(US), , ,   

    From DOE’s Argonne National Laboratory (US) : “Curiosity and technology drive quest to reveal fundamental secrets of the universe” 

    Argonne Lab

    From DOE’s Argonne National Laboratory (US)

    July 15, 2021
    John Spizzirri

    Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together.

    Imagine the first of our species to lie beneath the glow of an evening sky. An enormous sense of awe, perhaps a little fear, fills them as they wonder at those seemingly infinite points of light and what they might mean. As humans, we evolved the capacity to ask big insightful questions about the world around us and worlds beyond us. We dare, even, to question our own origins.

    “The place of humans in the universe is important to understand,” said physicist and computational scientist Salman Habib. ​“Once you realize that there are billions of galaxies we can detect, each with many billions of stars, you understand the insignificance of being human in some sense. But at the same time, you appreciate being human a lot more.”

    The South Pole Telescope is part of a collaboration between Argonne and a number of national labs and universities to measure the CMB, considered the oldest light in the universe.

    The high altitude and extremely dry conditions of the South Pole keep water vapor from absorbing select light wavelengths.

    With no less a sense of wonder than most of us, Habib and colleagues at the U.S. Department of Energy’s (DOE) Argonne National Laboratory are actively researching these questions through an initiative that investigates the fundamental components of both particle physics and astrophysics.

    The breadth of Argonne’s research in these areas is mind-boggling. It takes us back to the very edge of time itself, to some infinitesimally small portion of a second after the Big Bang when random fluctuations in temperature and density arose, eventually forming the breeding grounds of galaxies and planets.

    It explores the heart of protons and neutrons to understand the most fundamental constructs of the visible universe, particles and energy once free in the early post-Big Bang universe, but later confined forever within a basic atomic structure as that universe began to cool.

    And it addresses slightly newer, more controversial questions about the nature of Dark Matter and Dark Energy, both of which play a dominant role in the makeup and dynamics of the universe but are little understood.
    _____________________________________________________________________________________
    Dark Energy Survey

    Dark Energy Camera [DECam] built at DOE’s Fermi National Accelerator Laboratory(US)

    NOIRLab National Optical Astronomy Observatory(US) Cerro Tololo Inter-American Observatory(CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLab(US)NSF NOIRLab NOAO (US) Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    _____________________________________________________________________________________

    “And this world-class research we’re doing could not happen without advances in technology,” said Argonne Associate Laboratory Director Kawtar Hafidi, who helped define and merge the different aspects of the initiative.

    “We are developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles,” she added. ​“And because all of these detectors create big data that have to be analyzed, we are developing, among other things, artificial intelligence techniques to do that as well.”

    Decoding messages from the universe

    Fleshing out a theory of the universe on cosmic or subatomic scales requires a combination of observations, experiments, theories, simulations and analyses, which in turn requires access to the world’s most sophisticated telescopes, particle colliders, detectors and supercomputers.

    Argonne is uniquely suited to this mission, equipped as it is with many of those tools, the ability to manufacture others and collaborative privileges with other federal laboratories and leading research institutions to access other capabilities and expertise.

    As lead of the initiative’s cosmology component, Habib uses many of these tools in his quest to understand the origins of the universe and what makes it tick.

    And what better way to do that than to observe it, he said.

    “If you look at the universe as a laboratory, then obviously we should study it and try to figure out what it is telling us about foundational science,” noted Habib. ​“So, one part of what we are trying to do is build ever more sensitive probes to decipher what the universe is trying to tell us.”

    To date, Argonne is involved in several significant sky surveys, which use an array of observational platforms, like telescopes and satellites, to map different corners of the universe and collect information that furthers or rejects a specific theory.

    For example, the South Pole Telescope survey, a collaboration between Argonne and a number of national labs and universities, is measuring the cosmic microwave background (CMB) [above], considered the oldest light in the universe. Variations in CMB properties, such as temperature, signal the original fluctuations in density that ultimately led to all the visible structure in the universe.

    Additionally, the Dark Energy Spectroscopic Instrument and the forthcoming Vera C. Rubin Observatory are specially outfitted, ground-based telescopes designed to shed light on dark energy and dark matter, as well as the formation of luminous structure in the universe.

    DOE’s Lawrence Berkeley National Laboratory(US) DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory, in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Optical Astronomy Observatory (US) Mayall 4 m telescope at NSF NOIRLab NOAO Kitt Peak National Observatory (US) in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NSF (US) NOIRLab NOAO Kitt Peak National Observatory on the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NOIRLab (US) NOAO Kitt Peak National Observatory (US) on Kitt Peak of the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft). annotated.

    NSF (US) NOIRLab (US) NOAO (US) Vera C. Rubin Observatory [LSST] Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing NSF (US) NOIRLab (US) NOAO (US) Gemini South Telescope and NSF (US) NOIRLab (US) NOAO (US) Southern Astrophysical Research Telescope.

    Darker matters

    All the data sets derived from these observations are connected to the second component of Argonne’s cosmology push, which revolves around theory and modeling. Cosmologists combine observations, measurements and the prevailing laws of physics to form theories that resolve some of the mysteries of the universe.

    But the universe is complex, and it has an annoying tendency to throw a curve ball just when we thought we had a theory cinched. Discoveries within the past 100 years have revealed that the universe is both expanding and accelerating its expansion — realizations that came as separate but equal surprises.

    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    “To say that we understand the universe would be incorrect. To say that we sort of understand it is fine,” exclaimed Habib. ​“We have a theory that describes what the universe is doing, but each time the universe surprises us, we have to add a new ingredient to that theory.”

    Modeling helps scientists get a clearer picture of whether and how those new ingredients will fit a theory. They make predictions for observations that have not yet been made, telling observers what new measurements to take.

    Habib’s group is applying this same sort of process to gain an ever-so-tentative grasp on the nature of dark energy and dark matter. While scientists can tell us that both exist, that they comprise about 68 and 26% of the universe, respectively, beyond that not much else is known.

    ______________________________________________________________________________________________________________

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.


    Coma cluster via NASA/ESA Hubble.


    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.
    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.
    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970

    Dark Matter Research

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    _____________________________________________________________________________________

    Observations of cosmological structure — the distribution of galaxies and even of their shapes — provide clues about the nature of dark matter, which in turn feeds simple dark matter models and subsequent predictions. If observations, models and predictions aren’t in agreement, that tells scientists that there may be some missing ingredient in their description of dark matter.

    But there are also experiments that are looking for direct evidence of dark matter particles, which require highly sensitive detectors [above]. Argonne has initiated development of specialized superconducting detector technology for the detection of low-mass dark matter particles.

    This technology requires the ability to control properties of layered materials and adjust the temperature where the material transitions from finite to zero resistance, when it becomes a superconductor. And unlike other applications where scientists would like this temperature to be as high as possible — room temperature, for example — here, the transition needs to be very close to absolute zero.

    Habib refers to these dark matter detectors as traps, like those used for hunting — which, in essence, is what cosmologists are doing. Because it’s possible that dark matter doesn’t come in just one species, they need different types of traps.

    “It’s almost like you’re in a jungle in search of a certain animal, but you don’t quite know what it is — it could be a bird, a snake, a tiger — so you build different kinds of traps,” he said.

    Lab researchers are working on technologies to capture these elusive species through new classes of dark matter searches. Collaborating with other institutions, they are now designing and building a first set of pilot projects aimed at looking for dark matter candidates with low mass.

    Tuning in to the early universe

    Amy Bender is working on a different kind of detector — well, a lot of detectors — which are at the heart of a survey of the cosmic microwave background (CMB).

    “The CMB is radiation that has been around the universe for 13 billion years, and we’re directly measuring that,” said Bender, an assistant physicist at Argonne.

    The Argonne-developed detectors — all 16,000 of them — capture photons, or light particles, from that primordial sky through the aforementioned South Pole Telescope, to help answer questions about the early universe, fundamental physics and the formation of cosmic structures.

    Now, the CMB experimental effort is moving into a new phase, CMB-Stage 4 (CMB-S4).

    CMB-S4 is the next-generation ground-based cosmic microwave background experiment.With 21 telescopes at the South Pole and in the Chilean Atacama desert surveying the sky with 550,000 cryogenically-cooled superconducting detectors for 7 years, CMB-S4 will deliver transformative discoveries in fundamental physics, cosmology, astrophysics, and astronomy. CMB-S4 is supported by the Department of Energy Office of Science and the National Science Foundation.

    This larger project tackles even more complex topics like Inflationary Theory, which suggests that the universe expanded faster than the speed of light for a fraction of a second, shortly after the Big Bang.
    _____________________________________________________________________________________
    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation
    [caption id="attachment_55311" align="alignnone" width="632"] HPHS Owls

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes
    Alex Mittelmann, Coldcreation


    Alan Guth’s notes:

    Alan Guth’s original notes on inflation


    _____________________________________________________________________________________

    3
    A section of a detector array with architecture suitable for future CMB experiments, such as the upcoming CMB-S4 project. Fabricated at Argonne’s Center for Nanoscale Materials, 16,000 of these detectors currently drive measurements collected from the South Pole Telescope. (Image by Argonne National Laboratory.)

    While the science is amazing, the technology to get us there is just as fascinating.

    Technically called transition edge sensing (TES) bolometers, the detectors on the telescope are made from superconducting materials fabricated at Argonne’s Center for Nanoscale Materials, a DOE Office of Science User Facility.

    Each of the 16,000 detectors acts as a combination of very sensitive thermometer and camera. As incoming radiation is absorbed on the surface of each detector, measurements are made by supercooling them to a fraction of a degree above absolute zero. (That’s over three times as cold as Antarctica’s lowest recorded temperature.)

    Changes in heat are measured and recorded as changes in electrical resistance and will help inform a map of the CMB’s intensity across the sky.

    CMB-S4 will focus on newer technology that will allow researchers to distinguish very specific patterns in light, or polarized light. In this case, they are looking for what Bender calls the Holy Grail of polarization, a pattern called B-modes.

    Capturing this signal from the early universe — one far fainter than the intensity signal — will help to either confirm or disprove a generic prediction of inflation.

    It will also require the addition of 500,000 detectors distributed among 21 telescopes in two distinct regions of the world, the South Pole and the Chilean desert. There, the high altitude and extremely dry conditions keep water vapor in the atmosphere from absorbing millimeter wavelength light, like that of the CMB.

    While previous experiments have touched on this polarization, the large number of new detectors will improve sensitivity to that polarization and grow our ability to capture it.

    “Literally, we have built these cameras completely from the ground up,” said Bender. ​“Our innovation is in how to make these stacks of superconducting materials work together within this detector, where you have to couple many complex factors and then actually read out the results with the TES. And that is where Argonne has contributed, hugely.”

    Down to the basics

    Argonne’s capabilities in detector technology don’t just stop at the edge of time, nor do the initiative’s investigations just look at the big picture.

    Most of the visible universe, including galaxies, stars, planets and people, are made up of protons and neutrons. Understanding the most fundamental components of those building blocks and how they interact to make atoms and molecules and just about everything else is the realm of physicists like Zein-Eddine Meziani.

    “From the perspective of the future of my field, this initiative is extremely important,” said Meziani, who leads Argonne’s Medium Energy Physics group. ​“It has given us the ability to actually explore new concepts, develop better understanding of the science and a pathway to enter into bigger collaborations and take some leadership.”

    Taking the lead of the initiative’s nuclear physics component, Meziani is steering Argonne toward a significant role in the development of the Electron-Ion Collider, a new U.S. Nuclear Physics Program facility slated for construction at DOE’s Brookhaven National Laboratory (US).

    Argonne’s primary interest in the collider is to elucidate the role that quarks, anti-quarks and gluons play in giving mass and a quantum angular momentum, called spin, to protons and neutrons — nucleons — the particles that comprise the nucleus of an atom.


    EIC Electron Animation, Inner Proton Motion.
    Electrons colliding with ions will exchange virtual photons with the nuclear particles to help scientists ​“see” inside the nuclear particles; the collisions will produce precision 3D snapshots of the internal arrangement of quarks and gluons within ordinary nuclear matter; like a combination CT/MRI scanner for atoms. (Image by Brookhaven National Laboratory.)

    While we once thought nucleons were the finite fundamental particles of an atom, the emergence of powerful particle colliders, like the Stanford Linear Accelerator Center at Stanford University and the former Tevatron at DOE’s Fermilab, proved otherwise.

    It turns out that quarks and gluons were independent of nucleons in the extreme energy densities of the early universe; as the universe expanded and cooled, they transformed into ordinary matter.

    “There was a time when quarks and gluons were free in a big soup, if you will, but we have never seen them free,” explained Meziani. ​“So, we are trying to understand how the universe captured all of this energy that was there and put it into confined systems, like these droplets we call protons and neutrons.”

    Some of that energy is tied up in gluons, which, despite the fact that they have no mass, confer the majority of mass to a proton. So, Meziani is hoping that the Electron-Ion Collider will allow science to explore — among other properties — the origins of mass in the universe through a detailed exploration of gluons.

    And just as Amy Bender is looking for the B-modes polarization in the CMB, Meziani and other researchers are hoping to use a very specific particle called a J/psi to provide a clearer picture of what’s going on inside a proton’s gluonic field.

    But producing and detecting the J/psi particle within the collider — while ensuring that the proton target doesn’t break apart — is a tricky enterprise, which requires new technologies. Again, Argonne is positioning itself at the forefront of this endeavor.

    “We are working on the conceptual designs of technologies that will be extremely important for the detection of these types of particles, as well as for testing concepts for other science that will be conducted at the Electron-Ion Collider,” said Meziani.

    Argonne also is producing detector and related technologies in its quest for a phenomenon called neutrinoless double beta decay. A neutrino is one of the particles emitted during the process of neutron radioactive beta decay and serves as a small but mighty connection between particle physics and astrophysics.

    “Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle,” said Hafidi. ​“If the existence of these very rare decays is confirmed, it would have important consequences in understanding why there is more matter than antimatter in the universe.”

    Argonne scientists from different areas of the lab are working on the Neutrino Experiment with Xenon Time Projection Chamber (NEXT) collaboration to design and prototype key systems for the collaborative’s next big experiment. This includes developing a one-of-a-kind test facility and an R&D program for new, specialized detector systems.

    “We are really working on dramatic new ideas,” said Meziani. ​“We are investing in certain technologies to produce some proof of principle that they will be the ones to pursue later, that the technology breakthroughs that will take us to the highest sensitivity detection of this process will be driven by Argonne.”

    The tools of detection

    Ultimately, fundamental science is science derived from human curiosity. And while we may not always see the reason for pursuing it, more often than not, fundamental science produces results that benefit all of us. Sometimes it’s a gratifying answer to an age-old question, other times it’s a technological breakthrough intended for one science that proves useful in a host of other applications.

    Through their various efforts, Argonne scientists are aiming for both outcomes. But it will take more than curiosity and brain power to solve the questions they are asking. It will take our skills at toolmaking, like the telescopes that peer deep into the heavens and the detectors that capture hints of the earliest light or the most elusive of particles.

    We will need to employ the ultrafast computing power of new supercomputers. Argonne’s forthcoming Aurora exascale machine will analyze mountains of data for help in creating massive models that simulate the dynamics of the universe or subatomic world, which, in turn, might guide new experiments — or introduce new questions.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, to be built at DOE’s Argonne National Laboratory.

    And we will apply artificial intelligence to recognize patterns in complex observations — on the subatomic and cosmic scales — far more quickly than the human eye can, or use it to optimize machinery and experiments for greater efficiency and faster results.

    “I think we have been given the flexibility to explore new technologies that will allow us to answer the big questions,” said Bender. ​“What we’re developing is so cutting edge, you never know where it will show up in everyday life.”

    Funding for research mentioned in this article was provided by Argonne Laboratory Directed Research and Development; Argonne program development; DOE Office of High Energy Physics: Cosmic Frontier, South Pole Telescope-3G project, Detector R&D; and DOE Office of Nuclear Physics.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Argonne National Laboratory (US) seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their is a science and engineering research national laboratory operated by UChicago Argonne LLC for the United States Department of Energy. The facility is located in Lemont, Illinois, outside of Chicago, and is the largest national laboratory by size and scope in the Midwest.

    Argonne had its beginnings in the Metallurgical Laboratory of the University of Chicago, formed in part to carry out Enrico Fermi’s work on nuclear reactors for the Manhattan Project during World War II. After the war, it was designated as the first national laboratory in the United States on July 1, 1946. In the post-war era the lab focused primarily on non-weapon related nuclear physics, designing and building the first power-producing nuclear reactors, helping design the reactors used by the United States’ nuclear navy, and a wide variety of similar projects. In 1994, the lab’s nuclear mission ended, and today it maintains a broad portfolio in basic science research, energy storage and renewable energy, environmental sustainability, supercomputing, and national security.

    UChicago Argonne, LLC, the operator of the laboratory, “brings together the expertise of the University of Chicago (the sole member of the LLC) with Jacobs Engineering Group Inc.” Argonne is a part of the expanding Illinois Technology and Research Corridor. Argonne formerly ran a smaller facility called Argonne National Laboratory-West (or simply Argonne-West) in Idaho next to the Idaho National Engineering and Environmental Laboratory. In 2005, the two Idaho-based laboratories merged to become the DOE’s Idaho National Laboratory.
    What would become Argonne began in 1942 as the Metallurgical Laboratory at the University of Chicago, which had become part of the Manhattan Project. The Met Lab built Chicago Pile-1, the world’s first nuclear reactor, under the stands of the University of Chicago sports stadium. Considered unsafe, in 1943, CP-1 was reconstructed as CP-2, in what is today known as Red Gate Woods but was then the Argonne Forest of the Cook County Forest Preserve District near Palos Hills. The lab was named after the surrounding forest, which in turn was named after the Forest of Argonne in France where U.S. troops fought in World War I. Fermi’s pile was originally going to be constructed in the Argonne forest, and construction plans were set in motion, but a labor dispute brought the project to a halt. Since speed was paramount, the project was moved to the squash court under Stagg Field, the football stadium on the campus of the University of Chicago. Fermi told them that he was sure of his calculations, which said that it would not lead to a runaway reaction, which would have contaminated the city.

    Other activities were added to Argonne over the next five years. On July 1, 1946, the “Metallurgical Laboratory” was formally re-chartered as Argonne National Laboratory for “cooperative research in nucleonics.” At the request of the U.S. Atomic Energy Commission, it began developing nuclear reactors for the nation’s peaceful nuclear energy program. In the late 1940s and early 1950s, the laboratory moved to a larger location in unincorporated DuPage County, Illinois and established a remote location in Idaho, called “Argonne-West,” to conduct further nuclear research.

    In quick succession, the laboratory designed and built Chicago Pile 3 (1944), the world’s first heavy-water moderated reactor, and the Experimental Breeder Reactor I (Chicago Pile 4), built-in Idaho, which lit a string of four light bulbs with the world’s first nuclear-generated electricity in 1951. A complete list of the reactors designed and, in most cases, built and operated by Argonne can be viewed in the, Reactors Designed by Argonne page. The knowledge gained from the Argonne experiments conducted with these reactors 1) formed the foundation for the designs of most of the commercial reactors currently used throughout the world for electric power generation and 2) inform the current evolving designs of liquid-metal reactors for future commercial power stations.

    Conducting classified research, the laboratory was heavily secured; all employees and visitors needed badges to pass a checkpoint, many of the buildings were classified, and the laboratory itself was fenced and guarded. Such alluring secrecy drew visitors both authorized—including King Leopold III of Belgium and Queen Frederica of Greece—and unauthorized. Shortly past 1 a.m. on February 6, 1951, Argonne guards discovered reporter Paul Harvey near the 10-foot (3.0 m) perimeter fence, his coat tangled in the barbed wire. Searching his car, guards found a previously prepared four-page broadcast detailing the saga of his unauthorized entrance into a classified “hot zone”. He was brought before a federal grand jury on charges of conspiracy to obtain information on national security and transmit it to the public, but was not indicted.

    Not all nuclear technology went into developing reactors, however. While designing a scanner for reactor fuel elements in 1957, Argonne physicist William Nelson Beck put his own arm inside the scanner and obtained one of the first ultrasound images of the human body. Remote manipulators designed to handle radioactive materials laid the groundwork for more complex machines used to clean up contaminated areas, sealed laboratories or caves. In 1964, the “Janus” reactor opened to study the effects of neutron radiation on biological life, providing research for guidelines on safe exposure levels for workers at power plants, laboratories and hospitals. Scientists at Argonne pioneered a technique to analyze the moon’s surface using alpha radiation, which launched aboard the Surveyor 5 in 1967 and later analyzed lunar samples from the Apollo 11 mission.

    In addition to nuclear work, the laboratory maintained a strong presence in the basic research of physics and chemistry. In 1955, Argonne chemists co-discovered the elements einsteinium and fermium, elements 99 and 100 in the periodic table. In 1962, laboratory chemists produced the first compound of the inert noble gas xenon, opening up a new field of chemical bonding research. In 1963, they discovered the hydrated electron.

    High-energy physics made a leap forward when Argonne was chosen as the site of the 12.5 GeV Zero Gradient Synchrotron, a proton accelerator that opened in 1963. A bubble chamber allowed scientists to track the motions of subatomic particles as they zipped through the chamber; in 1970, they observed the neutrino in a hydrogen bubble chamber for the first time.

    Meanwhile, the laboratory was also helping to design the reactor for the world’s first nuclear-powered submarine, the U.S.S. Nautilus, which steamed for more than 513,550 nautical miles (951,090 km). The next nuclear reactor model was Experimental Boiling Water Reactor, the forerunner of many modern nuclear plants, and Experimental Breeder Reactor II (EBR-II), which was sodium-cooled, and included a fuel recycling facility. EBR-II was later modified to test other reactor designs, including a fast-neutron reactor and, in 1982, the Integral Fast Reactor concept—a revolutionary design that reprocessed its own fuel, reduced its atomic waste and withstood safety tests of the same failures that triggered the Chernobyl and Three Mile Island disasters. In 1994, however, the U.S. Congress terminated funding for the bulk of Argonne’s nuclear programs.

    Argonne moved to specialize in other areas, while capitalizing on its experience in physics, chemical sciences and metallurgy. In 1987, the laboratory was the first to successfully demonstrate a pioneering technique called plasma wakefield acceleration, which accelerates particles in much shorter distances than conventional accelerators. It also cultivated a strong battery research program.

    Following a major push by then-director Alan Schriesheim, the laboratory was chosen as the site of the Advanced Photon Source, a major X-ray facility which was completed in 1995 and produced the brightest X-rays in the world at the time of its construction.

    On 19 March 2019, it was reported in the Chicago Tribune that the laboratory was constructing the world’s most powerful supercomputer. Costing $500 million it will have the processing power of 1 quintillion flops. Applications will include the analysis of stars and improvements in the power grid.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 9:03 pm on July 5, 2021 Permalink | Reply
    Tags: "U of T researcher launches group to help detect hidden biases in AI systems", Artificial Intelligence, As AI systems are deployed in an ever-expanding range of applications bias in AI becomes an even more critical issue., , For example: the app works 80 per cent successfully on native English speakers but only 40 per cent for people whose first language is not English ., HALT AI group, HALT was launched in May as a free service., Measuring biases present in artificial intelligence systems as a first step toward fixing them., The group has studied systems for Apple; Google; and Microsoft. among others., The majority of the time there is a training set problem., The scientists found problems with Apple and Google’s voice-to-text systems., The scientists found that Microsoft’s age-estimation AI does not perform well for certain age groups.,   

    From University of Toronto (CA) : “U of T researcher launches group to help detect hidden biases in AI systems” 

    From University of Toronto (CA)

    July 05, 2021
    Matthew Tierney

    1
    Parham Aarabi, of the department of electrical and computer engineering, helped start a research group that uncovers biases in AI systems, including some belonging to Apple, Google and Microsoft. Photo by Johnny Guatto.

    A new initiative led by University of Toronto researcher Parham Aarabi aims to measure biases present in artificial intelligence systems as a first step toward fixing them.

    AI systems often reflect biases that are present in the datasets – or, sometimes, the AI’s modelling can introduce new biases.

    “Every AI system has some kind of a bias,” says Aarabi, an associate professor of communications/computer engineering in the Edward S. Rogers Sr. department of electrical and computer engineering in the Faculty of Applied Science & Engineering. “I say that as someone who has worked on AI systems and algorithms for over 20 years.”

    Aarabi is among the academic and industry experts in the University of Toronto’s HALT AI group, which tests other organizations’ AI systems using diverse input sets. HALT AI creates a diversity report – including a diversity chart for key metrics – that shows weaknesses and suggests improvements.

    “We found that most AI teams do not perform actual quantitative validation of their system,” Aarabi says. “We are able to say, for example, ‘Look, your app works 80 per cent successfully on native English speakers, but only 40 per cent for people whose first language is not English.’”

    HALT was launched in May as a free service. The group has conducted studies on a number of popular AI systems, including some belonging to Apple, Google and Microsoft. HALT’s statistical reports provide feedback across a variety of diversity dimensions, such gender, age and race.

    “In our own testing we found that Microsoft’s age-estimation AI does not perform well for certain age groups,” says Aarabi. “So too with Apple and Google’s voice-to-text systems: If you have a certain dialect, an accent, they can work poorly. But you do not know which dialect until you test. Similar apps fail in different ways – which is interesting, and likely indicative of the type and limitation of the training data that was used for each app.”

    HALT started early this year when AI researchers within and outside the electrical and computer engineering department began sharing their concerns about bias in AI systems. By May, the group brought aboard external experts in diversity from the private and academic sectors.

    “To truly understand and measure bias, it can’t just be a few people from U of T,” Aarabi says. “HALT is a broad group of individuals, including the heads of diversity at Fortune 500 companies as well as AI diversity experts at other academic institutions such as University College London (UK) and Stanford University (US).”

    As AI systems are deployed in an ever-expanding range of applications bias in AI becomes an even more critical issue. While AI system performance remains a priority, a growing number of developers are also inspecting their work for inherent biases.

    “The majority of the time there is a training set problem,” Aarabi says. “The developers simply don’t have enough training data across all representative demographic groups.”

    If diverse training data doesn’t improve the AI’s performance, then the model itself may be flawed and require reprogramming.

    Deepa Kundur, a professor and the chair of the department of electrical and computer engineering, says HALT AI is helping to create fairer AI systems.

    “Our push for diversity starts at home, in our department, but also extends to the electrical and computer engineering community at large – including the tools that researchers innovate for society,” she says. “HALT AI is helping to ensure a way forward for equitable and fair AI.”

    “Right now is the right time for researchers and practitioners to be thinking about this,” Aarabi adds. “They need to move from high-level abstractions and be definitive about how bias reveals itself. I think we can shed some light on that.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Toronto (CA) is a public research university in Toronto, Ontario, Canada, located on the grounds that surround Queen’s Park. It was founded by royal charter in 1827 as King’s College, the oldest university in the province of Ontario.

    Originally controlled by the Church of England, the university assumed its present name in 1850 upon becoming a secular institution.

    As a collegiate university, it comprises eleven colleges each with substantial autonomy on financial and institutional affairs and significant differences in character and history. The university also operates two satellite campuses located in Scarborough and Mississauga.

    University of Toronto has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

    Academically, the University of Toronto is noted for movements and curricula in literary criticism and communication theory, known collectively as the Toronto School.

    The university was the birthplace of insulin and stem cell research, and was the site of the first electron microscope in North America; the identification of the first black hole Cygnus X-1; multi-touch technology, and the development of the theory of NP-completeness.

    The university was one of several universities involved in early research of deep learning. It receives the most annual scientific research funding of any Canadian university and is one of two members of the Association of American Universities (US) outside the United States, the other being McGill(CA).

    The Varsity Blues are the athletic teams that represent the university in intercollegiate league matches, with ties to gridiron football, rowing and ice hockey. The earliest recorded instance of gridiron football occurred at University of Toronto’s University College in November 1861.

    The university’s Hart House is an early example of the North American student centre, simultaneously serving cultural, intellectual, and recreational interests within its large Gothic-revival complex.

    The University of Toronto has educated three Governors General of Canada, four Prime Ministers of Canada, three foreign leaders, and fourteen Justices of the Supreme Court. As of March 2019, ten Nobel laureates, five Turing Award winners, 94 Rhodes Scholars, and one Fields Medalist have been affiliated with the university.

    Early history

    The founding of a colonial college had long been the desire of John Graves Simcoe, the first Lieutenant-Governor of Upper Canada and founder of York, the colonial capital. As an University of Oxford (UK)-educated military commander who had fought in the American Revolutionary War, Simcoe believed a college was needed to counter the spread of republicanism from the United States. The Upper Canada Executive Committee recommended in 1798 that a college be established in York.

    On March 15, 1827, a royal charter was formally issued by King George IV, proclaiming “from this time one College, with the style and privileges of a University … for the education of youth in the principles of the Christian Religion, and for their instruction in the various branches of Science and Literature … to continue for ever, to be called King’s College.” The granting of the charter was largely the result of intense lobbying by John Strachan, the influential Anglican Bishop of Toronto who took office as the college’s first president. The original three-storey Greek Revival school building was built on the present site of Queen’s Park.

    Under Strachan’s stewardship, King’s College was a religious institution closely aligned with the Church of England and the British colonial elite, known as the Family Compact. Reformist politicians opposed the clergy’s control over colonial institutions and fought to have the college secularized. In 1849, after a lengthy and heated debate, the newly elected responsible government of the Province of Canada voted to rename King’s College as the University of Toronto and severed the school’s ties with the church. Having anticipated this decision, the enraged Strachan had resigned a year earlier to open Trinity College as a private Anglican seminary. University College was created as the nondenominational teaching branch of the University of Toronto. During the American Civil War the threat of Union blockade on British North America prompted the creation of the University Rifle Corps which saw battle in resisting the Fenian raids on the Niagara border in 1866. The Corps was part of the Reserve Militia lead by Professor Henry Croft.

    Established in 1878, the School of Practical Science was the precursor to the Faculty of Applied Science and Engineering which has been nicknamed Skule since its earliest days. While the Faculty of Medicine opened in 1843 medical teaching was conducted by proprietary schools from 1853 until 1887 when the faculty absorbed the Toronto School of Medicine. Meanwhile the university continued to set examinations and confer medical degrees. The university opened the Faculty of Law in 1887, followed by the Faculty of Dentistry in 1888 when the Royal College of Dental Surgeons became an affiliate. Women were first admitted to the university in 1884.

    A devastating fire in 1890 gutted the interior of University College and destroyed 33,000 volumes from the library but the university restored the building and replenished its library within two years. Over the next two decades a collegiate system took shape as the university arranged federation with several ecclesiastical colleges including Strachan’s Trinity College in 1904. The university operated the Royal Conservatory of Music from 1896 to 1991 and the Royal Ontario Museum from 1912 to 1968; both still retain close ties with the university as independent institutions. The University of Toronto Press was founded in 1901 as Canada’s first academic publishing house. The Faculty of Forestry founded in 1907 with Bernhard Fernow as dean was Canada’s first university faculty devoted to forest science. In 1910, the Faculty of Education opened its laboratory school, the University of Toronto Schools.

    World wars and post-war years

    The First and Second World Wars curtailed some university activities as undergraduate and graduate men eagerly enlisted. Intercollegiate athletic competitions and the Hart House Debates were suspended although exhibition and interfaculty games were still held. The David Dunlap Observatory in Richmond Hill opened in 1935 followed by the University of Toronto Institute for Aerospace Studies in 1949. The university opened satellite campuses in Scarborough in 1964 and in Mississauga in 1967. The university’s former affiliated schools at the Ontario Agricultural College and Glendon Hall became fully independent of the University of Toronto and became part of University of Guelph (CA) in 1964 and York University (CA) in 1965 respectively. Beginning in the 1980s reductions in government funding prompted more rigorous fundraising efforts.

    Since 2000

    In 2000 Kin-Yip Chun was reinstated as a professor of the university after he launched an unsuccessful lawsuit against the university alleging racial discrimination. In 2017 a human rights application was filed against the University by one of its students for allegedly delaying the investigation of sexual assault and being dismissive of their concerns. In 2018 the university cleared one of its professors of allegations of discrimination and antisemitism in an internal investigation after a complaint was filed by one of its students.

    The University of Toronto was the first Canadian university to amass a financial endowment greater than c. $1 billion in 2007. On September 24, 2020 the university announced a $250 million gift to the Faculty of Medicine from businessman and philanthropist James C. Temerty- the largest single philanthropic donation in Canadian history. This broke the previous record for the school set in 2019 when Gerry Schwartz and Heather Reisman jointly donated $100 million for the creation of a 750,000-square foot innovation and artificial intelligence centre.

    Research

    Since 1926 the University of Toronto has been a member of the Association of American Universities (US) a consortium of the leading North American research universities. The university manages by far the largest annual research budget of any university in Canada with sponsored direct-cost expenditures of $878 million in 2010. In 2018 the University of Toronto was named the top research university in Canada by Research Infosource with a sponsored research income (external sources of funding) of $1,147.584 million in 2017. In the same year the university’s faculty averaged a sponsored research income of $428,200 while graduate students averaged a sponsored research income of $63,700. The federal government was the largest source of funding with grants from the Canadian Institutes of Health Research; the Natural Sciences and Engineering Research Council; and the Social Sciences and Humanities Research Council amounting to about one-third of the research budget. About eight percent of research funding came from corporations- mostly in the healthcare industry.

    The first practical electron microscope was built by the physics department in 1938. During World War II the university developed the G-suit- a life-saving garment worn by Allied fighter plane pilots later adopted for use by astronauts.Development of the infrared chemiluminescence technique improved analyses of energy behaviours in chemical reactions. In 1963 the asteroid 2104 Toronto was discovered in the David Dunlap Observatory (CA) in Richmond Hill and is named after the university. In 1972 studies on Cygnus X-1 led to the publication of the first observational evidence proving the existence of black holes. Toronto astronomers have also discovered the Uranian moons of Caliban and Sycorax; the dwarf galaxies of Andromeda I, II and III; and the supernova SN 1987A. A pioneer in computing technology the university designed and built UTEC- one of the world’s first operational computers- and later purchased Ferut- the second commercial computer after UNIVAC I. Multi-touch technology was developed at Toronto with applications ranging from handheld devices to collaboration walls. The AeroVelo Atlas which won the Igor I. Sikorsky Human Powered Helicopter Competition in 2013 was developed by the university’s team of students and graduates and was tested in Vaughan.

    The discovery of insulin at the University of Toronto in 1921 is considered among the most significant events in the history of medicine. The stem cell was discovered at the university in 1963 forming the basis for bone marrow transplantation and all subsequent research on adult and embryonic stem cells. This was the first of many findings at Toronto relating to stem cells including the identification of pancreatic and retinal stem cells. The cancer stem cell was first identified in 1997 by Toronto researchers who have since found stem cell associations in leukemia; brain tumors; and colorectal cancer. Medical inventions developed at Toronto include the glycaemic index; the infant cereal Pablum; the use of protective hypothermia in open heart surgery; and the first artificial cardiac pacemaker. The first successful single-lung transplant was performed at Toronto in 1981 followed by the first nerve transplant in 1988; and the first double-lung transplant in 1989. Researchers identified the maturation promoting factor that regulates cell division and discovered the T-cell receptor which triggers responses of the immune system. The university is credited with isolating the genes that cause Fanconi anemia; cystic fibrosis; and early-onset Alzheimer’s disease among numerous other diseases. Between 1914 and 1972 the university operated the Connaught Medical Research Laboratories- now part of the pharmaceutical corporation Sanofi-Aventis. Among the research conducted at the laboratory was the development of gel electrophoresis.

    The University of Toronto is the primary research presence that supports one of the world’s largest concentrations of biotechnology firms. More than 5,000 principal investigators reside within 2 kilometres (1.2 mi) from the university grounds in Toronto’s Discovery District conducting $1 billion of medical research annually. MaRS Discovery District is a research park that serves commercial enterprises and the university’s technology transfer ventures. In 2008, the university disclosed 159 inventions and had 114 active start-up companies. Its SciNet Consortium operates the most powerful supercomputer in Canada.

     
  • richardmitnick 9:11 pm on July 2, 2021 Permalink | Reply
    Tags: "AI Designs Quantum Physics Experiments Beyond What Any Human Has Conceived", Artificial Intelligence, MELVIN had seemingly solved the problem of creating highly complex entangled states involving multiple photons. How?, MELVIN was a machine-learning algorithm., , , The algorithm had rediscovered a type of experimental arrangement that had been devised in the early 1990s., When two photons interact they become entangled and both can only be mathematically described using a single shared quantum state.   

    From Scientific American : “AI Designs Quantum Physics Experiments Beyond What Any Human Has Conceived” 

    From Scientific American

    July 2, 2021
    Anil Ananthaswamy

    1
    Credit: Getty Images.

    Quantum physicist Mario Krenn remembers sitting in a café in Vienna in early 2016, poring over computer printouts, trying to make sense of what MELVIN had found. MELVIN was a machine-learning algorithm Krenn had built, a kind of artificial intelligence. Its job was to mix and match the building blocks of standard quantum experiments and find solutions to new problems. And it did find many interesting ones. But there was one that made no sense.

    “The first thing I thought was, ‘My program has a bug, because the solution cannot exist,’” Krenn says. MELVIN had seemingly solved the problem of creating highly complex entangled states involving multiple photons (entangled states being those that once made Albert Einstein invoke the specter of “spooky action at a distance”). Krenn and his colleagues had not explicitly provided MELVIN the rules needed to generate such complex states, yet it had found a way. Eventually, he realized that the algorithm had rediscovered a type of experimental arrangement that had been devised in the early 1990s. But those experiments had been much simpler. MELVIN had cracked a far more complex puzzle.

    “When we understood what was going on, we were immediately able to generalize [the solution],” says Krenn, who is now at the University of Toronto (CA). Since then, other teams have started performing the experiments identified by MELVIN, allowing them to test the conceptual underpinnings of quantum mechanics in new ways. Meanwhile Krenn, Anton Zeilinger of the University of Vienna [Universität Wien] (AT) and their colleagues have refined their machine-learning algorithms. Their latest effort, an AI called THESEUS, has upped the ante: it is orders of magnitude faster than MELVIN, and humans can readily parse its output. While it would take Krenn and his colleagues days or even weeks to understand MELVIN’s meanderings, they can almost immediately figure out what THESEUS is saying.

    “It is amazing work,” says theoretical quantum physicist Renato Renner of the Institute for Theoretical Physics at the Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH), who reviewed a 2020 study about THESEUS by Krenn and Zeilinger but was not directly involved in these efforts.

    Krenn stumbled on this entire research program somewhat by accident when he and his colleagues were trying to figure out how to experimentally create quantum states of photons entangled in a very particular manner: When two photons interact, they become entangled, and both can only be mathematically described using a single shared quantum state. If you measure the state of one photon, the measurement instantly fixes the state of the other even if the two are kilometers apart (hence Einstein’s derisive comments on entanglement being “spooky”).

    In 1989 three physicists—Daniel Greenberger, the late Michael Horne and Zeilinger—described an entangled state that came to be known as “GHZ” (after their initials). It involved four photons, each of which could be in a quantum superposition of, say, two states, 0 and 1 (a quantum state called a qubit). In their paper, the GHZ state involved entangling four qubits such that the entire system was in a two-dimensional quantum superposition of states 0000 and 1111. If you measured one of the photons and found it in state 0, the superposition would collapse, and the other photons would also be in state 0. The same went for state 1. In the late 1990s Zeilinger and his colleagues experimentally observed GHZ states using three qubits for the first time.

    Krenn and his colleagues were aiming for GHZ states of higher dimensions. They wanted to work with three photons, where each photon had a dimensionality of three, meaning it could be in a superposition of three states: 0, 1 and 2. This quantum state is called a qutrit. The entanglement the team was after was a three-dimensional GHZ state that was a superposition of states 000, 111 and 222. Such states are important ingredients for secure quantum communications and faster quantum computing. In late 2013 the researchers spent weeks designing experiments on blackboards and doing the calculations to see if their setups could generate the required quantum states. But each time they failed. “I thought, ‘This is absolutely insane. Why can’t we come up with a setup?’” says Krenn says.

    To speed up the process, Krenn first wrote a computer program that took an experimental setup and calculated the output. Then he upgraded the program to allow it to incorporate in its calculations the same building blocks that experimenters use to create and manipulate photons on an optical bench: lasers, nonlinear crystals, beam splitters, phase shifters, holograms, and the like. The program searched through a large space of configurations by randomly mixing and matching the building blocks, performed the calculations and spat out the result. MELVIN was born. “Within a few hours, the program found a solution that we scientists—three experimentalists and one theorist—could not come up with for months,” Krenn says. “That was a crazy day. I could not believe that it happened.”

    Then he gave MELVIN more smarts. Anytime it found a setup that did something useful, MELVIN added that setup to its toolbox. “The algorithm remembers that and tries to reuse it for more complex solutions,” Krenn says.

    It was this more evolved MELVIN that left Krenn scratching his head in a Viennese café. He had set it running with an experimental toolbox that contained two crystals, each capable of generating a pair of photons entangled in three dimensions. Krenn’s naive expectation was that MELVIN would find configurations that combined these pairs of photons to create entangled states of at most nine dimensions. But “it actually found one solution, an extremely rare case, that has much higher entanglement than the rest of the states,” Krenn says.

    Eventually, he figured out that MELVIN had used a technique that multiple teams had developed nearly three decades ago. In 1991 one method was designed by Xin Yu Zou, Li Jun Wang and Leonard Mandel, all then at the University of Rochester (US). And in 1994 Zeilinger, then at the University of Innsbruck [Leopold-Franzens-Universität Innsbruck] (AT), and his colleagues came up with another. Conceptually, these experiments attempted something similar, but the configuration that Zeilinger and his colleagues devised is simpler to understand. It starts with one crystal that generates a pair of photons (A and B). The paths of these photons go right through another crystal, which can also generate two photons (C and D). The paths of photon A from the first crystal and of photon C from the second overlap exactly and lead to the same detector. If that detector clicks, it is impossible to tell whether the photon originated from the first or the second crystal. The same goes for photons B and D.

    A phase shifter is a device that effectively increases the path a photon travels as some fraction of its wavelength. If you were to introduce a phase shifter in one of the paths between the crystals and kept changing the amount of phase shift, you could cause constructive and destructive interference at the detectors. For example, each of the crystals could be generating, say, 1,000 pairs of photons per second. With constructive interference, the detectors would register 4,000 pairs of photons per second. And with destructive interference, they would detect none: the system as a whole would not create any photons even though individual crystals would be generating 1,000 pairs a second. “That is actually quite crazy, when you think about it,” Krenn says.

    MELVIN’s funky solution involved such overlapping paths. What had flummoxed Krenn was that the algorithm had only two crystals in its toolbox. And instead of using those crystals at the beginning of the experimental setup, it had wedged them inside an interferometer (a device that splits the path of, say, a photon into two and then recombines them). After much effort, he realized that the setup MELVIN had found was equivalent to one involving more than two crystals, each generating pairs of photons, such that their paths to the detectors overlapped. The configuration could be used to generate high-dimensional entangled states.

    Quantum physicist Nora Tischler, who was a Ph.D. student working with Zeilinger on an unrelated topic when MELVIN was being put through its paces, was paying attention to these developments. “It was kind of clear from the beginning [that such an] experiment wouldn’t exist if it hadn’t been discovered by an algorithm,” she says.

    Besides generating complex entangled states, the setup using more than two crystals with overlapping paths can be employed to perform a generalized form of Zeilinger’s 1994 quantum interference experiments with two crystals. Aephraim Steinberg, an experimentalist at the University of Toronto, who is a colleague of Krenn’s but has not worked on these projects, is impressed by what the AI found. “This is a generalization that (to my knowledge) no human dreamed up in the intervening decades and might never have done,” he says. “It’s a gorgeous first example of the kind of new explorations these thinking machines can take us on.”

    In one such generalized configuration with four crystals, each generating a pair of photons, and overlapping paths leading to four detectors, quantum interference can create situations where either all four detectors click (constructive interference) or none of them do so (destructive interference).

    But until recently, carrying out such an experiment remained a distant dream. Then, in a March preprint paper, a team led by Lan-Tian Feng of the University of Science and Technology [中国科学技术大学] (CN) at Chinese Academy of Sciences [中国科学院](CN) , in collaboration with Krenn, reported that they had fabricated the entire setup on a single photonic chip and performed the experiment. The researchers collected data for more than 16 hours: a feat made possible because of the photonic chip’s incredible optical stability, something that would have been impossible to achieve in a larger-scale tabletop experiment. For starters, the setup would require a square meter’s worth of optical elements precisely aligned on an optical bench, Steinberg says. Besides, “a single optical element jittering or drifting by a thousandth of the diameter of a human hair during those 16 hours could be enough to wash out the effect,” he says.

    During their early attempts to simplify and generalize what MELVIN had found, Krenn and his colleagues realized that the solution resembled abstract mathematical forms called graphs, which contain vertices and edges and are used to depict pairwise relations between objects. For these quantum experiments, every path a photon takes is represented by a vertex. And a crystal, for example, is represented by an edge connecting two vertices. MELVIN first produced such a graph and then performed a mathematical operation on it. The operation, called “perfect matching,” involves generating an equivalent graph in which each vertex is connected to only one edge. This process makes calculating the final quantum state much easier, although it is still hard for humans to understand.

    That changed with MELVIN’s successor THESEUS, which generates much simpler graphs by winnowing the first complex graph representing a solution that it finds down to the bare minimum number of edges and vertices (such that any further deletion destroys the setup’s ability to generate the desired quantum states). Such graphs are simpler than MELVIN’s perfect matching graphs, so it is even easier to make sense of any AI-generated solution.

    Renner is particularly impressed by THESEUS’s human-interpretable outputs. “The solution is designed in such a way that the number of connections in the graph is minimized,” he says. “And that’s naturally a solution we can better understand than if you had a very complex graph.”

    Eric Cavalcanti of Griffith University (AU) is both impressed by the work and circumspect about it. “These machine-learning techniques represent an interesting development. For a human scientist looking at the data and interpreting it, some of the solutions may look like ‘creative’ new solutions. But at this stage, these algorithms are still far from a level where it could be said that they are having truly new ideas or coming up with new concepts,” he says. “On the other hand, I do think that one day they will get there. So these are baby steps—but we have to start somewhere.”

    Steinberg agrees. “For now, they are just amazing tools,” he says. “And like all the best tools, they’re already enabling us to do some things we probably wouldn’t have done without them.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Scientific American , the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: