Tagged: Artificial Intelligence Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:17 am on August 13, 2021 Permalink | Reply
    Tags: "From detecting earthquakes to preventing disease- 27 U of T research projects receive CFI funding", Aerospace Studies and Engineering, Artificial Intelligence, Baby Brain and Behaviour, , Cellular and Biomolecular Research, Chemical Engineering & Applied Chemistry, Civil and Mineral engineering, Dynamic Emotional Behavior, , Macromolecular bioelectronics encoded for self-assembly, Mechanical & Industrial Engineering, Medical Biophysics and Cancer studies, Multi-organ repair and regeneration after lung injury, Nutritional sciences, Pharmacology and Toxicology, Radiation Oncology, Stem cell models, , Sustainable Water Management and Resource Recovery, Targeted brain tumour therapies,   

    From University of Toronto (CA) : “From detecting earthquakes to preventing disease- 27 U of T research projects receive CFI funding” 

    From University of Toronto (CA)

    August 12, 2021
    Tyler Irving

    1
    In a U of T Engineering lab, rock samples are subjected to the stress, fluid pressure and temperature conditions they experience in nature. Photo courtesy of Sebastian Goodfellow.

    Sebastian Goodfellow, a researcher at the University of Toronto (CA), listens for hidden signals that the ground is about to move beneath our feet.

    That includes so-called “induced” earthquakes that stem from human activities such as hydraulic fracturing (‘fracking’) and enhanced geothermal systems.

    “Think of the cracking sounds a cube of ice makes when you drop it in a cup of warm water, or the sound a wooden stick makes when you bend it until it breaks,” says Goodfellow, an assistant professor in the department of civil and mineral engineering in the Faculty of Applied Science & Engineering.

    “This occurs as a consequence of sudden localized changes in stress, and we study these microfracture sounds in the lab to understand how rock responds to changes in stress, fluid pressure and temperature.”

    While the frequency of these sonic clues is beyond the range of human hearing, they can be picked up with acoustic emission sensors. The challenge, however, is that scientists must listen continuously for hours in the absence of a method to predict when they will occur.

    “We’re talking about more than a terabyte of data per hour,” says Goodfellow. “We use a form of artificial intelligence called machine learning to extract patterns from these large waveform datasets.”

    Goodfellow’s study of induced seismicity project is one of 27 at U of T – and nine from U of T Engineering – to share more than $8.2 million in funding from the Canada Foundation for Innovation’s John R. Evans Leaders Fund (Read the full list of researchers and their projects).

    Named for the late U of T President Emeritus John R. Evans, the fund equips university researchers with the technology and infrastructure they need to remain at the forefront of innovation in Canada and globally. It also helps Canadian universities attract top researchers from around the world.

    “From sustainable electric transportation and engineering of novel materials to non-invasive neuro-imaging and applications of AI in public health, U of T researchers across our three campuses are advancing some of the most important discoveries of our time,” said Leah Cowen, U of T’s associate vice-president, research.

    “Addressing such complex challenges often requires cutting-edge technology, equipment and facilities. The support provided by the Canada Foundation for Innovation will go a long way towards enabling our researchers’ important work.”

    Goodfellow’s team will use the funding to buy a triaxial geophysical imaging cell fitted with acoustic emissions sensors as well as hardware for high-frequency acquisition of acoustic emissions data. The equipment will enable them to carry out controlled experiments in the lab, test better algorithms and develop new techniques to turn the data into insights – all to better understand processes that lead to induced earthquakes.

    By learning more about how these tiny cracks and pops are related to larger seismic events such as earthquakes, the team hopes to help professionals in a wide range of sectors make better decisions. That includes industries that employ underground injection technologies – geothermal power, hydraulic fracturing and carbon sequestration, among others – along with the bodies charged with regulating them.

    “Up until now, our poor understanding of the causal links between fluid injection and large, induced earthquakes limited the economic development of these industries,” says Goodfellow.

    “Our research will help mitigate the human and environmental impacts, leading to new economic growth opportunities for Canada.”

    ______________________________________________________________________________________________________________

    Here is the full list of 27 U of T researchers who received support for their projects:

    Cristina Amon, department of mechanical & industrial engineering in the Faculty of Applied Science & Engineering: Enabling sustainable e-mobility through intelligent thermal management systems for EVs and charging infrastructure.

    Jacqueline Beaudry, department of nutritional sciences in the Temerty Faculty of Medicine and Lunenfeld-Tannenbaum Research Institute at Sinai Health: Role of pancreatic and gut hormones in energy metabolism.

    Swetaprovo Chaudhuri, U of T Institute for Aerospace Studies in the Faculty of Applied Science & Engineering: Kinetics-transport interaction towards deposition of carbon particulates in meso-channel supercritical fuel flows.

    Mark Currie, department of cell and systems biology in Faculty of Arts & Science: Structural Biology Laboratory.

    Marcus Dillon, department of biology at U of T Mississauga: The evolutionary genomics of infectious phytopathogen emergence.

    Landon Edgar, department of pharmacology and toxicology in the Temerty Faculty of Medicine: Technologies to interrogate and control carbohydrate-mediated immunity.

    Gregory Fairn, department of biochemistry in the Temerty Faculty of Medicine and St. Michael’s Hospital: Advanced live cell imaging and isothermal calorimetry for the study immune cell dysfunction and inflammation.

    Kevin Golovin, department of mechanical and industrial engineering in the Faculty of Applied Science & Engineering: Durable Low Ice Adhesion Coatings Laboratory.

    Sebastian Goodfellow, department of civil and mineral engineering in the Faculty of Applied Science & Engineering: A study of induced seismicity through novel triaxial experiments and data analysis methodologies.

    Giovanni Grasselli, department of civil and mineral engineering in the Faculty of Applied Science & Engineering: Towards the sustainable development of energy resources – fundamentals and implications of hydraulic fracturing technology.

    Kristin Hope, department of medical biophysics in the Temerty Faculty of Medicine and Princess Margaret Cancer Centre, University Health Network: Characterizing and unlocking the therapeutic potential of stem cells and the leukemic microenvironment.

    Elizabeth Johnson, department of psychology at U of T Mississauga: Baby Brain and Behaviour Lab (BaBBL) – electrophysiological measures of infant speech and language development.

    Omar Khan, Institute of Biomedical Engineering in the Faculty of Applied Science & Engineering and department of immunology in the Temerty Faculty of Medicine: Combination ribonucleic acid treatment technology lab.

    Marianne Koritzinsky, department of radiation oncology in the Temerty Faculty of Medicine and Princess Margaret Cancer Centre, University Health Network: Targeted therapeutics to enhance radiotherapy efficacy and safety in the era of image-guided conformal treatment.

    Christopher Lawson, department of chemical engineering & applied chemistry in the Faculty of Applied Science & Engineering: The Microbiome Engineering Laboratory for Resource Recovery.

    Fa-Hsuan Lin, department of medical biophysics in the Temerty Faculty of Medicine and Sunnybrook Research Institute: Integrated non-invasive human neuroimaging and neuromodulation platform.

    Vasanti Malik, department of nutritional sciences in the Temerty Faculty of Medicine: Child obesity and metabolic health in pregnancy – a novel approach to chronic disease prevention and planetary health.

    Rafael Montenegro-Burke, Donnelly Centre for Cellular and Biomolecular Research and department of molecular genetics in the Temerty Faculty of Medicine: Mapping the dark metabolome using click chemistry tools.

    Robert Rozeske, department of psychology at U of T Scarborough: Neuronal mechanisms of dynamic emotional behavior.

    Karun Singh, department of laboratory medicine and pathobiology in the Temerty Faculty of Medicine and Toronto Western Hospital, University Health Network: Stem cell models to investigate brain function in development and disease.

    Corliss Kin I Sio, department of Earth sciences in the Faculty of Arts & Science: Constraining source compositions and timescales of mass transport using femtosecond LA-MC-ICPMS.

    Helen Tran, department of chemistry in the Faculty of Arts & Science: Macromolecular bioelectronics encoded for self-assembly, degradability and electron transport.

    Andrea Tricco, Dalla Lana School of Public Health: Expediting knowledge synthesis using artificial intelligence – CAL®-Synthesi.SR Dashboard.

    Jay Werber, department of chemical engineering and applied chemistry in the Faculty of Applied Science & Engineering: The Advanced Membranes (AM) Laboratory for Sustainable Water Management and Resource Recovery.

    Haibo Zhang, department of physiology in the Temerty Faculty of Medicine and St. Michael’s Hospital: Real time high-resolution imaging and cell sorting for studying multi-organ repair and regeneration after lung injury.

    Gang Zheng, department of medical biophysics in the Temerty Faculty of Medicine and Princess Margaret Cancer Centre, University Health Network: Preclinical magnetic resonance imaging for targeted brain tumour therapies.

    Shurui Zhou, department of electrical and computer engineering in the Faculty of Applied Science & Engineering: Improving collaboration efficiency for fork-based software development.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Toronto (CA) is a public research university in Toronto, Ontario, Canada, located on the grounds that surround Queen’s Park. It was founded by royal charter in 1827 as King’s College, the oldest university in the province of Ontario.

    Originally controlled by the Church of England, the university assumed its present name in 1850 upon becoming a secular institution.

    As a collegiate university, it comprises eleven colleges each with substantial autonomy on financial and institutional affairs and significant differences in character and history. The university also operates two satellite campuses located in Scarborough and Mississauga.

    University of Toronto has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

    Academically, the University of Toronto is noted for movements and curricula in literary criticism and communication theory, known collectively as the Toronto School.

    The university was the birthplace of insulin and stem cell research, and was the site of the first electron microscope in North America; the identification of the first black hole Cygnus X-1; multi-touch technology, and the development of the theory of NP-completeness.

    The university was one of several universities involved in early research of deep learning. It receives the most annual scientific research funding of any Canadian university and is one of two members of the Association of American Universities (US) outside the United States, the other being McGill(CA).

    The Varsity Blues are the athletic teams that represent the university in intercollegiate league matches, with ties to gridiron football, rowing and ice hockey. The earliest recorded instance of gridiron football occurred at University of Toronto’s University College in November 1861.

    The university’s Hart House is an early example of the North American student centre, simultaneously serving cultural, intellectual, and recreational interests within its large Gothic-revival complex.

    The University of Toronto has educated three Governors General of Canada, four Prime Ministers of Canada, three foreign leaders, and fourteen Justices of the Supreme Court. As of March 2019, ten Nobel laureates, five Turing Award winners, 94 Rhodes Scholars, and one Fields Medalist have been affiliated with the university.

    Early history

    The founding of a colonial college had long been the desire of John Graves Simcoe, the first Lieutenant-Governor of Upper Canada and founder of York, the colonial capital. As an University of Oxford (UK)-educated military commander who had fought in the American Revolutionary War, Simcoe believed a college was needed to counter the spread of republicanism from the United States. The Upper Canada Executive Committee recommended in 1798 that a college be established in York.

    On March 15, 1827, a royal charter was formally issued by King George IV, proclaiming “from this time one College, with the style and privileges of a University … for the education of youth in the principles of the Christian Religion, and for their instruction in the various branches of Science and Literature … to continue for ever, to be called King’s College.” The granting of the charter was largely the result of intense lobbying by John Strachan, the influential Anglican Bishop of Toronto who took office as the college’s first president. The original three-storey Greek Revival school building was built on the present site of Queen’s Park.

    Under Strachan’s stewardship, King’s College was a religious institution closely aligned with the Church of England and the British colonial elite, known as the Family Compact. Reformist politicians opposed the clergy’s control over colonial institutions and fought to have the college secularized. In 1849, after a lengthy and heated debate, the newly elected responsible government of the Province of Canada voted to rename King’s College as the University of Toronto and severed the school’s ties with the church. Having anticipated this decision, the enraged Strachan had resigned a year earlier to open Trinity College as a private Anglican seminary. University College was created as the nondenominational teaching branch of the University of Toronto. During the American Civil War the threat of Union blockade on British North America prompted the creation of the University Rifle Corps which saw battle in resisting the Fenian raids on the Niagara border in 1866. The Corps was part of the Reserve Militia lead by Professor Henry Croft.

    Established in 1878, the School of Practical Science was the precursor to the Faculty of Applied Science and Engineering which has been nicknamed Skule since its earliest days. While the Faculty of Medicine opened in 1843 medical teaching was conducted by proprietary schools from 1853 until 1887 when the faculty absorbed the Toronto School of Medicine. Meanwhile the university continued to set examinations and confer medical degrees. The university opened the Faculty of Law in 1887, followed by the Faculty of Dentistry in 1888 when the Royal College of Dental Surgeons became an affiliate. Women were first admitted to the university in 1884.

    A devastating fire in 1890 gutted the interior of University College and destroyed 33,000 volumes from the library but the university restored the building and replenished its library within two years. Over the next two decades a collegiate system took shape as the university arranged federation with several ecclesiastical colleges including Strachan’s Trinity College in 1904. The university operated the Royal Conservatory of Music from 1896 to 1991 and the Royal Ontario Museum from 1912 to 1968; both still retain close ties with the university as independent institutions. The University of Toronto Press was founded in 1901 as Canada’s first academic publishing house. The Faculty of Forestry founded in 1907 with Bernhard Fernow as dean was Canada’s first university faculty devoted to forest science. In 1910, the Faculty of Education opened its laboratory school, the University of Toronto Schools.

    World wars and post-war years

    The First and Second World Wars curtailed some university activities as undergraduate and graduate men eagerly enlisted. Intercollegiate athletic competitions and the Hart House Debates were suspended although exhibition and interfaculty games were still held. The David Dunlap Observatory in Richmond Hill opened in 1935 followed by the University of Toronto Institute for Aerospace Studies in 1949. The university opened satellite campuses in Scarborough in 1964 and in Mississauga in 1967. The university’s former affiliated schools at the Ontario Agricultural College and Glendon Hall became fully independent of the University of Toronto and became part of University of Guelph (CA) in 1964 and York University (CA) in 1965 respectively. Beginning in the 1980s reductions in government funding prompted more rigorous fundraising efforts.

    Since 2000

    In 2000 Kin-Yip Chun was reinstated as a professor of the university after he launched an unsuccessful lawsuit against the university alleging racial discrimination. In 2017 a human rights application was filed against the University by one of its students for allegedly delaying the investigation of sexual assault and being dismissive of their concerns. In 2018 the university cleared one of its professors of allegations of discrimination and antisemitism in an internal investigation after a complaint was filed by one of its students.

    The University of Toronto was the first Canadian university to amass a financial endowment greater than c. $1 billion in 2007. On September 24, 2020 the university announced a $250 million gift to the Faculty of Medicine from businessman and philanthropist James C. Temerty- the largest single philanthropic donation in Canadian history. This broke the previous record for the school set in 2019 when Gerry Schwartz and Heather Reisman jointly donated $100 million for the creation of a 750,000-square foot innovation and artificial intelligence centre.

    Research

    Since 1926 the University of Toronto has been a member of the Association of American Universities (US) a consortium of the leading North American research universities. The university manages by far the largest annual research budget of any university in Canada with sponsored direct-cost expenditures of $878 million in 2010. In 2018 the University of Toronto was named the top research university in Canada by Research Infosource with a sponsored research income (external sources of funding) of $1,147.584 million in 2017. In the same year the university’s faculty averaged a sponsored research income of $428,200 while graduate students averaged a sponsored research income of $63,700. The federal government was the largest source of funding with grants from the Canadian Institutes of Health Research; the Natural Sciences and Engineering Research Council; and the Social Sciences and Humanities Research Council amounting to about one-third of the research budget. About eight percent of research funding came from corporations- mostly in the healthcare industry.

    The first practical electron microscope was built by the physics department in 1938. During World War II the university developed the G-suit- a life-saving garment worn by Allied fighter plane pilots later adopted for use by astronauts.Development of the infrared chemiluminescence technique improved analyses of energy behaviours in chemical reactions. In 1963 the asteroid 2104 Toronto was discovered in the David Dunlap Observatory (CA) in Richmond Hill and is named after the university. In 1972 studies on Cygnus X-1 led to the publication of the first observational evidence proving the existence of black holes. Toronto astronomers have also discovered the Uranian moons of Caliban and Sycorax; the dwarf galaxies of Andromeda I, II and III; and the supernova SN 1987A. A pioneer in computing technology the university designed and built UTEC- one of the world’s first operational computers- and later purchased Ferut- the second commercial computer after UNIVAC I. Multi-touch technology was developed at Toronto with applications ranging from handheld devices to collaboration walls. The AeroVelo Atlas which won the Igor I. Sikorsky Human Powered Helicopter Competition in 2013 was developed by the university’s team of students and graduates and was tested in Vaughan.

    The discovery of insulin at the University of Toronto in 1921 is considered among the most significant events in the history of medicine. The stem cell was discovered at the university in 1963 forming the basis for bone marrow transplantation and all subsequent research on adult and embryonic stem cells. This was the first of many findings at Toronto relating to stem cells including the identification of pancreatic and retinal stem cells. The cancer stem cell was first identified in 1997 by Toronto researchers who have since found stem cell associations in leukemia; brain tumors; and colorectal cancer. Medical inventions developed at Toronto include the glycaemic index; the infant cereal Pablum; the use of protective hypothermia in open heart surgery; and the first artificial cardiac pacemaker. The first successful single-lung transplant was performed at Toronto in 1981 followed by the first nerve transplant in 1988; and the first double-lung transplant in 1989. Researchers identified the maturation promoting factor that regulates cell division and discovered the T-cell receptor which triggers responses of the immune system. The university is credited with isolating the genes that cause Fanconi anemia; cystic fibrosis; and early-onset Alzheimer’s disease among numerous other diseases. Between 1914 and 1972 the university operated the Connaught Medical Research Laboratories- now part of the pharmaceutical corporation Sanofi-Aventis. Among the research conducted at the laboratory was the development of gel electrophoresis.

    The University of Toronto is the primary research presence that supports one of the world’s largest concentrations of biotechnology firms. More than 5,000 principal investigators reside within 2 kilometres (1.2 mi) from the university grounds in Toronto’s Discovery District conducting $1 billion of medical research annually. MaRS Discovery District is a research park that serves commercial enterprises and the university’s technology transfer ventures. In 2008, the university disclosed 159 inventions and had 114 active start-up companies. Its SciNet Consortium operates the most powerful supercomputer in Canada.

     
  • richardmitnick 8:25 pm on July 18, 2021 Permalink | Reply
    Tags: "Curiosity and technology drive quest to reveal fundamental secrets of the universe", A very specific particle called a J/psi might provide a clearer picture of what’s going on inside a proton’s gluonic field., , Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together., Artificial Intelligence, , , , , Computational Science, , , , , , Developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles., , Electron-Ion Collider (EIC) at DOE's Brookhaven National Laboratory (US) to be built inside the tunnel that currently houses the Relativistic Heavy Ion Collider [RHIC]., Exploring the hearts of protons and neutrons, , , Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle., , , , , , , SLAC National Accelerator Laboratory(US), , ,   

    From DOE’s Argonne National Laboratory (US) : “Curiosity and technology drive quest to reveal fundamental secrets of the universe” 

    Argonne Lab

    From DOE’s Argonne National Laboratory (US)

    July 15, 2021
    John Spizzirri

    Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together.

    Imagine the first of our species to lie beneath the glow of an evening sky. An enormous sense of awe, perhaps a little fear, fills them as they wonder at those seemingly infinite points of light and what they might mean. As humans, we evolved the capacity to ask big insightful questions about the world around us and worlds beyond us. We dare, even, to question our own origins.

    “The place of humans in the universe is important to understand,” said physicist and computational scientist Salman Habib. ​“Once you realize that there are billions of galaxies we can detect, each with many billions of stars, you understand the insignificance of being human in some sense. But at the same time, you appreciate being human a lot more.”

    The South Pole Telescope is part of a collaboration between Argonne and a number of national labs and universities to measure the CMB, considered the oldest light in the universe.

    The high altitude and extremely dry conditions of the South Pole keep water vapor from absorbing select light wavelengths.

    With no less a sense of wonder than most of us, Habib and colleagues at the U.S. Department of Energy’s (DOE) Argonne National Laboratory are actively researching these questions through an initiative that investigates the fundamental components of both particle physics and astrophysics.

    The breadth of Argonne’s research in these areas is mind-boggling. It takes us back to the very edge of time itself, to some infinitesimally small portion of a second after the Big Bang when random fluctuations in temperature and density arose, eventually forming the breeding grounds of galaxies and planets.

    It explores the heart of protons and neutrons to understand the most fundamental constructs of the visible universe, particles and energy once free in the early post-Big Bang universe, but later confined forever within a basic atomic structure as that universe began to cool.

    And it addresses slightly newer, more controversial questions about the nature of Dark Matter and Dark Energy, both of which play a dominant role in the makeup and dynamics of the universe but are little understood.
    _____________________________________________________________________________________
    Dark Energy Survey

    Dark Energy Camera [DECam] built at DOE’s Fermi National Accelerator Laboratory(US)

    NOIRLab National Optical Astronomy Observatory(US) Cerro Tololo Inter-American Observatory(CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLab(US)NSF NOIRLab NOAO (US) Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    _____________________________________________________________________________________

    “And this world-class research we’re doing could not happen without advances in technology,” said Argonne Associate Laboratory Director Kawtar Hafidi, who helped define and merge the different aspects of the initiative.

    “We are developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles,” she added. ​“And because all of these detectors create big data that have to be analyzed, we are developing, among other things, artificial intelligence techniques to do that as well.”

    Decoding messages from the universe

    Fleshing out a theory of the universe on cosmic or subatomic scales requires a combination of observations, experiments, theories, simulations and analyses, which in turn requires access to the world’s most sophisticated telescopes, particle colliders, detectors and supercomputers.

    Argonne is uniquely suited to this mission, equipped as it is with many of those tools, the ability to manufacture others and collaborative privileges with other federal laboratories and leading research institutions to access other capabilities and expertise.

    As lead of the initiative’s cosmology component, Habib uses many of these tools in his quest to understand the origins of the universe and what makes it tick.

    And what better way to do that than to observe it, he said.

    “If you look at the universe as a laboratory, then obviously we should study it and try to figure out what it is telling us about foundational science,” noted Habib. ​“So, one part of what we are trying to do is build ever more sensitive probes to decipher what the universe is trying to tell us.”

    To date, Argonne is involved in several significant sky surveys, which use an array of observational platforms, like telescopes and satellites, to map different corners of the universe and collect information that furthers or rejects a specific theory.

    For example, the South Pole Telescope survey, a collaboration between Argonne and a number of national labs and universities, is measuring the cosmic microwave background (CMB) [above], considered the oldest light in the universe. Variations in CMB properties, such as temperature, signal the original fluctuations in density that ultimately led to all the visible structure in the universe.

    Additionally, the Dark Energy Spectroscopic Instrument and the forthcoming Vera C. Rubin Observatory are specially outfitted, ground-based telescopes designed to shed light on dark energy and dark matter, as well as the formation of luminous structure in the universe.

    DOE’s Lawrence Berkeley National Laboratory(US) DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory, in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Optical Astronomy Observatory (US) Mayall 4 m telescope at NSF NOIRLab NOAO Kitt Peak National Observatory (US) in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NSF (US) NOIRLab NOAO Kitt Peak National Observatory on the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NOIRLab (US) NOAO Kitt Peak National Observatory (US) on Kitt Peak of the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft). annotated.

    NSF (US) NOIRLab (US) NOAO (US) Vera C. Rubin Observatory [LSST] Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing NSF (US) NOIRLab (US) NOAO (US) Gemini South Telescope and NSF (US) NOIRLab (US) NOAO (US) Southern Astrophysical Research Telescope.

    Darker matters

    All the data sets derived from these observations are connected to the second component of Argonne’s cosmology push, which revolves around theory and modeling. Cosmologists combine observations, measurements and the prevailing laws of physics to form theories that resolve some of the mysteries of the universe.

    But the universe is complex, and it has an annoying tendency to throw a curve ball just when we thought we had a theory cinched. Discoveries within the past 100 years have revealed that the universe is both expanding and accelerating its expansion — realizations that came as separate but equal surprises.

    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    “To say that we understand the universe would be incorrect. To say that we sort of understand it is fine,” exclaimed Habib. ​“We have a theory that describes what the universe is doing, but each time the universe surprises us, we have to add a new ingredient to that theory.”

    Modeling helps scientists get a clearer picture of whether and how those new ingredients will fit a theory. They make predictions for observations that have not yet been made, telling observers what new measurements to take.

    Habib’s group is applying this same sort of process to gain an ever-so-tentative grasp on the nature of dark energy and dark matter. While scientists can tell us that both exist, that they comprise about 68 and 26% of the universe, respectively, beyond that not much else is known.

    ______________________________________________________________________________________________________________

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.


    Coma cluster via NASA/ESA Hubble.


    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.
    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.
    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970

    Dark Matter Research

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    _____________________________________________________________________________________

    Observations of cosmological structure — the distribution of galaxies and even of their shapes — provide clues about the nature of dark matter, which in turn feeds simple dark matter models and subsequent predictions. If observations, models and predictions aren’t in agreement, that tells scientists that there may be some missing ingredient in their description of dark matter.

    But there are also experiments that are looking for direct evidence of dark matter particles, which require highly sensitive detectors [above]. Argonne has initiated development of specialized superconducting detector technology for the detection of low-mass dark matter particles.

    This technology requires the ability to control properties of layered materials and adjust the temperature where the material transitions from finite to zero resistance, when it becomes a superconductor. And unlike other applications where scientists would like this temperature to be as high as possible — room temperature, for example — here, the transition needs to be very close to absolute zero.

    Habib refers to these dark matter detectors as traps, like those used for hunting — which, in essence, is what cosmologists are doing. Because it’s possible that dark matter doesn’t come in just one species, they need different types of traps.

    “It’s almost like you’re in a jungle in search of a certain animal, but you don’t quite know what it is — it could be a bird, a snake, a tiger — so you build different kinds of traps,” he said.

    Lab researchers are working on technologies to capture these elusive species through new classes of dark matter searches. Collaborating with other institutions, they are now designing and building a first set of pilot projects aimed at looking for dark matter candidates with low mass.

    Tuning in to the early universe

    Amy Bender is working on a different kind of detector — well, a lot of detectors — which are at the heart of a survey of the cosmic microwave background (CMB).

    “The CMB is radiation that has been around the universe for 13 billion years, and we’re directly measuring that,” said Bender, an assistant physicist at Argonne.

    The Argonne-developed detectors — all 16,000 of them — capture photons, or light particles, from that primordial sky through the aforementioned South Pole Telescope, to help answer questions about the early universe, fundamental physics and the formation of cosmic structures.

    Now, the CMB experimental effort is moving into a new phase, CMB-Stage 4 (CMB-S4).

    CMB-S4 is the next-generation ground-based cosmic microwave background experiment.With 21 telescopes at the South Pole and in the Chilean Atacama desert surveying the sky with 550,000 cryogenically-cooled superconducting detectors for 7 years, CMB-S4 will deliver transformative discoveries in fundamental physics, cosmology, astrophysics, and astronomy. CMB-S4 is supported by the Department of Energy Office of Science and the National Science Foundation.

    This larger project tackles even more complex topics like Inflationary Theory, which suggests that the universe expanded faster than the speed of light for a fraction of a second, shortly after the Big Bang.
    _____________________________________________________________________________________
    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation
    [caption id="attachment_55311" align="alignnone" width="632"] HPHS Owls

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes
    Alex Mittelmann, Coldcreation


    Alan Guth’s notes:

    Alan Guth’s original notes on inflation


    _____________________________________________________________________________________

    3
    A section of a detector array with architecture suitable for future CMB experiments, such as the upcoming CMB-S4 project. Fabricated at Argonne’s Center for Nanoscale Materials, 16,000 of these detectors currently drive measurements collected from the South Pole Telescope. (Image by Argonne National Laboratory.)

    While the science is amazing, the technology to get us there is just as fascinating.

    Technically called transition edge sensing (TES) bolometers, the detectors on the telescope are made from superconducting materials fabricated at Argonne’s Center for Nanoscale Materials, a DOE Office of Science User Facility.

    Each of the 16,000 detectors acts as a combination of very sensitive thermometer and camera. As incoming radiation is absorbed on the surface of each detector, measurements are made by supercooling them to a fraction of a degree above absolute zero. (That’s over three times as cold as Antarctica’s lowest recorded temperature.)

    Changes in heat are measured and recorded as changes in electrical resistance and will help inform a map of the CMB’s intensity across the sky.

    CMB-S4 will focus on newer technology that will allow researchers to distinguish very specific patterns in light, or polarized light. In this case, they are looking for what Bender calls the Holy Grail of polarization, a pattern called B-modes.

    Capturing this signal from the early universe — one far fainter than the intensity signal — will help to either confirm or disprove a generic prediction of inflation.

    It will also require the addition of 500,000 detectors distributed among 21 telescopes in two distinct regions of the world, the South Pole and the Chilean desert. There, the high altitude and extremely dry conditions keep water vapor in the atmosphere from absorbing millimeter wavelength light, like that of the CMB.

    While previous experiments have touched on this polarization, the large number of new detectors will improve sensitivity to that polarization and grow our ability to capture it.

    “Literally, we have built these cameras completely from the ground up,” said Bender. ​“Our innovation is in how to make these stacks of superconducting materials work together within this detector, where you have to couple many complex factors and then actually read out the results with the TES. And that is where Argonne has contributed, hugely.”

    Down to the basics

    Argonne’s capabilities in detector technology don’t just stop at the edge of time, nor do the initiative’s investigations just look at the big picture.

    Most of the visible universe, including galaxies, stars, planets and people, are made up of protons and neutrons. Understanding the most fundamental components of those building blocks and how they interact to make atoms and molecules and just about everything else is the realm of physicists like Zein-Eddine Meziani.

    “From the perspective of the future of my field, this initiative is extremely important,” said Meziani, who leads Argonne’s Medium Energy Physics group. ​“It has given us the ability to actually explore new concepts, develop better understanding of the science and a pathway to enter into bigger collaborations and take some leadership.”

    Taking the lead of the initiative’s nuclear physics component, Meziani is steering Argonne toward a significant role in the development of the Electron-Ion Collider, a new U.S. Nuclear Physics Program facility slated for construction at DOE’s Brookhaven National Laboratory (US).

    Argonne’s primary interest in the collider is to elucidate the role that quarks, anti-quarks and gluons play in giving mass and a quantum angular momentum, called spin, to protons and neutrons — nucleons — the particles that comprise the nucleus of an atom.


    EIC Electron Animation, Inner Proton Motion.
    Electrons colliding with ions will exchange virtual photons with the nuclear particles to help scientists ​“see” inside the nuclear particles; the collisions will produce precision 3D snapshots of the internal arrangement of quarks and gluons within ordinary nuclear matter; like a combination CT/MRI scanner for atoms. (Image by Brookhaven National Laboratory.)

    While we once thought nucleons were the finite fundamental particles of an atom, the emergence of powerful particle colliders, like the Stanford Linear Accelerator Center at Stanford University and the former Tevatron at DOE’s Fermilab, proved otherwise.

    It turns out that quarks and gluons were independent of nucleons in the extreme energy densities of the early universe; as the universe expanded and cooled, they transformed into ordinary matter.

    “There was a time when quarks and gluons were free in a big soup, if you will, but we have never seen them free,” explained Meziani. ​“So, we are trying to understand how the universe captured all of this energy that was there and put it into confined systems, like these droplets we call protons and neutrons.”

    Some of that energy is tied up in gluons, which, despite the fact that they have no mass, confer the majority of mass to a proton. So, Meziani is hoping that the Electron-Ion Collider will allow science to explore — among other properties — the origins of mass in the universe through a detailed exploration of gluons.

    And just as Amy Bender is looking for the B-modes polarization in the CMB, Meziani and other researchers are hoping to use a very specific particle called a J/psi to provide a clearer picture of what’s going on inside a proton’s gluonic field.

    But producing and detecting the J/psi particle within the collider — while ensuring that the proton target doesn’t break apart — is a tricky enterprise, which requires new technologies. Again, Argonne is positioning itself at the forefront of this endeavor.

    “We are working on the conceptual designs of technologies that will be extremely important for the detection of these types of particles, as well as for testing concepts for other science that will be conducted at the Electron-Ion Collider,” said Meziani.

    Argonne also is producing detector and related technologies in its quest for a phenomenon called neutrinoless double beta decay. A neutrino is one of the particles emitted during the process of neutron radioactive beta decay and serves as a small but mighty connection between particle physics and astrophysics.

    “Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle,” said Hafidi. ​“If the existence of these very rare decays is confirmed, it would have important consequences in understanding why there is more matter than antimatter in the universe.”

    Argonne scientists from different areas of the lab are working on the Neutrino Experiment with Xenon Time Projection Chamber (NEXT) collaboration to design and prototype key systems for the collaborative’s next big experiment. This includes developing a one-of-a-kind test facility and an R&D program for new, specialized detector systems.

    “We are really working on dramatic new ideas,” said Meziani. ​“We are investing in certain technologies to produce some proof of principle that they will be the ones to pursue later, that the technology breakthroughs that will take us to the highest sensitivity detection of this process will be driven by Argonne.”

    The tools of detection

    Ultimately, fundamental science is science derived from human curiosity. And while we may not always see the reason for pursuing it, more often than not, fundamental science produces results that benefit all of us. Sometimes it’s a gratifying answer to an age-old question, other times it’s a technological breakthrough intended for one science that proves useful in a host of other applications.

    Through their various efforts, Argonne scientists are aiming for both outcomes. But it will take more than curiosity and brain power to solve the questions they are asking. It will take our skills at toolmaking, like the telescopes that peer deep into the heavens and the detectors that capture hints of the earliest light or the most elusive of particles.

    We will need to employ the ultrafast computing power of new supercomputers. Argonne’s forthcoming Aurora exascale machine will analyze mountains of data for help in creating massive models that simulate the dynamics of the universe or subatomic world, which, in turn, might guide new experiments — or introduce new questions.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, to be built at DOE’s Argonne National Laboratory.

    And we will apply artificial intelligence to recognize patterns in complex observations — on the subatomic and cosmic scales — far more quickly than the human eye can, or use it to optimize machinery and experiments for greater efficiency and faster results.

    “I think we have been given the flexibility to explore new technologies that will allow us to answer the big questions,” said Bender. ​“What we’re developing is so cutting edge, you never know where it will show up in everyday life.”

    Funding for research mentioned in this article was provided by Argonne Laboratory Directed Research and Development; Argonne program development; DOE Office of High Energy Physics: Cosmic Frontier, South Pole Telescope-3G project, Detector R&D; and DOE Office of Nuclear Physics.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Argonne National Laboratory (US) seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their is a science and engineering research national laboratory operated by UChicago Argonne LLC for the United States Department of Energy. The facility is located in Lemont, Illinois, outside of Chicago, and is the largest national laboratory by size and scope in the Midwest.

    Argonne had its beginnings in the Metallurgical Laboratory of the University of Chicago, formed in part to carry out Enrico Fermi’s work on nuclear reactors for the Manhattan Project during World War II. After the war, it was designated as the first national laboratory in the United States on July 1, 1946. In the post-war era the lab focused primarily on non-weapon related nuclear physics, designing and building the first power-producing nuclear reactors, helping design the reactors used by the United States’ nuclear navy, and a wide variety of similar projects. In 1994, the lab’s nuclear mission ended, and today it maintains a broad portfolio in basic science research, energy storage and renewable energy, environmental sustainability, supercomputing, and national security.

    UChicago Argonne, LLC, the operator of the laboratory, “brings together the expertise of the University of Chicago (the sole member of the LLC) with Jacobs Engineering Group Inc.” Argonne is a part of the expanding Illinois Technology and Research Corridor. Argonne formerly ran a smaller facility called Argonne National Laboratory-West (or simply Argonne-West) in Idaho next to the Idaho National Engineering and Environmental Laboratory. In 2005, the two Idaho-based laboratories merged to become the DOE’s Idaho National Laboratory.
    What would become Argonne began in 1942 as the Metallurgical Laboratory at the University of Chicago, which had become part of the Manhattan Project. The Met Lab built Chicago Pile-1, the world’s first nuclear reactor, under the stands of the University of Chicago sports stadium. Considered unsafe, in 1943, CP-1 was reconstructed as CP-2, in what is today known as Red Gate Woods but was then the Argonne Forest of the Cook County Forest Preserve District near Palos Hills. The lab was named after the surrounding forest, which in turn was named after the Forest of Argonne in France where U.S. troops fought in World War I. Fermi’s pile was originally going to be constructed in the Argonne forest, and construction plans were set in motion, but a labor dispute brought the project to a halt. Since speed was paramount, the project was moved to the squash court under Stagg Field, the football stadium on the campus of the University of Chicago. Fermi told them that he was sure of his calculations, which said that it would not lead to a runaway reaction, which would have contaminated the city.

    Other activities were added to Argonne over the next five years. On July 1, 1946, the “Metallurgical Laboratory” was formally re-chartered as Argonne National Laboratory for “cooperative research in nucleonics.” At the request of the U.S. Atomic Energy Commission, it began developing nuclear reactors for the nation’s peaceful nuclear energy program. In the late 1940s and early 1950s, the laboratory moved to a larger location in unincorporated DuPage County, Illinois and established a remote location in Idaho, called “Argonne-West,” to conduct further nuclear research.

    In quick succession, the laboratory designed and built Chicago Pile 3 (1944), the world’s first heavy-water moderated reactor, and the Experimental Breeder Reactor I (Chicago Pile 4), built-in Idaho, which lit a string of four light bulbs with the world’s first nuclear-generated electricity in 1951. A complete list of the reactors designed and, in most cases, built and operated by Argonne can be viewed in the, Reactors Designed by Argonne page. The knowledge gained from the Argonne experiments conducted with these reactors 1) formed the foundation for the designs of most of the commercial reactors currently used throughout the world for electric power generation and 2) inform the current evolving designs of liquid-metal reactors for future commercial power stations.

    Conducting classified research, the laboratory was heavily secured; all employees and visitors needed badges to pass a checkpoint, many of the buildings were classified, and the laboratory itself was fenced and guarded. Such alluring secrecy drew visitors both authorized—including King Leopold III of Belgium and Queen Frederica of Greece—and unauthorized. Shortly past 1 a.m. on February 6, 1951, Argonne guards discovered reporter Paul Harvey near the 10-foot (3.0 m) perimeter fence, his coat tangled in the barbed wire. Searching his car, guards found a previously prepared four-page broadcast detailing the saga of his unauthorized entrance into a classified “hot zone”. He was brought before a federal grand jury on charges of conspiracy to obtain information on national security and transmit it to the public, but was not indicted.

    Not all nuclear technology went into developing reactors, however. While designing a scanner for reactor fuel elements in 1957, Argonne physicist William Nelson Beck put his own arm inside the scanner and obtained one of the first ultrasound images of the human body. Remote manipulators designed to handle radioactive materials laid the groundwork for more complex machines used to clean up contaminated areas, sealed laboratories or caves. In 1964, the “Janus” reactor opened to study the effects of neutron radiation on biological life, providing research for guidelines on safe exposure levels for workers at power plants, laboratories and hospitals. Scientists at Argonne pioneered a technique to analyze the moon’s surface using alpha radiation, which launched aboard the Surveyor 5 in 1967 and later analyzed lunar samples from the Apollo 11 mission.

    In addition to nuclear work, the laboratory maintained a strong presence in the basic research of physics and chemistry. In 1955, Argonne chemists co-discovered the elements einsteinium and fermium, elements 99 and 100 in the periodic table. In 1962, laboratory chemists produced the first compound of the inert noble gas xenon, opening up a new field of chemical bonding research. In 1963, they discovered the hydrated electron.

    High-energy physics made a leap forward when Argonne was chosen as the site of the 12.5 GeV Zero Gradient Synchrotron, a proton accelerator that opened in 1963. A bubble chamber allowed scientists to track the motions of subatomic particles as they zipped through the chamber; in 1970, they observed the neutrino in a hydrogen bubble chamber for the first time.

    Meanwhile, the laboratory was also helping to design the reactor for the world’s first nuclear-powered submarine, the U.S.S. Nautilus, which steamed for more than 513,550 nautical miles (951,090 km). The next nuclear reactor model was Experimental Boiling Water Reactor, the forerunner of many modern nuclear plants, and Experimental Breeder Reactor II (EBR-II), which was sodium-cooled, and included a fuel recycling facility. EBR-II was later modified to test other reactor designs, including a fast-neutron reactor and, in 1982, the Integral Fast Reactor concept—a revolutionary design that reprocessed its own fuel, reduced its atomic waste and withstood safety tests of the same failures that triggered the Chernobyl and Three Mile Island disasters. In 1994, however, the U.S. Congress terminated funding for the bulk of Argonne’s nuclear programs.

    Argonne moved to specialize in other areas, while capitalizing on its experience in physics, chemical sciences and metallurgy. In 1987, the laboratory was the first to successfully demonstrate a pioneering technique called plasma wakefield acceleration, which accelerates particles in much shorter distances than conventional accelerators. It also cultivated a strong battery research program.

    Following a major push by then-director Alan Schriesheim, the laboratory was chosen as the site of the Advanced Photon Source, a major X-ray facility which was completed in 1995 and produced the brightest X-rays in the world at the time of its construction.

    On 19 March 2019, it was reported in the Chicago Tribune that the laboratory was constructing the world’s most powerful supercomputer. Costing $500 million it will have the processing power of 1 quintillion flops. Applications will include the analysis of stars and improvements in the power grid.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 9:03 pm on July 5, 2021 Permalink | Reply
    Tags: "U of T researcher launches group to help detect hidden biases in AI systems", Artificial Intelligence, As AI systems are deployed in an ever-expanding range of applications bias in AI becomes an even more critical issue., , For example: the app works 80 per cent successfully on native English speakers but only 40 per cent for people whose first language is not English ., HALT AI group, HALT was launched in May as a free service., Measuring biases present in artificial intelligence systems as a first step toward fixing them., The group has studied systems for Apple; Google; and Microsoft. among others., The majority of the time there is a training set problem., The scientists found problems with Apple and Google’s voice-to-text systems., The scientists found that Microsoft’s age-estimation AI does not perform well for certain age groups.,   

    From University of Toronto (CA) : “U of T researcher launches group to help detect hidden biases in AI systems” 

    From University of Toronto (CA)

    July 05, 2021
    Matthew Tierney

    1
    Parham Aarabi, of the department of electrical and computer engineering, helped start a research group that uncovers biases in AI systems, including some belonging to Apple, Google and Microsoft. Photo by Johnny Guatto.

    A new initiative led by University of Toronto researcher Parham Aarabi aims to measure biases present in artificial intelligence systems as a first step toward fixing them.

    AI systems often reflect biases that are present in the datasets – or, sometimes, the AI’s modelling can introduce new biases.

    “Every AI system has some kind of a bias,” says Aarabi, an associate professor of communications/computer engineering in the Edward S. Rogers Sr. department of electrical and computer engineering in the Faculty of Applied Science & Engineering. “I say that as someone who has worked on AI systems and algorithms for over 20 years.”

    Aarabi is among the academic and industry experts in the University of Toronto’s HALT AI group, which tests other organizations’ AI systems using diverse input sets. HALT AI creates a diversity report – including a diversity chart for key metrics – that shows weaknesses and suggests improvements.

    “We found that most AI teams do not perform actual quantitative validation of their system,” Aarabi says. “We are able to say, for example, ‘Look, your app works 80 per cent successfully on native English speakers, but only 40 per cent for people whose first language is not English.’”

    HALT was launched in May as a free service. The group has conducted studies on a number of popular AI systems, including some belonging to Apple, Google and Microsoft. HALT’s statistical reports provide feedback across a variety of diversity dimensions, such gender, age and race.

    “In our own testing we found that Microsoft’s age-estimation AI does not perform well for certain age groups,” says Aarabi. “So too with Apple and Google’s voice-to-text systems: If you have a certain dialect, an accent, they can work poorly. But you do not know which dialect until you test. Similar apps fail in different ways – which is interesting, and likely indicative of the type and limitation of the training data that was used for each app.”

    HALT started early this year when AI researchers within and outside the electrical and computer engineering department began sharing their concerns about bias in AI systems. By May, the group brought aboard external experts in diversity from the private and academic sectors.

    “To truly understand and measure bias, it can’t just be a few people from U of T,” Aarabi says. “HALT is a broad group of individuals, including the heads of diversity at Fortune 500 companies as well as AI diversity experts at other academic institutions such as University College London (UK) and Stanford University (US).”

    As AI systems are deployed in an ever-expanding range of applications bias in AI becomes an even more critical issue. While AI system performance remains a priority, a growing number of developers are also inspecting their work for inherent biases.

    “The majority of the time there is a training set problem,” Aarabi says. “The developers simply don’t have enough training data across all representative demographic groups.”

    If diverse training data doesn’t improve the AI’s performance, then the model itself may be flawed and require reprogramming.

    Deepa Kundur, a professor and the chair of the department of electrical and computer engineering, says HALT AI is helping to create fairer AI systems.

    “Our push for diversity starts at home, in our department, but also extends to the electrical and computer engineering community at large – including the tools that researchers innovate for society,” she says. “HALT AI is helping to ensure a way forward for equitable and fair AI.”

    “Right now is the right time for researchers and practitioners to be thinking about this,” Aarabi adds. “They need to move from high-level abstractions and be definitive about how bias reveals itself. I think we can shed some light on that.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Toronto (CA) is a public research university in Toronto, Ontario, Canada, located on the grounds that surround Queen’s Park. It was founded by royal charter in 1827 as King’s College, the oldest university in the province of Ontario.

    Originally controlled by the Church of England, the university assumed its present name in 1850 upon becoming a secular institution.

    As a collegiate university, it comprises eleven colleges each with substantial autonomy on financial and institutional affairs and significant differences in character and history. The university also operates two satellite campuses located in Scarborough and Mississauga.

    University of Toronto has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

    Academically, the University of Toronto is noted for movements and curricula in literary criticism and communication theory, known collectively as the Toronto School.

    The university was the birthplace of insulin and stem cell research, and was the site of the first electron microscope in North America; the identification of the first black hole Cygnus X-1; multi-touch technology, and the development of the theory of NP-completeness.

    The university was one of several universities involved in early research of deep learning. It receives the most annual scientific research funding of any Canadian university and is one of two members of the Association of American Universities (US) outside the United States, the other being McGill(CA).

    The Varsity Blues are the athletic teams that represent the university in intercollegiate league matches, with ties to gridiron football, rowing and ice hockey. The earliest recorded instance of gridiron football occurred at University of Toronto’s University College in November 1861.

    The university’s Hart House is an early example of the North American student centre, simultaneously serving cultural, intellectual, and recreational interests within its large Gothic-revival complex.

    The University of Toronto has educated three Governors General of Canada, four Prime Ministers of Canada, three foreign leaders, and fourteen Justices of the Supreme Court. As of March 2019, ten Nobel laureates, five Turing Award winners, 94 Rhodes Scholars, and one Fields Medalist have been affiliated with the university.

    Early history

    The founding of a colonial college had long been the desire of John Graves Simcoe, the first Lieutenant-Governor of Upper Canada and founder of York, the colonial capital. As an University of Oxford (UK)-educated military commander who had fought in the American Revolutionary War, Simcoe believed a college was needed to counter the spread of republicanism from the United States. The Upper Canada Executive Committee recommended in 1798 that a college be established in York.

    On March 15, 1827, a royal charter was formally issued by King George IV, proclaiming “from this time one College, with the style and privileges of a University … for the education of youth in the principles of the Christian Religion, and for their instruction in the various branches of Science and Literature … to continue for ever, to be called King’s College.” The granting of the charter was largely the result of intense lobbying by John Strachan, the influential Anglican Bishop of Toronto who took office as the college’s first president. The original three-storey Greek Revival school building was built on the present site of Queen’s Park.

    Under Strachan’s stewardship, King’s College was a religious institution closely aligned with the Church of England and the British colonial elite, known as the Family Compact. Reformist politicians opposed the clergy’s control over colonial institutions and fought to have the college secularized. In 1849, after a lengthy and heated debate, the newly elected responsible government of the Province of Canada voted to rename King’s College as the University of Toronto and severed the school’s ties with the church. Having anticipated this decision, the enraged Strachan had resigned a year earlier to open Trinity College as a private Anglican seminary. University College was created as the nondenominational teaching branch of the University of Toronto. During the American Civil War the threat of Union blockade on British North America prompted the creation of the University Rifle Corps which saw battle in resisting the Fenian raids on the Niagara border in 1866. The Corps was part of the Reserve Militia lead by Professor Henry Croft.

    Established in 1878, the School of Practical Science was the precursor to the Faculty of Applied Science and Engineering which has been nicknamed Skule since its earliest days. While the Faculty of Medicine opened in 1843 medical teaching was conducted by proprietary schools from 1853 until 1887 when the faculty absorbed the Toronto School of Medicine. Meanwhile the university continued to set examinations and confer medical degrees. The university opened the Faculty of Law in 1887, followed by the Faculty of Dentistry in 1888 when the Royal College of Dental Surgeons became an affiliate. Women were first admitted to the university in 1884.

    A devastating fire in 1890 gutted the interior of University College and destroyed 33,000 volumes from the library but the university restored the building and replenished its library within two years. Over the next two decades a collegiate system took shape as the university arranged federation with several ecclesiastical colleges including Strachan’s Trinity College in 1904. The university operated the Royal Conservatory of Music from 1896 to 1991 and the Royal Ontario Museum from 1912 to 1968; both still retain close ties with the university as independent institutions. The University of Toronto Press was founded in 1901 as Canada’s first academic publishing house. The Faculty of Forestry founded in 1907 with Bernhard Fernow as dean was Canada’s first university faculty devoted to forest science. In 1910, the Faculty of Education opened its laboratory school, the University of Toronto Schools.

    World wars and post-war years

    The First and Second World Wars curtailed some university activities as undergraduate and graduate men eagerly enlisted. Intercollegiate athletic competitions and the Hart House Debates were suspended although exhibition and interfaculty games were still held. The David Dunlap Observatory in Richmond Hill opened in 1935 followed by the University of Toronto Institute for Aerospace Studies in 1949. The university opened satellite campuses in Scarborough in 1964 and in Mississauga in 1967. The university’s former affiliated schools at the Ontario Agricultural College and Glendon Hall became fully independent of the University of Toronto and became part of University of Guelph (CA) in 1964 and York University (CA) in 1965 respectively. Beginning in the 1980s reductions in government funding prompted more rigorous fundraising efforts.

    Since 2000

    In 2000 Kin-Yip Chun was reinstated as a professor of the university after he launched an unsuccessful lawsuit against the university alleging racial discrimination. In 2017 a human rights application was filed against the University by one of its students for allegedly delaying the investigation of sexual assault and being dismissive of their concerns. In 2018 the university cleared one of its professors of allegations of discrimination and antisemitism in an internal investigation after a complaint was filed by one of its students.

    The University of Toronto was the first Canadian university to amass a financial endowment greater than c. $1 billion in 2007. On September 24, 2020 the university announced a $250 million gift to the Faculty of Medicine from businessman and philanthropist James C. Temerty- the largest single philanthropic donation in Canadian history. This broke the previous record for the school set in 2019 when Gerry Schwartz and Heather Reisman jointly donated $100 million for the creation of a 750,000-square foot innovation and artificial intelligence centre.

    Research

    Since 1926 the University of Toronto has been a member of the Association of American Universities (US) a consortium of the leading North American research universities. The university manages by far the largest annual research budget of any university in Canada with sponsored direct-cost expenditures of $878 million in 2010. In 2018 the University of Toronto was named the top research university in Canada by Research Infosource with a sponsored research income (external sources of funding) of $1,147.584 million in 2017. In the same year the university’s faculty averaged a sponsored research income of $428,200 while graduate students averaged a sponsored research income of $63,700. The federal government was the largest source of funding with grants from the Canadian Institutes of Health Research; the Natural Sciences and Engineering Research Council; and the Social Sciences and Humanities Research Council amounting to about one-third of the research budget. About eight percent of research funding came from corporations- mostly in the healthcare industry.

    The first practical electron microscope was built by the physics department in 1938. During World War II the university developed the G-suit- a life-saving garment worn by Allied fighter plane pilots later adopted for use by astronauts.Development of the infrared chemiluminescence technique improved analyses of energy behaviours in chemical reactions. In 1963 the asteroid 2104 Toronto was discovered in the David Dunlap Observatory (CA) in Richmond Hill and is named after the university. In 1972 studies on Cygnus X-1 led to the publication of the first observational evidence proving the existence of black holes. Toronto astronomers have also discovered the Uranian moons of Caliban and Sycorax; the dwarf galaxies of Andromeda I, II and III; and the supernova SN 1987A. A pioneer in computing technology the university designed and built UTEC- one of the world’s first operational computers- and later purchased Ferut- the second commercial computer after UNIVAC I. Multi-touch technology was developed at Toronto with applications ranging from handheld devices to collaboration walls. The AeroVelo Atlas which won the Igor I. Sikorsky Human Powered Helicopter Competition in 2013 was developed by the university’s team of students and graduates and was tested in Vaughan.

    The discovery of insulin at the University of Toronto in 1921 is considered among the most significant events in the history of medicine. The stem cell was discovered at the university in 1963 forming the basis for bone marrow transplantation and all subsequent research on adult and embryonic stem cells. This was the first of many findings at Toronto relating to stem cells including the identification of pancreatic and retinal stem cells. The cancer stem cell was first identified in 1997 by Toronto researchers who have since found stem cell associations in leukemia; brain tumors; and colorectal cancer. Medical inventions developed at Toronto include the glycaemic index; the infant cereal Pablum; the use of protective hypothermia in open heart surgery; and the first artificial cardiac pacemaker. The first successful single-lung transplant was performed at Toronto in 1981 followed by the first nerve transplant in 1988; and the first double-lung transplant in 1989. Researchers identified the maturation promoting factor that regulates cell division and discovered the T-cell receptor which triggers responses of the immune system. The university is credited with isolating the genes that cause Fanconi anemia; cystic fibrosis; and early-onset Alzheimer’s disease among numerous other diseases. Between 1914 and 1972 the university operated the Connaught Medical Research Laboratories- now part of the pharmaceutical corporation Sanofi-Aventis. Among the research conducted at the laboratory was the development of gel electrophoresis.

    The University of Toronto is the primary research presence that supports one of the world’s largest concentrations of biotechnology firms. More than 5,000 principal investigators reside within 2 kilometres (1.2 mi) from the university grounds in Toronto’s Discovery District conducting $1 billion of medical research annually. MaRS Discovery District is a research park that serves commercial enterprises and the university’s technology transfer ventures. In 2008, the university disclosed 159 inventions and had 114 active start-up companies. Its SciNet Consortium operates the most powerful supercomputer in Canada.

     
  • richardmitnick 9:11 pm on July 2, 2021 Permalink | Reply
    Tags: "AI Designs Quantum Physics Experiments Beyond What Any Human Has Conceived", Artificial Intelligence, MELVIN had seemingly solved the problem of creating highly complex entangled states involving multiple photons. How?, MELVIN was a machine-learning algorithm., , , The algorithm had rediscovered a type of experimental arrangement that had been devised in the early 1990s., When two photons interact they become entangled and both can only be mathematically described using a single shared quantum state.   

    From Scientific American : “AI Designs Quantum Physics Experiments Beyond What Any Human Has Conceived” 

    From Scientific American

    July 2, 2021
    Anil Ananthaswamy

    1
    Credit: Getty Images.

    Quantum physicist Mario Krenn remembers sitting in a café in Vienna in early 2016, poring over computer printouts, trying to make sense of what MELVIN had found. MELVIN was a machine-learning algorithm Krenn had built, a kind of artificial intelligence. Its job was to mix and match the building blocks of standard quantum experiments and find solutions to new problems. And it did find many interesting ones. But there was one that made no sense.

    “The first thing I thought was, ‘My program has a bug, because the solution cannot exist,’” Krenn says. MELVIN had seemingly solved the problem of creating highly complex entangled states involving multiple photons (entangled states being those that once made Albert Einstein invoke the specter of “spooky action at a distance”). Krenn and his colleagues had not explicitly provided MELVIN the rules needed to generate such complex states, yet it had found a way. Eventually, he realized that the algorithm had rediscovered a type of experimental arrangement that had been devised in the early 1990s. But those experiments had been much simpler. MELVIN had cracked a far more complex puzzle.

    “When we understood what was going on, we were immediately able to generalize [the solution],” says Krenn, who is now at the University of Toronto (CA). Since then, other teams have started performing the experiments identified by MELVIN, allowing them to test the conceptual underpinnings of quantum mechanics in new ways. Meanwhile Krenn, Anton Zeilinger of the University of Vienna [Universität Wien] (AT) and their colleagues have refined their machine-learning algorithms. Their latest effort, an AI called THESEUS, has upped the ante: it is orders of magnitude faster than MELVIN, and humans can readily parse its output. While it would take Krenn and his colleagues days or even weeks to understand MELVIN’s meanderings, they can almost immediately figure out what THESEUS is saying.

    “It is amazing work,” says theoretical quantum physicist Renato Renner of the Institute for Theoretical Physics at the Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH), who reviewed a 2020 study about THESEUS by Krenn and Zeilinger but was not directly involved in these efforts.

    Krenn stumbled on this entire research program somewhat by accident when he and his colleagues were trying to figure out how to experimentally create quantum states of photons entangled in a very particular manner: When two photons interact, they become entangled, and both can only be mathematically described using a single shared quantum state. If you measure the state of one photon, the measurement instantly fixes the state of the other even if the two are kilometers apart (hence Einstein’s derisive comments on entanglement being “spooky”).

    In 1989 three physicists—Daniel Greenberger, the late Michael Horne and Zeilinger—described an entangled state that came to be known as “GHZ” (after their initials). It involved four photons, each of which could be in a quantum superposition of, say, two states, 0 and 1 (a quantum state called a qubit). In their paper, the GHZ state involved entangling four qubits such that the entire system was in a two-dimensional quantum superposition of states 0000 and 1111. If you measured one of the photons and found it in state 0, the superposition would collapse, and the other photons would also be in state 0. The same went for state 1. In the late 1990s Zeilinger and his colleagues experimentally observed GHZ states using three qubits for the first time.

    Krenn and his colleagues were aiming for GHZ states of higher dimensions. They wanted to work with three photons, where each photon had a dimensionality of three, meaning it could be in a superposition of three states: 0, 1 and 2. This quantum state is called a qutrit. The entanglement the team was after was a three-dimensional GHZ state that was a superposition of states 000, 111 and 222. Such states are important ingredients for secure quantum communications and faster quantum computing. In late 2013 the researchers spent weeks designing experiments on blackboards and doing the calculations to see if their setups could generate the required quantum states. But each time they failed. “I thought, ‘This is absolutely insane. Why can’t we come up with a setup?’” says Krenn says.

    To speed up the process, Krenn first wrote a computer program that took an experimental setup and calculated the output. Then he upgraded the program to allow it to incorporate in its calculations the same building blocks that experimenters use to create and manipulate photons on an optical bench: lasers, nonlinear crystals, beam splitters, phase shifters, holograms, and the like. The program searched through a large space of configurations by randomly mixing and matching the building blocks, performed the calculations and spat out the result. MELVIN was born. “Within a few hours, the program found a solution that we scientists—three experimentalists and one theorist—could not come up with for months,” Krenn says. “That was a crazy day. I could not believe that it happened.”

    Then he gave MELVIN more smarts. Anytime it found a setup that did something useful, MELVIN added that setup to its toolbox. “The algorithm remembers that and tries to reuse it for more complex solutions,” Krenn says.

    It was this more evolved MELVIN that left Krenn scratching his head in a Viennese café. He had set it running with an experimental toolbox that contained two crystals, each capable of generating a pair of photons entangled in three dimensions. Krenn’s naive expectation was that MELVIN would find configurations that combined these pairs of photons to create entangled states of at most nine dimensions. But “it actually found one solution, an extremely rare case, that has much higher entanglement than the rest of the states,” Krenn says.

    Eventually, he figured out that MELVIN had used a technique that multiple teams had developed nearly three decades ago. In 1991 one method was designed by Xin Yu Zou, Li Jun Wang and Leonard Mandel, all then at the University of Rochester (US). And in 1994 Zeilinger, then at the University of Innsbruck [Leopold-Franzens-Universität Innsbruck] (AT), and his colleagues came up with another. Conceptually, these experiments attempted something similar, but the configuration that Zeilinger and his colleagues devised is simpler to understand. It starts with one crystal that generates a pair of photons (A and B). The paths of these photons go right through another crystal, which can also generate two photons (C and D). The paths of photon A from the first crystal and of photon C from the second overlap exactly and lead to the same detector. If that detector clicks, it is impossible to tell whether the photon originated from the first or the second crystal. The same goes for photons B and D.

    A phase shifter is a device that effectively increases the path a photon travels as some fraction of its wavelength. If you were to introduce a phase shifter in one of the paths between the crystals and kept changing the amount of phase shift, you could cause constructive and destructive interference at the detectors. For example, each of the crystals could be generating, say, 1,000 pairs of photons per second. With constructive interference, the detectors would register 4,000 pairs of photons per second. And with destructive interference, they would detect none: the system as a whole would not create any photons even though individual crystals would be generating 1,000 pairs a second. “That is actually quite crazy, when you think about it,” Krenn says.

    MELVIN’s funky solution involved such overlapping paths. What had flummoxed Krenn was that the algorithm had only two crystals in its toolbox. And instead of using those crystals at the beginning of the experimental setup, it had wedged them inside an interferometer (a device that splits the path of, say, a photon into two and then recombines them). After much effort, he realized that the setup MELVIN had found was equivalent to one involving more than two crystals, each generating pairs of photons, such that their paths to the detectors overlapped. The configuration could be used to generate high-dimensional entangled states.

    Quantum physicist Nora Tischler, who was a Ph.D. student working with Zeilinger on an unrelated topic when MELVIN was being put through its paces, was paying attention to these developments. “It was kind of clear from the beginning [that such an] experiment wouldn’t exist if it hadn’t been discovered by an algorithm,” she says.

    Besides generating complex entangled states, the setup using more than two crystals with overlapping paths can be employed to perform a generalized form of Zeilinger’s 1994 quantum interference experiments with two crystals. Aephraim Steinberg, an experimentalist at the University of Toronto, who is a colleague of Krenn’s but has not worked on these projects, is impressed by what the AI found. “This is a generalization that (to my knowledge) no human dreamed up in the intervening decades and might never have done,” he says. “It’s a gorgeous first example of the kind of new explorations these thinking machines can take us on.”

    In one such generalized configuration with four crystals, each generating a pair of photons, and overlapping paths leading to four detectors, quantum interference can create situations where either all four detectors click (constructive interference) or none of them do so (destructive interference).

    But until recently, carrying out such an experiment remained a distant dream. Then, in a March preprint paper, a team led by Lan-Tian Feng of the University of Science and Technology [中国科学技术大学] (CN) at Chinese Academy of Sciences [中国科学院](CN) , in collaboration with Krenn, reported that they had fabricated the entire setup on a single photonic chip and performed the experiment. The researchers collected data for more than 16 hours: a feat made possible because of the photonic chip’s incredible optical stability, something that would have been impossible to achieve in a larger-scale tabletop experiment. For starters, the setup would require a square meter’s worth of optical elements precisely aligned on an optical bench, Steinberg says. Besides, “a single optical element jittering or drifting by a thousandth of the diameter of a human hair during those 16 hours could be enough to wash out the effect,” he says.

    During their early attempts to simplify and generalize what MELVIN had found, Krenn and his colleagues realized that the solution resembled abstract mathematical forms called graphs, which contain vertices and edges and are used to depict pairwise relations between objects. For these quantum experiments, every path a photon takes is represented by a vertex. And a crystal, for example, is represented by an edge connecting two vertices. MELVIN first produced such a graph and then performed a mathematical operation on it. The operation, called “perfect matching,” involves generating an equivalent graph in which each vertex is connected to only one edge. This process makes calculating the final quantum state much easier, although it is still hard for humans to understand.

    That changed with MELVIN’s successor THESEUS, which generates much simpler graphs by winnowing the first complex graph representing a solution that it finds down to the bare minimum number of edges and vertices (such that any further deletion destroys the setup’s ability to generate the desired quantum states). Such graphs are simpler than MELVIN’s perfect matching graphs, so it is even easier to make sense of any AI-generated solution.

    Renner is particularly impressed by THESEUS’s human-interpretable outputs. “The solution is designed in such a way that the number of connections in the graph is minimized,” he says. “And that’s naturally a solution we can better understand than if you had a very complex graph.”

    Eric Cavalcanti of Griffith University (AU) is both impressed by the work and circumspect about it. “These machine-learning techniques represent an interesting development. For a human scientist looking at the data and interpreting it, some of the solutions may look like ‘creative’ new solutions. But at this stage, these algorithms are still far from a level where it could be said that they are having truly new ideas or coming up with new concepts,” he says. “On the other hand, I do think that one day they will get there. So these are baby steps—but we have to start somewhere.”

    Steinberg agrees. “For now, they are just amazing tools,” he says. “And like all the best tools, they’re already enabling us to do some things we probably wouldn’t have done without them.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Scientific American , the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 10:26 am on June 17, 2021 Permalink | Reply
    Tags: "An Ally for Alloys", , Artificial Intelligence, “XMAT”—eXtreme environment MATerials—consortium, , , , Stronger materials are key to producing energy efficiently resulting in economic and decarbonization benefits.   

    From DOE’s Pacific Northwest National Laboratory (US) : “An Ally for Alloys” 

    From DOE’s Pacific Northwest National Laboratory (US)

    June 16, 2021
    Tim Ledbetter

    1

    Machine learning techniques have contributed to progress in science and technology fields ranging from health care to high-energy physics. Now, machine learning is poised to help accelerate the development of stronger alloys, particularly stainless steels, for America’s thermal power generation fleet. Stronger materials are key to producing energy efficiently resulting in economic and decarbonization benefits.

    “The use of ultra-high-strength steels in power plants dates back to the 1950s and has benefited from gradual improvements in the materials over time,” says Osman Mamun, a postdoctoral research associate at Pacific Northwest National Laboratory (PNNL). “If we can find ways to speed up improvements or create new materials, we could see enhanced efficiency in plants that also reduces the amount of carbon emitted into the atmosphere.”

    Mamun is the lead author on two recent, related journal articles that reveal new strategies for machine learning’s application in the design of advanced alloys. The articles chronicle the research outcomes of a joint effort between PNNL and the DOE National Energy Technology Lab (US). In addition to Mamun, the research team included PNNL’s Arun Sathanur and Ram Devanathan and NETL’s Madison Wenzlick and Jeff Hawk.

    The work was funded under the Department of Energy’s (US) Office of Fossil Energy via the “XMAT”—eXtreme environment MATerials—consortium, which includes research contributions from seven DOE national laboratories. The consortium seeks to accelerate the development of improved heat-resistant alloys for various power plant components and to predict the alloys’ long-term performance.

    The inside story of power plants

    A thermal power plant’s internal environment is unforgiving. Operating temperatures of more than 650 degrees Celsius and stresses exceeding 50 megapascals put a plant’s steel components to the test.

    “But also, that high temperature and pressure, along with reliable components, are critical in driving better thermodynamic efficiency that leads to reduced carbon emissions and increased cost-effectiveness,” Mamun explains.

    The PNNL–NETL collaboration focused on two material types. Austenitic stainless steel is widely used in plants because it offers strength and excellent corrosion resistance, but its service life at high temperatures is limited. Ferritic-martensitic steel that contains chromium in the 9 to 12 percent range also offers strength benefits but can be prone to oxidation and corrosion. Plant operators want materials that resist rupturing and last for decades.

    Over time, “trial and error” experimental approaches have incrementally improved steel, but are inefficient, time-consuming, and costly. It is crucial to accelerate the development of novel materials with superior properties.

    Models for predicting rupture strength and life

    Recent advances in computational modeling and machine learning, Mamun says, have become important new tools in the quest for achieving better materials more quickly.

    Machine learning, a form of artificial intelligence, applies an algorithm to datasets to develop faster solutions for science problems. This capability is making a big difference in research worldwide, in some cases shaving considerable time off scientific discovery and technology developments.

    The PNNL–NETL research team’s application of machine learning was described in their first journal article, published March 9 in Scientific Reports.

    2
    PNNL’s distinctive capabilities in joining steel to aluminum alloys enable lightweight vehicle technologies for sustainable transportation. Photo by Andrea Starr | Pacific Northwest National Laboratory.

    The paper recounts the team’s effort to enhance and analyze stainless steel datasets, contributed by NETL team members, with three different algorithms. The ultimate goal was to construct an accurate predictive model for the rupture strength of the two types of alloys. The team concluded that an algorithm known as the Gradient Boosted Decision Tree best met the needs for building machine learning models for accurate prediction of rupture strength.

    Further, the researchers maintain that integrating the resulting models into existing alloy design strategies could speed the identification of promising stainless steels that possess superior properties for dealing with stress and strain.

    “This research project not only took a step toward better approaches for extending the operating envelope of steel in power plants, but also demonstrated machine learning models grounded in physics to enable interpretation by domain scientists,” says research team member Ram Devanathan, a PNNL computational materials scientist. Devanathan leads the XMAT consortium’s data science thrust and serves on the organization’s steering committee.

    The project team’s second article was published in npj Materials Degradation’s April 16 edition.

    The team concluded in the paper that a machine-learning-based predictive model can reliably estimate the rupture life of the two alloys. The researchers also described a methodology to generate synthetic alloys that could be used to augment existing sparse stainless steel datasets, and identified the limitations of such an approach. Using these “hypothetical alloys” in machine learning models makes it possible to assess the performance of candidate materials without first synthesizing them in a laboratory.

    “The findings build on the earlier paper’s conclusions and represent another step toward establishing interpretable models of alloy performance in extreme environments, while also providing insights into data set development,” Devanathan says. “Both papers demonstrate XMAT’s thought leadership in this rapidly growing field.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Pacific Northwest National Laboratory (PNNL) (US) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

     
  • richardmitnick 8:13 am on June 14, 2021 Permalink | Reply
    Tags: "The road ahead- Raquel Urtasun's startup to ‘unleash full power of AI’ on self-driving cars", , Artificial Intelligence, , , , Women in STEM-Raquel Urtasun   

    From University of Toronto (CA) : Women in STEM-Raquel Urtasun “The road ahead- Raquel Urtasun’s startup to ‘unleash full power of AI’ on self-driving cars” 

    From University of Toronto (CA)

    June 10, 2021
    Rahul Kalvapalle

    1
    Raquel Urtasun, a U of T professor of computer science and world-leading expert in machine learning and computer vision, has launched her own Toronto-based, self-driving vehicle company with more than $100 million in funding (photo by Natalia Dolan)

    More than $100-million in funding. Two decades of artificial intelligence expertise. Ten years of experience in self-driving technology. A 40-strong team of scientists and engineers.

    The list of resources at Raquel Urtasun’s fingertips as she takes the wheel of Waabi, an autonomous vehicle startup, is impressive to say the least. The goal? Use AI to finally resolve the technical and financial challenges that have hindered the full commercialization of self-driving technology.

    It’s the first foray into entrepreneurship for Urtasun, a professor of computer science at the University of Toronto and one of the world’s leading experts in machine learning and computer vision. She says she was inspired to start her own company after four years as chief scientist and head of Uber ATG’s self-driving car lab in Toronto, where she realized need for a new generation of self-driving technologies that leverage AI’s full potential.

    “The thought of what would be the best way to do this grew and grew in my head until it became clear that, if you really want to change technology, the best way to do it is to start a new company,” Urtasun says.

    Urtasun’s new venture emerged from stealth mode earlier this week to announce one of the largest rounds of initial financing ever secured by a Canadian startup, raising more than $100 million from investors including Silicon Valley-based Khosla Ventures and Uber. Other investors include fellow U of T AI luminaries Geoffrey Hinton, a University Professor Emeritus, and Sanja Fidler, an associate professor of computer science, as well as Stanford University’s Fei-Fei Li and Pieter Abbeel of the University of California, Berkeley.

    Urtasun says the self-driving industry’s current players aren’t taking full advantage of the power of AI.

    “There is a little bit of AI there, but it doesn’t have a prominent role. Instead, it’s solving very specific sub-problems within the massive software stack – or brain of the self-driving car,” she says. “This causes difficulty in that it requires really complex, time-consuming manual tuning.

    “As a consequence of this, scaling the technology is costly and technically very challenging.”

    Waabi addresses this by utilizing “deep learning, probabilistic inference and complex optimization” to create a new class of algorithms, the likes of which Urtasun says have never been seen before in industry or academia.

    Key to Waabi’s approach is its novel autonomous system – essentially, the software brain of the self-driving vehicle – that is “end-to-end trainable,” meaning the entire software stack can automatically learn from data, removing the need for constant manual tuning and tweaking.

    The system is also “interpretable and explainable,” meaning it’s possible to deduce why it opts for certain manoeuvers over others – crucial for safety verification.

    It’s also capable of complex reasoning, which Urtasun says is vital for eventual real-world applications.

    “If you think about when you’re driving and arrive at an intersection, there’s a lot of things happening in your brain – you do very complex inference about what everybody’s doing at the intersection, how it will affect you, etc.” Urtasun says. “That’s what our new generation of algorithms provides – this ability to do really complex reasoning within the AI system.”

    Waabi also has a revolutionary simulator system that can test the algorithms and software with “an unprecedented level of fidelity,” Urtasun says.

    “When people in the industry say they test millions of miles of simulation, they’re really only testing the motion-planning component – which is one piece among this big software stack,” she says.

    “Waabi has the ability to simulate how the world looks at scale, how sensors observe the scene and the behaviours of humans in a way that’s very realistic and in real time.”

    That means significantly fewer hours of on-road drive testing.

    “Typically, companies have hundreds of vehicles that they’re driving so that they can observe how the system works. And every time you change something, you change the behaviour, so you have to drive again and again and again,” Urtasun says. “[Waabi] can develop, test in simulation and reduce the need for driving in the real-world.”

    It also means a system that’s safer because it can be trained to manage not only typical driving scenarios, but also ‘edge cases’ – situations that arise at extreme operating parameters.

    “We can train the system to handle those edge cases in simulation,” Urtasun says. “So, you end up with a system that is much safer, that you can develop faster and that requires less capital to develop because you need very few people compared to the traditional approach – and less testing in the real world.

    “[You] really unleash the power of AI.”

    The company’s name reflects its approach. “Waabi” means “she has vision” in Ojibwe (“a new vision to help solve self-driving,” Urtasun says) and means “simple” in Japanese – an ode to the simplicity of the software stack.

    “[It’s] a perfect definition of our technology and a perfect name for our company,” Urtasun says. “Plus, it sounds cool.”

    The potential applications for Waabi’s technology are wide-ranging, Urtasun says, but the initial focus will be the long-haul trucking sector – a departure from her time at Uber, where she worked on passenger vehicles. She notes that truck-driving is recognized as one of the most dangerous occupations, and that the industry suffers from a shortage of drivers. “Automation can serve those industry needs,” she says.

    Urtasun adds that long-haul trucking is also a prudent area to focus on because there’s less complexity involved with highway driving than is the case in cities.

    “Highways are still very difficult – don’t get me wrong – but they’re less complex compared to a city like Toronto, with all the things that might happen and how people follow the rules – well, very few people follow the rules. So, you need to handle all that complexity.”

    Toronto’s notoriously bad traffic aside, Urtasun says there’s nowhere else she’d rather set up an AI company.

    “When people ask me, ‘Why here?’ I say, ‘Why not?’ I love Toronto, I love Canada. It’s an amazing place to do innovation – there’s incredible talent and support from the government,” she says, pointing to Toronto’s emergence as a world-leading AI hub thanks to initiatives such as the Vector Institute for Artificial Intelligence, which she co-founded.

    “It’s been incredible to see the transformation that the city has gone through,” she says. “It was the case that people were leaving and going to California. Now, not only are we retaining talent but so much incredible talent is coming in – even from Silicon Valley,”

    There’s also plenty of talent to be tapped at U of T, Urtasun adds.

    “We have amazing U of T students who are doing great work within the company,” she says. “I really look forward to partnering closely with U of T to provide opportunities to the incredible talent that the university has. For me, it’s always been very important to [help develop] students – so that continues to be the case.”

    As the CEO of an AI-powered autonomous vehicle startup, Urtasun says it’s important for her to set an example for women and girls interested in pursuing careers in technology.

    “I think it’s very important that young girls, in particular, realize that this is not a man’s world. Technology is going to change the world and they definitely have a say,” she says.

    She adds Waabi and other technology companies benefit immensely from diverse leadership and perspectives – and so do their customers.

    “It’s important that in order to solve complex problems, we have diversity of opinions, approaches and backgrounds,” she says. “Waabi excels at all three types of diversity, which I think is the way to build incredible technology as well as showcase the diversity of the users who are going to use the technology at the end of the day.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Toronto (CA) is a public research university in Toronto, Ontario, Canada, located on the grounds that surround Queen’s Park. It was founded by royal charter in 1827 as King’s College, the oldest university in the province of Ontario.

    Originally controlled by the Church of England, the university assumed its present name in 1850 upon becoming a secular institution.

    As a collegiate university, it comprises eleven colleges each with substantial autonomy on financial and institutional affairs and significant differences in character and history. The university also operates two satellite campuses located in Scarborough and Mississauga.

    University of Toronto has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

    Academically, the University of Toronto is noted for movements and curricula in literary criticism and communication theory, known collectively as the Toronto School.

    The university was the birthplace of insulin and stem cell research, and was the site of the first electron microscope in North America; the identification of the first black hole Cygnus X-1; multi-touch technology, and the development of the theory of NP-completeness.

    The university was one of several universities involved in early research of deep learning. It receives the most annual scientific research funding of any Canadian university and is one of two members of the Association of American Universities (US) outside the United States, the other being McGill(CA).

    The Varsity Blues are the athletic teams that represent the university in intercollegiate league matches, with ties to gridiron football, rowing and ice hockey. The earliest recorded instance of gridiron football occurred at University of Toronto’s University College in November 1861.

    The university’s Hart House is an early example of the North American student centre, simultaneously serving cultural, intellectual, and recreational interests within its large Gothic-revival complex.

    The University of Toronto has educated three Governors General of Canada, four Prime Ministers of Canada, three foreign leaders, and fourteen Justices of the Supreme Court. As of March 2019, ten Nobel laureates, five Turing Award winners, 94 Rhodes Scholars, and one Fields Medalist have been affiliated with the university.

    Early history

    The founding of a colonial college had long been the desire of John Graves Simcoe, the first Lieutenant-Governor of Upper Canada and founder of York, the colonial capital. As an University of Oxford (UK)-educated military commander who had fought in the American Revolutionary War, Simcoe believed a college was needed to counter the spread of republicanism from the United States. The Upper Canada Executive Committee recommended in 1798 that a college be established in York.

    On March 15, 1827, a royal charter was formally issued by King George IV, proclaiming “from this time one College, with the style and privileges of a University … for the education of youth in the principles of the Christian Religion, and for their instruction in the various branches of Science and Literature … to continue for ever, to be called King’s College.” The granting of the charter was largely the result of intense lobbying by John Strachan, the influential Anglican Bishop of Toronto who took office as the college’s first president. The original three-storey Greek Revival school building was built on the present site of Queen’s Park.

    Under Strachan’s stewardship, King’s College was a religious institution closely aligned with the Church of England and the British colonial elite, known as the Family Compact. Reformist politicians opposed the clergy’s control over colonial institutions and fought to have the college secularized. In 1849, after a lengthy and heated debate, the newly elected responsible government of the Province of Canada voted to rename King’s College as the University of Toronto and severed the school’s ties with the church. Having anticipated this decision, the enraged Strachan had resigned a year earlier to open Trinity College as a private Anglican seminary. University College was created as the nondenominational teaching branch of the University of Toronto. During the American Civil War the threat of Union blockade on British North America prompted the creation of the University Rifle Corps which saw battle in resisting the Fenian raids on the Niagara border in 1866. The Corps was part of the Reserve Militia lead by Professor Henry Croft.

    Established in 1878, the School of Practical Science was the precursor to the Faculty of Applied Science and Engineering which has been nicknamed Skule since its earliest days. While the Faculty of Medicine opened in 1843 medical teaching was conducted by proprietary schools from 1853 until 1887 when the faculty absorbed the Toronto School of Medicine. Meanwhile the university continued to set examinations and confer medical degrees. The university opened the Faculty of Law in 1887, followed by the Faculty of Dentistry in 1888 when the Royal College of Dental Surgeons became an affiliate. Women were first admitted to the university in 1884.

    A devastating fire in 1890 gutted the interior of University College and destroyed 33,000 volumes from the library but the university restored the building and replenished its library within two years. Over the next two decades a collegiate system took shape as the university arranged federation with several ecclesiastical colleges including Strachan’s Trinity College in 1904. The university operated the Royal Conservatory of Music from 1896 to 1991 and the Royal Ontario Museum from 1912 to 1968; both still retain close ties with the university as independent institutions. The University of Toronto Press was founded in 1901 as Canada’s first academic publishing house. The Faculty of Forestry founded in 1907 with Bernhard Fernow as dean was Canada’s first university faculty devoted to forest science. In 1910, the Faculty of Education opened its laboratory school, the University of Toronto Schools.

    World wars and post-war years

    The First and Second World Wars curtailed some university activities as undergraduate and graduate men eagerly enlisted. Intercollegiate athletic competitions and the Hart House Debates were suspended although exhibition and interfaculty games were still held. The David Dunlap Observatory in Richmond Hill opened in 1935 followed by the University of Toronto Institute for Aerospace Studies in 1949. The university opened satellite campuses in Scarborough in 1964 and in Mississauga in 1967. The university’s former affiliated schools at the Ontario Agricultural College and Glendon Hall became fully independent of the University of Toronto and became part of University of Guelph (CA) in 1964 and York University (CA) in 1965 respectively. Beginning in the 1980s reductions in government funding prompted more rigorous fundraising efforts.

    Since 2000

    In 2000 Kin-Yip Chun was reinstated as a professor of the university after he launched an unsuccessful lawsuit against the university alleging racial discrimination. In 2017 a human rights application was filed against the University by one of its students for allegedly delaying the investigation of sexual assault and being dismissive of their concerns. In 2018 the university cleared one of its professors of allegations of discrimination and antisemitism in an internal investigation after a complaint was filed by one of its students.

    The University of Toronto was the first Canadian university to amass a financial endowment greater than c. $1 billion in 2007. On September 24, 2020 the university announced a $250 million gift to the Faculty of Medicine from businessman and philanthropist James C. Temerty- the largest single philanthropic donation in Canadian history. This broke the previous record for the school set in 2019 when Gerry Schwartz and Heather Reisman jointly donated $100 million for the creation of a 750,000-square foot innovation and artificial intelligence centre.

    Research

    Since 1926 the University of Toronto has been a member of the Association of American Universities (US) a consortium of the leading North American research universities. The university manages by far the largest annual research budget of any university in Canada with sponsored direct-cost expenditures of $878 million in 2010. In 2018 the University of Toronto was named the top research university in Canada by Research Infosource with a sponsored research income (external sources of funding) of $1,147.584 million in 2017. In the same year the university’s faculty averaged a sponsored research income of $428,200 while graduate students averaged a sponsored research income of $63,700. The federal government was the largest source of funding with grants from the Canadian Institutes of Health Research; the Natural Sciences and Engineering Research Council; and the Social Sciences and Humanities Research Council amounting to about one-third of the research budget. About eight percent of research funding came from corporations- mostly in the healthcare industry.

    The first practical electron microscope was built by the physics department in 1938. During World War II the university developed the G-suit- a life-saving garment worn by Allied fighter plane pilots later adopted for use by astronauts.Development of the infrared chemiluminescence technique improved analyses of energy behaviours in chemical reactions. In 1963 the asteroid 2104 Toronto was discovered in the David Dunlap Observatory (CA) in Richmond Hill and is named after the university. In 1972 studies on Cygnus X-1 led to the publication of the first observational evidence proving the existence of black holes. Toronto astronomers have also discovered the Uranian moons of Caliban and Sycorax; the dwarf galaxies of Andromeda I, II and III; and the supernova SN 1987A. A pioneer in computing technology the university designed and built UTEC- one of the world’s first operational computers- and later purchased Ferut- the second commercial computer after UNIVAC I. Multi-touch technology was developed at Toronto with applications ranging from handheld devices to collaboration walls. The AeroVelo Atlas which won the Igor I. Sikorsky Human Powered Helicopter Competition in 2013 was developed by the university’s team of students and graduates and was tested in Vaughan.

    The discovery of insulin at the University of Toronto in 1921 is considered among the most significant events in the history of medicine. The stem cell was discovered at the university in 1963 forming the basis for bone marrow transplantation and all subsequent research on adult and embryonic stem cells. This was the first of many findings at Toronto relating to stem cells including the identification of pancreatic and retinal stem cells. The cancer stem cell was first identified in 1997 by Toronto researchers who have since found stem cell associations in leukemia; brain tumors; and colorectal cancer. Medical inventions developed at Toronto include the glycaemic index; the infant cereal Pablum; the use of protective hypothermia in open heart surgery; and the first artificial cardiac pacemaker. The first successful single-lung transplant was performed at Toronto in 1981 followed by the first nerve transplant in 1988; and the first double-lung transplant in 1989. Researchers identified the maturation promoting factor that regulates cell division and discovered the T-cell receptor which triggers responses of the immune system. The university is credited with isolating the genes that cause Fanconi anemia; cystic fibrosis; and early-onset Alzheimer’s disease among numerous other diseases. Between 1914 and 1972 the university operated the Connaught Medical Research Laboratories- now part of the pharmaceutical corporation Sanofi-Aventis. Among the research conducted at the laboratory was the development of gel electrophoresis.

    The University of Toronto is the primary research presence that supports one of the world’s largest concentrations of biotechnology firms. More than 5,000 principal investigators reside within 2 kilometres (1.2 mi) from the university grounds in Toronto’s Discovery District conducting $1 billion of medical research annually. MaRS Discovery District is a research park that serves commercial enterprises and the university’s technology transfer ventures. In 2008, the university disclosed 159 inventions and had 114 active start-up companies. Its SciNet Consortium operates the most powerful supercomputer in Canada.

     
  • richardmitnick 3:16 pm on September 29, 2020 Permalink | Reply
    Tags: "Seed funding grants support plans for innovative research centers", , Artificial Intelligence, , Cyber-Physical Systems for Intelligent Transportation, Energy-efficient Magnetoelectronics, Resilience to Climate Change in Crops with Artificial Intelligence, Splicing Therapeutics,   

    From UC Santa Cruz: “Seed funding grants support plans for innovative research centers” 

    From UC Santa Cruz

    September 28, 2020
    Tim Stephens
    stephens@ucsc.edu

    1

    The UCSC Office of Research has awarded seed funding to six campus research groups to support their efforts to develop innovative new research centers.

    The Seed Funding for Center Scale Research Initiatives program supports collaborative, multidisciplinary proposals and aims to bring together faculty with diverse backgrounds, areas of inquiry, and expertise.

    “Our goal with this program is to make strategic investments in collaborative and multidisciplinary projects that will lead to innovative, inventive, and serendipitous discoveries and findings with big impacts and significant long-term funding,” said Scott Brandt, vice chancellor for research. “Each of these projects meets these criteria and we are very excited to see what they will produce going forward.”

    The first cohort of awards includes the following groups:

    An Interdisciplinary Research Network for Astrobiology at UCSC,” led by Natalie Batalha, professor of astronomy and astrophysics, explores new models of planetary development and how it could lead to the conditions for life, and furthers our understanding of the prevalence of life in the universe.
    Developing a Collaborative, Interdisciplinary Center Research Proposal for Energy-efficient Magnetoelectronics,” led by David Lederman, professor of physics, will design 2-dimensional material layered stacks to precisely control performance properties such as magnetism and superconductivity at the quantum scale for future microelectronics.
    The Applied Artificial Intelligence Initiative,” led by J. Xavier Prochaska, professor of astronomy and astrophysics, will continue to build bridges across disciplines and between industry, academia, and under-represented minorities to develop UCSC as a leader in the growing field of artificial intelligence.
    Building Resilience to Climate Change in Crops with Artificial Intelligence and AgTech Devices,” led by Marco Rolandi, professor of electrical and computer engineering, will use bioelectronic devices paired with AI algorithms that can regulate specific plant hormones to increase agricultural yields and crop resilience.
    Computation-Aware Algorithmic Design for Cyber-Physical Systems for Intelligent Transportation,” led by Ricardo Sanfelice, professor of electrical and computer engineering, will build a framework for merging feedback control algorithms with computing system designs to reduce operational risks and optimize performance for intelligent transportation systems and other cyber-physical systems.
    UCSC Center for Open Access Splicing Therapeutics (COAST),” led by Michael Stone, professor of chemistry and biochemistry, will pursue meaningful precision therapies for patients with rare diseases. The COAST team will leverage their ribonucleic acid (RNA) expertise to intervene with small populations where the medical industry cannot make cost effective treatments economically viable.

    Each group will use their seed funding over the next year to pursue center-scale research proposals. The awards of $60,000 to $75,000 provide support for grant planning, capacity building, research development activities, and acquisition of key data.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of California, Santa Cruz, opened in 1965 and grew, one college at a time, to its current (2008-09) enrollment of more than 16,000 students. Undergraduates pursue more than 60 majors supervised by divisional deans of humanities, physical & biological sciences, social sciences, and arts. Graduate students work toward graduate certificates, master’s degrees, or doctoral degrees in more than 30 academic fields under the supervision of the divisional and graduate deans. The dean of the Jack Baskin School of Engineering oversees the campus’s undergraduate and graduate engineering programs.

    UCSC Lick Observatory, Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft).

    UC Observatories Lick Automated Planet Finder, fully robotic 2.4-meter optical telescope at Lick Observatory, situated on the summit of Mount Hamilton, east of San Jose, California, USA.

    The UCO Lick C. Donald Shane telescope is a 120-inch (3.0-meter) reflecting telescope located at the Lick Observatory, Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft).

    UC Santa Cruz campus.

    UCSC is the home base for the Lick Observatory.

    Lick Observatory’s 36-inch Great Great Refractor telescope housed in the South (large) Dome of main building.

     
  • richardmitnick 8:43 am on October 21, 2019 Permalink | Reply
    Tags: Artificial Intelligence, , The Giotto project   

    From École Polytechnique Fédérale de Lausanne: “With Giotto, artificial intelligence gets a third dimension” 

    EPFL bloc

    From École Polytechnique Fédérale de Lausanne

    21.10.19
    Sarah Aubort

    1
    The Giotto project, launched by EPFL startup Learn to Forecast, intends to revolutionize the way we use artificial intelligence. Drawing on the science of shapes, Giotto pushes AI forward by making it more reliable and intuitive in areas such as materials science, neuroscience and biology. Giotto is open-source and available free of charge on GitHub, and it’s already being used by some EPFL scientists.

    Researchers use artificial intelligence to solve complex problems, but it’s not a transparent science: AI’s computational capabilities often exceed our understanding and raise issues of reliability and trust among users. “Algorithms are becoming increasingly complex,” says Matteo Caorsi, the lead scientist at Learn to Forecast (L2F). “It’s very hard to understand how they work and thus to trust the solutions they provide or predict when they might get things wrong.”

    Shapes hidden within data

    To address this problem, L2F followed an intuitive approach based on the science of shapes. The result is Giotto, a free and open-source library that aims to revolutionize the way we use machine learning. “Humans understand shapes and colors better than numbers and equations,” says Aldo Podestà, the CEO of L2F, “which is why we think that we can use topology – the science of shapes – to build a new language between AI and users.”

    Giotto offers a toolkit that uses algorithms inspired by topology to address some of the shortcomings of machine learning. Users don’t need to be fluent in advanced mathematics, since Giotto is a turnkey method of revealing structures previously hidden within a dataset. “This new form of AI is based on graphs and their multidimensional versions, in other words, geometrical objects that can reveal essential structures within the data,” says Thomas Boys, a co-founder at L2F.

    Until now, machine learning algorithms sought performance, even if that meant depriving users of a fuller understanding of the nature of the results. “Giotto helps identify the framework underlying all relationships among the data, and this allows users to understand the data better and extract meaning from them with greater accuracy,” adds Boys. The project is named for Giotto di Bondone, the 13th-century artist who first introduced perspective into painting. L2F hopes to usher in a similar paradigm change in data science by combining machine learning with topology.

    New horizons

    To develop Giotto, its creators worked with EPFL researchers who use topology every day. This includes Professor Kathryn Hess Bellwald, the head of the Laboratory for Topology and Neuroscience. “One of Giotto’s main advantages is that, because of its user friendliness, it will be possible for scientists from all kinds of fields to use these tools as a regular part of their data science toolkit,” says Prof. Hess Bellwald. “This should lead to new insights in many different areas that one could not attain without Giotto.”

    Learn to Forecast (L2F) was founded at EPFL in 2017. Its aim is to use artificial intelligence to address a wide variety of issues. The company raised three million francs via 4FO Ventures to develop the Giotto library, and it now has 25 employees.

    For more information : https://www.giotto.ai
    Git Hub : https://github.com/giotto-learn/giotto-learn

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL campus

    EPFL is Europe’s most cosmopolitan technical university. It receives students, professors and staff from over 120 nationalities. With both a Swiss and international calling, it is therefore guided by a constant wish to open up; its missions of teaching, research and partnership impact various circles: universities and engineering schools, developing and emerging countries, secondary schools and gymnasiums, industry and economy, political circles and the general public.

     
  • richardmitnick 2:11 pm on September 24, 2019 Permalink | Reply
    Tags: , Artificial Intelligence, CADES, ,   

    From Oak Ridge National Laboratory: “ORNL develops, deploys AI capabilities across research portfolio” 

    i1

    From Oak Ridge National Laboratory

    September 24, 2019

    Scott S Jones
    jonesg@ornl.gov
    865-241-6491

    Processes like manufacturing aircraft parts, analyzing data from doctors’ notes and identifying national security threats may seem unrelated, but at the U.S. Department of Energy’s Oak Ridge National Laboratory, artificial intelligence is improving all of these tasks. To accelerate promising AI applications in diverse research fields, ORNL has established a labwide AI Initiative, and its success will help to ensure U.S. economic competitiveness and national security.

    Led by ORNL AI Program Director David Womble, this internal investment brings the lab’s AI expertise, computing resources and user facilities together to facilitate analyses of massive datasets that would otherwise be unmanageable. Multidisciplinary research teams are advancing AI and high-performance computing to tackle increasingly complex problems, including designing novel materials, diagnosing and treating diseases and enhancing the cybersecurity of U.S. infrastructure.

    “AI has the potential to revolutionize science and engineering, and it is exciting to be part of this,” Womble said. “With its world-class scientists and facilities, ORNL will make significant contributions.”

    Across the lab, experts in data science are applying AI tools known as machine learning algorithms (which allow computers to learn from data and predict outcomes) and deep learning algorithms (which use neural networks inspired by the human brain to uncover patterns of interest in datasets) to accelerate breakthroughs across the scientific spectrum. As part of the initiative, ORNL researchers are developing new technologies to complement and expand these capabilities, establishing AI as a force for improving both fundamental and applied science applications.

    Home to the world’s most powerful and smartest supercomputer, Summit, ORNL is particularly well-suited for AI research.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    The IBM system debuted in June 2018 and resides at the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility located at ORNL.

    With hardware optimized for AI applications, Summit provides an ideal platform for applying machine learning and deep learning to groundbreaking research. The system’s increased memory bandwidth allows AI algorithms to run at faster speeds and obtain more accurate results.

    Other AI-enabled machines include the NVIDIA DGX-2 systems located at ORNL’s Compute and Data Environment for Science.

    3

    These appliances allow researchers to tackle data-intensive problems using unique AI strategies and to run smaller-scale simulations in preparation for later work on Summit.

    “AI is rapidly changing the way computational scientists do research, and ORNL’s history of leadership in computing and data makes it the perfect setting in which to advance the state of the art,” said Associate Laboratory Director for Computing and Computational Sciences Jeff Nichols. “While Summit’s rapid training of AI networks is already assisting researchers across the scientific spectrum in realizing the potential of AI, we have begun preparing for the post-Summit world via Frontier, a second-generation AI system that will provide new capabilities for machine learning, deep learning and data analytics.”

    Although ORNL researchers are applying the lab’s unique combination of AI expertise and powerful computing resources to address a range of scientific challenges, three areas in particular are poised to deliver major early results: additive manufacturing, health care and cyber-physical security.

    Additive manufacturing, or 3D printing, enables researchers at the Manufacturing Demonstration Facility, a DOE Office of Energy Efficiency and Renewable Energy User Facility located at ORNL, to develop reliable, energy-efficient plastic and metal parts at low cost. Using AI, they can consistently create high-quality, specialized aerospace components. AI can instantly locate cracks and other defects before they become problems, thereby reducing costs and time to market.

    Additionally, AI makes it possible for the machines to detect and repair errors in real time during the process of binder jetting, in which a liquid binding agent fuses together layers of powder particles.

    Researchers at ORNL are also optimizing AI techniques to analyze patient data from medical tests, doctors’ notes and other health records. These techniques use language processing to identify patterns among notes from different doctors, extracting previously inaccessible insights from mountains of data. When combined with results from x-rays and other relevant tests, these results could improve health care providers’ ability to diagnose and treat problems ranging from post-traumatic stress disorder to cancer.

    For example, ORNL Health Data Sciences Institute Director Gina Tourassi uses AI to automatically compile and analyze data and determine which factors are responsible for the development of certain diseases. Her team is running machine learning algorithms on Summit to scan millions of medical documents in pursuit of these types of insights.

    Cybersecurity platforms such as “Situ” monitor thousands of events per second to detect anomalies that human analysts would not be able to find. Situ sorts through massive amounts of raw network data, freeing up network operators to focus on small, manageable amounts of activity to investigate potential threats and make more informed decisions.

    And through partnerships with power companies, ORNL has also used AI to improve the security of power grids by monitoring data streams and identifying suspicious activity.

    To date, ORNL researchers have earned two R&D 100 Awards and 10 patents for work related to AI research and algorithm development. The lab plans to recruit additional AI experts to continue building on this foundation.

    To ensure that U.S. researchers maintain leadership in R&D innovation and continue revolutionizing science with AI, ORNL also provides professional development opportunities including the Artificial Intelligence Summer Institute, which pairs students with ORNL researchers to solve science problems using AI, and the Data Learning Users Group, which allows OLCF users and ORNL staff to practice using deep learning techniques.

    ORNL also collaborates with the University of Tennessee, Knoxville, to support the Bredesen Center Ph.D. program in data science and engineering, a curriculum that combines data science with scientific specialties ranging from materials science to national security.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 12:48 pm on September 18, 2019 Permalink | Reply
    Tags: A new approach to the problems of dark matter and dark energy, Artificial Intelligence, , , , , Deep artificial neural networks, , Facial recognition for cosmology, ,   

    From ETH Zürich: “Artificial intelligence probes dark matter in the universe” 

    ETH Zurich bloc

    From ETH Zürich

    18.09.2019
    Oliver Morsch

    A team of physicists and computer scientists at ETH Zürich has developed a new approach to the problem of dark matter and dark energy in the universe. Using machine learning tools, they programmed computers to teach themselves how to extract the relevant information from maps of the universe.

    1
    Excerpt from a typical computer-generated dark matter map used by the researchers to train the neural network. (Source: ETH Zürich)

    Understanding the how our universe came to be what it is today and what will be its final destiny is one of the biggest challenges in science. The awe-inspiring display of countless stars on a clear night gives us some idea of the magnitude of the problem, and yet that is only part of the story. The deeper riddle lies in what we cannot see, at least not directly: dark matter and dark energy. With dark matter pulling the universe together and dark energy causing it to expand faster, cosmologists need to know exactly how much of those two is out there in order to refine their models.

    At ETH Zürich, scientists from the Department of Physics and the Department of Computer Science have now joined forces to improve on standard methods for estimating the dark matter content of the universe through artificial intelligence. They used cutting-edge machine learning algorithms for cosmological data analysis that have a lot in common with those used for facial recognition by Facebook and other social media. Their results have recently been published in the scientific journal Physical Review D.

    Facial recognition for cosmology

    While there are no faces to be recognized in pictures taken of the night sky, cosmologists still look for something rather similar, as Tomasz Kacprzak, a researcher in the group of Alexandre Refregier at the Institute of Particle Physics and Astrophysics, explains: “Facebook uses its algorithms to find eyes, mouths or ears in images; we use ours to look for the tell-tale signs of dark matter and dark energy.” As dark matter cannot be seen directly in telescope images, physicists rely on the fact that all matter – including the dark variety – slightly bends the path of light rays arriving at the Earth from distant galaxies. This effect, known as “weak gravitational lensing”, distorts the images of those galaxies very subtly, much like far-away objects appear blurred on a hot day as light passes through layers of air at different temperatures.

    Weak gravitational lensing NASA/ESA Hubble

    Cosmologists can use that distortion to work backwards and create mass maps of the sky showing where dark matter is located. Next, they compare those dark matter maps to theoretical predictions in order to find which cosmological model most closely matches the data. Traditionally, this is done using human-designed statistics such as so-called correlation functions that describe how different parts of the maps are related to each other. Such statistics, however, are limited as to how well they can find complex patterns in the matter maps.

    Neural networks teach themselves

    “In our recent work, we have used a completely new methodology”, says Alexandre Refregier. “Instead of inventing the appropriate statistical analysis ourselves, we let computers do the job.” This is where Aurelien Lucchi and his colleagues from the Data Analytics Lab at the Department of Computer Science come in. Together with Janis Fluri, a PhD student in Refregier’s group and lead author of the study, they used machine learning algorithms called deep artificial neural networks and taught them to extract the largest possible amount of information from the dark matter maps.

    2
    Once the neural network has been trained, it can be used to extract cosmological parameters from actual images of the night sky. (Visualisations: ETH Zürich)

    In a first step, the scientists trained the neural networks by feeding them computer-generated data that simulates the universe. That way, they knew what the correct answer for a given cosmological parameter – for instance, the ratio between the total amount of dark matter and dark energy – should be for each simulated dark matter map. By repeatedly analysing the dark matter maps, the neural network taught itself to look for the right kind of features in them and to extract more and more of the desired information. In the Facebook analogy, it got better at distinguishing random oval shapes from eyes or mouths.

    More accurate than human-made analysis

    The results of that training were encouraging: the neural networks came up with values that were 30% more accurate than those obtained by traditional methods based on human-made statistical analysis. For cosmologists, that is a huge improvement as reaching the same accuracy by increasing the number of telescope images would require twice as much observation time – which is expensive.

    Finally, the scientists used their fully trained neural network to analyse actual dark matter maps from the KiDS-450 dataset. “This is the first time such machine learning tools have been used in this context,” says Fluri, “and we found that the deep artificial neural network enables us to extract more information from the data than previous approaches. We believe that this usage of machine learning in cosmology will have many future applications.”

    As a next step, he and his colleagues are planning to apply their method to bigger image sets such as the Dark Energy Survey.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    Also, more cosmological parameters and refinements such as details about the nature of dark energy will be fed to the neural networks.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: