Tagged: Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:17 am on October 7, 2022 Permalink | Reply
    Tags: "DOE Funds Pilot Study Focused on Biosecurity for Bioenergy Crops", , , , , , Computing, , , Research into threats from pathogens and pests would speed short-term response and spark long-term mitigation strategies.,   

    From The DOE’s Brookhaven National Laboratory: “DOE Funds Pilot Study Focused on Biosecurity for Bioenergy Crops” 

    From The DOE’s Brookhaven National Laboratory


    Karen McNulty Walsh
    (631) 344-8350

    Peter Genzer
    (631) 344-3174

    Research into threats from pathogens and pests would speed short-term response and spark long-term mitigation strategies.

    Pilot study on an important disease in sorghum (above) will develop understanding of threats to bioenergy crops, potentially speeding the development of short-term responses and long-term mitigation strategies. (Credit: U.S. Department of Energy Genomic Science program)

    The U.S. Department of Energy’s (DOE) Office of Science has selected Brookhaven National Laboratory to lead a new research effort focused on potential threats to crops grown for bioenergy production. Understanding how such bioenergy crops could be harmed by known or new pests or pathogens could help speed the development of rapid responses to mitigate damage and longer-term strategies for preventing such harm. The pilot project could evolve into a broader basic science capability to help ensure the development of resilient and sustainable bioenergy crops as part of a transition to a net-zero carbon economy.

    The idea is modeled on the way DOE’s National Virtual Biotechnology Laboratory (NVBL) pooled basic science capabilities to address the COVID-19 pandemic. With $5 Million in initial funding, allocated over the next two years, Brookhaven Lab and its partners will develop a coordinated approach for addressing biosecurity challenges. This pilot study will lead to a roadmap for building out a DOE-wide capability known as the National Virtual Biosecurity for Bioenergy Crops Center (NVBBCC).

    “A robust biosecurity capability optimized to respond rapidly to biological threats to bioenergy crops requires an integrated and versatile platform,” said Martin Schoonen, Brookhaven Lab’s Associate Laboratory Director for Environment, Biology, Nuclear Science & Nonproliferation, who will serve as principal investigator for the pilot project. “With this initial funding, we’ll develop a bio-preparedness platform for sampling and detecting threats, predicting how they might propagate, and understanding how pests or pathogens interact with bioenergy crops at the molecular level—all of which are essential for developing short-term control measures and long-term solutions.”

    The team will invest in new research tools—including experimental equipment and an integrating computing environment for data sharing, data analysis, and predictive modeling. Experiments on an important disease of energy sorghum, a leading target for bioengineering as an oil-producing crop, will serve as a model to help the team establish optimized protocols for studying plant-pathogen interactions.

    In addition, a series of workshops will bring together experts from a range of perspectives and institutions to identify partnerships within and outside DOE, as well as any future investments needed, to establish the full capabilities of an end-to-end biosecurity platform.

    “NVBBCC is envisioned to be a distributed, virtual center with multiple DOE-labs at its core to maximize the use of unique facilities and expertise across the DOE complex,” Schoonen said. “The center will support plant pathology research driven by the interests of the bioenergy crop community, as well as broader plant biology research that could impact crop health.”

    Building the platform

    The pilot study experiments and workshops will be organized around four main themes: detection and sampling, biomolecular characterization, assessment, and mitigation.

    In this initial phase, the research will focus on energy sorghum. This crop’s potential oil yield per acre far exceeds than that of soybeans, currently the world’s primary source of biodiesel.

    “Sorghum is susceptible to a devastating fungal disease, caused by Colletotrichum sublineola, which can result in yield losses of up to 67 percent,” said John Shanklin, chair of Brookhaven Lab’s Biology Department and co-lead of the assessment theme. “Finding ways to thwart this pathogen is a high priority for the bioenergy crop community.”

    The NVBBCC team will use a range of tools—including advanced remote-sensing technologies, COVID-19-like rapid test strips, and in-field sampling—to detect C. sublineola. Additional experiments will assess airborne propagation of fungal spores, drawing on Brookhaven Lab’s expertise in modeling the dispersal of aerosol particles.

    The team will also use state-of-the-art biomolecular characterization tools—including cryo-electron microscopes in Brookhaven’s Laboratory for BioMolecular Structure (LBMS) and x-ray crystallography beamlines at the National Synchrotron Light Source-II (NSLS-II)—to explore details of how pathogen proteins and plant proteins interact. In addition, they’ll add a new tool—a cryogenic-focused ion beam—to produce samples for high-resolution three-dimensional cellular imaging and other advanced imaging modalities.

    Together, these experiments will reveal mechanistic details that provide insight into how plants respond to infections, including how some strains of sorghum develop resistance to C. sublineola. The team will also draw on extensive information about the genetic makeup of sorghum and C. sublineola to identify factors that control expression of the various plant and pathogen proteins.

    The program will be supported by an integrating computing infrastructure with access to sophisticated computational tools across the DOE complex and at partner institutions, enabling integrated data analysis and collaboration using community data standards and tools. The infrastructure will also provide capabilities to develop, train, and verify new analytical and predictive computer models, including novel artificial intelligence (AI) solutions.

    “NVBBCC will build on the Johns Hopkins University-developed SciServer environment, which has been used successfully in large data-sharing and analysis projects in cosmology and soil ecology,” said Kerstin Kleese van Dam, head of Brookhaven Lab’s Computational Science Initiative. “NVBBCC’s computational infrastructure will allow members to easily coordinate research across different domains and sites, accelerating discovery and response times through integrated knowledge sharing.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Brookhaven Campus

    One of ten national laboratories overseen and primarily funded by the The DOE Office of Science, The DOE’s Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Energy research
    Structural biology
    Accelerator physics


    Brookhaven National Lab was originally owned by the Atomic Energy Commission and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University and Battelle Memorial Institute. From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.


    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology to have a facility near Boston, Massachusetts. Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University, Cornell University, Harvard University, Johns Hopkins University, Massachusetts Institute of Technology, Princeton University, University of Pennsylvania, University of Rochester, and Yale University.

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966.

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II. [below].

    BNL National Synchrotron Light Source.

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider (CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, it was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] as the future Electron–ion collider (EIC) in the United States.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.

    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.

    BNL National Synchrotron Light Source II, Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years. NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.

    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.

    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University-SUNY.

    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to the ATLAS experiment, one of the four detectors located at the The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] Large Hadron Collider(LHC).

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] map.

    Iconic view of the European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear] [Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH) [CERN] ATLAS detector.

    It is currently operating at The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH) [CERN] near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    DOE’s Oak Ridge National Laboratory Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.

    Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China .

    BNL Center for Functional Nanomaterials.

    BNL National Synchrotron Light Source II.


    BNL Relative Heavy Ion Collider Campus.

    BNL/RHIC Phenix detector.

  • richardmitnick 1:18 pm on February 8, 2022 Permalink | Reply
    Tags: "Engineering the future of manufacturing", , , , , , Computing, , Finding sustainable replacements for plastics, Renewable manufacturing,   

    From The University of Delaware (US): “Engineering the future of manufacturing” 

    U Delaware bloc

    From The University of Delaware (US)

    February 07, 2022
    Article by Maddy Lauria
    Photo by Evan Krape

    Marianthi Ierapetritou, the Bob and Jane Gore Centennial Chair of Chemical and Biomolecular Engineering, has received $3 million in funding from the National Science Foundation’s Future Manufacturing program to explore renewable raw materials for chemical manufacturing.

    COE’s Marianthi Ierapetritou leads $3 million National Science Foundation (US) effort.

    Solving the climate crisis isn’t just about everyone driving electric vehicles and installing solar panels on our homes. It’s about redesigning the way we live, including sweeping changes to the way we produce everyday products.

    As researchers race to find the latest and greatest technologies the world needs to be more resilient and sustainable, one team of educators at the University of Delaware is aiming to create a blueprint for a more renewable manufacturing future with a $3 million grant from the National Science Foundation.

    “It’s important to educate the new generation of engineers to try to change the mentality of how we’re utilizing the limited resources we have,” said Marianthi Ierapetritou, UD’s Bob and Jane Gore Centennial Chair of Chemical and Biomolecular Engineering. She will lead the project as she works with Department of Chemical and Biomolecular Engineering Professors Dionisios Vlachos and Raul Lobo, Department of Electrical and Computer Engineering Associate Professor Hui Fang and Joseph R. Biden, Jr. School of Public Policy and Administration Assistant Professor Kalim Shah to launch the future manufacturing project in 2022.

    The goal is to thoroughly examine existing literature around renewable products and processes in manufacturing, which will help researchers synthesize existing data and identify gaps in knowledge. From there, researchers can develop a framework for examining the potential economic, environmental and market impacts of alternative products and processes, while also evaluating the realistic probability of introducing new “green” solutions into existing supply chains and consumer markets.

    “The big idea here is how to better utilize available information,” Ierapetritou said. “It’s a collaboration between chemical engineering, computer science and public policy.”

    Several students at the undergraduate and graduate levels in both the College of Engineering and the Biden School will participate in some of the computational work, research and design while collaborating with real-world chemical companies and with the American Institute of Chemical Engineers’s Rapid Advancement in Process Intensification Deployment (RAPID) Institute as a manufacturing partner. The project’s funding is expected to span four years.

    By using computers to mine for innovations in existing studies — a task that would take multiple graduate students months or years to complete — these researchers can extract the information needed to better understand what it will take to change the way we produce and consume products.

    “We’re kind of a supporting team, while the chemical engineering team needs to use the information we are extracting,” Fang said. “It’s like we’re collecting all of the available recipes so we can enable the chef to create some new dishes.”

    Since many of the products used every day are created from petrochemicals (fossil fuels), researchers are looking for ways to create more renewable products that require less energy and produce less waste. A consensus of scientists around the world say that greenhouse gas emissions must reach zero globally to avoid a level of global warming expected to result in more catastrophic climate disasters than the deadly floods, fires and storms seen in recent years worldwide.

    But finding renewable and realistic replacements to the way societies manufacture products means getting innovative at the molecular level, explained Vlachos, Unidel Dan Rich Chair in Energy Professor of Chemical and Biomolecular Engineering, director of the Catalysis Center for Energy Innovation and director of the Delaware Energy Institute.

    That task can be tackled much more efficiently when computer intelligence gets involved. Instead of using expensive laboratory equipment, chemicals, molecules and catalysts, researchers can use chemistry-informed and data science-informed computer programs to point them in the right direction.

    “You don’t want to build a $20 billion plant and then it fails. That would be a disaster,” Vlachos said. “I want my computer programs to make better predictions. So, how do you build the new chemical route to go from here to there? We need the computer to tell us.”

    That also means teaching computers chemistry, which means the models can only be as good as they’re trained to be. Still, these programs will be able to process information exponentially faster than a group of human researchers engaged in trial and error experiments, while also eliminating the need for physical resources.

    For example, computer programs could extract everything from existing scientific literature about how a molecule like the sugar glucose reacts. With that information, the model could then explore how different combinations of different molecules act, and what new outcomes varying combinations might have, like how combinations of sugar and salt might impact a cake mix.

    “We’re transforming this whole process and using computer science and computer simulations to understand options and give you the best alternative without even going into the lab,” said Ierapetritou. “That’s why it’s called ‘future manufacturing.’ It doesn’t address the current needs of manufacturing, but rather the future needs and where we’d like to go.”

    Beyond searching for the ideal molecules and processes to build more sustainable products, the project will also look at the feasibility of producing those items. If, for example, the raw materials to create something are only available seasonally at certain locations, like useable waste from corn after the fall harvest, would it be more efficient and cost-effective to have modular manufacturing units that can be relocated at certain times of the year instead of building one huge manufacturing plant that would require additional transportation of raw items?

    “I think it might revolutionize the way we’re thinking about supply chain,” Ierapetritou said. “Especially now that we’re all paying the price of not optimizing supply chains.”

    This same interdisciplinary team also is working on a similar project funded by NSF and in collaboration with The University of Kansas (US) and The Pittsburg State University (KY)(US) to find sustainable replacements for plastics.

    To possibly work, these solutions also need to be based in reality — meaning those options also need to consider the logistics and costs of production and processing, real-world markets, consumer attitudes and potential environmental impacts.

    “The second piece is the market piece,” said Shah, assistant professor with the Biden School. “How do we market this green or eco-friendly approach to industry?”

    The modeling Shah will work with through this project — called “agent-based modeling” — will allow researchers to simulate real-world circumstances to explore whether a certain product would work at a scaled-up level, he explained. But human behavior isn’t exactly easy to model.

    “We’re not going to assume we know how different kinds of actors are going to act,” Shah said. “We’re going to do behavioral surveys of people, communities, businesses, and use the behavioral and physical principles and ideas to try to translate what we get from the surveys into rules that we can program.”

    This project focuses on optimizing future processes that will be needed to develop new products that could offer climate-related benefits, either from the way that foundational materials are harvested, how those base chemicals are processed to how the products are actually produced. That includes the supply chain processes from start to finish, as well as the role that marketing a new “green” solution will play.

    By using agent-based modeling, researchers can simulate how a product and its related processing would fit into particular sectors, or where their proposed idea might hit unexpected roadblocks.

    “Think of it like whatever goes into the programming of the Sims game,” Shah said, noting that their model outputs will be a collection of numbers, diagrams and statistics, not nearly as aesthetic as a multi-million dollar virtual game.

    As scientists try to communicate a dire need for swift changes to address the climate crisis, simulations like the one Shah plans to develop for this project could be applied to other projects, as well. The computational and modeling work led by Fang could also be applied to other similar projects.

    “There are models at multiple scales and multiple levels, and eventually we want to bring that all together,” Vlachos said. “Then we need to bring in society and decision-making, not just the science itself, and see where we go.”

    “A science-based, fact-based, data-based model can help the country move in the right direction with the right science. Now we need to deliver. We will deliver.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Delaware campus

    The The University of Delaware (US) is a public land-grant research university located in Newark, Delaware. University of Delaware (US) is the largest university in Delaware. It offers three associate’s programs, 148 bachelor’s programs, 121 master’s programs (with 13 joint degrees), and 55 doctoral programs across its eight colleges. The main campus is in Newark, with satellite campuses in Dover, the Wilmington area, Lewes, and Georgetown. It is considered a large institution with approximately 18,200 undergraduate and 4,200 graduate students. It is a privately governed university which receives public funding for being a land-grant, sea-grant, and space-grant state-supported research institution.

    University of Delaware (US) is classified among “R1: Doctoral Universities – Very high research activity”. According to The National Science Foundation (US), UD spent $186 million on research and development in 2018, ranking it 119th in the nation. It is recognized with the Community Engagement Classification by the Carnegie Foundation for the Advancement of Teaching.

    University of Delaware (US) is one of only four schools in North America with a major in art conservation. In 1923, it was the first American university to offer a study-abroad program.

    University of Delaware (US) traces its origins to a “Free School,” founded in New London, Pennsylvania in 1743. The school moved to Newark, Delaware by 1765, becoming the Newark Academy. The academy trustees secured a charter for Newark College in 1833 and the academy became part of the college, which changed its name to Delaware College in 1843. While it is not considered one of the colonial colleges because it was not a chartered institution of higher education during the colonial era, its original class of ten students included George Read, Thomas McKean, and James Smith, all three of whom went on to sign the Declaration of Independence. Read also later signed the United States Constitution.

    Science, Technology and Advanced Research (STAR) Campus

    On October 23, 2009, the University of Delaware (US) signed an agreement with Chrysler to purchase a shuttered vehicle assembly plant adjacent to the university for $24.25 million as part of Chrysler’s bankruptcy restructuring plan. The university has developed the 272-acre (1.10 km2) site into the Science, Technology and Advanced Research (STAR) Campus. The site is the new home of University of Delaware (US)’s College of Health Sciences, which includes teaching and research laboratories and several public health clinics. The STAR Campus also includes research facilities for University of Delaware (US)’s vehicle-to-grid technology, as well as Delaware Technology Park, SevOne, CareNow, Independent Prosthetics and Orthotics, and the East Coast headquarters of Bloom Energy. In 2020 [needs an update], University of Delaware (US) expects to open the Ammon Pinozzotto Biopharmaceutical Innovation Center, which will become the new home of the UD-led National Institute for Innovation in Manufacturing Biopharmaceuticals. Also, Chemours recently opened its global research and development facility, known as the Discovery Hub, on the STAR Campus in 2020. The new Newark Regional Transportation Center on the STAR Campus will serve passengers of Amtrak and regional rail.


    The university is organized into nine colleges:

    Alfred Lerner College of Business and Economics
    College of Agriculture and Natural Resources
    College of Arts and Sciences
    College of Earth, Ocean and Environment
    College of Education and Human Development
    College of Engineering
    College of Health Sciences
    Graduate College
    Honors College

    There are also five schools:

    Joseph R. Biden, Jr. School of Public Policy and Administration (part of the College of Arts & Sciences)
    School of Education (part of the College of Education & Human Development)
    School of Marine Science and Policy (part of the College of Earth, Ocean and Environment)
    School of Nursing (part of the College of Health Sciences)
    School of Music (part of the College of Arts & Sciences)

  • richardmitnick 10:51 am on January 11, 2022 Permalink | Reply
    Tags: "Looking at a new quantum revolution", , , , Computing, , ,   

    From Symmetry: “Looking at a new quantum revolution” 

    Symmetry Mag

    From Symmetry

    Kathryn Jepsen

    Illustration by Sandbox Studio, Chicago with Ana Kova.

    This month, Symmetry presents a series of articles on the past, present and future of quantum research—and its many connections to particle physics, astrophysics and computing.

    On July 25, 2018, a group of scientists from Microsoft, Google and IBM sat on a stage at the Computer History Museum in Mountain View, California. Matthias Troyer, John Martinis and Pat Gumann were all working on research into quantum computing, which takes advantage of our knowledge of quantum mechanics, the physics of how the world operates at the smallest level.

    The evening was billed as a night to ask the experts

    Quantum Questions.
    CHM Live | Quantum Questions

    About an hour into the event, moderator and historian David Brock asked the scientists one last thing: “What do you think—for us as, you know, citizens of the world—what are the most important things for us to know about and keep in mind about quantum computing, as it is today?”

    Troyer called attention to the museum displays around them. “When you look back at the history of computing… the abacus works on the same principle of the most modern, fastest classical CPU. It’s discrete, digital logic. There’s been no change in the way we compute for the last 5,000 years.

    “And now is the time when this is changing,” he said, “because with quantum computing we are radically changing the way we use nature to compute.”

    Scientists have called this moment a second quantum revolution. The first quantum revolution brought us developments like the transistor, which enabled the creation of powerful, portable modern electronic devices.

    It’s not yet clear what this new revolution will bring. But plenty of computer scientists, physicists and engineers are hard at work to find out. Around the world, research institutions, universities and businesses have been ramping up their investments in quantum science.

    At the end of 2018, the United States passed the National Quantum Initiative Act, which led to the establishment of five new Department of Energy Quantum Information Science Research Centers; five new National Science Foundation Quantum Leap Challenge Institutes; and the National Institute of Standards and Technology’s Quantum Economic Development Consortium.

    Efforts to develop quantum computers, quantum sensors and quantum networks have the potential to change our lives. And some of the first applications of these developments could be in particle physics and astrophysics research.

    Throughout the month of January, Symmetry will publish a series of articles meant to give readers a better understanding of this quantum ecosystem—the physics ideas it’s based on, the ways this knowledge can be applied, and what will determine the shape of our quantum future.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 9:38 am on November 16, 2021 Permalink | Reply
    Tags: "Stanford researchers design a frugal way to study complex systems and materials", 3-body problem; “n-body” problem, , “N-body” problems occur in everything from how proteins fold to understanding complex materials., Computing, , , , Scientists are particularly interested in the physics of exotic materials including new magnetic materials unlike anything found in nature., Scientists invented a new way to rapidly prototype complex geometries mirroring symmetries present in problems of interest., , Various macroscopic analogies that replicate multi-body interaction and specific geometry of the problem are tremendously insightful.   

    From Stanford University (US) : “Stanford researchers design a frugal way to study complex systems and materials” 

    Stanford University Name

    From Stanford University (US)

    November 15, 2021
    Taylor Kubota

    To celebrate the 60th birthday of King Oscar II of Sweden and Norway in 1889, the journal Acta Mathematica offered a prize for manuscripts that could help solve the following question, generally referred to as the 3-body problem: Can we predict the orbits of planets, moons and other celestial bodies over time? Although mathematician Henri Poincaré was awarded the gold medal and 2,500 Swedish kronor prize for his submission (later found to have an error), the general analytical solution to the “n-body” problem has remained difficult to track. Beyond celestial mechanics, “n-body” problems occur in everything from how proteins fold to understanding complex materials.

    Illustration of droplet lattices that can be rapidly prototyped to study the role of geometry in macroscopic analogs of complex materials. Image credit: Rebecca Konte, Resident Artist, Prakash Lab.

    While no single analytical insight has cracked these complex problems, various macroscopic analogies that replicate multi-body interaction and specific geometry of the problem are tremendously insightful. Among those, now, a simple table-top experimental method developed by Stanford University engineers. All that’s needed to begin is a slippery surface (say a glass slide), a permanent marker and a mixture of water and propylene glycol, a common ingredient in food coloring.

    With these supplies, the researchers invented a new way to rapidly prototype complex geometries mirroring symmetries present in problems of interest. Instead of planets strewn about the solar system, many tiny droplets interact with each other at a distance and the observer can directly watch and manipulate how the system evolves over time. The researchers detailed their new method in a paper published Aug. 24 in PNAS.

    The simple – and inexpensive – technique could be applied to many different questions across myriad fields. Other methods for studying these problems tend to be purely theoretical or require expensive and rare equipment at the nanoscale. For their part, the researchers are particularly interested in the physics of exotic materials including new magnetic materials unlike anything found in nature.

    “People are beginning to be able to fabricate almost any material in any geometry that they want – but these systems are so complex that people don’t have the capability to truly understand them,” said Anton Molina, lead author of the paper and a graduate student in the lab of Manu Prakash, associate professor of bioengineering. “So, we are excited about being able to use this rapid, frugal tool to quickly explore many possible configurations.”

    Back of the envelope experiments

    The idea to explore the role of geometry in self-interacting system originated many years ago when the group published a new class of active matter system dubbed “dancing droplets” [Nature] – where a complex interplay of interactions emerge in evaporating droplets that can sense each other and move autonomously, almost like crawling cells. The next big challenge in taming that system was to incorporate defined interactions.

    A droplet lattice made of gold with “dancing droplets” of propylene glycol. Image credit: Prakash Lab.

    “What we wanted was the simplest possible system in which the geometry of the interaction is completely programmable and tunable, while at that same time it behaves like a complex system and results in things that we could not predict,” said Prakash, who is senior author of the paper.

    Building on that dancing droplet work, the researchers knew this food additive was capable of imitating the interactions between different “bodies” in many-body systems – such as the gravitational forces between celestial objects or the electrostatic forces between atoms. Next, to introduce complex geometry critical in emulating many complex systems, the researchers created droplet lattices either carefully printed in gold or literally drawn with permanent marker.

    “We decided to do to use lithography printing for precision, but we also did a lot of the prototyping using permanent markers,” said Molina. The simple process meant the researchers could go from sketching a design while relaxing in the courtyard outside their lab to experimentally testing that design within hours. Prakash likened the process to “back of the envelope calculations” for experiments. This rapid exploration of the role of geometry in dynamical systems enables insights into exotic configurations and testing new ideas at a fast pace.

    The researchers focused first on hexagonal lattices made of smaller hexagons because that is the simplest structure that leads to a non-trivial evolution of dynamics in these droplet lattices. Motivated to achieve the lowest possible energy state for the system as a whole, the droplets form clumps of three and, overall, move toward the center of the lattice. Their first moves happen soon after they are put into the lattice but individual changes – that then trigger changes among other droplets – continue for several minutes. The multi-stage organization of the system is a universal feature of systems with long-range interactions; where all droplets are simultaneously communicating and pulling and pushing on each other via an invisible vapor phase.

    “Over time, everything’s evaporating and water vapor is leaving the system. Locally, these triplets preserve the lifetime of the droplet the longest,” explained Molina. “Watching the system gives you an answer for how these droplets actually do it, and yet it’s still puzzling to understand the individual motion of a given droplet.”

    An invitation to explore

    In addition to their hexagons, the researchers created various square lattices that map onto common models in physics, computing and materials science. Further understanding these systems could help inform the design of next-generation substrates for computing which might depart from conventional architectures we are used to seeing in current micro-chips.

    However, as often happens with work from the Prakash lab, this experimental tool is not only intended for academics and experts. A marker and some food coloring could lead to solving Acta Mathematica’s prize question, or it could be a means of explaining the fundamental role of geometry in materials or complex energy states to schoolchildren.

    “You might say, ‘Oh, to do fundamental science and discover new rules of nature, I need this and that.’ But even for experiments that’s not true,” said Prakash. “So, I hope people consider this an open invitation to explore because we have taken these experiments that were incredibly hard and made them, really, really simple. With creativity, there are ways of asking some really fundamental questions.”

    Additional Stanford co-authors include postdoctoral scholar Shailabh Kumar and former postdoctoral scholar Stefan Karpitschka (now at the MPG Institute for Dynamics and Self-Organization[MPG Institut für Dynamik und Selbstorganisation(DE)). Prakash is also a senior fellow at the Stanford Woods Institute for the Environment; a member of Bio-X, the Maternal & Child Health Research Institute and the Wu Tsai Neurosciences Institute; a faculty fellow at The Howard Hughes Medical Institute (HHMI)(US); and an investigator at the Chan Zuckerberg Biohub.

    This research was funded by the National Science Foundation, the Keck Foundation, an HHMI-Gates Faculty Scholar Award and a CZI Biohub Investigator Award.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus

    Leland and Jane Stanford founded Stanford University (US) to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members.

    Stanford University, officially Leland Stanford Junior University, is a private research university located in Stanford, California. Stanford was founded in 1885 by Leland and Jane Stanford in memory of their only child, Leland Stanford Jr., who had died of typhoid fever at age 15 the previous year. Stanford is consistently ranked as among the most prestigious and top universities in the world by major education publications. It is also one of the top fundraising institutions in the country, becoming the first school to raise more than a billion dollars in a year.

    Leland Stanford was a U.S. senator and former governor of California who made his fortune as a railroad tycoon. The school admitted its first students on October 1, 1891, as a coeducational and non-denominational institution. Stanford University struggled financially after the death of Leland Stanford in 1893 and again after much of the campus was damaged by the 1906 San Francisco earthquake. Following World War II, provost Frederick Terman supported faculty and graduates’ entrepreneurialism to build self-sufficient local industry in what would later be known as Silicon Valley.

    The university is organized around seven schools: three schools consisting of 40 academic departments at the undergraduate level as well as four professional schools that focus on graduate programs in law, medicine, education, and business. All schools are on the same campus. Students compete in 36 varsity sports, and the university is one of two private institutions in the Division I FBS Pac-12 Conference. It has gained 126 NCAA team championships, and Stanford has won the NACDA Directors’ Cup for 24 consecutive years, beginning in 1994–1995. In addition, Stanford students and alumni have won 270 Olympic medals including 139 gold medals.

    As of October 2020, 84 Nobel laureates, 28 Turing Award laureates, and eight Fields Medalists have been affiliated with Stanford as students, alumni, faculty, or staff. In addition, Stanford is particularly noted for its entrepreneurship and is one of the most successful universities in attracting funding for start-ups. Stanford alumni have founded numerous companies, which combined produce more than $2.7 trillion in annual revenue, roughly equivalent to the 7th largest economy in the world (as of 2020). Stanford is the alma mater of one president of the United States (Herbert Hoover), 74 living billionaires, and 17 astronauts. It is also one of the leading producers of Fulbright Scholars, Marshall Scholars, Rhodes Scholars, and members of the United States Congress.

    Stanford University was founded in 1885 by Leland and Jane Stanford, dedicated to Leland Stanford Jr, their only child. The institution opened in 1891 on Stanford’s previous Palo Alto farm.

    Jane and Leland Stanford modeled their university after the great eastern universities, most specifically Cornell University. Stanford opened being called the “Cornell of the West” in 1891 due to faculty being former Cornell affiliates (either professors, alumni, or both) including its first president, David Starr Jordan, and second president, John Casper Branner. Both Cornell and Stanford were among the first to have higher education be accessible, nonsectarian, and open to women as well as to men. Cornell is credited as one of the first American universities to adopt this radical departure from traditional education, and Stanford became an early adopter as well.

    Despite being impacted by earthquakes in both 1906 and 1989, the campus was rebuilt each time. In 1919, The Hoover Institution on War, Revolution and Peace was started by Herbert Hoover to preserve artifacts related to World War I. The Stanford Medical Center, completed in 1959, is a teaching hospital with over 800 beds. The DOE’s SLAC National Accelerator Laboratory(US)(originally named the Stanford Linear Accelerator Center), established in 1962, performs research in particle physics.


    Most of Stanford is on an 8,180-acre (12.8 sq mi; 33.1 km^2) campus, one of the largest in the United States. It is located on the San Francisco Peninsula, in the northwest part of the Santa Clara Valley (Silicon Valley) approximately 37 miles (60 km) southeast of San Francisco and approximately 20 miles (30 km) northwest of San Jose. In 2008, 60% of this land remained undeveloped.

    Stanford’s main campus includes a census-designated place within unincorporated Santa Clara County, although some of the university land (such as the Stanford Shopping Center and the Stanford Research Park) is within the city limits of Palo Alto. The campus also includes much land in unincorporated San Mateo County (including the SLAC National Accelerator Laboratory and the Jasper Ridge Biological Preserve), as well as in the city limits of Menlo Park (Stanford Hills neighborhood), Woodside, and Portola Valley.

    Non-central campus

    Stanford currently operates in various locations outside of its central campus.

    On the founding grant:

    Jasper Ridge Biological Preserve is a 1,200-acre (490 ha) natural reserve south of the central campus owned by the university and used by wildlife biologists for research.
    SLAC National Accelerator Laboratory is a facility west of the central campus operated by the university for the Department of Energy. It contains the longest linear particle accelerator in the world, 2 miles (3.2 km) on 426 acres (172 ha) of land.
    Golf course and a seasonal lake: The university also has its own golf course and a seasonal lake (Lake Lagunita, actually an irrigation reservoir), both home to the vulnerable California tiger salamander. As of 2012 Lake Lagunita was often dry and the university had no plans to artificially fill it.

    Off the founding grant:

    Hopkins Marine Station, in Pacific Grove, California, is a marine biology research center owned by the university since 1892.
    Study abroad locations: unlike typical study abroad programs, Stanford itself operates in several locations around the world; thus, each location has Stanford faculty-in-residence and staff in addition to students, creating a “mini-Stanford”.

    Redwood City campus for many of the university’s administrative offices located in Redwood City, California, a few miles north of the main campus. In 2005, the university purchased a small, 35-acre (14 ha) campus in Midpoint Technology Park intended for staff offices; development was delayed by The Great Recession. In 2015 the university announced a development plan and the Redwood City campus opened in March 2019.

    The Bass Center in Washington, DC provides a base, including housing, for the Stanford in Washington program for undergraduates. It includes a small art gallery open to the public.

    China: Stanford Center at Peking University, housed in the Lee Jung Sen Building, is a small center for researchers and students in collaboration with Beijing University [北京大学](CN) (Kavli Institute for Astronomy and Astrophysics at Peking University(CN) (KIAA-PKU).

    Administration and organization

    Stanford is a private, non-profit university that is administered as a corporate trust governed by a privately appointed board of trustees with a maximum membership of 38. Trustees serve five-year terms (not more than two consecutive terms) and meet five times annually.[83] A new trustee is chosen by the current trustees by ballot. The Stanford trustees also oversee the Stanford Research Park, the Stanford Shopping Center, the Cantor Center for Visual Arts, Stanford University Medical Center, and many associated medical facilities (including the Lucile Packard Children’s Hospital).

    The board appoints a president to serve as the chief executive officer of the university, to prescribe the duties of professors and course of study, to manage financial and business affairs, and to appoint nine vice presidents. The provost is the chief academic and budget officer, to whom the deans of each of the seven schools report. Persis Drell became the 13th provost in February 2017.

    As of 2018, the university was organized into seven academic schools. The schools of Humanities and Sciences (27 departments), Engineering (nine departments), and Earth, Energy & Environmental Sciences (four departments) have both graduate and undergraduate programs while the Schools of Law, Medicine, Education and Business have graduate programs only. The powers and authority of the faculty are vested in the Academic Council, which is made up of tenure and non-tenure line faculty, research faculty, senior fellows in some policy centers and institutes, the president of the university, and some other academic administrators, but most matters are handled by the Faculty Senate, made up of 55 elected representatives of the faculty.

    The Associated Students of Stanford University (ASSU) is the student government for Stanford and all registered students are members. Its elected leadership consists of the Undergraduate Senate elected by the undergraduate students, the Graduate Student Council elected by the graduate students, and the President and Vice President elected as a ticket by the entire student body.

    Stanford is the beneficiary of a special clause in the California Constitution, which explicitly exempts Stanford property from taxation so long as the property is used for educational purposes.

    Endowment and donations

    The university’s endowment, managed by the Stanford Management Company, was valued at $27.7 billion as of August 31, 2019. Payouts from the Stanford endowment covered approximately 21.8% of university expenses in the 2019 fiscal year. In the 2018 NACUBO-TIAA survey of colleges and universities in the United States and Canada, only Harvard University(US), the University of Texas System(US), and Yale University(US) had larger endowments than Stanford.

    In 2006, President John L. Hennessy launched a five-year campaign called the Stanford Challenge, which reached its $4.3 billion fundraising goal in 2009, two years ahead of time, but continued fundraising for the duration of the campaign. It concluded on December 31, 2011, having raised a total of $6.23 billion and breaking the previous campaign fundraising record of $3.88 billion held by Yale. Specifically, the campaign raised $253.7 million for undergraduate financial aid, as well as $2.33 billion for its initiative in “Seeking Solutions” to global problems, $1.61 billion for “Educating Leaders” by improving K-12 education, and $2.11 billion for “Foundation of Excellence” aimed at providing academic support for Stanford students and faculty. Funds supported 366 new fellowships for graduate students, 139 new endowed chairs for faculty, and 38 new or renovated buildings. The new funding also enabled the construction of a facility for stem cell research; a new campus for the business school; an expansion of the law school; a new Engineering Quad; a new art and art history building; an on-campus concert hall; a new art museum; and a planned expansion of the medical school, among other things. In 2012, the university raised $1.035 billion, becoming the first school to raise more than a billion dollars in a year.

    Research centers and institutes

    DOE’s SLAC National Accelerator Laboratory(US)
    Stanford Research Institute, a center of innovation to support economic development in the region.
    Hoover Institution, a conservative American public policy institution and research institution that promotes personal and economic liberty, free enterprise, and limited government.
    Hasso Plattner Institute of Design, a multidisciplinary design school in cooperation with the Hasso Plattner Institute of University of Potsdam [Universität Potsdam](DE) that integrates product design, engineering, and business management education).
    Martin Luther King Jr. Research and Education Institute, which grew out of and still contains the Martin Luther King Jr. Papers Project.
    John S. Knight Fellowship for Professional Journalists
    Center for Ocean Solutions
    Together with UC Berkeley(US) and UC San Francisco(US), Stanford is part of the Biohub, a new medical science research center founded in 2016 by a $600 million commitment from Facebook CEO and founder Mark Zuckerberg and pediatrician Priscilla Chan.

    Discoveries and innovation

    Natural sciences

    Biological synthesis of deoxyribonucleic acid (DNA) – Arthur Kornberg synthesized DNA material and won the Nobel Prize in Physiology or Medicine 1959 for his work at Stanford.
    First Transgenic organism – Stanley Cohen and Herbert Boyer were the first scientists to transplant genes from one living organism to another, a fundamental discovery for genetic engineering. Thousands of products have been developed on the basis of their work, including human growth hormone and hepatitis B vaccine.
    Laser – Arthur Leonard Schawlow shared the 1981 Nobel Prize in Physics with Nicolaas Bloembergen and Kai Siegbahn for his work on lasers.
    Nuclear magnetic resonance – Felix Bloch developed new methods for nuclear magnetic precision measurements, which are the underlying principles of the MRI.

    Computer and applied sciences

    ARPANETStanford Research Institute, formerly part of Stanford but on a separate campus, was the site of one of the four original ARPANET nodes.

    Internet—Stanford was the site where the original design of the Internet was undertaken. Vint Cerf led a research group to elaborate the design of the Transmission Control Protocol (TCP/IP) that he originally co-created with Robert E. Kahn (Bob Kahn) in 1973 and which formed the basis for the architecture of the Internet.

    Frequency modulation synthesis – John Chowning of the Music department invented the FM music synthesis algorithm in 1967, and Stanford later licensed it to Yamaha Corporation.

    Google – Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford. They were working on the Stanford Digital Library Project (SDLP). The SDLP’s goal was “to develop the enabling technologies for a single, integrated and universal digital library” and it was funded through the National Science Foundation, among other federal agencies.

    Klystron tube – invented by the brothers Russell and Sigurd Varian at Stanford. Their prototype was completed and demonstrated successfully on August 30, 1937. Upon publication in 1939, news of the klystron immediately influenced the work of U.S. and UK researchers working on radar equipment.

    RISCARPA funded VLSI project of microprocessor design. Stanford and UC Berkeley are most associated with the popularization of this concept. The Stanford MIPS would go on to be commercialized as the successful MIPS architecture, while Berkeley RISC gave its name to the entire concept, commercialized as the SPARC. Another success from this era were IBM’s efforts that eventually led to the IBM POWER instruction set architecture, PowerPC, and Power ISA. As these projects matured, a wide variety of similar designs flourished in the late 1980s and especially the early 1990s, representing a major force in the Unix workstation market as well as embedded processors in laser printers, routers and similar products.
    SUN workstation – Andy Bechtolsheim designed the SUN workstation for the Stanford University Network communications project as a personal CAD workstation, which led to Sun Microsystems.

    Businesses and entrepreneurship

    Stanford is one of the most successful universities in creating companies and licensing its inventions to existing companies; it is often held up as a model for technology transfer. Stanford’s Office of Technology Licensing is responsible for commercializing university research, intellectual property, and university-developed projects.

    The university is described as having a strong venture culture in which students are encouraged, and often funded, to launch their own companies.

    Companies founded by Stanford alumni generate more than $2.7 trillion in annual revenue, equivalent to the 10th-largest economy in the world.

    Some companies closely associated with Stanford and their connections include:

    Hewlett-Packard, 1939, co-founders William R. Hewlett (B.S, PhD) and David Packard (M.S).
    Silicon Graphics, 1981, co-founders James H. Clark (Associate Professor) and several of his grad students.
    Sun Microsystems, 1982, co-founders Vinod Khosla (M.B.A), Andy Bechtolsheim (PhD) and Scott McNealy (M.B.A).
    Cisco, 1984, founders Leonard Bosack (M.S) and Sandy Lerner (M.S) who were in charge of Stanford Computer Science and Graduate School of Business computer operations groups respectively when the hardware was developed.[163]
    Yahoo!, 1994, co-founders Jerry Yang (B.S, M.S) and David Filo (M.S).
    Google, 1998, co-founders Larry Page (M.S) and Sergey Brin (M.S).
    LinkedIn, 2002, co-founders Reid Hoffman (B.S), Konstantin Guericke (B.S, M.S), Eric Lee (B.S), and Alan Liu (B.S).
    Instagram, 2010, co-founders Kevin Systrom (B.S) and Mike Krieger (B.S).
    Snapchat, 2011, co-founders Evan Spiegel and Bobby Murphy (B.S).
    Coursera, 2012, co-founders Andrew Ng (Associate Professor) and Daphne Koller (Professor, PhD).

    Student body

    Stanford enrolled 6,996 undergraduate and 10,253 graduate students as of the 2019–2020 school year. Women comprised 50.4% of undergraduates and 41.5% of graduate students. In the same academic year, the freshman retention rate was 99%.

    Stanford awarded 1,819 undergraduate degrees, 2,393 master’s degrees, 770 doctoral degrees, and 3270 professional degrees in the 2018–2019 school year. The four-year graduation rate for the class of 2017 cohort was 72.9%, and the six-year rate was 94.4%. The relatively low four-year graduation rate is a function of the university’s coterminal degree (or “coterm”) program, which allows students to earn a master’s degree as a 1-to-2-year extension of their undergraduate program.

    As of 2010, fifteen percent of undergraduates were first-generation students.


    As of 2016 Stanford had 16 male varsity sports and 20 female varsity sports, 19 club sports and about 27 intramural sports. In 1930, following a unanimous vote by the Executive Committee for the Associated Students, the athletic department adopted the mascot “Indian.” The Indian symbol and name were dropped by President Richard Lyman in 1972, after objections from Native American students and a vote by the student senate. The sports teams are now officially referred to as the “Stanford Cardinal,” referring to the deep red color, not the cardinal bird. Stanford is a member of the Pac-12 Conference in most sports, the Mountain Pacific Sports Federation in several other sports, and the America East Conference in field hockey with the participation in the inter-collegiate NCAA’s Division I FBS.

    Its traditional sports rival is the University of California, Berkeley, the neighbor to the north in the East Bay. The winner of the annual “Big Game” between the Cal and Cardinal football teams gains custody of the Stanford Axe.

    Stanford has had at least one NCAA team champion every year since the 1976–77 school year and has earned 126 NCAA national team titles since its establishment, the most among universities, and Stanford has won 522 individual national championships, the most by any university. Stanford has won the award for the top-ranked Division 1 athletic program—the NACDA Directors’ Cup, formerly known as the Sears Cup—annually for the past twenty-four straight years. Stanford athletes have won medals in every Olympic Games since 1912, winning 270 Olympic medals total, 139 of them gold. In the 2008 Summer Olympics, and 2016 Summer Olympics, Stanford won more Olympic medals than any other university in the United States. Stanford athletes won 16 medals at the 2012 Summer Olympics (12 gold, two silver and two bronze), and 27 medals at the 2016 Summer Olympics.


    The unofficial motto of Stanford, selected by President Jordan, is Die Luft der Freiheit weht. Translated from the German language, this quotation from Ulrich von Hutten means, “The wind of freedom blows.” The motto was controversial during World War I, when anything in German was suspect; at that time the university disavowed that this motto was official.
    Hail, Stanford, Hail! is the Stanford Hymn sometimes sung at ceremonies or adapted by the various University singing groups. It was written in 1892 by mechanical engineering professor Albert W. Smith and his wife, Mary Roberts Smith (in 1896 she earned the first Stanford doctorate in Economics and later became associate professor of Sociology), but was not officially adopted until after a performance on campus in March 1902 by the Mormon Tabernacle Choir.
    “Uncommon Man/Uncommon Woman”: Stanford does not award honorary degrees, but in 1953 the degree of “Uncommon Man/Uncommon Woman” was created to recognize individuals who give rare and extraordinary service to the University. Technically, this degree is awarded by the Stanford Associates, a voluntary group that is part of the university’s alumni association. As Stanford’s highest honor, it is not conferred at prescribed intervals, but only when appropriate to recognize extraordinary service. Recipients include Herbert Hoover, Bill Hewlett, Dave Packard, Lucile Packard, and John Gardner.
    Big Game events: The events in the week leading up to the Big Game vs. UC Berkeley, including Gaieties (a musical written, composed, produced, and performed by the students of Ram’s Head Theatrical Society).
    “Viennese Ball”: a formal ball with waltzes that was initially started in the 1970s by students returning from the now-closed Stanford in Vienna overseas program. It is now open to all students.
    “Full Moon on the Quad”: An annual event at Main Quad, where students gather to kiss one another starting at midnight. Typically organized by the Junior class cabinet, the festivities include live entertainment, such as music and dance performances.
    “Band Run”: An annual festivity at the beginning of the school year, where the band picks up freshmen from dorms across campus while stopping to perform at each location, culminating in a finale performance at Main Quad.
    “Mausoleum Party”: An annual Halloween Party at the Stanford Mausoleum, the final resting place of Leland Stanford Jr. and his parents. A 20-year tradition, the “Mausoleum Party” was on hiatus from 2002 to 2005 due to a lack of funding, but was revived in 2006. In 2008, it was hosted in Old Union rather than at the actual Mausoleum, because rain prohibited generators from being rented. In 2009, after fundraising efforts by the Junior Class Presidents and the ASSU Executive, the event was able to return to the Mausoleum despite facing budget cuts earlier in the year.
    Former campus traditions include the “Big Game bonfire” on Lake Lagunita (a seasonal lake usually dry in the fall), which was formally ended in 1997 because of the presence of endangered salamanders in the lake bed.

    Award laureates and scholars

    Stanford’s current community of scholars includes:

    19 Nobel Prize laureates (as of October 2020, 85 affiliates in total)
    171 members of the National Academy of Sciences
    109 members of National Academy of Engineering
    76 members of National Academy of Medicine
    288 members of the American Academy of Arts and Sciences
    19 recipients of the National Medal of Science
    1 recipient of the National Medal of Technology
    4 recipients of the National Humanities Medal
    49 members of American Philosophical Society
    56 fellows of the American Physics Society (since 1995)
    4 Pulitzer Prize winners
    31 MacArthur Fellows
    4 Wolf Foundation Prize winners
    2 ACL Lifetime Achievement Award winners
    14 AAAI fellows
    2 Presidential Medal of Freedom winners

    Stanford University Seal

  • richardmitnick 2:10 pm on September 28, 2021 Permalink | Reply
    Tags: "The co-evolution of particle physics and computing", , Computing, , , , ,   

    From Symmetry: “The co-evolution of particle physics and computing” 

    Symmetry Mag

    From Symmetry

    Stephanie Melchor

    Illustration by Sandbox Studio, Chicago with Ariel Davis.

    Over time, particle physics and astrophysics and computing have built upon one another’s successes. That co-evolution continues today.

    In the mid-twentieth century, particle physicists were peering deeper into the history and makeup of the universe than ever before. Over time, their calculations became too complex to fit on a blackboard—or to farm out to armies of human “computers” doing calculations by hand.

    To deal with this, they developed some of the world’s earliest electronic computers.

    Physics has played an important role in the history of computing. The transistor—the switch that controls the flow of electrical signal within a computer—was invented by a group of physicists at Bell Labs. The incredible computational demands of particle physics and astrophysics experiments have consistently pushed the boundaries of what is possible. They have encouraged the development of new technologies to handle tasks from dealing with avalanches of data to simulating interactions on the scales of both the cosmos and the quantum realm.

    But this influence doesn’t just go one way. Computing plays an essential role in particle physics and astrophysics as well. As computing has grown increasingly more sophisticated, its own progress has enabled new scientific discoveries and breakthroughs.

    Illustration by Sandbox Studio, Chicago with Ariel Davis.

    Managing an onslaught of data

    In 1973, scientists at DOE’s Fermi National Accelerator Laboratory (US) in Illinois got their first big mainframe computer: a 7-year-old hand-me-down from DOE’s Lawrence Berkeley National Laboratory (US). Called the CDC 6600, it weighed about 6 tons. Over the next five years, Fermilab added five more large mainframe computers to its collection.

    Then came the completion of the Tevatron—at the time, the world’s highest-energy particle accelerator—which would provide the particle beams for numerous experiments at the lab.


    FNAL/Tevatron map

    Tevatron Accelerator


    FNAL/Tevatron CDF detector

    FNAL/Tevatron DØ detector


    By the mid-1990s, two four-story particle detectors would begin selecting, storing and analyzing data from millions of particle collisions at the Tevatron per second. Called the Collider Detector at Fermilab and the DØ detector, these new experiments threatened to overpower the lab’s computational abilities.

    In December of 1983, a committee of physicists and computer scientists released a 103-page report highlighting the “urgent need for an upgrading of the laboratory’s computer facilities.” The report said the lab “should continue the process of catching up” in terms of computing ability, and that “this should remain the laboratory’s top computing priority for the next few years.”

    Instead of simply buying more large computers (which were incredibly expensive), the committee suggested a new approach: They recommended increasing computational power by distributing the burden over clusters or “farms” of hundreds of smaller computers.

    Thanks to Intel’s 1971 development of a new commercially available microprocessor the size of a domino, computers were shrinking. Fermilab was one of the first national labs to try the concept of clustering these smaller computers together, treating each particle collision as a computationally independent event that could be analyzed on its own processor.

    Like many new ideas in science, it wasn’t accepted without some pushback.

    Joel Butler, a physicist at Fermilab who was on the computing committee, recalls, “There was a big fight about whether this was a good idea or a bad idea.”

    A lot of people were enchanted with the big computers, he says. They were impressive-looking and reliable, and people knew how to use them. And then along came “this swarm of little tiny devices, packaged in breadbox-sized enclosures.”

    The computers were unfamiliar, and the companies building them weren’t well-established. On top of that, it wasn’t clear how well the clustering strategy would work.

    As for Butler? “I raised my hand [at a meeting] and said, ‘Good idea’—and suddenly my entire career shifted from building detectors and beamlines to doing computing,” he chuckles.

    Not long afterward, innovation that sparked for the benefit of particle physics enabled another leap in computing. In 1989, Tim Berners-Lee, a computer scientist at European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN], launched the World Wide Web to help CERN physicists share data with research collaborators all over the world.

    To be clear, Berners-Lee didn’t create the internet—that was already underway in the form the ARPANET, developed by the US Department of Defense.


    But the ARPANET connected only a few hundred computers, and it was difficult to share information across machines with different operating systems.

    The web Berners-Lee created was an application that ran on the internet, like email, and started as a collection of documents connected by hyperlinks. To get around the problem of accessing files between different types of computers, he developed HTML (HyperText Markup Language), a programming language that formatted and displayed files in a web browser independent of the local computer’s operating system.

    Berners-Lee also developed the first web browser, allowing users to access files stored on the first web server (Berners-Lee’s computer at CERN).

    NCSA MOSAIC Browser


    He implemented the concept of a URL (Uniform Resource Locator), specifying how and where to access desired web pages.

    What started out as an internal project to help particle physicists share data within their institution fundamentally changed not just computing, but how most people experience the digital world today.

    Back at Fermilab, cluster computing wound up working well for handling the Tevatron data. Eventually, it became industry standard for tech giants like Google and Amazon.

    Over the next decade, other US national laboratories adopted the idea, too. DOE’s SLAC National Accelerator Laboratory (US)—then called Stanford Linear Accelerator Center—transitioned from big mainframes to clusters of smaller computers to prepare for its own extremely data-hungry experiment, BaBar.

    SLAC National Accelerator Laboratory(US) BaBar

    Both SLAC and Fermilab also were early adopters of Lee’s web server. The labs set up the first two websites in the United States, paving the way for this innovation to spread across the continent.

    In 1989, in recognition of the growing importance of computing in physics, Fermilab Director John Peoples elevated the computing department to a full-fledged division. The head of a division reports directly to the lab director, making it easier to get resources and set priorities. Physicist Tom Nash formed the new Computing Division, along with Butler and two other scientists, Irwin Gaines and Victoria White. Butler led the division from 1994 to 1998.

    High-performance computing in particle physics and astrophysics

    These computational systems worked well for particle physicists for a long time, says Berkeley Lab astrophysicist Peter Nugent. That is, until Moore’s Law started grinding to a halt.

    Moore’s Law is the idea that the number of transistors in a circuit will double, making computers faster and cheaper, every two years. The term was first coined in the mid-1970s, and the trend reliably proceeded for decades. But now, computer manufacturers are starting to hit the physical limit of how many tiny transistors they can cram onto a single microchip.

    Because of this, says Nugent, particle physicists have been looking to take advantage of high-performance computing instead.

    Nugent says high-performance computing is “something more than a cluster, or a cloud-computing environment that you could get from Google or AWS, or at your local university.”

    What it typically means, he says, is that you have high-speed networking between computational nodes, allowing them to share information with each other very, very quickly. When you are computing on up to hundreds of thousands of nodes simultaneously, it massively speeds up the process.

    On a single traditional computer, he says, 100 million CPU hours translates to more than 11,000 years of continuous calculations. But for scientists using a high-performance computing facility at Berkeley Lab, DOE’s Argonne National Laboratory (US) or DOE’s Oak Ridge National Laboratory (US), 100 million hours is a typical, large allocation for one year at these facilities.

    Although astrophysicists have always relied on high-performance computing for simulating the birth of stars or modeling the evolution of the cosmos, Nugent says they are now using it for their data analysis as well.

    This includes rapid image-processing computations that have enabled the observations of several supernovae, including SN 2011fe, captured just after it began. “We found it just a few hours after it exploded, all because we were able to run these pipelines so efficiently and quickly,” Nugent says.

    According to Berkeley Lab physicist Paolo Calafiura, particle physicists also use high-performance computing for simulations—for modeling not the evolution of the cosmos, but rather what happens inside a particle detector. “Detector simulation is significantly the most computing-intensive problem that we have,” he says.

    Scientists need to evaluate multiple possibilities for what can happen when particles collide. To properly correct for detector effects when analyzing particle detector experiments, they need to simulate more data than they collect. “If you collect 1 billion collision events a year,” Calafiura says, “you want to simulate 10 billion collision events.”

    Calafiura says that right now, he’s more worried about finding a way to store all of the simulated and actual detector data than he is about producing it, but he knows that won’t last.

    “When does physics push computing?” he says. “When computing is not good enough… We see that in five years, computers will not be powerful enough for our problems, so we are pushing hard with some radically new ideas, and lots of detailed optimization work.”

    That’s why The Department of Energy’s Exascale Computing Project aims to build, in the next few years, computers capable of performing a quintillion (that is, a billion billion) operations per second. The new computers will be 1000 times faster than the current fastest computers.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, to be built at DOE’s Argonne National Laboratory.

    The exascale computers will also be used for other applications ranging from precision medicine to climate modeling to national security.

    Machine learning and quantum computing

    Innovations in computer hardware have enabled astrophysicists to push the kinds of simulations and analyses they can do. For example, Nugent says, the introduction of graphics processing units [GPU’s] has sped up astrophysicists’ ability to do calculations used in machine learning, leading to an explosive growth of machine learning in astrophysics.

    With machine learning, which uses algorithms and statistics to identify patterns in data, astrophysicists can simulate entire universes in microseconds.

    Machine learning has been important in particle physics as well, says Fermilab scientist Nhan Tran. “[Physicists] have very high-dimensional data, very complex data,” he says. “Machine learning is an optimal way to find interesting structures in that data.”

    The same way a computer can be trained to tell the difference between cats and dogs in pictures, it can learn how to identify particles from physics datasets, distinguishing between things like pions and photons.

    Tran says using computation this way can accelerate discovery. “As physicists, we’ve been able to learn a lot about particle physics and nature using non-machine-learning algorithms,” he says. “But machine learning can drastically accelerate and augment that process—and potentially provide deeper insight into the data.”

    And while teams of researchers are busy building exascale computers, others are hard at work trying to build another type of supercomputer: the quantum computer.

    Remember Moore’s Law? Previously, engineers were able to make computer chips faster by shrinking the size of electrical circuits, reducing the amount of time it takes for electrical signals to travel. “Now our technology is so good that literally the distance between transistors is the size of an atom,” Tran says. “So we can’t keep scaling down the technology and expect the same gains we’ve seen in the past.”

    To get around this, some researchers are redefining how computation works at a fundamental level—like, really fundamental.

    The basic unit of data in a classical computer is called a bit, which can hold one of two values: 1, if it has an electrical signal, or 0, if it has none. But in quantum computing, data is stored in quantum systems—things like electrons, which have either up or down spins, or photons, which are polarized either vertically or horizontally. These data units are called “qubits.”

    Here’s where it gets weird. Through a quantum property called superposition, qubits have more than just two possible states. An electron can be up, down, or in a variety of stages in between.

    What does this mean for computing? A collection of three classical bits can exist in only one of eight possible configurations: 000, 001, 010, 100, 011, 110, 101 or 111. But through superposition, three qubits can be in all eight of these configurations at once. A quantum computer can use that information to tackle problems that are impossible to solve with a classical computer.

    Fermilab scientist Aaron Chou likens quantum problem-solving to throwing a pebble into a pond. The ripples move through the water in every possible direction, “simultaneously exploring all of the possible things that it might encounter.”

    In contrast, a classical computer can only move in one direction at a time.

    But this makes quantum computers faster than classical computers only when it comes to solving certain types of problems. “It’s not like you can take any classical algorithm and put it on a quantum computer and make it better,” says University of California, Santa Barbara physicist John Martinis, who helped build Google’s quantum computer.

    Although quantum computers work in a fundamentally different way than classical computers, designing and building them wouldn’t be possible without traditional computing laying the foundation, Martinis says. “We’re really piggybacking on a lot of the technology of the last 50 years or more.”

    The kinds of problems that are well suited to quantum computing are intrinsically quantum mechanical in nature, says Chou.

    For instance, Martinis says, consider quantum chemistry. Solving quantum chemistry problems with classical computers is so difficult, he says, that 10 to 15% of the world’s supercomputer usage is currently dedicated to the task. “Quantum chemistry problems are hard for the very reason why a quantum computer is powerful”—because to complete them, you have to consider all the different quantum-mechanical states of all the individual atoms involved.

    Because making better quantum computers would be so useful in physics research, and because building them requires skills and knowledge that physicists possess, physicists are ramping up their quantum efforts. In the United States, the National Quantum Initiative Act of 2018 called for the The National Institute of Standards and Technology (US), The National Science Foundation (US) and The Department of Energy (US) to support programs, centers and consortia devoted to quantum information science.

    Coevolution requires cooperation

    In the early days of computational physics, the line between who was a particle physicist and who was a computer scientist could be fuzzy. Physicists used commercially available microprocessors to build custom computers for experiments. They also wrote much of their own software—ranging from printer drivers to the software that coordinated the analysis between the clustered computers.

    Nowadays, roles have somewhat shifted. Most physicists use commercially available devices and software, allowing them to focus more on the physics, Butler says. But some people, like Anshu Dubey, work right at the intersection of the two fields. Dubey is a computational scientist at DOE’s Argonne National Laboratory (US) who works with computational physicists.

    When a physicist needs to computationally interpret or model a phenomenon, sometimes they will sign up a student or postdoc in their research group for a programming course or two and then ask them to write the code to do the job. Although these codes are mathematically complex, Dubey says, they aren’t logically complex, making them relatively easy to write.

    A simulation of a single physical phenomenon can be neatly packaged within fairly straightforward code. “But the real world doesn’t want to cooperate with you in terms of its modularity and encapsularity,” she says.

    Multiple forces are always at play, so to accurately model real-world complexity, you have to use more complex software—ideally software that doesn’t become impossible to maintain as it gets updated over time. “All of a sudden,” says Dubey, “you start to require people who are creative in their own right—in terms of being able to architect software.”

    That’s where people like Dubey come in. At Argonne, Dubey develops software that researchers use to model complex multi-physics systems—incorporating processes like fluid dynamics, radiation transfer and nuclear burning.

    Hiring computer scientists for research projects in physics and other fields of science can be a challenge, Dubey says. Most funding agencies specify that research money can be used for hiring students and postdocs, but not paying for software development or hiring dedicated engineers. “There is no viable career path in academia for people whose careers are like mine,” she says.

    In an ideal world, universities would establish endowed positions for a team of research software engineers in physics departments with a nontrivial amount of computational research, Dubey says. These engineers would write reliable, well-architected code, and their institutional knowledge would stay with a team.

    Physics and computing have been closely intertwined for decades. However the two develop—toward new analyses using artificial intelligence, for example, or toward the creation of better and better quantum computers—it seems they will remain on this path together.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 10:58 am on August 13, 2021 Permalink | Reply
    Tags: "University of Washington and Microsoft researchers develop 'nanopore-tal' enabling cells to talk to computers", A commercially available nanopore array — in this case the Oxford Nanopore Technologies MinION device., A new class of reporter proteins that can be directly read by a commercially available nanopore sensing device., , , , Computing, Genetically encoded reporter proteins have been a mainstay of biotechnology research., Scientists are currently working to scale up the number of "NTERs" to hundreds; thousands; maybe even millions more., The new system-dubbed “Nanopore-addressable protein Tags Engineered as Reporters” also known as NanoporeTERs or NTERs for short., This is a fundamentally new interface between cells and computers., University of Washington Paul G. Allen College of Electrical and Computer of Engineering (US)   

    From University of Washington Paul G. Allen College of Electrical and Computer of Engineering (US) : “University of Washington and Microsoft researchers develop ‘nanopore-tal’ enabling cells to talk to computers” 

    From University of Washington Paul G. Allen College of Electrical and Computer of Engineering (US)

    August 12, 2021

    MISL researcher Nicolas Cardozo pipes cell cultures containing NanoporeTERs onto a portable MinION nanopore sensing device for processing as professor Jeff Nivala looks on. Credit: Dennis Wise/University of Washington.

    Genetically encoded reporter proteins have been a mainstay of biotechnology research, allowing scientists to track gene expression, understand intracellular processes and debug engineered genetic circuits. But conventional reporting schemes that rely on fluorescence and other optical approaches come with practical limitations that could cast a shadow over the field’s future progress. Now, thanks to a team of researchers at the University of Washington and Microsoft, scientists are about to see reporter proteins in a whole new light.

    In a paper published today in the journal Nature Biotechnology, members of the Molecular Information Systems Laboratory housed at the UW’s Paul G. Allen School of Computer Science & Engineering introduce a new class of reporter proteins that can be directly read by a commercially available nanopore sensing device. The new system ― dubbed “Nanopore-addressable protein Tags Engineered as Reporters” also known as NanoporeTERs or NTERs for short ― can perform multiplexed detection of protein expression levels from bacterial and human cell cultures far beyond the capacity of existing techniques.

    You could say the new system offers a “nanopore-tal” into what is happening inside these complex biological systems where, up until this point, scientists have largely been operating in the dark.

    “NanoporeTERs offer a new and richer lexicon for engineered cells to express themselves and shed new light on the factors they are designed to track. They can tell us a lot more about what is happening in their environment all at once,” said co-lead author Nicolas Cardozo, a graduate student in the UW’s molecular engineering Ph.D. program. “We’re essentially making it possible for these cells to ‘talk’ to computers about what’s happening in their surroundings at a new level of detail, scale and efficiency that will enable deeper analysis than what we could do before.”

    Raw nanopore signals streaming from the MinION device, which contains an array of hundreds of nanopore sensors; each color represents data from an individual nanopore. The team uses machine learning to interpret these signals as NanoporeTERs barcodes. Credit: Dennis Wise/University of Washington.

    Conventional methods that employ optical reporter proteins, such as green fluorescent protein (GFP), are limited in the number of distinct genetic outputs that they can track simultaneously due to their overlapping spectral properties. For example, it’s difficult to distinguish between more than three different fluorescent protein colors, limiting multiplexed reporting to a maximum of three outputs. In contrast, NTERs were designed to carry distinct protein “barcodes” composed of strings of amino acids that, when used in combination, enable a degree of multiplexing approaching an order of magnitude more. These synthetic proteins are secreted outside of the cell into the surrounding environment, where they are collected and directly analyzed using a commercially available nanopore array — in this case the Oxford Nanopore Technologies MinION device. To make nanopore analysis possible, the NTER proteins were engineered with charged “tails” that get pulled into the tiny nanopore sensors by an electric field. Machine learning is then used to classify their electrical signals in order to determine the output levels of each NTER barcode.

    “This is a fundamentally new interface between cells and computers,” explained Allen School research professor and corresponding author Jeff Nivala. “One analogy I like to make is that fluorescent protein reporters are like lighthouses, and NanoporeTERs are like messages in a bottle. Lighthouses are really useful for communicating a physical location, as you can literally see where the signal is coming from, but it’s hard to pack more information into that kind of signal. A message in a bottle, on the other hand, can pack a lot of information into a very small vessel, and you can send many of them off to another location to be read. You might lose sight of the precise physical location where the messages were sent, but for many applications that’s not going to be an issue.”

    In developing this new, more expressive vessel, Nivala and his colleagues eschewed time-consuming sample preparation or the need for other specialized laboratory equipment to minimize both latency and cost. The NTERs scheme is also highly extensible. As a proof of concept, the team developed a library of more than 20 distinct tags; according to co-lead author Karen Zhang, the potential is significantly greater.

    Co-authors of the Nature Biotechnology paper (left to right): Karen Zhang, Nicolas Cardozo, Kathryn Doroschak and Jeff Nivala. Not pictured: Aerilynn Nguyen, Zoheb Siddiqui, Nicholas Bogard, Karin Strauss and Luis Ceze. Credit: Tara Brown Photography.

    “We are currently working to scale up the number of NTERs to hundreds; thousands; maybe even millions more,” Zhang, who graduated this year from the UW with bachelor’s degrees in biochemistry and microbiology, explained. “The more we have, the more things we can track. We’re particularly excited about the potential in single-cell proteomics, but this could also be a game-changer in terms of our ability to do multiplexed biosensing to diagnose disease and even target therapeutics to specific areas inside the body. And debugging complicated genetic circuit designs would become a whole lot easier and much less time consuming if we could measure the performance of all the components in parallel instead of by trial and error.”

    MISL researchers have made novel use of the ONT MinION device before. Allen School alumna Kathryn Doroschak (Ph.D., ‘21), one of the lead co-authors of this paper, was also involved in an earlier project in which she and her teammates developed a molecular tagging system to replace conventional inventory control methods. That system relied on barcodes comprising synthetic strands of DNA that could be decoded on demand using the portable ONT reader. This time, she and her colleagues went a step further in demonstrating how versatile such devices can be.

    “This is the first paper to show how a commercial nanopore sensor device can be repurposed for applications other than the DNA and RNA sequencing for which they were originally designed,” explained Doroschak. “This is exciting as a precursor for nanopore technology becoming more accessible and ubiquitous in the future. You can already plug a nanopore device into your cell phone; I could envision someday having a choice of ‘molecular apps’ that will be relatively inexpensive and widely available outside of traditional genomics.”

    Additional co-authors of the paper include research assistants Aerilynn Nguyen and Zoheb Siddiqui, former postdoc Nicholas Bogard, Allen School affiliate professor Karin Strauss, senior principal research manager at Microsoft; and Allen School professor Luis Ceze.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    About the University of Washington Paul G. Allen College of Electrical and Computer Engineering (US)

    Mission, Facts, and Stats

    Our mission is to develop outstanding engineers and ideas that change the world.

    275 faculty (25.2% women)

    128 NSF Young Investigator/Early Career Awards since 1984
    32 Sloan Foundation Research Awards
    2 MacArthur Foundation Fellows (2007 and 2011)

    A national leader in educating engineers, each year the College turns out new discoveries, inventions and top-flight graduates, all contributing to the strength of our economy and the vitality of our community.

    Engineering innovation

    PEOPLE Innovation at UW ECE is exemplified by our outstanding faculty and by the exceptional group of students they advise and mentor. Students receive a robust education through a strong technical foundation, group project work and hands-on research opportunities. Our faculty work in dynamic research areas with diverse opportunities for projects and collaborations. Through their research, they address complex global challenges in health, energy, technology and the environment, and receive significant research and education grants.IMPACT We continue to expand our innovation ecosystem by promoting an entrepreneurial mindset in our teaching and through diverse partnerships. The field of electrical and computer engineering is at the forefront of solving emerging societal challenges, empowered by innovative ideas from our community. As our department evolves, we are dedicated to expanding our faculty and student body to meet the growing demand for engineers. We welcomed six new faculty hires in the 2018-2019 academic year. Our meaningful connections and collaborations place the department as a leader in the field.

    Engineers drive the innovation economy and are vital to solving society’s most challenging problems. The College of Engineering is a key part of a world-class research university in a thriving hub of aerospace, biotechnology, global health and information technology innovation. Over 50% of UW startups in FY18 came from the College of Engineering.

    Commitment to diversity and access

    The College of Engineering is committed to developing and supporting a diverse student body and faculty that reflect and elevate the populations we serve. We are a national leader in women in engineering; 25.5% of our faculty are women compared to 17.4% nationally. We offer a robust set of diversity programs for students and faculty.
    Research and commercialization

    The University of Washington is an engine of economic growth, today ranked third in the nation for the number of startups launched each year, with 65 companies having been started in the last five years alone by UW students and faculty, or with technology developed here. The College of Engineering is a key contributor to these innovations, and engineering faculty, students or technology are behind half of all UW startups. In FY19, UW received $1.58 billion in total research awards from federal and nonfederal sources.


    The University of Washington (US) is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

    The University of Washington (US) is a public research university in Seattle, Washington, United States. Founded in 1861, University of Washington is one of the oldest universities on the West Coast; it was established in downtown Seattle approximately a decade after the city’s founding to aid its economic development. Today, the university’s 703-acre main Seattle campus is in the University District above the Montlake Cut, within the urban Puget Sound region of the Pacific Northwest. The university has additional campuses in Tacoma and Bothell. Overall, University of Washington encompasses over 500 buildings and over 20 million gross square footage of space, including one of the largest library systems in the world with more than 26 university libraries, as well as the UW Tower, lecture halls, art centers, museums, laboratories, stadiums, and conference centers. The university offers bachelor’s, master’s, and doctoral degrees through 140 departments in various colleges and schools, sees a total student enrollment of roughly 46,000 annually, and functions on a quarter system.

    University of Washington is a member of the Association of American Universities(US) and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation(US), UW spent $1.41 billion on research and development in 2018, ranking it 5th in the nation. As the flagship institution of the six public universities in Washington state, it is known for its medical, engineering and scientific research as well as its highly competitive computer science and engineering programs. Additionally, University of Washington continues to benefit from its deep historic ties and major collaborations with numerous technology giants in the region, such as Amazon, Boeing, Nintendo, and particularly Microsoft. Paul G. Allen, Bill Gates and others spent significant time at Washington computer labs for a startup venture before founding Microsoft and other ventures. The University of Washington’s 22 varsity sports teams are also highly competitive, competing as the Huskies in the Pac-12 Conference of the NCAA Division I, representing the United States at the Olympic Games, and other major competitions.

    The university has been affiliated with many notable alumni and faculty, including 21 Nobel Prize laureates and numerous Pulitzer Prize winners, Fulbright Scholars, Rhodes Scholars and Marshall Scholars.

    In 1854, territorial governor Isaac Stevens recommended the establishment of a university in the Washington Territory. Prominent Seattle-area residents, including Methodist preacher Daniel Bagley, saw this as a chance to add to the city’s potential and prestige. Bagley learned of a law that allowed United States territories to sell land to raise money in support of public schools. At the time, Arthur A. Denny, one of the founders of Seattle and a member of the territorial legislature, aimed to increase the city’s importance by moving the territory’s capital from Olympia to Seattle. However, Bagley eventually convinced Denny that the establishment of a university would assist more in the development of Seattle’s economy. Two universities were initially chartered, but later the decision was repealed in favor of a single university in Lewis County provided that locally donated land was available. When no site emerged, Denny successfully petitioned the legislature to reconsider Seattle as a location in 1858.

    In 1861, scouting began for an appropriate 10 acres (4 ha) site in Seattle to serve as a new university campus. Arthur and Mary Denny donated eight acres, while fellow pioneers Edward Lander, and Charlie and Mary Terry, donated two acres on Denny’s Knoll in downtown Seattle. More specifically, this tract was bounded by 4th Avenue to the west, 6th Avenue to the east, Union Street to the north, and Seneca Streets to the south.

    John Pike, for whom Pike Street is named, was the university’s architect and builder. It was opened on November 4, 1861, as the Territorial University of Washington. The legislature passed articles incorporating the University, and establishing its Board of Regents in 1862. The school initially struggled, closing three times: in 1863 for low enrollment, and again in 1867 and 1876 due to funds shortage. University of Washington awarded its first graduate Clara Antoinette McCarty Wilt in 1876, with a bachelor’s degree in science.

    19th century relocation

    By the time Washington state entered the Union in 1889, both Seattle and the University had grown substantially. University of Washington’s total undergraduate enrollment increased from 30 to nearly 300 students, and the campus’s relative isolation in downtown Seattle faced encroaching development. A special legislative committee, headed by University of Washington graduate Edmond Meany, was created to find a new campus to better serve the growing student population and faculty. The committee eventually selected a site on the northeast of downtown Seattle called Union Bay, which was the land of the Duwamish, and the legislature appropriated funds for its purchase and construction. In 1895, the University relocated to the new campus by moving into the newly built Denny Hall. The University Regents tried and failed to sell the old campus, eventually settling with leasing the area. This would later become one of the University’s most valuable pieces of real estate in modern-day Seattle, generating millions in annual revenue with what is now called the Metropolitan Tract. The original Territorial University building was torn down in 1908, and its former site now houses the Fairmont Olympic Hotel.

    The sole-surviving remnants of Washington’s first building are four 24-foot (7.3 m), white, hand-fluted cedar, Ionic columns. They were salvaged by Edmond S. Meany, one of the University’s first graduates and former head of its history department. Meany and his colleague, Dean Herbert T. Condon, dubbed the columns as “Loyalty,” “Industry,” “Faith”, and “Efficiency”, or “LIFE.” The columns now stand in the Sylvan Grove Theater.

    20th century expansion

    Organizers of the 1909 Alaska-Yukon-Pacific Exposition eyed the still largely undeveloped campus as a prime setting for their world’s fair. They came to an agreement with Washington’s Board of Regents that allowed them to use the campus grounds for the exposition, surrounding today’s Drumheller Fountain facing towards Mount Rainier. In exchange, organizers agreed Washington would take over the campus and its development after the fair’s conclusion. This arrangement led to a detailed site plan and several new buildings, prepared in part by John Charles Olmsted. The plan was later incorporated into the overall University of Washington campus master plan, permanently affecting the campus layout.

    Both World Wars brought the military to campus, with certain facilities temporarily lent to the federal government. In spite of this, subsequent post-war periods were times of dramatic growth for the University. The period between the wars saw a significant expansion of the upper campus. Construction of the Liberal Arts Quadrangle, known to students as “The Quad,” began in 1916 and continued to 1939. The University’s architectural centerpiece, Suzzallo Library, was built in 1926 and expanded in 1935.

    After World War II, further growth came with the G.I. Bill. Among the most important developments of this period was the opening of the School of Medicine in 1946, which is now consistently ranked as the top medical school in the United States. It would eventually lead to the University of Washington Medical Center, ranked by U.S. News and World Report as one of the top ten hospitals in the nation.

    In 1942, all persons of Japanese ancestry in the Seattle area were forced into inland internment camps as part of Executive Order 9066 following the attack on Pearl Harbor. During this difficult time, university president Lee Paul Sieg took an active and sympathetic leadership role in advocating for and facilitating the transfer of Japanese American students to universities and colleges away from the Pacific Coast to help them avoid the mass incarceration. Nevertheless many Japanese American students and “soon-to-be” graduates were unable to transfer successfully in the short time window or receive diplomas before being incarcerated. It was only many years later that they would be recognized for their accomplishments during the University of Washington’s Long Journey Home ceremonial event that was held in May 2008.

    From 1958 to 1973, the University of Washington saw a tremendous growth in student enrollment, its faculties and operating budget, and also its prestige under the leadership of Charles Odegaard. University of Washington student enrollment had more than doubled to 34,000 as the baby boom generation came of age. However, this era was also marked by high levels of student activism, as was the case at many American universities. Much of the unrest focused around civil rights and opposition to the Vietnam War. In response to anti-Vietnam War protests by the late 1960s, the University Safety and Security Division became the University of Washington Police Department.

    Odegaard instituted a vision of building a “community of scholars”, convincing the Washington State legislatures to increase investment in the University. Washington senators, such as Henry M. Jackson and Warren G. Magnuson, also used their political clout to gather research funds for the University of Washington. The results included an increase in the operating budget from $37 million in 1958 to over $400 million in 1973, solidifying University of Washington as a top recipient of federal research funds in the United States. The establishment of technology giants such as Microsoft, Boeing and Amazon in the local area also proved to be highly influential in the University of Washington’s fortunes, not only improving graduate prospects but also helping to attract millions of dollars in university and research funding through its distinguished faculty and extensive alumni network.

    21st century

    In 1990, the University of Washington opened its additional campuses in Bothell and Tacoma. Although originally intended for students who have already completed two years of higher education, both schools have since become four-year universities with the authority to grant degrees. The first freshman classes at these campuses started in fall 2006. Today both Bothell and Tacoma also offer a selection of master’s degree programs.

    In 2012, the University began exploring plans and governmental approval to expand the main Seattle campus, including significant increases in student housing, teaching facilities for the growing student body and faculty, as well as expanded public transit options. The University of Washington light rail station was completed in March 2015, connecting Seattle’s Capitol Hill neighborhood to the University of Washington Husky Stadium within five minutes of rail travel time. It offers a previously unavailable option of transportation into and out of the campus, designed specifically to reduce dependence on private vehicles, bicycles and local King County buses.

    University of Washington has been listed as a “Public Ivy” in Greene’s Guides since 2001, and is an elected member of the American Association of Universities. Among the faculty by 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences(US), 67 members of the American Academy of Arts and Sciences, 53 members of the National Academy of Medicine(US), 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering(US), 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among UW students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017.

    The Academic Ranking of World Universities (ARWU) has consistently ranked University of Washington as one of the top 20 universities worldwide every year since its first release. In 2019, University of Washington ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900.

    U.S. News & World Report ranked University of Washington 8th out of nearly 1,500 universities worldwide for 2021, with University of Washington’s undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities.

    In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world’s 500 major universities, ranked University of Washington 12th globally and 5th in the U.S.

    In 2019, Kiplinger Magazine’s review of “top college values” named University of Washington 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings University of Washington was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service.

  • richardmitnick 9:43 pm on July 26, 2021 Permalink | Reply
    Tags: "Midgard - a paradigm shift in data center technology", , Communications, Computing,   

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “Midgard – a paradigm shift in data center technology” 

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)


    EPFL researchers have pioneered an innovative approach to implementing virtual memory in data centers, which will greatly increase server efficiency.

    As big data, used by everything from AI to the Internet of Things, increasingly dominates our modern lives, cloud computing has grown massively in importance. It relies heavily on the use of virtual memory with one data server running many services for many different customers all at the same time, using virtual memory to process these services and to keep each customer’s data secure from the others.

    However, the way this virtual memory is deployed dates back to the 1960’s, and the fact that memory capacity is always increasing is actually beginning to slow things down. For example, data centers that provide services such as social networks or business analytics spend more than 20% of their processing time in virtual memory and protection checks. That means that any gains made in this area will represent a huge benefit in efficiency.

    Midgard: saving energy in the cloud

    Now, researchers working with EPFL’s Ecocloud Center for Sustainable Cloud Computing, have developed Midgard, a software-modelled prototype demonstrating proof of concept to greatly increase server efficiency. Their research paper, Rebooting Virtual Memory with Midgard, has just been presented at ISCA’21, the world’s flagship conference in computer architecture, and is the first of several steps to demonstrate a fully working system.

    “Midgard is a technology that can allow for growing memory capacity, while continuing to guarantee the security of the data of each user in the cloud services,” explains Professor Babak Falsafi, Founding Director of the Ecocloud Center and one of the paper’s authors. “With Midgard, the all-important data lookups and protection checks are done directly in on-chip memory rather than virtual memory, removing so much of the traditional hierarchy of lookups and translations that it scores a net gain in efficiency, even as more memory is deployed,” he continued.

    In recent testing at low loads, the Midgard system was 5% behind standard performance, but at loads of 256 MB aggregate large cache it was able to outperform traditional systems in terms of virtual memory overheads.

    An outstanding feature of Midgard technology is that, while it does represent a paradigm shift, it is compatible with existing operating systems such as Windows, MacOS and Linux. Future work will address the wide spectrum of topics needed to realize Midgard in real systems, such as compatibility development, packaging strategies and maintenance plans.

    For more information about Midgard click here.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École polytechnique fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is the Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH) . Associated with several specialized research institutes, the two universities form the Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices was located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganised and acquired the status of a university in 1890, the technical faculty changed its name to École d’ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich(CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.


    EPFL is organised into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences (SB, Jan S. Hesthaven)

    Institute of Mathematics (MATH, Victor Panaretos)
    Institute of Chemical Sciences and Engineering (ISIC, Emsley Lyndon)
    Institute of Physics (IPHYS, Harald Brune)
    European Centre of Atomic and Molecular Computations (CECAM, Ignacio Pagonabarraga Mora)
    Bernoulli Center (CIB, Nicolas Monod)
    Biomedical Imaging Research Center (CIBM, Rolf Gruetter)
    Interdisciplinary Center for Electron Microscopy (CIME, Cécile Hébert)
    Max Planck-EPFL Centre for Molecular Nanosciences and Technology (CMNT, Thomas Rizzo)
    Swiss Plasma Center (SPC, Ambrogio Fasoli)
    Laboratory of Astrophysics (LASTRO, Jean-Paul Kneib)

    School of Engineering (STI, Ali Sayed)

    Institute of Electrical Engineering (IEL, Giovanni De Micheli)
    Institute of Mechanical Engineering (IGM, Thomas Gmür)
    Institute of Materials (IMX, Michaud Véronique)
    Institute of Microengineering (IMT, Olivier Martin)
    Institute of Bioengineering (IBI, Matthias Lütolf)

    School of Architecture, Civil and Environmental Engineering (ENAC, Claudia R. Binder)

    Institute of Architecture (IA, Luca Ortelli)
    Civil Engineering Institute (IIC, Eugen Brühwiler)
    Institute of Urban and Regional Sciences (INTER, Philippe Thalmann)
    Environmental Engineering Institute (IIE, David Andrew Barry)

    School of Computer and Communication Sciences (IC, James Larus)

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing

    School of Life Sciences (SV, Gisou van der Goot)

    Bachelor-Master Teaching Section in Life Sciences and Technologies (SSV)
    Brain Mind Institute (BMI, Carmen Sandi)
    Institute of Bioengineering (IBI, Melody Swartz)
    Swiss Institute for Experimental Cancer Research (ISREC, Douglas Hanahan)
    Global Health Institute (GHI, Bruno Lemaitre)
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics (CPG)
    NCCR Synaptic Bases of Mental Diseases (NCCR-SYNAPSY)

    College of Management of Technology (CDM)

    Swiss Finance Institute at EPFL (CDM-SFI, Damir Filipovic)
    Section of Management of Technology and Entrepreneurship (CDM-PMTE, Daniel Kuhn)
    Institute of Technology and Public Policy (CDM-ITPP, Matthias Finger)
    Institute of Management of Technology and Entrepreneurship (CDM-MTEI, Ralf Seifert)
    Section of Financial Engineering (CDM-IF, Julien Hugonnier)

    College of Humanities (CDH, Thomas David)

    Human and social sciences teaching program (CDH-SHS, Thomas David)

    EPFL Middle East (EME, Dr. Franco Vigliotti)[62]

    Section of Energy Management and Sustainability (MES, Prof. Maher Kayal)

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

  • richardmitnick 8:24 am on July 24, 2021 Permalink | Reply
    Tags: "20 Years Ago Steve Jobs Built the ‘Coolest Computer Ever.’ It Bombed", Apple Computer, , Computing, The Power Mac G4 Cube,   

    From WIRED : “20 Years Ago Steve Jobs Built the ‘Coolest Computer Ever.’ It Bombed” 

    From WIRED

    07.24.2020 [Re-issued 7.24.21]
    Steven Levy

    Plus: An interview from the archives, the most-read story in WIRED history, and bottled-up screams.

    The Power Mac G4 Cube, released in 2000 and discontinued in 2001, violated the wisdom of Jobs’ product plan. Photograph: Apple/Getty Images.

    The Plain View

    This month marks the 20th anniversary of the Power Mac G4 Cube, which debuted July 19, 2000. It also marks the 19th anniversary of Apple’s announcement that it was “putting the Cube on ice”. That’s not my joke, it’s Apple’s, straight from the headline of its July 3, 2001, press release that officially pulled the plug.

    The idea of such a quick turnaround was nowhere in the mind of Apple CEO Steve Jobs on the eve of the product’s announcement at that summer 2000 Macworld Expo. I was reminded of this last week, as I listened to a cassette tape recorded 20 years prior, almost to the day. It documented a two-hour session with Jobs in Cupertino, California, shortly before the launch. The main reason he had summoned me to Apple’s headquarters was sitting under the over of a dark sheet of fabric on the long table in the boardroom of One Infinite Loop.

    “We have made the coolest computer ever,” he told me. “I guess I’ll just show it to you.”

    He yanked off the fabric, exposing an 8-inch stump of transparent plastic with a block of electronics suspended inside. It looked less like a computer than a toaster born from an immaculate conception between Philip K. Dick and Ludwig Mies van der Rohe. (But the fingerprints were, of course, Jony Ive’s.) Alongside it were two speakers encased in Christmas-ornament-sized, glasslike spheres.

    “The Cube,” Jobs said, in a stage whisper, hardly containing his excitement.

    He began by emphasizing that while the Cube was powerful, it was air-cooled. (Jobs hated fans. Hated them.) He demonstrated how it didn’t have a power switch, but could sense a wave of your hand to turn on the juice. He showed me how Apple had eliminated the tray that held CDs—with the Cube, you just hovered the disk over the slot and the machine inhaled it.

    And then he got to the plastics. It was as if Jobs had taken to heart that guy in The Graduate who gave career advice to Benjamin Braddock. “We are doing more with plastics than anyone else in the world,” he told me. “These are all specially formulated, and it’s all proprietary, just us. It took us six months just to formulate these plastics. They make bulletproof vests out of it! And it’s incredibly sturdy, and it’s just beautiful! There’s never been anything like that. How do you make something like that? Nobody ever made anything like that! Isn’t that beautiful? I think it’s stunning!”

    I admitted it was gorgeous. But I had a question for him. Earlier in the conversation, he had drawn Apple’s product matrix, four squares representing laptop and desktop, high and low end. Since returning to Apple in 1997, he had filled in all the quadrants with the iMac, Power Mac, iBook, and PowerBook. The Cube violated the wisdom of his product plan. It didn’t have the power features of the high-end Power Mac, like slots or huge storage. And it was way more expensive than the low-end iMac, even before you spent for a necessary separate display required of Cube owners. Knowing I was risking his ire, I asked him: Just who was going to buy this?

    Jobs didn’t miss a beat. “That’s easy!” he said. “A ton of people who are pros. Every designer is going to buy one.”

    Here was his justification for violating his matrix theory: “We realized there was an incredible opportunity to make something in the middle, sort of a love child, that was truly a breakthrough,” he said. The implicit message was that it was so great that people would alter their buying patterns to purchase one.

    That didn’t happen. For one thing, the price was prohibitive—by the time you bought the display, it was almost three times the price of an iMac and even more than some PowerMacs. By and large, people don’t spend their art budget on computers.

    That wasn’t the only issue with the G4 Cube. Those plastics were hard to manufacture, and people reported flaws. The air cooling had problems. If you left a sheet of paper on top of the device, it would shut down to prevent overheating. And because it had no On button, a stray wave of your hand would send the machine into action, like it or not.

    In any case, the G4 Cube failed to push buttons on the computer-buying public. Jobs told me it would sell millions. But Apple sold fewer than 150,000 units. The apotheosis of Apple design was also the apex of Apple hubris. Listening to the tape, I was struck by how much Jobs had been drunk on the elixir of aesthetics. “Do you really want to put a hole in this thing and put a button there?” Jobs asked me, justifying the lack of a power switch. “Look at the energy we put into this slot drive so you wouldn’t have a tray, and you want to ruin that and put a button in?”

    But here is something else about Jobs and the Cube that speaks not of failure but why he was a successful leader: Once it was clear that his Cube was a brick, he was quick to cut his losses and move on.

    In a 2017 talk at University of Oxford (UK), Apple CEO Tim Cook talked about the G4 Cube, which he described as “a spectacular commercial failure, from the first day, almost.” But Jobs’ reaction to the bad sales figures showed how quickly, when it became necessary, he could abandon even a product dear to his heart. “Steve, of everyone I’ve known in life,” Cook said at Oxford, “could be the most avid proponent of some position, and within minutes or days, if new information came out, you would think that he never ever thought that before.”

    But he did think it, and I have the tape to prove it. Happy birthday to Steve Jobs’ digital love child.

    Time Travel

    My July 2000 Newsweek article about the Cube came with a sidebar of excerpts from my interview with Steve Jobs. Here are a few:

    Levy: Last January you dropped the “interim” from your CEO title. Has this had any impact?

    Jobs: No, even when I first came and wasn’t sure how long I’d be here, I made decisions for the long term. The reason I finally changed the title was that it was becoming sort of a joke. And I don’t want anything at Apple to become a joke.

    Levy: Rumors have recirculated about you becoming CEO of Disney. Is there anything about running a media giant that appeals to you?

    Jobs: I was thinking of giving you a witty answer, like “Isn’t that what I’m doing now?” But no, it doesn’t appeal to me at all. I’m a product person. I believe it’s possible to express your feelings and your caring about things from your products, whether that product is a computer system or Toy Story 2. It’s wonderful to make a pure expression of something and then make a million copies. Like the G4 Cube. There will be a million copies of this out there.

    Levy: The G4 Cube reminds a lot of people that your previous company, Next, also made a cube-shaped machine.

    Jobs: Yeah, we did one before. Cubes are very efficient spaces. What makes this one [special] for me is not the fact that it’s a cube but it’s like a brain in a beaker. It’s just hanging from this perfectly clear, pristine crystal enclosure. That’s what’s so drop-dead about it. It’s incredibly functional. The whole thing is perfect.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 11:17 am on June 16, 2021 Permalink | Reply
    Tags: "New software turns ‘mental handwriting’ into words on computer screens", , Computing,   

    From Stanford University Engineering : “New software turns ‘mental handwriting’ into words on computer screens” 

    From Stanford University Engineering

    The new “mindwriting” technology enables a man with immobilized limbs to create text messages nearly as fast as people who use their thumbs to tap words onto smartphone keyboards.

    New software and hardware taps into the brain to convert thoughts about handwriting into text on a computer screen. Credit: Aaron Burden/Unsplash

    Stanford University investigators have coupled artificial intelligence software with a device, called a brain-computer interface (BCI), implanted in the brain of a man with full-body paralysis.

    The software was able to decode information from the BCI to quickly convert the man’s thoughts about handwriting into text on a computer screen.

    The man was able to write using this approach more than twice as quickly as he could using a previous method developed by the Stanford researchers, who reported those findings in 2017 in the journal eLife.

    The new findings, published online May 12 in Nature,could spur further advances benefiting hundreds of thousands of Americans, and millions globally, who’ve lost the use of their upper limbs or their ability to speak due to spinal-cord injuries, strokes or amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease, said Jaimie Henderson, professor of neurosurgery.

    “This approach allowed a person with paralysis to compose sentences at speeds nearly comparable to those of able-bodied adults of the same age typing on a smartphone,” said Henderson, the John and Jene Blume–Robert and Ruth Halperin Professor at Stanford Medicine. “The goal is to restore the ability to communicate by text.”

    The participant in the study produced text at a rate of about 18 words per minute. By comparison, able-bodied people of the same age can punch out about 23 words per minute on a smartphone.

    The participant, referred to as T5, lost practically all movement below the neck because of a spinal-cord injury in 2007. Nine years later, Henderson placed two brain-computer-interface chips, each the size of a baby aspirin, on the left side of T5’s brain. Each chip has 100 electrodes that pick up signals from neurons firing in the part of the motor cortex – a region of the brain’s outermost surface – that governs hand movement.

    Those neural signals are sent via wires to a computer, where artificial intelligence algorithms decode the signals and surmise T5’s intended hand and finger motion. The algorithms were designed in Stanford’s Neural Prosthetics Translational Laboratory, co-directed by Henderson and Krishna Shenoy, the Hong Seh and Vivian W. M. Lim Professor in the School of Engineering and professor of electrical engineering.

    Shenoy and Henderson, who have been collaborating on BCIs since 2005, are the senior co-authors of the new study. The lead author is Frank Willett, a research scientist in the Neural Prosthetics Translational Laboratory and with the Howard Hughes Medical Institute (HHMI) (US).

    “We’ve learned that the brain retains its ability to prescribe fine movements a full decade after the body has lost its ability to execute those movements,” Willett said. “And we’ve learned that complicated intended motions involving changing speeds and curved trajectories, like handwriting, can be interpreted more easily and more rapidly by the artificial intelligence algorithms we’re using than can simpler intended motions like moving a cursor in a straight path at a steady speed. Alphabetical letters are different from one another, so they’re easier to tell apart.”

    In the 2017 study, three participants with limb paralysis, including T5 – all with BCIs placed in the motor cortex – were asked to concentrate on using an arm and hand to move a cursor from one key to the next on a computer-screen keyboard display, then to focus on clicking on that key.

    In that study, T5 set what was until now the all-time record: copying displayed sentences at about 40 characters per minute. Another study participant was able to write extemporaneously, selecting whatever words she wanted, at 24.4 characters per minute.

    If the paradigm underlying the 2017 study was analogous to typing, the model for the new Nature study is analogous to handwriting. T5 concentrated on trying to write individual letters of the alphabet on an imaginary legal pad with an imaginary pen, despite his inability to move his arm or hand. He repeated each letter 10 times, permitting the software to “learn” to recognize the neural signals associated with his effort to write that particular letter.

    In numerous multi-hour sessions that followed, T5 was presented with groups of sentences and instructed to make a mental effort to “handwrite” each one. No uppercase letters were employed. Examples of the sentences were “i interrupted, unable to keep silent,” and “within thirty seconds the army had landed.” Over time, the algorithms improved their ability to differentiate among the neural firing patterns typifying different characters. The algorithms’ interpretation of whatever letter T5 was attempting to write appeared on the computer screen after a roughly half-second delay.

    In further sessions, T5 was instructed to copy sentences the algorithms had never been exposed to. He was eventually able to generate 90 characters, or about 18 words, per minute. Later, asked to give his answers to open-ended questions, which required some pauses for thought, he generated 73.8 characters (close to 15 words, on average) per minute, tripling the previous free-composition record set in the 2017 study.

    T5’s sentence-copying error rate was about one mistake in every 18 or 19 attempted characters. His free-composition error rate was about one in every 11 or 12 characters. When the researchers used an after-the-fact autocorrect function – similar to the ones incorporated into our smartphone keyboards – to clean things up, those error rates were markedly lower: below 1% for copying, and just over 2% for freestyle.

    These error rates are quite low compared with other BCIs, said Shenoy, who is also a Howard Hughes Medical Institute investigator.

    “While handwriting can approach 20 words per minute, we tend to speak around 125 words per minute, and this is another exciting direction that complements handwriting. If combined, these systems could together offer even more options for patients to communicate effectively,” Shenoy said.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford Engineering has been at the forefront of innovation for nearly a century, creating pivotal technologies that have transformed the worlds of information technology, communications, health care, energy, business and beyond.

    The school’s faculty, students and alumni have established thousands of companies and laid the technological and business foundations for Silicon Valley. Today, the school educates leaders who will make an impact on global problems and seeks to define what the future of engineering will look like.

    Our mission is to seek solutions to important global problems and educate leaders who will make the world a better place by using the power of engineering principles, techniques and systems. We believe it is essential to educate engineers who possess not only deep technical excellence, but the creativity, cultural awareness and entrepreneurial skills that come from exposure to the liberal arts, business, medicine and other disciplines that are an integral part of the Stanford experience.

    Our key goals are to:

    Conduct curiosity-driven and problem-driven research that generates new knowledge and produces discoveries that provide the foundations for future engineered systems
    Deliver world-class, research-based education to students and broad-based training to leaders in academia, industry and society
    Drive technology transfer to Silicon Valley and beyond with deeply and broadly educated people and transformative ideas that will improve our society and our world.

    The Future of Engineering

    The engineering school of the future will look very different from what it looks like today. So, in 2015, we brought together a wide range of stakeholders, including mid-career faculty, students and staff, to address two fundamental questions: In what areas can the School of Engineering make significant world‐changing impact, and how should the school be configured to address the major opportunities and challenges of the future?

    One key output of the process is a set of 10 broad, aspirational questions on areas where the School of Engineering would like to have an impact in 20 years. The committee also returned with a series of recommendations that outlined actions across three key areas — research, education and culture — where the school can deploy resources and create the conditions for Stanford Engineering to have significant impact on those challenges.

    Stanford University

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

  • richardmitnick 3:52 pm on May 13, 2021 Permalink | Reply
    Tags: "Harnessing the hum of fluorescent lights for more efficient computing", A team led by University of Michigan researchers has developed a material that’s at least twice as “magnetostrictive” and far less costly than other materials in its class., , Computing, Magnetoelectric chips could make everything from massive data centers to cell phones far more energy efficient., Magnetoelectric devices use magnetic fields instead of electricity to store the digital ones and zeros of binary data.   

    From University of Michigan : “Harnessing the hum of fluorescent lights for more efficient computing” 

    U Michigan bloc

    From University of Michigan

    May 12, 2021

    Gabe Cherry

    Nicole Casal Moore

    The property that makes fluorescent lights buzz could power a new generation of more efficient computing devices that store data with magnetic fields, rather than electricity.

    A team led by University of Michigan researchers has developed a material that’s at least twice as “magnetostrictive” and far less costly than other materials in its class. In addition to computing, it could also lead to better magnetic sensors for medical and security devices.

    Magnetostriction, which causes the buzz of fluorescent lights and electrical transformers, occurs when a material’s shape and magnetic field are linked—that is, a change in shape causes a change in magnetic field. The property could be key to a new generation of computing devices called magnetoelectrics.

    Magnetoelectric chips could make everything from massive data centers to cell phones far more energy efficient, slashing the electricity requirements of the world’s computing infrastructure.

    Made of a combination of iron and gallium, the material is detailed in a paper published May 12 in Nature Communications. The team is led by U-M materials science and engineering professor John Heron and includes researchers from Intel; Cornell University (US); University of California-Berkeley (US); University of Wisconsin (US); Purdue University (US) and elsewhere.

    Magnetoelectric devices use magnetic fields instead of electricity to store the digital ones and zeros of binary data. Tiny pulses of electricity cause them to expand or contract slightly, flipping their magnetic field from positive to negative or vice versa. Because they don’t require a steady stream of electricity, as today’s chips do, they use a fraction of the energy.

    “A key to making magnetoelectric devices work is finding materials whose electrical and magnetic properties are linked.” Heron said. “And more magnetostriction means that a chip can do the same job with less energy.”

    Cheaper magnetoelectric devices with a tenfold improvement

    Most of today’s magnetostrictive materials use rare-earth elements, which are too scarce and costly to be used in the quantities needed for computing devices. But Heron’s team has found a way to coax high levels of magnetostriction from inexpensive iron and gallium.

    Ordinarily, explains Heron, the magnetostriction of iron-gallium alloy increases as more gallium is added. But those increases level off and eventually begin to fall as the higher amounts of gallium begin to form an ordered atomic structure.

    So the research team used a process called low-temperature molecular-beam epitaxy to essentially freeze atoms in place, preventing them from forming an ordered structure as more gallium was added. This way, Heron and his team were able to double the amount of gallium in the material, netting a tenfold increase in magnetostriction compared to unmodified iron-gallium alloys.

    “Low-temperature molecular-beam epitaxy is an extremely useful technique—it’s a little bit like spray painting with individual atoms,” Heron said. “And ‘spray painting’ the material onto a surface that deforms slightly when a voltage is applied also made it easy to test its magnetostrictive properties.”

    Researchers are working with Intel’s MESO program

    The magnetoelectric devices made in the study are several microns in size—large by computing standards. But the researchers are working with Intel to find ways to shrink them to a more useful size that will be compatible with the company’s magnetoelectric spin-orbit device (or MESO) program, one goal of which is to push magnetoelectric devices into the mainstream.

    “Intel is great at scaling things and at the nuts and bolts of making a technology actually work at the super-small scale of a computer chip,” Heron said. “They’re very invested in this project and we’re meeting with them regularly to get feedback and ideas on how to ramp up this technology to make it useful in the computer chips that they call MESO.”

    While a device that uses the material is likely decades away, Heron’s lab has filed for patent protection through the U-M Office of Technology Transfer.

    The research is supported by IMRA America and the National Science Foundation (grant numbers NNCI-1542081, EEC-1160504 DMR-1719875 and DMR-1539918).

    Other researchers on the paper include U-M associate professor of materials science and engineering Emmanouil Kioupakis; U-M assistant professor of materials science and engineering Robert Hovden; and U-M graduate student research assistants Peter Meisenheimer and Suk Hyun Sung.

    See the full article here .


    Please support STEM education in your local school system

    Stem Education Coalition

    U MIchigan Campus

    The University of Michigan (U-M, UM, UMich, or U of M), frequently referred to simply as Michigan, is a public research university located in Ann Arbor, Michigan, United States. Originally, founded in 1817 in Detroit as the Catholepistemiad, or University of Michigania, 20 years before the Michigan Territory officially became a state, the University of Michigan is the state’s oldest university. The university moved to Ann Arbor in 1837 onto 40 acres (16 ha) of what is now known as Central Campus. Since its establishment in Ann Arbor, the university campus has expanded to include more than 584 major buildings with a combined area of more than 34 million gross square feet (781 acres or 3.16 km²), and has two satellite campuses located in Flint and Dearborn. The University was one of the founding members of the Association of American Universities (US).

    Considered one of the foremost research universities in the United States, the university has very high research activity and its comprehensive graduate program offers doctoral degrees in the humanities, social sciences, and STEM fields (Science, Technology, Engineering and Mathematics) as well as professional degrees in business, medicine, law, pharmacy, nursing, social work and dentistry. Michigan’s body of living alumni (as of 2012) comprises more than 500,000. Besides academic life, Michigan’s athletic teams compete in Division I of the NCAA and are collectively known as the Wolverines. They are members of the Big Ten Conference.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: