Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:37 pm on October 15, 2021 Permalink | Reply
    Tags: "Life on LEO: Plants to be Added to the Landscape Evolution Observatory at Biosphere 2", Applied Research & Technology, , , , The University of Arizona (US)   

    From The University of Arizona (US) : “Life on LEO: Plants to be Added to the Landscape Evolution Observatory at Biosphere 2” 

    From The University of Arizona (US)

    Daniel Stolte

    Surprisingly little is known about how rain moves through landscapes once it’s on the ground. The University of Arizona’s Landscape Evolution Observatory is designed to provide answers. A $3.5 million grant will allow scientists to study the roles plants and microbes play in the process.

    One of three artificial hillslopes in the Landscape Evolution Observatory. Each is equipped with 1,900 sensors and sampling devices that enable scientists to monitor water, carbon and energy cycling processes and the physical and chemical evolution of the landscape at small and large scales. Credit: Aaron Bugaj.

    The National Science Foundation (US) has awarded $3.5 million to a team led by University of Arizona researchers to study how life prevails in barren landscapes, such as those disturbed by wildfires, volcanic eruptions or mining operations.

    The research will yield new insights into the effects of a changing climate on such landscapes, and could someday even help astronauts raise crops on Mars.

    Researchers from The University of Arizona, DOE’s Lawrence Berkeley National Laboratory (US) and California Lutheran University (US) will establish a complete ecosystem – with plants, artificial rain and sophisticated monitoring technology – on the large artificial hillslopes at the Landscape Evolution Observatory, or LEO, located inside The University of Arizona’s Biosphere 2. The experiment will offer scientists a detailed look at how emergent plant life interacts with soil, water and carbon dioxide from the atmosphere to create more complex ecosystems.

    “In a nutshell, we’re getting ready to put life on LEO in the form of plants,” said Scott Saleska, a professor in the Department of Ecology and Evolutionary Biology who took over as LEO’s director of science earlier this year. “This grant will allow us to answer a question central to ecology: Can we predict what is going to happen when we build up an ecosystem from scratch? LEO allows us to literally watch life’s complexity build up from ground zero.”

    LEO is the world’s largest laboratory experiment in the interdisciplinary earth sciences. The experiment consists of three artificial landscapes that mimic watersheds in the natural world, each contained within elaborate steel structures housed in three adjacent bays under the glass-and-steel domes of Biosphere 2. Each hillslope is 100 feet long and 35 feet wide and blanketed with 1 million pounds of crushed basalt rock, layered 3 feet deep. Each of LEO’s hillslopes is studded with 1,900 sensors that allow scientists to observe each step in the landscapes’ evolution – from lifeless soil to living, breathing landscapes that will ultimately support complex microbial and vascular plant communities.

    The first organisms to colonize barren landscape are microbes and less complex plants, such as these mosses growing in the Landscape Evolution Observatory, on the hillslope soils created from crushed basalt rock that originated in a volcanic eruption. Credit: Aaron Bugaj.

    Over the past five years, researchers have used LEO to gain in-depth knowledge of how landscapes evolve in the absence of plant life other than microbes and mosses. Those studies focused on the interactions between soil and water, with the water being provided through a sophisticated irrigation system that simulates various kinds of rain. The new NSF grant kicks off a new phase of the project, allowing researchers to study more complex interactions between the physical and biological components of LEO’s ecosystem, particularly between tiny microbial communities and higher plants.

    Water, Water Everywhere – But What Does it Do and Where Does it Go?

    The world faces the increasingly urgent question of how to better understand and manage complex physical-biological systems to address pressing problems such as how to restore degraded landscapes, practice sustainable ecosystem management and terraform planets beyond Earth. Terraforming is the science of transforming hostile environments into land that can grow crops.

    By adding plants with roots and vascular systems to LEO, Saleska’s team will study how plant life affects a well-established physical system and test hypotheses about the interactions between plants and microbes.

    Project co-leader Katrina Dlugosch, associate professor of ecology and evolutionary biology, selected alfalfa as the model plant organism to be planted at LEO because it has been thoroughly studied, and its genome has been sequenced and is well-known. Alfalfa also commonly enters in symbioses – or partnerships – with microbes capable of scrubbing nitrogen from the atmosphere and converting it into nutrients the plants can use.

    “Alfalfa provides one of the key features of primary succession – the process of life colonizing an environment that has very little to offer in terms of nutrients,” Dlugosch explained.

    “We think there will be a strong selection in this harsh environment on how these plants establish and maintain their partnerships with the microbes, and we are looking to understand both the ecology of that and, down the road, the biological evolution of this hillslope community as a whole,” said Malak Tfaily, assistant professor in The University of Arizona Department of Environmental Science.

    The team also will use LEO’s hillslopes as models for watershed environments in the natural world. Experiments will test how water flows through landscapes, how that affects the weathering of rock to soil, and the effects of those processes on landscapes and their biological habitability.

    “The basic question boils down to: What happens to the rain?” said Peter Troch, University of Arizona professor of hydrology and atmospheric science and a member of the project’s steering committee. “We are going to test how water is used by plants through root water uptake or contributes to aquifer recharge and streamflow.”

    Troch expects the results to inform land management practices such as water conservation measures in water-limited environments and plant selection in landscape restoration efforts.

    A key part of the project is its scalability, Saleska added. What researchers learn from studying small patches of plants growing on the LEO hillslope can be applied to full landscapes.

    The project, titled Growing a new science of landscape terraformation: The convergence of rock, fluids, and life to form complex ecosystems across scales, was selected by NSF under its Growing Convergence Research program, which aims to solve complex research problems with a focus on societal needs. In addition to experts in hydrology, geochemistry, evolutionary genomics and ecology, the LEO team will include anthropologists who study cultures of science, with the goal of breaking new ground in how researchers from historically separate disciplines can better share and integrate their ideas and insights for the benefit of the world.

    “These are extremely competitive grants, specifically created to address some of the world’s greatest challenges, and to even be considered requires a portfolio of interdisciplinary scholarship and technological capability that the university excels at bringing together,” said University of Arizona President Robert C. Robbins. “The fact that our researchers continue to attract these types of grants speaks to the unique ecosystem of talent, technology and perseverance that our faculty bring to the table.”

    Other members of the LEO project steering committee include Jon Chorover, head of the Department of Environmental Science; Jennifer Croissant, associate professor in the Department of Gender and Women’s Studies; Elizabeth “Betsy” Arnold, a professor in the School of Plant Sciences and the Department of Ecology and Evolutionary Biology; and William Riley, senior scientist at Lawrence Berkeley National Lab in Berkeley.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    As of 2019, the The University of Arizona (US) enrolled 45,918 students in 19 separate colleges/schools, including The University of Arizona College of Medicine in Tucson and Phoenix and the James E. Rogers College of Law, and is affiliated with two academic medical centers (Banner – University Medical Center Tucson and Banner – University Medical Center Phoenix). The University of Arizona is one of three universities governed by the Arizona Board of Regents. The university is part of the Association of American Universities and is the only member from Arizona, and also part of the Universities Research Association(US). The university is classified among “R1: Doctoral Universities – Very High Research Activity”.

    Known as the Arizona Wildcats (often shortened to “Cats”), The University of Arizona’s intercollegiate athletic teams are members of the Pac-12 Conference of the NCAA. The University of Arizona athletes have won national titles in several sports, most notably men’s basketball, baseball, and softball. The official colors of the university and its athletic teams are cardinal red and navy blue.

    After the passage of the Morrill Land-Grant Act of 1862, the push for a university in Arizona grew. The Arizona Territory’s “Thieving Thirteenth” Legislature approved The University of Arizona in 1885 and selected the city of Tucson to receive the appropriation to build the university. Tucson hoped to receive the appropriation for the territory’s mental hospital, which carried a $100,000 allocation instead of the $25,000 allotted to the territory’s only university (Arizona State University(US) was also chartered in 1885, but it was created as Arizona’s normal school, and not a university). Flooding on the Salt River delayed Tucson’s legislators, and by they time they reached Prescott, back-room deals allocating the most desirable territorial institutions had been made. Tucson was largely disappointed with receiving what was viewed as an inferior prize.

    With no parties willing to provide land for the new institution, the citizens of Tucson prepared to return the money to the Territorial Legislature until two gamblers and a saloon keeper decided to donate the land to build the school. Construction of Old Main, the first building on campus, began on October 27, 1887, and classes met for the first time in 1891 with 32 students in Old Main, which is still in use today. Because there were no high schools in Arizona Territory, the university maintained separate preparatory classes for the first 23 years of operation.


    The University of Arizona is classified among “R1: Doctoral Universities – Very high research activity”. UArizona is the fourth most awarded public university by National Aeronautics and Space Administration(US) for research. The University of Arizona was awarded over $325 million for its Lunar and Planetary Laboratory (LPL) to lead NASA’s 2007–08 mission to Mars to explore the Martian Arctic, and $800 million for its OSIRIS-REx mission, the first in U.S. history to sample an asteroid.

    The LPL’s work in the Cassini spacecraft orbit around Saturn is larger than any other university globally. The University of Arizona laboratory designed and operated the atmospheric radiation investigations and imaging on the probe. The University of Arizona operates the HiRISE camera, a part of the Mars Reconnaissance Orbiter. While using the HiRISE camera in 2011, University of Arizona alumnus Lujendra Ojha and his team discovered proof of liquid water on the surface of Mars—a discovery confirmed by NASA in 2015. The University of Arizona receives more NASA grants annually than the next nine top NASA/JPL-Caltech(US)-funded universities combined. As of March 2016, The University of Arizona’s Lunar and Planetary Laboratory is actively involved in ten spacecraft missions: Cassini VIMS; Grail; the HiRISE camera orbiting Mars; the Juno mission orbiting Jupiter; Lunar Reconnaissance Orbiter (LRO); Maven, which will explore Mars’ upper atmosphere and interactions with the sun; Solar Probe Plus, a historic mission into the Sun’s atmosphere for the first time; Rosetta’s VIRTIS; WISE; and OSIRIS-REx, the first U.S. sample-return mission to a near-earth asteroid, which launched on September 8, 2016.

    The University of Arizona students have been selected as Truman, Rhodes, Goldwater, and Fulbright Scholars. According to The Chronicle of Higher Education, UArizona is among the top 25 producers of Fulbright awards in the U.S.

    The University of Arizona is a member of the Association of Universities for Research in Astronomy(US), a consortium of institutions pursuing research in astronomy. The association operates observatories and telescopes, notably Kitt Peak National Observatory(US) just outside Tucson. Led by Roger Angel, researchers in the Steward Observatory Mirror Lab at The University of Arizona are working in concert to build the world’s most advanced telescope. Known as the Giant Magellan Telescope(CL), it will produce images 10 times sharper than those from the Earth-orbiting Hubble Telescope.

    Giant Magellan Telescope, 21 meters, to be at the NOIRLab(US) National Optical Astronomy Observatory(US) Carnegie Institution for Science’s(US) Las Campanas Observatory(CL), some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.

    The telescope is set to be completed in 2021. GMT will ultimately cost $1 billion. Researchers from at least nine institutions are working to secure the funding for the project. The telescope will include seven 18-ton mirrors capable of providing clear images of volcanoes and riverbeds on Mars and mountains on the moon at a rate 40 times faster than the world’s current large telescopes. The mirrors of the Giant Magellan Telescope will be built at The University of Arizona and transported to a permanent mountaintop site in the Chilean Andes where the telescope will be constructed.

    Reaching Mars in March 2006, the Mars Reconnaissance Orbiter contained the HiRISE camera, with Principal Investigator Alfred McEwen as the lead on the project. This National Aeronautics and Space Agency (US) mission to Mars carrying the UArizona-designed camera is capturing the highest-resolution images of the planet ever seen. The journey of the orbiter was 300 million miles. In August 2007, The University of Arizona, under the charge of Scientist Peter Smith, led the Phoenix Mars Mission, the first mission completely controlled by a university. Reaching the planet’s surface in May 2008, the mission’s purpose was to improve knowledge of the Martian Arctic. The Arizona Radio Observatory(US), a part of The University of Arizona Department of Astronomy Steward Observatory(US), operates the Submillimeter Telescope on Mount Graham.

    The National Science Foundation(US) funded the iPlant Collaborative in 2008 with a $50 million grant. In 2013, iPlant Collaborative received a $50 million renewal grant. Rebranded in late 2015 as “CyVerse”, the collaborative cloud-based data management platform is moving beyond life sciences to provide cloud-computing access across all scientific disciplines.

    In June 2011, the university announced it would assume full ownership of the Biosphere 2 scientific research facility in Oracle, Arizona, north of Tucson, effective July 1. Biosphere 2 was constructed by private developers (funded mainly by Texas businessman and philanthropist Ed Bass) with its first closed system experiment commencing in 1991. The university had been the official management partner of the facility for research purposes since 2007.

    U Arizona mirror lab-Where else in the world can you find an astronomical observatory mirror lab under a football stadium?

    University of Arizona’s Biosphere 2, located in the Sonoran desert. An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why the UA is a university unlike any other.

  • richardmitnick 3:46 pm on October 15, 2021 Permalink | Reply
    Tags: "Two Impacts-Not Just One-May Have Formed The Moon", Applied Research & Technology, , , ,   

    From Sky & Telescope : “Two Impacts-Not Just One-May Have Formed The Moon” 

    From Sky & Telescope

    October 14, 2021
    Asa Stahl

    In this image, the proposed hit-and-run collision is simulated in 3D, shown about an hour after impact. Theia, the impactor, barely escapes the collision. A. Emsenhuber / The University of Bern [Universität Bern](CH) / The Ludwig Maximilians University of Munich [Ludwig-Maximilians-Universität München](DE).

    Scientists have long thought that the Moon formed with a bang, when a protoplanet the size of Mars hit the newborn Earth. Evidence from Moon rocks and simulations back up this idea.

    But a new study suggests that the protoplanet most likely hit Earth twice. The first time, the impactor (dubbed “Theia”) only glanced off Earth. Then, some hundreds of thousands of years later, it came back to deliver the final blow.

    The study, which simulated the literally Earth-shattering impact thousands of times, found that such a “hit-and-run return” scenario could help answer two longstanding questions surrounding the creation of the Moon. At the same time, it might explain how Earth and Venus ended up so different.

    The One-Two Punch

    “The key issue here is planetary diversity,” says Erik Asphaug (The University of Arizona (US)), who led the study. Venus and Earth have similar sizes, masses, and distances from the Sun. If Venus is a “crushing hot-house,” he asks, “why is Earth so amazingly blue and rich?”

    The Moon might hold the secret. Its creation was the last major episode in Earth’s formation, a catastrophic event that set the stage for the rest of our planet’s evolution. “You can’t understand how Earth formed without understanding how the Moon formed,” Asphaug explains. “They are part of the same puzzle.”

    The new simulations, which were published in the October Journal of Planetary Sciences, put a few more pieces of that puzzle into place.

    The first has to do with the speed of Theia’s impact. If Theia had hit our planet too fast, it would have exploded into an interplanetary plume of debris and eroded much of Earth. Yet if it had come in too slowly, the result would be a Moon whose orbit looks nothing like what we see today. The original impact theory doesn’t explain why Theia traveled at a just-right speed between these extremes.

    “[This] new scenario fixes that,” says Matthias Meier (Natural History Museum, Switzerland), who was not involved in the study. Initially, Theia could have been going much faster, but the first impact would have slowed it down to the perfect speed for the second one.

    The other problem with the original impact theory is that our Moon ought to be mostly made of primordial Theia. But Moon rocks from the Apollo missions show that Earth and the Moon have nearly identical compositions when it comes to certain kinds of elements. How could they have formed from two different building blocks?

    “The canonical giant-impact scenario is really bad at solving [this issue],” Meier says (though others have tried).

    A hit-and-run return, on the other hand, would enable Earth’s and Theia’s materials to mix more than in a single impact, ultimately forming a Moon chemically more similar to Earth. Though Asphaug and colleagues don’t quite fix the mismatch, they argue that more advanced simulations would yield even better results.

    Earth vs. Venus

    Resolving this aspect of the giant-impact theory would be no mean feat. But Asphaug’s real surprise came when he saw how hit-and-run impacts would have affected Venus compared to Earth.

    “I first thought maybe there was a mistake,” he recalls.

    The new simulations showed that the young Earth tended to pass on half of its hit-and-runners to Venus, while Venus accreted almost everything that came its way. This dynamic could help explain the drastic differences between the two planets: If more runners ended up at Venus, they would have enriched the planet in more outer solar system material compared to Earth. And since the impactors that escaped Earth to go on to Venus would have been the faster ones, each planet would have experienced generally different collisions.

    This finding flips the original purpose of the study on its head. If Venus suffered more giant impacts than Earth, the question would no longer be “why does Earth have a moon?” but “why doesn’t Venus?”

    Perhaps there was only one hit-and-run event, the one that made our Moon. Perhaps there were many, but for the same reason that Venus collected more impacts than Earth, it also accreted more destructive debris, obliterating any moon it already had. Or perhaps the last of Venus’ impacts was just particularly violent.

    Finding out means taking a trip to Venus. That would provide “the next leap in understanding,” Meier says. If Earth and Venus both had hit-and-runs, for example, then the surface of Venus ought to be more like Earth’s than previously expected. If Venus has the same chemical similarities as the Moon and Earth, that would throw out the giant-impact theory’s last remaining problem.

    “Getting samples from Venus,” Asphaug concludes, “is the key to answering all these questions.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sky & Telescope, founded in 1941 by Charles A. Federer Jr. and Helen Spence Federer, has the largest, most experienced staff of any astronomy magazine in the world. Its editors are virtually all amateur or professional astronomers, and every one has built a telescope, written a book, done original research, developed a new product, or otherwise distinguished him or herself.

    Sky & Telescope magazine, now in its eighth decade, came about because of some happy accidents. Its earliest known ancestor was a four-page bulletin called The Amateur Astronomer, which was begun in 1929 by the Amateur Astronomers Association in New York City. Then, in 1935, the American Museum of Natural History opened its Hayden Planetarium and began to issue a monthly bulletin that became a full-size magazine called The Sky within a year. Under the editorship of Hans Christian Adamson, The Sky featured large illustrations and articles from astronomers all over the globe. It immediately absorbed The Amateur Astronomer.

    Despite initial success, by 1939 the planetarium found itself unable to continue financial support of The Sky. Charles A. Federer, who would become the dominant force behind Sky & Telescope, was then working as a lecturer at the planetarium. He was asked to take over publishing The Sky. Federer agreed and started an independent publishing corporation in New York.

    “Our first issue came out in January 1940,” he noted. “We dropped from 32 to 24 pages, used cheaper quality paper…but editorially we further defined the departments and tried to squeeze as much information as possible between the covers.” Federer was The Sky’s editor, and his wife, Helen, served as managing editor. In that January 1940 issue, they stated their goal: “We shall try to make the magazine meet the needs of amateur astronomy, so that amateur astronomers will come to regard it as essential to their pursuit, and professionals to consider it a worthwhile medium in which to bring their work before the public.”

  • richardmitnick 12:27 pm on October 15, 2021 Permalink | Reply
    Tags: "Holey metalens!", Applied Research & Technology, ,   

    From Harvard University John A Paulson School of Engineering and Applied Sciences (US) : “Holey metalens!” 

    From Harvard University John A Paulson School of Engineering and Applied Sciences (US)


    Harvard University (US)

    October 13, 2021
    Leah Burrows

    New metalens focuses light with ultra-deep holes.

    Holey metalens! New metalens focuses light with ultra-deep holes.

    Metasurfaces are nanoscale structures that interact with light. Today, most metasurfaces use monolith-like nanopillars to focus, shape and control light. The taller the nanopillar, the more time it takes for light to pass through the nanostructure, giving the metasurface more versatile control of each color of light. But very tall pillars tend to fall or cling together. What if, instead of building tall structures, you went the other way?

    In a recent paper, researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) developed a metasurface that uses very deep, very narrow holes, rather than very tall pillars, to focus light to a single spot.

    The research is published in Nano Letters.

    The new metasurface uses more than 12 million needle-like holes drilled into a 5-micrometer silicon membrane, about 1/20 the thickness of hair. The diameter of these long, thin holes is only a few hundred nanometers, making the aspect ratio — the ratio of the height to width — nearly 30:1.

    It is the first time that holes with such a high aspect ratio have been used in meta-optics.

    “This approach may be used to create large achromatic metalenses that focus various colors of light to the same focal spot, paving the way for a generation of high-aspect ratio flat optics, including large-area broadband achromatic metalenses,” said Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS and senior author of the paper.

    A scanning electron microscopy (SEM) image (left) of the holes on side I of the holey metalens and (right) SEM image of the holes on side II of the metalens. Credit: Capasso Lab/Harvard SEAS.

    “If you tried to make pillars with this aspect ratio, they would fall over,” said Daniel Lim, a graduate student at SEAS and co-first author of the paper. “The holey platform increases the accessible aspect ratio of optical nanostructures without sacrificing mechanical robustness.”

    Just like with nanopillars, which vary in size to focus light, the holey metalens has holes of varying size precisely positioned over the 2 mm lens diameter. The hole size variation bends the light towards the lens focus.

    “Holey metasurfaces add a new dimension to lens design by controlling the confinement and propagation of light over a wide parameter space and make new functionalities possible,” said Maryna Meretska, a postdoctoral fellow at SEAS and co-first author of the paper. “Holes can be filled in with nonlinear optical materials, which will lead to multi-wavelength generation and manipulation of light, or with liquid crystals to actively modulate the properties of light.”

    The metalenses were fabricated using conventional semiconductor industry processes and standard materials, allowing it to be manufactured at scale in the future.

    The Harvard Office of Technology Development has protected the intellectual property relating to this project and is exploring commercialization opportunities.

    This project is supported by the Defense Advanced Research Projects Agency (DARPA), under award number HR00111810001. Lim is supported by A*STAR Singapore through the National Science Scholarship Scheme. Meretska is supported by NWO Rubicon Grant 019.173EN.010 from the Dutch Funding Agency NWO.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Through research and scholarship, the Harvard John A. Paulson School of Engineering and Applied Sciences (US) will create collaborative bridges across Harvard and educate the next generation of global leaders. By harnessing the power of engineering and applied sciences we will address the greatest challenges facing our society.

    Specifically, that means that SEAS will provide to all Harvard College students an introduction to and familiarity with engineering and technology as this is essential knowledge in the 21st century.

    Moreover, our concentrators will be immersed in the liberal arts environment and be able to understand the societal context for their problem solving, capable of working seamlessly with others, including those in the arts, the sciences, and the professional schools. They will focus on the fundamental engineering and applied science disciplines for the 21st century; as we will not teach legacy 20th century engineering disciplines.

    Instead, our curriculum will be rigorous but inviting to students, and be infused with active learning, interdisciplinary research, entrepreneurship and engineering design experiences. For our concentrators and graduate students, we will educate “T-shaped” individuals – with depth in one discipline but capable of working seamlessly with others, including arts, humanities, natural science and social science.

    To address current and future societal challenges, knowledge from fundamental science, art, and the humanities must all be linked through the application of engineering principles with the professions of law, medicine, public policy, design and business practice.

    In other words, solving important issues requires a multidisciplinary approach.

    With the combined strengths of SEAS, the Faculty of Arts and Sciences, and the professional schools, Harvard is ideally positioned to both broadly educate the next generation of leaders who understand the complexities of technology and society and to use its intellectual resources and innovative thinking to meet the challenges of the 21st century.

    Ultimately, we will provide to our graduates a rigorous quantitative liberal arts education that is an excellent launching point for any career and profession.

    Harvard University campus

    Harvard University (US) is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s bestknown landmark.

    Harvard University (US) has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

    The Massachusetts colonial legislature, the General Court, authorized Harvard University (US)’s founding. In its early years, Harvard College primarily trained Congregational and Unitarian clergy, although it has never been formally affiliated with any denomination. Its curriculum and student body were gradually secularized during the 18th century, and by the 19th century, Harvard University (US) had emerged as the central cultural establishment among the Boston elite. Following the American Civil War, President Charles William Eliot’s long tenure (1869–1909) transformed the college and affiliated professional schools into a modern research university; Harvard became a founding member of the Association of American Universities in 1900. James B. Conant led the university through the Great Depression and World War II; he liberalized admissions after the war.

    The university is composed of ten academic faculties plus the Radcliffe Institute for Advanced Study. Arts and Sciences offers study in a wide range of academic disciplines for undergraduates and for graduates, while the other faculties offer only graduate degrees, mostly professional. Harvard has three main campuses: the 209-acre (85 ha) Cambridge campus centered on Harvard Yard; an adjoining campus immediately across the Charles River in the Allston neighborhood of Boston; and the medical campus in Boston’s Longwood Medical Area. Harvard University (US)’s endowment is valued at $41.9 billion, making it the largest of any academic institution. Endowment income helps enable the undergraduate college to admit students regardless of financial need and provide generous financial aid with no loans The Harvard Library is the world’s largest academic library system, comprising 79 individual libraries holding about 20.4 million items.

    Harvard University (US) has more alumni, faculty, and researchers who have won Nobel Prizes (161) and Fields Medals (18) than any other university in the world and more alumni who have been members of the U.S. Congress, MacArthur Fellows, Rhodes Scholars (375), and Marshall Scholars (255) than any other university in the United States. Its alumni also include eight U.S. presidents and 188 living billionaires, the most of any university. Fourteen Turing Award laureates have been Harvard affiliates. Students and alumni have also won 10 Academy Awards, 48 Pulitzer Prizes, and 108 Olympic medals (46 gold), and they have founded many notable companies.


    Harvard University (US) was established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. In 1638, it acquired British North America’s first known printing press. In 1639, it was named Harvard College after deceased clergyman John Harvard, an alumnus of the University of Cambridge(UK) who had left the school £779 and his library of some 400 volumes. The charter creating the Harvard Corporation was granted in 1650.

    A 1643 publication gave the school’s purpose as “to advance learning and perpetuate it to posterity, dreading to leave an illiterate ministry to the churches when our present ministers shall lie in the dust.” It trained many Puritan ministers in its early years and offered a classic curriculum based on the English university model—many leaders in the colony had attended the University of Cambridge—but conformed to the tenets of Puritanism. Harvard University (US) has never affiliated with any particular denomination, though many of its earliest graduates went on to become clergymen in Congregational and Unitarian churches.

    Increase Mather served as president from 1681 to 1701. In 1708, John Leverett became the first president who was not also a clergyman, marking a turning of the college away from Puritanism and toward intellectual independence.

    19th century

    In the 19th century, Enlightenment ideas of reason and free will were widespread among Congregational ministers, putting those ministers and their congregations in tension with more traditionalist, Calvinist parties. When Hollis Professor of Divinity David Tappan died in 1803 and President Joseph Willard died a year later, a struggle broke out over their replacements. Henry Ware was elected to the Hollis chair in 1805, and the liberal Samuel Webber was appointed to the presidency two years later, signaling the shift from the dominance of traditional ideas at Harvard to the dominance of liberal, Arminian ideas.

    Charles William Eliot, president 1869–1909, eliminated the favored position of Christianity from the curriculum while opening it to student self-direction. Though Eliot was the crucial figure in the secularization of American higher education, he was motivated not by a desire to secularize education but by Transcendentalist Unitarian convictions influenced by William Ellery Channing and Ralph Waldo Emerson.

    20th century

    In the 20th century, Harvard University (US)’s reputation grew as a burgeoning endowment and prominent professors expanded the university’s scope. Rapid enrollment growth continued as new graduate schools were begun and the undergraduate college expanded. Radcliffe College, established in 1879 as the female counterpart of Harvard College, became one of the most prominent schools for women in the United States. Harvard University (US) became a founding member of the Association of American Universities in 1900.

    The student body in the early decades of the century was predominantly “old-stock, high-status Protestants, especially Episcopalians, Congregationalists, and Presbyterians.” A 1923 proposal by President A. Lawrence Lowell that Jews be limited to 15% of undergraduates was rejected, but Lowell did ban blacks from freshman dormitories.

    President James B. Conant reinvigorated creative scholarship to guarantee Harvard University (US)’s preeminence among research institutions. He saw higher education as a vehicle of opportunity for the talented rather than an entitlement for the wealthy, so Conant devised programs to identify, recruit, and support talented youth. In 1943, he asked the faculty to make a definitive statement about what general education ought to be, at the secondary as well as at the college level. The resulting Report, published in 1945, was one of the most influential manifestos in 20th century American education.

    Between 1945 and 1960, admissions were opened up to bring in a more diverse group of students. No longer drawing mostly from select New England prep schools, the undergraduate college became accessible to striving middle class students from public schools; many more Jews and Catholics were admitted, but few blacks, Hispanics, or Asians. Throughout the rest of the 20th century, Harvard became more diverse.

    Harvard University (US)’s graduate schools began admitting women in small numbers in the late 19th century. During World War II, students at Radcliffe College (which since 1879 had been paying Harvard University (US) professors to repeat their lectures for women) began attending Harvard University (US) classes alongside men. Women were first admitted to the medical school in 1945. Since 1971, Harvard University (US) has controlled essentially all aspects of undergraduate admission, instruction, and housing for Radcliffe women. In 1999, Radcliffe was formally merged into Harvard University (US).

    21st century

    Drew Gilpin Faust, previously the dean of the Radcliffe Institute for Advanced Study, became Harvard University (US)’s first woman president on July 1, 2007. She was succeeded by Lawrence Bacow on July 1, 2018.

  • richardmitnick 11:03 am on October 15, 2021 Permalink | Reply
    Tags: "New nanowire architectures boost computers' processing power", Applied Research & Technology, ,   

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “New nanowire architectures boost computers’ processing power” 

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

    Sandy Evangelista

    Valerio Piazza is creating new 3D architectures built from an inventive form of nanowire. His research aims to push the boundaries of miniaturization and pave the way to more powerful electronic devices. He has just won the 2020 Piaget Scientific Award, whose prize money will fund his work at EPFL for a year.

    Piazza, a scientist at EPFL’s Laboratory of Semiconductor Materials, studies semiconductors on a nano scale. His focus is nanowires, or nanostructures made of semiconducting materials, and his goal is to move transistors beyond their saturation point. That’s because transistors are everywhere – in cars, traffic lights, and even coffee makers – but their miniaturization capacity is reaching a limit because existing designs are nearly saturated. “The main challenges we now face in processing power relate to overcoming the transistor saturation point, which we can do with nanowires and other kinds of nanostructures,” says Piazza 2020 Piaget Scientific Award.

    Valerio Piazza characterizes nanowires to optimize their electrical properties © 2021 EPFL Alain Herzog.

    Much of the recent improvement in processing power stems from advancements in microfabrication methods. These methods are what have allowed engineers to develop compact, yet sophisticated electronic devices like smartphones and smartwatches. By reducing the size of transistors, engineers can fit more on a circuit, resulting in greater processing power for a given surface area. But that also means there’s a limit to just how small processers can go, based on the size of their transistors. At least that’s true for the current generation of processing technology. Piazza’s work aims to overcome that obstacle by developing new kinds of transistors based on nanowires for use in next-generation quantum computers.

    Today’s computers are made up of electronic components and integrated circuits like processing chips. Each bit corresponds to an electrical charge that indicates whether current is running through a wire or not (i.e., “on” or “off”). On the other hand, quantum computers are not limited to just two states but can accommodate an infinite number of states. The fundamental element of quantum computing is the qubit, which is the smallest unit of memory. And it’s precisely at this sub-micron level that Piazza is conducting his research.

    Nanowires are made up of groups 3 and 5 of the atoms in the periodic table © 2021 EPFL Alain Herzog.

    Piazza’s horizontal nanowires – they can be vertical, too – are made up of atoms from groups III and V of the periodic table: gallium, aluminum, indium, nitrogen, phosphorus and arsenic. “Each step of our development work comes with its own set of challenges. First we have to nanostructure the substrate and create the material – here the challenge is to improve the quality of our crystals. Then we’ll need to characterize our nanowires, with the goal of improving their electrical properties,” he says.

    A complex network of nanowires © 2021 EPFL Alain Herzog.

    Processor transistors currently measure around 10 nm. Piazza’s (horizontal) nanowires are the same size but should offer better electrical performance, depending on crystal quality. His method involves etching nanoconductors on substrate surfaces in order to create different patterns, which will let him test various structures for enhancing performance. “Take a city’s highways as an example. If there’s just one road, you can get only from Point A to Point B. But if there are lots of exits and side streets, you can travel to different neighborhoods and go even farther,” says Piazza. In other words, he’s creating a network. Over the next few months he’ll focus on identifying factors that could improve the process.

    The Piaget Scientific Award, sponsored by Piaget, is a prestigious award given out by EPFL every year to promote groundbreaking research in the broader field of miniaturization and microengineering. The award comes with prize money allowing the winner to conduct research at an EPFL lab for one year. It’s open to outstanding young PhD graduates who have the potential of becoming pioneering researchers in the field.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École polytechnique fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is the Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH) . Associated with several specialized research institutes, the two universities form the Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form the Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices was located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganised and acquired the status of a university in 1890, the technical faculty changed its name to École d’ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich(CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.


    EPFL is organised into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences (SB, Jan S. Hesthaven)

    Institute of Mathematics (MATH, Victor Panaretos)
    Institute of Chemical Sciences and Engineering (ISIC, Emsley Lyndon)
    Institute of Physics (IPHYS, Harald Brune)
    European Centre of Atomic and Molecular Computations (CECAM, Ignacio Pagonabarraga Mora)
    Bernoulli Center (CIB, Nicolas Monod)
    Biomedical Imaging Research Center (CIBM, Rolf Gruetter)
    Interdisciplinary Center for Electron Microscopy (CIME, Cécile Hébert)
    Max Planck-EPFL Centre for Molecular Nanosciences and Technology (CMNT, Thomas Rizzo)
    Swiss Plasma Center (SPC, Ambrogio Fasoli)
    Laboratory of Astrophysics (LASTRO, Jean-Paul Kneib)

    School of Engineering (STI, Ali Sayed)

    Institute of Electrical Engineering (IEL, Giovanni De Micheli)
    Institute of Mechanical Engineering (IGM, Thomas Gmür)
    Institute of Materials (IMX, Michaud Véronique)
    Institute of Microengineering (IMT, Olivier Martin)
    Institute of Bioengineering (IBI, Matthias Lütolf)

    School of Architecture, Civil and Environmental Engineering (ENAC, Claudia R. Binder)

    Institute of Architecture (IA, Luca Ortelli)
    Civil Engineering Institute (IIC, Eugen Brühwiler)
    Institute of Urban and Regional Sciences (INTER, Philippe Thalmann)
    Environmental Engineering Institute (IIE, David Andrew Barry)

    School of Computer and Communication Sciences (IC, James Larus)

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing

    School of Life Sciences (SV, Gisou van der Goot)

    Bachelor-Master Teaching Section in Life Sciences and Technologies (SSV)
    Brain Mind Institute (BMI, Carmen Sandi)
    Institute of Bioengineering (IBI, Melody Swartz)
    Swiss Institute for Experimental Cancer Research (ISREC, Douglas Hanahan)
    Global Health Institute (GHI, Bruno Lemaitre)
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics (CPG)
    NCCR Synaptic Bases of Mental Diseases (NCCR-SYNAPSY)

    College of Management of Technology (CDM)

    Swiss Finance Institute at EPFL (CDM-SFI, Damir Filipovic)
    Section of Management of Technology and Entrepreneurship (CDM-PMTE, Daniel Kuhn)
    Institute of Technology and Public Policy (CDM-ITPP, Matthias Finger)
    Institute of Management of Technology and Entrepreneurship (CDM-MTEI, Ralf Seifert)
    Section of Financial Engineering (CDM-IF, Julien Hugonnier)

    College of Humanities (CDH, Thomas David)

    Human and social sciences teaching program (CDH-SHS, Thomas David)

    EPFL Middle East (EME, Dr. Franco Vigliotti)[62]

    Section of Energy Management and Sustainability (MES, Prof. Maher Kayal)

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

  • richardmitnick 10:40 am on October 15, 2021 Permalink | Reply
    Tags: "A New Link to an Old Model Could Crack the Mystery of Deep Learning", ANNs: artificial neural networks, Applied Research & Technology, , SVMs: support vector machines   

    From Quanta Magazine (US) : “A New Link to an Old Model Could Crack the Mystery of Deep Learning” 

    From Quanta Magazine (US)

    October 11, 2021
    Anil Ananthaswamy

    Olena Shmahalo/Quanta Magazine.

    In the machine learning world, the sizes of artificial neural networks — and their outsize successes — are creating conceptual conundrums. When a network named AlexNet won an annual image recognition competition in 2012, it had about 60 million parameters. These parameters, fine-tuned during training, allowed AlexNet to recognize images that it had never seen before. Two years later, a network named VGG wowed the competition with more than 130 million such parameters. Some artificial neural networks, or ANNs, now have billions of parameters.

    These massive networks — astoundingly successful at tasks such as classifying images, recognizing speech and translating text from one language to another — have begun to dominate machine learning and artificial intelligence. Yet they remain enigmatic. The reason behind their amazing power remains elusive.

    But a number of researchers are showing that idealized versions of these powerful networks are mathematically equivalent to older, simpler machine learning models called kernel machines. If this equivalence can be extended beyond idealized neural networks, it may explain how practical ANNs achieve their astonishing results.

    Part of the mystique of artificial neural networks is that they seem to subvert traditional machine learning theory, which leans heavily on ideas from statistics and probability theory. In the usual way of thinking, machine learning models — including neural networks, trained to learn about patterns in sample data in order to make predictions about new data — work best when they have just the right number of parameters.

    If the parameters are too few, the learned model can be too simple and fail to capture all the nuances of the data it’s trained on. Too many and the model becomes overly complex, learning the patterns in the training data with such fine granularity that it cannot generalize when asked to classify new data, a phenomenon called overfitting. “It’s a balance between somehow fitting your data too well and not fitting it well at all. You want to be in the middle,” said Mikhail Belkin, a machine learning researcher at The University of California-San Diego (US).

    By all accounts, deep neural networks like VGG have way too many parameters and should overfit. But they don’t. Instead, such networks generalize astoundingly well to new data — and until recently, no one knew why. It wasn’t for lack of trying. For example, Naftali Tishby, a computer scientist and neuroscientist at the The Hebrew University of Jerusalem הַאוּנִיבֶרְסִיטָה הַעִבְרִית בִּירוּשָׁלַיִם‎ (IL) who died in August, argued that deep neural networks first fit the training data and then discard irrelevant information (by going through an information bottleneck), which helps them generalize. But others have argued that this doesn’t happen in all types of deep neural networks, and the idea remains controversial.

    Now, the mathematical equivalence of kernel machines and idealized neural networks is providing clues to why or how these over-parameterized networks arrive at (or converge to) their solutions. Kernel machines are algorithms that find patterns in data by projecting the data into extremely high dimensions. By studying the mathematically tractable kernel equivalents of idealized neural networks, researchers are learning why deep nets, despite their shocking complexity, converge during training to solutions that generalize well to unseen data.

    “A neural network is a little bit like a Rube Goldberg machine. You don’t know which part of it is really important,” said Belkin. “I think reducing [them] to kernel methods — because kernel methods don’t have all this complexity — somehow allows us to isolate the engine of what’s going on.”

    Find the Line

    Kernel methods, or kernel machines, rely on an area of mathematics with a long history. It goes back to the 19th-century German mathematician Carl Friedrich Gauss, who came up with the eponymous Gaussian kernel, which maps a variable x to a function with the familiar shape of a bell curve. The modern usage of kernels took off in the early 20th century, when the English mathematician James Mercer used them for solving integral equations. By the 1960s, kernels were being used in machine learning to tackle data that was not amenable to simple techniques of classification.

    Understanding kernel methods requires starting with algorithms in machine learning called linear classifiers. Let’s say that cats and dogs can be classified using data in only two dimensions, meaning that you need two features (say the size of the snout, which we can plot on the x-axis, and the size of the ears, which goes on the y-axis) to tell the two types of animals apart. Plot this labeled data on the xy-plane, and cats should be in one cluster and dogs in another.

    One can then train a linear classifier using the labeled data to find a straight line that separates the two clusters. This involves finding the coefficients of the equation representing the line. Now, given new unlabeled data, it’s easy to classify it as a dog or a cat by seeing which side of the line it falls on.

    Dog and cat lovers, however, would be aghast at such oversimplification. Actual data about the snouts and ears of the many types of cats and dogs almost certainly can’t be divided by a linear separator. In such situations, when the data is linearly inseparable, it can be transformed or projected into a higher-dimensional space. (One simple way to do this would be to multiply the value of two features to create a third; maybe there is something about the correlation between the sizes of the snouts and ears that separates dogs from cats.)

    More generally, looking at the data in higher-dimensional space makes it easier to find a linear separator, known as a hyperplane when the space has more than three dimensions. When this hyperplane is projected back to the lower dimensions, it’ll take the shape of a nonlinear function with curves and wiggles that separates the original lower-dimensional data into two clusters.

    When we’re working with real data, though, it’s often computationally inefficient — and sometimes impossible — to find the coefficients of the hyperplane in high dimensions. But it isn’t for kernel machines.

    Kernel of Truth

    The power of kernel machines involves their ability to do two things. First, they map each point in a low-dimensional data set to a point that lives in higher dimensions. The dimensionality of this hyperspace can be infinite, depending on the mapping, which can pose a problem: Finding the coefficients of the separating hyperplane involves calculating something called an inner product for each pair of high-dimensional features, and that becomes difficult when the data is projected into infinite dimensions.

    Samuel Velasco/Quanta Magazine.

    So here’s the second thing kernel machines do: Given two low-dimensional data points, they use a kernel function to spit out a number that’s equal to the inner product of the corresponding higher-dimensional features. Crucially, the algorithm can use this trick to find the coefficients of the hyperplane, without ever actually stepping into the high-dimensional space.

    “The great thing about the kernel trick is that all the computations happen in the low-dimensional space” rather than the possibly infinite-dimensional space, said Bernhard Boser, a professor emeritus at The University of California-Berkeley (US).

    Boser, together with his colleagues Isabelle Guyon and Vladimir Vapnik, invented a class of kernel machines called support vector machines (SVMs) in the late 1980s and early 1990s, when they were all at Bell Labs in Holmdel, New Jersey. While kernel machines of various types had made their mark in machine learning from the 1960s onward, it was with the invention of SVMs that they took center stage. SVMs proved extraordinarily powerful. By the early 2000s, they were used in fields as diverse as bioinformatics (for finding similarities between different protein sequences and predicting the functions of proteins, for example), machine vision and handwriting recognition.

    SVMs went on to dominate machine learning until deep neural networks came of age in 2012 with the arrival of AlexNet. As the machine learning community pivoted to ANNs, SVMs were left stranded, but they (and kernel machines generally) remain powerful models that have much to teach us. For example, they can do more than just use the kernel trick to find a separating hyperplane.

    “If you have a powerful kernel, then you are mapping the data to a kernel space that is kind of infinite-dimensional and very powerful,” said Chiyuan Zhang, a research scientist at Google Research’s Brain Team. “You can always find a linear separator in this powerful hidden space that separates the data, and there are infinitely many possible solutions.” But kernel theory lets you pick not just an arbitrary linear separator, but the best possible one (for some definition of “best”), by limiting the space of solutions to search. This is akin to reducing the number of parameters in a model to prevent it from overfitting, a process called regularization. Zhang wondered if deep neural networks might be doing something similar.

    Deep neural networks are made of layers of artificial neurons. They have an input layer, an output layer and at least one hidden layer sandwiched between them. The more hidden layers there are, the deeper the network. The parameters of the network represent the strengths of the connections between these neurons. Training a network for, say, image recognition involves repeatedly showing it previously categorized images and determining values for its parameters that help it correctly characterize those images. Once trained, the ANN represents a model for turning an input (say, an image) into an output (a label or category).

    In 2017, Zhang and colleagues carried out a series of empirical tests on networks like AlexNet and VGG to see whether the algorithms that are used to train these ANNs are somehow effectively reducing the number of tunable parameters, resulting in a form of implicit regularization. In other words, did the training regime render these networks incapable of overfitting?

    The team found that this was not the case. Using cleverly manipulated data sets, Zhang’s team showed that AlexNet and other such ANNs are indeed capable of overfitting and not generalizing. But the same networks trained with the same algorithm didn’t overfit — rather, they generalized well — when given unaltered data. This kind of implicit regularization couldn’t be the answer. The finding called for “a better explanation to characterize generalization in deep neural networks,” said Zhang.

    Infinite Neurons

    Meanwhile, studies were showing that wider neural networks are typically as good or better at generalization than their narrower counterparts. To some this was a hint that maybe ANNs could be understood by adopting a strategy from physics, where “studying limiting cases can sometimes simplify a problem,” said Yasaman Bahri, a research scientist on Google Research’s Brain Team. To tackle such situations, physicists often simplify the problem by considering extreme cases. What happens when the number of particles in a system goes to infinity, for example? “Statistical effects can become easier to deal with in those limits,” said Bahri. What would happen to a neural network, mathematically speaking, if the width of its layers — the number of neurons in a single layer — were infinite?

    In 1994, Radford Neal, now a professor emeritus at The University of Toronto (CA), asked this exact question of a network with a single hidden layer. He showed that if the weights of this network were set up, or initialized, with certain statistical properties, then at initialization (before any training), such a network was mathematically equivalent to a well-known kernel function called a Gaussian process. More than two decades later, in 2017, two groups, including Bahri’s, showed that the same holds true of idealized infinite-width deep neural networks with many hidden layers.

    This had a startling implication. Usually, even after a deep net has been trained, an analytical mathematical expression cannot be used to make predictions about unseen data. You just have to run the deep net and see what it says — it’s something of a black box. But in the idealized scenario, at initialization the network is equivalent to a Gaussian process. You can throw away your neural network and just train the kernel machine, for which you have the mathematical expressions.

    “Once you map it over to a Gaussian process … you can calculate analytically what the prediction should be,” said Bahri.

    This was already a landmark result, but it didn’t mathematically describe what happens during the most common form of training used in practice. In this latter setting, it was unclear how the solution could generalize so well.

    Begin the Descent

    Part of the mystery centered on how deep neural networks are trained, which involves an algorithm called gradient descent. The word “descent” refers to the fact that, during training, the network traverses a complex, high-dimensional landscape full of hills and valleys, where each location in the landscape represents the error made by the network for a given set of parameter values. Eventually, once the parameters have been suitably tuned, the ANN reaches a region called the global minimum, meaning it’s as close as possible to accurately classifying the training data. Training a network is essentially a problem of optimization, of finding the global minimum, with the trained network representing an almost optimal function that maps inputs to outputs. It’s a complex process that’s difficult to analyze.

    “No existing theory can guarantee that if you apply some widely used algorithm like gradient descent, [the ANN] can converge to the global minimum,” said Simon Du, an expert on machine learning at The University of Washington(US). By the end of 2018, we began to understand why.

    Again, as often happens with major scientific advances, multiple groups arrived at a possible answer at the same time, based on mathematical analyses of infinite-width networks and how they relate to the better-understood kernel machines. Around the time Du’s group and others put out papers, a young Swiss graduate student named Arthur Jacot presented his group’s work at NeurIPS 2018, the field’s flagship conference.

    While the teams differed in the details and the framing of their work, the essence was this: Deep neural networks of infinite width, whose weights are initialized with certain statistical properties in mind, are exactly equivalent to kernels not just at initialization, but throughout the training process. A key assumption about the weights is that they individually change very little during training (though the net effect of an infinite number of small changes is significant). Given such assumptions, Jacot and his colleagues at the EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH) showed that an infinite-width deep neural network is always equivalent to a kernel that never changes during training. It does not even depend on the training data. The kernel function depends only on the architecture of the neural network, such as its depth and type of connectivity. The team named their kernel the neural tangent kernel, based on some of its geometric properties.

    “We know that at least in some cases neural networks can behave like kernel methods,” said Jacot. “It’s the first step to try to really compare these methods in trying to understand the similarities and differences.”

    Getting to All ANNs

    The most important outcome of this result is that it explains why deep neural networks, at least in this ideal scenario, converge to a solution. This convergence is difficult to prove mathematically when we look at an ANN in parameter space, that is, in terms of its parameters and the complex loss landscape. But because the idealized deep net is equivalent to a kernel machine, we can use the training data to train either the deep net or the kernel machine, and each will eventually find a near-optimal function that transforms inputs to outputs.

    During training, the evolution of the function represented by the infinite-width neural network matches the evolution of the function represented by the kernel machine. When seen in function space, the neural network and its equivalent kernel machine both roll down a simple, bowl-shaped landscape in some hyper-dimensional space. It’s easy to prove that gradient descent will get you to the bottom of the bowl — the global minimum. At least for this idealized scenario, “you can prove global convergence,” said Du. “That’s why the learning theory community people are very excited.”

    Not everyone is convinced that this equivalence between kernels and neural networks will hold for practical neural networks, which have finite width and whose parameters can change dramatically during training. “I think there are some dots that still need to be connected,” said Zhang. There’s also the psychological aspect: Neural networks have a mystique about them, and to reduce them to kernel machines feels disappointing for Zhang. “I kind of hope it’s not the answer, because it makes things less interesting in the sense that the old theory can be used.”

    But others are excited. Belkin, for example, thinks that even if kernel methods are old theory, they are still not fully understood. His team has shown empirically that kernel methods don’t overfit and do generalize well to test data without any need for regularization, similar to neural networks and contrary to what you’d expect from traditional learning theory. “If we understand what’s going on with kernel methods, then I think that really gives us a key to open this magic box of [neural networks],” said Belkin.

    Not only do researchers have a firmer mathematical grasp of kernels, making it easier to use them as analogues to understand neural nets, but they’re also empirically easier to work with than neural networks. Kernels are far less complex, they don’t require the random initialization of parameters, and their performance is more reproducible. Researchers have begun investigating links between realistic networks and kernels and are excited to see just how far they can take this new understanding.

    “If we establish absolute, complete equivalence, then I think it would kind of change the whole game,” said Belkin.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine (US) is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 4:40 pm on October 14, 2021 Permalink | Reply
    Tags: "Department of Energy gives green light for a flagship petawatt laser facility at SLAC", Applied Research & Technology, , , , Locating high-energy high-power lasers next to an XFEL can now be realized., , , Two state-of-the-art laser systems ­– a high-power petawatt laser and a high-energy kilojoule laser., University of Rochester’s Laboratory for Laser Energetics (LLE)   

    From DOE’s SLAC National Accelerator Laboratory (US) : “Department of Energy gives green light for a flagship petawatt laser facility at SLAC” 

    From DOE’s SLAC National Accelerator Laboratory (US)

    October 7, 2021
    Ali Sundermier
    Glennda Chui

    High-power lasers will work in concert with the lab’s X-ray laser to dramatically improve our understanding of matter in extreme conditions.

    Petawatt lasers are the most powerful on the planet, generating a million billion watts to produce some of the most extreme conditions seen on Earth. But today’s petawatt lasers are standalone facilities, with limited ability to fully diagnose the conditions they produce.

    A new facility at the Department of Energy’s SLAC National Accelerator Laboratory will change that. It will be the first to combine these powerful lasers with an X-ray free-electron laser (XFEL) that can probe the extreme conditions they create as never before. Coupled to the lab’s Linac Coherent Light Source (LCLS), the Matter in Extreme Conditions Upgrade, or MEC-U, promises to dramatically improve our understanding of the conditions needed to produce fusion energy and to replicate a wide range of astrophysical phenomena here on Earth.

    In a new underground experimental facility coupled to SLAC’s Linac Coherent Light Source (LCLS), two state-of-the-art laser systems – a high-power petawatt laser and a high-energy kilojoule laser – will feed into two new experimental areas dedicated to the study of hot dense plasmas, astrophysics, and planetary science. (Gilliss Dyer/SLAC National Accelerator Laboratory)

    The project got approval from the DOE Office of Science (SC) on Monday to move from its conceptual design phase to preliminary design and execution, having passed what is known as Critical Decision 1.

    “It’s been gratifying to see the community rally together to support this project, and I think this achievement really validates those efforts. It shows that this notion of locating high-energy high-power lasers next to an XFEL can now be realized,” said SLAC scientist Arianna Gleason.

    “Working in concert, they’ll allow us to look behind the curtain of physics at extreme conditions to see how it’s all stitched together, opening a new frontier.”

    A national opportunity

    SLAC will work in partnership with The DOE’S Lawrence Livermore National Laboratory (US) and University of Rochester’s Laboratory for Laser Energetics (LLE) to design and construct the facility in a new underground cavern.

    University of Rochester(US) Laboratory for Laser Energetics.

    There, two state-of-the-art laser systems ­– a high-power petawatt laser and a high-energy kilojoule laser ­– will feed into two new experimental areas dedicated to the study of hot dense plasmas, astrophysics, and planetary science.

    “Not only are we working with some of the leading laser laboratories in the world, but we’re also working with world experts in experimental science, high energy density science and the operation of DOE Office of Science user facilities, where scientists from all over the world can come to do experiments,” said Alan Fry, MEC-U Project Director.

    Scientists started discussing what would be needed to make a quantum leap in this field in 2014 at a series of high-power laser workshops at SLAC. Three years later, a National Academies report called “Opportunities in intense ultrafast lasers: Reaching for the brightest light” highlighted the importance of this field of science. It recommended that DOE secure a key global advantage for the U.S. by locating high-intensity lasers “with existing infrastructure, such as particle accelerators.”

    Building on success

    This project builds on the success achieved at the existing Matter in Extreme Conditions (MEC) instrument at LCLS. Funded by DOE SC’s Fusion Energy Sciences program (FES), MEC uses short-pulse lasers coupled to X-ray laser pulses from LCLS to probe the characteristics of matter with unprecedented precision. These experiments have delivered a wealth of outstanding science and attracted worldwide media attention, with examples such as the study of “diamond rain” thought to exist on Neptune, to investigating the signatures of asteroid impacts on the Earth, to studying potential failure mechanisms of satellites due to solar flares.

    The Matter in Extreme Conditions instrument at SLAC serves hundreds of scientists from across the community, providing the tools necessary to investigate extremely hot, dense matter similar to that found in the centers of stars and giant planets. Credit: Matt Beardsley/SLAC National Accelerator Laboratory.

    The existing MEC instrument is however limited in the regimes it can access. It has only modest laser capabilities which don’t allow it to reach the conditions of highest interest to researchers. The community called for investment into a petawatt laser that can produce unprecedented light pressures and generate plasmas at the even higher temperatures found in cosmic collisions, the cores of stars and planets, and fusion devices, giving scientists access to more extreme forms of matter needed to address the most important scientific challenges identified by the broad community of scientific users.

    “The new high-power lasers being designed by Livermore and Rochester are world-leading in their own right,” Fry said. “The fact that they’re coupled to LCLS then really puts it over the top in terms of capabilities.”

    MEC-U will also take advantage of the LCLS-II upgrade to the LCLS facility, which will provide X-ray laser beams of unsurpassed brilliance for probing those plasmas, doubling the X-ray energy that has been attainable to date.

    SLAC/LCLS II projected view.

    Magnets called undulators stretch roughly 100 meters down a tunnel at SLAC National Accelerator Laboratory, with one side (right) producing hard x-rays and the other soft x-rays.Credit: SLAC National Accelerator Laboratory.

    New scientific frontiers

    Access to the facility will be open to researchers from across the country and around the world, facilitated in part by LaserNetUS, a research network that is boosting access to high-intensity laser facilities at labs and universities across the country. This will allow more MEC users in a broader range of fields to use the facility, while also helping train new staff and develop new techniques.

    “This new facility will lead to a greater understanding of everything from fusion energy to the most extreme phenomena in the universe, shedding light on cosmic rays, planetary physics and stellar conditions.” said Siegfried Glenzer, director of the High Energy Density Division at SLAC. “It really shows the DOE’s dedication to continue to tackle the most important and exciting problems in plasma physics.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC National Accelerator Laboratory (US) originally named Stanford Linear Accelerator Center, is a Department of Energy (US) National Laboratory operated by Stanford University (US) under the programmatic direction of the Department of Energy (US) Office of Science and located in Menlo Park, California. It is the site of the Stanford Linear Accelerator, a 3.2 kilometer (2-mile) linear accelerator constructed in 1966 and shut down in the 2000s, which could accelerate electrons to energies of 50 GeV.
    Today SLAC research centers on a broad program in atomic and solid-state physics, chemistry, biology, and medicine using X-rays from synchrotron radiation and a free-electron laser as well as experimental and theoretical research in elementary particle physics, astroparticle physics, and cosmology.

    Founded in 1962 as the Stanford Linear Accelerator Center, the facility is located on 172 hectares (426 acres) of Stanford University-owned land on Sand Hill Road in Menlo Park, California—just west of the University’s main campus. The main accelerator is 3.2 kilometers (2 mi) long—the longest linear accelerator in the world—and has been operational since 1966.

    Research at SLAC has produced three Nobel Prizes in Physics

    1976: The charm quark—see J/ψ meson
    1990: Quark structure inside protons and neutrons
    1995: The tau lepton

    SLAC’s meeting facilities also provided a venue for the Homebrew Computer Club and other pioneers of the home computer revolution of the late 1970s and early 1980s.

    In 1984 the laboratory was named an ASME National Historic Engineering Landmark and an IEEE Milestone.

    SLAC developed and, in December 1991, began hosting the first World Wide Web server outside of Europe.

    In the early-to-mid 1990s, the Stanford Linear Collider (SLC) investigated the properties of the Z boson using the Stanford Large Detector.

    As of 2005, SLAC employed over 1,000 people, some 150 of whom were physicists with doctorate degrees, and served over 3,000 visiting researchers yearly, operating particle accelerators for high-energy physics and the Stanford Synchrotron Radiation Laboratory (SSRL) for synchrotron light radiation research, which was “indispensable” in the research leading to the 2006 Nobel Prize in Chemistry awarded to Stanford Professor Roger D. Kornberg.

    In October 2008, the Department of Energy announced that the center’s name would be changed to SLAC National Accelerator Laboratory. The reasons given include a better representation of the new direction of the lab and the ability to trademark the laboratory’s name. Stanford University had legally opposed the Department of Energy’s attempt to trademark “Stanford Linear Accelerator Center”.

    In March 2009, it was announced that the SLAC National Accelerator Laboratory was to receive $68.3 million in Recovery Act Funding to be disbursed by Department of Energy’s Office of Science.

    In October 2016, Bits and Watts launched as a collaboration between SLAC and Stanford University to design “better, greener electric grids”. SLAC later pulled out over concerns about an industry partner, the state-owned Chinese electric utility.


    The main accelerator was an RF linear accelerator that accelerated electrons and positrons up to 50 GeV. At 3.2 km (2.0 mi) long, the accelerator was the longest linear accelerator in the world, and was claimed to be “the world’s most straight object.” until 2017 when the European x-ray free electron laser opened. The main accelerator is buried 9 m (30 ft) below ground and passes underneath Interstate Highway 280. The above-ground klystron gallery atop the beamline, was the longest building in the United States until the LIGO project’s twin interferometers were completed in 1999. It is easily distinguishable from the air and is marked as a visual waypoint on aeronautical charts.

    A portion of the original linear accelerator is now part of the Linac Coherent Light Source [below].

    Stanford Linear Collider

    The Stanford Linear Collider was a linear accelerator that collided electrons and positrons at SLAC. The center of mass energy was about 90 GeV, equal to the mass of the Z boson, which the accelerator was designed to study. Grad student Barrett D. Milliken discovered the first Z event on 12 April 1989 while poring over the previous day’s computer data from the Mark II detector. The bulk of the data was collected by the SLAC Large Detector, which came online in 1991. Although largely overshadowed by the Large Electron–Positron Collider at CERN, which began running in 1989, the highly polarized electron beam at SLC (close to 80%) made certain unique measurements possible, such as parity violation in Z Boson-b quark coupling.

    Presently no beam enters the south and north arcs in the machine, which leads to the Final Focus, therefore this section is mothballed to run beam into the PEP2 section from the beam switchyard.

    The SLAC Large Detector (SLD) was the main detector for the Stanford Linear Collider. It was designed primarily to detect Z bosons produced by the accelerator’s electron-positron collisions. Built in 1991, the SLD operated from 1992 to 1998.

    SLAC National Accelerator Laboratory(US)Large Detector


    PEP (Positron-Electron Project) began operation in 1980, with center-of-mass energies up to 29 GeV. At its apex, PEP had five large particle detectors in operation, as well as a sixth smaller detector. About 300 researchers made used of PEP. PEP stopped operating in 1990, and PEP-II began construction in 1994.


    From 1999 to 2008, the main purpose of the linear accelerator was to inject electrons and positrons into the PEP-II accelerator, an electron-positron collider with a pair of storage rings 2.2 km (1.4 mi) in circumference. PEP-II was host to the BaBar experiment, one of the so-called B-Factory experiments studying charge-parity symmetry.

    SLAC National Accelerator Laboratory(US) BaBar

    SLAC National Accelerator Laboratory(US)/SSRL

    Fermi Gamma-ray Space Telescope

    SLAC plays a primary role in the mission and operation of the Fermi Gamma-ray Space Telescope, launched in August 2008. The principal scientific objectives of this mission are:

    To understand the mechanisms of particle acceleration in AGNs, pulsars, and SNRs.
    To resolve the gamma-ray sky: unidentified sources and diffuse emission.
    To determine the high-energy behavior of gamma-ray bursts and transients.
    To probe dark matter and fundamental physics.


    The Stanford PULSE Institute (PULSE) is a Stanford Independent Laboratory located in the Central Laboratory at SLAC. PULSE was created by Stanford in 2005 to help Stanford faculty and SLAC scientists develop ultrafast x-ray research at LCLS.

    The Linac Coherent Light Source (LCLS)[below] is a free electron laser facility located at SLAC. The LCLS is partially a reconstruction of the last 1/3 of the original linear accelerator at SLAC, and can deliver extremely intense x-ray radiation for research in a number of areas. It achieved first lasing in April 2009.

    The laser produces hard X-rays, 10^9 times the relative brightness of traditional synchrotron sources and is the most powerful x-ray source in the world. LCLS enables a variety of new experiments and provides enhancements for existing experimental methods. Often, x-rays are used to take “snapshots” of objects at the atomic level before obliterating samples. The laser’s wavelength, ranging from 6.2 to 0.13 nm (200 to 9500 electron volts (eV)) is similar to the width of an atom, providing extremely detailed information that was previously unattainable. Additionally, the laser is capable of capturing images with a “shutter speed” measured in femtoseconds, or million-billionths of a second, necessary because the intensity of the beam is often high enough so that the sample explodes on the femtosecond timescale.

    The LCLS-II [below] project is to provide a major upgrade to LCLS by adding two new X-ray laser beams. The new system will utilize the 500 m (1,600 ft) of existing tunnel to add a new superconducting accelerator at 4 GeV and two new sets of undulators that will increase the available energy range of LCLS. The advancement from the discoveries using this new capabilities may include new drugs, next-generation computers, and new materials.


    In 2012, the first two-thirds (~2 km) of the original SLAC LINAC were recommissioned for a new user facility, the Facility for Advanced Accelerator Experimental Tests (FACET). This facility was capable of delivering 20 GeV, 3 nC electron (and positron) beams with short bunch lengths and small spot sizes, ideal for beam-driven plasma acceleration studies. The facility ended operations in 2016 for the constructions of LCLS-II which will occupy the first third of the SLAC LINAC. The FACET-II project will re-establish electron and positron beams in the middle third of the LINAC for the continuation of beam-driven plasma acceleration studies in 2019.

    The Next Linear Collider Test Accelerator (NLCTA) is a 60-120 MeV high-brightness electron beam linear accelerator used for experiments on advanced beam manipulation and acceleration techniques. It is located at SLAC’s end station B

    SSRL and LCLS are DOE Office of Science user facilities.

    Stanford University (US)

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members.

    Stanford University, officially Leland Stanford Junior University, is a private research university located in Stanford, California. Stanford was founded in 1885 by Leland and Jane Stanford in memory of their only child, Leland Stanford Jr., who had died of typhoid fever at age 15 the previous year. Stanford is consistently ranked as among the most prestigious and top universities in the world by major education publications. It is also one of the top fundraising institutions in the country, becoming the first school to raise more than a billion dollars in a year.

    Leland Stanford was a U.S. senator and former governor of California who made his fortune as a railroad tycoon. The school admitted its first students on October 1, 1891, as a coeducational and non-denominational institution. Stanford University struggled financially after the death of Leland Stanford in 1893 and again after much of the campus was damaged by the 1906 San Francisco earthquake. Following World War II, provost Frederick Terman supported faculty and graduates’ entrepreneurialism to build self-sufficient local industry in what would later be known as Silicon Valley.

    The university is organized around seven schools: three schools consisting of 40 academic departments at the undergraduate level as well as four professional schools that focus on graduate programs in law, medicine, education, and business. All schools are on the same campus. Students compete in 36 varsity sports, and the university is one of two private institutions in the Division I FBS Pac-12 Conference. It has gained 126 NCAA team championships, and Stanford has won the NACDA Directors’ Cup for 24 consecutive years, beginning in 1994–1995. In addition, Stanford students and alumni have won 270 Olympic medals including 139 gold medals.

    As of October 2020, 84 Nobel laureates, 28 Turing Award laureates, and eight Fields Medalists have been affiliated with Stanford as students, alumni, faculty, or staff. In addition, Stanford is particularly noted for its entrepreneurship and is one of the most successful universities in attracting funding for start-ups. Stanford alumni have founded numerous companies, which combined produce more than $2.7 trillion in annual revenue, roughly equivalent to the 7th largest economy in the world (as of 2020). Stanford is the alma mater of one president of the United States (Herbert Hoover), 74 living billionaires, and 17 astronauts. It is also one of the leading producers of Fulbright Scholars, Marshall Scholars, Rhodes Scholars, and members of the United States Congress.

    Stanford University was founded in 1885 by Leland and Jane Stanford, dedicated to Leland Stanford Jr, their only child. The institution opened in 1891 on Stanford’s previous Palo Alto farm.

    Jane and Leland Stanford modeled their university after the great eastern universities, most specifically Cornell University. Stanford opened being called the “Cornell of the West” in 1891 due to faculty being former Cornell affiliates (either professors, alumni, or both) including its first president, David Starr Jordan, and second president, John Casper Branner. Both Cornell and Stanford were among the first to have higher education be accessible, nonsectarian, and open to women as well as to men. Cornell is credited as one of the first American universities to adopt this radical departure from traditional education, and Stanford became an early adopter as well.

    Despite being impacted by earthquakes in both 1906 and 1989, the campus was rebuilt each time. In 1919, The Hoover Institution on War, Revolution and Peace was started by Herbert Hoover to preserve artifacts related to World War I. The Stanford Medical Center, completed in 1959, is a teaching hospital with over 800 beds. The DOE’s SLAC National Accelerator Laboratory(US)(originally named the Stanford Linear Accelerator Center), established in 1962, performs research in particle physics.


    Most of Stanford is on an 8,180-acre (12.8 sq mi; 33.1 km^2) campus, one of the largest in the United States. It is located on the San Francisco Peninsula, in the northwest part of the Santa Clara Valley (Silicon Valley) approximately 37 miles (60 km) southeast of San Francisco and approximately 20 miles (30 km) northwest of San Jose. In 2008, 60% of this land remained undeveloped.

    Stanford’s main campus includes a census-designated place within unincorporated Santa Clara County, although some of the university land (such as the Stanford Shopping Center and the Stanford Research Park) is within the city limits of Palo Alto. The campus also includes much land in unincorporated San Mateo County (including the SLAC National Accelerator Laboratory and the Jasper Ridge Biological Preserve), as well as in the city limits of Menlo Park (Stanford Hills neighborhood), Woodside, and Portola Valley.

    Non-central campus

    Stanford currently operates in various locations outside of its central campus.

    On the founding grant:

    Jasper Ridge Biological Preserve is a 1,200-acre (490 ha) natural reserve south of the central campus owned by the university and used by wildlife biologists for research.
    SLAC National Accelerator Laboratory is a facility west of the central campus operated by the university for the Department of Energy. It contains the longest linear particle accelerator in the world, 2 miles (3.2 km) on 426 acres (172 ha) of land.
    Golf course and a seasonal lake: The university also has its own golf course and a seasonal lake (Lake Lagunita, actually an irrigation reservoir), both home to the vulnerable California tiger salamander. As of 2012 Lake Lagunita was often dry and the university had no plans to artificially fill it.

    Off the founding grant:

    Hopkins Marine Station, in Pacific Grove, California, is a marine biology research center owned by the university since 1892.
    Study abroad locations: unlike typical study abroad programs, Stanford itself operates in several locations around the world; thus, each location has Stanford faculty-in-residence and staff in addition to students, creating a “mini-Stanford”.

    Redwood City campus for many of the university’s administrative offices located in Redwood City, California, a few miles north of the main campus. In 2005, the university purchased a small, 35-acre (14 ha) campus in Midpoint Technology Park intended for staff offices; development was delayed by The Great Recession. In 2015 the university announced a development plan and the Redwood City campus opened in March 2019.

    The Bass Center in Washington, DC provides a base, including housing, for the Stanford in Washington program for undergraduates. It includes a small art gallery open to the public.

    China: Stanford Center at Peking University, housed in the Lee Jung Sen Building, is a small center for researchers and students in collaboration with Beijing University [北京大学](CN) (Kavli Institute for Astronomy and Astrophysics at Peking University(CN) (KIAA-PKU).

    Administration and organization

    Stanford is a private, non-profit university that is administered as a corporate trust governed by a privately appointed board of trustees with a maximum membership of 38. Trustees serve five-year terms (not more than two consecutive terms) and meet five times annually.[83] A new trustee is chosen by the current trustees by ballot. The Stanford trustees also oversee the Stanford Research Park, the Stanford Shopping Center, the Cantor Center for Visual Arts, Stanford University Medical Center, and many associated medical facilities (including the Lucile Packard Children’s Hospital).

    The board appoints a president to serve as the chief executive officer of the university, to prescribe the duties of professors and course of study, to manage financial and business affairs, and to appoint nine vice presidents. The provost is the chief academic and budget officer, to whom the deans of each of the seven schools report. Persis Drell became the 13th provost in February 2017.

    As of 2018, the university was organized into seven academic schools. The schools of Humanities and Sciences (27 departments), Engineering (nine departments), and Earth, Energy & Environmental Sciences (four departments) have both graduate and undergraduate programs while the Schools of Law, Medicine, Education and Business have graduate programs only. The powers and authority of the faculty are vested in the Academic Council, which is made up of tenure and non-tenure line faculty, research faculty, senior fellows in some policy centers and institutes, the president of the university, and some other academic administrators, but most matters are handled by the Faculty Senate, made up of 55 elected representatives of the faculty.

    The Associated Students of Stanford University (ASSU) is the student government for Stanford and all registered students are members. Its elected leadership consists of the Undergraduate Senate elected by the undergraduate students, the Graduate Student Council elected by the graduate students, and the President and Vice President elected as a ticket by the entire student body.

    Stanford is the beneficiary of a special clause in the California Constitution, which explicitly exempts Stanford property from taxation so long as the property is used for educational purposes.

    Endowment and donations

    The university’s endowment, managed by the Stanford Management Company, was valued at $27.7 billion as of August 31, 2019. Payouts from the Stanford endowment covered approximately 21.8% of university expenses in the 2019 fiscal year. In the 2018 NACUBO-TIAA survey of colleges and universities in the United States and Canada, only Harvard University(US), the University of Texas System(US), and Yale University(US) had larger endowments than Stanford.

    In 2006, President John L. Hennessy launched a five-year campaign called the Stanford Challenge, which reached its $4.3 billion fundraising goal in 2009, two years ahead of time, but continued fundraising for the duration of the campaign. It concluded on December 31, 2011, having raised a total of $6.23 billion and breaking the previous campaign fundraising record of $3.88 billion held by Yale. Specifically, the campaign raised $253.7 million for undergraduate financial aid, as well as $2.33 billion for its initiative in “Seeking Solutions” to global problems, $1.61 billion for “Educating Leaders” by improving K-12 education, and $2.11 billion for “Foundation of Excellence” aimed at providing academic support for Stanford students and faculty. Funds supported 366 new fellowships for graduate students, 139 new endowed chairs for faculty, and 38 new or renovated buildings. The new funding also enabled the construction of a facility for stem cell research; a new campus for the business school; an expansion of the law school; a new Engineering Quad; a new art and art history building; an on-campus concert hall; a new art museum; and a planned expansion of the medical school, among other things. In 2012, the university raised $1.035 billion, becoming the first school to raise more than a billion dollars in a year.

    Research centers and institutes

    DOE’s SLAC National Accelerator Laboratory(US)
    Stanford Research Institute, a center of innovation to support economic development in the region.
    Hoover Institution, a conservative American public policy institution and research institution that promotes personal and economic liberty, free enterprise, and limited government.
    Hasso Plattner Institute of Design, a multidisciplinary design school in cooperation with the Hasso Plattner Institute of University of Potsdam [Universität Potsdam](DE) that integrates product design, engineering, and business management education).
    Martin Luther King Jr. Research and Education Institute, which grew out of and still contains the Martin Luther King Jr. Papers Project.
    John S. Knight Fellowship for Professional Journalists
    Center for Ocean Solutions
    Together with UC Berkeley(US) and UC San Francisco(US), Stanford is part of the Biohub, a new medical science research center founded in 2016 by a $600 million commitment from Facebook CEO and founder Mark Zuckerberg and pediatrician Priscilla Chan.

    Discoveries and innovation

    Natural sciences

    Biological synthesis of deoxyribonucleic acid (DNA) – Arthur Kornberg synthesized DNA material and won the Nobel Prize in Physiology or Medicine 1959 for his work at Stanford.
    First Transgenic organism – Stanley Cohen and Herbert Boyer were the first scientists to transplant genes from one living organism to another, a fundamental discovery for genetic engineering. Thousands of products have been developed on the basis of their work, including human growth hormone and hepatitis B vaccine.
    Laser – Arthur Leonard Schawlow shared the 1981 Nobel Prize in Physics with Nicolaas Bloembergen and Kai Siegbahn for his work on lasers.
    Nuclear magnetic resonance – Felix Bloch developed new methods for nuclear magnetic precision measurements, which are the underlying principles of the MRI.

    Computer and applied sciences

    ARPANETStanford Research Institute, formerly part of Stanford but on a separate campus, was the site of one of the four original ARPANET nodes.

    Internet—Stanford was the site where the original design of the Internet was undertaken. Vint Cerf led a research group to elaborate the design of the Transmission Control Protocol (TCP/IP) that he originally co-created with Robert E. Kahn (Bob Kahn) in 1973 and which formed the basis for the architecture of the Internet.

    Frequency modulation synthesis – John Chowning of the Music department invented the FM music synthesis algorithm in 1967, and Stanford later licensed it to Yamaha Corporation.

    Google – Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford. They were working on the Stanford Digital Library Project (SDLP). The SDLP’s goal was “to develop the enabling technologies for a single, integrated and universal digital library” and it was funded through the National Science Foundation, among other federal agencies.

    Klystron tube – invented by the brothers Russell and Sigurd Varian at Stanford. Their prototype was completed and demonstrated successfully on August 30, 1937. Upon publication in 1939, news of the klystron immediately influenced the work of U.S. and UK researchers working on radar equipment.

    RISCARPA funded VLSI project of microprocessor design. Stanford and UC Berkeley are most associated with the popularization of this concept. The Stanford MIPS would go on to be commercialized as the successful MIPS architecture, while Berkeley RISC gave its name to the entire concept, commercialized as the SPARC. Another success from this era were IBM’s efforts that eventually led to the IBM POWER instruction set architecture, PowerPC, and Power ISA. As these projects matured, a wide variety of similar designs flourished in the late 1980s and especially the early 1990s, representing a major force in the Unix workstation market as well as embedded processors in laser printers, routers and similar products.
    SUN workstation – Andy Bechtolsheim designed the SUN workstation for the Stanford University Network communications project as a personal CAD workstation, which led to Sun Microsystems.

    Businesses and entrepreneurship

    Stanford is one of the most successful universities in creating companies and licensing its inventions to existing companies; it is often held up as a model for technology transfer. Stanford’s Office of Technology Licensing is responsible for commercializing university research, intellectual property, and university-developed projects.

    The university is described as having a strong venture culture in which students are encouraged, and often funded, to launch their own companies.

    Companies founded by Stanford alumni generate more than $2.7 trillion in annual revenue, equivalent to the 10th-largest economy in the world.

    Some companies closely associated with Stanford and their connections include:

    Hewlett-Packard, 1939, co-founders William R. Hewlett (B.S, PhD) and David Packard (M.S).
    Silicon Graphics, 1981, co-founders James H. Clark (Associate Professor) and several of his grad students.
    Sun Microsystems, 1982, co-founders Vinod Khosla (M.B.A), Andy Bechtolsheim (PhD) and Scott McNealy (M.B.A).
    Cisco, 1984, founders Leonard Bosack (M.S) and Sandy Lerner (M.S) who were in charge of Stanford Computer Science and Graduate School of Business computer operations groups respectively when the hardware was developed.[163]
    Yahoo!, 1994, co-founders Jerry Yang (B.S, M.S) and David Filo (M.S).
    Google, 1998, co-founders Larry Page (M.S) and Sergey Brin (M.S).
    LinkedIn, 2002, co-founders Reid Hoffman (B.S), Konstantin Guericke (B.S, M.S), Eric Lee (B.S), and Alan Liu (B.S).
    Instagram, 2010, co-founders Kevin Systrom (B.S) and Mike Krieger (B.S).
    Snapchat, 2011, co-founders Evan Spiegel and Bobby Murphy (B.S).
    Coursera, 2012, co-founders Andrew Ng (Associate Professor) and Daphne Koller (Professor, PhD).

    Student body

    Stanford enrolled 6,996 undergraduate and 10,253 graduate students as of the 2019–2020 school year. Women comprised 50.4% of undergraduates and 41.5% of graduate students. In the same academic year, the freshman retention rate was 99%.

    Stanford awarded 1,819 undergraduate degrees, 2,393 master’s degrees, 770 doctoral degrees, and 3270 professional degrees in the 2018–2019 school year. The four-year graduation rate for the class of 2017 cohort was 72.9%, and the six-year rate was 94.4%. The relatively low four-year graduation rate is a function of the university’s coterminal degree (or “coterm”) program, which allows students to earn a master’s degree as a 1-to-2-year extension of their undergraduate program.

    As of 2010, fifteen percent of undergraduates were first-generation students.


    As of 2016 Stanford had 16 male varsity sports and 20 female varsity sports, 19 club sports and about 27 intramural sports. In 1930, following a unanimous vote by the Executive Committee for the Associated Students, the athletic department adopted the mascot “Indian.” The Indian symbol and name were dropped by President Richard Lyman in 1972, after objections from Native American students and a vote by the student senate. The sports teams are now officially referred to as the “Stanford Cardinal,” referring to the deep red color, not the cardinal bird. Stanford is a member of the Pac-12 Conference in most sports, the Mountain Pacific Sports Federation in several other sports, and the America East Conference in field hockey with the participation in the inter-collegiate NCAA’s Division I FBS.

    Its traditional sports rival is the University of California, Berkeley, the neighbor to the north in the East Bay. The winner of the annual “Big Game” between the Cal and Cardinal football teams gains custody of the Stanford Axe.

    Stanford has had at least one NCAA team champion every year since the 1976–77 school year and has earned 126 NCAA national team titles since its establishment, the most among universities, and Stanford has won 522 individual national championships, the most by any university. Stanford has won the award for the top-ranked Division 1 athletic program—the NACDA Directors’ Cup, formerly known as the Sears Cup—annually for the past twenty-four straight years. Stanford athletes have won medals in every Olympic Games since 1912, winning 270 Olympic medals total, 139 of them gold. In the 2008 Summer Olympics, and 2016 Summer Olympics, Stanford won more Olympic medals than any other university in the United States. Stanford athletes won 16 medals at the 2012 Summer Olympics (12 gold, two silver and two bronze), and 27 medals at the 2016 Summer Olympics.


    The unofficial motto of Stanford, selected by President Jordan, is Die Luft der Freiheit weht. Translated from the German language, this quotation from Ulrich von Hutten means, “The wind of freedom blows.” The motto was controversial during World War I, when anything in German was suspect; at that time the university disavowed that this motto was official.
    Hail, Stanford, Hail! is the Stanford Hymn sometimes sung at ceremonies or adapted by the various University singing groups. It was written in 1892 by mechanical engineering professor Albert W. Smith and his wife, Mary Roberts Smith (in 1896 she earned the first Stanford doctorate in Economics and later became associate professor of Sociology), but was not officially adopted until after a performance on campus in March 1902 by the Mormon Tabernacle Choir.
    “Uncommon Man/Uncommon Woman”: Stanford does not award honorary degrees, but in 1953 the degree of “Uncommon Man/Uncommon Woman” was created to recognize individuals who give rare and extraordinary service to the University. Technically, this degree is awarded by the Stanford Associates, a voluntary group that is part of the university’s alumni association. As Stanford’s highest honor, it is not conferred at prescribed intervals, but only when appropriate to recognize extraordinary service. Recipients include Herbert Hoover, Bill Hewlett, Dave Packard, Lucile Packard, and John Gardner.
    Big Game events: The events in the week leading up to the Big Game vs. UC Berkeley, including Gaieties (a musical written, composed, produced, and performed by the students of Ram’s Head Theatrical Society).
    “Viennese Ball”: a formal ball with waltzes that was initially started in the 1970s by students returning from the now-closed Stanford in Vienna overseas program. It is now open to all students.
    “Full Moon on the Quad”: An annual event at Main Quad, where students gather to kiss one another starting at midnight. Typically organized by the Junior class cabinet, the festivities include live entertainment, such as music and dance performances.
    “Band Run”: An annual festivity at the beginning of the school year, where the band picks up freshmen from dorms across campus while stopping to perform at each location, culminating in a finale performance at Main Quad.
    “Mausoleum Party”: An annual Halloween Party at the Stanford Mausoleum, the final resting place of Leland Stanford Jr. and his parents. A 20-year tradition, the “Mausoleum Party” was on hiatus from 2002 to 2005 due to a lack of funding, but was revived in 2006. In 2008, it was hosted in Old Union rather than at the actual Mausoleum, because rain prohibited generators from being rented. In 2009, after fundraising efforts by the Junior Class Presidents and the ASSU Executive, the event was able to return to the Mausoleum despite facing budget cuts earlier in the year.
    Former campus traditions include the “Big Game bonfire” on Lake Lagunita (a seasonal lake usually dry in the fall), which was formally ended in 1997 because of the presence of endangered salamanders in the lake bed.

    Award laureates and scholars

    Stanford’s current community of scholars includes:

    19 Nobel Prize laureates (as of October 2020, 85 affiliates in total)
    171 members of the National Academy of Sciences
    109 members of National Academy of Engineering
    76 members of National Academy of Medicine
    288 members of the American Academy of Arts and Sciences
    19 recipients of the National Medal of Science
    1 recipient of the National Medal of Technology
    4 recipients of the National Humanities Medal
    49 members of American Philosophical Society
    56 fellows of the American Physics Society (since 1995)
    4 Pulitzer Prize winners
    31 MacArthur Fellows
    4 Wolf Foundation Prize winners
    2 ACL Lifetime Achievement Award winners
    14 AAAI fellows
    2 Presidential Medal of Freedom winners

    Stanford University Seal

  • richardmitnick 9:38 am on October 14, 2021 Permalink | Reply
    Tags: "How ‘ice needles’ weave patterns of stones in frozen landscapes", Applied Research & Technology, , Repeating patterns of stones that form in cold landscapes,   

    From The University of Washington (US) : “How ‘ice needles’ weave patterns of stones in frozen landscapes” 

    From The University of Washington (US)

    October 6, 2021
    Hannah Hickey

    Circles of stones in Svalbard, Norway. Each circle measures roughly 10 feet, or 3 meters, across. New research provides insight into how these features form in rocky, frost-prone landscapes.Credit: Bernar Hallet/University of Washington.

    Nature is full of repeating patterns that are part of the beauty of our world. An international team, including a researcher from the University of Washington, used modern tools to explain repeating patterns of stones that form in cold landscapes.

    The new study, published Oct. 5 in the PNAS, uses experimental tools to show how needles of ice growing randomly on frozen ground can gradually move rocks into regular, repeating patterns. The team, based mainly in China and Japan, uses a combination of novel experiments and computer modeling to describe these striking features with new theoretical insights.

    “The presence of these amazing patterns that develop without any intervention from humans is pretty striking in nature,” said co-author Bernard Hallet, a UW professor emeritus of Earth and space sciences and member of the Quaternary Research Center. “It’s like a Japanese garden, but where is the gardener?”

    Lines of stones in Hawaii. Repeated freeze-thaw cycles create lines when the stones are on more steeply sloping ground.Credit: Bernard Hallet/University of Washington.

    Hallet specializes in studying the patterns that form in polar regions, high-mountain and other cold environments. One of the reasons for the patterns is needle ice. As the temperature drops, the moisture contained in the soil grows into spikes of ice crystals that protrude from the ground.

    “When you go out in the backyard after a freezing night and you feel a little crunch under the foot, you’re probably walking on needle ice,” Hallet said.

    As needle ice forms it tends to push up soil particles and, if there are any, small stones. More needle ice can form on patches of bare soil compared to rock-covered areas, Hallet said. The ice needles will slightly displace any remaining stones in the barer region. Over years, the stones begin to cluster in groups, leaving the bare patches essentially stone-free.

    “That kind of selective growth involves interesting feedbacks between the size of the stones, the moisture in the soil and the growth of the ice needles,” Hallet said.

    Labyrinths of stones in Svalbard, Norway. Labyrinth patterns form where the stones are on a gentle slope. New research provides insight into how these features form in rocky, frost-prone landscapes. Credit: Bernard Hallet/University of Washington.

    Hallet had previously reviewed another scientific paper by first author Anyuan Li, formerly at Shaoxing University [绍兴文理学院](CN) and now at The University of Tsukuba [筑波大学](JP). The two began a collaboration that mixes Hallet’s longtime expertise investigating patterns in nature with Li and his collaborators’ background in experimental science and computer modeling.

    Senior author Quan-Xing Liu at East China Normal University[华东师范大学](CN) uses fieldwork and lab experiments to understand self-organized patterns in nature. For this study, the experimental setup was a flat square of wet soil a little over 1 foot on each side (0.4 meters) that began with stones spaced uniformly on the surface. The researchers ran the experiment through 30 freeze-thaw cycles. By the end of that time, regular patterns had started to appear.

    “The videos are pretty striking, and they show that the ice just comes up and in a single cycle it pushes up stones and moves them slightly to the side,” Hallet said. “Because of those experiments and the abilities of the individuals involved to analyze those results, we have much more tangible, quantitative descriptions of these features.”

    Further experiments looked at how the pattern changed depending on the concentration of stones, the slope of the ground, and the height of the ice needles, which is also affected by the stone concentration. Based on those results, the authors wrote a computer model that predicts what patterns will appear depending on the concentration of stones on the frost-prone surface.

    Two different computer models predict the long-term distribution of stones on freezing ground depending on the stones’ initial concentration. The left column starts with 20% stone coverage, which creates islands, shown here in white; the middle rows have 30% and 40% stone coverage, which creates labyrinths and worm-like shapes; and the fourth column is 80% stone coverage, which gives no pattern. The right column shows 20% stone coverage on a slightly sloping ground; the stones tend to form lines.Li et al./PNAS.

    Other co-authors on the new study are Norikazu Matsuoka at the University of Tsukuba; Fujun Niu at the South China University of Technology [華南理工大學](CN); Jing Chen and Wensi Hu at East China Normal University; Desheng Li at Shanghai Jiao Tong University [海交通大学](CN); Johan van de Koppel at The The University of Groningen [Rijksuniversiteit Groningen] (NL); and Nigel Goldenfeld at The University of California-San Diego(US).

    The research was funded by the Second Tibetan Plateau Scientific Expedition and Research program; the Japan Society for the Promotion of Science; the National Natural Science Foundation of China; the Chinese Academy of Sciences; and the China Scholarship Council.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition


    The University of Washington (US) is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

    The University of Washington (US) is a public research university in Seattle, Washington, United States. Founded in 1861, University of Washington is one of the oldest universities on the West Coast; it was established in downtown Seattle approximately a decade after the city’s founding to aid its economic development. Today, the university’s 703-acre main Seattle campus is in the University District above the Montlake Cut, within the urban Puget Sound region of the Pacific Northwest. The university has additional campuses in Tacoma and Bothell. Overall, University of Washington encompasses over 500 buildings and over 20 million gross square footage of space, including one of the largest library systems in the world with more than 26 university libraries, as well as the UW Tower, lecture halls, art centers, museums, laboratories, stadiums, and conference centers. The university offers bachelor’s, master’s, and doctoral degrees through 140 departments in various colleges and schools, sees a total student enrollment of roughly 46,000 annually, and functions on a quarter system.

    University of Washington is a member of the Association of American Universities(US) and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation(US), UW spent $1.41 billion on research and development in 2018, ranking it 5th in the nation. As the flagship institution of the six public universities in Washington state, it is known for its medical, engineering and scientific research as well as its highly competitive computer science and engineering programs. Additionally, University of Washington continues to benefit from its deep historic ties and major collaborations with numerous technology giants in the region, such as Amazon, Boeing, Nintendo, and particularly Microsoft. Paul G. Allen, Bill Gates and others spent significant time at Washington computer labs for a startup venture before founding Microsoft and other ventures. The University of Washington’s 22 varsity sports teams are also highly competitive, competing as the Huskies in the Pac-12 Conference of the NCAA Division I, representing the United States at the Olympic Games, and other major competitions.

    The university has been affiliated with many notable alumni and faculty, including 21 Nobel Prize laureates and numerous Pulitzer Prize winners, Fulbright Scholars, Rhodes Scholars and Marshall Scholars.

    In 1854, territorial governor Isaac Stevens recommended the establishment of a university in the Washington Territory. Prominent Seattle-area residents, including Methodist preacher Daniel Bagley, saw this as a chance to add to the city’s potential and prestige. Bagley learned of a law that allowed United States territories to sell land to raise money in support of public schools. At the time, Arthur A. Denny, one of the founders of Seattle and a member of the territorial legislature, aimed to increase the city’s importance by moving the territory’s capital from Olympia to Seattle. However, Bagley eventually convinced Denny that the establishment of a university would assist more in the development of Seattle’s economy. Two universities were initially chartered, but later the decision was repealed in favor of a single university in Lewis County provided that locally donated land was available. When no site emerged, Denny successfully petitioned the legislature to reconsider Seattle as a location in 1858.

    In 1861, scouting began for an appropriate 10 acres (4 ha) site in Seattle to serve as a new university campus. Arthur and Mary Denny donated eight acres, while fellow pioneers Edward Lander, and Charlie and Mary Terry, donated two acres on Denny’s Knoll in downtown Seattle. More specifically, this tract was bounded by 4th Avenue to the west, 6th Avenue to the east, Union Street to the north, and Seneca Streets to the south.

    John Pike, for whom Pike Street is named, was the university’s architect and builder. It was opened on November 4, 1861, as the Territorial University of Washington. The legislature passed articles incorporating the University, and establishing its Board of Regents in 1862. The school initially struggled, closing three times: in 1863 for low enrollment, and again in 1867 and 1876 due to funds shortage. University of Washington awarded its first graduate Clara Antoinette McCarty Wilt in 1876, with a bachelor’s degree in science.

    19th century relocation

    By the time Washington state entered the Union in 1889, both Seattle and the University had grown substantially. University of Washington’s total undergraduate enrollment increased from 30 to nearly 300 students, and the campus’s relative isolation in downtown Seattle faced encroaching development. A special legislative committee, headed by University of Washington graduate Edmond Meany, was created to find a new campus to better serve the growing student population and faculty. The committee eventually selected a site on the northeast of downtown Seattle called Union Bay, which was the land of the Duwamish, and the legislature appropriated funds for its purchase and construction. In 1895, the University relocated to the new campus by moving into the newly built Denny Hall. The University Regents tried and failed to sell the old campus, eventually settling with leasing the area. This would later become one of the University’s most valuable pieces of real estate in modern-day Seattle, generating millions in annual revenue with what is now called the Metropolitan Tract. The original Territorial University building was torn down in 1908, and its former site now houses the Fairmont Olympic Hotel.

    The sole-surviving remnants of Washington’s first building are four 24-foot (7.3 m), white, hand-fluted cedar, Ionic columns. They were salvaged by Edmond S. Meany, one of the University’s first graduates and former head of its history department. Meany and his colleague, Dean Herbert T. Condon, dubbed the columns as “Loyalty,” “Industry,” “Faith”, and “Efficiency”, or “LIFE.” The columns now stand in the Sylvan Grove Theater.

    20th century expansion

    Organizers of the 1909 Alaska-Yukon-Pacific Exposition eyed the still largely undeveloped campus as a prime setting for their world’s fair. They came to an agreement with Washington’s Board of Regents that allowed them to use the campus grounds for the exposition, surrounding today’s Drumheller Fountain facing towards Mount Rainier. In exchange, organizers agreed Washington would take over the campus and its development after the fair’s conclusion. This arrangement led to a detailed site plan and several new buildings, prepared in part by John Charles Olmsted. The plan was later incorporated into the overall University of Washington campus master plan, permanently affecting the campus layout.

    Both World Wars brought the military to campus, with certain facilities temporarily lent to the federal government. In spite of this, subsequent post-war periods were times of dramatic growth for the University. The period between the wars saw a significant expansion of the upper campus. Construction of the Liberal Arts Quadrangle, known to students as “The Quad,” began in 1916 and continued to 1939. The University’s architectural centerpiece, Suzzallo Library, was built in 1926 and expanded in 1935.

    After World War II, further growth came with the G.I. Bill. Among the most important developments of this period was the opening of the School of Medicine in 1946, which is now consistently ranked as the top medical school in the United States. It would eventually lead to the University of Washington Medical Center, ranked by U.S. News and World Report as one of the top ten hospitals in the nation.

    In 1942, all persons of Japanese ancestry in the Seattle area were forced into inland internment camps as part of Executive Order 9066 following the attack on Pearl Harbor. During this difficult time, university president Lee Paul Sieg took an active and sympathetic leadership role in advocating for and facilitating the transfer of Japanese American students to universities and colleges away from the Pacific Coast to help them avoid the mass incarceration. Nevertheless many Japanese American students and “soon-to-be” graduates were unable to transfer successfully in the short time window or receive diplomas before being incarcerated. It was only many years later that they would be recognized for their accomplishments during the University of Washington’s Long Journey Home ceremonial event that was held in May 2008.

    From 1958 to 1973, the University of Washington saw a tremendous growth in student enrollment, its faculties and operating budget, and also its prestige under the leadership of Charles Odegaard. University of Washington student enrollment had more than doubled to 34,000 as the baby boom generation came of age. However, this era was also marked by high levels of student activism, as was the case at many American universities. Much of the unrest focused around civil rights and opposition to the Vietnam War. In response to anti-Vietnam War protests by the late 1960s, the University Safety and Security Division became the University of Washington Police Department.

    Odegaard instituted a vision of building a “community of scholars”, convincing the Washington State legislatures to increase investment in the University. Washington senators, such as Henry M. Jackson and Warren G. Magnuson, also used their political clout to gather research funds for the University of Washington. The results included an increase in the operating budget from $37 million in 1958 to over $400 million in 1973, solidifying University of Washington as a top recipient of federal research funds in the United States. The establishment of technology giants such as Microsoft, Boeing and Amazon in the local area also proved to be highly influential in the University of Washington’s fortunes, not only improving graduate prospects but also helping to attract millions of dollars in university and research funding through its distinguished faculty and extensive alumni network.

    21st century

    In 1990, the University of Washington opened its additional campuses in Bothell and Tacoma. Although originally intended for students who have already completed two years of higher education, both schools have since become four-year universities with the authority to grant degrees. The first freshman classes at these campuses started in fall 2006. Today both Bothell and Tacoma also offer a selection of master’s degree programs.

    In 2012, the University began exploring plans and governmental approval to expand the main Seattle campus, including significant increases in student housing, teaching facilities for the growing student body and faculty, as well as expanded public transit options. The University of Washington light rail station was completed in March 2015, connecting Seattle’s Capitol Hill neighborhood to the University of Washington Husky Stadium within five minutes of rail travel time. It offers a previously unavailable option of transportation into and out of the campus, designed specifically to reduce dependence on private vehicles, bicycles and local King County buses.

    University of Washington has been listed as a “Public Ivy” in Greene’s Guides since 2001, and is an elected member of the American Association of Universities. Among the faculty by 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences(US), 67 members of the American Academy of Arts and Sciences, 53 members of the National Academy of Medicine(US), 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering(US), 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among UW students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017.

    The Academic Ranking of World Universities (ARWU) has consistently ranked University of Washington as one of the top 20 universities worldwide every year since its first release. In 2019, University of Washington ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900.

    U.S. News & World Report ranked University of Washington 8th out of nearly 1,500 universities worldwide for 2021, with University of Washington’s undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities.

    In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world’s 500 major universities, ranked University of Washington 12th globally and 5th in the U.S.

    In 2019, Kiplinger Magazine’s review of “top college values” named University of Washington 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings University of Washington was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service.

  • richardmitnick 4:25 pm on October 13, 2021 Permalink | Reply
    Tags: "How to better identify dangerous volcanoes", Applied Research & Technology, , , ,   

    From Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH): “How to better identify dangerous volcanoes” 

    From Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH)

    Felix Würsten

    The more water is dissolved in the magma, the greater the risk that a volcano will explode. A new ETH study now shows that this simple rule is only partially true. Paradoxically, high water content significantly reduces the risk of explosion.

    During the eruption of Mount Pinatubo in June 1991, large quantities of ash particles were ejected into the stratosphere. The eruption’s impact on the climate lasted for years. (Bild: Dave Harlow, The Geological Survey (US))

    Volcanologists have long been troubled by two questions: When exactly will a volcano erupt next? And how will that eruption unfold? Will the lava flow down the mountain as a viscous paste, or will the volcano explosively drive a cloud of ash kilometres up into the atmosphere?

    The first question of “when” can now be answered relatively precisely, explains Olivier Bachmann, Professor of Magmatic Petrology at ETH Zürich. He points to monitoring data from the Canary Island of La Palma, where the Cumbre Vieja volcano recently emitted a lava flow that poured down to the sea. Using seismic data, the experts were able to track the rise of the lava in real time, so to speak, and predict the eruption to within a few days.

    Unpredictable forces of nature

    The “how”, on the other hand, is still a major headache for volcanologists. Volcanoes on islands such as La Palma or Hawaii are known to be unlikely to produce huge explosions. But this question is much more difficult to answer for the large volcanoes located along subduction zones, such as those found in the Andes, on the US West Coast, in Japan, Indonesia, or in Italy and Greece. This is because all these volcanoes can erupt in many different ways, with no way to predict which will occur.

    To better understand how a volcano erupts, in recent years many researchers have focused on what happens in the volcanic conduit. It has been known for some time that the dissolved gases in the magma, which then emerges as lava at the Earth’s surface, are an important factor. If there are large quantities of dissolved gases in the magma, gas bubbles form in response to the decrease in pressure as the magma rises up through the conduit, similar to what happens in a shaken champagne bottle. These gas bubbles, if they cannot escape, then lead to an explosive eruption. In contrast, a magma containing little dissolved gas flows gently out of the conduit and is therefore much less dangerous for the surrounding area.

    What happens in the run-​up?

    Bachmann and his postdoctoral researcher Răzvan-​Gabriel Popa have now focused on the magma chamber in a new study they recently published in the journal Nature Geoscience. In an extensive literature study, they analysed data from 245 volcanic eruptions, reconstructing how hot the magma chamber was before the eruption, how many solid crystals there were in the melt and how high the dissolved water content was. This last factor is particularly important, because the dissolved water later forms the infamous gas bubbles during the magma’s ascent, turning the volcano into a champagne bottle that was too quickly uncorked.

    The data initially confirmed the existing doctrine: if the magma contains little water, the risk of an explosive eruption is low. The risk is also low if the magma already contains many crystals. This is because these ensure the formation of gas channels in the conduit through which the gas can easily escape, Bachmann explains. In the case of magma with few crystals and a water content of more than 3.5 percent, on the other hand, the risk of an explosive eruption is very high – just as the prevailing doctrine predicts.

    What surprised Bachmann and Popa, however, was that the picture changes again with high water content: if there is more than about 5.5 percent water in the magma, the risk of an explosive eruption drops markedly, even though many gas bubbles can certainly form as the lava rises. “So there’s a clearly defined area of risk that we need to focus on,” Bachmann explains.

    Gases as a buffer

    The two volcanologists explain their new finding by way of two effects, all related to the very high water content that causes gas bubbles to form not only in the conduit, but also down in the magma chamber. First, the many gas bubbles link up early on, at great depth, to form channels in the conduit, making it easier for the gas to escape. The gas can then leak into the atmosphere without any explosive effect. Second, the gas bubbles present in the magma chamber delay the eruption of the volcano and thus reduce the risk of an explosion.

    “Before a volcano erupts, hot magma rises from great depths and enters the subvolcanic chamber of the volcano, which is located 6 to 8 kilometres below the surface, and increases the pressure there,” Popa explains. “As soon as the pressure in the magma chamber is high enough to crack the overlying rocks, an eruption occurs.”

    If the molten rock in the magma chamber contains gas bubbles, these act as a buffer: they are compressed by the material rising from below, slowing the pressure buildup in the magma chamber. This delay gives the magma more time to absorb heat from below, such that the lava is hotter and thus less viscous when it finally erupts. This makes it easier for the gas in the conduit to escape from the magma without explosive side effects.

    COVID-​19 as a stroke of luck

    These new findings make it theoretically possible to arrive at better forecasts for when to expect a dangerous explosion. The question is, how can scientists determine in advance the quantity of gas bubble in the magma chamber and the extent to which the magma has already crystallised? “We’re currently discussing with geophysicists which methods could be used to best record these crucial parameters,” Bachmann says. “I think the solution is to combine different metrics – seismic, gravimetric, geoelectric and magnetic data, for example.”

    To conclude, Bachmann mentions a side aspect of the new study: “If it weren’t for the coronavirus crisis, we probably wouldn’t have written this paper,” he says with a grin. “When the first lockdown meant we suddenly couldn’t go into the field or the lab, we had to rethink our research activities at short notice. So we took the time we now had on our hands and spent it going through the literature to verify an idea we’d already had based on our own measurement data. We probably wouldn’t have done this time-​consuming research under normal circumstances.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH) is a public research university in the city of Zürich, Switzerland. Founded by the Swiss Federal Government in 1854 with the stated mission to educate engineers and scientists, the school focuses exclusively on science, technology, engineering and mathematics. Like its sister institution Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH) , it is part of the Swiss Federal Institutes of Technology Domain (ETH Domain)) , part of the Swiss Federal Department of Economic Affairs, Education and Research [EAER][Eidgenössisches Departement für Wirtschaft, Bildung und Forschung] [Département fédéral de l’économie, de la formation et de la recherche] (CH).

    The university is an attractive destination for international students thanks to low tuition fees of 809 CHF per semester, PhD and graduate salaries that are amongst the world’s highest, and a world-class reputation in academia and industry. There are currently 22,200 students from over 120 countries, of which 4,180 are pursuing doctoral degrees. In the 2021 edition of the QS World University Rankings ETH Zürich is ranked 6th in the world and 8th by the Times Higher Education World Rankings 2020. In the 2020 QS World University Rankings by subject it is ranked 4th in the world for engineering and technology (2nd in Europe) and 1st for earth & marine science.

    As of November 2019, 21 Nobel laureates, 2 Fields Medalists, 2 Pritzker Prize winners, and 1 Turing Award winner have been affiliated with the Institute, including Albert Einstein. Other notable alumni include John von Neumann and Santiago Calatrava. It is a founding member of the IDEA League and the International Alliance of Research Universities (IARU) and a member of the CESAER network.

    ETH Zürich was founded on 7 February 1854 by the Swiss Confederation and began giving its first lectures on 16 October 1855 as a polytechnic institute (eidgenössische polytechnische Schule) at various sites throughout the city of Zurich. It was initially composed of six faculties: architecture, civil engineering, mechanical engineering, chemistry, forestry, and an integrated department for the fields of mathematics, natural sciences, literature, and social and political sciences.

    It is locally still known as Polytechnikum, or simply as Poly, derived from the original name eidgenössische polytechnische Schule, which translates to “federal polytechnic school”.

    ETH Zürich is a federal institute (i.e., under direct administration by the Swiss government), whereas the University of Zürich [Universität Zürich ] (CH) is a cantonal institution. The decision for a new federal university was heavily disputed at the time; the liberals pressed for a “federal university”, while the conservative forces wanted all universities to remain under cantonal control, worried that the liberals would gain more political power than they already had. In the beginning, both universities were co-located in the buildings of the University of Zürich.

    From 1905 to 1908, under the presidency of Jérôme Franel, the course program of ETH Zürich was restructured to that of a real university and ETH Zürich was granted the right to award doctorates. In 1909 the first doctorates were awarded. In 1911, it was given its current name, Eidgenössische Technische Hochschule. In 1924, another reorganization structured the university in 12 departments. However, it now has 16 departments.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form the Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    Reputation and ranking

    ETH Zürich is ranked among the top universities in the world. Typically, popular rankings place the institution as the best university in continental Europe and ETH Zürich is consistently ranked among the top 1-5 universities in Europe, and among the top 3-10 best universities of the world.

    Historically, ETH Zürich has achieved its reputation particularly in the fields of chemistry, mathematics and physics. There are 32 Nobel laureates who are associated with ETH Zürich, the most recent of whom is Richard F. Heck, awarded the Nobel Prize in chemistry in 2010. Albert Einstein is perhaps its most famous alumnus.

    In 2018, the QS World University Rankings placed ETH Zürich at 7th overall in the world. In 2015, ETH Zürich was ranked 5th in the world in Engineering, Science and Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US) and University of Cambridge(UK). In 2015, ETH Zürich also ranked 6th in the world in Natural Sciences, and in 2016 ranked 1st in the world for Earth & Marine Sciences for the second consecutive year.

    In 2016, Times Higher Education World University Rankings ranked ETH Zürich 9th overall in the world and 8th in the world in the field of Engineering & Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US), California Institute of Technology(US), Princeton University(US), University of Cambridge(UK), Imperial College London(UK) and University of Oxford(UK) .

    In a comparison of Swiss universities by swissUP Ranking and in rankings published by CHE comparing the universities of German-speaking countries, ETH Zürich traditionally is ranked first in natural sciences, computer science and engineering sciences.

    In the survey CHE ExcellenceRanking on the quality of Western European graduate school programs in the fields of biology, chemistry, physics and mathematics, ETH Zürich was assessed as one of the three institutions to have excellent programs in all the considered fields, the other two being Imperial College London(UK) and the University of Cambridge(UK), respectively.

  • richardmitnick 12:55 pm on October 13, 2021 Permalink | Reply
    Tags: "Scientists capture image of bizarre 'electron ice' for the first time", Applied Research & Technology, , , Wigner crystal — a strange honeycomb-pattern material inside another material made entirely out of electrons.   

    From Live Science (US) : “Scientists capture image of bizarre ‘electron ice’ for the first time” 

    From Live Science (US)

    Ben Turner

    The scanning tunnelling image of the graphene sheet shows the honeycomb imprint of the ‘electron ice’ underneath it. (Image credit: H. Li et al./Nature)

    Physicists have taken the first ever image of a Wigner crystal — a strange honeycomb-pattern material inside another material made entirely out of electrons.

    Hungarian physicist Eugene Wigner first theorized this crystal in 1934, but it’s taken more than eight decades for scientists to finally get a direct look at the “electron ice.” The fascinating first image shows electrons squished together into a tight, repeating pattern — like tiny blue butterfly wings, or pressings of an alien clover.

    The researchers behind the study, published on Sept. 29 in the journal Nature, say that while this isn’t the first time that a Wigner crystal has been plausibly created or even had its properties studied, the visual evidence they collected is the most emphatic proof of the material’s existence yet.

    “If you say you have an electron crystal, show me the crystal,” study co-author Feng Wang, a physicist at The University of California (US), told Nature News.

    Inside ordinary conductors like silver or copper, or semiconductors like silicon, electrons zip around so fast that they are barely able to interact with each other. But at very low temperatures, they slow down to a crawl, and the repulsion between the negatively charged electrons begins to dominate. The once highly mobile particles grind to a halt, arranging themselves into a repeating, honeycomb-like pattern to minimize their total energy use.

    To see this in action, the researchers trapped electrons in the gap between atom-thick layers of two tungsten semiconductors — one tungsten disulfide and the other tungsten diselenide. Then, after applying an electric field across the gap to remove any potentially disruptive excess electrons, the researchers chilled their electron sandwich down to 5 degrees above absolute zero. Sure enough, the once-speedy electrons stopped, settling into the repeating structure of a Wigner crystal.

    The researchers then used a device called a scanning tunneling microscope (STM) to view this new crystal. STMs work by applying a tiny voltage across a very sharp metal tip before running it just above a material, causing electrons to leap down to the material’s surface from the tip. The rate that electrons jump from the tip depends on what’s underneath them, so researchers can build up a picture of the Braille-like contours of a 2D surface by measuring current flowing into the surface at each point.

    But the current provided by the STM was at first too much for the delicate electron ice, “melting” it upon contact. To stop this, the researchers inserted a single-atom layer of graphene just above the Wigner crystal, enabling the crystal to interact with the graphene and leave an impression on it that the STM could safely read — much like a photocopier. By tracing the image imprinted on the graphene sheet completely, the STM captured the first snapshot of the Wigner crystal, proving its existence beyond all doubt.

    Now that they have conclusive proof that Wigner crystals exist, scientists can use the crystals to answer deeper questions about how multiple electrons interact with each other, such as why the crystals arrange themselves in honeycomb orderings, and how they “melt.” The answers will offer a rare glimpse into some of the most elusive properties of the tiny particles.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 12:35 pm on October 13, 2021 Permalink | Reply
    Tags: "Levitation yields better neutron-lifetime measurement", Applied Research & Technology, , , , ,   

    From DOE’s Los Alamos National Laboratory (US) via Science Alert (US) : “Levitation yields better neutron-lifetime measurement” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)



    Science Alert (US)

    13 OCTOBER 2021

    TanyaLovus/iStock/Getty Images Plus.

    We now know, to within a tenth of a percent, how long a neutron can survive outside the atomic nucleus before decaying into a proton.

    This is the most precise measurement yet of the lifespan of these fundamental particles, representing a more than two-fold improvement over previous measurements. This has implications for our understanding of how the first matter in the Universe was created from a soup of protons and neutrons in the minutes after the Big Bang.

    “The process by which a neutron ‘decays’ into a proton – with an emission of a light electron and an almost massless neutrino – is one of the most fascinating processes known to physicists,” said nuclear physicist Daniel Salvat of The Indiana University (US) Bloomington.

    “The effort to measure this value very precisely is significant because understanding the precise lifetime of the neutron can shed light on how the universe developed – as well as allow physicists to discover flaws in our model of the subatomic universe that we know exist but nobody has yet been able to find.”

    The research was conducted at The Los Alamos National Science Center, where a special experiment is set up just for trying to measure neutron lifespans. It’s called the UCNtau project, and it involves ultra-cold neutrons (UCNs) stored in a magneto-gravitational trap.

    The neutrons are cooled almost to absolute zero, and placed in the trap, a bowl-shaped chamber lined with thousands of permanent magnets, which levitate the neutrons, inside a vacuum jacket.

    The magnetic field prevents the neutrons from depolarizing and, combined with gravity, keeps the neutrons from escaping. This design allows neutrons to be stored for up to 11 days.

    The researchers stored their neutrons in the UCNtau trap for 30 to 90 minutes, then counted the remaining particles after the allotted time. Over the course of repeated experiments, conducted between 2017 and 2019, they counted over 40 million neutrons, obtaining enough statistical data to determine the particles’ lifespan with the greatest precision yet.

    This lifespan is around 877.75 ± 0.28 seconds (14 minutes and 38 seconds), according to the researchers’ analysis. The refined measurement can help place important physical constraints on the Universe, including the formation of matter and dark matter.

    After the Big Bang, things happened relatively quickly. In the very first moments, the hot, ultra-dense matter that filled the Universe cooled into quarks and electrons; just millionths of a second later, the quarks coalesced into protons and neutrons.

    Knowing the lifespan of the neutron can help physicists understand what role, if any, decaying neutrons play in the formation of the mysterious mass in the Universe known as dark matter. This information can also help test the validity of something called the Cabibbo-Kobayashi-Maskawa matrix, which helps explain the behavior of quarks under the Standard Model of physics, the researchers said.

    “The underlying model explaining neutron decay involves the quarks changing their identities, but recently improved calculations suggest this process may not occur as previously predicted,” Salvat said.

    “Our new measurement of the neutron lifetime will provide an independent assessment to settle this issue, or provide much-searched-for evidence for the discovery of new physics.”

    The research has been accepted into Physical Review Letters.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory (US) mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University (US), Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: