Tagged: Civil Engineering Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:08 pm on January 25, 2023 Permalink | Reply
    Tags: "Build more but pollute less - The University of Toronto (CA) research centre tackles need for sustainable infrastructure", , , Civil Engineering, , ,   

    From Faculty of Applied Science & Engineering At The University of Toronto (CA): “Build more but pollute less – The University of Toronto (CA) research centre tackles need for sustainable infrastructure” 

    From Faculty of Applied Science & Engineering

    At

    The University of Toronto (CA)

    1.19.23
    Tyler Irving

    1
    A crane is reflected in a window at a construction site in downtown Toronto (Lance McMillan/Toronto Star via Getty Images)

    The newest research centre at the University of Toronto’s Faculty of Applied Science & Engineering will develop innovative ways to meet the urgent and growing need for infrastructure – without further exacerbating the climate crisis.

    The Centre for the Sustainable Built Environment brings together seven researchers from across U of T, as well as a dozen companies in construction and related industries. The goal is to identify strategies that will lower the environmental footprint by reimagining how new infrastructure is designed, where it is built and what materials are used in its construction.

    “In Canada and around the world, we have a huge housing and infrastructure deficit – there’s a big social need to build much more than we have right now,” says Shoshanna Saxe, associate professor in the department of civil and mineral engineering and Canada Research Chair in Sustainable Infrastructure.

    “At the same time, construction resource use accounts for up to a third of total global greenhouse gas emissions each year, a problem that is getting worse. It’s been estimated that if we continue current ways of construction, by 2050 the emissions due to new housing alone will cause us to blow past two degrees of global warming,” she adds. “If we want to avoid that, let alone reach net zero by 2050, we need to find ways to do more with less.”

    Saxe and her collaborators – Evan Bentz, Chris Essert, Elias Khalil, Heather MacLean, Daman Panesar and Daniel Posen, all fellow U of T researchers – plan to approach this complex challenge from several different angles. Some efficiencies can be found by looking at where new housing is built, as well as what it looks like.

    “The average person living in a city consumes fewer resources than the average person living in a suburb, because in a city you have more people per kilometre of sewer, road or electrical infrastructure. There are big rewards for well-designed cities,” Saxe says.

    “The shape and types of buildings we build is also important. For example, Toronto has a lot of long skinny apartments, where a lot of the space is in the hallway. If we design differently, we can better use that space to provide more housing, or avoid [extra space] all together and save materials, emissions and cost.”

    Saxe and her team have also shown that large concrete basements account for a high proportion of the emissions due to construction – building more of the structure above ground could improve the environmental bottom line. Other potential solutions involve alternative building materials, such as new types of concrete that are less carbon-intensive.

    The multidisciplinary team – whose researchers cover a wide range of expertise, from carrying out life-cycle analysis of construction projects to defining national carbon budgets – will address issues well beyond the traditional bounds of engineering. For example, the group plans to explore the legal frameworks that translate established housing rights into practical built spaces.

    “It’s absurd to say that the right to housing means that everyone has to live in a space the size of a closet,” Saxe says. “But it’s also absurd to expect everyone to have their own 3,500-square-foot house. Can we find a middle ground where everyone can live in dignity, without consuming in a way that threatens the planet?”

    The research collaboration includes 12 external partners in the construction sector: Colliers; the Cement Association of Canada; Chandos Construction; Mattamy Homes; Northcrest; Pomerleau; Purpose Building, Inc.; ZGF Architects; Arup; SvN Architects + Planners; Entuitive; and KPMB Architects.

    By working closely with this core group, Saxe and her collaborators aim to speed up knowledge translation, ensuring that the insights gained through their research can be applied in industry.

    “The conversations we have with our partners can inform their design and construction, as well as the conversations they then have with their clients, raising everyone’s level of knowledge and awareness,” she says.

    “We hope that by giving people – policymakers, designers and builders – the tools they need to address these challenges of building more with fewer emissions, we can improve outcomes across the built environment and create a more sustainable future for everyone.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Faculty of Applied Science and Engineering is an academic division of the University of Toronto devoted to study and research in engineering. Founded in 1873 as the School of Practical Science, it is still known today by the longtime nickname of Skule. The faculty is based primarily across 16 buildings on the southern side of the university campus in Downtown Toronto, in addition to operating the Institute for Aerospace Studies facility. The faculty administers undergraduate, master’s and doctoral degree programs, as well as a dual-degree program with the Rotman School of Management.

    Departments

    Department of Chemical Engineering & Applied Chemistry (Chem)
    Department of Civil and Mineral Engineering (Civ/Min)
    The Edward S. Rogers Sr. Department of Electrical & Computer Engineering (ECE)
    Department of Materials Science & Engineering (MSE)
    Department of Mechanical & Industrial Engineering (MIE)

    Divisions

    Division of Engineering Science (EngSci)
    Division of Environmental Engineering & Energy Systems (DEEES)

    Specialized institutes

    University of Toronto Institute for Aerospace Studies (UTIAS)
    Institute of Biomedical Engineering (BME)

    Affiliated research institutes and centres

    BioZone
    Centre for Advanced Coating Technologies (CACT)
    Centre for Advanced Diffusion-Wave Technologies (CADIFT)
    Centre for Advanced Nanotechnology Centre for Global Engineering (CGEN)
    Centre for Maintenance Optimization & Reliability Engineering (C-MORE)
    Centre for Management of Technology & Entrepreneurship (CMTE)
    Centre for Research in Healthcare Engineering (CRHE)
    Centre for the Resilience of Critical Infrastructure (RCI)
    Centre for Technology & Social Development Emerging Communications Technology Institute (ECTI)
    Identity, Privacy & Security Institute (IPSI)
    Institute for Leadership Education in Engineering (ILead)
    Institute for Multidisciplinary Design & Innovation (UT-IMDI)
    Institute for Optical Sciences Institute for Robotics & Mechatronics (IRM)
    Institute for Sustainable Energy (ISE)
    Intelligent Transportation Systems (ITS) Centre & Test Bed
    Lassonde Institute of Mining
    Pulp & Paper Centre
    Southern Ontario Centre for Atmospheric Aerosol Research (SOCAAR)
    Terrence Donnelly Centre for Cellular & Biomolecular Research
    Ontario Centre for the Characterization of Advanced Materials (OCCAM)

    The University of Toronto (CA) is a public research university in Toronto, Ontario, Canada, located on the grounds that surround Queen’s Park. It was founded by royal charter in 1827 as King’s College, the oldest university in the province of Ontario.

    Originally controlled by the Church of England, the university assumed its present name in 1850 upon becoming a secular institution.

    As a collegiate university, it comprises eleven colleges each with substantial autonomy on financial and institutional affairs and significant differences in character and history. The university also operates two satellite campuses located in Scarborough and Mississauga.

    University of Toronto has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

    Academically, the University of Toronto is noted for movements and curricula in literary criticism and communication theory, known collectively as the Toronto School.

    The university was the birthplace of insulin and stem cell research, and was the site of the first electron microscope in North America; the identification of the first black hole Cygnus X-1; multi-touch technology, and the development of the theory of NP-completeness.

    The university was one of several universities involved in early research of deep learning. It receives the most annual scientific research funding of any Canadian university and is one of two members of the Association of American Universities outside the United States, the other being McGill(CA).

    The Varsity Blues are the athletic teams that represent the university in intercollegiate league matches, with ties to gridiron football, rowing and ice hockey. The earliest recorded instance of gridiron football occurred at University of Toronto’s University College in November 1861.

    The university’s Hart House is an early example of the North American student centre, simultaneously serving cultural, intellectual, and recreational interests within its large Gothic-revival complex.

    The University of Toronto has educated three Governors General of Canada, four Prime Ministers of Canada, three foreign leaders, and fourteen Justices of the Supreme Court. As of March 2019, ten Nobel laureates, five Turing Award winners, 94 Rhodes Scholars, and one Fields Medalist have been affiliated with the university.

    Early history

    The founding of a colonial college had long been the desire of John Graves Simcoe, the first Lieutenant-Governor of Upper Canada and founder of York, the colonial capital. As an University of Oxford (UK)-educated military commander who had fought in the American Revolutionary War, Simcoe believed a college was needed to counter the spread of republicanism from the United States. The Upper Canada Executive Committee recommended in 1798 that a college be established in York.

    On March 15, 1827, a royal charter was formally issued by King George IV, proclaiming “from this time one College, with the style and privileges of a University … for the education of youth in the principles of the Christian Religion, and for their instruction in the various branches of Science and Literature … to continue for ever, to be called King’s College.” The granting of the charter was largely the result of intense lobbying by John Strachan, the influential Anglican Bishop of Toronto who took office as the college’s first president. The original three-storey Greek Revival school building was built on the present site of Queen’s Park.

    Under Strachan’s stewardship, King’s College was a religious institution closely aligned with the Church of England and the British colonial elite, known as the Family Compact. Reformist politicians opposed the clergy’s control over colonial institutions and fought to have the college secularized. In 1849, after a lengthy and heated debate, the newly elected responsible government of the Province of Canada voted to rename King’s College as the University of Toronto and severed the school’s ties with the church. Having anticipated this decision, the enraged Strachan had resigned a year earlier to open Trinity College as a private Anglican seminary. University College was created as the nondenominational teaching branch of the University of Toronto. During the American Civil War the threat of Union blockade on British North America prompted the creation of the University Rifle Corps which saw battle in resisting the Fenian raids on the Niagara border in 1866. The Corps was part of the Reserve Militia lead by Professor Henry Croft.

    Established in 1878, the School of Practical Science was the precursor to the Faculty of Applied Science and Engineering which has been nicknamed Skule since its earliest days. While the Faculty of Medicine opened in 1843 medical teaching was conducted by proprietary schools from 1853 until 1887 when the faculty absorbed the Toronto School of Medicine. Meanwhile the university continued to set examinations and confer medical degrees. The university opened the Faculty of Law in 1887, followed by the Faculty of Dentistry in 1888 when the Royal College of Dental Surgeons became an affiliate. Women were first admitted to the university in 1884.

    A devastating fire in 1890 gutted the interior of University College and destroyed 33,000 volumes from the library but the university restored the building and replenished its library within two years. Over the next two decades a collegiate system took shape as the university arranged federation with several ecclesiastical colleges including Strachan’s Trinity College in 1904. The university operated the Royal Conservatory of Music from 1896 to 1991 and the Royal Ontario Museum from 1912 to 1968; both still retain close ties with the university as independent institutions. The University of Toronto Press was founded in 1901 as Canada’s first academic publishing house. The Faculty of Forestry founded in 1907 with Bernhard Fernow as dean was Canada’s first university faculty devoted to forest science. In 1910, the Faculty of Education opened its laboratory school, the University of Toronto Schools.

    World wars and post-war years

    The First and Second World Wars curtailed some university activities as undergraduate and graduate men eagerly enlisted. Intercollegiate athletic competitions and the Hart House Debates were suspended although exhibition and interfaculty games were still held. The David Dunlap Observatory in Richmond Hill opened in 1935 followed by the University of Toronto Institute for Aerospace Studies in 1949. The university opened satellite campuses in Scarborough in 1964 and in Mississauga in 1967. The university’s former affiliated schools at the Ontario Agricultural College and Glendon Hall became fully independent of the University of Toronto and became part of University of Guelph (CA) in 1964 and York University (CA) in 1965 respectively. Beginning in the 1980s reductions in government funding prompted more rigorous fundraising efforts.

    Since 2000

    In 2000 Kin-Yip Chun was reinstated as a professor of the university after he launched an unsuccessful lawsuit against the university alleging racial discrimination. In 2017 a human rights application was filed against the University by one of its students for allegedly delaying the investigation of sexual assault and being dismissive of their concerns. In 2018 the university cleared one of its professors of allegations of discrimination and antisemitism in an internal investigation after a complaint was filed by one of its students.

    The University of Toronto was the first Canadian university to amass a financial endowment greater than c. $1 billion in 2007. On September 24, 2020 the university announced a $250 million gift to the Faculty of Medicine from businessman and philanthropist James C. Temerty- the largest single philanthropic donation in Canadian history. This broke the previous record for the school set in 2019 when Gerry Schwartz and Heather Reisman jointly donated $100 million for the creation of a 750,000-square foot innovation and artificial intelligence centre.

    Research

    Since 1926 the University of Toronto has been a member of the Association of American Universities a consortium of the leading North American research universities. The university manages by far the largest annual research budget of any university in Canada with sponsored direct-cost expenditures of $878 million in 2010. In 2018 the University of Toronto was named the top research university in Canada by Research Infosource with a sponsored research income (external sources of funding) of $1,147.584 million in 2017. In the same year the university’s faculty averaged a sponsored research income of $428,200 while graduate students averaged a sponsored research income of $63,700. The federal government was the largest source of funding with grants from the Canadian Institutes of Health Research; the Natural Sciences and Engineering Research Council; and the Social Sciences and Humanities Research Council amounting to about one-third of the research budget. About eight percent of research funding came from corporations- mostly in the healthcare industry.

    The first practical electron microscope was built by the physics department in 1938. During World War II the university developed the G-suit- a life-saving garment worn by Allied fighter plane pilots later adopted for use by astronauts.Development of the infrared chemiluminescence technique improved analyses of energy behaviours in chemical reactions. In 1963 the asteroid 2104 Toronto was discovered in the David Dunlap Observatory (CA) in Richmond Hill and is named after the university. In 1972 studies on Cygnus X-1 led to the publication of the first observational evidence proving the existence of black holes. Toronto astronomers have also discovered the Uranian moons of Caliban and Sycorax; the dwarf galaxies of Andromeda I, II and III; and the supernova SN 1987A. A pioneer in computing technology the university designed and built UTEC- one of the world’s first operational computers- and later purchased Ferut- the second commercial computer after UNIVAC I. Multi-touch technology was developed at Toronto with applications ranging from handheld devices to collaboration walls. The AeroVelo Atlas which won the Igor I. Sikorsky Human Powered Helicopter Competition in 2013 was developed by the university’s team of students and graduates and was tested in Vaughan.

    The discovery of insulin at the University of Toronto in 1921 is considered among the most significant events in the history of medicine. The stem cell was discovered at the university in 1963 forming the basis for bone marrow transplantation and all subsequent research on adult and embryonic stem cells. This was the first of many findings at Toronto relating to stem cells including the identification of pancreatic and retinal stem cells. The cancer stem cell was first identified in 1997 by Toronto researchers who have since found stem cell associations in leukemia; brain tumors; and colorectal cancer. Medical inventions developed at Toronto include the glycaemic index; the infant cereal Pablum; the use of protective hypothermia in open heart surgery; and the first artificial cardiac pacemaker. The first successful single-lung transplant was performed at Toronto in 1981 followed by the first nerve transplant in 1988; and the first double-lung transplant in 1989. Researchers identified the maturation promoting factor that regulates cell division and discovered the T-cell receptor which triggers responses of the immune system. The university is credited with isolating the genes that cause Fanconi anemia; cystic fibrosis; and early-onset Alzheimer’s disease among numerous other diseases. Between 1914 and 1972 the university operated the Connaught Medical Research Laboratories- now part of the pharmaceutical corporation Sanofi-Aventis. Among the research conducted at the laboratory was the development of gel electrophoresis.

    The University of Toronto is the primary research presence that supports one of the world’s largest concentrations of biotechnology firms. More than 5,000 principal investigators reside within 2 kilometres (1.2 mi) from the university grounds in Toronto’s Discovery District conducting $1 billion of medical research annually. MaRS Discovery District is a research park that serves commercial enterprises and the university’s technology transfer ventures. In 2008, the university disclosed 159 inventions and had 114 active start-up companies. Its SciNet Consortium operates the most powerful supercomputer in Canada.

     
  • richardmitnick 9:34 pm on January 13, 2023 Permalink | Reply
    Tags: "Drag and drop", "Friction and drag", "Moving water and earth", An MIT team has come up with a better formula to calculate how much sediment a fluid can push across a granular bed: “bed load transport”., , As a river cuts through a landscape it can operate like a conveyor belt., , Civil Engineering, , , How much sediment a fluid can push across a granular bed, Managing river restoration and coastal erosion, ,   

    From The Massachusetts Institute of Technology: “Moving water and earth” 

    From The Massachusetts Institute of Technology

    1.11.23
    Jennifer Chu

    1
    An MIT team has developed a more accurate formula to calculate how much sediment a fluid can push across a granular bed, which could help engineers manage river restoration and coastal erosion. The key to the new formula comes down to the shape of the sediment grains. Credit: Courtesy of the researchers.

    As a river cuts through a landscape it can operate like a conveyor belt, moving truckloads of sediment over time. Knowing how quickly or slowly this sediment flows can help engineers plan for the downstream impact of restoring a river or removing a dam. But the models currently used to estimate sediment flow can be off by a wide margin.

    An MIT team has come up with a better formula to calculate how much sediment a fluid can push across a granular bed — a process known as “bed load transport”. The key to the new formula comes down to the shape of the sediment grains.

    It may seem intuitive: A smooth, round stone should skip across a river bed faster than an angular pebble. But flowing water also pushes harder on the angular pebble, which could erase the round stone’s advantage. Which effect wins? Existing sediment transport models surprisingly don’t offer an answer, mainly because the problem of measuring grain shape is too unwieldy: How do you quantify a pebble’s contours?

    The MIT researchers found that instead of considering a grain’s exact shape, they could boil the concept of shape down to two related properties: “friction and drag”. A grain’s drag, or resistance to fluid flow, relative to its internal friction, the resistance to sliding past other grains, can provide an easy way to gauge the effects of a grain’s shape.

    When they incorporated this new mathematical measure of grain shape into a standard model for bed load transport, the new formula made predictions that matched experiments that the team performed in the lab.

    “Sediment transport is a part of life on Earth’s surface, from the impact of storms on beaches to the gravel nests in mountain streams where salmon lay their eggs,” the team writes of their new study, appearing today in Nature [below]. “Damming and sea level rise have already impacted many such terrains and pose ongoing threats. A good understanding of bed load transport is crucial to our ability to maintain these landscapes or restore them to their natural states.”

    The study’s authors are Eric Deal, Santiago Benavides, Qiong Zhang, Ken Kamrin, and Taylor Perron of MIT, and Jeremy Venditti and Ryan Bradley of Simon Fraser University in Canada.

    Figuring flow

    3
    Video of glass spheres (top) and natural river gravel (bottom) undergoing bed load transport in a laboratory flume, slowed down 17x relative to real time. Average grain diameter is about 5 mm. This video shows how rolling and tumbling natural grains interact with one another in a way that is not possible for spheres. What can’t be seen so easily is that natural grains also experience higher drag forces from the flowing water than spheres do.
    Credit: Courtesy of the researchers.

    Bed load transport is the process by which a fluid such as air or water drags grains across a bed of sediment, causing the grains to hop, skip, and roll along the surface as a fluid flows through. This movement of sediment in a current is what drives rocks to migrate down a river and sand grains to skip across a desert.

    Being able to estimate bed load transport can help scientists prepare for situations such as urban flooding and coastal erosion. Since the 1930s, one formula has been the go-to model for calculating bed load transport; it’s based on a quantity known as the Shields parameter, after the American engineer who originally derived it. This formula sets a relationship between the force of a fluid pushing on a bed of sediment, and how fast the sediment moves in response. Albert Shields incorporated certain variables into this formula, including the average size and density of a sediment’s grains — but not their shape.

    “People may have backed away from accounting for shape because it’s one of these very scary degrees of freedom,” says Kamrin, a professor of mechanical engineering at MIT. “Shape is not a single number.”

    And yet, the existing model has been known to be off by a factor of 10 in its predictions of sediment flow. The team wondered whether grain shape could be a missing ingredient, and if so, how the nebulous property could be mathematically represented.

    “The trick was to focus on characterizing the effect that shape has on sediment transport dynamics, rather than on characterizing the shape itself,” says Deal.

    “It took some thinking to figure that out,” says Perron, a professor of geology in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “But we went back to derive the Shields parameter, and when you do the math, this ratio of drag to friction falls out.”

    “Drag and drop”

    Their work showed that the Shields parameter — which predicts how much sediment is transported — can be modified to include not just size and density, but also grain shape, and furthermore, that a grain’s shape can be simply represented by a measure of the grain’s drag and its internal friction. The math seemed to make sense. But could the new formula predict how sediment actually flows?

    To answer this, the researchers ran a series of flume experiments, in which they pumped a current of water through an inclined tank with a floor covered in sediment. They ran tests with sediment of various grain shapes, including beds of round glass beads, smooth glass chips, rectangular prisms, and natural gravel. They measured the amount of sediment that was transported through the tank in a fixed amount of time. They then determined the effect of each sediment type’s grain shape by measuring the grains’ drag and friction.

    For drag, the researchers simply dropped individual grains down through a tank of water and gathered statistics for the time it took the grains of each sediment type to reach the bottom. For instance, a flatter grain type takes a longer time on average, and therefore has greater drag, than a round grain type of the same size and density.

    To measure friction, the team poured grains through a funnel and onto a circular tray, then measured the resulting pile’s angle, or slope — an indication of the grains’ friction, or ability to grip onto each other.

    For each sediment type, they then worked the corresponding shape’s drag and friction into the new formula, and found that it could indeed predict the bedload transport, or the amount of moving sediment that the researchers measured in their experiments.

    The team says the new model more accurately represents sediment flow. Going forward, scientists and engineers can use the model to better gauge how a river bed will respond to scenarios such as sudden flooding from severe weather or the removal of a dam.

    “If you were trying to make a prediction of how fast all that sediment will get evacuated after taking a dam out, and you’re wrong by a factor of three or five, that’s pretty bad,” Perron says. “Now we can do a lot better.”

    This research was supported, in part, by the U.S. Army Research Laboratory.

    Science paper:
    Nature

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    4

    The Computer Science and Artificial Intelligence Laboratory (CSAIL)

    From The Kavli Institute For Astrophysics and Space Research

    MIT’s Institute for Medical Engineering and Science is a research institute at the Massachusetts Institute of Technology

    The MIT Laboratory for Nuclear Science

    The MIT Media Lab

    The MIT School of Engineering

    The MIT Sloan School of Management

    Spectrum

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 7:57 am on January 12, 2023 Permalink | Reply
    Tags: "Preparing for a changing climate", A multi-institutional effort to identify the best models to calculate flood risk at coastal military installations where climate change threatens to increase the risk of flood damage., , Civil Engineering, , , , Many military installations are located along the coast and they can’t be easily relocated. They need to be protected., , , The findings could have broader implications for coastal communities., The goal is to be able to accurately predict what kind of flooding or damage a certain site might experience during a hurricane impact., The models have to be able to process information quickly enough so that there’s time for a response., The more complex the model is the more physics it includes and the more computationally demanding it is., , UD civil engineers lead research to examine models for coastal readiness at U.S. military bases.   

    From The University of Delaware : “Preparing for a changing climate” 

    U Delaware bloc

    From The University of Delaware

    1.11.23
    Maddy Lauria
    Photo courtesy of Christopher Lashley, Stephanie Patch and NASA.
    Photo illustrations by Joy Smoker.

    1
    Jack Puleo, chair of the University of Delaware’s Department of Civil and Environmental Engineering, is leading a research effort that could have broad implications for coastal communities and calculating risk in the face of a changing climate and rising sea levels.

    UD civil engineers lead research to examine models for coastal readiness at U.S. military bases.

    University of Delaware civil engineers are leading a multi-institutional effort to identify the best models to calculate flood risk at coastal military installations where climate change threatens to increase the risk of flood damage from sea level rise and storm surge.

    The four-year project, which launched in mid-2022 and will run through spring 2025, is funded by a $2.2 million grant from the U.S. Department of Defense (DoD). Project partners include faculty and students from the Netherlands, North Carolina State University, the University of South Alabama, Texas A&M and the United States Geological Survey (USGS).

    “Many military installations are located along the coast and they can’t be easily relocated. They need to be protected,” said Jack Puleo, chair of UD’s Department of Civil and Environmental Engineering and project lead. “To do that, we need to understand what the flooding risk is.”

    The DoD-funded research will explore numerical models that calculate total water levels in the face of sea level rise, tides, wind-induced surge, waves and other environmental variables to determine which approaches not only perform the best but are also the most cost-effective. The team of researchers will apply their work to three military sites: the Virginia-based Naval Station Norfolk on the Atlantic Coast, Tyndall Air Force Base on the Gulf Coast of Florida and the Ronald Reagan Ballistic Missile Defense Test Site on the Marshall Islands in the Pacific Ocean.

    The goal is to be able to accurately predict what kind of flooding or damage a certain site might experience during a hurricane impact, for example, when there’s been another foot of sea level rise.

    “But it’s not just getting wet that’s important,” Puleo said. “It’s about flooding duration and depth. If a prediction says there will be 1 inch of water on a roadway, maybe you don’t care as much. But if it says you’ll have 1 foot of water for multiple tidal cycles, that’s important to know. It could hamper critical services and evacuation.”

    2
    This image shows one of the areas a team of civil engineers will be focusing on — the coastline near Tyndall Air Force Base along the Gulf of Mexico — during a multi-year research project to examine the varying strengths and weaknesses of coastal flooding models, particularly in the face of changing water levels.

    Their findings could have broader implications for coastal communities by identifying which applications work best in which settings because running high-fidelity models isn’t cheap or easy.

    Modeling strengths and weaknesses

    There is a wide range of predictive models available to use, from those that handle basic calculations (but are still highly technical) to those that can produce highly localized results. Combining those models with witness accounts and existing data will help researchers “tease out the importance of knowing the fine details,” Puleo said.

    The question is how much information is really needed to make accurate predictions that could help these military installations become more resilient in the face of a changing climate, especially along the coast. It’s also about timing: the models have to be able to process information quickly enough so that there’s time for a response, such as moving assets out of the way if necessary.

    “We’re the team testing out all of these models and methods to be able to provide a kind of roadmap for when to use which model and what it will cost computationally or resource-wise to be able to do that,” said Stephanie Patch, an associate professor at the University of South Alabama’s Department of Civil, Coastal, and Environmental Engineering. The answer would largely depend on the event — a heavy rain or a major hurricane — as well as the specific location.

    The military is interested in learning about the best options because there can be a steep cost associated with running the higher-end models — upwards of $250,000 per site for data collection, supercomputer access and manpower to generate model input, at a rough estimate. On the other hand, there could also be a steep cost with responding to an event that never happens if the model’s prediction doesn’t play out — or the opposite if an event turns unexpectedly catastrophic and there’s no time to respond.

    3
    These images show water elevations after Hurricane Michael of 2018 at Tyndall Air Force Base along the Gulf of Mexico. Total water levels were estimated using a model called XBeach, run by the University of South Alabama’s Stephanie Patch and colleagues.

    While Patch is focusing on a model that’s very closely tied to a small area of beach and dune and the impacts of erosion, North Carolina State University’s Casey Dietrich is working with larger-scale models capable of simulating storm effects over large areas, like an entire state or the entire Gulf of Mexico. But the information from the varying models can be linked to help the smaller-scale studies make more accurate predictions, Dietrich explained.

    “The goal is to provide guidance to the DoD about the strengths and weaknesses of each model in comparison. They’re all going to have things they’re good with and things they struggle with,” Dietrich said. Those comparisons will help the agencies decide what types of models they want to use to get what types of information — depending on how much time, effort and funding they want to commit.

    There’s also a goal of reducing cost and building smarter models, he said.

    “If we are able to improve our predictions at very specific sites along the coast, we also can have better predictions at other specific sites along the coast, like someone’s house or a bridge or other infrastructure,” Dietrich said.

    Still, differences in the geographic location of the military facilities themselves will play a role in the physics of varying environmental factors, such as wind-driven waves or storm surge, and how those variables interact with the land. That’s why researchers are exploring sites on the Atlantic, Gulf and Pacific coasts. 

    But knowing everything everywhere isn’t always possible, Puleo said. Information on what the seafloor or topography looks like may rely on data collected decades ago or sparse patches of information.

    “There’s so many models to choose, and they’re not all easy to just pick up and use,” said Patch. “I think this project is so great because we’re getting a team of people together who have this expertise in different models who can determine the benefits of those.”

    Planning for the future

    Making as-accurate-as-possible predictions despite any data gaps and potential funding restraints is part of the real-world balance decision-makers must tackle in the face of storm preparedness. 

    UD postdoc Christopher Lashley is using data from 2011’s Hurricane Irene to see how a particular level of modeling will perform. His job, he said, is to make sure he’s giving the model the correct input — because the model is only as accurate as the information it’s given.

    “The more complex the model is, the more physics it includes, the more computationally demanding it is,” Lashley said. “One simulation could take maybe 100 people with individual laptops running at the same time, if you weren’t using a supercomputer. Lesser models can be run in one hour or a few minutes. It can vary significantly.”

    4
    These three images show how the position of a hurricane can impact the water level, due to surge, in certain coastal areas.

    UD Professor Fengyan Shi, a numerical modeling expert with decades of experience and a core faculty member of UD’s Center for Applied Coastal Research, will lead the modeling group. He said working with fluid environments is very complicated because of the various elements you have to consider, like wind fields and how waves are generated.

    Add on top of that the long-term process of sea level rise and physics happening in different places, such as the way water flows in a harbor versus the open ocean, and it’s easy to see how researchers can become very detailed with their modeling approaches.

    “This is real applied research,” Shi said, noting that it will also help researchers further study the impact of physics in model predictions.

    Ultimately, what the team tests and validates could be useful to everyone, Lashley said. Especially if findings indicate that the less-complex models work well at predicting, say, devastating impacts from a hurricane that would require evacuations days in advance, the coastal engineering work they’re doing could ultimately benefit countries and communities without the access to supercomputers or time to wait for slower models to be run.

    “If you know, then you can plan,” he said.

    This forward-looking kind of research is also what lies ahead in the future of coastal engineering, said Patch.

    “We’re learning a lot about the models in terms of how they compare with each other,” she said. “I hope the outcome of this work doesn’t just benefit these specific locations, but also military installations around the world and communities around the world. It’s so translatable and transferable. I would love to see an outcome of this project — even if it’s indirect — to learn enough to apply it worldwide on all of our coasts as climate changes.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Delaware campus

    The University of Delaware is a public land-grant research university located in Newark, Delaware. University of Delaware (US) is the largest university in Delaware. It offers three associate’s programs, 148 bachelor’s programs, 121 master’s programs (with 13 joint degrees), and 55 doctoral programs across its eight colleges. The main campus is in Newark, with satellite campuses in Dover, the Wilmington area, Lewes, and Georgetown. It is considered a large institution with approximately 18,200 undergraduate and 4,200 graduate students. It is a privately governed university which receives public funding for being a land-grant, sea-grant, and space-grant state-supported research institution.

    The University of Delaware is classified among “R1: Doctoral Universities – Very high research activity”. According to The National Science Foundation, UD spent $186 million on research and development in 2018, ranking it 119th in the nation. It is recognized with the Community Engagement Classification by the Carnegie Foundation for the Advancement of Teaching.

    The University of Delaware is one of only four schools in North America with a major in art conservation. In 1923, it was the first American university to offer a study-abroad program.

    The University of Delaware traces its origins to a “Free School,” founded in New London, Pennsylvania in 1743. The school moved to Newark, Delaware by 1765, becoming the Newark Academy. The academy trustees secured a charter for Newark College in 1833 and the academy became part of the college, which changed its name to Delaware College in 1843. While it is not considered one of the colonial colleges because it was not a chartered institution of higher education during the colonial era, its original class of ten students included George Read, Thomas McKean, and James Smith, all three of whom went on to sign the Declaration of Independence. Read also later signed the United States Constitution.

    Science, Technology and Advanced Research (STAR) Campus

    On October 23, 2009, The University of Delaware signed an agreement with Chrysler to purchase a shuttered vehicle assembly plant adjacent to the university for $24.25 million as part of Chrysler’s bankruptcy restructuring plan. The university has developed the 272-acre (1.10 km^2) site into the Science, Technology and Advanced Research (STAR) Campus. The site is the new home of University of Delaware (US)’s College of Health Sciences, which includes teaching and research laboratories and several public health clinics. The STAR Campus also includes research facilities for University of Delaware (US)’s vehicle-to-grid technology, as well as Delaware Technology Park, SevOne, CareNow, Independent Prosthetics and Orthotics, and the East Coast headquarters of Bloom Energy. In 2020 [needs an update], University of Delaware expects to open the Ammon Pinozzotto Biopharmaceutical Innovation Center, which will become the new home of the UD-led National Institute for Innovation in Manufacturing Biopharmaceuticals. Also, Chemours recently opened its global research and development facility, known as the Discovery Hub, on the STAR Campus in 2020. The new Newark Regional Transportation Center on the STAR Campus will serve passengers of Amtrak and regional rail.

    Academics

    The university is organized into nine colleges:

    Alfred Lerner College of Business and Economics
    College of Agriculture and Natural Resources
    College of Arts and Sciences
    College of Earth, Ocean and Environment
    College of Education and Human Development
    College of Engineering
    College of Health Sciences
    Graduate College
    Honors College

    There are also five schools:

    Joseph R. Biden, Jr. School of Public Policy and Administration (part of the College of Arts & Sciences)
    School of Education (part of the College of Education & Human Development)
    School of Marine Science and Policy (part of the College of Earth, Ocean and Environment)
    School of Nursing (part of the College of Health Sciences)
    School of Music (part of the College of Arts & Sciences)

     
  • richardmitnick 8:53 am on January 7, 2023 Permalink | Reply
    Tags: "Debris-covered glaciers": glaciers that are covered by sand and rocks and boulders., "SSPs": shared socioeconomic pathways, "Team projects two out of three glaciers could be lost by 2100", "Tidewater glaciers": glaciers that terminate in the ocean, , , Civil Engineering, , David Rounce led an international effort to produce new projections of glacier mass loss through the century under different emissions scenarios., , , , , North American and Central European glacial regions will almost disappear completely., Only recently have researchers been able to produce global predictions for total glacial mass change using the new "SSPs"., , The report warned that policymakers have less than three years to act to avert catastrophic and irreversible changes to our climate., The world could lose as much as 41 percent of its total glacier mass this century—or as little as 26 percent—depending on today’s climate change mitigation efforts.   

    From The College of Engineering At Carnegie Mellon University: “Team projects two out of three glaciers could be lost by 2100” 

    From The College of Engineering

    At

    Carnegie Mellon University

    1
    Glaciers from a research expedition. Credit: Carnegie Mellon College of Engineering.

    1.7.23
    Dan Carroll
    dccarrol@andrew.cmu.edu

    David Rounce led an international effort to produce new projections of glacier mass loss through the century under different emissions scenarios.

    The projections were aggregated into global temperature change scenarios to support adaptation and mitigation discussions, such as those at the recent United Nations Conference of Parties (COP 27). His work showed that the world could lose as much as 41 percent of its total glacier mass this century—or as little as 26 percent—depending on today’s climate change mitigation efforts.

    The most recent IPCC report for policymakers brought together thousands of internationally recognized climate experts in an urgent plea to citizens and their governments to fight for drastic and immediate reductions to greenhouse gas emissions. The report warned that policymakers have less than three years to act to avert catastrophic and irreversible changes to our climate. The shared socioeconomic pathways, or SSPs, they used to model future scenarios for climate change are based on factors like population, economic growth, education, urbanization, and innovation. These new pathways illustrate a more complete picture of socioeconomic trends that could impact future greenhouse gas emissions.

    Only recently have researchers been able to produce global predictions for total glacial mass change using the new “SSPs”. Rounce’s work aggregates these future climate scenarios based on their increase in global mean temperature to evaluate the corresponding impacts associated with temperature change scenarios ranging from +1.5° C to +4° C. His model is also calibrated with an unprecedented amount of data, including individual mass change observations for every glacier, and uses state-of-the-art calibration methods that require the use of supercomputers.

    Rounce, an assistant professor of Civil and Environmental Engineering, and his team found that in the SSP with continued investment in fossil fuels, more than 40 percent of the glacial mass will be gone within the century, and more than 80 percent of glaciers by number could well disappear. Even in a best-case, low-emissions scenario, where the increase in global mean temperature is limited to +1.5° C relative to pre-industrial levels, more than 25 percent of glacial mass will be gone and nearly 50 percent of glaciers by number are projected to disappear. A majority of these lost glaciers are small (less than one km2) by glacial standards, but their loss can negatively affect local hydrology, tourism, glacier hazards, and cultural values.

    Many processes govern how glaciers lose mass, and Rounce is working to advance how models account for different types of glaciers, including tidewater and debris-covered glaciers. “Tidewater glaciers” refer to glaciers that terminate in the ocean, which causes them to lose a lot of mass at this interface. Debris-covered glaciers refer to glaciers that are covered by sand, rocks, and boulders. Prior work by Rounce has shown that the thickness and distribution of debris cover can have a positive or negative effect on glacial melt rates across an entire region, depending on the debris thickness. In this newest work, he found that accounting for these processes had relatively little impact on the global glacier projections, but substantial differences in mass loss were found when analyzing individual glaciers.


    David Rounce: The Response of Glaciers, Water Resources, and Hazards to Climate Change.

    His work provides better context for regional glacier modeling, and he hopes it will spur climate policy makers to lower temperature change goals beyond the 2.7° C mark that pledges from COP-26 are projected to hit. Smaller glacial regions like Central Europe, low latitudes like the Andes, and the upper areas of North America will be disproportionately affected by temperatures rising more than 2° C. At a 3° C rise these glacial regions almost disappear completely.

    Rounce noted that the way in which glaciers respond to changes in climate takes a long time. He describes the glaciers as extremely slow-moving rivers. Cutting emissions today will not remove previously emitted greenhouse gasses, nor can it instantly halt the inertia they contribute to climate change, meaning even a complete halt to emissions would still take between 30 and 100 years to be reflected in glacier mass loss rates.

    Many processes govern how glaciers lose mass and Rounce’s study advances how models account for different types of glaciers, including tidewater and debris-covered glaciers. Tidewater glaciers refer to glaciers that terminate in the ocean, which causes them to lose a lot of mass at this interface. Debris-covered glaciers refer to glaciers that are covered by sand, rocks, and boulders.

    Prior work [Geophysical Research Letters (below)] by Rounce has shown that the thickness and distribution of debris cover can have a positive or negative effect on glacial melt rates across an entire region, depending on the debris thickness.

    In this newest work, he found that accounting for these processes had relatively little impact on the global glacier projections, but substantial differences in mass loss were found when analyzing individual glaciers.

    The model is also calibrated with an unprecedented amount of data, including individual mass change observations for every glacier, which provide a more complete and detailed picture of glacier mass change. The use of supercomputers was thus essential to support the application of state-of-the-art calibration methods and the large ensembles of different emissions scenarios.

    Science papers:
    Geophysical Research Letters 2021
    See the above science paper for instructive material with images.
    Science
    Science

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The College of Engineering is well-known for working on problems of both scientific and practical importance. Our acclaimed faculty focus on transformative results that will drive the intellectual and economic vitality of our community, nation and world. Our “maker” culture is ingrained in all that we do, leading to novel approaches and unprecedented results.

    Carnegie Mellon University is a global research university with more than 12,000 students, 95,000 alumni, and 5,000 faculty and staff.

    CMU has been a birthplace of innovation since its founding in 1900.

    Today, we are a global leader bringing groundbreaking ideas to market and creating successful startup businesses.
    Our award-winning faculty members are renowned for working closely with students to solve major scientific, technological and societal challenges. We put a strong emphasis on creating things—from art to robots. Our students are recruited by some of the world’s most innovative companies.

    We have campuses in Pittsburgh, Qatar and Silicon Valley, and degree-granting programs around the world, including Africa, Asia, Australia, Europe and Latin America.

     
  • richardmitnick 8:29 pm on January 4, 2023 Permalink | Reply
    Tags: "Turning coal mine drainage into a source of rare minerals", A novel process for lessening the negative environmental impact of coal mine drainage, , , Civil Engineering, , , Extracting rare-earth elements from coal mine drainage, Geodetic Engineering, Getting rare-earths out of the ground can cause immense environmental and social harm., Study finds way to extract metals needed for modern tech., The Ohio State team used a passive system to neutralize the coal drainage and capture the rare-earth elements., , The process captured a variety of metals used in modern technology including terbium and neodymium and europium.   

    From The Ohio State University: “Turning coal mine drainage into a source of rare minerals” 

    From The Ohio State University

    1.3.23

    Tatyana Woodall
    Ohio State News
    woodall.52@osu.edu

    1
    Coal mine drainage impairs thousands of miles of waterways in the U.S. every year, disrupting the growth of all kinds of aquatic plants and animals. Photo: Getty Images.

    Study finds way to extract metals needed for modern tech.

    A new study investigates a novel process for lessening the negative environmental impact of coal mine drainage and extracting rare-earth elements from it, precious minerals needed to manufacture many high-tech devices.

    “Rare-earth elements, like Yttrium, for example, are necessary components of electronics, computers, and other gadgets that we use every day,” said Jeff Bielicki, co-author of the study and an associate professor in civil environmental and geodetic engineering and the John Glenn College of Public Affairs at The Ohio State University. 

    The study, published in the journal Environmental Engineering Science [below], assesses an experimental process patented by the team that was shown to successfully clean coal mine drainage while producing rare-earth elements in samples from various rivers across Ohio, Pennsylvania, and West Virginia.

    “One thing that surprised me was just how well our process cleans up the water,” said Bielicki. “From an environmental standpoint, the major benefit of this work is that we’re successfully trapping and neutralizing so much pollution.”

    When abandoned coal mines leak water, the subsequent drainage can pollute thousands of miles of natural waterways, turning them orange, and can cause great injury to the ecosystem.

    Although the rare earth elements that are in from coal mine drainage are in increasingly high demand, viable natural deposits of these minerals are found in only a few areas around the world, meaning that only a few countries can provide them.

    For example, much of the Western world, including the United States, relies on China to supply about 80% of these critical resources. As a result, many government agencies seek to reduce this dependence by establishing a domestic supply of rare-earth elements, especially because getting them out of the ground can cause immense environmental and social harm, Bielicki said.

    “By sourcing these materials from other countries, we don’t really have any oversight of the environmental consequences of how they’re mining and producing the materials,” he said. “Domestic production is good in a variety of ways, in part because we can have regulations that better protect the environment and the people in the communities from where we get them.”

    Currently, coal mine drainage is treated using active treatment systems which employ chemicals to clean the water, or passive treatment systems, which often depend on bacterial activity or geochemical methods.

    According to the study, passive approaches tend to require fewer resources and have fewer environmental impacts. The Ohio State team used a passive system employing a combination of alkaline industrial byproducts, including materials like water treatment plant sludge, to neutralize the coal drainage and capture the rare earth elements.

    “It’s designed to let the natural seepage of coal mine drainage percolate through the material to trap and extract it,” said Bielicki. The average time it takes to rid water of waste often varies, because the process largely depends on how quickly water flows out from the mine.

    The process captured a variety of metals used in modern technology, including terbium, neodymium and europium, which play critical roles in phone displays, batteries, microphones, speakers and other parts.

    The process is currently more costly than the current market price of rare metals, but further advances will bring the price down, Bielicki said.

    Bielicki said he hopes their research will inform future policy surrounding coal waste disposal and help the public to examine the environmental repercussions of mining outside of typical costs, like its impact on human health and the ecosystem at large.

    “Nothing we do to our environment is benign, so while shifting away from coal and other fossil fuels is beneficial in several different dimensions, we need to effect these transitions in ways that address a larger sphere of issues than just cost,“ he said. “Our research is a vital step in addressing the legacies of those environmental and social consequences.”

    Other Ohio State co-authors of the study were Marcos Miranda, Soomin Chun, and Chin-Min Cheng. Other members of the team include Ohio State professors John Lenhart and Tarunjit Butalia. This work was supported by the Environmental Research Education Foundation and the U.S. Department of Energy.

    Science paper:
    Environmental Engineering Science
    See the science paper for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Ohio State University is a public research university in Columbus, Ohio. Founded in 1870 as a land-grant university and the ninth university in Ohio with the Morrill Act of 1862, the university was originally known as the Ohio Agricultural and Mechanical College.

    Ohio State has been ranked by major institutional rankings among the best public universities in the United States. Originally focused on various agricultural and mechanical disciplines, it developed into a comprehensive university under the direction of then-Governor and later U.S. president Rutherford B. Hayes, and in 1878, the Ohio General Assembly passed a law changing the name to “The Ohio State University” and broadening the scope of the university. Admission standards tightened and became greatly more selective throughout the 2000s and 2010s.

    Ohio State’s political science department and faculty have greatly contributed to the construction and development of the constructivist and realist schools of international relations; a 2004 LSE study ranked the program as first among public institutions and fourth overall in the world. A member of the Association of American Universities since 1916, Ohio State is a leading producer of Fulbright Scholars, and is the only school in North America that offers an Accreditation Board for Engineering and Technology, Inc-accredited undergraduate degree in welding engineering. The university’s endowment of $6.8 billion in 2021 is among the largest in the world. Past and present alumni and faculty include five Nobel Prize laureates, nine Rhodes Scholars, seven Churchill Scholars, one Fields Medalist, seven Pulitzer Prize winners, 64 Goldwater scholars, six U.S. Senators, 15 U.S. Representatives, and 108 Olympic medalists. It is classified among “R1: Doctoral Universities – Very high research activity.” As of 2021, Ohio State has the most students in the 95th percentile or above on standardized testing of any public university in the United States.

    The university has an extensive student life program, with over 1,000 student organizations; intercollegiate, club and recreational sports programs; student media organizations and publications, fraternities and sororities; and three student governments. Its athletic teams compete in Division I of the NCAA and are known as the Ohio State Buckeyes, and it’s a member of the Big Ten Conference for the majority of its sports. The school’s football program has had great success and is one of the major programs of college football; their rivalry against the University of Michigan has been termed as one of the greatest in North American sports. As of 2017, Ohio State’s football program is valued at $1.5 billion, the highest valuation of any such program in the country. The main campus in Columbus has grown into the third-largest university campus in the United States, with nearly 50,000 undergraduate students and nearly 15,000 graduate students. study ranked the program as first among public institutions and fourth overall in the world.

    In 1906, Ohio State President William Oxley Thompson, along with the university’s supporters in the state legislature, put forth the Lybarger Bill with the aim of shifting virtually all higher education support to the continued development of Ohio State while funding only the “normal school” functions of the state’s other public universities. Although the Lybarger Bill failed narrowly to gain passage, in its place the Eagleson Bill was passed as a compromise, which determined that all doctoral education and research functions would be the role of Ohio State, and that Miami University and Ohio University would not offer instruction beyond the master’s degree level – an agreement that would remain in place until the 1950s.

    With the onset of the Great Depression, Ohio State would face many of the challenges affecting universities throughout America as budget support was slashed, and students without the means of paying tuition returned home to support families. By the mid-1930s, however, enrollment had stabilized due in large part to the role of the Federal Emergency Relief Administration and later the National Youth Administration. By the end of the decade, enrollment had still managed to grow to over 17,500. In 1934, the Ohio State Research Foundation was founded to bring in outside funding for faculty research projects. In 1938, a development office was opened to begin raising funds privately to offset reductions in state support.

    In 1952, Ohio State founded the interdisciplinary Mershon Center for International Security Studies, which it still houses. The work of this program led to the United States Department of Homeland Security basing the National Academic Consortium for Homeland Security at the university in 2003.

    The Ohio State University and the University of Michigan football programs participated in The Ten Year War between 1969 and 1978. In consistently close matches, it pitted coaches Woody Hayes of Ohio State and Bo Schembechler of Michigan against each other. This heated era led to the persistent Michigan–Ohio State football rivalry.

    Ohio State had an open admissions policy until the late 1980s; particularly since the early 2000s, the college has greatly raised standards for admission, and it has been increasingly cited as one of the best public universities in the United States. As of 2021, it has by far the most students in the country in the 95th percentile or above of test-takers on the ACT and SAT of any public university. The trend particularly began under former university administrator William Kirwan in 1998, who set out to greatly increase the quality of applicants and make the university an elite academic university.

    Michael V. Drake, former chancellor of the University of California-Irvine, became the 15th president of the Ohio State University on June 30, 2014. He announced on November 21, 2019, that he would retire at the end of the 2019–2020 academic year. In 2019, Ohio State filed for trademark protection of “the” when it is used to refer to Ohio State; the application was denied. On June 3, 2020, the Ohio State Board of Trustees appointed Kristina M. Johnson, the former chancellor of The State University of New York, as the 16th president of the Ohio State University. The main campus in Columbus has grown into the third-largest university campus in the United States.

    On June 22, 2022, the United States Patent and Trademark Office granted the university a trademark on the word “the” in relation to clothing, such as T-shirts, baseball caps and hats distributed and/or sold through athletic or collegiate channels. Ohio State and its fans, in particular those of its athletics program, frequently emphasizes the word “THE” when referring to the school.

    The Public Ivies: America’s Flagship Public Universities (2000) by Howard and Matthew Greene listed Ohio State as one of a select number of public universities offering the highest educational quality. In its 2021 edition, U.S. News & World Report ranked Ohio State as tied for the 17th-best public university in the United States, and tied for 53rd among all national universities. They ranked the college’s political science, audiology, sociology, speech–language pathology, finance, accounting, public affairs, nursing, social work, healthcare administration and pharmacy programs as among the top 20 programs in the country. The Academic Ranking of World Universities placed Ohio State 42–56 nationally and 101–150 globally for 2020. In its 2021 rankings, Times Higher Education World University Rankings ranked it tied for 80th in the world. In 2021, QS World University Rankings ranked the university 108th in the world. The Washington Monthly college rankings, which seek to evaluate colleges’ contributions to American society based on factors of social mobility, research and service to the country by their graduates, placed Ohio State 98th among national universities in 2020.

    In 1916, Ohio State became the first university in Ohio to be extended membership into the Association of American Universities, and remains the only public university in Ohio among the organization’s 60 members. Ohio State is also the only public university in Ohio to be classified among “R1: Doctoral Universities – Highest Research Activity” and have its undergraduate admissions classified as “more selective.”

    Ohio State’s political science program is ranked among the top programs globally. Considered to be one of the leading departments in the United States, it has played a particularly significant role in the construction and development of the constructivist and realist schools of international relations. Notable political scientists who have worked at the university include Alexander Wendt, John Mueller, Randall Schweller, Gene Sharp and Herb Asher. In 2004, it was ranked as first among public institutions and fourth overall in the world by British political scientist Simon Hix at the London School of Economics and Political Science, while a 2007 study in the academic journal PS: Political Science & Politics ranked it ninth in the United States. It is a leading producer of Fulbright Scholars.

    Bloomberg Businessweek ranked the undergraduate business program at Ohio State’s Fisher College of Business as the 14th best in the nation in its 2016 rankings. U.S. News & World Report ranks the MBA program tied for 30th in America. Fisher’s Executive MBA program was ranked third nationally for return on investment by The Wall Street Journal in 2008, citing a 170 percent return on an average of $66,900 invested in tuition and expenses during the 18-month program.

    The Ohio State linguistics department was recently ranked among the top 10 programs nationally, and top 20 internationally by QS World University Rankings.

    Ohio State’s research expenditures for the 2019 fiscal year were $968.3 million. The university is among the top 12 U.S. public research universities and third among all universities in industry-sponsored research (National Science Foundation). It is also named as one of the most innovative universities in the nation (U.S. News & World Report) and in the world (Reuters). In a 2007 report released by the National Science Foundation, Ohio State’s research expenditures for 2006 were $652 million, placing it seventh among public universities and 11th overall, also ranking third among all American universities for private industry-sponsored research. Research expenditures at Ohio State were $864 million in 2017. In 2006, Ohio State announced it would designate at least $110 million of its research efforts toward what it termed “fundamental concerns” such as research toward a cure for cancer, renewable energy sources and sustainable drinking water supplies. In 2021, President Kristina M. Johnson announced the university would invest at least $750 million over the next 10 years toward research and researchers. This was announced in conjunction with Ohio State’s new Innovation District, which will be an interdisciplinary research facility and act as a hub for healthcare and technology research, serving Ohio State faculty and students as well as public and private partners. Construction is expected to be completed in 2023.

    Research facilities include Aeronautical/Astronautical Research Laboratory, Byrd Polar Research Center, Center for Automotive Research (OSU CAR), Chadwick Arboretum, Biomedical Research Tower, Biological Sciences Building, CDME, Comprehensive Cancer Center, David Heart and Lung Research Institute, Electroscience Laboratory, Large Binocular Telescope (LBT, originally named the Columbus Project), Mershon Center for International Security Studies, Museum of Biological Diversity, National Center for the Middle Market, Stone Laboratory on Gibraltar Island, OH, Center for Urban and Regional Analysis and Ohio Agricultural Research and Development Center.

    Ohio State’s faculty currently includes 21 members of the National Academy of Sciences or National Academy of Engineering, four members of the Institute of Medicine and 177 elected fellows of the American Association for the Advancement of Science. In 2009, 17 Ohio State faculty members were elected as AAAS Fellows. Each year since 2002, Ohio State has either led or been second among all American universities in the number of their faculty members elected as fellows to the AAAS.

    In surveys conducted in 2005 and 2006 by the Collaborative on Academic Careers in Higher Education (COACHE), Ohio State was rated as “exemplary” in four of the seven measured aspects of workplace satisfaction for junior faculty members at 31 universities: overall tenure practices, policy effectiveness, compensation and work-family balance.

    In the last quarter century, 32 Ohio State faculty members have received the Guggenheim Fellowship, more than all other public and private Ohio universities combined. In 2008, three Ohio State faculty members were awarded Guggenheim Fellowships, placing Ohio State among the top 15 universities in the United States. Since the 2000–2001 award year, 55 Ohio State faculty members have been named as Fulbright Fellows, the most of any Ohio university.

     
  • richardmitnick 11:06 am on December 30, 2022 Permalink | Reply
    Tags: "EQSIM Shakes up Earthquake Research at the Exascale Level", , , Civil Engineering, , , , ECP has gone from simulating the model at 2–2.5 Hz at the start of this project to simulating more than 300 billion grid points at 10 Hz which is a huge computational lift., , , , Researchers have been applying high-performance computing to model site specific motions and better understand what forces a structure is subjected to during a seismic event., Scientists want to reduce the uncertainty in earthquake ground motions and how a structure is going to respond to earthquakes., , , The challenge is that tremendous computer horsepower is required to model seismicity. Fortunately the emergence of exascale computing has changed the equation., , , The earth is very heterogeneous and the geology is very complicated., The excitement of ECP is that we now have new exascale computers that can do a billion billion calculations per second with a tremendous volume of memory., The prediction of future earthquakes at a specific site is a challenging problem because the processes associated with earthquakes and the response of structures is very complicated., The whole goal with EQSIM was to advance the state of modeling all the way from the fault rupture to the waves propagating through the earth to the waves interacting with the structure., When the earthquake fault ruptures it releases energy in a very complex way and that energy manifests and propagates as seismic waves through the earth.   

    From The DOE’s Lawrence Berkeley National Laboratory Via The DOE’s Exascale Computing Project: “EQSIM Shakes up Earthquake Research at the Exascale Level” 

    From The DOE’s Lawrence Berkeley National Laboratory

    Via

    The DOE’s Exascale Computing Project

    12.7.22
    Kathy Kincade | The DOE’s Lawrence Berkeley National Laboratory

    Since 2017, EQSIM—one of several projects supported by the DOE’s Exascale Computing Project (ECP)—has been breaking new ground in efforts to understand how seismic activity affects the structural integrity of buildings and infrastructure. While small-scale models and historical observations are helpful, they only scratch the surface of quantifying a geological event as powerful and far-reaching as a major earthquake.

    EQSIM bridges this gap by using physics-based supercomputer simulations to predict the ramifications of an earthquake on buildings and infrastructure and create synthetic earthquake records that can provide much larger analytical datasets than historical, single-event records.

    To accomplish this, however, has presented a number of challenges, noted EQSIM principal investigator David McCallen, a senior scientist in Lawrence Berkeley National Laboratory’s Earth and Environmental Sciences Area and director of the Center for Civil Engineering Earthquake Research at the University of Nevada Reno.

    1
    David McCallen is a senior scientist in Lawrence Berkeley National Laboratory’s Earth and Environmental Sciences Area, director of the Center for Civil Engineering Earthquake Research at the University of Nevada Reno, and principal investigator of ECP’s EQSIM project.

    “The prediction of future earthquake motions that will occur at a specific site is a challenging problem because the processes associated with earthquakes and the response of structures is very complicated,” he said. “When the earthquake fault ruptures, it releases energy in a very complex way, and that energy manifests and propagates as seismic waves through the earth. In addition, the earth is very heterogeneous and the geology is very complicated. So when those waves arrive at the site or piece of infrastructure you are concerned with, they interact with that infrastructure in a very complicated way.”

    Over the last decade-plus, researchers have been applying high-performance computing to model these processes to more accurately predict site-specific motions and better understand what forces a structure is subjected to during a seismic event.

    “The challenge is that tremendous computer horsepower is required to do this,” McCallen said. “It‘s hard to simulate ground motions at a frequency content that is relevant to engineered structures. It takes super-big models that run very efficiently. So, it’s been very challenging computationally, and for some time we didn’t have the computational horsepower to do that and extrapolate to that.”

    Fortunately, the emergence of exascale computing has changed the equation.

    “The excitement of ECP is that we now have these new computers that can do a billion billion calculations per second with a tremendous volume of memory, and for the first time we are on the threshold of being able to solve, with physics-based models, this very complex problem,” McCallen said. “So our whole goal with EQSIM was to advance the state of computational capabilities so we could model all the way from the fault rupture to the waves propagating through the earth to the waves interacting with the structure—with the idea that ultimately we want to reduce the uncertainty in earthquake ground motions and how a structure is going to respond to earthquakes.”

    A Team Effort

    Over the last 5 years, using both the Cori [below] and Perlmutter [below] supercomputers at The DOE’s Lawrence Berkeley National Laboratory and the Summit system at The DOE’s Oak Ridge National Laboratory, the EQSIM team has focused primarily on modeling earthquake scenarios in the San Francisco Bay Area.

    These supercomputing resources helped them create a detailed, regional-scale model that includes all of the necessary geophysics modeling features, such as 3D geology, earth surface topography, material attenuation, nonreflecting boundaries, and fault rupture.

    “We’ve gone from simulating this model at 2–2.5 Hz at the start of this project to simulating more than 300 billion grid points at 10 Hz, which is a huge computational lift,” McCallen said.

    Other notable achievements of this ECP project include:

    Making important advances to the SW4 geophysics code, including how it is coupled to local engineering models of the soil and structure system.
    Developing a schema for handling the huge datasets used in these models. “For a single earthquake we are running 272 TB of data, so you have to have a strategy for storing, visualizing, and exploiting that data,” McCallen said.
    Developing a visualization tool that allows very efficient browsing of this data.

    “The development of the computational workflow and how everything fits together is one of our biggest achievements, starting with the initiation of the earthquake fault structure all the way through to the response of the engineered system,” McCallen said. “We are solving one high-level problem but also a whole suite of lower-level challenges to make this work. The ability to envision, implement, and optimize that workflow has been absolutely essential.”

    None of this could have happened without the contributions of multiple partners across a spectrum of science, engineering, and mathematics, he emphasized. Earth engineers, seismologists, computer scientists, and applied mathematicians from Berkeley Lab and The DOE’s Lawrence Livermore National Laboratory formed the multidisciplinary, closely integrated team necessary to address the computational challenges.

    “This is an inherently multidisciplinary problem,” McCallen said. “You are starting with the way a fault ruptures and the way waves propagate through the earth, and that is the domain of a seismologist. Then those waves are arriving at a site where you have a structure that has found a non-soft soil, so it transforms into a geotechnical engineering and structural engineering problem.”

    It doesn’t stop there, he added. “You absolutely need this melding of people who have the scientific and engineering domain knowledge, but they are enabled by the applied mathematicians who can develop really fast and efficient algorithms and the computer scientists who know how to program and optimally parallelize and handle all the I/O on these really big problems.”

    Looking ahead, the EQSIM team is already involved in another DOE project with an office that deals with energy systems. Their goal is to transition and leverage everything they’ve done through the ECP program to look at earthquake effects on distributed energy systems.

    This new project involves applying these same capabilities to programs within the DOE Office of Cybersecurity, Energy Security, and Emergency Response, which is concerned with the integrity of energy systems in the United States. The team is also working to make its large earthquake datasets available as open-access to both the research community and practicing engineers.

    “That is common practice for historical measured earthquake records, and we want to do that with synthetic earthquake records that give you a lot more data because you have motions everywhere, not just locations where you had an instrument measuring an earthquake,” McCallen said.

    Being involved with ECP has been a key boost to this work, he added, enabling EQSIM to push the envelope of computing performance.

    “We have extended the ability of doing these direct, high-frequency simulations a tremendous amount,” he said. “We have a plot that shows the increase in performance and capability, and it has gone up orders of magnitude, which is really important because we need to run really big problems really, really fast. So that, coupled with the exascale hardware, has really made a difference. We’re doing things now that we only thought about doing a decade ago, like resolving high-frequency ground motions. It is really an exciting time for those of us who are working on simulating earthquakes.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About The DOE’s Exascale Computing Project
    The ECP is a collaborative effort of two DOE organizations – the DOE’s Office of Science and the DOE’s National Nuclear Security Administration. As part of the National Strategic Computing initiative, ECP was established to accelerate delivery of a capable exascale ecosystem, encompassing applications, system software, hardware technologies and architectures, and workforce development to meet the scientific and national security mission needs of DOE in the early-2020s time frame.

    About The Office of Science

    The DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://science.energy.gov/.

    About The NNSA

    Established by Congress in 2000, NNSA is a semi-autonomous agency within the DOE responsible for enhancing national security through the military application of nuclear science. NNSA maintains and enhances the safety, security, and effectiveness of the U.S. nuclear weapons stockpile without nuclear explosive testing; works to reduce the global danger from weapons of mass destruction; provides the U.S. Navy with safe and effective nuclear propulsion; and responds to nuclear and radiological emergencies in the United States and abroad. https://nnsa.energy.gov

    The Goal of ECP’s Application Development focus area is to deliver a broad array of comprehensive science-based computational applications that effectively utilize exascale HPC technology to provide breakthrough simulation and data analytic solutions for scientific discovery, energy assurance, economic competitiveness, health enhancement, and national security.

    Awareness of ECP and its mission is growing and resonating—and for good reason. ECP is an incredible effort focused on advancing areas of key importance to our country: economic competiveness, breakthrough science and technology, and national security. And, fortunately, ECP has a foundation that bodes extremely well for the prospects of its success, with the demonstrably strong commitment of the US Department of Energy (DOE) and the talent of some of America’s best and brightest researchers.

    ECP is composed of about 100 small teams of domain, computer, and computational scientists, and mathematicians from DOE labs, universities, and industry. We are tasked with building applications that will execute well on exascale systems, enabled by a robust exascale software stack, and supporting necessary vendor R&D to ensure the compute nodes and hardware infrastructure are adept and able to do the science that needs to be done with the first exascale platforms.the science that needs to be done with the first exascale platforms.

    LBNL campus

    Bringing Science Solutions to the World

    In the world of science, The Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the The National Academy of Sciences, one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the The National Academy of Engineering, and three of our scientists have been elected into The Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by The DOE through its Office of Science. It is managed by the University of California and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above The University of California-Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California-Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.

    LBNL 88 inch cyclotron.

    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded The DOE’s Los Alamos Laboratory, and Robert Wilson founded The DOE’s Fermi National Accelerator Laboratory.

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now The Department of Energy . The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now The DOE’s Lawrence Livermore National Laboratory) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy , with management from the University of California. Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science:

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    The DOE’s Lawrence Berkeley National Laboratory Advanced Light Source.
    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    Berkeley Lab Laser Accelerator (BELLA) Center

    The DOE Joint Genome Institute supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory, DOE’s Oak Ridge National Laboratory (ORNL), DOE’s Pacific Northwest National Laboratory (PNNL), and the HudsonAlpha Institute for Biotechnology . The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    LBNL Molecular Foundry

    The LBNL Molecular Foundry is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory.

    Cray Cori II supercomputer at National Energy Research Scientific Computing Center at DOE’s Lawrence Berkeley National Laboratory, named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    NERSC Hopper Cray XE6 supercomputer.

    NERSC Cray XC30 Edison supercomputer.

    NERSC GPFS for Life Sciences.

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supercomputer.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory, the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science , and DOE’s Lawrence Livermore National Laboratory (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory leads JCESR and Berkeley Lab is a major partner.

     
  • richardmitnick 10:21 am on December 22, 2022 Permalink | Reply
    Tags: "Efforts to Hold Up The Leaning Tower of Pisa Are Going Better Than Expected", , Civil Engineering, Considering it is an 850-year-old patient with a tilt of around five meters and a subsidence of over three meters the state of health of the Leaning Tower of Pisa is excellent., Construction of the Tower of Pisa began in 1174 and within a few years – following construction of its first few tiers in fact – it was obvious something was wrong., , In 2013 researchers from Australia's national science agency-CSIRO-also mapped every nook and cranny of the tower using 3D scanners creating some ghostly digital reconstructions of the tower., Italy's Leaning Tower of Pisa has weathered four earthquakes., , The decade-long stabilization project was eventually completed in 2001 after which the tower had straightened up some 40 centimeters and now its tilt is just shy of 4 degrees., The same soft soils beneath the tower's foundation that produced its characteristic lean might actually now offer some protection from earthquakes., The tower has crept upright by about 4 centimeters (1.6 inches) in the 21 years since the last stabilization works were done., The tower’s shallow foundations were constructed on an unstable base of mud and sand and clay that was softer on the southern side.   

    From “Science Alert (AU)” : “Efforts to Hold Up The Leaning Tower of Pisa Are Going Better Than Expected” 

    ScienceAlert

    From “Science Alert (AU)”

    12.22.22
    Clare Watson

    1
    (1001Love/Getty Images)

    Take one look at Italy’s Leaning Tower of Pisa and a single question springs to mind: just how close is it to toppling right over?

    For decades going on centuries, engineers, historians, and onlookers have held their collective breath at the fate of the iconic bell tower, which has weathered four earthquakes and swayed back and forth, yet somehow still stands with that eponymous lean.

    It’s not without some clever intervening that the tower has avoided a date with gravity. In fact, before it was even finished engineers battled to return the structure to an upright position.

    We can all now breathe a sigh of relief, thanks to the latest survey of the bell tower, which found that its health is much better than forecasted. The tower has crept upright by about 4 centimeters (1.6 inches) in the 21 years since the last stabilization works were done.

    The survey was conducted by a team of geotechnical engineers and funded by Opera Primaziale Pisana (O₽A), a non-profit organization established to oversee construction works to preserve the historic site.

    “Considering it is an 850-year-old patient with a tilt of around five meters and a subsidence of over three meters, the state of health of the Leaning Tower of Pisa is excellent,” an O₽A spokesperson told Italy’s National Associated Press Agency earlier this month.

    Construction of the Tower of Pisa began in 1174, and within a few years – following construction of its first few tiers in fact – it was obvious something was wrong. Its shallow foundations were constructed on an unstable base of mud, sand, and clay that was softer on the southern side.

    Engineers tried to correct the lean as they went, making upper floors taller on one side than the other, resulting in what you could say is a marvelous building that’s curved as well as lopsided.

    Over many years, with its tilt increasing, engineers attempted to sure up the eight-storey tower, sometimes making the problem worse. By the 1990s, the Tower of Pisa was no closer to solid ground, tilting 5.5 degrees to the south, just beyond the point at which engineers thought the tower would collapse.

    2
    Laser scan of the Tower of Pisa. (CyArk/Wikimedia Commons/CC BY-SA 4.0)

    Shortly thereafter, the tower was closed to the public and the Italian government enlisted a group of experts, chaired by civil engineer Michele Jamiolkowski, to work out how to save it. They thought about injecting cement beneath the tower but decided that was too risky and instead tried anchoring the north side down with 900 tons (816 metric tonnes) of lead weights to counterbalance the sunken south.

    When that didn’t work, they excavated soil from beneath the tower’s north side. Slowly, it began to rise up – and rotate. Anyone who has played a gravity-defying game of Jenga would know how nerve-wracking that would be.


    Stabilizing the Leaning Tower of Pisa.

    The decade-long stabilization project was eventually completed in 2001, after which the tower had straightened up some 40 centimeters and now its tilt is just shy of 4 degrees – still twice as much as the building’s original lean when construction finished in 1350.

    In 2013, researchers from Australia’s national science agency, CSIRO, also mapped every nook and cranny of the tower using 3D scanners, creating some ghostly digital reconstructions of the tower that could be used should the building ever need repairing.

    Topple, hopefully, it will not. The tower now sways so slightly, oscillating on average around half a millimeter year, according to geotechnician Nunziante Squeglia, a professor of geotechnics at the University of Pisa, who is part of the monitoring group.

    “Although what counts the most is the stability of the bell tower, which is better than expected,” Squeglia told ANSA.

    In a country steeped in antiquity, Italy’s Leaning Tower of Pisa is not the only historic figure under close inspection for fear of collapse. Scientists have for centuries been eyeing off cracks in David of Michelangelo’s ankles that could bring down the world’s most perfect statue, an effort which ramped up after a 2014 paper found that a slight tilt of 5 degrees has already caused damage and could eventually lead to catastrophic failure. Earthquakes the same year didn’t help allay tensions.

    While the fate of David is precarious, thankfully the Leaning Tower of Pisa should be secure for at least the next 300 years and maybe more, experts say. Some engineers even think that the restoration efforts could be so successful that the infamous tower may one day right itself.

    Ironically, research shows that the same soft soils beneath the tower’s foundation that produced its characteristic lean might actually now offer some protection from earthquakes, giving the structure a longer, less destructive natural vibration period if rocked.

    So perhaps a better question to ask ourselves is: will the Leaning Tower of Pisa ever stand up straight? And what then will become of its name?

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 9:02 pm on December 13, 2022 Permalink | Reply
    Tags: "Go ahead - Tell Gaurav Sant his ideas won’t work", , , Civil Engineering, , , , , ,   

    From The Henry Samueli School of Engineering and Applied Science At The University of California-Los Angeles: “Go ahead – Tell Gaurav Sant his ideas won’t work” 

    From The Henry Samueli School of Engineering and Applied Science

    At

    UCLA bloc

    The University of California-Los Angeles

    12.13.22

    1
    Sant, the son and grandson of civil engineers, says working at UCLA has given him the flexibility to pursue projects that are atypical for academia. Credit: Temasek Foundation.

    Sometimes taking on a big problem means trying to do something that others say won’t work. For an example, look no further than Gaurav Sant, UCLA’s Pritzker Professor of Sustainability.

    With an eye toward providing remedies for climate change, he and his colleagues take approaches to removing carbon dioxide from the atmosphere that are considered unlikely by some. Several of his projects have emerged from UCLA’s labs to spawn startups that are bringing new technologies into the marketplace, where they can make a difference in people’s lives.

    In an interview, Sant — a professor of civil and environmental engineering and of materials science and engineering at the UCLA Samueli School of Engineering and a member of the California NanoSystems Institute at UCLA — reflected on tackling major challenges, what he does when someone says his ideas won’t work, how his research fits into a tradition, and how it doesn’t.

    What shaped your research interest in sustainability?

    If you look at the arc of civilization, civil engineers have been at the forefront of improving our standard of life in many ways. Major improvements from sanitation and water treatment to roads and highways have come from civil engineering.

    So, in the spirit of working on big problems with big societal implications, I was drawn to this particular one: While cement and steel have undoubtedly been foundational to our way of life, the carbon footprint of producing construction materials is enormous. I became interested in reducing the carbon intensity of these materials.

    To that end, for nearly a decade you’ve been developing technology to produce concrete using carbon dioxide. What did the prospects look like early on?

    When I first talked about it with people in academia and industry — folks for whom I have incredible respect — they told me, “There’s little possibility this is ever going to work.” And I said, “Challenge accepted.”

    It turns out that nobody had carefully looked at this process for using carbon dioxide to make concrete. We went through the due diligence to make sure it wasn’t physically infeasible, and we eventually realized that we could build and scale up the process. From there, I think we just persevered longer than others might have. It’s taken hard work from generations of Ph.D. students, postdoctoral researchers and staff scientists.

    Your team won the NRG COSIA Carbon XPRIZE, a global competition for carbon removal technologies. How did that affect your work?

    Participating in the competition prompted us to think much bigger than we might have otherwise. It really made us focus on trying to develop processes that will be viable in the real world, something that most academics aren’t necessarily challenged or inclined to do.

    You founded the company CarbonBuilt, based at CNSI’s Magnify incubator, to move the technology forward. What’s the latest on that venture?

    The company is building its first commercial plants in the U.S. at this point, and CarbonBuilt’s technology will be deployed at an industrial scale in early 2023.

    With colleagues, you have also helped develop a system for removing carbon from the ocean, so that seawater can take up more carbon dioxide from the atmosphere. What inspired that initiative?

    I have a really simple way of looking at it. Carbon management requires two levels of solutions. You need sector-specific ones, such as decarbonizing construction. But you also need solutions focused on decarbonizing our global economy and way of life.

    We thought more broadly about how we could make our technologies more applicable to society at large. That framing led us to create this technology, which a UCLA startup is now commercializing.

    Both projects are connected to UCLA’s cross-disciplinary Institute for Carbon Management, which you direct. What do you see as the institute’s role?

    Our goal is to translate research into impact. The ICM plays in a space where academia, industry and national laboratories generally don’t operate. Academia and government labs tend to work on fundamental research related to materials and processes. And for-profit companies are disinclined to take on technology development efforts that aren’t assured commercial success.

    At the ICM, we emphasize translation rather than discovery, where success means bringing technologies to maturity — rapidly, scalably and repeatably — so commercial ventures can take them and scale them to success. We set out to build devices, systems and processes that are the first of their kind.

    We do this in two areas: carbon management solutions, with a focus on removing, mitigating and avoiding emissions; and new processes for expanding supplies of lithium, nickel, cobalt, manganese and molybdenum, which are materials foundational to the clean-energy transition.

    Given that so much of your work deals with limiting climate change, where do you find such optimism in the face of an increasingly dire problem?

    Fundamentally, we rely on the idea that with careful planning, focused effort and commitment, success often emerges. Such optimistic thinking is prerequisite to making a change for the better.

    Optimism is important, and ambition is important, because otherwise the question turns into, “Why bother?” And “Why bother?” does not lead to innovative solutions. Fortunately, optimism and ambition are endemic to human character.

    What first sparked your interest in science and engineering?

    I’m a third-generation civil engineer. My grandfather built prominent projects in a city in India called Pune, and my father built prominent projects in Goa, the state in India where I grew up. As a consequence, I’ve been engineering-oriented since I was a little kid.

    How does your work fit into that family tradition?

    I’m hesitant to equate these things, because there’s a great difference between what my grandfather did, what my father continues to do and what I do. They did things that made a difference in people’s day-to-day lives. With the work that we’re doing, the benefits are delayed. That said, if we succeed — which will take a decade or more to know — the outcome could be just as impactful.

    But the philosophical commonality is that all three of us, in our own ways, have done, or are doing, things that matter to us as individuals, and to society at large.

    You’ve been at UCLA for your entire career as a professor. What keeps you here?

    This is a world-class community. I’ve received support from some incredible people — in our leadership, among my peers and among our supporters, not just at UCLA Samueli but around the campus, in L.A. and around the world. I’ve had the freedom and flexibility to do things that were perhaps not typical for academia, and to take chances. So I have been fortunate to be doing fun things with a team of people I want to work with today, tomorrow and from there on.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    The UCLA Henry Samueli School of Engineering and Applied Science is the school of engineering at the University of California-Los Angeles. It opened as the College of Engineering in 1945, and was renamed the School of Engineering in 1969. Since its initial enrollment of 379 students, the school has grown to approximately 6,100 students. The school is ranked 16th among all engineering schools in the United States. The school offers 28 degree programs and is home to eight externally funded interdisciplinary research centers, including those in space exploration, wireless sensor systems, and nanotechnology.

    The University of California-Los Angeles

    UC LA Campus

    For nearly 100 years, The University of California-Los Angeles has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

    The University of California-Los Angeles is a public land-grant research university in Los Angeles, California. The University of California-Los Angeles traces its early origins back to 1882 as the southern branch of the California State Normal School (now San Jose State University). It became the Southern Branch of The University of California in 1919, making it the second-oldest (after University of California-Berkeley ) of the 10-campus University of California system.

    The University of California-Los Angeles offers 337 undergraduate and graduate degree programs in a wide range of disciplines, enrolling about 31,500 undergraduate and 12,800 graduate students. The University of California-Los Angeles had 168,000 applicants for Fall 2021, including transfer applicants, making the school the most applied-to of any American university.

    The university is organized into six undergraduate colleges; seven professional schools; and four professional health science schools. The undergraduate colleges are the College of Letters and Science; Samueli School of Engineering; School of the Arts and Architecture; Herb Alpert School of Music; School of Theater, Film and Television; and School of Nursing.

    The University of California-Los Angeles is called a “Public Ivy”, and is ranked among the best public universities in the United States by major college and university rankings. This includes one ranking that has The University of California-Los Angeles as the top public university in the United States in 2021. As of October 2020, 25 Nobel laureates; three Fields Medalists; five Turing Award winners; and two Chief Scientists of the U.S. Air Force have been affiliated with The University of California-Los Angeles as faculty; researchers or alumni. Among the current faculty members, 55 have been elected to the National Academy of Sciences; 28 to the National Academy of Engineering ; 39 to the Institute of Medicine; and 124 to the American Academy of Arts and Sciences . The university was elected to the Association of American Universities in 1974.

    The University of California-Los Angeles student-athletes compete as the Bruins in the Pac-12 Conference. The Bruins have won 129 national championships, including 118 NCAA team championships- more than any other university except Stanford University, whose athletes have won 126. The University of California-Los Angeles students, coaches, and staff have won 251 Olympic medals: 126 gold; 65 silver; and 60 bronze. The University of California-Los Angeles student-athletes have competed in every Olympics since 1920 with one exception (1924) and have won a gold medal in every Olympics the U.S. participated in since 1932.

    History

    In March 1881, at the request of state senator Reginaldo Francisco del Valle, the California State Legislature authorized the creation of a southern branch of the California State Normal School (now San José State University) in downtown Los Angeles to train teachers for the growing population of Southern California. The Los Angeles branch of the California State Normal School opened on August 29, 1882, on what is now the site of the Central Library of the Los Angeles Public Library system. The facility included an elementary school where teachers-in-training could practice their technique with children. That elementary school is related to the present day University of California-Los Angeles Lab School. In 1887, the branch campus became independent and changed its name to Los Angeles State Normal School.

    In 1914, the school moved to a new campus on Vermont Avenue (now the site of Los Angeles City College) in East Hollywood. In 1917, UC Regent Edward Augustus Dickson, the only regent representing the Southland at the time and Ernest Carroll Moore- Director of the Normal School, began to lobby the State Legislature to enable the school to become the second University of California campus, after University of California-Berkeley. They met resistance from University of California-Berkeley alumni, Northern California members of the state legislature, and Benjamin Ide Wheeler- President of the University of California from 1899 to 1919 who were all vigorously opposed to the idea of a southern campus. However, David Prescott Barrows the new President of the University of California did not share Wheeler’s objections.

    On May 23, 1919, the Southern Californians’ efforts were rewarded when Governor William D. Stephens signed Assembly Bill 626 into law which acquired the land and buildings and transformed the Los Angeles Normal School into the Southern Branch of the University of California. The same legislation added its general undergraduate program- the Junior College. The Southern Branch campus opened on September 15 of that year offering two-year undergraduate programs to 250 Junior College students and 1,250 students in the Teachers College under Moore’s continued direction. Southern Californians were furious that their so-called “branch” provided only an inferior junior college program (mocked at the time by The University of Southern California students as “the twig”) and continued to fight Northern Californians (specifically, Berkeley) for the right to three and then four years of instruction culminating in bachelor’s degrees. On December 11, 1923 the Board of Regents authorized a fourth year of instruction and transformed the Junior College into the College of Letters and Science which awarded its first bachelor’s degrees on June 12, 1925.

    Under University of California President William Wallace Campbell, enrollment at the Southern Branch expanded so rapidly that by the mid-1920s the institution was outgrowing the 25-acre Vermont Avenue location. The Regents searched for a new location and announced their selection of the so-called “Beverly Site”—just west of Beverly Hills—on March 21, 1925 edging out the panoramic hills of the still-empty Palos Verdes Peninsula. After the athletic teams entered the Pacific Coast conference in 1926 the Southern Branch student council adopted the nickname “Bruins”, a name offered by the student council at The University of California-Berkeley. In 1927, the Regents renamed the Southern Branch the University of California at Los Angeles (the word “at” was officially replaced by a comma in 1958 in line with other UC campuses). In the same year the state broke ground in Westwood on land sold for $1 million- less than one-third its value- by real estate developers Edwin and Harold Janss for whom the Janss Steps are named. The campus in Westwood opened to students in 1929.

    The original four buildings were the College Library (now Powell Library); Royce Hall; the Physics-Biology Building (which became the Humanities Building and is now the Renee and David Kaplan Hall); and the Chemistry Building (now Haines Hall) arrayed around a quadrangular courtyard on the 400 acre (1.6 km^2) campus. The first undergraduate classes on the new campus were held in 1929 with 5,500 students. After lobbying by alumni; faculty; administration and community leaders University of California-Los Angeles was permitted to award the master’s degree in 1933 and the doctorate in 1936 against continued resistance from The University of California-Berkeley.

    Maturity as a university

    During its first 32 years University of California-Los Angeles was treated as an off-site department of The University of California. As such its presiding officer was called a “provost” and reported to the main campus in Berkeley. In 1951 University of California-Los Angeles was formally elevated to co-equal status with The University of California-Berkeley, and its presiding officer Raymond B. Allen was the first chief executive to be granted the title of chancellor. The appointment of Franklin David Murphy to the position of Chancellor in 1960 helped spark an era of tremendous growth of facilities and faculty honors. By the end of the decade University of California-Los Angeles had achieved distinction in a wide range of subjects. This era also secured University of California-Los Angeles’s position as a proper university and not simply a branch of the University of California system. This change is exemplified by an incident involving Chancellor Murphy, which was described by him:

    “I picked up the telephone and called in from somewhere and the phone operator said, “University of California.” And I said, “Is this Berkeley?” She said, “No.” I said, “Well who have I gotten to?” ” University of California-Los Angeles.” I said, “Why didn’t you say University of California-Los Angeles?” “Oh”, she said, “we’re instructed to say University of California.” So, the next morning I went to the office and wrote a memo; I said, “Will you please instruct the operators, as of noon today, when they answer the phone to say, ‘ University of California-Los Angeles.'” And they said, “You know they won’t like it at Berkeley.” And I said, “Well, let’s just see. There are a few things maybe we can do around here without getting their permission.”

    Recent history

    On June 1, 2016 two men were killed in a murder-suicide at an engineering building in the university. School officials put the campus on lockdown as Los Angeles Police Department officers including SWAT cleared the campus.

    In 2018, a student-led community coalition known as “Westwood Forward” successfully led an effort to break University of California-Los Angeles and Westwood Village away from the existing Westwood Neighborhood Council and form a new North Westwood Neighborhood Council with over 2,000 out of 3,521 stakeholders voting in favor of the split. Westwood Forward’s campaign focused on making housing more affordable and encouraging nightlife in Westwood by opposing many of the restrictions on housing developments and restaurants the Westwood Neighborhood Council had promoted.

    Academics

    Divisions

    Undergraduate

    College of Letters and Science
    Social Sciences Division
    Humanities Division
    Physical Sciences Division
    Life Sciences Division
    School of the Arts and Architecture
    Henry Samueli School of Engineering and Applied Science (HSSEAS)
    Herb Alpert School of Music
    School of Theater, Film and Television
    School of Nursing
    Luskin School of Public Affairs

    Graduate

    Graduate School of Education & Information Studies (GSEIS)
    School of Law
    Anderson School of Management
    Luskin School of Public Affairs
    David Geffen School of Medicine
    School of Dentistry
    Jonathan and Karin Fielding School of Public Health
    Semel Institute for Neuroscience and Human Behavior
    School of Nursing

    Research

    University of California-Los Angeles is classified among “R1: Doctoral Universities – Very high research activity” and had $1.32 billion in research expenditures in FY 2018.

    .

     
  • richardmitnick 10:48 am on October 26, 2022 Permalink | Reply
    Tags: "Studying floods to better predict their dangers" Katerina Boukin, , Civil Engineering, , , MIT Concrete Sustainability Hub,   

    From The Massachusetts Institute of Technology: “Studying floods to better predict their dangers” Katerina Boukin 

    From The Massachusetts Institute of Technology

    10.13.22
    Andrew Paul Laurent | MIT Concrete Sustainability Hub

    A fourth-generation civil engineer, graduate student Katerina Boukin researches the growing yet misunderstood threat of pluvial flooding, including flash floods.

    1
    “If we don’t know how a flood is propagating, we don’t know the risk it poses to the urban environment. And if we don’t understand the risk, we can’t really discuss mitigation strategies,” says CSHub researcher Katya Boukin. “That’s why I pursue improving flood propagation models.” Photo: Andrew Paul Laurent.

    “My job is basically flooding Cambridge,” says Katerina “Katya” Boukin, a graduate student in civil and environmental engineering at MIT and the MIT Concrete Sustainability Hub’s resident expert on flood simulations.

    You can often find her fine-tuning high-resolution flood risk models for the City of Cambridge, Massachusetts, or talking about hurricanes with fellow researcher Ipek Bensu Manav.

    Flooding represents one of the world’s gravest natural hazards. Extreme climate events inducing flooding, like severe storms, winter storms, and tropical cyclones, caused an estimated $128.1 billion of damages in 2021 alone.

    Climate simulation models suggest that severe storms will become more frequent in the coming years, necessitating a better understanding of which parts of cities are most vulnerable — an understanding that can be improved through modeling.

    A problem with current flood models is that they struggle to account for an oft-misunderstood type of flooding known as pluvial flooding.

    “You might think of flooding as the overflowing of a body of water, like a river. This is fluvial flooding. This can be somewhat predictable, as you can think of proximity to water as a risk factor,” Boukin explains.

    However, the “flash flooding” that causes many deaths each year can happen even in places nowhere near a body of water. This is an example of pluvial flooding, which is affected by terrain, urban infrastructure, and the dynamic nature of storm loads.

    “If we don’t know how a flood is propagating, we don’t know the risk it poses to the urban environment. And if we don’t understand the risk, we can’t really discuss mitigation strategies,” says Boukin, “That’s why I pursue improving flood propagation models.”

    Boukin is leading development of a new flood prediction method that seeks to address these shortcomings. By better representing the complex morphology of cities, Boukin’s approach may provide a clearer forecast of future urban flooding.

    2
    Katya Boukin developed this model of the City of Cambridge, Massachusetts. The base model was provided through a collaboration between MIT, the City of Cambridge, and Dewberry Engineering. Image: Katya Boukin.

    “In contrast to the more typical traditional catchment model, our method has rainwater spread around the urban environment based on the city’s topography, below-the-surface features like sewer pipes, and the characteristics of local soils,” notes Boukin.

    “We can simulate the flooding of regions with local rain forecasts. Our results can show how flooding propagates by the foot and by the second,” she adds.

    While Boukin’s current focus is flood simulation, her unconventional academic career has taken her research in many directions, like examining structural bottlenecks in dense urban rail systems and forecasting ground displacement due to tunneling.

    “I’ve always been interested in the messy side of problem-solving. I think that difficult problems present a real chance to gain a deeper understanding,” says Boukin.

    Boukin credits her upbringing for giving her this perspective. A native of Israel, Boukin says that civil engineering is the family business. “My parents are civil engineers, my mom’s parents are, too, her grandfather was a professor in civil engineering, and so on. Civil engineering is my bloodline.”

    However, the decision to follow the family tradition did not come so easily. “After I took the Israeli equivalent of the SAT, I was at a decision point: Should I go to engineering school or medical school?” she recalls.

    “I decided to go on a backpacking trip to help make up my mind. It’s sort of an Israeli rite to explore internationally, so I spent six months in South America. I think backpacking is something everyone should do.”

    After this soul searching, Boukin landed on engineering school, where she fell in love with structural engineering. “It was the option that felt most familiar and interesting. I grew up playing with AutoCAD on the family computer, and now I use AutoCAD professionally!” she notes.

    “For my master’s degree, I was looking to study in a department that would help me integrate knowledge from fields like climatology and civil engineering. I found the MIT Department of Civil and Environmental Engineering to be an excellent fit,” she says.

    “I am lucky that MIT has so many people that work together as well as they do. I ended up at the Concrete Sustainability Hub, where I’m working on projects which are the perfect fit between what I wanted to do and what the department wanted to do.”

    Boukin’s move to Cambridge has given her a new perspective on her family and childhood.

    “My parents brought me to Israel when I was just 1 year old. In moving here as a second-time immigrant, I have a new perspective on what my parents went through during the move to Israel. I moved when I was 27 years old, the same age as they were. They didn’t have a support network and worked any job they could find,” she explains.

    “I am incredibly grateful to them for the morals they instilled in my sister, who recently graduated medical school, and I. I know I can call my parents if I ever need something, and they will do whatever they can to help.”

    Boukin hopes to honor her parents’ efforts through her research.

    “Not only do I want to help stakeholders understand flood risks, I want to make awareness of flooding more accessible. Each community needs different things to be resilient, and different cultures have different ways of delivering and receiving information,” she says.

    “Everyone should understand that they, in addition to the buildings and infrastructure around them, are part of a complex ecosystem. Any change to a city can affect the rest of it. If designers and residents are aware of this when considering flood mitigation strategies, we can better design cities and understand the consequences of damage.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: