Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:32 pm on October 7, 2022 Permalink | Reply
    Tags: "New route to evolution - how DNA from our mitochondria gets into our genomes", , , Each mitochondrion has its own DNA distinct to the rest of the human genome which is comprised of nuclear DNA., , , It is not clear exactly how the mitochondrial DNA inserts itself., Mitochondrial DNA also appears in some cancer DNA suggesting that it acts as a sticking plaster to try and repair damage to our genetic code., Mitochondrial DNA is passed down the maternal line., Scientists have shown that in one in every 4000 births some of the genetic code from our mitochondria – the ‘batteries’ that power our cells – inserts itself into our DNA.,   

    From The University of Cambridge (UK): “New route to evolution – how DNA from our mitochondria gets into our genomes” 

    U Cambridge bloc

    From The University of Cambridge (UK)

    10.5.22
    Craig Brierley

    1
    Mitochondria surrounded by cytoplasm. Credit: Dr David Furness.

    Scientists have shown that in one in every 4,000 births, some of the genetic code from our mitochondria – the ‘batteries’ that power our cells – inserts itself into our DNA, revealing a surprising new insight into how humans evolve.

    In a study published today in Nature [below], researchers at the University of Cambridge and Queen Mary University of London show that mitochondrial DNA also appears in some cancer DNA suggesting that it acts as a sticking plaster to try and repair damage to our genetic code.

    Mitochondria are tiny ‘organelles’ that sit within our cells, where they act like batteries, providing energy in the form of the molecule ATP to power the cells. Each mitochondrion has its own DNA – mitochondrial DNA – that is distinct to the rest of the human genome which is comprised of nuclear DNA.

    Mitochondrial DNA is passed down the maternal line – that is, we inherit it from our mothers, not our fathers. However, a study published in PNAS [below] in 2018 from researchers at the Cincinnati Children’s Hospital Medical Center in the USA reported evidence that suggested some mitochondrial DNA had been passed down the paternal line.

    To investigate these claims, the Cambridge team looked at the DNA from over 11,000 families recruited to Genomics England’s 100,000 Genomes Project, searching for patterns that looked like paternal inheritance. The Cambridge team found mitochondrial DNA ‘inserts’ in the nuclear DNA of some children that were not present in that of their parents. This meant that the US team had probably reached the wrong conclusions: what they had observed were not paternally-inherited mitochondrial DNA, but rather these inserts.

    Now, extending this work to over 66,000 people, the team showed that the new inserts are actually happening all the time, showing a new way our genome evolves.

    Professor Patrick Chinnery, from the Medical Research Council Mitochondrial Biology Unit and Department of Clinical Neurosciences at the University of Cambridge, explained: “Billions of years ago, a primitive animal cell took in a bacterium that became what we now call mitochondria. These supply energy to the cell to allow it to function normally, while removing oxygen, which is toxic at high levels. Over time, bits of these primitive mitochondria have passed into the cell nucleus, allowing their genomes to talk to each other.

    “This was all thought to have happened a very long time ago, mostly before we had even formed as a species, but what we’ve discovered is that that’s not true. We can see this happening right now, with bits of our mitochondrial genetic code transferring into the nuclear genome in a measurable way.”

    The team estimate that mitochondrial DNA transfers to nuclear DNA in around one in every 4,000 births. If that individual has children of their own, they will pass these inserts on – the team found that most of us carry five of the new inserts, and one in seven of us (14%) carry very recent ones. Once in place, the inserts can occasionally lead to very rare diseases, including a rare genetic form of cancer.

    It is not clear exactly how the mitochondrial DNA inserts itself – whether it does so directly or via an intermediary, such as RNA – but Professor Chinnery says it is likely to occur within the mother’s egg cells.

    When the team looked at sequences taken from 12,500 tumour samples, they found that mitochondrial DNA was even more common in tumour DNA, arising in around one in 1,000 cancers, and in some cases, the mitochondrial DNA inserts actually causes the cancer.

    “Our nuclear genetic code is breaking and being repaired all the time,” said Professor Chinnery. “Mitochondrial DNA appears to act almost like a Band-Aid, a sticking plaster to help the nuclear genetic code repair itself. And sometimes this works, but on rare occasions if might make things worse or even trigger the development of tumours.”

    More than half (58%) of the insertions were in regions of the genome that code for proteins. In the majority of cases, the body recognizes the invading mitochondrial DNA and silences it in a process known as methylation, whereby a molecule attaches itself to the insert and switches it off. A similar process occurs when viruses manage to insert themselves into our DNA. However, this method of silencing is not perfect, as some of the mitochondrial DNA inserts go on to be copied and move around the nucleus itself.

    The team looked for evidence that the reverse might happen – that mitochondrial DNA absorbs parts of our nuclear DNA – but found none. There are likely to be several reasons why this should be the case.

    Firstly, cells only have two copies of nuclear DNA, but thousands of copies of mitochondrial DNA, so the chances of mitochondrial DNA being broken and passing into the nucleus are much greater than the other way around.

    Secondly, the DNA in mitochondria is packaged inside two membranes and there are no holes in the membrane, so it would be difficult for nuclear DNA to get in. By contrast, if mitochondrial DNA manages to get out, holes in the membrane surrounding nuclear DNA would allow it pass through with relative ease.

    Professor Sir Mark Caulfield, Vice Principal for Health at Queen Mary University of London, said: “I am so delighted that the 100,000 Genomes Project has unlocked the dynamic interplay between mitochondrial DNA and our genome in the cell’s nucleus. This defines a new role in DNA repair, but also one that could occasionally trigger rare disease, or even malignancy.”

    The research was mainly funded by the Medical Research Council, Wellcome, and the National Institute for Health Research.

    Science papers:
    Nature
    PNAS 2018
    See the science papers for instructive material.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Cambridge Campus

    The University of Cambridge (UK) [legally The Chancellor, Masters, and Scholars of the University of Cambridge] is a collegiate public research university in Cambridge, England. Founded in 1209 Cambridge is the second-oldest university in the English-speaking world and the world’s fourth-oldest surviving university. It grew out of an association of scholars who left the University of Oxford (UK) after a dispute with townsfolk. The two ancient universities share many common features and are often jointly referred to as “Oxbridge”.

    Cambridge is formed from a variety of institutions which include 31 semi-autonomous constituent colleges and over 150 academic departments, faculties and other institutions organized into six schools. All the colleges are self-governing institutions within the university, each controlling its own membership and with its own internal structure and activities. All students are members of a college. Cambridge does not have a main campus and its colleges and central facilities are scattered throughout the city. Undergraduate teaching at Cambridge is organized around weekly small-group supervisions in the colleges – a feature unique to the Oxbridge system. These are complemented by classes, lectures, seminars, laboratory work and occasionally further supervisions provided by the central university faculties and departments. Postgraduate teaching is provided predominantly centrally.

    Cambridge University Press a department of the university is the oldest university press in the world and currently the second largest university press in the world. Cambridge Assessment also a department of the university is one of the world’s leading examining bodies and provides assessment to over eight million learners globally every year. The university also operates eight cultural and scientific museums, including the Fitzwilliam Museum, as well as a botanic garden. Cambridge’s libraries – of which there are 116 – hold a total of around 16 million books, around nine million of which are in Cambridge University Library, a legal deposit library. The university is home to – but independent of – the Cambridge Union – the world’s oldest debating society. The university is closely linked to the development of the high-tech business cluster known as “Silicon Fe”. It is the central member of Cambridge University Health Partners, an academic health science centre based around the Cambridge Biomedical Campus.

    By both endowment size and consolidated assets Cambridge is the wealthiest university in the United Kingdom. In the fiscal year ending 31 July 2019, the central university – excluding colleges – had a total income of £2.192 billion of which £592.4 million was from research grants and contracts. At the end of the same financial year the central university and colleges together possessed a combined endowment of over £7.1 billion and overall consolidated net assets (excluding “immaterial” historical assets) of over £12.5 billion. It is a member of numerous associations and forms part of the ‘golden triangle’ of English universities.

    Cambridge has educated many notable alumni including eminent mathematicians; scientists; politicians; lawyers; philosophers; writers; actors; monarchs and other heads of state. As of October 2020, 121 Nobel laureates; 11 Fields Medalists; 7 Turing Award winners; and 14 British prime ministers have been affiliated with Cambridge as students; alumni; faculty or research staff. University alumni have won 194 Olympic medals.

    History

    By the late 12th century, the Cambridge area already had a scholarly and ecclesiastical reputation due to monks from the nearby bishopric church of Ely. However, it was an incident at Oxford which is most likely to have led to the establishment of the university: three Oxford scholars were hanged by the town authorities for the death of a woman without consulting the ecclesiastical authorities who would normally take precedence (and pardon the scholars) in such a case; but were at that time in conflict with King John. Fearing more violence from the townsfolk scholars from the University of Oxford started to move away to cities such as Paris; Reading; and Cambridge. Subsequently enough scholars remained in Cambridge to form the nucleus of a new university when it had become safe enough for academia to resume at Oxford. In order to claim precedence, it is common for Cambridge to trace its founding to the 1231 charter from Henry III granting it the right to discipline its own members (ius non-trahi extra) and an exemption from some taxes; Oxford was not granted similar rights until 1248.

    A bull in 1233 from Pope Gregory IX gave graduates from Cambridge the right to teach “everywhere in Christendom”. After Cambridge was described as a studium generale in a letter from Pope Nicholas IV in 1290 and confirmed as such in a bull by Pope John XXII in 1318 it became common for researchers from other European medieval universities to visit Cambridge to study or to give lecture courses.

    Foundation of the colleges

    The colleges at the University of Cambridge were originally an incidental feature of the system. No college is as old as the university itself. The colleges were endowed fellowships of scholars. There were also institutions without endowments called hostels. The hostels were gradually absorbed by the colleges over the centuries; but they have left some traces, such as the name of Garret Hostel Lane.

    Hugh Balsham, Bishop of Ely, founded Peterhouse – Cambridge’s first college in 1284. Many colleges were founded during the 14th and 15th centuries but colleges continued to be established until modern times. There was a gap of 204 years between the founding of Sidney Sussex in 1596 and that of Downing in 1800. The most recently established college is Robinson built in the late 1970s. However, Homerton College only achieved full university college status in March 2010 making it the newest full college (it was previously an “Approved Society” affiliated with the university).

    In medieval times many colleges were founded so that their members would pray for the souls of the founders and were often associated with chapels or abbeys. The colleges’ focus changed in 1536 with the Dissolution of the Monasteries. Henry VIII ordered the university to disband its Faculty of Canon Law and to stop teaching “scholastic philosophy”. In response, colleges changed their curricula away from canon law and towards the classics; the Bible; and mathematics.

    Nearly a century later the university was at the centre of a Protestant schism. Many nobles, intellectuals and even commoners saw the ways of the Church of England as too similar to the Catholic Church and felt that it was used by the Crown to usurp the rightful powers of the counties. East Anglia was the centre of what became the Puritan movement. In Cambridge the movement was particularly strong at Emmanuel; St Catharine’s Hall; Sidney Sussex; and Christ’s College. They produced many “non-conformist” graduates who, greatly influenced by social position or preaching left for New England and especially the Massachusetts Bay Colony during the Great Migration decade of the 1630s. Oliver Cromwell, Parliamentary commander during the English Civil War and head of the English Commonwealth (1649–1660), attended Sidney Sussex.

    Modern period

    After the Cambridge University Act formalized the organizational structure of the university the study of many new subjects was introduced e.g. theology, history and modern languages. Resources necessary for new courses in the arts architecture and archaeology were donated by Viscount Fitzwilliam of Trinity College who also founded the Fitzwilliam Museum. In 1847 Prince Albert was elected Chancellor of the University of Cambridge after a close contest with the Earl of Powis. Albert used his position as Chancellor to campaign successfully for reformed and more modern university curricula, expanding the subjects taught beyond the traditional mathematics and classics to include modern history and the natural sciences. Between 1896 and 1902 Downing College sold part of its land to build the Downing Site with new scientific laboratories for anatomy, genetics, and Earth sciences. During the same period the New Museums Site was erected including the Cavendish Laboratory which has since moved to the West Cambridge Site and other departments for chemistry and medicine.

    The University of Cambridge began to award PhD degrees in the first third of the 20th century. The first Cambridge PhD in mathematics was awarded in 1924.

    In the First World War 13,878 members of the university served and 2,470 were killed. Teaching and the fees it earned came almost to a stop and severe financial difficulties followed. As a consequence, the university first received systematic state support in 1919 and a Royal Commission appointed in 1920 recommended that the university (but not the colleges) should receive an annual grant. Following the Second World War the university saw a rapid expansion of student numbers and available places; this was partly due to the success and popularity gained by many Cambridge scientists.

     
  • richardmitnick 1:04 pm on October 7, 2022 Permalink | Reply
    Tags: "New process could enable more efficient plastics recycling", A catalyst made of a microporous material called a zeolite containing cobalt can selectively break down various plastic polymer molecules and turn more than 80 percent of them into propane., A chemical process using a catalyst based on cobalt has been found to be very effective at breaking down a variety of plastics., A key problem is that plastics come in so many different varieties and chemical processes for breaking them down into a form that can be reused in some way tend to be very specific to each type., , , , , Polyethylene (PET) and polypropylene (PP)-two widely produced forms of plastic-can be broken down into propane. Propane can then be used as a fuel or a feedstock for a variety of products., Recycling plastics has been a thorny problem because the long-chain molecules in plastics are held together by carbon bonds which are very stable and difficult to break apart., The accumulation of plastic waste is one of the major pollution issues of modern times., , The materials needed for the process-zeolites and cobalt-are both quite cheap and widely available., Today much of the plastic material gathered through recycling programs ends up in landfills anyway.   

    From The Massachusetts Institute of Technology: “New process could enable more efficient plastics recycling” 

    From The Massachusetts Institute of Technology

    10.6.22
    David L. Chandler

    1
    A new chemical process can break down a variety of plastics into usable propane — a possible solution to our inability to effectively recycle many types of plastic. Image: Courtesy of the researchers. Edited by MIT News.

    The accumulation of plastic waste in the oceans, soil, and even in our bodies is one of the major pollution issues of modern times, with over 5 billion tons disposed of so far. Despite major efforts to recycle plastic products, actually making use of that motley mix of materials has remained a challenging issue.

    A key problem is that plastics come in so many different varieties, and chemical processes for breaking them down into a form that can be reused in some way tend to be very specific to each type of plastic. Sorting the hodgepodge of waste material, from soda bottles to detergent jugs to plastic toys, is impractical at large scale. Today much of the plastic material gathered through recycling programs ends up in landfills anyway. Surely there’s a better way.

    According to new research from MIT and elsewhere, it appears there may indeed be a much better way. A chemical process using a catalyst based on cobalt has been found to be very effective at breaking down a variety of plastics, such as polyethylene (PET) and polypropylene (PP), the two most widely produced forms of plastic, into a single product, propane. Propane can then be used as a fuel for stoves, heaters, and vehicles, or as a feedstock for the production of a wide variety of products — including new plastics, thus potentially providing at least a partial closed-loop recycling system.

    The finding is described today in the open access journal JACS Au [below], in a paper by MIT professor of chemical engineering Yuriy Román-Leshkov, postdoc Guido Zichitella, and seven others at MIT, the DOE’s SLAC National Accelerator Laboratory, and the National Renewable Energy Laboratory.

    Recycling plastics has been a thorny problem, Román-Leshkov explains, because the long-chain molecules in plastics are held together by carbon bonds, which are “very stable and difficult to break apart.” Existing techniques for breaking these bonds tend to produce a random mix of different molecules, which would then require complex refining methods to separate out into usable specific compounds. “The problem is,” he says, “there’s no way to control where in the carbon chain you break the molecule.”

    But to the surprise of the researchers, a catalyst made of a microporous material called a zeolite that contains cobalt nanoparticles can selectively break down various plastic polymer molecules and turn more than 80 percent of them into propane.

    Although zeolites are riddled with tiny pores less than a nanometer wide (corresponding to the width of the polymer chains), a logical assumption had been that there would be little interaction at all between the zeolite and the polymers. Surprisingly, however, the opposite turned out to be the case: Not only do the polymer chains enter the pores, but the synergistic work between cobalt and the acid sites in the zeolite can break the chain at the same point. That cleavage site turned out to correspond to chopping off exactly one propane molecule without generating unwanted methane, leaving the rest of the longer hydrocarbons ready to undergo the process, again and again.

    “Once you have this one compound, propane, you lessen the burden on downstream separations,” Román-Leshkov says. “That’s the essence of why we think this is quite important. We’re not only breaking the bonds, but we’re generating mainly a single product” that can be used for many different products and processes.

    The materials needed for the process, zeolites and cobalt, “are both quite cheap” and widely available, he says, although today most cobalt comes from troubled areas in the Democratic Republic of Congo. Some new production is being developed in Canada, Cuba, and other places. The other material needed for the process is hydrogen, which today is mostly produced from fossil fuels but can easily be made other ways, including electrolysis of water using carbon-free electricity such as solar or wind power.

    The researchers tested their system on a real example of mixed recycled plastic, producing promising results. But more testing will be needed on a greater variety of mixed waste streams to determine how much fouling takes place from various contaminants in the material — such as inks, glues, and labels attached to the plastic containers, or other nonplastic materials that get mixed in with the waste — and how that affects the long-term stability of the process.

    Together with collaborators at NREL, the MIT team is also continuing to study the economics of the system, and analyzing how it can fit into today’s systems for handling plastic and mixed waste streams. “We don’t have all the answers yet,” Román-Leshkov says, but preliminary analysis looks promising.

    The research team included Amani Ebrahim and Simone Bare at the SLAC National Accelerator Laboratory; Jie Zhu, Anna Brenner, Griffin Drake and Julie Rorrer at MIT; and Greg Beckham at the National Renewable Energy Laboratory. The work was supported by the U.S. Department of Energy (DoE), the Swiss National Science Foundation, and the DoE’s Office of Energy Efficiency and Renewable Energy, Advanced Manufacturing Office (AMO), and Bioenergy Technologies Office (BETO), as part of the the Bio-Optimized Technologies to keep Thermoplastics out of Landfills and the Environment (BOTTLE) Consortium.

    Science paper:
    JACS Au
    See the science paper for instructive material.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 12:27 pm on October 7, 2022 Permalink | Reply
    Tags: "To help meet global EV demand researchers develop sustainable method of recycling older lithium-ion batteries", A new more sustainable method to mine valuable metals – including lithium but also cobalt nickel and manganese – from lithium-ion batteries that have reached the end of their useful lifespan., Achieving the 2050 target will require an increase in the supply of critical metals where prices of are already very high., Conventional processes for recycling lithium-ion batteries: ‘pyrometallurgy” which uses extremely high temperature; “hydrometallurgy” which uses acids and reducing agents for extraction., If we recycle existing batteries we can sustain the constrained supply chain and help bring down the cost of EV batteries., In landfills corrosive electrolyte leaching can occur contaminating underground water systems., Not only can recycling provide these materials at a lower cost but it also reduces the need to mine raw ore., Part of Canada’s commitment to reach net-zero emissions by 2050 includes a target requiring 100 per cent of new light-duty cars and passenger trucks sold in the country to be electric by 2035., Pyrometallurgy produces greenhouse gas emissions while hydrometallurgy creates wastewater that needs to be processed and handled., Scientists are now moving towards commercialization of this method to increase its technology readiness level., The lab group is using supercritical fluid extraction to recover metals from end-of-life lithium-ion batteries., The lab process matched the extraction efficiency of lithium and nickel and cobalt and manganese to 90 per cent when compared to the conventional leaching processes., The scientists used carbon dioxide as a solvent which was brought to a supercritical phase by increasing the temperature above 31 C and the pressure up to 7 megapascals., , Today many batteries are discarded improperly and end up in landfills.   

    From The University of Toronto (CA): “To help meet global EV demand researchers develop sustainable method of recycling older lithium-ion batteries” 

    From The University of Toronto (CA)

    10.3.22
    Safa Jinje

    1
    Professor Gisele Azimi and PhD candidate Jiakai (Kevin) Zhang have proposed a new, more sustainable method to recover valuable metals from lithium-ion batteries that have reached the end of their useful lives (photo by Safa Jinje)

    A University of Toronto researcher has developed a new technique to help recycle the metals in lithium-ion batteries, which are in high demand amid surging global sales of electric vehicles.

    Gisele Azimi, a professor in the departments of materials science and engineering and chemical engineering and applied chemistry in the Faculty of Applied Science & Engineering, and her team have proposed a new, more sustainable method to mine valuable metals – including lithium, but also cobalt, nickel and manganese – from lithium-ion batteries that have reached the end of their useful lifespan.

    “Getting these metals from raw ore takes a lot of energy,” says Jiakai (Kevin) Zhang, a PhD candidate in chemical engineering and applied chemistry who is lead author on a new paper recently published in Resources, Conservation and Recycling [below].

    “If we recycle existing batteries we can sustain the constrained supply chain and help bring down the cost of EV batteries, making the vehicles more affordable.”

    Part of Canada’s commitment to reach net-zero emissions by 2050 includes a mandatory target requiring 100 per cent of new light-duty cars and passenger trucks sold in the country to be electric by 2035.

    Achieving this target will require an increase in the supply of critical metals, the price of which is already very high. For example, cobalt, a key ingredient in the cathode production of lithium-nickel-manganese-cobalt-oxide (commonly abbreviated as NMC) batteries widely used in EVs, is also one of the most expensive components of lithium-ion batteries due to its limited reserve.

    “We are about to reach a point where many lithium-ion batteries are reaching their end of life,” says Azimi. “These batteries are still very rich in elements of interest and can provide a crucial resource for recovery.”

    Not only can recycling provide these materials at a lower cost but it also reduces the need to mine raw ore that comes with environmental and ethical costs.

    The life expectancy of EV batteries is from 10 to 20 years, but most car manufacturers only provide a guarantee for eight years or 160,000 kilometres – whichever comes first. When EV batteries reach end of life, they can be refurbished for second-life uses or recycled to recover metals. But today many batteries are discarded improperly and end up in landfills.

    “If we keep mining lithium, cobalt and nickel for batteries and then just landfill them at end-of-life, there will be a negative environmental impact, especially if corrosive electrolyte leaching occurs and contaminates underground water systems,” says Zhang.

    2
    Gisele Azimi and Jiakai (Kevin) Zhang conducted their supercritical fluid extraction experiments in a 100-millilitre high-pressure reactor (photo by Safa Jinje)

    Conventional processes for recycling lithium-ion batteries are based on pyrometallurgy, which uses extremely high temperature, or hydrometallurgy, which uses acids and reducing agents for extraction. These two processes are both energy intensive: pyrometallurgy produces greenhouse gas emissions while hydrometallurgy creates wastewater that needs to be processed and handled.

    In contrast, Azimi’s lab group is using supercritical fluid extraction to recover metals from end-of-life lithium-ion batteries. This process separates one component from another by using an extracting solvent at a temperature and pressure above its critical point, where it adopts the properties of both a liquid and a gas.

    To recover the metals, Zhang used carbon dioxide as a solvent, which was brought to a supercritical phase by increasing the temperature above 31 C, and the pressure up to 7 megapascals.

    In the paper, the team showed that this process matched the extraction efficiency of lithium, nickel, cobalt and manganese to 90 per cent when compared to the conventional leaching processes, while also using fewer chemicals and generating significantly less secondary waste. In fact, the main source of energy expended during the supercritical fluid extraction process was due to the compression of CO2.

    “The advantage of our method is that we are using carbon dioxide from the air as the solvent instead of highly hazardous acids or bases,” she says. “Carbon dioxide is abundant, cheap and inert, and it’s also easy to handle, vent and recycle.” 

    Supercritical fluid extraction is not a new process. It has been used in the food and pharmaceutical industries to extract caffeine from coffee beans since the 1970s. Azimi and her team’s work builds on previous research in the Laboratory for Strategic Materials to recover rare earth elements from nickel-metal-hydride batteries.

    However, this is the first time that this process has been used to recover metals from lithium-ion batteries, she says.

    “We really believe in the success and the benefits of this process,” says Azimi.

    “We are now moving towards commercialization of this method to increase its technology readiness level. Our next step is to finalize partnerships to build industrial-scale recycling facilities for secondary resources. If it’s enabled, it would be a big game changer.”

    The research was supported by the Natural Sciences and Engineering Research Council of Canada and Ontario’s Ministry of Economic Development, Job Creation and Trade.

    Science paper:
    Resources, Conservation and Recycling

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The The University of Toronto (CA) is a public research university in Toronto, Ontario, Canada, located on the grounds that surround Queen’s Park. It was founded by royal charter in 1827 as King’s College, the oldest university in the province of Ontario.

    Originally controlled by the Church of England, the university assumed its present name in 1850 upon becoming a secular institution.

    As a collegiate university, it comprises eleven colleges each with substantial autonomy on financial and institutional affairs and significant differences in character and history. The university also operates two satellite campuses located in Scarborough and Mississauga.

    University of Toronto has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

    Academically, the University of Toronto is noted for movements and curricula in literary criticism and communication theory, known collectively as the Toronto School.

    The university was the birthplace of insulin and stem cell research, and was the site of the first electron microscope in North America; the identification of the first black hole Cygnus X-1; multi-touch technology, and the development of the theory of NP-completeness.

    The university was one of several universities involved in early research of deep learning. It receives the most annual scientific research funding of any Canadian university and is one of two members of the Association of American Universities outside the United States, the other being McGill(CA).

    The Varsity Blues are the athletic teams that represent the university in intercollegiate league matches, with ties to gridiron football, rowing and ice hockey. The earliest recorded instance of gridiron football occurred at University of Toronto’s University College in November 1861.

    The university’s Hart House is an early example of the North American student centre, simultaneously serving cultural, intellectual, and recreational interests within its large Gothic-revival complex.

    The University of Toronto has educated three Governors General of Canada, four Prime Ministers of Canada, three foreign leaders, and fourteen Justices of the Supreme Court. As of March 2019, ten Nobel laureates, five Turing Award winners, 94 Rhodes Scholars, and one Fields Medalist have been affiliated with the university.

    Early history

    The founding of a colonial college had long been the desire of John Graves Simcoe, the first Lieutenant-Governor of Upper Canada and founder of York, the colonial capital. As an University of Oxford (UK)-educated military commander who had fought in the American Revolutionary War, Simcoe believed a college was needed to counter the spread of republicanism from the United States. The Upper Canada Executive Committee recommended in 1798 that a college be established in York.

    On March 15, 1827, a royal charter was formally issued by King George IV, proclaiming “from this time one College, with the style and privileges of a University … for the education of youth in the principles of the Christian Religion, and for their instruction in the various branches of Science and Literature … to continue for ever, to be called King’s College.” The granting of the charter was largely the result of intense lobbying by John Strachan, the influential Anglican Bishop of Toronto who took office as the college’s first president. The original three-storey Greek Revival school building was built on the present site of Queen’s Park.

    Under Strachan’s stewardship, King’s College was a religious institution closely aligned with the Church of England and the British colonial elite, known as the Family Compact. Reformist politicians opposed the clergy’s control over colonial institutions and fought to have the college secularized. In 1849, after a lengthy and heated debate, the newly elected responsible government of the Province of Canada voted to rename King’s College as the University of Toronto and severed the school’s ties with the church. Having anticipated this decision, the enraged Strachan had resigned a year earlier to open Trinity College as a private Anglican seminary. University College was created as the nondenominational teaching branch of the University of Toronto. During the American Civil War the threat of Union blockade on British North America prompted the creation of the University Rifle Corps which saw battle in resisting the Fenian raids on the Niagara border in 1866. The Corps was part of the Reserve Militia lead by Professor Henry Croft.

    Established in 1878, the School of Practical Science was the precursor to the Faculty of Applied Science and Engineering which has been nicknamed Skule since its earliest days. While the Faculty of Medicine opened in 1843 medical teaching was conducted by proprietary schools from 1853 until 1887 when the faculty absorbed the Toronto School of Medicine. Meanwhile the university continued to set examinations and confer medical degrees. The university opened the Faculty of Law in 1887, followed by the Faculty of Dentistry in 1888 when the Royal College of Dental Surgeons became an affiliate. Women were first admitted to the university in 1884.

    A devastating fire in 1890 gutted the interior of University College and destroyed 33,000 volumes from the library but the university restored the building and replenished its library within two years. Over the next two decades a collegiate system took shape as the university arranged federation with several ecclesiastical colleges including Strachan’s Trinity College in 1904. The university operated the Royal Conservatory of Music from 1896 to 1991 and the Royal Ontario Museum from 1912 to 1968; both still retain close ties with the university as independent institutions. The University of Toronto Press was founded in 1901 as Canada’s first academic publishing house. The Faculty of Forestry founded in 1907 with Bernhard Fernow as dean was Canada’s first university faculty devoted to forest science. In 1910, the Faculty of Education opened its laboratory school, the University of Toronto Schools.

    World wars and post-war years

    The First and Second World Wars curtailed some university activities as undergraduate and graduate men eagerly enlisted. Intercollegiate athletic competitions and the Hart House Debates were suspended although exhibition and interfaculty games were still held. The David Dunlap Observatory in Richmond Hill opened in 1935 followed by the University of Toronto Institute for Aerospace Studies in 1949. The university opened satellite campuses in Scarborough in 1964 and in Mississauga in 1967. The university’s former affiliated schools at the Ontario Agricultural College and Glendon Hall became fully independent of the University of Toronto and became part of University of Guelph (CA) in 1964 and York University (CA) in 1965 respectively. Beginning in the 1980s reductions in government funding prompted more rigorous fundraising efforts.

    Since 2000

    In 2000 Kin-Yip Chun was reinstated as a professor of the university after he launched an unsuccessful lawsuit against the university alleging racial discrimination. In 2017 a human rights application was filed against the University by one of its students for allegedly delaying the investigation of sexual assault and being dismissive of their concerns. In 2018 the university cleared one of its professors of allegations of discrimination and antisemitism in an internal investigation after a complaint was filed by one of its students.

    The University of Toronto was the first Canadian university to amass a financial endowment greater than c. $1 billion in 2007. On September 24, 2020 the university announced a $250 million gift to the Faculty of Medicine from businessman and philanthropist James C. Temerty- the largest single philanthropic donation in Canadian history. This broke the previous record for the school set in 2019 when Gerry Schwartz and Heather Reisman jointly donated $100 million for the creation of a 750,000-square foot innovation and artificial intelligence centre.

    Research

    Since 1926 the University of Toronto has been a member of the Association of American Universities a consortium of the leading North American research universities. The university manages by far the largest annual research budget of any university in Canada with sponsored direct-cost expenditures of $878 million in 2010. In 2018 the University of Toronto was named the top research university in Canada by Research Infosource with a sponsored research income (external sources of funding) of $1,147.584 million in 2017. In the same year the university’s faculty averaged a sponsored research income of $428,200 while graduate students averaged a sponsored research income of $63,700. The federal government was the largest source of funding with grants from the Canadian Institutes of Health Research; the Natural Sciences and Engineering Research Council; and the Social Sciences and Humanities Research Council amounting to about one-third of the research budget. About eight percent of research funding came from corporations- mostly in the healthcare industry.

    The first practical electron microscope was built by the physics department in 1938. During World War II the university developed the G-suit- a life-saving garment worn by Allied fighter plane pilots later adopted for use by astronauts.Development of the infrared chemiluminescence technique improved analyses of energy behaviours in chemical reactions. In 1963 the asteroid 2104 Toronto was discovered in the David Dunlap Observatory (CA) in Richmond Hill and is named after the university. In 1972 studies on Cygnus X-1 led to the publication of the first observational evidence proving the existence of black holes. Toronto astronomers have also discovered the Uranian moons of Caliban and Sycorax; the dwarf galaxies of Andromeda I, II and III; and the supernova SN 1987A. A pioneer in computing technology the university designed and built UTEC- one of the world’s first operational computers- and later purchased Ferut- the second commercial computer after UNIVAC I. Multi-touch technology was developed at Toronto with applications ranging from handheld devices to collaboration walls. The AeroVelo Atlas which won the Igor I. Sikorsky Human Powered Helicopter Competition in 2013 was developed by the university’s team of students and graduates and was tested in Vaughan.

    The discovery of insulin at the University of Toronto in 1921 is considered among the most significant events in the history of medicine. The stem cell was discovered at the university in 1963 forming the basis for bone marrow transplantation and all subsequent research on adult and embryonic stem cells. This was the first of many findings at Toronto relating to stem cells including the identification of pancreatic and retinal stem cells. The cancer stem cell was first identified in 1997 by Toronto researchers who have since found stem cell associations in leukemia; brain tumors; and colorectal cancer. Medical inventions developed at Toronto include the glycaemic index; the infant cereal Pablum; the use of protective hypothermia in open heart surgery; and the first artificial cardiac pacemaker. The first successful single-lung transplant was performed at Toronto in 1981 followed by the first nerve transplant in 1988; and the first double-lung transplant in 1989. Researchers identified the maturation promoting factor that regulates cell division and discovered the T-cell receptor which triggers responses of the immune system. The university is credited with isolating the genes that cause Fanconi anemia; cystic fibrosis; and early-onset Alzheimer’s disease among numerous other diseases. Between 1914 and 1972 the university operated the Connaught Medical Research Laboratories- now part of the pharmaceutical corporation Sanofi-Aventis. Among the research conducted at the laboratory was the development of gel electrophoresis.

    The University of Toronto is the primary research presence that supports one of the world’s largest concentrations of biotechnology firms. More than 5,000 principal investigators reside within 2 kilometres (1.2 mi) from the university grounds in Toronto’s Discovery District conducting $1 billion of medical research annually. MaRS Discovery District is a research park that serves commercial enterprises and the university’s technology transfer ventures. In 2008, the university disclosed 159 inventions and had 114 active start-up companies. Its SciNet Consortium operates the most powerful supercomputer in Canada.

     
  • richardmitnick 11:28 am on October 7, 2022 Permalink | Reply
    Tags: "Mapping human brain development", , , , Researchers at ETH Zürich are growing human brain-​like tissue from stem cells and are then mapping the cell types.,   

    From The Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH): “Mapping human brain development” 

    From The Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH)

    10.7.22
    Peter Rüegg

    Researchers at ETH Zürich are growing human brain-​like tissue from stem cells and are then mapping the cell types that occur in different brain regions and the genes that regulate their development.

    1
    Brain organoid from human stem cells under the fluorescence microscope: the protein GLI3 is stained purple and marks neuronal precursor cells in forebrain regions of the organoid. Neurons are stained green. (Photograph: F. Sanchís Calleja, A. Jain, P. Wahle / ETH Zürich)

    The human brain is probably the most complex organ in the entire living world and has long been an object of fascination for researchers. However, studying the brain, and especially the genes and molecular switches that regulate and direct its development, is no easy task.

    To date, scientists have proceeded using animal models, primarily mice, but their findings cannot be transferred directly to humans. A mouse’s brain is structured differently and lacks the furrowed surface typical of the human brain. Cell cultures have thus far been of limited value in this field, as cells tend to spread over a large area when grown on a culture dish; this does not correspond to the natural three-dimensional structure of the brain.

    Mapping molecular fingerprints

    A group of researchers led by Barbara Treutlein, ETH Professor at the Department of Biosystems Science and Engineering in Basel, has now taken a new approach to studying the development of the human brain: they are growing and using organoids – millimetre-sized three-dimensional tissues that can be grown from what are known as pluripotent stem cells.

    Provided these stem cells receive the right stimulus, researchers can program them to become any kind of cell present in the body, including neurons. When the stem cells are aggregated into a small ball of tissue and then exposed to the appropriate stimulus, they can even self-organize and form a three-dimensional brain organoid with a complex tissue architecture.

    In a new study just published in Nature [below], Treutlein and her colleagues have now studied thousands of individual cells within a brain organoid at various points in time and in great detail. Their goal was to characterise the cells in molecular-genetic terms: in other words, the totality of all gene transcripts (transcriptome) as a measure of gene expression, but also the accessibility of the genome as a measure of regulatory activity. They have managed to represent this data as a kind of map showing the molecular fingerprint of each cell within the organoid.

    However, this procedure generates immense data sets: each cell in the organoid has 20,000 genes, and each organoid in turn consists of many thousands of cells. “This results in a gigantic matrix, and the only way we can solve it is with the help of suitable programs and machine learning,” explains Jonas Fleck, a doctoral student in Treutlein’s group and one of the study’s co-lead authors. To analyse all this data and predict gene regulation mechanisms, the researchers developed their own program. “We can use it to generate an entire interaction network for each individual gene and predict what will happen in real cells when that gene fails,” Fleck says.

    Identifying genetic switches

    The aim of this study was to systematically identify those genetic switches that have a significant impact on the development of neurons in the different regions of brain organoids.

    With the help of a CRISPR-Cas9 system, the ETH researchers selectively switched off one gene in each cell, altogether about two dozen genes simultaneously in the entire organoid. This enabled them to find out what role the respective genes played in the development of the brain organoid.

    “This technique can be used to screen genes involved in disease. In addition, we can look at the effect these genes have on how different cells within the organoid develop,” explains Sophie Jansen, also a doctoral student in Treutlein’s group and the second co-lead author of the study.

    2
    Map of a brain organoid: The colours of the cells shown as circles indicate different cell types. Right: Regulatory network of transcription factor genes that controls the development of a brain organoid. (Graphics: Barbara Treutlein / ETH Zürich)

    Checking pattern formation in the forebrain

    To test their theory, the researchers chose the GLI3 gene as an example. This gene is the blueprint for the transcription factor of the same name, a protein that docks onto certain sites on DNA in order to regulate another gene. When GLI3 is switched off, the cellular machinery is prevented from reading this gene and transcribing it into an RNA molecule.

    In mice, mutations in the GLI3 gene can lead to malformations in the central nervous system. Its role in human neuronal development was previously unexplored, but it is known that mutations in the gene lead to diseases such as Greig cephalopolysyndactyly and Pallister Hall Syndromes.

    Silencing this GLI3 gene enabled the researchers both to verify their theoretical predictions and to determine directly in the cell culture how the loss of this gene affected the brain organoid’s further development. “We have shown for the first time that the GLI3 gene is involved in the formation of forebrain patterns in humans. This had previously been shown only in mice,” Treutlein says.

    Model systems reflect developmental biology

    “The exciting thing about this research is that it lets you use genome-wide data from so many individual cells to postulate what roles individual genes play,” she explains. “What’s equally exciting in my opinion is that these model systems made in a Petri dish really do reflect developmental biology as we know it from mice.”

    Treutlein also finds it fascinating how the culture medium can give rise to self-organized tissue with structures comparable to those of the human brain – not only at the morphological level but also (as the researchers have shown in their latest study) at the level of gene regulation and pattern formation. “Organoids like this are truly an excellent way to study human developmental biology,” she points out.

    Versatile brain organoids

    Research on organoids made up of human cell material has the advantage that the findings are transferable to humans. They can be used to study not only basic developmental biology but also the role of genes in diseases or developmental brain disorders. For example, Treutlein and her colleagues are working with organoids of this type to investigate the genetic cause of autism and of heterotopia; in the latter, neurons appear outside their usual anatomical location in the cerebral cortex.

    Organoids may also be used for testing drugs, and possibly for culturing transplantable organs or organ parts. Treutlein confirms that the pharmaceutical industry is very interested in these cell cultures.

    However, growing organoids takes both time and effort. Moreover, each clump of cells develops individually rather than in a standardised way. That is why Treutlein and her team are working to improve the organoids and automate their manufacturing process.
    __________________________________________________
    Human Cell Atlas

    The research and mapping of brain organoids is embedded in the Human Developmental Cell Atlas; this, in turn, is part of the Human Cell Atlas. The Human Cell Atlas is an attempt by researchers worldwide both to map all cell types in the human body and to compile data on which genes are active in which cells at which times as well as on which genes might be involved in diseases. The head of the Human Cell Atlas project is Aviv Regev, a biology professor at MIT; she received an honorary doctorate from ETH Zürich in 2021. ETH Professor Barbara Treutlein is co-coordinating the Organoid Cell Atlas subsection, which aims to map all the cell stages that can be produced in cell culture and then to compare them with the original cells of the human body.
    __________________________________________________

    Science paper:
    Nature
    See the science paper for instructive material.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus

    The Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH) is a public research university in the city of Zürich, Switzerland. Founded by the Swiss Federal Government in 1854 with the stated mission to educate engineers and scientists, the school focuses exclusively on science, technology, engineering and mathematics. Like its sister institution The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH) , it is part of The Swiss Federal Institutes of Technology Domain (ETH Domain)) , part of the The Swiss Federal Department of Economic Affairs, Education and Research [EAER][Eidgenössisches Departement für Wirtschaft, Bildung und Forschung] [Département fédéral de l’économie, de la formation et de la recherche] (CH).

    The university is an attractive destination for international students thanks to low tuition fees of 809 CHF per semester, PhD and graduate salaries that are amongst the world’s highest, and a world-class reputation in academia and industry. There are currently 22,200 students from over 120 countries, of which 4,180 are pursuing doctoral degrees. In the 2021 edition of the QS World University Rankings ETH Zürich is ranked 6th in the world and 8th by the Times Higher Education World Rankings 2020. In the 2020 QS World University Rankings by subject it is ranked 4th in the world for engineering and technology (2nd in Europe) and 1st for earth & marine science.

    As of November 2019, 21 Nobel laureates, 2 Fields Medalists, 2 Pritzker Prize winners, and 1 Turing Award winner have been affiliated with the Institute, including Albert Einstein. Other notable alumni include John von Neumann and Santiago Calatrava. It is a founding member of the IDEA League and the International Alliance of Research Universities (IARU) and a member of the CESAER network.

    ETH Zürich was founded on 7 February 1854 by the Swiss Confederation and began giving its first lectures on 16 October 1855 as a polytechnic institute (eidgenössische polytechnische schule) at various sites throughout the city of Zurich. It was initially composed of six faculties: architecture, civil engineering, mechanical engineering, chemistry, forestry, and an integrated department for the fields of mathematics, natural sciences, literature, and social and political sciences.

    It is locally still known as Polytechnikum, or simply as Poly, derived from the original name eidgenössische polytechnische schule, which translates to “federal polytechnic school”.

    ETH Zürich is a federal institute (i.e., under direct administration by the Swiss government), whereas The University of Zürich [Universität Zürich ] (CH) is a cantonal institution. The decision for a new federal university was heavily disputed at the time; the liberals pressed for a “federal university”, while the conservative forces wanted all universities to remain under cantonal control, worried that the liberals would gain more political power than they already had. In the beginning, both universities were co-located in the buildings of the University of Zürich.

    From 1905 to 1908, under the presidency of Jérôme Franel, the course program of ETH Zürich was restructured to that of a real university and ETH Zürich was granted the right to award doctorates. In 1909 the first doctorates were awarded. In 1911, it was given its current name, Eidgenössische Technische Hochschule. In 1924, another reorganization structured the university in 12 departments. However, it now has 16 departments.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    Reputation and ranking

    ETH Zürich is ranked among the top universities in the world. Typically, popular rankings place the institution as the best university in continental Europe and ETH Zürich is consistently ranked among the top 1-5 universities in Europe, and among the top 3-10 best universities of the world.

    Historically, ETH Zürich has achieved its reputation particularly in the fields of chemistry, mathematics and physics. There are 32 Nobel laureates who are associated with ETH Zürich, the most recent of whom is Richard F. Heck, awarded the Nobel Prize in chemistry in 2010. Albert Einstein is perhaps its most famous alumnus.

    In 2018, the QS World University Rankings placed ETH Zürich at 7th overall in the world. In 2015, ETH Zürich was ranked 5th in the world in Engineering, Science and Technology, just behind the Massachusetts Institute of Technology, Stanford University and University of Cambridge (UK). In 2015, ETH Zürich also ranked 6th in the world in Natural Sciences, and in 2016 ranked 1st in the world for Earth & Marine Sciences for the second consecutive year.

    In 2016, Times Higher Education World University Rankings ranked ETH Zürich 9th overall in the world and 8th in the world in the field of Engineering & Technology, just behind the Massachusetts Institute of Technology, Stanford University, California Institute of Technology, Princeton University, University of Cambridge(UK), Imperial College London(UK) and University of Oxford(UK) .

    In a comparison of Swiss universities by swissUP Ranking and in rankings published by CHE comparing the universities of German-speaking countries, ETH Zürich traditionally is ranked first in natural sciences, computer science and engineering sciences.

    In the survey CHE Excellence Ranking on the quality of Western European graduate school programs in the fields of biology, chemistry, physics and mathematics, ETH Zürich was assessed as one of the three institutions to have excellent programs in all the considered fields, the other two being Imperial College London (UK) and the University of Cambridge (UK), respectively.

     
  • richardmitnick 10:17 am on October 7, 2022 Permalink | Reply
    Tags: "DOE Funds Pilot Study Focused on Biosecurity for Bioenergy Crops", , , , , , , , , Research into threats from pathogens and pests would speed short-term response and spark long-term mitigation strategies.,   

    From The DOE’s Brookhaven National Laboratory: “DOE Funds Pilot Study Focused on Biosecurity for Bioenergy Crops” 

    From The DOE’s Brookhaven National Laboratory

    10.6.22

    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    Research into threats from pathogens and pests would speed short-term response and spark long-term mitigation strategies.

    1
    Pilot study on an important disease in sorghum (above) will develop understanding of threats to bioenergy crops, potentially speeding the development of short-term responses and long-term mitigation strategies. (Credit: U.S. Department of Energy Genomic Science program)

    The U.S. Department of Energy’s (DOE) Office of Science has selected Brookhaven National Laboratory to lead a new research effort focused on potential threats to crops grown for bioenergy production. Understanding how such bioenergy crops could be harmed by known or new pests or pathogens could help speed the development of rapid responses to mitigate damage and longer-term strategies for preventing such harm. The pilot project could evolve into a broader basic science capability to help ensure the development of resilient and sustainable bioenergy crops as part of a transition to a net-zero carbon economy.

    The idea is modeled on the way DOE’s National Virtual Biotechnology Laboratory (NVBL) pooled basic science capabilities to address the COVID-19 pandemic. With $5 Million in initial funding, allocated over the next two years, Brookhaven Lab and its partners will develop a coordinated approach for addressing biosecurity challenges. This pilot study will lead to a roadmap for building out a DOE-wide capability known as the National Virtual Biosecurity for Bioenergy Crops Center (NVBBCC).

    “A robust biosecurity capability optimized to respond rapidly to biological threats to bioenergy crops requires an integrated and versatile platform,” said Martin Schoonen, Brookhaven Lab’s Associate Laboratory Director for Environment, Biology, Nuclear Science & Nonproliferation, who will serve as principal investigator for the pilot project. “With this initial funding, we’ll develop a bio-preparedness platform for sampling and detecting threats, predicting how they might propagate, and understanding how pests or pathogens interact with bioenergy crops at the molecular level—all of which are essential for developing short-term control measures and long-term solutions.”

    The team will invest in new research tools—including experimental equipment and an integrating computing environment for data sharing, data analysis, and predictive modeling. Experiments on an important disease of energy sorghum, a leading target for bioengineering as an oil-producing crop, will serve as a model to help the team establish optimized protocols for studying plant-pathogen interactions.

    In addition, a series of workshops will bring together experts from a range of perspectives and institutions to identify partnerships within and outside DOE, as well as any future investments needed, to establish the full capabilities of an end-to-end biosecurity platform.

    “NVBBCC is envisioned to be a distributed, virtual center with multiple DOE-labs at its core to maximize the use of unique facilities and expertise across the DOE complex,” Schoonen said. “The center will support plant pathology research driven by the interests of the bioenergy crop community, as well as broader plant biology research that could impact crop health.”

    Building the platform

    2
    The pilot study experiments and workshops will be organized around four main themes: detection and sampling, biomolecular characterization, assessment, and mitigation.

    In this initial phase, the research will focus on energy sorghum. This crop’s potential oil yield per acre far exceeds than that of soybeans, currently the world’s primary source of biodiesel.

    “Sorghum is susceptible to a devastating fungal disease, caused by Colletotrichum sublineola, which can result in yield losses of up to 67 percent,” said John Shanklin, chair of Brookhaven Lab’s Biology Department and co-lead of the assessment theme. “Finding ways to thwart this pathogen is a high priority for the bioenergy crop community.”

    The NVBBCC team will use a range of tools—including advanced remote-sensing technologies, COVID-19-like rapid test strips, and in-field sampling—to detect C. sublineola. Additional experiments will assess airborne propagation of fungal spores, drawing on Brookhaven Lab’s expertise in modeling the dispersal of aerosol particles.

    The team will also use state-of-the-art biomolecular characterization tools—including cryo-electron microscopes in Brookhaven’s Laboratory for BioMolecular Structure (LBMS) and x-ray crystallography beamlines at the National Synchrotron Light Source-II (NSLS-II)—to explore details of how pathogen proteins and plant proteins interact. In addition, they’ll add a new tool—a cryogenic-focused ion beam—to produce samples for high-resolution three-dimensional cellular imaging and other advanced imaging modalities.

    Together, these experiments will reveal mechanistic details that provide insight into how plants respond to infections, including how some strains of sorghum develop resistance to C. sublineola. The team will also draw on extensive information about the genetic makeup of sorghum and C. sublineola to identify factors that control expression of the various plant and pathogen proteins.

    The program will be supported by an integrating computing infrastructure with access to sophisticated computational tools across the DOE complex and at partner institutions, enabling integrated data analysis and collaboration using community data standards and tools. The infrastructure will also provide capabilities to develop, train, and verify new analytical and predictive computer models, including novel artificial intelligence (AI) solutions.

    “NVBBCC will build on the Johns Hopkins University-developed SciServer environment, which has been used successfully in large data-sharing and analysis projects in cosmology and soil ecology,” said Kerstin Kleese van Dam, head of Brookhaven Lab’s Computational Science Initiative. “NVBBCC’s computational infrastructure will allow members to easily coordinate research across different domains and sites, accelerating discovery and response times through integrated knowledge sharing.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Brookhaven Campus

    One of ten national laboratories overseen and primarily funded by the The DOE Office of Science, The DOE’s Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University and Battelle Memorial Institute. From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology to have a facility near Boston, Massachusetts. Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University, Cornell University, Harvard University, Johns Hopkins University, Massachusetts Institute of Technology, Princeton University, University of Pennsylvania, University of Rochester, and Yale University.

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966.

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II. [below].

    BNL National Synchrotron Light Source.

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider (CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, it was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] as the future Electron–ion collider (EIC) in the United States.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.

    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.

    BNL National Synchrotron Light Source II, Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years. NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.

    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.

    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University-SUNY.

    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to the ATLAS experiment, one of the four detectors located at the The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] Large Hadron Collider(LHC).

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] map.

    Iconic view of the European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear] [Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH) [CERN] ATLAS detector.

    It is currently operating at The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH) [CERN] near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    DOE’s Oak Ridge National Laboratory Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.

    Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China .


    BNL Center for Functional Nanomaterials.

    BNL National Synchrotron Light Source II.

    BNL NSLS II.

    BNL Relative Heavy Ion Collider Campus.

    BNL/RHIC Phenix detector.


     
  • richardmitnick 9:49 am on October 7, 2022 Permalink | Reply
    Tags: "UCLA-Led Study Could Be Step Toward Cheaper Hydrogen-Based Energy", Researchers devise method for predicting performance of catalysts in fuel cells., , The Henry Samueli School of Engineering and Applied Science,   

    From The DOE’s Brookhaven National Laboratory And The Henry Samueli School of Engineering and Applied Science At The University of California-Los Angeles: “UCLA-Led Study Could Be Step Toward Cheaper Hydrogen-Based Energy” 

    From The DOE’s Brookhaven National Laboratory

    And

    The Henry Samueli School of Engineering and Applied Science

    At

    UCLA bloc

    The University of California-Los Angeles

    10.3.22

    Researchers devise method for predicting performance of catalysts in fuel cells.

    1
    The research group is now collaborating with Toyota Motor Corp. to develop fuel cell catalysts with possible real-world applications. (Pictured: Toyota hydrogen fuel cell concept vehicle, 2019. Unsplash/Darren Halstead)

    A study led by UCLA researchers could help accelerate the use of hydrogen as an environmentally friendly source of energy in transportation and other applications.

    The team developed a method for predicting platinum alloys’ potency and stability — two key indicators of how they will perform as catalysts in hydrogen fuel cells. Then, using that technique, they designed and produced an alloy that yielded excellent results under conditions approximating real-world use. The findings are published in the journal Nature Catalysis [below].

    “For the sustainability of our planet, we can’t keep living the way we do, and reinventing energy is one major way to change our path,” said corresponding author Yu Huang, a professor of materials science and engineering at the UCLA Samueli School of Engineering and a member of the California NanoSystems Institute at UCLA. “We have fuel cell cars, but we need to make them cheaper. In this study, we came up with an approach to allow researchers to identify the right catalysts much faster.”

    Fuel cells generate power using oxygen from the atmosphere and hydrogen. A key step in the process is using a catalyst to break the bonds between pairs of oxygen atoms. The catalysts that work best are highly active, in order to drive the reaction, while also being stable enough to be used for long periods of time. And for those designing fuel cells, finding the best catalysts has been a major challenge.

    Platinum is the best element for the purpose, but its rarity makes the technology prohibitively expensive for large-scale adoption. An alloy combining platinum with a more readily accessible metal or metals would reduce the cost, but there has never been a practical, real-world method for quickly screening which alloy would make the best catalyst.

    As a result, advances in the technology have come through trial and error so far.

    “This is a decisive step forward toward the rational design, down to the microscopic scale, of catalysts with optimal performance,” said Alessandro Fortunelli of Italy’s National Research Council, a co-corresponding author of the paper. “Nobody has ever come up with a method, either theoretical or experimental, to predict the stability of platinum alloy catalysts.”

    The new method predicts both the potency and the stability of platinum alloy catalysts. It was developed using a combination of experiments, complex computation and X-ray spectroscopy, which allowed the investigators to precisely identify chemical properties.

    The researchers then created catalysts combining precise amounts of platinum, nickel and cobalt in a specific atomic structure and configuration based on their experimental measure. They showed that the alloy they designed is both highly active and highly stable, a rare but much-needed combination for fuel cell catalysts.

    Huang said that the method could be applied to potential catalysts mixing platinum with a subset of metals beyond nickel and cobalt.

    The paper’s other co-corresponding authors are chemist Qingying Jia of Northeastern University and theorist William Goddard of Caltech. Huang, whose UCLA laboratory was primarily responsible for designing and testing the catalyst, said the collaboration with scientists and engineers at other institutions was vital to the study’s success.

    “Lacking any of these partners, this work would be impossible,” she said. “For a long-term, curiosity-driven collaboration such as this one, the most important thing is to have the right people. Every single one of us was focused on digging deep and trying to figure out what’s happening. It also helped that this was a fun team to work with.”

    Huang’s group is now collaborating with Toyota Motor Corp. to develop fuel cell catalysts with possible real-world applications.

    The study’s first author is Jin Huang, who earned a doctorate from UCLA in 2021. Other UCLA co-authors are doctoral students Zeyan Liu, Bosi Peng and Yang Liu; former graduate students Mufan Li and Sung-Joon Lee; postdoctoral researcher Chengzhang Wan; assistant project scientist Enbo Zhu, who worked on the study as both a doctoral student and postdoctoral researcher at UCLA; and Xiangfeng Duan, a professor of chemistry and biochemistry. Other authors are from the Brookhaven National Laboratory in New York, Italy’s National Research Council, Northeastern University and UC Irvine.

    The research was supported by the U.S. Office of Naval Research and the National Science Foundation.

    Science paper:
    Nature Catalysis
    See the science paper for instructive material.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The UCLA Henry Samueli School of Engineering and Applied Science is the school of engineering at the University of California-Los Angeles. It opened as the College of Engineering in 1945, and was renamed the School of Engineering in 1969. Since its initial enrollment of 379 students, the school has grown to approximately 6,100 students. The school is ranked 16th among all engineering schools in the United States. The school offers 28 degree programs and is home to eight externally funded interdisciplinary research centers, including those in space exploration, wireless sensor systems, and nanotechnology.

    The University of California-Los Angeles

    UC LA Campus

    For nearly 100 years, The University of California-Los Angeles has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

    The University of California-Los Angeles is a public land-grant research university in Los Angeles, California. The University of California-Los Angeles traces its early origins back to 1882 as the southern branch of the California State Normal School (now San Jose State University). It became the Southern Branch of The University of California in 1919, making it the second-oldest (after University of California-Berkeley ) of the 10-campus University of California system.

    The University of California-Los Angeles offers 337 undergraduate and graduate degree programs in a wide range of disciplines, enrolling about 31,500 undergraduate and 12,800 graduate students. The University of California-Los Angeles had 168,000 applicants for Fall 2021, including transfer applicants, making the school the most applied-to of any American university.

    The university is organized into six undergraduate colleges; seven professional schools; and four professional health science schools. The undergraduate colleges are the College of Letters and Science; Samueli School of Engineering; School of the Arts and Architecture; Herb Alpert School of Music; School of Theater, Film and Television; and School of Nursing.

    The University of California-Los Angeles is called a “Public Ivy”, and is ranked among the best public universities in the United States by major college and university rankings. This includes one ranking that has The University of California-Los Angeles as the top public university in the United States in 2021. As of October 2020, 25 Nobel laureates; three Fields Medalists; five Turing Award winners; and two Chief Scientists of the U.S. Air Force have been affiliated with The University of California-Los Angeles as faculty; researchers or alumni. Among the current faculty members, 55 have been elected to the National Academy of Sciences; 28 to the National Academy of Engineering ; 39 to the Institute of Medicine; and 124 to the American Academy of Arts and Sciences . The university was elected to the Association of American Universities in 1974.

    The University of California-Los Angeles student-athletes compete as the Bruins in the Pac-12 Conference. The Bruins have won 129 national championships, including 118 NCAA team championships- more than any other university except Stanford University, whose athletes have won 126. The University of California-Los Angeles students, coaches, and staff have won 251 Olympic medals: 126 gold; 65 silver; and 60 bronze. The University of California-Los Angeles student-athletes have competed in every Olympics since 1920 with one exception (1924) and have won a gold medal in every Olympics the U.S. participated in since 1932.

    History

    In March 1881, at the request of state senator Reginaldo Francisco del Valle, the California State Legislature authorized the creation of a southern branch of the California State Normal School (now San José State University) in downtown Los Angeles to train teachers for the growing population of Southern California. The Los Angeles branch of the California State Normal School opened on August 29, 1882, on what is now the site of the Central Library of the Los Angeles Public Library system. The facility included an elementary school where teachers-in-training could practice their technique with children. That elementary school is related to the present day University of California-Los Angeles Lab School. In 1887, the branch campus became independent and changed its name to Los Angeles State Normal School.

    In 1914, the school moved to a new campus on Vermont Avenue (now the site of Los Angeles City College) in East Hollywood. In 1917, UC Regent Edward Augustus Dickson, the only regent representing the Southland at the time and Ernest Carroll Moore- Director of the Normal School, began to lobby the State Legislature to enable the school to become the second University of California campus, after University of California-Berkeley. They met resistance from University of California-Berkeley alumni, Northern California members of the state legislature, and Benjamin Ide Wheeler- President of the University of California from 1899 to 1919 who were all vigorously opposed to the idea of a southern campus. However, David Prescott Barrows the new President of the University of California did not share Wheeler’s objections.

    On May 23, 1919, the Southern Californians’ efforts were rewarded when Governor William D. Stephens signed Assembly Bill 626 into law which acquired the land and buildings and transformed the Los Angeles Normal School into the Southern Branch of the University of California. The same legislation added its general undergraduate program- the Junior College. The Southern Branch campus opened on September 15 of that year offering two-year undergraduate programs to 250 Junior College students and 1,250 students in the Teachers College under Moore’s continued direction. Southern Californians were furious that their so-called “branch” provided only an inferior junior college program (mocked at the time by The University of Southern California students as “the twig”) and continued to fight Northern Californians (specifically, Berkeley) for the right to three and then four years of instruction culminating in bachelor’s degrees. On December 11, 1923 the Board of Regents authorized a fourth year of instruction and transformed the Junior College into the College of Letters and Science which awarded its first bachelor’s degrees on June 12, 1925.

    Under University of California President William Wallace Campbell, enrollment at the Southern Branch expanded so rapidly that by the mid-1920s the institution was outgrowing the 25-acre Vermont Avenue location. The Regents searched for a new location and announced their selection of the so-called “Beverly Site”—just west of Beverly Hills—on March 21, 1925 edging out the panoramic hills of the still-empty Palos Verdes Peninsula. After the athletic teams entered the Pacific Coast conference in 1926 the Southern Branch student council adopted the nickname “Bruins”, a name offered by the student council at The University of California-Berkeley. In 1927, the Regents renamed the Southern Branch the University of California at Los Angeles (the word “at” was officially replaced by a comma in 1958 in line with other UC campuses). In the same year the state broke ground in Westwood on land sold for $1 million- less than one-third its value- by real estate developers Edwin and Harold Janss for whom the Janss Steps are named. The campus in Westwood opened to students in 1929.

    The original four buildings were the College Library (now Powell Library); Royce Hall; the Physics-Biology Building (which became the Humanities Building and is now the Renee and David Kaplan Hall); and the Chemistry Building (now Haines Hall) arrayed around a quadrangular courtyard on the 400 acre (1.6 km^2) campus. The first undergraduate classes on the new campus were held in 1929 with 5,500 students. After lobbying by alumni; faculty; administration and community leaders University of California-Los Angeles was permitted to award the master’s degree in 1933 and the doctorate in 1936 against continued resistance from The University of California-Berkeley.

    Maturity as a university

    During its first 32 years University of California-Los Angeles was treated as an off-site department of The University of California. As such its presiding officer was called a “provost” and reported to the main campus in Berkeley. In 1951 University of California-Los Angeles was formally elevated to co-equal status with The University of California-Berkeley, and its presiding officer Raymond B. Allen was the first chief executive to be granted the title of chancellor. The appointment of Franklin David Murphy to the position of Chancellor in 1960 helped spark an era of tremendous growth of facilities and faculty honors. By the end of the decade University of California-Los Angeles had achieved distinction in a wide range of subjects. This era also secured University of California-Los Angeles’s position as a proper university and not simply a branch of the University of California system. This change is exemplified by an incident involving Chancellor Murphy, which was described by him:

    “I picked up the telephone and called in from somewhere and the phone operator said, “University of California.” And I said, “Is this Berkeley?” She said, “No.” I said, “Well who have I gotten to?” ” University of California-Los Angeles.” I said, “Why didn’t you say University of California-Los Angeles?” “Oh”, she said, “we’re instructed to say University of California.” So, the next morning I went to the office and wrote a memo; I said, “Will you please instruct the operators, as of noon today, when they answer the phone to say, ‘ University of California-Los Angeles.'” And they said, “You know they won’t like it at Berkeley.” And I said, “Well, let’s just see. There are a few things maybe we can do around here without getting their permission.”

    Recent history

    On June 1, 2016 two men were killed in a murder-suicide at an engineering building in the university. School officials put the campus on lockdown as Los Angeles Police Department officers including SWAT cleared the campus.

    In 2018, a student-led community coalition known as “Westwood Forward” successfully led an effort to break University of California-Los Angeles and Westwood Village away from the existing Westwood Neighborhood Council and form a new North Westwood Neighborhood Council with over 2,000 out of 3,521 stakeholders voting in favor of the split. Westwood Forward’s campaign focused on making housing more affordable and encouraging nightlife in Westwood by opposing many of the restrictions on housing developments and restaurants the Westwood Neighborhood Council had promoted.

    Academics

    Divisions

    Undergraduate

    College of Letters and Science
    Social Sciences Division
    Humanities Division
    Physical Sciences Division
    Life Sciences Division
    School of the Arts and Architecture
    Henry Samueli School of Engineering and Applied Science (HSSEAS)
    Herb Alpert School of Music
    School of Theater, Film and Television
    School of Nursing
    Luskin School of Public Affairs

    Graduate

    Graduate School of Education & Information Studies (GSEIS)
    School of Law
    Anderson School of Management
    Luskin School of Public Affairs
    David Geffen School of Medicine
    School of Dentistry
    Jonathan and Karin Fielding School of Public Health
    Semel Institute for Neuroscience and Human Behavior
    School of Nursing

    Research

    University of California-Los Angeles is classified among “R1: Doctoral Universities – Very high research activity” and had $1.32 billion in research expenditures in FY 2018.

    Brookhaven Campus

    One of ten national laboratories overseen and primarily funded by the The DOE Office of Science, The DOE’s Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University and Battelle Memorial Institute. From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology to have a facility near Boston, Massachusetts. Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University, Cornell University, Harvard University, Johns Hopkins University, Massachusetts Institute of Technology, Princeton University, University of Pennsylvania, University of Rochester, and Yale University.

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966.

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II. [below].

    BNL National Synchrotron Light Source.

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider (CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, it was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] as the future Electron–ion collider (EIC) in the United States.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.

    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.

    BNL National Synchrotron Light Source II, Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years. NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.

    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.

    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University-SUNY.

    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to the ATLAS experiment, one of the four detectors located at the The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] Large Hadron Collider(LHC).

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH)[CERN] map.

    Iconic view of the European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear] [Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH) [CERN] ATLAS detector.

    It is currently operating at The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organization européenne pour la recherche nucléaire] [Europäische Organization für Kernforschung](CH) [CERN] near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    DOE’s Oak Ridge National Laboratory Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.

    Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China .


    BNL Center for Functional Nanomaterials.

    BNL National Synchrotron Light Source II.

    BNL NSLS II.

    BNL Relative Heavy Ion Collider Campus.

    BNL/RHIC Phenix detector.


     
  • richardmitnick 9:13 am on October 7, 2022 Permalink | Reply
    Tags: "Why NIST Is Putting Its CHIPS Into U.S. Manufacturing", A typical integrated circuit today contains billions of tiny on-off switches known as transistors., An area of major excitement at NIST is “advanced packaging.”, Artificial diamonds are currently used as the semiconductors in chips for aerospace applications., “Integrated circuits”, Cell phones send and receive Wi-Fi and cellular signals thanks to semiconductor chips inside them., Chips also abound on the exteriors of homes inside everything from security cameras to solar panels., Chips typically need to go through a dizzying series of steps-and different suppliers-before they become finished products., CPUs and GPUs in computers, Digital cameras contain chips that detect light and turn it into an image., , Gallium nitride is resistant to damage from cosmic rays and other radiation in space so it’s commonly the material of choice for electronic devices in satellites., Light emitting diodes (LEDs) on chips, Manufacturers typically mass-produce dozens of integrated circuits on a single semiconductor wafer and then dice the wafer to separate the individual pieces., Measurement science plays a key role in up to 50% of semiconductor manufacturing steps., Memory chips store data., , NIST has the measurement science and technical standards expertise that is needed by the U.S. chip industry., President Joe Biden recently signed into law the "CHIPS Act"., Semiconductor chips, Silicon carbide can handle larger amounts of electricity and voltage than other materials so it has been used in chips for electric vehicles., Silicon is a type of material known as a semiconductor., Silicon is the most frequently used raw material for chips., The average car can have upward of 1200 chips in it., , Today’s cars are computers on wheels.   

    From The National Institute of Standards and Technology: “Why NIST Is Putting Its CHIPS Into U.S. Manufacturing” 

    From The National Institute of Standards and Technology

    10.7.22

    Ben P. Stein

    1
    A NIST NanoFab user works with an optical microscope and computer software to inspect samples and take pictures.
    Credit: B. Hayes/NIST.

    Right after the pandemic hit, I bought a new vacuum cleaner. I wanted to step up my housecleaning skills since I knew I’d be home a lot more. I was able to buy mine right away, but friends who wanted new appliances weren’t so lucky. My relatives had to wait months for their new refrigerator to arrive. And it wasn’t just appliances. New cars were absent from dealership lots, while used cars commanded a premium. What do all these things have in common? Semiconductor chips.

    The pandemic disrupted the global supply chain, and semiconductor chips were particularly vulnerable. The chip shortage delivered a wakeup call for our country to make our supply chain more resilient and increase domestic manufacturing of chips, which are omnipresent in modern life.

    “To an astonishing degree, the products and services we encounter every day are powered by semiconductor chips,” says Mike Molnar, director of NIST’s Office of Advanced Manufacturing.

    Think about your kitchen. Dishwashers have chips that sense how dirty your loads are and precisely time their cleaning cycles to reduce your energy and water bills. Some rice cookers use chips with “fuzzy logic” to judge how long to cook rice. Many toasters now have chips that make sure your bread is perfectly browned.

    We commonly think of chips as the “brains” that crunch numbers, and that is certainly true for the CPUs in computers, but chips do all sorts of useful things. Memory chips store data. Digital cameras contain chips that detect light and turn it into an image. Modern TVs produce their colorful displays with arrays of light emitting diodes (LEDs) on chips. Phones send and receive Wi-Fi and cellular signals thanks to semiconductor chips inside them. Chips also abound on the exteriors of homes, inside everything from security cameras to solar panels.

    The average car can have upward of 1,200 chips in it, and you can’t make a new car unless you have all of them. “Today’s cars are computers on wheels,” an auto mechanic said to me a few years ago, and his words were never more on point than during the height of the pandemic. In 2021, the chip shortage was estimated to have caused a loss of $110 billion in new vehicle sales worldwide.

    The chips in today’s cars are a combination of low-tech, mature chips and high-tech, state-of-the-art processors (which you’ll especially find in electric vehicles and those that have autonomous driving capabilities).

    2
    It takes a lot of chemistry to make a computer chip. Here a NanoFab user is working with acids while wearing the proper personal protective equipment (PPE). Credit: B. Hayes/NIST.

    Whether mature or cutting-edge, chips typically need to go through a dizzying series of steps — and different suppliers — before they become finished products. And most of this work is currently done outside this country. The U.S., once a leader in chip manufacturing, currently only has about a 12% share in the market.

    To reestablish our nation’s leadership in chip manufacturing, Congress recently passed, and President Joe Biden recently signed into law, the “CHIPS Act”. The CHIPS Act aims to help U.S. manufacturers grow an ecosystem in which they produce both mature and state-of-the-art chips at all stages of the manufacturing process and supply chain, and NIST is going to play a big role in this effort.

    The Dirt on Semiconductor Chips

    Silicon is the most frequently used raw material for chips, and one of the most abundant atomic elements on Earth. To give you a sense of its abundance, silicon and oxygen are the main ingredients of most beach sand, and a major component of glass, rocks and soil (which means that you can also find it in actual, not just metaphorical, dirt).

    3
    Making a “wafer” of semiconductor material, like the one shown here, is the first step for making a chip.
    Credit: MS Mikel/Shutterstock.

    Silicon is a type of material known as a semiconductor. Electricity flows through semiconductors better than it does through insulators (such as rubber and cotton), but not quite as well as it does through conductors (such as metals and water).

    But that’s a good thing. In semiconductors, you can control electric current precisely — and without any moving parts. By applying a small voltage to them, you can either cause current to flow or to stop — making the semiconductor (or a small region within it) act like a conductor or insulator depending on what you want to do.

    The first step for making a chip is to start with a thin slice of a semiconductor material, known as a “wafer,” often round in shape. On top of the wafer, manufacturers then create complex miniature electric circuits, commonly called “integrated circuits” (ICs) because they are embedded as one piece on the wafer. A typical IC today contains billions of tiny on-off switches known as transistors that enable a chip to perform a wide range of complex tasks from sending signals to processing information. Increasingly, these circuits also have “photonic” components in which light travels alongside electricity.

    Manufacturers typically mass-produce dozens of ICs on a single semiconductor wafer and then dice the wafer to separate the individual pieces. When each of them is packaged as a self-contained device, you have a “chip,” which can then be placed in smartphones, computers and so many other products.

    4
    An array of photonic integrated circuit chips, which use light to process information. These diced photonics chips are ready for assembly and packaging at AIM Photonics, an Albany, New York-based research facility that is part of the national Manufacturing USA network. Credit: AIM Photonics.

    Though silicon is the most commonly used raw material for chips, other semiconductors are used depending on the application. For example, gallium nitride is resistant to damage from cosmic rays and other radiation in space, so it’s commonly the material of choice for electronic devices in satellites. Gallium arsenide is frequently employed to make LEDs, because silicon typically produces heat instead of light if you try to make an LED with it.

    Non-silicon semiconductors are used in the growing field of “power electronics” in vehicles and energy systems such as wind and solar. Silicon carbide can handle larger amounts of electricity and voltage than other materials, so it has been used in chips for electric vehicles to perform functions such as converting DC battery power into the AC power delivered to the motors.

    Diamonds are semiconductors too — and they have the greatest ability to conduct heat of any known material. Artificial diamonds are currently used as the semiconductors in chips for aerospace applications, as they can draw heat away from the power loads generated in those chips.

    So Why NIST?

    Measurement science plays a key role in up to 50% of semiconductor manufacturing steps, according to a recent NIST report. Good measurements enable manufacturers to mass-produce high-quality, high-performance chips.

    NIST has the measurement science and technical standards expertise that is needed by the U.S. chip industry, and our programs to advance manufacturing and support manufacturing networks across the U.S. mean we can partner with industry to find out what they need and deliver on it.

    5
    This is a test chip NIST has developed, as part of a research and development agreement with Google, for measuring the performance of semiconductor devices used in a range of advanced applications such as artificial intelligence. Credit: B. Hoskins/NIST.

    NIST researchers already work on semiconductor materials for many reasons. For example, researchers have developed new ways to measure semiconductor materials in order to detect defects (such as a stray aluminum atom in silicon) that could cause chips to malfunction. As electronic components get smaller, chips need to be increasingly free of such defects.

    “Modern chips may contain over 100 billion complex nanodevices that are less than 50 atoms across — all must work nearly identically for the chip to function,” the NIST report points out.

    Flexible and Printable Chips

    NIST researchers also measure the properties of new materials that could be useful for future inventions. All of the semiconductor materials I mentioned above are brittle and can’t be bent. But devices with chips — from pacemakers to blood pressure monitors to defibrillators — are increasingly being made with flexible materials so they can be “wearable” and you can attach them comfortably to the contours of your body. NIST researchers have been at the forefront of the work to develop these “flexible” chips.

    6
    A circuit made from organic thin-film transistors is fabricated on a flexible plastic substrate. Credit: Patrick Mansell/Penn State.

    Researchers are also studying materials that could serve as “printable” chips that would be cheaper and more environmentally friendly. Instead of going through the complicated multistep process of making chips in a factory, we are developing ways to print circuits directly onto materials such as paper using technology that’s similar to ink-jet printers.

    And while we’ve lost a lot of overall chip manufacturing share, U.S. companies still make many of the machines that carry out the individual steps for fabricating chips, such as those that deposit ultrathin layers of material on top of semiconductors. But what if, instead of these machines being shipped abroad, more domestic manufacturers developed expertise in using them?

    To support this effort, NIST researchers are planning to perform measurements with these very machines in their labs. They will study materials that these machines use and the manufacturing processes associated with them. The information from the NIST work could help more domestic manufacturers develop the know-how for making chips. This work can help create an ecosystem with many domestic chip manufacturers, not just a few, leading to a more resilient supply chain.

    7
    Three researchers at NIST’s NanoFab talk science with a state-of-the-art Atomic Layer Deposition (ALD) system in the background.Credit: B. Hayes/NIST.

    “Reliance on only one supplier is problematic, as we saw with the recent shortage in baby formula,” NIST’s Jyoti Malhotra pointed out to me. Malhotra serves on the senior leadership team of NIST’s Manufacturing Extension Partnership (MEP). MEP has been connecting NIST labs to the U.S. suppliers and manufacturers who produce materials, components, devices and equipment enabling U.S. chip manufacturing.

    Advanced Packaging

    Last but not least, an area of major excitement at NIST is “advanced packaging.” No, we don’t mean the work of those expert gift-wrappers you may find at stores during the holiday season. When we talk about chip packaging, we’re referring to everything that goes around a chip to protect it from damage and connect it to the rest of the device. Advanced packaging takes things to the next level: It uses ingenious techniques during the chipmaking process to connect multiple chips to each other and the rest of the device in as tiny a space as possible.

    But it’s more about just making a smartphone that fits in your pocket. Advanced packaging enables our devices to be faster and more energy-efficient because information can be exchanged between chips over shorter distances and this in turn reduces energy consumption.

    One great byproduct of advanced packaging’s innovations can be found on my wrist — namely, the smartwatch I wear for my long-distance runs. My watch uses GPS to measure how far I ran. It also measures my heart rate, and after my workouts, it uploads my running data wirelessly to my phone. Its battery lasts for days; it had plenty of juice left even after I ran a full marathon last month.

    Twenty years ago, running watches were big and clunky, with much less functionality. My friends and I had a particular model with a huge face and a bulky slab that fit over the insides of our wrists. When a friend and I opened up his watch to replace his battery, we saw that the GPS receiver was on a completely separate circuit board from the rest of the watch electronics.

    9
    A running friend of mine still has his old running watch, and he recently took a picture of it alongside the modern one that he now uses. The GPS chip in the old watch is on its own circuit board underneath the buttons, apart from the rest of the watch electronics. The modern watch has all the electronic components beneath the small watch face. Credit: Ron Weber.

    Under the small and thin face of my current watch you will find all its electronics, including a GPS sensor, battery, heart-rate monitor, wireless communications device and so many other things.

    Further development of advanced packaging could produce even more powerful devices for monitoring a patient’s vitals, measuring pollutants in the environment, and increasing situational awareness for soldiers in the field.

    10
    This illustration shows the staggering number of ultrathin semiconductor layers that are possible thanks to “advanced packaging” techniques. When I saw this, it reminded me of one of those amazing sandwiches that the cartoon character Dagwood would eat, but I think this is even more impressive! Credit: DoE 3DFeM center at Penn State University.

    Advanced packaging is also a potential niche for domestic manufacturers to grow global market share (currently at 3% for this part of the chipmaking process). Chips are becoming so complex that design and manufacturing processes, once separate steps, are now increasingly intertwined — and the U.S. remains a world leader in chip design. NIST’s measurements to support advanced packaging in chips and standards for the packaging process could give domestic manufacturers a decisive edge in this area.

    All the NIST experts I’ve spoken to talk about a future in which chip manufacturers work increasingly closely with their customers, such as automakers. The benefit of closer relationships would mean that customers could collaborate with manufacturers to create more customized chips that bring about completely new products.

    And as we’ve seen, incorporating chips into existing products tends to make them “smart,” whether it’s an appliance figuring out how long to bake the bread, or solar panels that maximize electricity production by coordinating the power output from individual panels. With more domestic manufacturers on the scene, there are more opportunities to incorporate chips into products — that could also be manufactured in the U.S.A.

    I first encountered semiconductor chips in the 1970s, when the U.S. was a dominant force in chip manufacturing. Inside a department store with my mom, I saw pocket calculators on display, and they fascinated me. You could punch their number keys and they would instantly solve any addition or multiplication problem. As a 6-year-old, I thought that they had little brains in them!

    Since then, semiconductor chips have been a big part of my life. And after the pandemic, I realize I can’t take them for granted. I’m glad to be part of an agency that is working to create a more resilient supply chain — and bring back chip manufacturing in this country.
    __________________________________________________

    Semiconductor Chip Glossary

    Semiconductor: Material that can act either as a conductor or an insulator of electricity, depending on small changes in voltage

    Silicon: Semiconductor material that serves as the basis for many circuits in industry

    Transistor: Simple switch, made with a semiconductor material, that turns on or off depending on changes in voltage and can combine with other transistors to create complex devices

    Integrated circuit: Many transistors (anywhere from several to billions) combined to make a small circuit on a chip

    Wafer: Thin piece of semiconductor material (such as silicon) that we use as a base for building multiple integrated circuits

    Lithography: Process of etching into or building onto the surface of a wafer in order to produce patterns of integrated circuits

    Chip: Self-contained piece including the semiconductor surface and integrated circuit, independently packaged for use in electronics such as cellphones or computers

    Fab: Industrial facility where raw silicon wafers become fully functioning electronic chips
    __________________________________________________

    11
    NIST graphic designer Brandon Hayes and me in our bunny suits as we prepared to enter the NIST NanoFab, where Brandon took many amazing pictures, several of which you see in this blog post. Look for more NanoFab photos from Brandon as we continue to cover this topic in the coming months and years!
    Credit: J. Zhang/NIST

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 8:10 am on October 7, 2022 Permalink | Reply
    Tags: "How satellites harm astronomy - what’s being done", , , , , , , International Telecommunication Union, , , Square Kilometer Array Observatory (SKAO),   

    From “EarthSky” : “How satellites harm astronomy – what’s being done” 

    1

    From “EarthSky”

    10.6.22
    Kelly Kizer Whitt

    1
    Artist’s concept shows the 30,000 planned satellites from the Starlink Generation 2 constellation as of 2022. Different sub-constellations are in different colors. Learn more about how mega constellations of satellites harm astronomy. Image via The European Southern Observatory [La Observatorio Europeo Austral] [Observatoire européen austral][Europäische Südsternwarte](EU)(CL).

    You may have heard the growing complaints from astronomers as companies such as SpaceX add more satellites to our sky. Astronomers are not against the communication networks that the satellites provide, but they have valid concerns for the future of ground-based explorations of the universe. And there is only so much astronomers can do on their own to mitigate the problem. A report from the 2021 conference for Dark and Quiet Skies stated:

    “The advantages to society that the communication constellations are offering cannot be disputed, but their impact on the pristine appearance of the night sky and on astronomy must be considered with great attention because they affect both the cultural heritage of humanity and the progress of science.”

    How satellites harm astronomy: The problem with increasing satellites

    Astronomers face a variety of problems with the increasing numbers of satellites filling low-Earth orbit. Optical and near-infrared telescopes feel the impacts from these mega constellations. Some of the biggest are on wide-field surveys, longer exposures and evening and morning twilight observations when sunlight reflects off the satellites. The European Southern Observatory, the European Space Organization, reported these findings from a 2021 study [Astronomy & Astrophysics(below)]:

    “The effect is more pronounced for long exposures, up to three percent of which may be ruined during twilight. The study also found that the greatest impact of new satellite constellations will be on wide-field surveys made by telescopes such as the US National Science Foundation’s Vera C. Rubin Observatory. Up to 30-50 percent of twilight observations being seriously impacted.”

    And because we’re talking about scientists, of course they’ve officially started studying the issue. Studies in 2020 [ Astronomy and Astrophysics (below)] and 2021 [Astronomy & Astrophysics (below)] showed the impact on optical and near-infrared telescopes. They found that telescopes such as the Very Large Telescope (VLT) and the future Extremely Large Telescope (ELT) will be “moderately affected” by new satellite mega constellations.

    Some telescopes, such as the Rubin Observatory under construction in Chile, will experience greater impacts. These telescopes scan wide areas quickly. This makes them crucial in spotting supernovae or potentially dangerous asteroids.

    The impact on radio astronomy

    Radio astronomy has its own particular concerns. Radio telescopes don’t look in the visible wavelengths of the electromagnetic spectrum, so it’s not the same “visibility” issue. For radio telescopes, the main problem is with the signals the satellites transmit down to Earth. Plus, radio telescopes aren’t only looking at dim lights in the night. They’re looking at the sky 24/7. So, satellites are a problem every hour of the day, not just at twilight.

    But there’s more. A satellite’s signal is much, much stronger than the faint background sources that radio astronomers study. And a satellite doesn’t have to pass right in front of the object of study to cause interference. Satellite sources in a radio telescope’s “peripheral vision” also interfere.

    The European Southern Observatory (ESO) described the potential impact of satellites on radio astronomy:

    “They amount to hundreds of radio transmitters above the observatory’s horizon, which will affect the measurements made by our highly sensitive radio telescopes.”

    Radio astronomy has some protection against interference. Radio astronomers call this spectrum management, and the Radio Communication Sector of the International Telecommunication Union (ITU-R) create regulations that help protect astronomers studying certain frequency bands and wavelength ranges. But the recent large constellations of telecommunication satellites pose new threats.

    One recommendation is for satellite designs that avoid direct illumination of radio telescopes and radio-quiet zones. Also, the cumulative background electromagnetic noise created by satellite constellations should be kept below the limit already agreed to by the ITU.

    Philip Diamond of the Square Kilometer Array Observatory (SKAO) summed up the issue:

    “The deployment of thousands of satellites in low-Earth orbit in the coming years will inevitably change this landscape by creating a much larger number of fast-moving radio sources in the sky, which will interfere with humanity’s ability to explore the universe.”

    What can visual astronomers do?

    It would be great if a computer program could quickly eliminate all the satellites trails or interference from astronomers’ data. But it’s not quite that easy. One recent report outlined the problem of low-Earth orbit satellites on images:

    “They leave traces of their transit on astronomical images, significantly decreasing the scientific usability of the collected data. Post-processing of the affected images only partially remedies the problem: the brighter trails may saturate the detectors, making portions of images unusable, while the removal of the fainter trails leaves residual effects that seriously affect important scientific programs, as, for example, statistical, automated surveys of faint galaxies.”

    But there are some things astronomers could do, and have been doing thus far. They can avoid observing where satellites will pass, limit observations to areas of the sky that are in Earth’s shadow and close the shutter precisely when a satellite crosses the field of view. This all takes a lot of knowledge of the paths of thousands of satellites and plenty of pre-planning. Obviously, these are not realistic possibilities for many situations.

    What can satellite operators do?

    Another way to mitigate the problem is for satellite operators to adjust their designs (for example, darkening the satellite). They can also operate the satellites in a way that would raise their orbits out of vision of the optical telescopes, deorbit satellites that are no longer functioning, as well as other considerations for minimizing disruption. In several cases, the satellite operators have shown willingness to cooperate on this.

    Unfortunately, the companies planning these mega satellite constellations did not warn astronomers in advance. So many of these satellites were already filling the skies without any restrictions as astronomers scrambled to figure out how to save their observations and lessen the impact. Their efforts led to the creation of a new center that is collecting data from the community, astronomers and the general public, among others, to learn more about the effects on the night sky.

    Official efforts to reduce harm from satellites

    In June 2022, the International Astronomical Union (IAU), together with the National Science Foundation’s National Optical-Infrared Astronomy Research Laboratory (NOIRLab) and SKAO, opened the Center for the Protection of the Dark and Quiet Sky from Satellite Constellation Interference (CPS). The center highlights the dramatically increased risk of interference from low-Earth orbit satellites – both planned and already in orbit – that provide broadband services. On their website, you can see a running total of the number of operational constellation satellites (2,994) and the number of planned constellation satellites (431,713), among other stats.

    Co-director Connie Walker from NOIRLab said:

    “Three years ago SpaceX launched the first 60 Starlink satellites. The number of satellites from this and other companies is increasing exponentially and impacting the field of astronomy. During the last two years, four key workshops identified issues and recommended mitigation solutions with the help of astronomers, satellite industry folk, space lawyers and people from the general community worldwide.”

    In the peer-reviewed journal Air & Space Law [below], scientists at ESO published a study in September 2021 extensively warning of the dangers of unlimited satellites on astronomy. They’re trying to address satellite constellations’ impact on astronomy. They’re making efforts to coordinate solutions so both satellites and observational astronomy can continue developing without harmful interference.

    A reminder of what we’re losing when satellites harm astronomy

    One of ESO’s studies estimated that in the future, up to 100 satellites could be visible to the unaided eye during twilight. Imagine how that will change your own view of the night sky. Then imagine if your profession depended upon seeing what is beyond the satellites. How will we learn about the universe or detect potential threats to Earth?

    The IAU created the Dark and Quiet Skies Working Group. As Debra Elmegreen, IAU President, summed up:

    “Interference of our view of the sky caused by ground-based artificial lights, optical and infrared trails of satellite constellations and radio transmission on the ground and in space is an existential threat to astronomical observations. Viewing the night sky has been culturally important throughout humanity’s history, and dark skies are important for wildlife as well.”

    Science papers:
    Astronomy & Astrophysics
    Astronomy and Astrophysics 2020
    Astronomy & Astrophysics 2021
    Air & Space Law 2021
    See the science papers for instructive material.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Deborah Byrd created the EarthSky radio series in 1991 and founded EarthSky.org in 1994. Today, she serves as Editor-in-Chief of this website. She has won a galaxy of awards from the broadcasting and science communities, including having an asteroid named 3505 Byrd in her honor. A science communicator and educator since 1976, Byrd believes in science as a force for good in the world and a vital tool for the 21st century. “Being an EarthSky editor is like hosting a big global party for cool nature-lovers,” she says.

     
  • richardmitnick 8:31 pm on October 6, 2022 Permalink | Reply
    Tags: "Metamorphic core complexes", "Study Shows Gravitational Forces Deep Within the Earth Have Great Impact on Landscape Evolution", , , Collaborative national research centers on integrating tectonics climate and mammal diversity., , , ,   

    From Stoney Brook University – SUNY : “Study Shows Gravitational Forces Deep Within the Earth Have Great Impact on Landscape Evolution” 

    Stoney Brook bloc

    From Stoney Brook University – SUNY

    10.6.22

    Collaborative national research centers on integrating tectonics climate and mammal diversity.

    Stony Brook University is leading a research project that focuses on the interplay between the evolution of the landscape, climate and fossil record of mammal evolution and diversification in the Western United States. A little explored aspect of this geosciences research is the connection between gravitational forces deep in the Earth and landscape evolution. Now in a newly published paper in Nature Communications [below], the researchers show by way of computer modeling that deep roots under mountain belts (analogous to the massive ice below the tip of an iceberg) trigger dramatic movements along faults that result in collapse of the mountain belt and exposure of rocks that were once some 15 miles below the surface.

    The origin of these enigmatic exposures, called “metamorphic core complexes,” has been hotly debated within the scientific community. This study finding may alter the way scientists attempt to uncover the history of Earth as an evolving planet.

    Lead principal investigator William E. Holt, PhD, a Professor of Geophysics the Department of Geosciences in the School of Arts and Sciences at Stony Brook University, first author Alireza Bahadori, a former PhD student under Holt and now at Columbia University, and colleagues found that these core complexes are a fossil signature of past mountain belts in the Western United States that occupied regions around Phoenix and Las Vegas. These mountain areas left traces in the form of gravel deposits from ancient northward and eastward flowing rivers, found today south and west of Flagstaff, Arizona.

    1
    These visuals from the modeling illustrate metamorphic core complex development showing crustal stresses and strain rates, faults, uplift of deeper rocks, and sedimentation from surface erosion. These processes of core complex development occur after a thickened crustal root supporting topography is weakened through the introduction of heat, fluids, and partial melt. Credit: Alireza Bahadori and William E. Holt.

    The work articulated in the paper highlights the development of what the research team terms as a general model for metamorphic core complex formation and a demonstration that they result from the collapse of a mountain belt supported by a thickened crustal root.

    The authors further explain: “We show that gravitational body forces generated by topography and crustal root cause an upward flow pattern of the ductile lower-middle crust, facilitated by a detachment surface evolving into a low-angle normal fault. This detachment surface acquires large amounts of finite strain, consistent with thick mylonite zones found in metamorphic core complexes.”

    The work builds on research also published in Nature Communications [below] in 2022. Holt and colleagues published a first-of-a-kind model in three dimensions to illustrate the linkage between climate and tectonics to simulate the landscape and erosion/deposition history of the region before, during and after the formation of these metamorphic core complexes.

    This modeling was linked to a global climate model that predicted precipitation trends throughout the southwestern U.S. over time. The 3-D model accurately predicts deposition of sediments in basins that contain the mammal fossil and climate records.

    The group also published a paper in Science Advances [below] in November 2021, led by team member Katie Loughney.

    This research showed that a major peak in mammal diversification can be statistically tied to the peak in extensional collapse of the ancient mountain belts. Thus, the collaborative study is the first of its kind to quantify how deep Earth forces combine with climate to influence the landscape and impact mammal diversification and species dispersal found within the fossil record.

    The study required the vast computing resources provided by the High-Performance Computing Cluster SeaWulf at Stony Brook University. The climate modeling, produced by Ran Feng, University of Connecticut, was supported by the Cheyenne supercomputer maintained at NCAR-Wyoming Supercomputing Center.

    Much of the research that led to these findings reported each of the papers was supported by multiple grants from the National Science Foundation, including grant number EAR-1814051 to Stony Brook University.

    In addition to Holt, the national collaborative team included several researchers from Stony Brook University: Drs. Emma Troy Rasbury, Daniel Davis, Ali Bahadori (now at Columbia University) and Tara Smiley. Other colleagues include researchers from the University of Michigan (Drs. Catherine Badgley and Katie Loughney – now University of Georgia); University of Connecticut (Dr. Ran Feng); Purdue University (Dr. Lucy Flesch), as well a researcher from a consulting business, e4Sciences (Dr. Bruce Ward).

    Science papers:
    Nature Communications
    Nature Communications
    Science Advances 2021
    See the science papers for instructive material.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stoney Brook campus

    Stony Brook University-SUNY’s reach extends from its 1,039-acre campus on Long Island’s North Shore–encompassing the main academic areas, an 8,300-seat stadium and sports complex and Stony Brook Medicine–to Stony Brook Manhattan, a Research and Development Park, four business incubators including one at Calverton, New York, and the Stony Brook Southampton campus on Long Island’s East End. Stony Brook also co-manages Brookhaven National Laboratory, joining Princeton, the University of Chicago, Stanford, and the University of California on the list of major institutions involved in a research collaboration with a national lab.

    And Stony Brook is still growing. To the students, the scholars, the health professionals, the entrepreneurs and all the valued members who make up the vibrant Stony Brook community, this is a not only a great local and national university, but one that is making an impact on a global scale.

     
  • richardmitnick 4:46 pm on October 6, 2022 Permalink | Reply
    Tags: "JPL Developing More Tools to Help Search for Life in Deep Space", "OWLS": Oceans Worlds Life Surveyor, A key difficulty the OWLS team faced was how to process liquid samples in space., “OCEANS” uses a technique called capillary electrophoresis – basically running an electric current through a sample to separate it into its components., “OCEANS”: Organic Capillary Electrophoresis Analysis System pressure-cooks liquid samples and feeds them to instruments that search for the chemical building blocks of life., Creating the most powerful instrument system you could design to look for both chemical and biological signs of life., , One vision for OWLS is to use it to analyze frozen water from a vapor plume erupting from Saturn’s moon Enceladus., OWLS’ microscope system would be the first in space capable of imaging cells., The team designed two instruments that can extract a liquid sample and process it in the conditions of space., Using algorithms computers would select only the most interesting data to be sent home while also offering a “manifest” of information still on board.   

    From NASA JPL-Caltech: “JPL Developing More Tools to Help Search for Life in Deep Space” 

    From NASA JPL-Caltech

    10.6.22

    Ian J. O’Neill
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-2649
    ian.j.oneill@jpl.nasa.gov

    Melissa Pamer
    Jet Propulsion Laboratory, Pasadena, Calif.
    626-314-4928
    melissa.pamer@jpl.nasa.gov

    1
    Counterclockwise from top: California’s Mono Lake was the site of a field test for JPL’s Ocean Worlds Life Surveyor. A suite of eight instruments designed to detect life in liquid samples from icy moons, OWLS can autonomously track lifelike movement in water flowing past its microscopes. Credit: NASA/JPL-Caltech.

    A team at the Lab has invented new technologies that could be used by future missions to analyze liquid samples from watery worlds and look for signs of alien life.

    Are we alone in the universe? An answer to that age-old question has seemed tantalizingly within reach since the discovery of ice-encrusted moons in our solar system with potentially habitable subsurface oceans. But looking for evidence of life in a frigid sea hundreds of millions of miles away poses tremendous challenges. The science equipment used must be exquisitely complex yet capable of withstanding intense radiation and cryogenic temperatures. What’s more, the instruments must be able to take diverse, independent, complementary measurements that together could produce scientifically defensible proof of life.

    To address some of the difficulties that future life-detection missions might encounter, a team at NASA’s Jet Propulsion Laboratory in Southern California has developed “OWLS”, a powerful suite of science instruments unlike any other. Short for Oceans Worlds Life Surveyor, “OWLS” is designed to ingest and analyze liquid samples. It features eight instruments – all automated – that, in a lab on Earth, would require the work of several dozen people.

    2
    JPL’s OWLS combines powerful chemical-analysis instruments that look for the building blocks of life with microscopes that search for cells. This version of OWLS would be miniaturized and customized for use on future missions. Credit: NASA/JPL-Caltech.

    One vision for “OWLS” is to use it to analyze frozen water from a vapor plume erupting from Saturn’s moon Enceladus.

    “How do you take a sprinkling of ice a billion miles from Earth and determine – in the one chance you’ve got, while everyone on Earth is waiting with bated breath – whether there’s evidence of life?” said Peter Willis, the project’s co-principal investigator and science lead. “We wanted to create the most powerful instrument system you could design for that situation to look for both chemical and biological signs of life.”

    “OWLS” has been funded by JPL Next, a technology accelerator program run by the Lab’s Office of Space Technology. In June, after a half-decade of work, the project team tested its equipment – currently the size of a few filing cabinets – on the salty waters of Mono Lake in California’s Eastern Sierra. OWLS found chemical and cellular evidence of life, using its built-in software to identify that evidence without human intervention.

    “We have demonstrated the first generation of the “OWLS” suite,” Willis said. “The next step is to customize and miniaturize it for specific mission scenarios.”


    The science autonomy software on JPL’s OWLS tracks particles as water flows past the microscope, using machine-learning algorithms to look for evidence of lifelike motion. Here, particle tracks that the autonomy believes belong to “motile” organisms are colored magenta.
    Credit: NASA/JPL-Caltech

    Challenges, Solutions

    A key difficulty the “OWLS” team faced was how to process liquid samples in space. On Earth, scientists can rely on gravity, a reasonable lab temperature, and air pressure to keep samples in place, but those conditions don’t exist on a spacecraft hurtling through the solar system or on the surface of a frozen moon. So the team designed two instruments that can extract a liquid sample and process it in the conditions of space.

    Since it is not clear what form life might take on an ocean world, “OWLS” also needed to include the broadest possible array of instruments, capable of measuring a size range from single molecules to microorganisms. To that end, the project joined two subsystems: one that employs a variety of chemical analysis techniques using multiple instruments, and one with several microscopes to examine visual clues.

    “OWLS”’ microscope system would be the first in space capable of imaging cells. Developed in conjunction with scientists at Portland State University in Oregon, it combines a digital holographic microscope, which can identify cells and motion throughout the volume of a sample, with two fluorescent imagers, which use dyes to observe chemical content and cellular structures. Together, they provide overlapping views at a resolution of less than a single micron, or about 0.00004 inches.

    Dubbed Extant Life Volumetric Imaging System (“ELVIS”), the microscope subsystem has no moving parts – a rarity. And it uses machine-learning algorithms to both home in on lifelike movement and detect objects lit up by fluorescent molecules, whether naturally occurring in living organisms or as added dyes bound to parts of cells.

    “It’s like looking for a needle in a haystack without having to pick up and examine every single piece of hay,” said co-principal investigator Chris Lindensmith, who leads the microscope team. “We’re basically grabbing big armfuls of hay and saying, ‘Oh, there’s needles here, here, and here.’”

    To examine much tinier forms of evidence, “OWLS” uses its Organic Capillary Electrophoresis Analysis System (“OCEANS”), which essentially pressure-cooks liquid samples and feeds them to instruments that search for the chemical building blocks of life: all varieties of amino acids, as well as fatty acids and organic compounds. The system is so sensitive, it can even detect unknown forms of carbon. Willis, who led development of “OCEANS”, compares it to a shark that can smell just one molecule of blood in a billion molecules of water – and also tell the blood type. It would be only the second instrument system to perform liquid chemical analysis in space, after the Microscopy, Electrochemistry, and Conductivity Analyzer (“MECA”) instrument on NASA’s Phoenix Mars Lander.

    “OCEANS’ uses a technique called capillary electrophoresis – basically, running an electric current through a sample to separate it into its components. The sample is then routed to three types of detectors, including a mass spectrometer, the most powerful tool for identifying organic compounds.

    Sending It Home

    These subsystems produce massive amounts of data, just an estimated 0.0001% of which could be sent back to faraway Earth because of data transmission rates that are more limited than dial-up internet from the 1980s. So “OWLS” has been designed with what’s called “onboard science instrument autonomy.” Using algorithms, computers would analyze, summarize, prioritize, and select only the most interesting data to be sent home while also offering a “manifest” of information still on board.

    “We’re starting to ask questions now that necessitate more sophisticated instruments,” said Lukas Mandrake, the project’s instrument autonomy system engineer. “Are some of these other planets habitable? Is there defensible scientific evidence for life rather than a hint that it might be there? That requires instruments that take a lot of data, and that’s what “OWLS” and its science autonomy is set up to accomplish.”

    For more about JPL’s “OWLS” project, go to:

    https://www.jpl.nasa.gov/go/owls

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NASA JPL-Caltech Campus

    NASA JPL-Caltech is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge, on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    NASA Deep Space Network. Credit: NASA.

    NASA Deep Space Network Station 56 Madrid Spain added in early 2021.

    NASA Deep Space Network Station 14 at Goldstone Deep Space Communications Complex in California

    NASA Canberra Deep Space Communication Complex, AU, Deep Space Network. Credit: NASA

    NASA Deep Space Network Madrid Spain. Credit: NASA.

    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs.] NASA shares data with various national and international organizations such as from the[JAXA]Greenhouse Gases Observing Satellite.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: