Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:01 am on July 4, 2022 Permalink | Reply
    Tags: "Flexible organic LED produces ‘romantic’ candle-like light", , ,   

    From “physicsworld.com” : “Flexible organic LED produces ‘romantic’ candle-like light” 

    From “physicsworld.com”

    29 Jun 2022
    Isabelle Dumé

    A bendable organic LED with a natural mica backing releases a strong, candlelight-like glow. (Courtesy: Andy Chen and Ambrose Chen)

    A new bendable organic light-emitting diode (OLED) that produces warm, candle-like light with hardly any emissions at blue wavelengths might find a place in flexible lighting and smart displays that can be used at night without disrupting the body’s biological clock. The device, which is an improved version of one developed recently by a team of researchers from National Tsing Hua University in Taiwan, is made from a light-emitting layer on a mica substrate that is completely free of plastic.

    Jwo-Huei Jou and Ying-Hao Chu of the National Tsing Hua University’s Department of Materials Science and Engineering and colleagues recently patented OLEDS that produce warm, white light. However, these earlier devices still emit some unwanted blue light, which decreases the production of the “sleep hormone” melatonin and can therefore disrupt sleeping patterns. A further issue is that these OLEDs were made of solid materials and were therefore not flexible.

    Mica, a natural layered mineral

    One way to make OLEDs flexible is to paste them onto a plastic backing, but most plastics cannot be bent repeatedly – a prerequisite for real-world flexible applications. Jou, Chu and colleagues therefore decided to investigate backings made from mica, a natural layered mineral that can be cleaved into bendable, transparent sheets.

    The researchers began by depositing a clear indium tin oxide (ITO) film onto a mica sheet as the LED’s anode. They then mixed a luminescent material, N,N’-dicarbazole-1,1’-biphenyl, with red and yellow phosphorescent dyes to fabricate the device’s light-emitting layer. Next, they sandwiched this layer between electrically conductive solutions with the anode on one side and an aluminium layer in the other to create a flexible OLED.

    Tests showed that when coated with a transparent conductor, the mica substrate is robust to bending curvatures of 1/5 mm^-1 – a record high – and 50 000 bending cycles at a 7.5 mm bending radius. The OLED is also highly resistant to moisture and oxygen and has a lifetime that is 83% of similar devices on glass.

    “Romantic” light

    The new device emits bright, warm light upon the application of a constant current. This light contains even less blue-wavelength light than natural candlelight, Jou and Chu report, meaning that the exposure limit for humans is 47 000 seconds compared to just 320 s for a cold-white counterpart, according to the team’s calculations. This means that a person exposed to the OLED for 1.5 hours would see their melatonin production suppressed by about 1.6%, compared to 29% for a cold-white compact fluorescent lamp over the same period.

    “We have fabricated an OLED emitting a psychologically-warm but physically-cool, scorching-free romantic candle-like light on a bendable mica substrate using our patented candlelight OLED technology,” Jou tells Physics World. “This technology could provide designers and artists with more freedom in designing variable lighting systems that fit into different spaces, thanks to their flexibility.”

    The researchers now hope to make their OLEDs completely transparent. “When lit, these candlelight OLEDs could then be seen from both sides,” Chu says.

    The present work is detailed in ACS Applied Electronic Materials.

    See the full article here .

    Please help promote STEM in your local schools.

    http://www.stemedcoalition.org/”>Stem Education Coalition

    physicsworld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.

  • richardmitnick 9:41 am on July 4, 2022 Permalink | Reply
    Tags: "Cosmic manatee accelerates particles from head", , , Ground based Čerenkov Astronomy, , , The Manatee Nebula   

    From The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganisation](EU): “Cosmic manatee accelerates particles from head” 

    ESA Space For Europe Banner

    European Space Agency – United Space in Europe (EU)

    From The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganisation](EU)

    04/07/2022 [Just today in social media.]


    ESA’s XMM-Newton has X-rayed this beautiful cosmic creature, known as the Manatee Nebula, pinning down the location of unusual particle acceleration in its “head”.

    The Manatee Nebula, or W50, is thought to be a large supernova remnant created when a giant star exploded around 30 000 years ago, flinging its shells of gases out across the sky. It is one of the largest such features known, spanning the equivalent size of four full Moons.

    Unusually for a supernova remnant, a black hole remains in its core. This central “microquasar”, known as SS 433, emits powerful jets of particles travelling at speeds close to a quarter the speed of light that punch through the gassy shells, creating the double-lobed shape.

    SS 433 is identified by the red dot in the middle of the image. The X-ray data acquired by XMM-Newton are represented in yellow (soft X-rays), magenta (medium energy X-rays) and cyan (hard X-ray emission), while red is radio and green optical wavelengths imaged by the Very Large Array and the Skinakas Observatory in Greece, respectively. NASA NuSTAR and Chandra data were also used for the study (not shown in this image).

    The nebula attracted attention in 2018 when the High-Altitude Water Čerenkov Observatory, which is sensitive to very high energy gamma-ray photons, revealed the presence of highly energetic particles (hundreds of tera electron volts), but could not pinpoint from where within the Manatee the particles were originating.

    XMM-Newton was crucial in homing in on the region of particle acceleration in the X-ray jet blasting from the Manatee’s head, which begins about 100 light years away from the microquasar (represented by the magneta and cyan colours towards the left side SS 433) and extends to approximately 300 light years (coinciding with the radio ‘ear’ where the shock terminates).

    Samar Safi-Harb of the University of Manitoba, Canada, who led the study, says “thanks to the new XMM-Newton data, supplemented with NuSTAR and Chandra data, we believe the particles are getting accelerated to very high energies in the head of the Manatee through an unusually energetic particle acceleration process. The black hole outflow likely made its way there and has been re-energized to high-energy radiation at that location, perhaps due to shock waves in the expanding gas clouds and enhanced magnetic fields.”

    The nebula acts as a nearby laboratory for exploring a wide range of astrophysical phenomena associated with the outflows of many galactic and extragalactic sources and will be subject to further investigation. Furthermore, follow-up studies by ESA’s future Athena X-ray observatory will provide even more sensitive details about the inner workings of this curious cosmic Manatee.

    The paper has been accepted for publication in The Astrophysical Journal.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC (NL) in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the
    European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA’s space flight programme includes human spaceflight (mainly through participation in the International Space Station program); the launch and operation of uncrewed exploration missions to other planets and the Moon; Earth observation, science and telecommunication; designing launch vehicles; and maintaining a major spaceport, the The Guiana Space Centre [Centre Spatial Guyanais; CSG also called Europe’s Spaceport) at Kourou, French Guiana. The main European launch vehicle Ariane 5 is operated through Arianespace with ESA sharing in the costs of launching and further developing this launch vehicle. The agency is also working with NASA to manufacture the Orion Spacecraft service module that will fly on the Space Launch System.

    The agency’s facilities are distributed among the following centres:

    ESA European Space Research and Technology Centre (ESTEC) (NL) in Noordwijk, Netherlands;
    ESA Centre for Earth Observation [ESRIN] (IT) in Frascati, Italy;
    ESA Mission Control ESA European Space Operations Center [ESOC](DE) is in Darmstadt, Germany;
    ESA -European Astronaut Centre [EAC] trains astronauts for future missions is situated in Cologne, Germany;
    European Centre for Space Applications and Telecommunications (ECSAT) (UK), a research institute created in 2009, is located in Harwell, England;
    ESA – European Space Astronomy Centre [ESAC] (ES) is located in Villanueva de la Cañada, Madrid, Spain.
    European Space Agency Science Programme is a long-term programme of space science and space exploration missions.


    After World War II, many European scientists left Western Europe in order to work with the United States. Although the 1950s boom made it possible for Western European countries to invest in research and specifically in space-related activities, Western European scientists realized solely national projects would not be able to compete with the two main superpowers. In 1958, only months after the Sputnik shock, Edoardo Amaldi (Italy) and Pierre Auger (France), two prominent members of the Western European scientific community, met to discuss the foundation of a common Western European space agency. The meeting was attended by scientific representatives from eight countries, including Harrie Massey (United Kingdom).

    The Western European nations decided to have two agencies: one concerned with developing a launch system, ELDO (European Launch Development Organization), and the other the precursor of the European Space Agency, ESRO (European Space Research Organisation). The latter was established on 20 March 1964 by an agreement signed on 14 June 1962. From 1968 to 1972, ESRO launched seven research satellites.

    ESA in its current form was founded with the ESA Convention in 1975, when ESRO was merged with ELDO. ESA had ten founding member states: Belgium, Denmark, France, West Germany, Italy, the Netherlands, Spain, Sweden, Switzerland, and the United Kingdom. These signed the ESA Convention in 1975 and deposited the instruments of ratification by 1980, when the convention came into force. During this interval the agency functioned in a de facto fashion. ESA launched its first major scientific mission in 1975, Cos-B, a space probe monitoring gamma-ray emissions in the universe, which was first worked on by ESRO.

    ESA50 Logo large

    Later activities

    ESA collaborated with National Aeronautics Space Agency on the International Ultraviolet Explorer (IUE), the world’s first high-orbit telescope, which was launched in 1978 and operated successfully for 18 years.

    ESA Infrared Space Observatory.

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/National Aeronautics and Space Administration Solar Orbiter annotated.

    A number of successful Earth-orbit projects followed, and in 1986 ESA began Giotto, its first deep-space mission, to study the comets Halley and Grigg–Skjellerup. Hipparcos, a star-mapping mission, was launched in 1989 and in the 1990s SOHO, Ulysses and the Hubble Space Telescope were all jointly carried out with NASA. Later scientific missions in cooperation with NASA include the Cassini–Huygens space probe, to which ESA contributed by building the Titan landing module Huygens.

    ESA/Huygens Probe from Cassini landed on Titan.

    As the successor of ELDO, ESA has also constructed rockets for scientific and commercial payloads. Ariane 1, launched in 1979, carried mostly commercial payloads into orbit from 1984 onward. The next two versions of the Ariane rocket were intermediate stages in the development of a more advanced launch system, the Ariane 4, which operated between 1988 and 2003 and established ESA as the world leader in commercial space launches in the 1990s. Although the succeeding Ariane 5 experienced a failure on its first flight, it has since firmly established itself within the heavily competitive commercial space launch market with 82 successful launches until 2018. The successor launch vehicle of Ariane 5, the Ariane 6, is under development and is envisioned to enter service in the 2020s.

    The beginning of the new millennium saw ESA become, along with agencies like National Aeronautics Space Agency, Japan Aerospace Exploration Agency, Indian Space Research Organisation, the Canadian Space Agency(CA) and Roscosmos(RU), one of the major participants in scientific space research. Although ESA had relied on co-operation with NASA in previous decades, especially the 1990s, changed circumstances (such as tough legal restrictions on information sharing by the United States military) led to decisions to rely more on itself and on co-operation with Russia. A 2011 press issue thus stated:

    “Russia is ESA’s first partner in its efforts to ensure long-term access to space. There is a framework agreement between ESA and the government of the Russian Federation on cooperation and partnership in the exploration and use of outer space for peaceful purposes, and cooperation is already underway in two different areas of launcher activity that will bring benefits to both partners.”

    Notable ESA programmes include SMART-1, a probe testing cutting-edge space propulsion technology, the Mars Express and Venus Express missions, as well as the development of the Ariane 5 rocket and its role in the ISS partnership. ESA maintains its scientific and research projects mainly for astronomy-space missions such as Corot, launched on 27 December 2006, a milestone in the search for exoplanets.

    On 21 January 2019, ArianeGroup and Arianespace announced a one-year contract with ESA to study and prepare for a mission to mine the Moon for lunar regolith.


    The treaty establishing the European Space Agency reads:

    The purpose of the Agency shall be to provide for and to promote, for exclusively peaceful purposes, cooperation among European States in space research and technology and their space applications, with a view to their being used for scientific purposes and for operational space applications systems…

    ESA is responsible for setting a unified space and related industrial policy, recommending space objectives to the member states, and integrating national programs like satellite development, into the European program as much as possible.

    Jean-Jacques Dordain – ESA’s Director General (2003–2015) – outlined the European Space Agency’s mission in a 2003 interview:

    “Today space activities have pursued the benefit of citizens, and citizens are asking for a better quality of life on Earth. They want greater security and economic wealth, but they also want to pursue their dreams, to increase their knowledge, and they want younger people to be attracted to the pursuit of science and technology. I think that space can do all of this: it can produce a higher quality of life, better security, more economic wealth, and also fulfill our citizens’ dreams and thirst for knowledge, and attract the young generation. This is the reason space exploration is an integral part of overall space activities. It has always been so, and it will be even more important in the future.”


    According to the ESA website, the activities are:

    Observing the Earth
    Human Spaceflight
    Space Science
    Space Engineering & Technology
    Telecommunications & Integrated Applications
    Preparing for the Future
    Space for Climate


    Copernicus Programme
    Cosmic Vision
    Horizon 2000
    Living Planet Programme

    Every member country must contribute to these programmes:

    Technology Development Element Programme
    Science Core Technology Programme
    General Study Programme
    European Component Initiative


    Depending on their individual choices the countries can contribute to the following programmes, listed according to:

    Earth Observation
    Human Spaceflight and Exploration
    Space Situational Awareness


    ESA has formed partnerships with universities. ESA_LAB@ refers to research laboratories at universities. Currently there are ESA_LAB@

    Technische Universität Darmstadt (DE)
    École des hautes études commerciales de Paris (HEC Paris) (FR)
    Université de recherche Paris Sciences et Lettres (FR)
    The University of Central Lancashire (UK)

    Membership and contribution to ESA

    By 2015, ESA was an intergovernmental organization of 22 member states. Member states participate to varying degrees in the mandatory (25% of total expenditures in 2008) and optional space programmes (75% of total expenditures in 2008). The 2008 budget amounted to €3.0 billion whilst the 2009 budget amounted to €3.6 billion. The total budget amounted to about €3.7 billion in 2010, €3.99 billion in 2011, €4.02 billion in 2012, €4.28 billion in 2013, €4.10 billion in 2014 and €4.33 billion in 2015. English is the main language within ESA. Additionally, official documents are also provided in German and documents regarding the Spacelab are also provided in Italian. If found appropriate, the agency may conduct its correspondence in any language of a member state.

    Non-full member states
    Since 2016, Slovenia has been an associated member of the ESA.

    Latvia became the second current associated member on 30 June 2020, when the Association Agreement was signed by ESA Director Jan Wörner and the Minister of Education and Science of Latvia, Ilga Šuplinska in Riga. The Saeima ratified it on July 27. Previously associated members were Austria, Norway and Finland, all of which later joined ESA as full members.

    Since 1 January 1979, Canada has had the special status of a Cooperating State within ESA. By virtue of this accord, The Canadian Space Agency [Agence spatiale canadienne, ASC] (CA) takes part in ESA’s deliberative bodies and decision-making and also in ESA’s programmes and activities. Canadian firms can bid for and receive contracts to work on programmes. The accord has a provision ensuring a fair industrial return to Canada. The most recent Cooperation Agreement was signed on 15 December 2010 with a term extending to 2020. For 2014, Canada’s annual assessed contribution to the ESA general budget was €6,059,449 (CAD$8,559,050). For 2017, Canada has increased its annual contribution to €21,600,000 (CAD$30,000,000).


    After the decision of the ESA Council of 21/22 March 2001, the procedure for accession of the European states was detailed as described the document titled The Plan for European Co-operating States (PECS). Nations that want to become a full member of ESA do so in 3 stages. First a Cooperation Agreement is signed between the country and ESA. In this stage, the country has very limited financial responsibilities. If a country wants to co-operate more fully with ESA, it signs a European Cooperating State (ECS) Agreement. The ECS Agreement makes companies based in the country eligible for participation in ESA procurements. The country can also participate in all ESA programmes, except for the Basic Technology Research Programme. While the financial contribution of the country concerned increases, it is still much lower than that of a full member state. The agreement is normally followed by a Plan For European Cooperating State (or PECS Charter). This is a 5-year programme of basic research and development activities aimed at improving the nation’s space industry capacity. At the end of the 5-year period, the country can either begin negotiations to become a full member state or an associated state or sign a new PECS Charter.

    During the Ministerial Meeting in December 2014, ESA ministers approved a resolution calling for discussions to begin with Israel, Australia and South Africa on future association agreements. The ministers noted that “concrete cooperation is at an advanced stage” with these nations and that “prospects for mutual benefits are existing”.

    A separate space exploration strategy resolution calls for further co-operation with the United States, Russia and China on “LEO” exploration, including a continuation of ISS cooperation and the development of a robust plan for the coordinated use of space transportation vehicles and systems for exploration purposes, participation in robotic missions for the exploration of the Moon, the robotic exploration of Mars, leading to a broad Mars Sample Return mission in which Europe should be involved as a full partner, and human missions beyond LEO in the longer term.”

    Relationship with the European Union

    The political perspective of the European Union (EU) was to make ESA an agency of the EU by 2014, although this date was not met. The EU member states provide most of ESA’s funding, and they are all either full ESA members or observers.


    At the time ESA was formed, its main goals did not encompass human space flight; rather it considered itself to be primarily a scientific research organisation for uncrewed space exploration in contrast to its American and Soviet counterparts. It is therefore not surprising that the first non-Soviet European in space was not an ESA astronaut on a European space craft; it was Czechoslovak Vladimír Remek who in 1978 became the first non-Soviet or American in space (the first man in space being Yuri Gagarin of the Soviet Union) – on a Soviet Soyuz spacecraft, followed by the Pole Mirosław Hermaszewski and East German Sigmund Jähn in the same year. This Soviet co-operation programme, known as Intercosmos, primarily involved the participation of Eastern bloc countries. In 1982, however, Jean-Loup Chrétien became the first non-Communist Bloc astronaut on a flight to the Soviet Salyut 7 space station.

    Because Chrétien did not officially fly into space as an ESA astronaut, but rather as a member of the French CNES astronaut corps, the German Ulf Merbold is considered the first ESA astronaut to fly into space. He participated in the STS-9 Space Shuttle mission that included the first use of the European-built Spacelab in 1983. STS-9 marked the beginning of an extensive ESA/NASA joint partnership that included dozens of space flights of ESA astronauts in the following years. Some of these missions with Spacelab were fully funded and organizationally and scientifically controlled by ESA (such as two missions by Germany and one by Japan) with European astronauts as full crew members rather than guests on board. Beside paying for Spacelab flights and seats on the shuttles, ESA continued its human space flight co-operation with the Soviet Union and later Russia, including numerous visits to Mir.

    During the latter half of the 1980s, European human space flights changed from being the exception to routine and therefore, in 1990, the European Astronaut Centre in Cologne, Germany was established. It selects and trains prospective astronauts and is responsible for the co-ordination with international partners, especially with regard to the International Space Station. As of 2006, the ESA astronaut corps officially included twelve members, including nationals from most large European countries except the United Kingdom.

    In the summer of 2008, ESA started to recruit new astronauts so that final selection would be due in spring 2009. Almost 10,000 people registered as astronaut candidates before registration ended in June 2008. 8,413 fulfilled the initial application criteria. Of the applicants, 918 were chosen to take part in the first stage of psychological testing, which narrowed down the field to 192. After two-stage psychological tests and medical evaluation in early 2009, as well as formal interviews, six new members of the European Astronaut Corps were selected – five men and one woman.

    Cooperation with other countries and organizations

    ESA has signed co-operation agreements with the following states that currently neither plan to integrate as tightly with ESA institutions as Canada, nor envision future membership of ESA: Argentina, Brazil, China, India (for the Chandrayan mission), Russia and Turkey.

    Additionally, ESA has joint projects with the European Union, NASA of the United States and is participating in the International Space Station together with the United States (NASA), Russia and Japan (JAXA).

    European Union
    ESA and EU member states
    ESA-only members
    EU-only members

    ESA is not an agency or body of the European Union (EU), and has non-EU countries (Norway, Switzerland, and the United Kingdom) as members. There are however ties between the two, with various agreements in place and being worked on, to define the legal status of ESA with regard to the EU.

    There are common goals between ESA and the EU. ESA has an EU liaison office in Brussels. On certain projects, the EU and ESA co-operate, such as the upcoming Galileo satellite navigation system. Space policy has since December 2009 been an area for voting in the European Council. Under the European Space Policy of 2007, the EU, ESA and its Member States committed themselves to increasing co-ordination of their activities and programmes and to organising their respective roles relating to space.

    The Lisbon Treaty of 2009 reinforces the case for space in Europe and strengthens the role of ESA as an R&D space agency. Article 189 of the Treaty gives the EU a mandate to elaborate a European space policy and take related measures, and provides that the EU should establish appropriate relations with ESA.

    Former Italian astronaut Umberto Guidoni, during his tenure as a Member of the European Parliament from 2004 to 2009, stressed the importance of the European Union as a driving force for space exploration, “…since other players are coming up such as India and China it is becoming ever more important that Europeans can have an independent access to space. We have to invest more into space research and technology in order to have an industry capable of competing with other international players.”

    The first EU-ESA International Conference on Human Space Exploration took place in Prague on 22 and 23 October 2009. A road map which would lead to a common vision and strategic planning in the area of space exploration was discussed. Ministers from all 29 EU and ESA members as well as members of parliament were in attendance.

    National space organisations of member states:

    The Centre National d’Études Spatiales(FR) (CNES) (National Centre for Space Study) is the French government space agency (administratively, a “public establishment of industrial and commercial character”). Its headquarters are in central Paris. CNES is the main participant on the Ariane project. Indeed, CNES designed and tested all Ariane family rockets (mainly from its centre in Évry near Paris)
    The UK Space Agency is a partnership of the UK government departments which are active in space. Through the UK Space Agency, the partners provide delegates to represent the UK on the various ESA governing bodies. Each partner funds its own programme.
    The Italian Space Agency A.S.I. – Agenzia Spaziale Italiana was founded in 1988 to promote, co-ordinate and conduct space activities in Italy. Operating under the Ministry of the Universities and of Scientific and Technological Research, the agency cooperates with numerous entities active in space technology and with the president of the Council of Ministers. Internationally, the ASI provides Italy’s delegation to the Council of the European Space Agency and to its subordinate bodies.
    The German Aerospace Center (DLR)[Deutsches Zentrum für Luft- und Raumfahrt e. V.] is the national research centre for aviation and space flight of the Federal Republic of Germany and of other member states in the Helmholtz Association. Its extensive research and development projects are included in national and international cooperative programmes. In addition to its research projects, the centre is the assigned space agency of Germany bestowing headquarters of German space flight activities and its associates.
    The Instituto Nacional de Técnica Aeroespacial (INTA)(ES) (National Institute for Aerospace Technique) is a Public Research Organization specialised in aerospace research and technology development in Spain. Among other functions, it serves as a platform for space research and acts as a significant testing facility for the aeronautic and space sector in the country.

    National Aeronautics Space Agency

    ESA has a long history of collaboration with NASA. Since ESA’s astronaut corps was formed, the Space Shuttle has been the primary launch vehicle used by ESA’s astronauts to get into space through partnership programmes with NASA. In the 1980s and 1990s, the Spacelab programme was an ESA-NASA joint research programme that had ESA develop and manufacture orbital labs for the Space Shuttle for several flights on which ESA participate with astronauts in experiments.

    In robotic science mission and exploration missions, NASA has been ESA’s main partner. Cassini–Huygens was a joint NASA-ESA mission, along with the Infrared Space Observatory, INTEGRAL, SOHO, and others.

    National Aeronautics and Space Administration/European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/ASI Italian Space Agency [Agenzia Spaziale Italiana](IT) Cassini Spacecraft.

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU) Integral spacecraft

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganisation] (EU)/National Aeronautics and Space AdministrationSOHO satellite. Launched in 1995.

    Also, the Hubble Space Telescope is a joint project of NASA and ESA.

    National Aeronautics and Space Administration/European Space Agency[La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganisation](EU) Hubble Space Telescope

    ESA-NASA joint projects include the James Webb Space Telescope and the proposed Laser Interferometer Space Antenna.

    National Aeronautics Space Agency/European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganisation]Canadian Space Agency [Agence Spatiale Canadienne](CA) James Webb Space Telescope annotated. Scheduled for launch in December 2021.

    Gravity is talking. Lisa will listen. Dialogos of Eide.

    The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/National Aeronautics and Space Administration eLISA space based, the future of gravitational wave research.

    NASA has committed to provide support to ESA’s proposed MarcoPolo-R mission to return an asteroid sample to Earth for further analysis. NASA and ESA will also likely join together for a Mars Sample Return Mission. In October 2020 the ESA entered into a memorandum of understanding (MOU) with NASA to work together on the Artemis program, which will provide an orbiting lunar gateway and also accomplish the first manned lunar landing in 50 years, whose team will include the first woman on the Moon.

    NASA ARTEMIS spacecraft depiction.

    Cooperation with other space agencies

    Since China has started to invest more money into space activities, the Chinese Space Agency[中国国家航天局] (CN) has sought international partnerships. ESA is, beside, The Russian Federal Space Agency Государственная корпорация по космической деятельности «Роскосмос»](RU) one of its most important partners. Two space agencies cooperated in the development of the Double Star Mission. In 2017, ESA sent two astronauts to China for two weeks sea survival training with Chinese astronauts in Yantai, Shandong.

    ESA entered into a major joint venture with Russia in the form of the CSTS, the preparation of French Guiana spaceport for launches of Soyuz-2 rockets and other projects. With India, ESA agreed to send instruments into space aboard the ISRO’s Chandrayaan-1 in 2008. ESA is also co-operating with Japan, the most notable current project in collaboration with JAXA is the BepiColombo mission to Mercury.

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/Japan Aerospace Exploration Agency [国立研究開発法人宇宙航空研究開発機構](JP) Bepicolumbo in flight illustration. Artist’s impression of BepiColombo – ESA’s first mission to Mercury. ESA’s Mercury Planetary Orbiter (MPO) will be operated from ESOC Germany.

    ESA’s Mercury Planetary Orbiter (MPO) will be operated from ESOC Germany.

    Speaking to reporters at an air show near Moscow in August 2011, ESA head Jean-Jacques Dordain said ESA and Russia’s Roskosmos space agency would “carry out the first flight to Mars together.”

  • richardmitnick 8:43 am on July 4, 2022 Permalink | Reply
    Tags: "The current state of Citizen Science", , , Citizen-or community-Science continues to grow and engage nonscientists in scientific research., It’s safe to say many projects would not function without community volunteers.   

    From “COSMOS (AU)” : “The current state of Citizen Science” 

    Cosmos Magazine bloc

    From “COSMOS (AU)”

    3 July 2022
    Qamariya Nasrullah

    The ups and downs of volunteer-based research.

    Credit: zorandimzr / Getty.

    Citizen-or community-Science continues to grow and engage nonscientists in scientific research. Depending on who you ask, citizen science can be anything from planting trees, counting birds or analyzing data, all the way through to formulating and executing a science project in its entirety.

    While it’s impossible to put a dollar value on citizen science, it’s safe to say many projects would not function without community volunteers. For example, 17% of research publication on the monarch butterfly (Danaus plexippus) and 50% of studies on migratory birds and climate change have utilized citizen science efforts.

    CSIRO’s own Radio Galaxy Zoo involved over 12,000 volunteers who analysed radio sky images, and made over 2.29 million classifications – equivalent to 122 years of full-time work if done by a single astronomer.

    Radio Galaxy Zoo

    Radio Galaxy Zoo joined Zooniverse in December 2013, asking citizen scientists to analyse radio sky images from the Very Large Array in New Mexico, CSIRO’s Australia Telescope Compact Array, and infrared images from NASA’s Spitzer and WISE Space Telescopes which maps the stars in galaxies.

    The main idea was to ask citizen scientists to match the radio plasma (radio images) with the galaxy (seen in infrared) that they thought the plasma is originating from. The radio plasma typically come from the process of star formation or the result of supermassive black hole growth within galaxies.

    Despite the use of volunteers for science research, little investigation has been done on the bigger picture. Is citizen science data usable? In which areas do we need to improve to make citizen science sustainable?

    Is Citizen Science accessible for everyone?

    New research published in BioScience has found that of the 3894 participants surveyed, 77% of them were involved in multiple Citizen Science projects. Some volunteers were even “super-users”, taking part in as many as 50 projects. This suggests that Citizen Science is primarily being carried out by a small pool of already interested volunteers.

    Participants were also five times more likely to have an advanced degree than the general population, and six to seven times more likely to already be working in STEM fields. Less than 5% of the volunteers who answered questions about their cultural background identified as Black, Asian-American, Pacific Islander, Native American, or Latin American.

    “Participation in Citizen Science isn’t reaching as far into different segments of the public as we had hoped for in the field,” says study co-author Associate Professor Caren Cooper, from North Carolina State University. “We’re seeing that most volunteers are mostly highly educated white people, with a high percentage of STEM professionals. We’re not even reaching other types of professionals. This is part of the wake-up call that’s underway in the field right now.”

    “Through these projects, volunteers can learn about science, but also about their own communities,” says lead author Bradley Allf, also from NC State. “If those benefits are being concentrated in people who already have a lot of access to power in society, and to science generally, then Citizen Science is doing a disservice to the underserved.”

    A potential way to get more of the community engaged in Citizen Science may be through school-based programs. Dr Erinn Fagan-Jeffries of the South Australian Museum and the University of Adelaide has been leading a program called Insect Investigators, which has seen students and teachers across 50 regional schools in South Australia, Queensland and Western Australia set up and run insect traps. The insects are then collected and sent back to entomologists for identification and DNA barcoding. The students even get to help come up with scientific names if any new species are discovered.

    “The response has been really positive so far,” says Fagan-Jeffries. “We’ve had really good success rates in terms of completing the trapping and sending back the samples to us. We’ve also had quite a bit of engagement through a discussion board online with quite a lot of schools posting photos and comments.”

    The Flinders University Palaeontology Laboratory has always been a hub for a diverse range of science-trained and community volunteers. A few years ago, the FU Palaeo laboratory introduced the James Moore Memorial Prize, with the aim of providing rural high school students with the opportunity to participate in a paleontology excavation and assist with laboratory research. It’s already showing signs of increasing accessibility of science to more remote communities.

    “I’m already seeing names popping up in degree enrollments from previous James Moore Prize applicants from over the past few years,” says Professor Gavin Prideaux, one of the leaders at the FU Palaeo lab. The program has now expanded to fund one rural and one metro school student per year.

    Another way to make Citizen Science more accessible is through technology. Over 7.26 billion (91.54%) of the world’s population has a smart phone.

    The iNaturalist app, a joint initiative of the California Academy of Sciences and National Geographic Society, has more than 2.5 million current users around the world, at all age and education levels. Last year over 29 million observations were made by citizen scientists, which resulted in over 39 million identifications confirmed by trained scientists. The peak month for observations in Australia is October, when the annual Great Southern BioBlitz survey happens.

    The recently released machine-learning-powered app BirdNET gives users free bird sound identification. The app includes more than 3,000 species; it supports 13 languages, with species names translated into an additional 12 languages. In the past three years of its trial phase, more than 2 million users from over 100 countries have generated over 40 million submissions. Some of the scientific results based on data collected by the BirdNET app have been published in PLOS Biology.

    “The most exciting part of this work is how simple it is for people to participate in bird research and conservation,” says lead author Dr Connor Wood, from Cornell University. “You don’t need to know anything about birds, you just need a smartphone, and the BirdNET app can then provide both you and the research team with a prediction for what bird you’ve heard. This has led to tremendous participation worldwide, which translates to an incredible wealth of data. It’s really a testament to an enthusiasm for birds that unites people from all walks of life.”

    Is citizen science data usable?

    The easy answer is yes. In general, citizen scientists do a great job. The main variables are the complexity of the task, and the level of training they receive.

    A new study utilized Citizen Science in a project at Chicago’s Field Museum, in the US. In order to better understand the impacts of climate change on the liverwort, a type of tiny plant, guests were asked to draw fine lines on photographs of microscopic lobes (a type of primitive leaf) to measure how lobe size has changed across different regions and through time. After two years, a total of 11,000 participants generated almost 100,000 measurements.

    “It was surprising how all age groups from young children, families, youth, and adults were able to generate high-quality taxonomic data sets, making observations and preparing measurements, and at the same time empowering community scientists through authentic contributions to science,” says Dr Matt von Konrat, Head of Botanical Collections at the museum, and co-author of research published in Research Ideas and Outcomes.

    However, not all citizen scientist projects are easily done. For the FU Palaeo lab, due to the delicate and complicated nature of fossils, volunteers need adequate training if they’re to produce usable data.

    “It’s a cost-benefit situation,” says Prideaux. “For researchers who are time poor, who would like to have volunteers, it’s a huge investment in both time and resources to properly train them.”

    One strategy to fix this is getting university students involved and to use them as volunteers, for instance Bachelor of Science (Palaeontology) students who are already being instructed in paleontological techniques from day 1 of their degree.

    Flinders University – Palaeontology Clean Labs.

    “In an ideal world, I would have sufficient funding to train and supervise a team of volunteers,” says Prideaux. “Although it’s unfortunate not to be able to include the broader community, at the moment it’s most feasible to prioritise an investment in these students.”

    “I think comes down to the design of the project and set up to make things easy to follow, says Fagan-Jeffries. “Are you providing training to your participants or volunteers so that they have the skills to be able to do it well? And accurately?”

    Science for all citizens

    Perhaps the greatest benefit of Citizen Science is the opportunity it provides for everyone to have a level of scientific literacy that will help them throughout their life.

    “Not all people are going to become scientists, because that’s not a job for everyone,” says Fagan-Jeffries. “What we need is a society where people who are in all different careers have an appreciation for science and an understanding of why it’s important and how it’s done.

    “An understanding of science allows you to make more informed decisions when you’re looking into policies or voting for governments, or even just reading the back of a skincare product, just that general awareness of the scientific process and critical thinking.”

    Citizen Science: Everybody Counts | Caren Cooper | TEDxGreensboro.

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 8:05 am on July 4, 2022 Permalink | Reply
    Tags: "Keeping the Energy in The Room", "MKIDs": Microwave Kinetic Inductance Detectors, A Cooper pair is able to move about without resistance., A thin layer of the metal indium-placed between the superconducting sensor and the substrate-drastically reduced the energy leaking out of the sensor., An MKID Exoplanet Camera can detect even faint signals., An MKID uses a superconductor in which electricity can flow with no resistance., , , CMOS sensors are semiconductors based on silicon., , In a superconductor all the electrons are paired up., In a superconductor two electrons will pair up-one spin up and one spin down-in a Cooper pair., , Right now scientists can only do spectroscopy for a tiny subset of exoplanets-those where the planet passes between its star and Earth., Scientists can use spectroscopy to identify the composition of objects both nearby and across the entire visible universe., Sensor Technology, , The gap energy in a superconductor is about 10000 times less than in semiconductors based on silicon., The indium essentially acted like a fence., The photo-electric effect CMOS sensor: a photon strikes the sensor knocking off an electron that can then be detected as a signal suitable for processing by a microprocessor., The scientists chose indium because it is also a superconductor at the temperatures at which the MKID will operate and adjacent superconductors tend to cooperate if they are thin., The technique cut down the wavelength measurement uncertainty from 10% to 5%., The University of California-Santa Barbara, This will all soon be possible with the capabilities of the next generation of 30-meter telescopes., With better MKIDs scientists can use light reflected off the surface of a planet rather than transmitted through its narrow atmosphere alone.   

    From The University of California-Santa Barbara: “Keeping the Energy in The Room” 

    UC Santa Barbara Name bloc

    From The University of California-Santa Barbara

    July 1, 2022

    Harrison Tasoff
    (805) 893-7220

    Professor Ben Mazin talks superconductors, exoplanets and dance clubs as he explains advances in sensor technology.

    The sensor mounted for use in an MKID Exoplanet Camera. Photo Credit: Ben Mazin.

    It may seem like technology advances year after year, as if by magic. But behind every incremental improvement and breakthrough revolution is a team of scientists and engineers hard at work.

    UC Santa Barbara Professor Ben Mazin is developing precision optical sensors for telescopes and observatories. In a paper published in Physical Review Letters, he and his team improved the spectra resolution of their superconducting sensor, a major step in their ultimate goal: analyzing the composition of exoplanets.

    “We were able to roughly double the spectral resolving power of our detectors,” said first author Nicholas Zobrist, a doctoral student in the Mazin Lab.

    “This is the largest energy resolution increase we’ve ever seen,” added Mazin. “It opens up a whole new pathway to science goals that we couldn’t achieve before.”

    The Mazin lab works with a type of sensor called an MKID. Most light detectors — like the CMOS sensor in a phone camera — are semiconductors based on silicon. These operate via the photo-electric effect: a photon strikes the sensor knocking off an electron that can then be detected as a signal suitable for processing by a microprocessor.

    An MKID uses a superconductor in which electricity can flow with no resistance. In addition to zero resistance, these materials have other useful properties. For instance, semiconductors have a gap energy that needs to be overcome to knock the electron out. The related gap energy in a superconductor is about 10,000 times less, so it can detect even faint signals.

    What’s more, a single photon can knock many electrons off of a superconductor, as opposed to only one in a semiconductor. By measuring the number of mobile electrons, an MKID can actually determine the energy (or wavelength) of the incoming light. “And the energy of the photon, or its spectra, tells us a lot about the physics of what emitted that photon,” Mazin said.

    Leaking energy

    The researchers had hit a limit as to how sensitive they could make these MKIDs. After much scrutiny, they discovered that energy was leaking from the superconductor into the sapphire crystal wafer that the device is made on. As a result, the signal appeared weaker than it truly was.

    In typical electronics, current is carried by mobile electrons. But these have a tendency to interact with their surroundings, scattering and losing energy in what’s known as resistance. In a superconductor two electrons will pair up — one spin up and one spin down — and this Cooper pair, as it’s called, is able to move about without resistance.

    “It’s like a couple at a club,” Mazin explained. “You’ve got two people who pair up, and then they can move together through the crowd without any resistance. Whereas a single person stops to talk to everybody along the way, slowing them down.”

    In a superconductor, all the electrons are paired up. “They’re all dancing together, moving around without interacting with other couples very much because they’re all gazing deeply into each other’s eyes.

    “A photon hitting the sensor is like someone coming in and spilling a drink on one of the partners,” he continued. “This breaks the couple up, causing one partner to stumble into other couples and create a disturbance.” This is the cascade of mobile electrons that the MKID measures.

    But sometimes this happens at the edge of the dancefloor. The offended party stumbles out of the club without knocking into anyone else. Great for the rest of the dancers, but not for the scientists. If this happens in the MKID, then the light signal will seem weaker than it actually was.

    Fencing them in

    Mazin, Zobrist and their co-authors discovered that a thin layer of the metal indium — placed between the superconducting sensor and the substrate — drastically reduced the energy leaking out of the sensor. The indium essentially acted like a fence around the dancefloor, keeping the jostled dancers in the room and interacting with the rest of the crowd.

    They chose indium because it is also a superconductor at the temperatures at which the MKID will operate, and adjacent superconductors tend to cooperate if they are thin. The metal did present a challenge to the team, though. Indium is softer than lead, so it has a tendency to clump up. That’s not great for making the thin, uniform layer the researchers needed.

    But their time and effort paid off. The technique cut down the wavelength measurement uncertainty from 10% to 5%, the study reports. For example, photons with a wavelength of 1,000 nanometers can now be measured to a precision of 50 nm with this system. “This has real implications for the science we can do,” Mazin said, “because we can better resolve the spectra of the objects that we’re looking at.”

    Different phenomena emit photons with specific spectra (or wavelengths), and different molecules absorb photons of different wavelengths. Using this light, scientists can use spectroscopy to identify the composition of objects both nearby and across the entire visible universe.

    Mazin is particularly interested in applying these detectors to exoplanet science. Right now scientists can only do spectroscopy for a tiny subset of exoplanets. The planet needs to pass between its star and Earth, and it must have a thick atmosphere so that enough light passes through it for researchers to work with. Still, the signal to noise ratio is abysmal, especially for rocky planets, Mazin said.

    With better MKIDs scientists can use light reflected off the surface of a planet rather than transmitted through its narrow atmosphere alone. This will soon be possible with the capabilities of the next generation of 30-meter telescopes.

    The Mazin group is also experimenting with a completely different approach to the energy-loss issue. Although the results from this paper are impressive, Mazin said he believes the indium technique could be obsolete if his team is successful with this new endeavor. Either way, he added, the scientists are rapidly closing in on their goals.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Santa Barbara Seal

    The University of California-Santa Barbara is a public land-grant research university in Santa Barbara, California, and one of the ten campuses of the University of California system. Tracing its roots back to 1891 as an independent teachers’ college, The University of California-Santa Barbara joined the University of California system in 1944, and is the third-oldest undergraduate campus in the system.

    The university is a comprehensive doctoral university and is organized into five colleges and schools offering 87 undergraduate degrees and 55 graduate degrees. It is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation, The University of California-Santa Barbara spent $235 million on research and development in fiscal year 2018, ranking it 100th in the nation. In his 2001 book The Public Ivies: America’s Flagship Public Universities, author Howard Greene labeled The University of California-Santa Barbara a “Public Ivy”.

    The University of California-Santa Barbara is a research university with 10 national research centers, including the Kavli Institute for Theoretical Physics and the Center for Control, Dynamical-Systems and Computation. Current University of California-Santa Barbara faculty includes six Nobel Prize laureates; one Fields Medalist; 39 members of the National Academy of Sciences; 27 members of the National Academy of Engineering; and 34 members of the American Academy of Arts and Sciences. The University of California-Santa Barbara was the No. 3 host on the ARPANET and was elected to the Association of American Universities in 1995. The faculty also includes two Academy and Emmy Award winners and recipients of a Millennium Technology Prize; an IEEE Medal of Honor; a National Medal of Technology and Innovation; and a Breakthrough Prize in Fundamental Physics.
    The University of California-Santa Barbara Gauchos compete in the Big West Conference of the NCAA Division I. The Gauchos have won NCAA national championships in men’s soccer and men’s water polo.


    The University of California-Santa Barbara traces its origins back to the Anna Blake School, which was founded in 1891, and offered training in home economics and industrial arts. The Anna Blake School was taken over by the state in 1909 and became the Santa Barbara State Normal School which then became the Santa Barbara State College in 1921.

    In 1944, intense lobbying by an interest group in the City of Santa Barbara led by Thomas Storke and Pearl Chase persuaded the State Legislature, Gov. Earl Warren, and the Regents of the University of California to move the State College over to the more research-oriented University of California system. The State College system sued to stop the takeover but the governor did not support the suit. A state constitutional amendment was passed in 1946 to stop subsequent conversions of State Colleges to University of California campuses.

    From 1944 to 1958, the school was known as Santa Barbara College of the University of California, before taking on its current name. When the vacated Marine Corps training station in Goleta was purchased for the rapidly growing college Santa Barbara City College moved into the vacated State College buildings.

    Originally the regents envisioned a small several thousand–student liberal arts college a so-called “Williams College of the West”, at Santa Barbara. Chronologically, The University of California-Santa Barbara is the third general-education campus of the University of California, after The University of California-Berkeley and The University of California-Los Angeles (the only other state campus to have been acquired by the University of California system). The original campus the regents acquired in Santa Barbara was located on only 100 acres (40 ha) of largely unusable land on a seaside mesa. The availability of a 400-acre (160 ha) portion of the land used as Marine Corps Air Station Santa Barbara until 1946 on another seaside mesa in Goleta, which the regents could acquire for free from the federal government, led to that site becoming the Santa Barbara campus in 1949.

    Originally only 3000–3500 students were anticipated but the post-WWII baby boom led to the designation of general campus in 1958 along with a name change from “Santa Barbara College” to “University of California-Santa Barbara,” and the discontinuation of the industrial arts program for which the state college was famous. A chancellor- Samuel B. Gould- was appointed in 1959.

    In 1959 The University of California-Santa Barbara professor Douwe Stuurman hosted the English writer Aldous Huxley as the university’s first visiting professor. Huxley delivered a lectures series called The Human Situation.

    In the late ’60s and early ’70s The University of California-Santa Barbara became nationally known as a hotbed of anti–Vietnam War activity. A bombing at the school’s faculty club in 1969 killed the caretaker Dover Sharp. In the spring of 1970 multiple occasions of arson occurred including a burning of the Bank of America branch building in the student community of Isla Vista during which time one male student Kevin Moran was shot and killed by police. The University of California-Santa Barbara ‘s anti-Vietnam activity impelled then-Gov. Ronald Reagan to impose a curfew and order the National Guard to enforce it. Armed guardsmen were a common sight on campus and in Isla Vista during this time.

    In 1995 The University of California-Santa Barbara was elected to the Association of American Universities– an organization of leading research universities with a membership consisting of 59 universities in the United States (both public and private) and two universities in Canada.

    On May 23, 2014 a killing spree occurred in Isla Vista, California, a community in close proximity to the campus. All six people killed during the rampage were students at The University of California-Santa Barbara. The murderer was a former Santa Barbara City College student who lived in Isla Vista.

    Research activity

    According to the National Science Foundation, The University of California-Santa Barbara spent $236.5 million on research and development in fiscal 2013, ranking it 87th in the nation.

    From 2005 to 2009 UCSB was ranked fourth in terms of relative citation impact in the U.S. (behind Massachusetts Institute of Technology, California Institute of Technology, and Princeton University) according to Thomson Reuters.

    The University of California-Santa Barbara hosts 12 National Research Centers, including The Kavli Institute for Theoretical Physics, the National Center for Ecological Analysis and Synthesis, the Southern California Earthquake Center, the UCSB Center for Spatial Studies, an affiliate of the National Center for Geographic Information and Analysis, and the California Nanosystems Institute. Eight of these centers are supported by The National Science Foundation. UCSB is also home to Microsoft Station Q, a research group working on topological quantum computing where American mathematician and Fields Medalist Michael Freedman is the director.

    Research impact rankings

    The Times Higher Education World University Rankings ranked The University of California-Santa Barbara 48th worldwide for 2016–17, while the Academic Ranking of World Universities (ARWU) in 2016 ranked https://www.nsf.gov/ 42nd in the world; 28th in the nation; and in 2015 tied for 17th worldwide in engineering.

    In the United States National Research Council rankings of graduate programs, 10 University of California-Santa Barbara departments were ranked in the top ten in the country: Materials; Chemical Engineering; Computer Science; Electrical and Computer Engineering; Mechanical Engineering; Physics; Marine Science Institute; Geography; History; and Theater and Dance. Among U.S. university Materials Science and Engineering programs, The University of California-Santa Barbara was ranked first in each measure of a study by the National Research Council of the NAS.

    The Centre for Science and Technologies Studies at

  • richardmitnick 12:28 pm on July 3, 2022 Permalink | Reply
    Tags: "When autism spectrum disorder occurs with intellectual disability a convergent mechanism for two top-ranking risk genes may be the cause", A significant proportion — approximately 31% — of people with ASD also exhibit ID., , , , , , Microglia are very sensitive to pathological changes in the central nervous system and are the main form of active immune defense to maintain brain health., Preclinical study reveals that immune cells in the brain could be possible new drug targets for ASD and intellectual disability., The paper focuses on ADNP and POGZ-the two top-ranked risk factor genes for ASD/ID., The researchers are hopeful that future research will determine whether chronic neuroinflammation in which targeting microglia or inflammatory signaling pathways could prove to be a useful treatment., The University at Buffalo-SUNY, two top-ranked genetic risk factors for autism spectrum disorder/intellectual disability (ASD/ID) lead to these neurodevelopmental disorders.   

    From The University at Buffalo-SUNY: “When autism spectrum disorder occurs with intellectual disability a convergent mechanism for two top-ranking risk genes may be the cause” 

    SUNY Buffalo

    From The University at Buffalo-SUNY

    June 30, 2022
    Ellen Goldbaum

    “When designing clinical trials to evaluate treatment effectiveness, I think our research underscores the importance of considering the genetic factors involved in an individual’s ASD/ID,” said Conrow-Graham. The paper published in Brain is the culmination of her PhD work in the Jacobs School of Medicine and Biomedical Sciences. (Photo: Sandra Kicman)

    Preclinical study reveals that immune cells in the brain could be possible new drug targets for ASD and intellectual disability.

    University at Buffalo scientists have discovered a convergent mechanism that may be responsible for how two top-ranked genetic risk factors for autism spectrum disorder/intellectual disability (ASD/ID) lead to these neurodevelopmental disorders.

    While ASD is distinct from ID, a significant proportion — approximately 31% — of people with ASD also exhibit ID. Neither condition is well-understood at the molecular level.

    “Given the vast number of genes known to be involved in ASD/ID and the many potential mechanisms contributing to the disorders, it is exciting to find a shared process between two different genes at the molecular level that could be underlying the behavioral changes,” said Megan Conrow-Graham, first author and an MD/PhD candidate in the Jacobs School of Medicine and Biomedical Sciences at UB.

    Published today in the journal Brain, the paper focuses on ADNP and POGZ, the two top-ranked risk factor genes for ASD/ID. The research demonstrates that mutations in these genes result in abnormal activation and overexpression of immune response genes and genes for a type of immune cell in the brain called microglia.

    “Our finding opens the possibility of targeting microglia and immune genes for treating ASD/ID, but much remains to be studied, given the heterogeneity and complexity of these brain disorders,” said Zhen Yan, PhD, senior author and SUNY Distinguished Professor in the Department of Physiology and Biophysics in the Jacobs School.

    The UB scientists found that mutations in the two genes studied activate microglia and cause immune genes in the brain to be overexpressed. The hypothesized result is the abnormal function of synapses in the brain, a characteristic of ASD/ID.

    The research involved studies on postmortem brain tissue from humans with ASD/ID, as well as studies on mice in which ADNP and POGZ were silenced through viral delivery of small interference RNA. These mice exhibited impaired cognitive task performance, such as spatial memory, object recognition memory and long-term memory.

    Weakening a repressive function

    “Under normal conditions, cells in the central nervous system should not express large quantities of genes that activate the immune system,” said Conrow-Graham. “ADNP and POGZ both work to repress these genes so that inflammatory pathways are not continuously activated, which could damage surrounding cells. When that repression is weakened, these immune and inflammatory genes are then able to be expressed in large quantities.”

    The upregulated genes in the mouse prefrontal cortex caused by the deficiencies in ADNP or POGZ activated the pro-inflammatory response.

    “This is consistent with what we see in upregulated genes in the prefrontal cortex of humans with ASD/ID,” said Conrow-Graham. The prefrontal cortex is the part of the brain responsible for executive function, such as cognition and emotional control.

    The mutated genes also activate the glial cells in the brain called microglia, which serve as support cells for neurons and have an immune function in the brain; they comprise 10-15% of all brain cells.

    Sensitive microglia

    “Microglia are very sensitive to pathological changes in the central nervous system and are the main form of active immune defense to maintain brain health,” explained Yan. “Aberrant activation of microglia, which we demonstrate occurs as a result of deficiency in ADNP or POGZ, could lead to the damage and loss of synapses and neurons.”

    The researchers are hopeful that future research will determine whether chronic neuroinflammation could be directly contributing to at least some cases of ASD/ID, in which targeting microglia or inflammatory signaling pathways could prove to be a useful treatment.

    The researchers pointed out that the clinical presentation of both ASD and ID is incredibly varied. Significant variation also likely is present in the kinds of mechanisms responsible for the symptoms of ASD and/or ID.

    “We found that changes in two risk genes lead to a convergent mechanism, likely involving immune activation,” said Conrow-Graham. “However, this probably isn’t the case for all individuals with ASD/ID. When designing clinical trials to evaluate treatment effectiveness, I think our research underscores the importance of considering the genetic factors involved in an individual’s ASD/ID.”

    The research is the culmination of Conrow-Graham’s PhD work; she has now returned to complete the last two years of the MD degree in the Jacobs School. She described her experience pursuing both an MD and a PhD as extremely complementary.

    The immune system has a role

    “My training at each level was super helpful to supplement the other,” she said. “When I began my PhD, I had completed two years of MD training, so I was familiar with the basics of physiology, anatomy and pathology. Because of this, I was able to bring a broader perspective to my neuroscience research, identifying how the immune system might be playing a role. Prior to this, our lab had not really investigated immunology-related pathways, so having that background insight was really beneficial.”

    She added that she learned so much from all of her colleagues in Yan’s lab, including faculty members, lab technicians and other students. “I learned so many technical skills that I had never used before joining the lab, thanks to the dedication of lab co-workers for my training,” she said.

    Her experience at the lab bench working on the basic science underlying neuropsychiatric disorders will definitely influence her work as a clinician.

    “I plan to pursue a career as a child and adolescent psychiatrist, so I may be able to work directly with this patient population,” she said. “We’re learning now that better care may be able to be provided by taking a personalized medicine approach, taking into account genetics, psychosocial factors and others. Being able to take a very deep dive into the field of psychiatric genetics was a privilege that I hope will help me to provide the best care for patients.”

    The research was funded by the Nancy Lurie Marks Family Foundation and by a National Institutes of Health Ruth L. Kirschstein Individual Predoctoral NRSA for MD/PhD F30 fellowship for Conrow-Graham.

    In addition to Conrow-Graham and Yan, co-authors are Jamal B. Williams, PhD, former graduate student; Jennifer Martin, PhD, former postdoctoral fellow; Ping Zhong, PhD, senior research scientist; Qing Cao, PhD, postdoctoral fellow; and Benjamin Rein, PhD, former graduate student.

    All are current or former members of Yan’s lab.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    SUNY Buffalo Campus

    The University at Buffalo-SUNY is a public research university with campuses in Buffalo and Amherst, New York, United States. The university was founded in 1846 as a private medical college and merged with the State University of New York system in 1962. It is one of four university centers in the system, in addition to The University at Albany-SUNY, The University at Binghampton-SUNY, and The University at Stony Brook-SUNY . As of fall 2020, the university enrolls 32,347 students in 13 colleges, making it the largest public university in the state of New York.

    Since its founding by a group which included future United States President Millard Fillmore, the university has evolved from a small medical school to a large research university. Today, in addition to the College of Arts and Sciences, the university houses the largest state-operated medical school, dental school, education school, business school, engineering school, and pharmacy school, and is also home to SUNY’s only law school. The University at Binghampton has the largest enrollment, largest endowment, and most research funding among the universities in the SUNY system. The university offers bachelor’s degrees in over 100 areas of study, as well as 205 master’s degrees, 84 doctoral degrees, and 10 professional degrees. The University at Buffalo and The University of Virginia are the only colleges founded by United States Presidents.

    The University at Buffalo is classified as an R1 University, meaning that it engages in a very high level of research activity. In 1989, UB was elected to The Association of American Universities, a selective group of major research universities in North America. University at Buffalo’s alumni and faculty have included five Nobel laureates, five Pulitzer Prize winners, one head of government, two astronauts, three billionaires, one Academy Award winner, one Emmy Award winner, and Fulbright Scholars.

    The University at Buffalo intercollegiate athletic teams are the Bulls. They compete in Division I of the NCAA, and are members of the Mid-American Conference.

    The University at Buffalo is organized into 13 academic schools and colleges.

    The School of Architecture and Planning is the only combined architecture and urban planning school in the State University of New York system, offers the only accredited professional master’s degree in architecture, and is one of two SUNY schools that offer an accredited professional master’s degree in urban planning. In addition, the Buffalo School of Architecture and Planning also awards the original undergraduate four year pre-professional degrees in architecture and environmental design in the SUNY system. Other degree programs offered by the Buffalo School of Architecture and Planning include a research-oriented Master of Science in architecture with specializations in historic preservation/urban design, inclusive design, and computing and media technologies; a PhD in urban and regional planning; and, an advanced graduate certificate in historic preservation.

    The College of Arts and Sciences was founded in 1915 and is the largest and most comprehensive academic unit at University at Buffalo with 29 degree-granting departments, 16 academic programs, and 23 centers and institutes across the humanities, arts, and sciences.

    The School of Dental Medicine was founded in 1892 and offers accredited programs in DDS, oral surgery, and other oral sciences.

    The Graduate School of Education was founded in 1931 and is one of the largest graduate schools at University at Buffalo. The school has four academic departments: counseling and educational psychology, educational leadership and policy, learning and instruction, and library and information science. In academic year 2008–2009, the Graduate School of Education awarded 472 master’s degrees and 52 doctoral degrees.

    The School of Engineering and Applied Sciences was founded in 1946 and offers undergraduate and graduate degrees in six departments. It is the largest public school of engineering in the state of New York. University at Buffalo is the only public school in New York State to offer a degree in Aerospace Engineering.

    The School of Law was founded in 1887 and is the only law school in the SUNY system. The school awarded 265 JD degrees in the 2009–2010 academic year.

    The School of Management was founded in 1923 and offers AACSB-accredited undergraduate, MBA, and doctoral degrees.

    The School of Medicine and Biomedical Sciences is the founding faculty of the University at Buffalo and began in 1846. It offers undergraduate and graduate degrees in the biomedical and biotechnical sciences as well as an MD program and residencies.

    The School of Nursing was founded in 1936 and offers bachelors, masters, and doctoral degrees in nursing practice and patient care.

    The School of Pharmacy and Pharmaceutical Sciences was founded in 1886, making it the second-oldest faculty at University at Buffalo and one of only two pharmacy schools in the SUNY system.

    The School of Public Health and Health Professions was founded in 2003 from the merger of the Department of Social and Preventive Medicine and the University at Buffalo School of Health Related Professions. The school offers a bachelor’s degree in exercise science as well as professional, master’s and PhD degrees.

    The School of Social Work offers graduate MSW and doctoral degrees in social work.

    The Roswell Park Graduate Division is an affiliated academic unit within the Graduate School of UB, in partnership with Roswell Park Comprehensive Cancer Center, an independent NCI-designated Comprehensive Cancer Center. The Roswell Park Graduate Division offers five PhD programs and two MS programs in basic and translational biomedical research related to cancer. Roswell Park Comprehensive Cancer Center was founded in 1898 by Dr. Roswell Park and was the world’s first cancer research institute.

    The University at Buffalo houses two New York State Centers of Excellence (out of the total 11): Center of Excellence in Bioinformatics and Life Sciences (CBLS) and Center of Excellence in Materials Informatics (CMI). Emphasis has been placed on developing a community of research scientists centered around an economic initiative to promote Buffalo and create the Center of Excellence for Bioinformatics and Life Sciences as well as other advanced biomedical and engineering disciplines.

    Total research expenditures for the fiscal year of 2017 were $401 million, ranking 59th nationally.

    SUNY’s administrative offices are in Albany, the state’s capital, with satellite offices in Manhattan and Washington, D.C.

    With 25,000 acres of land, SUNY’s largest campus is The SUNY College of Environmental Science and Forestry, which neighbors the State University of New York Upstate Medical University – the largest employer in the SUNY system with over 10,959 employees. While the SUNY system doesn’t officially recognize a flagship university, the University at Buffalo and Stony Brook University are sometimes treated as unofficial flagships.

    The State University of New York was established in 1948 by Governor Thomas E. Dewey, through legislative implementation of recommendations made by the Temporary Commission on the Need for a State University (1946–1948). The commission was chaired by Owen D. Young, who was at the time Chairman of General Electric. The system was greatly expanded during the administration of Governor Nelson A. Rockefeller, who took a personal interest in design and construction of new SUNY facilities across the state.

    Apart from units of the unrelated City University of New York (CUNY), SUNY comprises all state-supported institutions of higher education.

  • richardmitnick 11:58 am on July 3, 2022 Permalink | Reply
    Tags: "US and Czech Scientists Collaborate To Explore Gamma-Ray Production With High Power Lasers", , , , , , , , , The L3-HAPLS laser system installed at the ELI Beamlines Research Center in Dolní Břežany Czech Republic.,   

    From The University of California-San Diego: “US and Czech Scientists Collaborate To Explore Gamma-Ray Production With High Power Lasers” 

    From The University of California-San Diego

    July 01, 2022
    Daniel Kane

    The U.S. National Science Foundation (NSF) and the Czech Science Foundation (GACR) are funding a new collaborative project of scientists from the University of California San Diego in the U.S. and ELI Beamlines (Institute of Physics of the Czech Academy of Sciences) in the Czech Republic which aims to leverage the capabilities of the ELI Beamlines multi-petawatt laser facility.

    Researchers hope these experiments can achieve a breakthrough by demonstrating efficient generation of dense gamma-ray beams.

    Stellar objects like pulsars can create matter and antimatter directly from light because of their extreme energies. In fact, the magnetic field, or “magnetosphere,” of a pulsar is filled with electrons and positrons that are created by colliding photons.

    Reproducing the same phenomena in a laboratory on Earth is extremely challenging. It requires a dense cloud of photons with energies that are millions of times higher than visible light, an achievement that has so far eluded the scientists working in this field. However, theories suggest that high-power lasers ought to be able to produce such a photon cloud.

    As the first international laser research infrastructure dedicated to the application of high-power and high-intensity lasers, the Extreme Light Infrastructure (ELI ERIC) facilities will enable such research possibilities. The ELI ERIC is a multi-site research infrastructure based on specialized and complementary facilities ELI Beamlines (Czech Republic) and ELI ALPS (Hungary). The new capabilities at ELI will create the necessary conditions to test the theories in a laboratory.

    Super computer simulation of energetic gamma-ray emission (yellow arrows) by a dense plasma (green) irradiated by a high-intensity laser beam (red and blue). The laser propagates from left to right, with the emitted photons flying in the same direction. The smooth blue and red regions represent a strong magnetic field generated by the plasma, whereas the oscillation region corresponds to the laser magnetic field.

    This project combines theoretical expertise from the University of California San Diego (U.S.), experimental expertise from ELI Beamlines, as well as target fabrication and engineering expertise from General Atomics (U.S.). The roughly $1,000,000 project, jointly funded by NSF and GACR, will be led by Prof. Alexey Arefiev at UC San Diego. Target development for rep-rated deployment will take place at General Atomics, led by Dr. Mario Manuel, while the primary experiments will be conducted at ELI Beamlines by a team led by Dr. Florian Condamine and Dr. Stefan Weber.

    The concept for the project was developed by Arefiev’s research group at UC San Diego, which specializes in supercomputer simulations of intense light-matter interactions. The approach for this project leverages an effect that occurs when electrons in a plasma are accelerated to near light speeds by a high-powered laser. This effect is called “relativistic transparency” because it causes a previously opaque dense plasma to become transparent to laser light.

    In this regime, extremely strong magnetic fields are generated as the laser propagates through the plasma. During this process, the relativistic electrons oscillate in the magnetic field, which in turn causes the emission of gamma-rays, predominantly in the direction of the laser.

    “It is very exciting that we are in a position to generate the sort of magnetic fields that previously only existed in extreme astrophysical objects, such as neutron stars,” says Arefiev. “The ability of the ELI Beamlines lasers to reach very high on-target intensity is the key to achieving this regime.”

    These experiments will provide the first statistically relevant study of gamma-ray generation using high-powered lasers. Researchers hope the work will open the way for secondary high-energy photon sources that can be used not only for fundamental physics studies, but also for a range of important industrial applications such as material science, nuclear waste imaging, nuclear fuel assay, security, high-resolution deep-penetration radiography, etc. Such “extreme imaging” requires robust, reproducible, and well-controlled gamma-ray sources. The present proposal aims exactly at the development of such unprecedented sources.

    The experiments will be greatly assisted by another technological advance. Until recently, high-power laser facilities could execute about one shot every hour, which limited the amount of data that could be collected. However, new facilities like ELI Beamlines are capable of multiple shots per second. These capabilities allow for statistical studies of laser-target interactions in ways that were impossible only a few years ago. That means a shift in the way such experiments are designed and executed is necessary to take full advantage of the possibilities.

    “The P3 installation at ELI Beamlines is a unique and versatile experimental infrastructure for sophisticated high-field experiments and perfectly adapted to the planned program,” comments Condamine. Weber notes, “This collaboration between San Diego and ELI Beamlines is expected to be a major step forward to bring together the US community and the ELI-team for joint experiments.”

    Thus, a major part of this project is training the next generation of scientists at ELI Beamlines to develop techniques that can fully leverage its rep-rated capabilities. UC San Diego students and postdoctoral researchers will also train on rep-rated target deployment and data acquisition on General Atomics’ new GALADRIEL laser facility to help improve the efficiency of the experiments conducted at ELI Beamlines.

    The P3 (Plasma Physics Platform)-installation at ELI Beamlines where the experiments will take place.

    “This is the first project funded by the Czech Science Foundation and the US National Science Foundation. I believe that the new collaboration between the agencies will lead to a number of successful projects and collaborating scientific teams from the Czech Republic and the USA will benefit from it,” says GACR president Dr. Petr Baldrian.

    “We are thrilled to be working with our counterparts in the Czech Republic to further expand international scientific cooperation in artificial intelligence, nanotechnology, and plasma science research. I am optimistic this will be the first of many collaborative projects between NSF and GACR,” says the Director of NSF, Dr. Sethuraman Panchanathan.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of California- San Diego, is a public research university located in the La Jolla area of San Diego, California, in the United States. The university occupies 2,141 acres (866 ha) near the coast of the Pacific Ocean with the main campus resting on approximately 1,152 acres (466 ha). Established in 1960 near the pre-existing Scripps Institution of Oceanography, University of California-San Diego is the seventh oldest of the 10 University of California campuses and offers over 200 undergraduate and graduate degree programs, enrolling about 22,700 undergraduate and 6,300 graduate students. The University of California-San Diego is one of America’s “Public Ivy” universities, which recognizes top public research universities in the United States. The University of California-San Diego was ranked 8th among public universities and 37th among all universities in the United States, and rated the 18th Top World University by U.S. News & World Report’s 2015 rankings.

    The University of California-San Diego is organized into seven undergraduate residential colleges (Revelle; John Muir; Thurgood Marshall; Earl Warren; Eleanor Roosevelt; Sixth; and Seventh), four academic divisions (Arts and Humanities; Biological Sciences; Physical Sciences; and Social Sciences), and seven graduate and professional schools (Jacobs School of Engineering; Rady School of Management; Scripps Institution of Oceanography; School of Global Policy and Strategy; School of Medicine; Skaggs School of Pharmacy and Pharmaceutical Sciences; and the newly established Wertheim School of Public Health and Human Longevity Science). University of California-San Diego Health, the region’s only academic health system, provides patient care; conducts medical research; and educates future health care professionals at the University of California-San Diego Medical Center, Hillcrest; Jacobs Medical Center; Moores Cancer Center; Sulpizio Cardiovascular Center; Shiley Eye Institute; Institute for Genomic Medicine; Koman Family Outpatient Pavilion and various express care and urgent care clinics throughout San Diego.

    The university operates 19 organized research units (ORUs), including the Center for Energy Research; Qualcomm Institute (a branch of the California Institute for Telecommunications and Information Technology); San Diego Supercomputer Center; and the Kavli Institute for Brain and Mind, as well as eight School of Medicine research units, six research centers at Scripps Institution of Oceanography and two multi-campus initiatives, including the Institute on Global Conflict and Cooperation. The University of California-San Diego is also closely affiliated with several regional research centers, such as the Salk Institute; the Sanford Burnham Prebys Medical Discovery Institute; the Sanford Consortium for Regenerative Medicine; and the Scripps Research Institute. It is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation, UC San Diego spent $1.265 billion on research and development in fiscal year 2018, ranking it 7th in the nation.

    The University of California-San Diego is considered one of the country’s “Public Ivies”. As of February 2021, The University of California-San Diego faculty, researchers and alumni have won 27 Nobel Prizes and three Fields Medals, eight National Medals of Science, eight MacArthur Fellowships, and three Pulitzer Prizes. Additionally, of the current faculty, 29 have been elected to the National Academy of Engineering, 70 to the National Academy of Sciences, 45 to the National Academy of Medicine and 110 to the American Academy of Arts and Sciences.


    When the Regents of the University of California originally authorized the San Diego campus in 1956, it was planned to be a graduate and research institution, providing instruction in the sciences, mathematics, and engineering. Local citizens supported the idea, voting the same year to transfer to the university 59 acres (24 ha) of mesa land on the coast near the preexisting Scripps Institution of Oceanography. The Regents requested an additional gift of 550 acres (220 ha) of undeveloped mesa land northeast of Scripps, as well as 500 acres (200 ha) on the former site of Camp Matthews from the federal government, but Roger Revelle, then director of Scripps Institution and main advocate for establishing the new campus, jeopardized the site selection by exposing the La Jolla community’s exclusive real estate business practices, which were antagonistic to minority racial and religious groups. This outraged local conservatives, as well as Regent Edwin W. Pauley.

    University of California President Clark Kerr satisfied San Diego city donors by changing the proposed name from University of California, La Jolla, to University of California-San Diego. The city voted in agreement to its part in 1958, and the University of California approved construction of the new campus in 1960. Because of the clash with Pauley, Revelle was not made chancellor. Herbert York, first director of DOE’s Lawrence Livermore National Laboratory, was designated instead. York planned the main campus according to the “Oxbridge” model, relying on many of Revelle’s ideas.

    According to Kerr, “San Diego always asked for the best,” though this created much friction throughout the University of California system, including with Kerr himself, because University of California-San Diego often seemed to be “asking for too much and too fast.” Kerr attributed University of California-San Diego’s “special personality” to Scripps, which for over five decades had been the most isolated University of California unit in every sense: geographically, financially, and institutionally. It was a great shock to the Scripps community to learn that Scripps was now expected to become the nucleus of a new University of California campus and would now be the object of far more attention from both the university administration in Berkeley and the state government in Sacramento.

    The University of California-San Diego was the first general campus of the University of California to be designed “from the top down” in terms of research emphasis. Local leaders disagreed on whether the new school should be a technical research institute or a more broadly based school that included undergraduates as well. John Jay Hopkins of General Dynamics Corporation pledged one million dollars for the former while the City Council offered free land for the latter. The original authorization for the University of California-San Diego campus given by the University of California Regents in 1956 approved a “graduate program in science and technology” that included undergraduate programs, a compromise that won both the support of General Dynamics and the city voters’ approval.

    Nobel laureate Harold Urey, a physicist from the University of Chicago, and Hans Suess, who had published the first paper on the greenhouse effect with Revelle in the previous year, were early recruits to the faculty in 1958. Maria Goeppert-Mayer, later the second female Nobel laureate in physics, was appointed professor of physics in 1960. The graduate division of the school opened in 1960 with 20 faculty in residence, with instruction offered in the fields of physics, biology, chemistry, and earth science. Before the main campus completed construction, classes were held in the Scripps Institution of Oceanography.

    By 1963, new facilities on the mesa had been finished for the School of Science and Engineering, and new buildings were under construction for Social Sciences and Humanities. Ten additional faculty in those disciplines were hired, and the whole site was designated the First College, later renamed after Roger Revelle, of the new campus. York resigned as chancellor that year and was replaced by John Semple Galbraith. The undergraduate program accepted its first class of 181 freshman at Revelle College in 1964. Second College was founded in 1964, on the land deeded by the federal government, and named after environmentalist John Muir two years later. The University of California-San Diego School of Medicine also accepted its first students in 1966.

    Political theorist Herbert Marcuse joined the faculty in 1965. A champion of the New Left, he reportedly was the first protester to occupy the administration building in a demonstration organized by his student, political activist Angela Davis. The American Legion offered to buy out the remainder of Marcuse’s contract for $20,000; the Regents censured Chancellor William J. McGill for defending Marcuse on the basis of academic freedom, but further action was averted after local leaders expressed support for Marcuse. Further student unrest was felt at the university, as the United States increased its involvement in the Vietnam War during the mid-1960s, when a student raised a Viet Minh flag over the campus. Protests escalated as the war continued and were only exacerbated after the National Guard fired on student protesters at Kent State University in 1970. Over 200 students occupied Urey Hall, with one student setting himself on fire in protest of the war.

    Early research activity and faculty quality, notably in the sciences, was integral to shaping the focus and culture of the university. Even before The University of California-San Diego had its own campus, faculty recruits had already made significant research breakthroughs, such as the Keeling Curve, a graph that plots rapidly increasing carbon dioxide levels in the atmosphere and was the first significant evidence for global climate change; the Kohn–Sham equations, used to investigate particular atoms and molecules in quantum chemistry; and the Miller–Urey experiment, which gave birth to the field of prebiotic chemistry.

    Engineering, particularly computer science, became an important part of the university’s academics as it matured. University researchers helped develop University of California-San Diego Pascal, an early machine-independent programming language that later heavily influenced Java; the National Science Foundation Network, a precursor to the Internet; and the Network News Transfer Protocol during the late 1970s to 1980s. In economics, the methods for analyzing economic time series with time-varying volatility (ARCH), and with common trends (cointegration) were developed. The University of California-San Diego maintained its research intense character after its founding, racking up 25 Nobel Laureates affiliated within 50 years of history; a rate of five per decade.

    Under Richard C. Atkinson’s leadership as chancellor from 1980 to 1995, the university strengthened its ties with the city of San Diego by encouraging technology transfer with developing companies, transforming San Diego into a world leader in technology-based industries. He oversaw a rapid expansion of the School of Engineering, later renamed after Qualcomm founder Irwin M. Jacobs, with the construction of the San Diego Supercomputer Center and establishment of the computer science, electrical engineering, and bioengineering departments. Private donations increased from $15 million to nearly $50 million annually, faculty expanded by nearly 50%, and enrollment doubled to about 18,000 students during his administration. By the end of his chancellorship, the quality of The University of California-San Diego graduate programs was ranked 10th in the nation by the National Research Council.

    The university continued to undergo further expansion during the first decade of the new millennium with the establishment and construction of two new professional schools — the Skaggs School of Pharmacy and Rady School of Management—and the California Institute for Telecommunications and Information Technology, a research institute run jointly with University of California Irvine. The University of California-San Diego also reached two financial milestones during this time, becoming the first university in the western region to raise over $1 billion in its eight-year fundraising campaign in 2007 and also obtaining an additional $1 billion through research contracts and grants in a single fiscal year for the first time in 2010. Despite this, due to the California budget crisis, the university loaned $40 million against its own assets in 2009 to offset a significant reduction in state educational appropriations. The salary of Pradeep Khosla, who became chancellor in 2012, has been the subject of controversy amidst continued budget cuts and tuition increases.

    On November 27, 2017, the university announced it would leave its longtime athletic home of the California Collegiate Athletic Association, an NCAA Division II league, to begin a transition to Division I in 2020. At that time, it will join the Big West Conference, already home to four other UC campuses (Davis, Irvine, Riverside, Santa Barbara). The transition period will run through the 2023–24 school year. The university prepares to transition to NCAA Division I competition on July 1, 2020.


    Applied Physics and Mathematics

    The Nature Index lists The University of California-San Diego as 6th in the United States for research output by article count in 2019. In 2017, The University of California-San Diego spent $1.13 billion on research, the 7th highest expenditure among academic institutions in the U.S. The university operates several organized research units, including the Center for Astrophysics and Space Sciences (CASS), the Center for Drug Discovery Innovation, and the Institute for Neural Computation. The University of California-San Diego also maintains close ties to the nearby Scripps Research Institute and Salk Institute for Biological Studies. In 1977, The University of California-San Diego developed and released the University of California-San Diego Pascal programming language. The university was designated as one of the original national Alzheimer’s disease research centers in 1984 by the National Institute on Aging. In 2018, The University of California-San Diego received $10.5 million from the DOE National Nuclear Security Administration to establish the Center for Matters under Extreme Pressure (CMEC).

    The university founded the San Diego Supercomputer Center (SDSC) in 1985, which provides high performance computing for research in various scientific disciplines. In 2000, The University of California-San Diego partnered with The University of California-Irvine to create the Qualcomm Institute – University of California-San Diego, which integrates research in photonics, nanotechnology, and wireless telecommunication to develop solutions to problems in energy, health, and the environment.

    The University of California-San Diego also operates the Scripps Institution of Oceanography, one of the largest centers of research in earth science in the world, which predates the university itself. Together, SDSC and SIO, along with funding partner universities California Institute of Technology, San Diego State University, and The University of California-Santa Barbara, manage the High Performance Wireless Research and Education Network.

  • richardmitnick 11:27 am on July 3, 2022 Permalink | Reply
    Tags: "CERN’s Higgs boson discovery:: The pinnacle of international scientific collaboration?", , , , , , , , ,   

    From “Physics Today” : “CERN’s Higgs boson discovery:: The pinnacle of international scientific collaboration?” 

    Physics Today bloc

    From “Physics Today”

    30 Jun 2022
    Michael Riordan

    Decades of effort to establish a global, scientist-managed high-energy-physics laboratory culminated in the discovery of the final missing piece of the discipline’s standard model.

    Credit: Abigail Malate for Physics Today

    Ten years ago, two of the largest scientific collaborations ever—spanning six continents and encompassing more than 60 nations—announced their discovery at CERN of the long-sought Higgs boson, the capstone of the standard model.

    Physicists from all the countries involved could take well-earned credit for what will surely stand as one of the 21st century’s greatest scientific breakthroughs. It was a remarkable diplomatic achievement, too, at a moment of relative world peace, perhaps the pinnacle of international scientific cooperation. And it would not have been possible without a series of farsighted decisions and actions.


    Part of CERN’s success as a citadel of modern physics is due to the early-1950s decision to establish it in Geneva, Switzerland, a city and nation widely recognized for cosmopolitanism and political neutrality. Many thousands of scientists of diverse nationalities, not just Europeans, have eagerly pursued high-energy-physics research in this highly appealing environment, given its many cultural amenities—plus world-class hiking, mountain climbing, and skiing in the nearby Alps.

    CERN grew steadily during more than five decades of increasingly important high-energy-physics research, reusing existing accelerators and colliders wherever possible in the construction of new facilities. It gradually developed a talented, cohesive staff that could effectively manage the difficult construction of the multibillion-euro Large Hadron Collider (LHC) and its four gigantic detectors: ALICE, ATLAS, CMS, and LHCb.









    After the 1993 demise of the Superconducting Super Collider (SSC), CERN leaders decided to pursue construction of the LHC, but they realized they needed to attract significant funds for the project from beyond Europe. That transformation—effectively to make it a “world laboratory”—required extending its organizational framework and lab culture to embrace those contributions and the large contingents of non-European physicists that would accompany them.

    Given that accomplishment, CERN will likely remain the focus of world high-energy physics as the discipline begins building the next generation of particle colliders.

    Especially after the savage Russian invasion of Ukraine and the looming bifurcation of the world order, the lab now offers an island of stability in a global sea of uncertainty. National governments require strong assurances that the money and equipment they send abroad for scientific megaprojects are being well managed on behalf of their scientists and citizenry. In that regard, CERN has a remarkably robust, decades-long track record.

    Funding international collaborations

    Establishing a vigorous, productive laboratory culture does not happen overnight. It requires years, if not decades. In the late 1980s, SSC proponents failed to appreciate that essential process. Rather than electing to build their gargantuan new collider in Illinois adjacent to Fermilab and adapt the lab’s existing Tevatron to serve as a proton injector, they selected a new, “green field” site just south of Dallas, Texas.

    [Earlier than the LHC at CERN, The DOE’s Fermi National Accelerator Laboratory had sought the Higgs with the Tevatron Accelerator.

    But the Tevatron could barely muster 2 TeV [Terraelecton volts], not enough energy to find the Higgs. CERN’s LHC is capable of 13 TeV.

    Another possible attempt in the U.S. would have been the Super Conducting Supercollider.

    Fermilab has gone on to become a world powerhouse in neutrino research with the LBNF/DUNE project which will send neutrinos 800 miles to SURF-The Sanford Underground Research Facility in in Lead, South Dakota.]

    Other factors were involved in the project’s collapse, too, among them the internecine politics of Washington, DC (see my article, Physics Today, October 2016, page 48). But mismanagement of the project (whether real or perceived) by a contentious, untested organization of accelerator physicists and military managers contributed heavily to the SSC’s October 1993 termination by the US Congress.

    When the US quest to build the SSC finally ended, CERN was ready to push ahead with plans for its fledgling LHC project—and to make it a global endeavor. Whereas the SSC project had severe difficulty in securing foreign contributions for building the collider, CERN reached beyond its 19 European member states for contributions to the LHC. By the time the CERN Council gave conditional approval to proceed with the project in December 1994, the lab could anticipate sufficient funding from Europe for an initial construction phase based on a proposed “missing magnet” scheme: Just two-thirds of the proton collider’s superconducting dipole magnets would at first be installed in the existing 27 km tunnel of the Large Electron–Positron (LEP) Collider after its physics research ended. Some doubted whether the scheme was feasible, but it permitted the project to begin hardly a year after the SSC termination. And CERN then opened the door to additional contributions from nonmember states that would allow LHC construction to occur in a single phase.

    In May 1995 Japan became the first non-European nation to offer a major contribution to LHC construction, committing a total of 5 billion yen (then worth about 65 million Swiss francs or $50 million). Russia made a similar commitment the following year, mainly for the construction of the LHC detectors. Canada, China, India, and Israel soon followed suit (although with smaller contributions). The US—still smarting from the SSC debacle—took longer. After lengthy negotiations with the Department of Energy and Congress, CERN director general Christopher Llewellyn Smith finally succeeded in securing a major US commitment worth $531 million in December 1997, including $200 million for collider construction. The US, Japan, and Russia were granted special “observer” status on the CERN Council, giving them a say in LHC management.

    Russia provides an excellent case history of the negotiations and agreements involved in extending CERN participation to include nonmember states. Soviet and Russian physicists had been involved in research there since the mid 1970s, when they began working on fixed-target experiments on the Super Proton Synchrotron.

    In the early 1990s, Russian physicists made major contributions to the design of the CMS detector for the LHC, for which the RDMS (Russia and Dubna member states) collaboration, led by the Joint Institute of Nuclear Research (JINR) in Dubna, Russia, played a formative role.

    Cutaway view of the original Compact Muon Solenoid, or CMS, detector. Credit: CERN.

    The total cost of materials and equipment produced in Russia for the CMS has been estimated at $15 million, with part of the amount provided by CERN and its member states. Russian institutes contributed a similar value of equipment and materials to the ATLAS experiment—again funded partly by CERN and its member states. Hundreds of Russian physicists have since been involved in both experiments.

    And those globe-spanning experimental collaborations benefited extensively from the creation and development of the World Wide Web at CERN by Tim Berners-Lee.

    By the time CERN shut down the LEP in November 2000 and began full-fledged LHC construction, the lab had effectively been transformed from a European center for high-energy physics into a world laboratory for the discipline. The “globalization” of high-energy physics was off to a good start.

    A crucial aspect of that global scientific laboratory is the Worldwide LHC Computing Grid, a multitier system of more than 150 computers linked by high-speed internet and private fiber-optic cables designed to cope with the torrent of information being generated by the LHC detectors—typically many terabytes of data daily. Initial event processing occurs on CERN mainframe computers, which send the results to 13 regional academic institutions (Fermilab and JINR, for example) for further processing and distribution. The grid enables experimenters to do much of the data analysis at their home institutions, supplemented by occasional in-person visits to CERN to interact directly with collaborators and detector hardware. In addition, thousands of these physicists make extensive use of the World Wide Web to share designs, R&D efforts, and initial results as well as to draft scientific articles for publication.

    CERN has been able to establish a successful laboratory culture, conducive to the best possible work by thousands of high-energy physicists, because the lab has essentially complete control of its budget, which exceeded a billion Swiss francs annually as the new century began. Accommodations have been made for specific national needs (for example, the costs of German reunification), but the resulting budget remains under CERN auspices. Important decisions are made by physicists—not bureaucrats or politicians—who better appreciate the ramifications of those decisions for the quality of the scientific research to be done. Contrary to the case of the SSC, meddlesome military managers were not involved.

    Discovering the Higgs boson

    Scientists’ control of their own workplace, which begins with laboratory design and construction and continues into its management and operations, is an important factor in doing successful research. When a meltdown of dozens of superconducting dipole magnets occurred shortly after LHC commissioning began in September 2008, for example, it was a crack team of CERN accelerator physicists who dealt with and solved the utterly challenging problem, taking more than a year to bring the machine back to life. Protons finally began colliding in November 2009, albeit at a reduced collision energy of 7 TeV and at very low luminosity (collision rate).

    Serious data taking began in 2011, as LHC operators nudged the luminosity steadily higher and proton collisions began to surge in. By year’s end, both the ATLAS and CMS experiments were experiencing small excesses of two-photon and four-lepton events—the decay channels expected to give the clearest indication of Higgs boson production—in the vicinity of 125 GeV. But both collaborations stopped short of claiming its discovery.

    When similar excesses appeared in the experiments during the spring 2012 run, their confidence swelled—especially after combinations of the two-photon and four-lepton events exceeded the rigorous five-sigma statistical significance required in high-energy physics. I was fortunate to be present at CERN (if a little groggy from jet lag) when that crucial threshold was crossed in late June by a group of ATLAS experimenters, many hailing from China and the US, who began noisily celebrating in an adjacent office. (See the accompanying essay by Sau Lan Wu.)

    The 4 July 2012 CERN press conference announcing the Higgs discovery—timed to coincide with the opening day of the 36th International Conference on High Energy Physics in Melbourne, Australia—was televised around the globe to rapt physicist audiences on at least six continents. Americans had to awaken in the early morning hours of their nation’s 236th birthday to watch the proceedings. In the packed auditorium, along with former CERN directors (including Llewellyn Smith) and current managers sitting prominently and proudly in the front row, sat theorists François Englert and Peter Higgs, who would soon share the Nobel Prize in physics for anticipating this epochal discovery (see Physics Today, December 2013, page 10). “I think we have it,” stated CERN director general Rolf-Dieter Heuer after the ATLAS and CMS presentations, perhaps a bit guardedly. “We have observed a new particle consistent with a Higgs boson.”

    At the Higgs discovery announcement, CERN Director General Rolf Heuer congratulates François Englert and Peter Higgs, who would later receive the 2013 Nobel Prize in Physics for their theoretical description of the origin of mass—which was confirmed by the Higgs boson detection.

    It was certainly a European triumph, a vindication of the continent’s patient and enduring support of science—but also a triumph for the global physics community. Both the ATLAS and CMS collaborations then involved about 3000 physicists. ATLAS physicists hailed from 177 institutions in 38 nations; CMS included 182 institutions in 40 nations. Physicists from Brazil, Canada, China, India, Japan, Russia, Ukraine, and the US, among many other nations, could rejoice in the superb achievement, along with those from Belgium, France, Germany, Italy, the Netherlands, Poland, Spain, Sweden, the UK, and other CERN member states.

    If the Higgs boson discovery does not represent the pinnacle of international scientific cooperation, it surely sets a high standard. It will be a difficult one to match in the coming decades, given the conflicts and cleavages that have been erupting since Russia’s brutal Ukraine invasion. Russian scientific institutes have been at least temporarily excluded from future CERN projects—and the ban may well become permanent. And the costs of European rearmament could easily impact the CERN budget in the coming years. The first two decades of the 21st century will certainly represent a special moment in history when so many nations could work together peacefully in a common scientific pursuit of the greatest significance.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Our mission

    The mission of ”Physics Today” is to be a unifying influence for the diverse areas of physics and the physics-related sciences.

    It does that in three ways:

    • by providing authoritative, engaging coverage of physical science research and its applications without regard to disciplinary boundaries;
    • by providing authoritative, engaging coverage of the often complex interactions of the physical sciences with each other and with other spheres of human endeavor; and
    • by providing a forum for the exchange of ideas within the scientific community.”

  • richardmitnick 10:10 am on July 3, 2022 Permalink | Reply
    Tags: "Webb Telescope Will Look for Signs of Life Way Out There", , , , , , ,   

    From “The New York Times” : “Webb Telescope Will Look for Signs of Life Way Out There” 

    From “The New York Times”

    July 2, 2022
    Carl Zimmer

    The folded-up James Webb Space Telescope as it was prepared for mounting on a rocket and launch last year at the European spaceport in Kourou, French Guiana. Credit: Chris Gunn/NASA.

    The first question astronomers want to answer about exoplanets: Do they have atmospheres friendly to life?

    This month will mark a new chapter in the search for extraterrestrial life, when the most powerful space telescope yet built will start spying on planets that orbit other stars. Astronomers hope that the James Webb Space Telescope will reveal whether some of those planets harbor atmospheres that might support life.

    Identifying an atmosphere in another solar system would be remarkable enough. But there is even a chance — albeit tiny — that one of these atmospheres will offer what is known as a biosignature: a signal of life itself.

    “I think we will be able to find planets that we think are interesting — you know, good possibilities for life,” said Megan Mansfield, an astronomer at the University of Arizona. “But we won’t necessarily be able to just identify life immediately.”

    So far, Earth remains the only planet in the universe where life is known to exist. Scientists have been sending probes to Mars for almost 60 years and have not yet found Martians. But it is conceivable that life is hiding under the surface of the Red Planet or waiting to be discovered on a moon of Jupiter or Saturn. Some scientists have held out hope that even Venus, despite its scorching atmosphere of sulfur dioxide clouds, might be home to Venusians.

    Even if Earth turns out to be the only planet harboring life in our own solar system, many other solar systems in the universe hold so-called exoplanets.

    In 1995, French astronomers spotted the first exoplanet orbiting a sunlike star. Known as 51 Pegasi b, the exoplanet turned out to be an unpromising home for life — a puffy gas giant bigger than Jupiter, and a toasty 1,800 degrees Fahrenheit.

    In the years since, scientists have found more than 5,000 other exoplanets. Some of them are far more similar to Earth — roughly the same size, made of rock rather than gas and orbiting in a “Goldilocks zone” around their star, not so close as to get cooked but not so far as to be frozen.

    An artist’s rendering of the exoplanet 51 Pegasi b, the first exoplanet ever discovered. Credit: The European Southern Observatory [La Observatorio Europeo Austral][Observatoire européen austral][Europäische Südsternwarte](EU)(CL).

    Unfortunately, the relatively small size of these exoplanets has made them extremely difficult to study, until now. The James Webb Space Telescope, launched last Christmas, will change that, acting as a magnifying glass to let astronomers look more closely at these worlds.

    Since its launch from Kourou, French Guiana, the telescope has traveled a million miles from Earth, entering its own orbit around the sun at L2.

    There, a shield protects its 21-foot mirror from any heat or light from the sun or Earth. In this profound darkness, the telescope can detect faint, distant glimmers of light, including those that could reveal new details about faraway planets.

    The space telescope “is the first big space observatory to take the study of exoplanet atmospheres into account in its design,” Dr. Mansfield said.

    NASA engineers began taking pictures of an array of objects with the Webb telescope in mid-June and will release its first images to the public on July 12.

    Exoplanets will be in that first batch of pictures, said Eric Smith, the program’s lead scientist. Because the telescope will spend relatively little time observing the exoplanets, Dr. Smith considered those first images a “quick and dirty” look at the telescope’s power.

    Those quick looks will be followed by a series of much longer observations, starting in July, offering a much clearer picture of the exoplanets.

    A number of teams of astronomers are planning to look at the seven planets that orbit a star called Trappist-1.

    The TRAPPIST-1 star and planet system; the ESO Belgian robotic Trappist National Telescope at Cerro La Silla, Chile.


    Earlier observations have suggested that three of the planets occupy the habitable zone.

    “It’s an ideal place to look for traces of life outside of the solar system,” said Olivia Lim, a graduate student at the University of Montreal who will be observing the Trappist-1 planets starting around July 4.

    Because Trappist-1 is a small, cool star, its habitable zone is closer to it than in our own solar system. As a result, its potentially habitable planets orbit at close range, taking just a few days to circle the star. Every time the planets pass in front of Trappist-1, scientists will be able tackle a basic but crucial question: Do any of them have an atmosphere?

    “If it doesn’t have air, it’s not habitable, even if it’s in the habitable zone,” said Nikole Lewis, an astronomer at Cornell University.

    Dr. Lewis and other astronomers would not be surprised to find no atmospheres surrounding Trappist-1’s planets. Even if the planets had developed atmospheres when they formed, the star might have blasted them away long ago with ultraviolet and X-ray radiation.

    “It’s possible that they could just strip away all of the atmosphere on a planet before it even had a chance to like start forming life,” Dr. Mansfield said. “That’s the first-order question that we’re trying to answer here: whether these planets could have an atmosphere long enough that they’d be able to develop life.”

    A planet passing in front of Trappist-1 will create a tiny shadow, but the shadow will be too small for the space telescope to capture. Instead, the telescope will detect a slight dimming in the light traveling from the star.

    “It’s like looking at a solar eclipse with your eyes shut,” said Jacob Lustig-Yaeger, an astronomer doing a postdoctoral fellowship at the Johns Hopkins Applied Physics Laboratory. “You might have some sense that the light has dimmed.”

    A planet with an atmosphere would dim the star behind it differently than a bare planet would. Some of the star’s light will pass straight through the atmosphere, but the gases will absorb light at certain wavelengths. If astronomers look only at starlight at those wavelengths, the planet will dim Trappist-1 even more.

    The telescope will send these observations of Trappist-1 back to Earth. “And then you get an email that’s like, ‘Hello, your data are available,’” Dr. Mansfield said.

    But the light coming from Trappist-1 will be so faint that it will take time to make sense of it. “Your eye is used to dealing with millions of photons per second,” Dr. Smith said. “But these telescopes, they’re just collecting a few photons a second.”

    Before Dr. Mansfield or her fellow astronomers will be able to analyze exoplanets passing in front of Trappist-1, they will have to first distinguish it from tiny fluctuations produced by the telescope’s own machinery.

    “A lot of the work that I actually do is making sure that we’re carefully correcting for anything weird that the telescope is doing, so that we can see those teeny-tiny signals,” Dr. Mansfield said.

    An artist’s concept of the view from one of the planets in the Trappist-1 system. Credit: M. Kornmesser/European Southern Observatory, via European Pressphoto Agency.

    It is possible that at the end of those efforts, Dr. Mansfield and her colleagues will discover an atmosphere around a Trappist-1 planet. But that result alone will not reveal the nature of the atmosphere. It might be rich in nitrogen and oxygen, like on Earth, or more akin to the toxic stew of carbon dioxide and sulfuric acid on Venus. Or it could be a mix that scientists have never seen before.

    “We have no idea what these atmospheres are made of,” said Alexander Rathcke, an astronomer at the Technical University of Denmark. “We have ideas, simulations, and all this stuff, but we really have no idea. We have to go and look.”

    The James Webb Space Telescope, sometimes called the J.W.S.T., may prove powerful enough to determine the specific ingredients of exoplanet atmospheres because each kind of molecule absorbs a different range of wavelengths of light.

    But those discoveries will depend on the weather on the exoplanets. A bright, reflective blanket of clouds could prevent any starlight from entering an exoplanet’s atmosphere, ruining any attempt to find alien air.

    “It is really hard to distinguish between an atmosphere with clouds or no atmosphere,” Dr. Rathcke said.

    If the weather cooperates, astronomers are especially eager to see if the exoplanets have water in their atmospheres. At least on Earth, water is an essential requirement for biology. “We think that would probably be a good starting point to look for life,” Dr. Mansfield said.

    But a watery atmosphere will not necessarily mean that an exoplanet harbors life. To be sure a planet is alive, scientists will have to detect a biosignature, a molecule or a combination of several molecules that is distinctively made by living things.

    Scientists are still debating what a reliable biosignature would be. Earth’s atmosphere is unique in our solar system in that it contains a lot of oxygen, largely the product of plants and algae. But oxygen can also be produced without life’s help, when water molecules in the air are split. Methane, likewise, can be released by living microbes but also by volcanoes.

    It is possible that there is a particular balance of gases that can provide a clear biosignature, one that cannot be maintained without the help of life.

    “We need extremely favorable scenarios to find these biosignatures,” said Dr. Rathcke. “I’m not saying that it’s not possible. I just think it’s far-fetched. We need to be extremely lucky.”

    Joshua Krissansen-Totton, a planetary scientist at the University of California-Santa Cruz, said that finding such a balance may require the Webb telescope to observe a planet repeatedly passing in front of Trappist-1.

    “If anyone comes forward in the next five years and says, ‘Yes, we’ve found life with J.W.S.T.,’ I’ll be very skeptical of that claim,” Dr. Krissansen-Totton said

    It is possible that the James Webb Space Telescope simply will not be capable of finding biosignatures. That task may have to wait for the next generation of space telescopes, more than a decade away. These will study exoplanets the same way that people look at Mars or Venus in the night sky: by observing starlight reflecting off them against the black background of space, rather than observing them as they pass in front of a star.

    “Mostly, we’ll be doing the very important groundwork for future telescopes,” Dr. Rathcke predicted. “I would be very surprised if J.W.S.T. delivers biosignature detections, but I hope to stand corrected. I mean, this is basically what I’m doing this work for.”

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 8:30 am on July 3, 2022 Permalink | Reply
    Tags: "Gravity Could Solve Clean Energy’s One Major Drawback", A fundamental quirk of electricity: It is impossible to store., , Finding green energy when the winds are calm and the skies are cloudy has been a challenge. Storing it in giant concrete blocks could be the answer., Grids with a high percentage of wind and solar power are susceptible to sudden swings in electricity supply., Has the moment for gravity energy storage finally arrived?, In many parts of the world the era of burning fossil fuels to produce electricity is drawing to a close., Pumped hydro, The race to decarbonize our power grids poses challenges we haven’t faced before., The tricky part however would be figuring out a way to lift and stack weights autonomously., We are living through a revolution in electricity production., , Without a way to decarbonize the world’s electricity supply we’ll never hit net zero greenhouse gas emissions by 2050.   

    From “WIRED“: “Gravity Could Solve Clean Energy’s One Major Drawback” 

    From “WIRED“

    Jan 4, 2022 [Just now in social media.]
    Matt Reynolds

    The Commercial Demonstration Unit lifts blocks weighing 35 tons each.Photograph: Giovanni Frondoni.

    Finding green energy when the winds are calm and the skies are cloudy has been a challenge. Storing it in giant concrete blocks could be the answer.

    In a Swiss valley, an unusual multi-armed crane lifts two 35-ton concrete blocks high into the air. The blocks delicately inch their way up the blue steel frame of the crane, where they hang suspended from either side of a 66-meter-wide horizontal arm. There are three arms in total, each one housing the cables, winches, and grabbing hooks needed to hoist another pair of blocks into the sky, giving the apparatus the appearance of a giant metallic insect lifting and stacking bricks with steel webs. Although the tower is 75 meters tall, it is easily dwarfed by the forested flanks of southern Switzerland’s Lepontine Alps, which rise from the valley floor in all directions.

    Thirty meters. Thirty-five. Forty. The concrete blocks are slowly hoisted upwards by motors powered with electricity from the Swiss power grid. For a few seconds they hang in the warm September air, then the steel cables holding the blocks start to unspool and they begin their slow descent to join the few dozen similar blocks stacked at the foot of the tower. This is the moment that this elaborate dance of steel and concrete has been designed for. As each block descends, the motors that lift the blocks start spinning in reverse, generating electricity that courses through the thick cables running down the side of the crane and onto the power grid. In the 30 seconds during which the blocks are descending, each one generates about one megawatt of electricity: enough to power roughly 1,000 homes.

    This tower is a prototype from Switzerland-based Energy Vault, one of a number of startups finding new ways to use gravity to generate electricity. A fully-sized version of the tower might contain 7,000 bricks and provide enough electricity to power several thousand homes for eight hours. Storing energy in this way could help solve the biggest problem facing the transition to renewable electricity: finding a zero-carbon way to keep the lights on when the wind isn’t blowing and the sun isn’t shining. “The greatest hurdle we have is getting low-cost storage,” says Robert Piconi, CEO and cofounder of Energy Vault.

    Without a way to decarbonize the world’s electricity supply we’ll never hit net zero greenhouse gas emissions by 2050. Electricity production and heat add up to a quarter of all global emissions [IPCC] and, since almost every activity you can imagine requires electricity, cleaning up power grids has huge knock-on effects. If our electricity gets greener, so do our homes, industries, and transport systems. This will become even more critical as more parts of our lives become electrified— particularly heating and transport, which will be difficult to decarbonize in any other way. All of this electrification is expected to double electricity production by 2050 according to the International Atomic Energy Agency. But without an easy way to store large amounts of energy and then release it when we need it, we may never undo our reliance on dirty, polluting, fossil-fuel-fired power stations.

    This is where gravity energy storage comes in. Proponents of the technology argue that gravity provides a neat solution to the storage problem. Rather than relying on lithium-ion batteries, which degrade over time and require rare-earth metals that must be dug out of the ground, Piconi and his colleagues say that gravity systems could provide a cheap, plentiful, and long-lasting store of energy that we’re currently overlooking. But to prove it, they’ll need to build an entirely new way of storing electricity, and then convince an industry already going all-in on lithium-ion batteries that the future of storage involves extremely heavy weights falling from great heights.

    Energy Vault’s test site is in a small town called Arbedo-Castione in Ticino, the southernmost of Switzerland’s 26 cantons and the only one where the sole official language is Italian. The foothills of the Swiss Alps is a fitting location for a gravity energy storage startup: A short drive east from Energy Vault’s offices will take you to the Contra Dam, a concrete edifice made famous in the opening scene of GoldenEye, where James Bond bungee-jumps down the dam’s 220-meter-high face to infiltrate a top-secret Soviet chemical weapons facility. Just to the north of Arbedo-Castione, another towering dam blocks the upper Blenio Valley, holding back the waters of the Luzzone reservoir.

    Water and height—Switzerland has both of these resources in abundance, which is why the country was an early pioneer of the oldest and most widely used large-scale energy storage on the planet: pumped hydro. In the very north of Switzerland is the oldest working pumped hydro facility in the world. Built in 1907, the Engeweiher pumped hydro facility works on the same basic premise as Energy Vault’s tower. When electricity supply is plentiful, water is pumped upwards from the nearby Rhine to fill the 90,000-cubic-meter Engeweiher reservoir. When energy demand is at its highest, some of this water is released through a set of gates and plunges down to a hydroelectric power plant, where the downward movement of the water turns the blades of a turbine and generates electricity. Engeweiher now doubles as a local beauty spot, popular with joggers and dog walkers from the nearby town of Schaffhausen, but pumped hydro has come a long way since the early 20th century. Over 94 percent of the world’s large-scale energy storage is pumped hydro, most of it built between the 1960s and ’90s to harness cheap electricity produced by nuclear power plants running overnight.

    The simplicity of pumped hydro made it the obvious starting point for Bill Gross, a serial entrepreneur and founder of the California-based startup incubator Idealab. “I always wanted to figure out a way to make what I was thinking was an artificial dam. How can we take the properties of a dam, which are so great, but build it wherever we want?” he says. Although new pumped hydro plants are still being built, the technology has some big drawbacks. New projects take years to plan and build, and they only work in places where height and water are plentiful. Gross wanted to re-create the simplicity of pumped hydro, but in a way that meant the storage could be built anywhere. In 2009 he cofounded a startup called Energy Cache, which planned to store energy by lifting gravel bags up hillsides using a jerry-rigged ski lift. Gross and his cofounder Aaron Fyke eventually built a small prototype of the device in 2012 on a hillside in Irwindale, California, but they struggled to find customers and shortly afterwards the startup folded. “For years I thought about that. I was saddened about that,” he says. “But I kept on thinking that the real thing that energy storage has to have is that you need to be able to put it wherever you want.” While Gross was brooding on his failed startup, the case for energy storage was only getting stronger. Between 2010 and 2016, the cost of solar electricity went from 38 cents (28p) per kilowatt hour to just 11 cents. Gross became convinced that it might be time to return to his gravity storage idea, with a new startup and a new design. And he knew exactly who he wanted to build it.

    Blocks raised by the Commercial Demonstration Unit “plug” into the blocks below.Photograph: Giovanni Frondoni.

    Andrea Pedretti has a background in building improbable structures. At his family’s civil engineering firm in Ticino he helped build the main stage for the annual Kongsberg Jazz Festival in Norway: a 20-meter-high floating PVC blanket with a bulging horn that pours sound into the town square. In 2016, Pedretti received a call from Gross asking him to help design a very different kind of structure: an energy storage device that would re-create pumped hydro without the need for mountains. The pair started drafting rough ideas for structures, calculating how much each one would cost to build and discussing the designs over frequent calls between Ticino and California. “[Gross] is always obsessed with reducing the cost of everything—he’s very good at this,” says Pedretti, now Energy Vault’s chief technology officer. One of their first designs took the form of a steel-walled tank 100 meters tall and 30 meters wide, where water would be pumped to the top and then released to plunge back down to the bottom, turning a turbine connected to a generator. Later they considered building a series of elevated plastic troughs that would tilt as water dropped between the levels. None of the designs brought the cost down low enough, so Pedretti and Gross returned to one of their very first ideas: using a crane to lift and drop weights. Cranes are cheap and the technology is everywhere, reasoned Pedretti. This way they wouldn’t have to reinvent the wheel just to get their idea off the ground.

    The tricky part however would be figuring out a way to lift and stack weights autonomously. The storage system would work by stacking thousands of blocks in concentric rings around a central tower, which would require millimeter-precise placement of the blocks and the ability to compensate for wind and the pendulum effect caused by a heavy weight swinging at the end of a cable. On the demonstrator tower in Arbedo-Castione, the trolleys that hold the cables that lift the bricks move back and forth to compensate for this motion; the blackboard in Pedretti’s office in Westlake Village, California, is still covered with equations he used to work out the best way to smoothly lift and stack blocks.

    In July 2017, Pedretti went online and bought a 40-year-old crane for €5,000. “It was rusty, but it was fine. It did the job,” he says. With his colleague at Energy Vault, Johnny Zani, he replaced the crane’s electronics and set it up in a town called Biasca, north of Energy Vault’s current test site. For their first test of the software, they instructed the crane to lift a bag of dirt and move it to a specific point a short distance away. “It was amazing—it worked the first time. This never happens! It took the weight, moved it and stopped it exactly ten metres away,” says Pedretti. A week later they swapped the bag of dirt for a stack of bright blue barrels and took a video of the crane stacking the barrels. “This was the video that basically started the company,” says Pedretti.

    By October 2017, Energy Vault had officially become a company, with Robert Piconi, a former healthcare executive and another of Gross’s collaborators, as its CEO. Now they had to convince investors that their 40-year-old crane was just the beginning of a company that could help solve the world’s growing renewable electricity dilemma.

    Energy Vault’s 75-meter-tall Commercial Demonstration Unit at night, in Arbedo-Castione, Switzerland.Photograph: Giovanni Frondoni.

    We are living through a revolution in electricity production. In many parts of the world the era of burning fossil fuels to produce electricity is drawing to a close. In 2020, the UK went a record-breaking 67 days without firing up one of its few remaining coal power plants, a staggering feat for a country that produced one-third of its electricity from coal less than 10 years ago. Since 2010, the rapid deployment of wind and solar has pushed the share of global electricity produced by renewables up from 20 percent to just under 29 percent. According to the International Energy Agency, by 2023 total installed wind and solar capacity will surpass that of natural gas. By 2024 it will shoot past coal and a year later renewables as a whole are set to become the single largest source of electricity generation worldwide. “If we are serious about trying to deal with climate change, we better be in a situation where we are moving towards a high renewables penetration system,” says Dharik Mallapragada, a research scientist at Massachusetts Institute of Technology’s Energy Initiative. “That’s our best card from a technology perspective. Just deploy as much wind and solar into the system as we can.”

    The race to decarbonize our grids poses challenges we haven’t faced before. Running a power grid is a high-wire act where electricity generation must be carefully balanced with demand at all times. The system is always on the verge of veering dangerously out of equilibrium. Generate too much electricity and the grid breaks down. Generate too little electricity and, well, the grid breaks down. This is exactly what happened in Texas in February 2021, when one of the coldest winter storms in decades hit the state. Texans raced to turn up their heating and defend against temperatures so low that the pipelines running to gas and nuclear power stations froze solid. As demand surged and supply plummeted in the early hours of February 15, staff in the control room at the Electrical Reliability Council of Texas (ERCOT) frantically called utilities, asking them to cut power to their customers. Millions of Texans were left without electricity for days. Some died of hypothermia inside their own homes while they waited for the power to come back online. A few days after the crisis, ERCOT’s chief executive officer Bill Magness admitted that the entire grid was only “seconds and minutes” away from an uncontrolled blackout that could have left tens of millions of residents without power for several weeks.

    Grids with a high percentage of wind and solar power are susceptible to sudden swings in electricity supply. When the skies darken or the winds grow calm, that electricity generation simply disappears from the grid, leaving utilities to plug the gap using fossil fuels. The opposite situation poses problems too. Around 32 percent of California’s electricity is generated from renewables, but on cool spring days, when the skies are clear and the winds steady, this can spike to almost 95 percent. Unfortunately, solar power peaks at around midday, hours before electricity demand reaches its highest level as people return home from work, crank up the air-conditioning, and turn on the TV. Since solar power isn’t generated late in the evening, this peak demand is usually met by gas power plants instead. When researchers at California Independent System Operator charted this gap between solar production and peak energy demand on a graph, they noticed that the line traced the round belly and slender neck of a duck, and christened one of renewables’ most vexing complications the “duck curve.” The cute-looking curve is such a problem that California sometimes has to pay neighboring states to take excess solar energy off its hands to avoid overloading its power lines. In Hawaii, where the difference between peak solar electricity generation and peak demand is even more pronounced, this curve has another name: the “Nessie curve.”

    All of these problems are down to a fundamental quirk of electricity: It is impossible to store. A spark of electricity produced at a coal-fired power plant cannot stay still; it has to go somewhere. To keep networks in balance, grid operators are constantly matching supply and demand, but the more wind and solar you add to the grid, the more uncertainty you introduce into this balancing act. Utilities hedge against this by keeping fossil-fuel power plants around to dispatch reliable energy whenever necessary. Energy storage offers one way out of this bind. By converting electrical energy into a different form of energy—chemical energy in a lithium-ion battery, or gravitational potential energy in one of Energy Vault’s hanging bricks—you can hold onto that energy and deploy it exactly when you need it. That way you squeeze more value out of renewable power sources and reduce the need for backup from fossil fuel power plants. “It’s a shift that has to happen, and battery technology and energy storage more generally is an important part of that shift towards renewable power,” says Alex Holland, a senior technology analyst at IDTechEx. According to Bloomberg New Energy Finance, energy storage is on the verge of an exponential rise: Its 2019 report predicts a 122-fold increase in storage by 2040, requiring up to half a trillion pounds in new investments.

    A rendering of how retired coal-plant sites could be reused for Energy Vault Resiliences Centers.Photograph: Energy Vault Inc.

    Even as his company started work on the multi-arm crane design in 2018, it was becoming clear to Piconi that the next version of his energy storage system would need a major overhaul. For a start, a full-scale tower would weigh an astronomical amount and require deep foundations to keep it stable. The blocks alone would add up to about 245,000 ton—nearly half the weight of the Burj Khalifa skyscraper in Dubai. The exposed design also posed potential problems. If snow was trapped between two blocks it could be compacted into ice, making stacking more blocks impossible. Sandstorms could prove a similar risk.

    To solve these problems, Piconi and his colleagues decided to put their gravity storage system inside vast modular buildings—a system they call EVx. Each proposed building would measure at least 100 meters tall and contain thousands of weights. Getting rid of the crane simplifies the logistics of working with so many weights. Instead of having to be stacked precisely in concentric circles, now the weights can simply be lifted vertically by a trolley system and stored on a rack at the top of the building until they are ready to come back down again. The design can also be altered depending on storage requirements: A long but thin building would provide lots of energy over a relatively short period of time, while adding further width to the building would increase the timespan over which it could release energy. A one-gigawatt-hour system that could provide roughly enough energy to power around 100,000 homes for 10 hours would have a footprint of 25 to 30 acres. “I mean, it’s pretty massive,” Piconi says, but he points out that the systems are likely to be deployed in places where there is no shortage of space, including near existing wind and solar farms. The system is also garnering interest from power-hungry heavy industries eager to use more renewable energy. One potential customer is an ammonia manufacturer in the Middle East and another a large mining firm in Australia. Piconi says that the majority of customers will buy the storage system outright, but some can be leased on a monthly storage-as-a-service model. So far, the biggest deals on the table for Energy Vault are with big industrial clients. “As things have evolved and people are looking at alternatives and [solar power] has come down so low, these industrial applications become very interesting,” Piconi says.

    The most important question facing Energy Vault is whether it can get the cost of its buildings low enough that it makes gravity the most attractive form of energy storage. Since 1991, the cost of lithium-ion batteries has fallen by 97 percent, and analysts expect that price to keep dropping in the coming decades. “Really, any storage technology has to compete against lithium-ion, because lithium-ion is on this incredible cost-reduction trajectory,” says Oliver Schmidt, a visiting researcher at Imperial College London. Over the next couple of decades, hundreds of millions of electric vehicles will roll off production lines, and almost every single one of them will contain a lithium-ion battery. In mid-2018, Tesla’s Gigafactory was producing more than 20 gigawatt hours of lithium-ion batteries every year—more than the total grid-scale battery storage installed in the entire world. The boom in electric vehicles is driving the cost of lithium-ion down, and energy storage is coming along for the ride.

    The price of Energy Vault’s systems might not have so far to fall. Every facility will require the construction of a new building, although Gross says the team is already working on ways to cut costs by reducing the amount of material required and automating parts of the construction. One advantage it has is the weights. The several thousand 30-ton blocks in each EVx system can be made out of soil from the building site or other materials destined for landfill, plus a little binder. In July 2021, Energy Vault announced a partnership with Italian energy firm Enel Green Power to use fiberglass from decommissioned wind turbine blades to form part of its bricks. At its test site in Arbedo-Castione, it has a brick press that can churn out a new block every 15 minutes. “That’s what’s great about the way we’ve designed the supply chain. There’s nothing to stop us. It’s dirt. It’s waste product. We can build these brick machines in four months, we can build 25 to 50 of them,” says Piconi.

    Edinburgh-based energy storage startup Gravitricity has found a novel way to keep the costs of gravity storage down: dropping its weights down disused mineshafts, rather than building towers. “We believe that to get the sort of cost, engineering and physics to work for large scale systems … we need to use the geology of the Earth to hold the weight up,” says Gravitricity managing director Charlie Blair. In April 2021, Gravitricity started tests on a 15-meter-high demonstration system assembled in Leith, Scotland, but the company’s first commercial system may end up being in Czechia, where politicians are keen to find a new use for soon-to-be-decommissioned coal mines. Another potential location is South Africa, which has plenty of its own mines plus the added problems of an unstable electricity grid and frequent power blackouts.

    Gravitricity is targeting a different part of the energy market from Energy Vault: providing short bursts of electricity at crucial times to keep expensive energy infrastructure from being damaged. Power grids are designed to operate at a certain frequency; European grids run at 50 hertz while in the US it’s 60 hertz. This frequency is maintained by keeping a balance between supply and demand on the grid, but a sudden spike in either of these threatens to send the frequency rising or falling. In fossil-fuel power plants, spinning turbines act like shock absorbers, smoothing out small changes in frequency while operators either increase or decrease energy supply to match demand. Solar and wind power plants don’t work like this, so when they stop generating electricity, grids need another source of power to quickly step in to maintain frequency while generation elsewhere is ramped up. Blair says that Gravitricity’s systems will be able to respond to frequency changes in less than a second, and that combining its system with other technologies could shorten this response time even further. This service, called frequency response, is so crucial that power network operators pay a heavy premium for companies that can respond with split-second timing.

    Has the moment for gravity energy storage finally arrived? In the last decade, multiple gravity startups have launched, failed and then reappeared in different forms. None of them have yet sold and built a system for a customer, although Energy Vault has eight deals signed with several projects slated to begin by the middle of 2022. In September 2021, the company announced that it would soon list on the New York Stock Exchange after a merger with a special purchase acquisition company (SPAC): an in-vogue alternative to an IPO that offers firms a quicker and easier route into going public. The company behind Energy Vault’s listing, Novus Capital, was also behind another SPAC which took the farming technology firm AppHarvest public in February 2021. Since then, AppHarvest’s share price has been on a dramatic downward slide, and the company is now subject to a class action lawsuit alleging that the firm misled investors about its projected financial results.

    The latest SPAC valued Energy Vault at $1.1 billion (£808 million), but some experts aren’t convinced that the potential for gravity energy storage is as widespread as its proponents suggest. “There’s a lot of money floating around, generally, green energy storage technologies. And I think you can ride that wave to a certain extent,” says Alex Holland, the analyst at IDTechEx. In 2019 Energy Vault announced a $110 million investment from SoftBank’s Vision Fund, although SoftBank only delivered $25 million of this before pausing the funding in 2020. SoftBank later re-invested in Energy Vault as part of a Series C round in August 2021 and again as part of the SPAC deal. Other investors in Energy Vault include Saudi Aramco Energy Ventures, Prime Movers Lab, and several investment firms.

    As with other early-stage storage companies, Energy Vault has had to strike a careful balancing act in how it pitches itself: disruptive enough to attract investors looking for the next big thing, but reliable and cheap enough that utilities will consider making it a part of their energy infrastructure. On one hand there is the moonshot of a fully renewable world, on the other the brute economics of cheap energy storage. One wall in the company’s Ticino offices holds a framed tweet from Bill Gates calling Energy Vault an “exciting company.” On the opposite side of the wall is another framed quote, this time from Robert Piconi himself, about dispatching stored energy below the cost of fossil fuels.

    Schmidt was also surprised to see a billion-dollar valuation. The need for long-term storage really starts to bite when energy systems are made up of more than 80 percent renewable energy. That figure is a very long way off for most countries. In the meantime, we still have other ways of achieving flexibility: thermal power plants burning biomass with carbon capture, interconnections between power grids and reducing demand for electricity. Schmidt thinks that lithium-ion will satisfy most of the world’s need for new storage until national power grids hit 80 percent renewables, and then the need for longer-term storage will be met by a host of competing technologies, including flow batteries, compressed air, thermal storage and gravity storage. “The first challenge with renewables, as you get to high penetrations, is second-to-second, minute-to-minute volatility, and if you can’t solve those stability problems you won’t ever get to 80 percent renewable penetration,” says Marek Kubik, a managing director at Fluence, an energy storage company that has built 3.4 gigawatts of grid-scale battery storage—almost all of it lithium ion. “Today, lithium ion has just been the dominant technology because of the cost declines, which are driven not by the stationary storage industry but by electric vehicles. That is a very formidable force.”

    Pedretti points out, however, that lithium ion batteries degrade over time and have to be replaced. Gravity is a form of storage that theoretically shouldn’t lose efficacy. “Today, people think short-term,” he says. “Politicians, managers, everyone is measured on short-term performance.” Switching the world to renewable electricity will require a shift in thinking from just a few years ahead to decades and even centuries to come. The people who built Switzerland’s dams and pumped hydro plants didn’t take a short-term view, he adds. The Engeweiher pumped hydro plant in Schaffhausen is still contracted to run for another 31 years; by the end of that contract it will have been in operation for nearly one and a half centuries. Building the power grid for a zero-carbon world is a similar exercise in long-term thinking: “In the past the people who made the dams didn’t think short-term. They thought more long-term. And today this is missing.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 3:03 pm on July 2, 2022 Permalink | Reply
    Tags: "Destruction and recovery of kelp forests driven by changes in sea urchin behavior", A dramatic outbreak of kelp-eating sea urchins along the Central Coast of California in 2014 leads to a significant reduction in the region’s kelp forests., A long-term study of kelp forest dynamics on California’s Central Coast highlights the critical role of sea urchin behavior not just the size of the urchin population., A series of major disruptions to California’s kelp forest ecosystems began in 2013 with the emergence of sea star wasting disease., Scientists showed that sea otters were crucial to maintaining patches of healthy kelp forest in Monterey Bay by preying on sea urchins., the 2014 outbreak of purple sea urchins in southern Monterey Bay was primarily driven by a behavioral shift and not by a population increase., The researchers found that kelp forests can recover if sea urchins move off of a reef.,   

    From The University of California-Santa Cruz: “Destruction and recovery of kelp forests driven by changes in sea urchin behavior” 

    From The University of California-Santa Cruz

    June 29, 2022
    Tim Stephens

    A long-term study of kelp forest dynamics on California’s Central Coast highlights the critical role of sea urchin behavior not just the size of the urchin population.

    Purple sea urchins hidden in a reef crevice. (Photo by Michael Langhans)

    Along the Monterey Peninsula, “urchin barrens” with no kelp (foreground) are interspersed with remnant patches of kelp forest (background). (Photo by Michael Langhans)

    In 2019, researchers found a forest of bull kelp growing on a deep reef that had been a sea urchin barren the previous year. (Photo by Patrick Webster)

    A diver conducts a survey in a kelp forest. (Photo by Michael Langhans)

    A dramatic outbreak of kelp-eating sea urchins along the Central Coast of California in 2014 leading to a significant reduction in the region’s kelp forests, was driven primarily by the emergence of sea urchins from their hiding places rather than an increase in the urchin population. In subsequent years, sea urchin movements enabled kelp forest recovery at sites that had been denuded “urchin barrens.”

    Those are among the key findings of a long-term study of sea urchins and kelp forest dynamics in Monterey Bay conducted by scientists at the University of California, Santa Cruz. Published June 29 in Ecology Letters, the new findings could be helpful in efforts to restore decimated kelp forests along the California coast, said Joshua Smith, who led the study as a Ph.D. student at UCSC and is now a postdoctoral researcher at the National Center for Ecological Analysis and Synthesis at UC Santa Barbara.

    “As people are thinking about practical ways to facilitate kelp forest recovery, most people would agree we need to reduce the number of sea urchins, but it’s also really important to consider the role of sea urchin behavior,” Smith said.

    Smith and coauthor Tim Tinker, an adjunct professor of ecology and evolutionary biology at UCSC, used 22 years of long-term monitoring data to show that the 2014 outbreak of purple sea urchins in southern Monterey Bay was primarily driven by a behavioral shift and not by a population increase.

    Over the next three years, they tracked the foraging behavior of sea urchins in the area, which had become a patchy mosaic of kelp forests and urchin barrens. They found that sea urchin behavior was linked to transitions between the two states in both directions.

    Ecosystem disruptions

    A series of major disruptions to California’s kelp forest ecosystems began in 2013 with the emergence of sea star wasting disease, which wiped out a sea urchin predator, the sunflower sea star. The next year an extraordinary marine heatwave bathed the coast in warm water, creating poor conditions for the growth of kelp. That set the stage for the unprecedented outbreak of sea urchins. All of a sudden, the rocky reefs where kelp forests grew were covered with purple sea urchins grazing on the living kelp.

    Normally, Smith explained, sea urchins hide from predators in the cracks and crevices of the rocky reef and feed on kelp detritus that drifts their way on the currents. With reduced kelp productivity due to the warm water, the usual food deliveries weren’t happening, while at the same time a known sea urchin predator had disappeared.

    “In 2014, they came storming out of the crevices looking for kelp,” Smith said. “These were big adult sea urchins that showed up all of a sudden, especially in central and northern California, and that behavioral shift here along the Monterey Peninsula led to these urchin barrens where there had been kelp forest.”

    In a previous study, Smith and coauthors showed that sea otters were crucial to maintaining patches of healthy kelp forest in Monterey Bay by preying on sea urchins. In northern California, where there are no sea otters, the kelp forests are almost entirely gone.

    In the new study, Smith and Tinker found no evidence of an unusual pulse of juvenile sea urchins increasing the population in 2014. In subsequent years, however, sea urchin “recruitment” (the addition of juvenile urchins to the population) did increase, so there are now more sea urchins than there were before 2014, Smith said.


    Nevertheless, the researchers found that kelp forests can recover if sea urchins move off of a reef. In 2018, for example, they went out to a site that had been an urchin barren the previous year and found a kelp forest. The sea urchins had moved away from the reef, which they had stripped of kelp, into shallower water where there was an abundance of red foliose algae.

    “They prefer kelp, but they had eaten all the kelp, so they moved up into shallower water and that allowed the kelp to regrow on the deeper reef,” Smith said. “Interestingly, the kelp that came back was bull kelp, not the giant kelp that is typically dominant on the central coast.”

    He noted that water conditions this year are excellent for kelp growth. An unusually windy spring has driven strong upwelling of cold, nutrient-rich water along the coast.

    “We’re all waiting to see how that impacts kelp recovery,” Smith said. “There are just a lot more sea urchins now out on the reefs actively grazing, so we need to consider how we might get their numbers back down and enough predation to drive them back into the crevices.”

    This work was supported by the National Science Foundation.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Santa Cruz campus.

    The University of California-Santa Cruz, opened in 1965 and grew, one college at a time, to its current (2008-09) enrollment of more than 16,000 students. Undergraduates pursue more than 60 majors supervised by divisional deans of humanities, physical & biological sciences, social sciences, and arts. Graduate students work toward graduate certificates, master’s degrees, or doctoral degrees in more than 30 academic fields under the supervision of the divisional and graduate deans. The dean of the Jack Baskin School of Engineering oversees the campus’s undergraduate and graduate engineering programs.

    UCSC is the home base for the Lick Observatory.

    UCO Lick Observatory’s 36-inch Great Refractor telescope housed in the South (large) Dome of main building.

    UC Santa Cruz Lick Observatory Since 1888 Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft)

    UC Observatories Lick Automated Planet Finder fully robotic 2.4-meter optical telescope at Lick Observatory, situated on the summit of Mount Hamilton, east of San Jose, California, USA.

    The UCO Lick C. Donald Shane telescope is a 120-inch (3.0-meter) reflecting telescope located at the Lick Observatory, Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft).

    Search for extraterrestrial intelligence expands at Lick Observatory
    New instrument scans the sky for pulses of infrared light
    March 23, 2015
    By Hilary Lebow
    Astronomers are expanding the search for extraterrestrial intelligence into a new realm with detectors tuned to infrared light at UC’s Lick Observatory. A new instrument, called NIROSETI, will soon scour the sky for messages from other worlds.

    “Infrared light would be an excellent means of interstellar communication,” said Shelley Wright, an assistant professor of physics at UC San Diego who led the development of the new instrument while at the University of Toronto’s Dunlap Institute for Astronomy & Astrophysics.

    Wright worked on an earlier SETI project at Lick Observatory as a UC Santa Cruz undergraduate, when she built an optical instrument designed by UC Berkeley researchers. The infrared project takes advantage of new technology not available for that first optical search.

    Infrared light would be a good way for extraterrestrials to get our attention here on Earth, since pulses from a powerful infrared laser could outshine a star, if only for a billionth of a second. Interstellar gas and dust is almost transparent to near infrared, so these signals can be seen from great distances. It also takes less energy to send information using infrared signals than with visible light.

    The NIROSETI instrument saw first light on the Nickel 1-meter Telescope at Lick Observatory on March 15, 2015. (Photo by Laurie Hatch.)
    Astronomers are expanding the search for extraterrestrial intelligence into a new realm with detectors tuned to infrared light at UC’s Lick Observatory. A new instrument, called NIROSETI, will soon scour the sky for messages from other worlds.

    Alumna Shelley Wright, now an assistant professor of physics at UC San Diego, discusses the dichroic filter of the NIROSETI instrument, developed at the U Toronto Dunlap Institute for Astronomy and Astrophysics (CA) and brought to The University of California-San Diego and installed at the UC Santa Cruz Lick Observatory Nickel Telescope (Photo by Laurie Hatch). “Infrared light would be an excellent means of interstellar communication,” said Shelley Wright, an assistant professor of physics at The University of California-San Diego who led the development of the new instrument while at the U Toronto Dunlap Institute for Astronomy and Astrophysics (CA).

    Shelley Wright of UC San Diego with NIROSETI, developed at U Toronto Dunlap Institute for Astronomy and Astrophysics (CA) at the 1-meter Nickel Telescope at Lick Observatory at UC Santa Cruz

    NIROSETI team from left to right Rem Stone UCO Lick Observatory Dan Werthimer, UC Berkeley; Jérôme Maire, U Toronto; Shelley Wright, The University of California-San Diego Patrick Dorval, U Toronto; Richard Treffers, Starman Systems. (Image by Laurie Hatch).

    Wright worked on an earlier SETI project at Lick Observatory as a UC Santa Cruz undergraduate, when she built an optical instrument designed by University of California-Berkeley researchers. The infrared project takes advantage of new technology not available for that first optical search.

    Infrared light would be a good way for extraterrestrials to get our attention here on Earth, since pulses from a powerful infrared laser could outshine a star, if only for a billionth of a second. Interstellar gas and dust is almost transparent to near infrared, so these signals can be seen from great distances. It also takes less energy to send information using infrared signals than with visible light.

    Frank Drake, professor emeritus of astronomy and astrophysics at UC Santa Cruz and director emeritus of the SETI Institute, said there are several additional advantages to a search in the infrared realm.

    Frank Drake with his Drake Equation. Credit Frank Drake.

    Drake Equation, Frank Drake, Seti Institute.

    “The signals are so strong that we only need a small telescope to receive them. Smaller telescopes can offer more observational time, and that is good because we need to search many stars for a chance of success,” said Drake.

    The only downside is that extraterrestrials would need to be transmitting their signals in our direction, Drake said, though he sees this as a positive side to that limitation. “If we get a signal from someone who’s aiming for us, it could mean there’s altruism in the universe. I like that idea. If they want to be friendly, that’s who we will find.”

    Scientists have searched the skies for radio signals for more than 50 years and expanded their search into the optical realm more than a decade ago. The idea of searching in the infrared is not a new one, but instruments capable of capturing pulses of infrared light only recently became available.

    “We had to wait,” Wright said. “I spent eight years waiting and watching as new technology emerged.”

    Now that technology has caught up, the search will extend to stars thousands of light years away, rather than just hundreds. NIROSETI, or Near-Infrared Optical Search for Extraterrestrial Intelligence, could also uncover new information about the physical universe.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: