Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:56 pm on June 22, 2018 Permalink | Reply
    Tags: , , , , ,   

    From Scientific American: “Evidence Builds for a New Kind of Neutrino” 

    Scientific American

    From Scientific American

    June 7, 2018
    Clara Moskowitz


    Physicists have caught ghostly particles called neutrinos misbehaving at an Illinois experiment, suggesting an extra species of neutrino exists. If borne out, the findings would be nothing short of revolutionary, introducing a new fundamental particle to the lexicon of physics that might even help explain the mystery of dark matter.

    Undeterred by the fact that no one agrees on what the observations actually mean, experts gathered at a neutrino conference this week in Germany are already excitedly discussing these and other far-reaching implications.

    Neutrinos are confusing to begin with. Formed long ago in the universe’s first moments and today in the hearts of stars and the cores of nuclear reactors, the miniscule particles travel at nearly the speed of light, and scarcely interact with anything else; billions pass harmlessly through your body each day, and a typical neutrino could traverse a layer of lead a light-year thick unscathed. Ever since their discovery in the mid–20th century, neutrinos were predicted to weigh nothing at all, but experiments in the 1990s showed they do have some mass—although physicists still do not know exactly how much. Stranger still, they come in three known varieties, or flavors—electron neutrinos, muon neutrinos and tau neutrinos—and, most bizarrely, can transform from one flavor to another. Because of these oddities and others, many physicists have been betting on neutrinos to open the door to the next frontier in physics.

    Now some think the door has cracked ajar. The discovery comes from 15 years’ worth of data gathered by the Mini Booster Neutrino Experiment (MiniBooNE) at Fermi National Accelerator Laboratory in Batavia, Ill. MiniBooNE detects and characterizes neutrinos by the flashes of light they occasionally create when they strike atomic nuclei in a giant vat filled with 800 tons of pure mineral oil. Its design is similar to that of an earlier project, the Liquid Scintillator Neutrino Detector (LSND) at Los Alamos National Laboratory in New Mexico. In the 1990s LSND observed a curious anomaly, a greater-than-expected number of electron neutrinos in a beam of particles that started out as muon neutrinos; MiniBooNE has now seen the same thing, in a neutrino beam generated by one of Fermilab’s particle accelerators.

    Because muon neutrinos could not have transformed directly into electron flavor over the short distance of the LSND experiment, theorists at the time proposed that some of the particles were oscillating into a fourth flavor—a “sterile neutrino”—and then turning into electron neutrinos, producing the mysterious excess. Although the possibility was tantalizing, many physicists assumed the findings were a fluke, caused by some mundane error particular to LSND. But now that MiniBooNE has observed the very same pattern, scientists are being forced to reckon with potentially more profound causes for the phenomenon. “Now you have to really say you have two experiments seeing the same physics effect, so there must be something fundamental going on,” says MiniBooNE co-spokesperson Richard Van de Water of Los Alamos. “People can’t ignore this anymore.”

    The MiniBooNE team submitted its findings on May 30 to the preprint server arXiv, and is presenting them this week at the XXVIII International Conference on Neutrino Physics and Astrophysics in Heidelberg, Germany.

    A Fourth Flavor

    Sterile neutrinos are an exciting prospect, but outside experts say it is too early to conclude such particles are behind the observations. “If it is sterile neutrinos, it’d be revolutionary,” says Mark Thomson, a neutrino physicist and chief executive of the U.K.’s Science and Technology Facilities Council who was not part of the research. “But that’s a big ‘if.’”

    This new flavor would be called “sterile” because the particles would not feel any of the forces of nature, save for gravity, which would effectively block off communication with the rest of the particle world. Even so, they would still have mass, potentially making them an attractive explanation for the mysterious “dark matter” that seems to contribute additional mass to galaxies and galaxy clusters. “If there is a sterile neutrino, it’s not just some extra particle hanging out there, but maybe some messenger to the universe’s ‘dark sector,’” Van de Water says. “That’s why this is really exciting.” Yet the sterile neutrinos that might be showing up at MiniBooNE seem to be too light to account for dark matter themselves—rather they might be the first vanguard of a whole group of sterile neutrinos of various masses. “Once there is one [sterile neutrino], it begs the question: How many?” says Kevork Abazajian, a theoretical physicist at the University of California, Irvine. “They could participate in oscillations and be dark matter.”

    The findings are hard to interpret, however, because if neutrinos are transforming into sterile neutrinos in MiniBooNE, then scientists would expect to measure not just the appearance of extra electron neutrinos, but a corresponding disappearance of the muon neutrinos they started out as, balanced like two sides of an equation. Yet MiniBooNE and other experiments do not see such a disappearance. “That’s a problem, but it’s not a huge problem,” says theoretical physicist André de Gouvêa of Fermilab. “The reason this is not slam-dunk evidence against the sterile neutrino hypothesis is that [detecting] disappearance is very hard. You have to know exactly how much you had at the beginning, and that’s a challenge.”

    Another Mystery?

    Or perhaps MiniBooNE has discovered something big, but not sterile neutrinos. Maybe some other new aspect of the universe is responsible for the unexpected pattern of particles in the experiment’s beam. “Right now people are thinking about whether there are other new phenomena out there that could resolve this ambiguity,” de Gouvêa says. “Maybe the neutrinos have some new force that we haven’t thought about, or maybe the neutrinos decay in some funny way. It kind of feels like we haven’t hit the right hypothesis yet.”

    Unusually, this is one mystery physicists will not have to wait too long to solve. Another experiment at Fermilab called MicroBooNE was designed to follow MiniBooNE and will be able to study the excess more closely.


    One drawback of MiniBooNE is that it cannot be sure the flashes of light it sees are truly coming from neutrinos—it is possible that some unknown process is producing an excess of photons that mimic the neutrino signal. MicroBooNE, which should deliver its first data later this year, can distinguish between neutrino signals and impostors. If the signal turns out to be an excess of ordinary photons, rather than electron neutrinos, then all bets are off. “We don’t know what would do that in terms of physics, but if it is due to photons, we know that this sterile neutrino interpretation is not correct,” de Gouvêa says.

    In addition to MicroBooNE, Fermilab is building two other detectors to sit on the same beam of neutrinos and work in concert to study the neutrino oscillations going on there. Known collectively as the Short-Baseline Neutrino Program, the new system should be up and running by 2020 and could deliver definitive data in the early part of that decade, says Steve Brice, head of Fermilab’s Neutrino Division.

    FNAL Short baseline neutrino detector

    Until then physicists will continue to debate the mysteries of neutrinos—a field that is growing in size and excitement every year. The meeting happening now in Heidelberg, for example, is the largest neutrino conference ever. “It’s been a steady ramp-up over the last decade,” Brice says. “It’s an area that’s hard to study, but it’s proving to be a very fruitful field for physics.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

  • richardmitnick 2:32 pm on June 22, 2018 Permalink | Reply
    Tags: A SINFONI of Exoplanets, , , , ,   

    From ESOblog: “A SINFONI of Exoplanets” 

    ESO 50 Large

    From ESOblog

    Science Snapshots

    22 June 2018

    Exoplanets have fast become a huge research area and astronomers are now trying to study their atmospheres. The possibility of finding an exoplanet with an atmosphere that may be able to support life is incredibly exciting. We spoke to Jens Hoeijmakers, from the Geneva Observatory and the Center for Space Habitability in Bern, Switzerland, to find out more about these distant worlds.

    Q: Let’s start simple: what is an exoplanet?

    A: Since 1995, we have known that many stars other than the Sun have their own “solar systems,” with the majority of stars hosting one or multiple planets. Exoplanetary systems come in all shapes and colours, meaning that they are very diverse. Astronomers have discovered planets ranging from gas giants to smaller, rocky planets. Some planets orbit far away from their star like the gas and ice giants in our Solar System, and some orbit very closely, with surface temperatures greater than 1000°C.

    Q: Why do you think it’s important and exciting to study exoplanets?

    A: The discovery of the existence of exoplanets has evolved as a major branch of astronomy in the past two decades. We now know of the existence of thousands of exoplanets, and this has shown that planets may even be more common than stars in our Universe! This ubiquitous presence of planets all around us begs the question of whether it’s possible for extraterrestrial life to exist. This is a major driving force behind the continued search for exoplanets and the detailed study of those that we’ve already discovered. But besides the exciting prospect of discovering life, the exoplanet population also gives us a unique window into understanding our own Solar System and the possible outcomes of the same planet formation processes that have made our Solar System the way that we see it today — essentially, studying exoplanets can help us understand how we got to be here.

    This composite image represents the close environment of Beta Pictoris as seen in near-infrared light. A very careful subtraction of the much bright stellar halo reveals this very faint environment. The outer part of the image shows the reflected light on the dust disc, as observed with ADONIS on ESO 3.6-metre telescope. The inner part is the innermost part of the system, as seen with NACO on the Very Large Telescope.
    Credit: ESO/A.-M. Lagrange et al.

    ADONIS Infrared Cameras


    Q: Your research looked at one exoplanet in particular: Beta Pictoris b. Why did you choose to look at this system?

    A: Beta Pictoris b is maybe the most famous directly-imaged exoplanet, meaning that astronomers have managed to actually take a snapshot of the planet rather than infer its existence through its indirect effect on its star, as is most commonly done. Beta Pictoris b orbits a bright star about 70 light-years away from Earth, is in a system about 20 to 25 million years old and has a fairly hot surface, about 1700°C.

    Beta Pictoris b is one of the easier (but still challenging) planets to image directly because it’s young and hot enough to be observed at infrared wavelengths. When stars and planets form in a large disk of gas and dust, known as the protoplanetary disk, the material from which the planets form is very hot. This means that newborn planets start off with very high temperatures, and throughout the first tens of millions of years of their lives, they slowly cool down as they radiate this heat away, making these planets visible at infrared wavelengths. This is the class of planets that we can directly image, and Beta Pictoris b is a typical example of such a young planet, which is why we chose to observe it — and, indeed, why it is one of the most famous directly imaged exoplanets.

    Q: How did you observe Beta Pictoris b and what were you aiming to find?

    A: We used existing data of the planet from the SINFONI spectrograph on ESO’s Very Large Telescope located at the Paranal Observatory in Chile. Our aim was actually to test out the instrument — to investigate to what extent an adaptive-optics-assisted integral field spectrograph like SINFONI can be used to study an exoplanet’s atmosphere.


    SINFONI is a special instrument. Not only does it perform the high-contrast imaging necessary to separately image the planet from its brighter host star, but it also simultaneously generates a spectrum of each pixel in that image at a high enough resolution. This allows us to see absorption lines in the spectrum of the planet. These absorption lines are what tell us about the chemicals in the planet’s atmosphere, and also about the planet’s temperature and other physical parameters. In fact, our new technique relies on the fact that the planet’s spectrum has absorption lines that are not present in the star that it orbits. This helps us disentangle the planet from its much brighter star, effectively increasing the contrast on top of the already high-contrast imaging from SINFONI. The instrument was not actually designed to be used in this way, so we’re the first to apply this technique.

    The only other instrument in the world that can currently perform this type of research is the OSIRIS spectrograph at the Keck Observatory in Hawaii.

    UCO Keck OSIRIS being installed

    It is very similar to SINFONI but is located at a much more northern latitude, meaning that SINFONI and OSIRIS can access complementary parts of the sky.

    Molecular maps of carbon dioxide (left) and water (right) around Beta Pictoris. Beta Pictoris b is starkly visible in the lower right side of both maps. The left-side scale is the y-position and the bottom-side scale is the x-position. The scale is in arcseconds. Credit: J. Hoeijmakers.

    Yepun, the fourth Unit Telescope of the VLT, is angled at a very low altitude, revealing the cell holding its main mirror and the SINFONI integral-field spectrograph.
    Credit: ESO

    Q: So what did you and your team find out?

    A: First of all, our analysis of the existing dataset confidently shows the presence of water and carbon monoxide in the atmosphere of Beta Pictoris b. This in itself is not a new result because both species were known (and expected) to be present. However, it is the first time that a high-contrast imaging instrument has been used to directly detect these absorption lines in an exoplanet’s atmosphere, thereby uniquely and robustly confirming their presence.

    Q: Did you face any challenges during your research?

    A: Our analysis was quite challenging because these observations were experimental. SINFONI is not tuned for these kinds of observations, so when the data was initially taken in 2014, it was quickly deemed too challenging even for the detection of the planet, let alone a measurement of its spectrum. In the case of this dataset, our method is more sensitive to the planet, but we also had to overcome the fact that the instrument is simply not designed to image a very faint planet next to a very bright star. This is why we strongly advocate that future, SINFONI-like instruments (such as the planned HARMONI instrument on ESO’s Extremely Large Telescope) should be outfitted with a coronagraph, which blocks out much of the starlight, making such observations even more powerful.

    This composite image shows the movement of Beta Pictoris b around its star, observed by the NACO on the VLT over six years.
    Credit: ESO/A.-M. Lagrange

    Q: What do you personally find most exciting about this research?

    A: This is a clear example of using an existing dataset and instrument in a completely new way and finding exciting results. I think that there is no reason why the same analysis and observations couldn’t have been carried out 10 years ago, achieving the same results — and something similar is true for the entire field of exoplanets! The first exoplanets could have been discovered with technology that was already available over a decade earlier if only astronomers had taken the possibility of the existence of hot Jupiters seriously. That’s why I sometimes wonder what other new discoveries or applications of existing facilities are still hiding under our noses right now.

    Q: What might this research lead to in the future? And what are the next big steps in the field?

    A: The strength of our signal spells good news for the future, when new instruments will come online that are similar to SINFONI but much more powerful in terms of contrast and spectral resolution. For instance, our result came from over two hours of observations with SINFONI, but we calculated that the same result could be obtained using the Extremely Large Telescope in only 90 seconds — for a planet like Beta Pictoris that is five times closer to its host star! In this sense, our result is a clear demonstration of this analysis technique and should encourage ongoing development of these future instruments, especially for making them suitable for the high-contrast imaging of exoplanets.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Visit ESO in Social Media-




    ESO Bloc Icon

    ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre European Extremely Large Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

    ESO LaSilla
    ESO/Cerro LaSilla 600 km north of Santiago de Chile at an altitude of 2400 metres.

    VLT at Cerro Paranal, with an elevation of 2,635 metres (8,645 ft) above sea level.

    ESO Vista Telescope
    ESO/Vista Telescope at Cerro Paranal, with an elevation of 2,635 metres (8,645 ft) above sea level.

    ESO/NTT at Cerro LaSilla 600 km north of Santiago de Chile at an altitude of 2400 metres.

    ESO VLT Survey telescope
    VLT Survey Telescope at Cerro Paranal with an elevation of 2,635 metres (8,645 ft) above sea level.

    ALMA Array
    ALMA on the Chajnantor plateau at 5,000 metres.

    ESO/E-ELT to be built at Cerro Armazones at 3,060 m.

    APEX Atacama Pathfinder 5,100 meters above sea level, at the Llano de Chajnantor Observatory in the Atacama desert.

    Leiden MASCARA instrument, La Silla, located in the southern Atacama Desert 600 kilometres (370 mi) north of Santiago de Chile at an altitude of 2,400 metres (7,900 ft)

    Leiden MASCARA cabinet at ESO Cerro la Silla located in the southern Atacama Desert 600 kilometres (370 mi) north of Santiago de Chile at an altitude of 2,400 metres (7,900 ft)

    ESO Next Generation Transit Survey at Cerro Paranel, 2,635 metres (8,645 ft) above sea level

    SPECULOOS four 1m-diameter robotic telescopes 2016 in the ESO Paranal Observatory, 2,635 metres (8,645 ft) above sea level

    ESO TAROT telescope at Paranal, 2,635 metres (8,645 ft) above sea level

    ESO ExTrA telescopes at Cerro LaSilla at an altitude of 2400 metres

  • richardmitnick 1:50 pm on June 22, 2018 Permalink | Reply
    Tags: , , , Where are they?   

    From SETI Institute: “If Extraterrestrials are out there, why haven’t we found them?” 

    SETI Logo new
    From SETI Institute

    Jun 18, 2018
    Seth Shostak, Senior Astronomer

    The Fermi Paradox, named for Dr. Enrico Fermi, describes the apparent contradiction between the lack of evidence of extraterrestrial civilizations and the high probability that such alien life exists. AP

    “Where is everybody?”

    For those who want to understand why we haven’t found any space aliens, the Fermi Paradox is as popular as cheeseburgers. First proposed by physicist Enrico Fermi in 1950, this perennial head-scratcher rests on the idea that it would take only a few tens of millions of years for an advanced civilization to colonize the Milky Way — leaving their mark on every last star system in the galaxy.

    So why hasn’t some ambitious race of aliens done that? After all, the Milky Way is three times older than Earth, so they’ve had plenty of opportunity to finish the project. We should see outposts of someone’s galactic empire in every direction. Why don’t we?

    As Fermi put it, “Where is everybody?”

    A Russian physicist named A.A. Berezin recently addressed this cosmic conundrum in a short paper. He thinks he knows why we haven’t espied aliens. Mind you, he’s not the first. The Fermi Paradox has prompted dozens if not hundreds of explanations. One possibility is that colonizing the galaxy is simply too costly. Or maybe alien societies are out there, but we lack the instruments to find them. Others favor the idea that extraterrestrials find Homo sapiens inconsequential and juvenile — so they keep a low profile and avoid us.

    Berezin suggests something else. He presumes that at some point in the 13.8 billion years since the Big Bang, an extraterrestrial civilization managed to develop the capability to travel between the stars. Soon thereafter, they embarked on a project to spread out. But as they — or their robot underlings — took over the galaxy, they eradicated everyone else. Some of this might have been inadvertent, in the same way that construction crews mindlessly obliterate ants.

    Does this sound like a variation on Douglas Adams’ “Hitchhiker’s Guide to the Galaxy,” in which Earth is unintentionally destroyed to make way for a hyperspace bypass? Well, it’s the same basic idea. But unlike Adams’ story, Berezin’s doesn’t make much sense. To begin with, it’s unclear how this suggestion really differs from the original paradox. If some ancient society of Galactans took over our galaxy (and maybe all the nearby galaxies too — there’s been time enough), why don’t we see evidence of that?

    By 200 A.D., the Roman Empire had infested nearly all the lands edging the Mediterranean. If you were living within the empire, you’d definitely know it — you could find fluted architecture just about everywhere. So if the Galactans have been all over the place, why don’t we notice? In addition, these hypothesized alien colonists couldn’t just sweep through the Milky Way once and leave it at that. A new species — such as Homo sapiens — might arise at any time, offering a new challenge to imperial dominance and forcing the Galactans to clean house again.

    Keeping control of the galaxy would be an endless project, and one that couldn’t be managed from some central “headquarters.” Even at the speed of light, it takes tens of thousands of years to get from one random spot in the Milky Way to another. Compare that to the response time for Rome — the time between learning that there was trouble afoot and getting their armies in place to confront it. That was typically weeks, not tens of thousands of years.

    Ask yourself: Would the Roman Empire have existed if the legions took centuries or more to trudge to Germania every time the troublesome Alemanni crossed the Rhine? Germania would cease being Roman before you could say “barbarian.”

    It seems clear that Galactans would have to adopt the Roman strategy: Station some defensive infrastructure throughout the Milky Way so it’s possible to deal with problems quickly. Sounds easy, but it would present a difficult logistical problem. How do you adequately maintain and update such a massive network when travel times are measured in millennia?

    Berezin’s idea of how to resolve the puzzle presented by the Fermi Paradox seems neither more convincing nor more plausible than many of the others. It replaces one paradox with another by arguing that the galaxy is, indeed, inhabited everywhere by a pervasive culture that presumably sprang up billions of years ago but somehow manages to evade all our detection efforts.

    The paradox continues to fuel many lunchtime conversations, which at least is a nice diversion from gossip or politics. But if we someday find a signal from space, Fermi’s question will become nothing more than an historical curiosity — a bit of misplaced musing that confounded Homo sapiens for a few decades.

    Meanwhile the aliens — and who could doubt they exist? — keep their own company.

    Originally published at https://www.nbcnews.com/mach/science/if-space-aliens-are-out-there-why-haven-t-we-ncna881951

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    SETI Institute – 189 Bernardo Ave., Suite 100
    Mountain View, CA 94043
    Phone 650.961.6633 – Fax 650-961-7099
    Privacy PolicyQuestions and Comments

  • richardmitnick 1:25 pm on June 22, 2018 Permalink | Reply
    Tags: , , , , , , ,   

    From Brookhaven Lab: “Upgrades to ATLAS and LHC Magnets for Run 2 and Beyond” 

    From Brookhaven Lab


    Peter Genzer,
    (631) 344-3174

    The following news release was issued by CERN, the European Organization for Nuclear Research, home to the Large Hadron Collider (LHC). Scientists from the U.S. Department of Energy’s Brookhaven National Laboratory play multiple roles in the research at the LHC and are making major contributions to the high-luminosity upgrade described in this news release, including the development of new niobium tin superconducting magnets that will enable significantly higher collision rates; new particle tracking and signal readout systems for the ATLAS experiment that will allow scientists to capture and analyze the most significant details from vastly larger data sets; and increases in computing capacity devoted to analyzing and sharing that data with scientists around the world. Brookhaven Lab also hosts the Project Office for the U.S. contribution to the HL-LHC detector upgrades of the ATLAS experiment. For more information about Brookhaven’s roles in the high-luminosity upgrade or to speak with a Brookhaven/LHC scientist, contact Karen McNulty Walsh, (631) 344-8350, kmcnulty@bnl.gov.

    Brookhaven physicists play critical roles in LHC restart and plans for the future of particle physics.

    The ATLAS detector at the Large Hadron Collider, an experiment with large involvement from physicists at Brookhaven National Laboratory. Image credit: CERN

    July 6, 2015

    At the beginning of June, the Large Hadron Collider at CERN, the European research facility, began smashing together protons once again. The high-energy particle collisions taking place deep underground along the border between Switzerland and France are intended to allow physicists to probe the furthest edges of our knowledge of the universe and its tiniest building blocks.

    The Large Hadron Collider returns to operations after a two-year offline period, Long Shutdown 1, which allowed thousands of physicists worldwide to undertake crucial upgrades to the already cutting-edge particle accelerator. The LHC now begins its second multi-year operating period, Run 2, which will take the collider through 2018 with collision energies nearly double those of Run 1. In other words, Run 2 will nearly double the energies that allowed researchers to detect the long-sought Higgs Boson in 2012.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The U.S. Department of Energy’s Brookhaven National Laboratory is a crucial player in the physics program at the Large Hadron Collider, in particular as the U.S. host laboratory for the pivotal ATLAS experiment, one of the two large experiments that discovered the Higgs. Physicists at Brookhaven were busy throughout Long Shutdown 1, undertaking projects designed to maximize the LHC’s chances of detecting rare new physics as the collider reaches into a previous unexplored subatomic frontier.

    While the technology needed to produce a new particle is a marvel on its own terms, equally remarkable is everything the team at ATLAS and other experiments must do to detect these potentially world-changing discoveries. Because the production of such particles is a rare phenomenon, it isn’t enough to just be able to smash one proton into another. The LHC needs to be able to collide proton bunches, each bunch consisting of hundreds of billions of particles, every 50 nanoseconds—eventually rising to every 25 nanoseconds in Run 2—and be ready to sort through the colossal amounts of data that all those collisions produce.

    It is with those interwoven challenges—maximizing the number of collisions within the LHC, capturing the details of potentially noteworthy collisions, and then managing the gargantuan amount of data those collisions produce—that scientists at Brookhaven National Laboratory are making their mark on the Large Hadron Collider and its search for new physics—and not just for the current Run 2, but looking forward to the long-term future operation of the collider.

    Restarting the Large Hadron Collider


    CERN map

    CERN LHC Tunnel

    CERN LHC particles




    CERN CMS New

    CERN LHCb New II

    Brookhaven physicist Srini Rajagopalan, operation program manager for U.S. ATLAS, works to keep manageable the colossal amounts of data that are generated by the Large Hadron Collider and sent to Brookhaven’s RHIC and ATLAS Computing Facility.

    The Large Hadron Collider is the largest single machine in the world, so it’s tempting to think of its scale just in terms of its immense size. The twin beamlines of the particle accelerator sit about 300 to 600 feet underground in a circular tunnel more than 17 miles around. Over 1,600 magnets, each weighing more than 25 tons, are required to keep the beams of protons focused and on the correct paths, and nearly 100 tons of liquid helium is necessary to keep the magnets operating at temperatures barely above absolute zero. Then there are the detectors, each of which stand several stories high.

    But the scale of the LHC extends not just in space, but in time as well. A machine of this size and complexity doesn’t just switch on or off with the push of a button, and even relatively simple maintenance can require weeks, if not months, to perform. That’s why the LHC recently completed Long Shutdown 1, a two-year offline period in which physicists undertook the necessary repairs and upgrades to get the collider ready for the next three years of near-continuous operation. As the U.S. host laboratory for the ATLAS experiment, Brookhaven National Laboratory was pivotal in upgrading and improving one of the cornerstones of the LHC apparatus.

    “After having run for three years, the detector needs to be serviced much like your car,” said Brookhaven physicist Srini Rajagopalan, operation program manager for U.S. ATLAS. “Gas leaks crop up that need to be fixed. Power supplies, electronic boards and several other components need to be repaired or replaced. Hence a significant amount of detector consolidation work occurs during the shutdown to ensure an optimal working detector when beam returns.”

    Beyond these vital repairs, the major goal of the upgrade work during Long Shutdown 1 was to increase the LHC’s center of mass energies from the previous 8 trillion electron volts (TeV) to 13 TeV, near the operational maximum of 14 TeV.

    “Upgrading the energy means you’re able to probe much higher mass ranges, and you have access to new particles that might be substantially heavier,” said Rajagopalan. “If you have a very heavy particle that cannot be produced, it doesn’t matter how much data you collect, you just cannot reach that. That’s why it was very important to go from 8 to 13 TeV. Doubling the energy allows us to access the new physics much more easily.”

    As the LHC probes higher and higher energies, the phenomena that the researchers hope to observe will happen more and more rarely, meaning the particle beams need to create many more collisions than they did before. Beyond this increase in collision rates, or luminosity, however, the entire infrastructure of data collection and management has to evolve to deal with the vastly increased volume of information the LHC can now produce.

    “Much of the software had to be evolved or rewritten,” said Rajagopalan, “from patches and fixes that are more or less routine software maintenance to implementing new algorithms and installing new complex data management systems capable of handling the higher luminosity and collision rates.”

    Making More Powerful Magnets

    Brookhaven physicist Peter Wanderer, head of the laboratory’s Superconducting Magnet Division, stands in front of the oven in which niobium tin is made into a superconductor.

    The Large Hadron Collider works by accelerating twin beams of protons to speeds close to that of light. The two beams, traveling in opposite directions along the path of the collider, both contain many bunches of protons, with each bunch containing about 100 billion protons. When the bunches of protons meet, not all of the protons inside of them are going to interact and only a tiny fraction of the colliding bunches are likely to yield potentially interesting physics. As such, it’s absolutely vital to control those beams to maximize the chances of useful collisions occurring.

    The best way to achieve that and the desired increase in luminosity—both during the current Run 2, and looking ahead to the long-term future of the LHC—is to tighten the focus of the beam. The more tightly packed protons are, the more likely they’ll smash into each other. This means working with the main tool that controls the beam inside the accelerator: the magnets.

    FNAL magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC Photo Reidar Hahn

    “Most of the length of the circumference along a circular machine like the LHC is taken up with a regular sequence of magnets,” said Peter Wanderer, head of Brookhaven Lab’s Superconducting Magnet Division, which made some of the magnets for the current LHC configuration and is working on new designs for future upgrades. “The job of these magnets is to bend the proton beams around to the next point or region where you can do something useful with them, like produce collisions, without letting the beam get larger.”

    A beam of protons is a bunch of positively charged particles that all repel one another, so they want to move apart, he explained. So physicists use the magnetic fields to keep the particles from being able to move away from the desired path.

    “You insert different kinds of magnets, different sequences of magnets, in order to make the beams as small as possible, to get the most collisions possible when the beams collide,” Wanderer said.

    The magnets currently in use in the LHC are made of the superconducting material niobium titanium (NbTi). When the electromagnets are cooled in liquid helium to temperatures of about 4 Kelvin (-452.5 degrees Fahrenheit), they lose all electric resistance and are able to achieve a much higher current density compared with a conventional conductor like copper. A magnetic field gets stronger as its current is more densely packed, meaning a superconductor can produce a much stronger field over a smaller radius than copper.

    But there’s an upper limit to how high a field the present niobium titanium superconductors can reach. So Wanderer and his team at Brookhaven have been part of a decade-long project to refine the next generation of superconducting magnets for a future upgrade to the LHC. These new magnets will be made from niobium tin (Nb3Sn).

    “Niobium tin can go to higher fields than niobium titanium, which will give us even stronger focusing,” Wanderer said. “That will allow us to get a smaller beam, and even more collisions.” Niobium tin can also function at a slightly higher temperature, so the new magnets will be easier to cool than those currently in use.

    There are a few catches. For one, niobium tin, unlike niobium titanium, isn’t initially superconducting. The team at Brookhaven has to first heat the material for two days at 650 degrees Celsius (1200 degrees Fahrenheit) before beginning the process of turning the raw materials into the wires and cables that make up an electromagnet.

    “And when niobium tin becomes a superconductor, then it’s very brittle, which makes it really challenging,” said Wanderer. “You need tooling that can withstand the heat for two days. It needs to be very precise, to within thousandths of an inch, and when you take it out of the tooling and want to put it into a magnet, and wrap it with iron, you have to handle it very carefully. All that adds a lot to the cost. So one of the things we’ve worked out over 10 years is how to do it right the first time, almost always.”

    Fortunately, there’s still time to work out any remaining kinks. The new niobium tin magnets aren’t set to be installed at the LHC until around 2022, when the changeover from niobium titanium to niobium tin will be a crucial part of converting the Large Hadron Collider into the High-Luminosity Large Hadron Collider (HL-LHC).

    Managing Data at Higher Luminosity

    As the luminosity of the LHC increases in Run 2 and beyond, perhaps the biggest challenge facing the ATLAS team at Brookhaven lies in recognizing a potentially interesting physics event when it occurs. That selectivity is crucial, because even CERN’s worldwide computing grid—which includes about 170 global sites, and of which Brookhaven’s RHIC and ATLAS Computing Facility is a major center—can only record the tiniest fraction of over 100 million collisions that occur each second. That means it’s just as important to quickly recognize the millions of events that don’t need to be recorded as it is to recognize the handful that do.

    “What you have to do is, on the fly, analyze each event and decide whether you want to save it to disk for later use or not,” said Rajagopalan. “And you have to be careful you don’t throw away good physics events. So you’re looking for signatures. If it’s a good signature, you say, ‘Save it!’ Otherwise, you junk it. That’s how you bring the data rate down to a manageable amount you can write to disk.”

    Physicists screen out unwanted data using what’s known as a trigger system. The principle is simple: as the data from each collision comes in, it’s analyzed for a preset signature pattern, or trigger, that would mark it as potentially interesting.

    “We can change the trigger, or make the trigger more sophisticated to be more selective,” said Brookhaven’s Howard Gordon, a leader in the ATLAS physics program. “If we don’t select the right events, they are gone forever.”

    The current trigger system can handle the luminosities of Run 2, but with future upgrades it will no longer be able to screen out and reject enough collisions to keep the number of recorded events manageable. So the next generation of ATLAS triggers will have to be even more sophisticated in terms of what they can instantly detect—and reject.

    A more difficult problem comes with the few dozen events in each bunch of protons that look like they might be interesting, but aren’t.

    “Not all protons in a bunch interact, but it’s not necessarily going to be only one proton in a bunch that interacts with a proton from the opposite bunch,” said Rajagopalan. “You could have 50 of them interact. So now you have 50 events on top of each other. Imagine the software challenge when just one of those is the real, new physics we’re interested in discovering, but you have all these 49 others—junk!—sitting on top of it.”

    “We call it pileup!” Gordon quipped.

    Finding one good result among 50 is tricky enough, but in 10 years that number will be closer to 1 in 150 or 200, with all those additional extraneous results interacting with each other and adding exponentially to the complexity of the task. Being able to recognize instantly as many characteristics of the desired particles as possible will go a long way to keeping the data manageable.

    Further upgrades are planned over the next decade to cope with the ever-increasing luminosity and collision rates. For example, the Brookhaven team and collaborators will be working to develop an all-new silicon tracking system and a full replacement of the readout electronics with state-of-the-art technology that will allow physicists to collect and analyze ten times more data for LHC Run 4, scheduled for 2026.

    The physicists at CERN, Brookhaven, and elsewhere have strong motivation for meeting these challenges. Doing so will not only offer the best chance of detecting rare physics events and expanding the frontiers of physics, but would allow the physicists to do it within a reasonable timespan.

    As Rajagopalan put it, “We are ready for the challenge. The next few years are going to be an exciting time as we push forward to explore a new unchartered energy frontier.”

    Brookhaven’s role in the LHC is supported by the DOE Office of Science.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 3:29 pm on June 21, 2018 Permalink | Reply
    Tags: ALMA Discover Exciting Structures in a Young Protoplanetary Disk That Support Planet Formation, , , , , ,   

    From ALMA: “ALMA Discover Exciting Structures in a Young Protoplanetary Disk That Support Planet Formation” 

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    From ALMA

    20 June, 2018

    Ruobing Dong
    Steward Observatory, University of Arizona, USA
    Institute of Astronomy and Astrophysics, Academia Sinica, Taiwan
    +1 609 423 5625

    Nicolás Lira
    Education and Public Outreach Coordinator
    Joint ALMA Observatory, Santiago – Chile
    Phone: +56 2 2467 6519
    Cell phone: +56 9 9445 7726

    Masaaki Hiramatsu
    Education and Public Outreach Officer, NAOJ Chile
, Tokyo – Japan
    +81 422 34 3630

    Charles E. Blue
    Public Information Officer
    National Radio Astronomy Observatory Charlottesville, Virginia – USA
    Phone: +1 434 296 0314
    Cell phone: +1 202 236 6324

    Richard Hook
    Public Information Officer, ESO
    Garching bei München, Germany
    Phone: +49 89 3200 6655
    Cell phone: +49 151 1537 3591

    ALMA image of the 0.87 mm continuum emission from the MWC 758 disk. Credit: ALMA (ESO/NAOJ/NRAO)/Dong et al.

    Since early 2000, rich structures, including gaps and rings, dust clumps, and spiral arm-like features, have been discovered in a few tens of disks surrounding newborn stars. With the belief that planets are forming inside, astronomers named these disks protoplanetary disks.

    The origin of these structures is in hot debate among astronomers. In one scenario, they are thought to be produced by unseen planets forming inside and gravitationally interacting with the host disks, as planets open gaps, shepherd dust clumps, and excite spiral arms.

    Alternative ways to produce observed disk structures that do not invoke planets have also been raised. For examples, large central cavities may be the outcome of photoevaporation, as high energy radiations from the central star evaporate the inner disk. Also, under certain conditions shadows in disks may mimic the spiral arms seen in reflected light.

    The protoplanetary disk around a young star MWC 758 is located at 500 light years from us. In 2012, a pair of near symmetric giant spiral arms was discovered in reflected light. In dust thermal and molecular gas line emission at millimeter wavelengths, a big inner hole and two major dust clumps have been found, too.

    Now with the new ALMA image, the previously known cavity of MWC 758 is shown to be off-centered from the star with its shape well described by an ellipse with one focus on the star. Also, a millimeter dust emission feature corresponds nicely with one of the two spiral arms previously seen in reflected light. Both discoveries are the first among protoplanetary disks.

    “MWC 758 is a rare breed!”, says Sheng-Yuan Liu at ASIAA, co-author of this study, “All major types of disk structures have been found in this system. It reveals to us one of the most comprehensive suites of evidence of planet formation in all protoplanetary disks.”

    Previously in 2015, Dr. Dong and his collaborators proposed that the two arms in the MWC 758 disk can be explained as driven by a super-Jupiter planet just outside the disk.

    “Our new ALMA observations lend crucial support to planet-based origins for all the structures.”, says Dr. Takayuki Muto at Kogakuin University, Japan, co-author of this research, “For example, it’s exciting to see ellipses with one focus on the star. That’s Kepler’s first law! It’s pointing to a dynamical origin, possibly interacting with planets.”

    The off-centered cavity strongly, on the other hand, disfavors alternative explanations such as photoevaporation, which does not have an azimuthal dependence.

    Various disk structures are marked. The green dotted contours mark the boundaries of the disk; the small circle at the center roughly marks the location of the star; the two green solid contours represent the extent of the two bright clumps; the solid, dotted and dashed white arcs trace out the inner, middle, and outer rings, respectively; and the arrow points out the spiral arm. The resolution (beam size, ~6.5 AU) of the image is labeled at the lower left corner. Credit: ALMA (ESO/NAOJ/NRAO)/Dong et al.

    The fact that the south spiral branch is present in the millimeter emission tracing the dust rules that it’s a density arm. Other scenarios, such as shadows, which view the spiral arms as surface features, are not expected to reproduce the observations. The ultra-high resolution achieved in the new ALMA dataset also enables the detection of a slight offset between the arm locations in reflected light and in dust emission, which is consistent with models of planet-induced density wave.

    “These fantastic new details are only made possible thanks to the amazing angular resolution delivered by ALMA”, says co-author Eiji Akiyama at Hokkaido University, Japan, “We took full advantage of ALMA’s long baseline capabilities, and now the MWC 758 disk joins the elite club of ultra-high-resolution ALMA disks alongside only a handful of others.”

    Additional information

    This research was presented in a paper “The Eccentric Cavity, Triple Rings, Two-Armed Spirals, and Double Clumps of the MWC 758 Disk” by Dong et al. to appear in The Astrophysical Journal.

    The team is composed of Ruobing Dong (U. of Arizona, USA; ASIAA, Taiwan), Sheng-yuan Liu (ASIAA, Taiwan), Josh Eisner (University of Arizona, USA), Sean Andrews (Harvard-Smithsonian Center for Astrophysics, USA), Jeffrey Fung (UC Berkeley, USA), Zhaohuan Zhu (UNLV, USA) Eugene Chiang (UC Berkeley, USA), Jun Hashimoto (Astrobiology Center, NINS, Japan), Hauyu Baobab Liu (European Southern Observatory, Germany), Simon Casassus (University of Chile, Chile), Thomas Esposito (UC Berkeley, USA), Yasuhiro Hasegawa (JPL/Caltech, USA), Takayuki Muto (Kogakuin University, Japan), Yaroslav Pavlyuchenkov (Russian Academy of Sciences, Russia), David Wilner (Harvard-Smithsonian Center for Astrophysics, USA), Eiji Akiyama (Hokkaido University, Japan), Motohide Tamura (The University of Tokyo; Astrobiology Center, NINS, Japan), and John Wisniewski (U. of Oklahoma, USA).

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Atacama Large Millimeter/submillimeter Array (ALMA), an international astronomy facility, is a partnership of Europe, North America and East Asia in cooperation with the Republic of Chile. ALMA is funded in Europe by the European Organization for Astronomical Research in the Southern Hemisphere (ESO), in North America by the U.S. National Science Foundation (NSF) in cooperation with the National Research Council of Canada (NRC) and the National Science Council of Taiwan (NSC) and in East Asia by the National Institutes of Natural Sciences (NINS) of Japan in cooperation with the Academia Sinica (AS) in Taiwan.

    ALMA construction and operations are led on behalf of Europe by ESO, on behalf of North America by the National Radio Astronomy Observatory (NRAO), which is managed by Associated Universities, Inc. (AUI) and on behalf of East Asia by the National Astronomical Observatory of Japan (NAOJ). The Joint ALMA Observatory (JAO) provides the unified leadership and management of the construction, commissioning and operation of ALMA.

    NRAO Small
    ESO 50 Large

  • richardmitnick 3:05 pm on June 21, 2018 Permalink | Reply
    Tags: , , Muon antineutrino oscillation spotted by NOvA, ,   

    From physicsworld.com: “Muon antineutrino oscillation spotted by NOvA” 

    From physicsworld.com

    07 June 2018
    Hamish Johnston

    FNAL NOvA detector in northern Minnesota

    NOvA Far Detector Block

    The best evidence yet that muon antineutrinos can change into electron antineutrinos has been found by the NOvA experiment in the US. The measurement involved sending a beam of muon antineutrinos more than 800 km through the Earth from Fermilab near Chicago to a detector in northern Minnesota. After running for about 14 months, NOvA found that at least 13 of the muon antineutrinos had changed type, or “flavour”, during their journey.

    The results were presented at the Neutrino 2018 conference, which is being held in Heidelberg, Germany, this week. Although the measurement is still below the threshold required to claim a “discovery”, the result means that fundamental properties of neutrinos and antineutrinos can be compared in detail. This could shed light on important mysteries of physics, such as why there is very little antimatter in the universe.

    Neutrinos and antineutrinos come in three flavours: electron, muon and tau. The subatomic particles also exist in three mass states, which means that neutrinos (and antineutrinos) will continuously change flavour (or oscillate). Neutrino oscillation came as a surprise to physicists, who had originally thought that neutrinos have no mass. Indeed, the origins of neutrino mass are not well-understood and a better understanding of neutrino oscillation could point to new physics beyond the Standard Model.
    Pion focusing

    NOvA has been running for more than three years and comprises two detectors – one located at Fermilab and the other in Minnesota near the border with Canada.

    FNAL Near Detector

    The muon antineutrinos in the beam are produced at Fermilab’s NuMI facility by firing a beam of protons at a carbon target. This produces pions, which then decay to produce either muon neutrinos or muon antineutrinos – depending upon the charge of the pion. By focusing pions of one charge into a beam, researchers can create a beam of either neutrinos or antineutrinos.

    The beam is aimed on a slight downward trajectory so it can travel through the Earth to the detector in Minnesota, which weighs in at 14,000 ton. Electron neutrinos and antineutrinos are detected when they very occasionally collide with an atom in a liquid scintillator, which produces a tiny flash of light. This light is converted into electrical signals by photomultipler tubes and the type of neutrino (or antineutrino) can be worked-out by studying the pattern of signal produced.

    The experiment’s first run with antineutrino began in February 2017 and ended in April 2018. The first results were presented this week in Heidelberg by collaboration member Mayly Sanchez of Iowa State University, who reported that a total of 18 electron antineutrinos had been seen by the Minnesota detector. If muon antineutrinos did not oscillate to electron antineutrinos, then only five detections should have been made.
    “Strong evidence”

    “The result is above 4σ level, which is strong evidence for electron antineutrino appearance,” Sanchez told Physics World, adding that this is the first time that the appearance of electron antineutrinos has been seen in a beam of muon antineutrinos. While this is below the 5σ level normally accepted as a discovery in particle physics, it is much stronger evidence than found by physicists working on the T2K detector in Japan – which last year reported seeing hints of the oscillation.

    In 2014-2017 NOvA detected 58 electron neutrinos that have appeared in a muon neutrino beam. This has allowed NOvA physicists to compare the rates at which muon neutrinos and antineutrinos oscillate to their respective electron counterparts. According to Sanchez, the team has seen a small discrepancy that has a statistical significance of just 1.8σ. While this difference is well within the expected measurement uncertainty, if it persists as more data are collected it could point towards new physics.

    Sanchez says that NOvA is still running in antineutrino mode and the amount of data taken will double by 2019.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

  • richardmitnick 2:41 pm on June 21, 2018 Permalink | Reply
    Tags: , , , , ,   

    From Fermilab: “New laser technology shows success in particle accelerators” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermilab an enduring source of strength for the US contribution to scientific research world wide.

    June 21, 2018
    Sarah Lawhun

    David Johnson, left, and Todd Johnson work on the recently installed laser notcher in the Fermilab accelerator complex. The laser notcher, the first application of its kind in an in-production particle accelerator, has helped boost particle beam production at the lab. Photo: Reidar Hahn

    Lasers — used in medicine, manufacturing and made wildly popular by science fiction — are finding a new use in particle physics.

    Fermilab scientists and engineers have developed a tool called a laser notcher, which takes advantage of the laser’s famously precise targeting abilities to do something unexpected: boost the number of particles that accelerators send to experiments. It’s cranked up the lab’s particle output considerably — by an incredible 15 percent — giving scientists more opportunities to study nature’s tiniest constituents.

    While lasers have been used during accelerator tests and diagnostics, this is the first application of its kind used in a fully operational accelerator.

    “For such a new design, the laser notcher has been remarkably reliable,” said Fermilab engineer Bill Pellico, who manages one of the laboratory’s major accelerator upgrade programs, called the Proton Improvement Plan. “It’s already shown it will provide a considerable increase in the number of particles we can produce.”

    The notcher increases particle production, counterintuitively, by removing particles from a particle beam.

    Bunching out

    The process of removing particles isn’t new. Typically, an accelerator generates a particle beam in bunches — compact packets that each contain hundreds of millions of particles. Imagine each bunch in a beam as a pearl on a strand. Bunches can be arranged in patterns according to the acceleration needs. Perhaps the needed pattern is a 80-bunch-long string followed by a three-bunch-long gap. Often, the best way to create the gap is to start with a regular, uninterrupted string of bunches and simply remove the unneeded ones.

    But it isn’t so simple. Traditionally, beam bunches are kicked out by a fast-acting magnet, called a magnetic kicker. It’s a messy business: Particles fly off, strike beamline walls and generally create a subatomic obstacle course for the beam. While it’s not impossible for the beam to pass through such a scene, it also isn’t smooth sailing.

    Accelerator experts refer to the messy phenomenon as beam loss, and it’s a measurable, predictable predicament. They accommodate it by holding back on the amount of beam they accelerate in the first place, setting a ceiling on the number of particles they pack into the beam.

    That ceiling is a limitation for Fermilab’s new and upcoming experiments, which require greater and greater numbers of particles than the accelerator complex could handle previously. So the lab’s accelerator specialists look for ways to raise the particle beam ceiling and meet the experimental needs for beam.

    The most straightforward way to do this is to eliminate the thing that’s keeping the ceiling low and stifling particle delivery — beam loss.

    Lasers against loss

    The new laser notcher works by directing powerful pulses of laser light at particle bunches, taking them out of commission. Both the position and precision of the notcher allow it to create gaps cleanly —delivering a one-two punch in curbing beam loss.

    First, the notcher is positioned early in the series of Fermilab’s accelerators, when the particle beam hasn’t yet achieved the close-to-light speeds it will attain by the time it exits the accelerator chain. (At this early stage, the beam lumbers along at 4 percent the speed of light, a mere 2.7 million miles per hour.) This far upstream, the beam loss resulting from ejecting bunches doesn’t have much of an impact.

    “We moved the process to a place where, when we lose particles, it really doesn’t matter,” said David Johnson, Fermilab engineering physicist who led the laser notcher project.

    Second, the laser notcher is, like a scalpel, surgical in its bunch removal. It ejects bunches precisely, individually, bunch by bunch. That enables scientists to create gaps of exactly the right lengths needed by later acceleration stages.

    For Fermilab’s accelerator chain, the winning formula is for the notcher to create a gap that is 80 nanoseconds (billionths of a second) long every 2,200 nanoseconds. It’s the perfect-length gap needed by one of Fermilab’s later-stage accelerators, called the Booster.

    A graceful exit

    The Fermilab Booster feeds beam to the next accelerator stages or directly to experiments.

    Prior to the laser notcher’s installation, a magnetic kicker would boot specified bunches as they entered the Booster, resulting in messy beam loss.

    With the laser notcher now on the scene, the Booster receives a beam that has prefab, well-defined gaps. These 80-nanosecond-long windows of opportunity mean that, as the beam leaves the Booster and heads toward its next stop, it can make a clean, no-fuss, no-loss exit.

    With Booster beam loss brought down to low levels, Fermilab accelerator operators can raise the ceiling on the numbers of particles they can pack into the beam. The results so far are promising: The notcher has already allowed beam power to increase by a whopping 15 percent.

    Thanks to this innovation and other upgrade improvements, the Booster accelerator is now operating at its highest efficiency ever and at record-setting beam power.

    “Although lasers have been used in proton accelerators in the past for diagnostics and tests, this is the first-of-its-kind application of lasers in an operational proton synchrotron, and it establishes a technological framework for using laser systems in a variety of other bunch-by-bunch applications, which would further advance the field of high-power proton accelerators,” said Sergei Nagaitsev, head of the Fermilab Office of Accelerator Science Programs.

    Plentiful protons and other particles

    The laser notcher, installed in January, is a key part of a larger program, the Proton Improvement Plan (PIP), to upgrade the lab’s chain of particle accelerators to produce powerful proton beams.

    As the name of the program implies, it starts with protons.

    Fermilab sends protons barreling through the lab’s accelerator complex, and they’re routed to various experiments. Along the way, some of them are transformed into other particles needed by experiments, for example into neutrinos—tiny, omnipresent particles that could hold the key to filling in gaps in our understanding the universe’s evolution. Fermilab experiments need boatloads of these particles to carry out its scientific program. Some of the protons are transformed into muons, which can provide scientists with hints about the nature of the vacuum.

    With more protons coming down the pipe, thanks to PIP and the laser notcher, the accelerator can generate more neutrinos, muons and other particles, feeding Fermilab’s muon experiments, Muon g-2 and Mu2e, and its neutrino experiments, including its largest operating neutrino experiment, NOvA, and its flagship, the Deep Underground Neutrino Experiment and Long-Baseline Neutrino Facility.

    “Considering all the upgrades and improvements to Fermilab accelerators as a beautiful cake with frosting, the increase in particle production we managed to achieve with the laser notcher is like the cherry on top of the cake,” Nagaitsev said.

    “It’s a seemingly small change with a significant impact,” Johnson said.

    As the Fermilab team moves forward, they’ll continue to put the notcher through its paces, investigating paths for improvement.

    With this innovation, Fermilab adds another notch in the belt of what lasers can do.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.



    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF


    FNAL Don Lincoln


    FNAL Cryomodule Testing Facility

    FNAL Minos Far Detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector


    FNAL Holometer

  • richardmitnick 2:22 pm on June 21, 2018 Permalink | Reply
    Tags: 'Red Nuggets' are Galactic Gold for Astronomers, , , , ,   

    From NASA Chandra: “‘Red Nuggets’ are Galactic Gold for Astronomers” 

    NASA Chandra Banner

    NASA/Chandra Telescope

    From NASA Chandra

    June 21, 2018

    Megan Watzke
    Chandra X-ray Center, Cambridge, Mass.

    Credit: X-ray: NASA/CXC/MTA-Eötvös University/N. Werner et al.; Illustration: NASA/CXC/M.Weiss
    Press Image, Caption, and Videos

    The central black holes may be the driving force in how much star formation occurs in a certain type of rare galaxy.

    ‘Red nuggets’ are the relics of the first massive galaxies that formed within a billion years after the Big Bang.

    While most red nuggets merged with other galaxies, some remained untouched throughout the history of the Universe.

    Astronomers used Chandra to learn more about how the black holes in these galaxies affect star formation.

    About a decade ago, astronomers discovered a population of small, but massive galaxies called “red nuggets.” A new study using NASA’s Chandra X-ray Observatory indicates that black holes have squelched star formation in these galaxies and may have used some of the untapped stellar fuel to grow to unusually massive proportions.

    Red nuggets were first discovered by the Hubble Space Telescope at great distances from Earth, corresponding to times only about three or four billion years after the Big Bang. They are relics of the first massive galaxies that formed within only one billion years after the Big Bang. Astronomers think they are the ancestors of the giant elliptical galaxies seen in the local Universe. The masses of red nuggets are similar to those of giant elliptical galaxies, but they are only about a fifth of their size.

    While most red nuggets merged with other galaxies over billions of years, a small number managed to slip through the long history of the cosmos untouched. These unscathed red nuggets represent a golden opportunity to study how the galaxies, and the supermassive black hole at their centers, act over billions of years of isolation.

    For the first time, Chandra has been used to study the hot gas in two of these isolated red nuggets, MRK 1216, and PGC 032673. They are located only 295 million and 344 million light years from Earth respectively, rather than billions of light years for the first known red nuggets. This X-ray emitting hot gas contains the imprint of activity generated by the supermassive black holes in each of the two galaxies.

    “These galaxies have existed for 13 billion years without ever interacting with another of its kind,” said Norbert Werner of MTA-Eötvös University Lendület Hot Universe and Astrophysics Research Group in Budapest, Hungary, who led the study. “We are finding that the black holes in these galaxies take over and the result is not good for new stars trying to form.”

    Astronomers have long known that the material falling towards black holes can be redirected outward at high speeds due to intense gravitational and magnetic fields. These high-speed jets can tamp down the formation of stars. This happens because the blasts from the vicinity of the black hole provide a powerful source of heat, preventing the galaxy’s hot interstellar gas from cooling enough to allow large numbers of stars to form.

    The temperature of the hot gas is higher in the center of the MRK 1216 galaxy compared to its surroundings, showing the effects of recent heating by the black hole. Also, radio emission is observed from the center of the galaxy, a signature of jets from black holes. Finally, the X-ray emission from the vicinity of the black hole is about a hundred million times lower than a theoretical limit on how fast a black hole can grow — called the “Eddington limit” — where the outward pressure of radiation is balanced by the inward pull of gravity. This low level of X-ray emission is typical for black holes producing jets. All these factors provide strong evidence that activity generated by the central supermassive black holes in these red nugget galaxies is suppressing the formation of new stars.

    The black holes and the hot gas may have another connection. The authors suggest that much of the black hole mass may have accumulated from the hot gas surrounding both galaxies. The black holes in both MRK 1216 and PGC 032873 are among the most massive known, with estimated masses of about five billion times that of the Sun, based on optical observations of the speeds of stars near the galaxies’ centers. Furthermore, the masses of the MRK 1216 black hole and possibly the one in PGC 032873 are estimated to be a few percent of the combined masses of all the stars in the central regions of the galaxies, whereas in most galaxies, the ratio is about ten times less.

    In the latest research, astronomers used Chandra to study the hot gas in two of these isolated red nuggets, Mrk 1216, and PGC 032673. (The Chandra data, colored red, of Mrk 1216 is shown in the inset.) These two galaxies are located only 295 million and 344 million light years from Earth respectively, rather than billions of light years for the first known red nuggets, allowing for a more detailed look. The gas in the galaxy is heated to such high temperatures that it emits brightly in X-ray light, which Chandra detects. This hot gas contains the imprint of activity generated by the supermassive black holes in each of the two galaxies.

    An artist’s illustration (main panel) shows how material falling towards black holes can be redirected outward at high speeds due to intense gravitational and magnetic fields. These high-speed jets can tamp down the formation of stars. This happens because the blasts from the vicinity of the black hole provide a powerful source of heat, preventing the galaxy’s hot interstellar gas from cooling enough to allow large numbers of stars to form.

    “Apparently, left to their own devices, black holes can act a bit like a bully,” said co-author Kiran Lakhchaura, also of MTA-Eötvös University.

    “Not only do they prevent new stars from forming,” said co-author Massimo Gaspari, an Einstein fellow from Princeton University, “they may also take some of that galactic material and use it to feed themselves.”

    In addition, the hot gas in and around PGC 032873 is about ten times fainter than the hot gas around MRK 1216. Because both galaxies appear to have evolved in isolation over the past 13 billion years, this difference might have arisen from more ferocious outbursts from PGC 032873’s black hole in the past, which blew most of the hot gas away.

    “The Chandra data tell us more about what the long, solitary journey through cosmic time has been like for these red nugget galaxies,” said co-author Rebecca Canning of Stanford University. “Although the galaxies haven’t interacted with others, they’ve shown plenty of inner turmoil.”

    A paper describing these results in the latest issue of the Monthly Notices of the Royal Astronomical Society. The authors of the paper are Norbert Werner (MTA-Eötvös University Lendület Hot Universe and Astrophysics Research Group in Budapest, Hungary), Kiran Lakhchaura (MTA-Eötvös University), Rebecca Canning (Stanford University), Massimo Gaspari (Princeton University), and Aurora Simeonescu (ISAS/JAXA).

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    NASA’s Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA’s Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra’s science and flight operations from Cambridge, Mass.

  • richardmitnick 1:35 pm on June 21, 2018 Permalink | Reply
    Tags: , , , , The nearby galaxy ESO 325-G004   

    From NASA/ESA Hubble Telescope and ESO VLT: “Most Precise Test of Einstein’s General Relativity Outside Milky Way” 

    NASA/ESA Hubble Telescope

    From NASA/ESA Hubble Telescope


    ESO 50 Large

    From European Southern Observatory

    ESO VLT Platform at Cerro Paranal elevation 2,635 m (8,645 ft)


    Thomas Collett
    University of Portsmouth
    Portsmouth, UK
    Tel: +44 239 284 5146
    Email: thomas.collett@port.ac.uk

    Bob Nichol
    University of Portsmouth
    Portsmouth, UK
    Tel: +44 239 284 3117
    Email: bob.nichol@port.ac.uk

    Mathias Jäger
    ESA/Hubble, Public Information Officer
    Garching bei München, Germany
    Tel: +49 176 62397500
    Email: mjaeger@partner.eso.org

    Richard Hook
    ESO Public Information Officer
    Garching bei München, Germany
    Tel: +49 89 3200 6655
    Cell: +49 151 1537 3591
    Email: pio@eso.org


    An international team of astronomers using the NASA/ESA Hubble Space Telescope and the European Southern Observatory’s Very Large Telescope has made the most precise test of general relativity yet outside our Milky Way. The nearby galaxy ESO 325-G004 acts as a strong gravitational lens, distorting light from a distant galaxy behind it to create an Einstein ring around its centre. By comparing the mass of ESO 325-G004 with the curvature of space around it, the astronomers found that gravity on these astronomical length-scales behaves as predicted by general relativity. This rules out some alternative theories of gravity.

    Using the NASA/ESA Hubble Space Telescope and European Southern Observatory’s Very Large Telescope (VLT), a team led by Thomas Collett (University of Portsmouth, UK), was able to perform the most precise test of general relativity outside the Milky Way to date.

    The theory of general relativity predicts that objects deform spacetime, causing any light that passes by to be deflected and resulting in a phenomenon known as gravitational lensing. This effect is only noticeable for very massive objects. A few hundred strong gravitational lenses are known, but most are too distant to precisely measure their mass. However, the elliptical galaxy ESO 325-G004 is amongst the closest lenses at just 450 million light-years from Earth.

    Using the MUSE instrument on the VLT the team calculated the mass of ESO 325-G004 by measuring the movement of stars within it.

    ESO MUSE on the VLT

    Using Hubble the scientists were able to observe an Einstein ring resulting from light from a distant galaxy being distorted by the intervening ESO 325-G004. Studying the ring allowed the astronomers to measure how light, and therefore spacetime, is being distorted by the huge mass of ESO 325-G004.

    Collett comments: “We know the mass of the foreground galaxy from MUSE and we measured the amount of gravitational lensing we see from Hubble. We then compared these two ways to measure the strength of gravity — and the result was just what general relativity predicts, with an uncertainty of only nine percent. This is the most precise test of general relativity outside the Milky Way to date. And this using just one galaxy!”
    General relativity has been tested with exquisite accuracy on Solar System scales, and the motions of stars around the black hole at the centre of the Milky Way are under detailed study, but previously there had been no precise tests on larger astronomical scales. Testing the long range properties of gravity is vital to validate our current cosmological model.

    These findings may have important implications for models of gravity alternative to general relativity. These alternative theories predict that the effects of gravity on the curvature of spacetime are “scale dependent”. This means that gravity should behave differently across astronomical length-scales from the way it behaves on the smaller scales of the Solar System. Collett and his team found that this is unlikely to be true unless these differences only occur on length scales larger than 6000 light-years.

    “The Universe is an amazing place providing such lenses which we can use as our laboratories,” adds team member Bob Nichol (University of Portsmouth). “It is so satisfying to use the best telescopes in the world to challenge Einstein, only to find out how right he was.”

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    More information

    This research was presented in a paper entitled A precise extragalactic test of General Relativity by Collett et al., to appear in the journal Science.

    The team is composed of T. E. Collett (Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth, UK), L. J. Oldham (Institute of Astronomy, University of Cambridge, Cambridge, UK), R. Smith (Centre for Extragalactic Astronomy, Durham University, Durham, UK), M. W. Auger (Institute of Astronomy, University of Cambridge, Cambridge, UK), K. B. Westfall (Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth, UK; University of California Observatories – Lick Observatory, Santa Cruz, USA), D. Bacon (Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth, UK), R. C. Nichol (Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth, UK), K. L. Masters (Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth, UK), K. Koyama (Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth, UK), R. van den Bosch (Max Planck Institute for Astronomy, Königstuhl, Heidelberg, Germany).

    ESA50 Logo large

    AURA Icon

    NASA image

    Visit ESO in Social Media-




    ESO Bloc Icon

    ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre European Extremely Large Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

    See the full NASA/ESA Hubble article here .
    See the full ESO/VLT article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 11:42 am on June 21, 2018 Permalink | Reply
    Tags: Biomedical engineering - Bringing a human touch to modern prosthetics,   

    From JHU HUB: “Biomedical engineering – Bringing a human touch to modern prosthetics” 

    Johns Hopkins

    From JHU HUB


    ‘Electronic skin’ allows user to experience a sense of touch and pain; ‘After many years, I felt my hand, as if a hollow shell got filled with life again,’ amputee volunteer says.

    Amy Lunday

    Amputees often experience the sensation of a “phantom limb”—a feeling that a missing body part is still there.

    That sensory illusion is closer to becoming a reality thanks to a team of engineers at Johns Hopkins University that has created an electronic skin. When layered on top of prosthetic hands, this e-dermis brings back a real sense of touch through the fingertips.

    “After many years, I felt my hand, as if a hollow shell got filled with life again,” says the amputee who served as the team’s principal volunteer. (The research protocol used in the study does not allow identification of the amputee volunteers.)

    Made of fabric and rubber laced with sensors to mimic nerve endings, e-dermis recreates a sense of touch as well as pain by sensing stimuli and relaying the impulses back to the peripheral nerves.

    “We’ve made a sensor that goes over the fingertips of a prosthetic hand and acts like your own skin would,” says Luke Osborn, a graduate student in biomedical engineering. “It’s inspired by what is happening in human biology, with receptors for both touch and pain.

    Luke Osborn interacts with a prosthetic hand sporting the e-dermis. Image credit: Larry Canner / Homewood Photography

    “This is interesting and new,” Osborn adds, “because now we can have a prosthetic hand that is already on the market and fit it with an e-dermis that can tell the wearer whether he or she is picking up something that is round or whether it has sharp points.”

    The work, published online in the journal Science Robotics, shows it’s possible to restore a range of natural, touch-based feelings to amputees who use prosthetic limbs. The ability to detect pain could be useful, for instance, not only in prosthetic hands but also in lower limb prostheses, alerting the user to potential damage to the device.

    Human skin is made up of a complex network of receptors that relay a variety of sensations to the brain. This network provided a biological template for the research team, which includes members from the Johns Hopkins departments of Biomedical Engineering, Electrical and Computer Engineering, and Neurology, and from the Singapore Institute of Neurotechnology.

    Video: American Academy for the Advancement of Science

    Bringing a more human touch to modern prosthetic designs is critical, especially when it comes to incorporating the ability to feel pain, Osborn says.

    “Pain is, of course, unpleasant, but it’s also an essential, protective sense of touch that is lacking in the prostheses that are currently available to amputees,” he says. “Advances in prosthesis designs and control mechanisms can aid an amputee’s ability to regain lost function, but they often lack meaningful, tactile feedback or perception.”

    That’s where the e-dermis comes in, conveying information to the amputee by stimulating peripheral nerves in the arm, making the so-called phantom limb come to life. Inspired by human biology, the e-dermis enables its user to sense a continuous spectrum of tactile perceptions, from light touch to noxious or painful stimulus.

    The e-dermis does this by electrically stimulating the amputee’s nerves in a non-invasive way, through the skin, says the paper’s senior author, Nitish Thakor, a professor of biomedical engineering and director of the Biomedical Instrumentation and Neuroengineering Laboratory at Johns Hopkins.

    “For the first time, a prosthesis can provide a range of perceptions from fine touch to noxious to an amputee, making it more like a human hand,” says Thakor, co-founder of Infinite Biomedical Technologies, the Baltimore-based company that provided the prosthetic hardware used in the study.

    The team created a “neuromorphic model” mimicking the touch and pain receptors of the human nervous system, allowing the e-dermis to electronically encode sensations just as the receptors in the skin would. Tracking brain activity via electroencephalography, or EEG, the team determined that the test subject was able to perceive these sensations in his phantom hand.

    The researchers then connected the e-dermis output to the volunteer by using a noninvasive method known as transcutaneous electrical nerve stimulation, or TENS. In a pain-detection task the team determined that the test subject and the prosthesis were able to experience a natural, reflexive reaction to both pain while touching a pointed object and non-pain when touching a round object.

    The e-dermis is not sensitive to temperature—for this study, the team focused on detecting object curvature (for touch and shape perception) and sharpness (for pain perception). The e-dermis technology could be used to make robotic systems more human, and it could also be used to expand or extend to astronaut gloves and space suits, Osborn says.

    The researchers plan to further develop the technology and work to better understand how to provide meaningful sensory information to amputees in the hopes of making the system ready for widespread patient use.

    Johns Hopkins is a pioneer in the field of upper limb dexterous prosthesis. More than a decade ago, the university’s Applied Physics Laboratory led the development of the advanced Modular Prosthetic Limb, which an amputee patient controls with the muscles and nerves that once controlled his or her real arm or hand.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About the Hub

    We’ve been doing some thinking — quite a bit, actually — about all the things that go on at Johns Hopkins. Discovering the glue that holds the universe together, for example. Or unraveling the mysteries of Alzheimer’s disease. Or studying butterflies in flight to fine-tune the construction of aerial surveillance robots. Heady stuff, and a lot of it.

    In fact, Johns Hopkins does so much, in so many places, that it’s hard to wrap your brain around it all. It’s too big, too disparate, too far-flung.

    We created the Hub to be the news center for all this diverse, decentralized activity, a place where you can see what’s new, what’s important, what Johns Hopkins is up to that’s worth sharing. It’s where smart people (like you) can learn about all the smart stuff going on here.

    At the Hub, you might read about cutting-edge cancer research or deep-trench diving vehicles or bionic arms. About the psychology of hoarders or the delicate work of restoring ancient manuscripts or the mad motor-skills brilliance of a guy who can solve a Rubik’s Cube in under eight seconds.

    There’s no telling what you’ll find here because there’s no way of knowing what Johns Hopkins will do next. But when it happens, this is where you’ll find it.

    Johns Hopkins Campus

    The Johns Hopkins University opened in 1876, with the inauguration of its first president, Daniel Coit Gilman. “What are we aiming at?” Gilman asked in his installation address. “The encouragement of research … and the advancement of individual scholars, who by their excellence will advance the sciences they pursue, and the society where they dwell.”

    The mission laid out by Gilman remains the university’s mission today, summed up in a simple but powerful restatement of Gilman’s own words: “Knowledge for the world.”

    What Gilman created was a research university, dedicated to advancing both students’ knowledge and the state of human knowledge through research and scholarship. Gilman believed that teaching and research are interdependent, that success in one depends on success in the other. A modern university, he believed, must do both well. The realization of Gilman’s philosophy at Johns Hopkins, and at other institutions that later attracted Johns Hopkins-trained scholars, revolutionized higher education in America, leading to the research university system as it exists today.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: