Updates from June, 2018 Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:21 pm on June 19, 2018 Permalink | Reply
    Tags: , , , , , , NASA's Next Flagship Mission May Be A Crushing Disappointment For Astrophysics   

    From Ethan Siegel: “NASA’s Next Flagship Mission May Be A Crushing Disappointment For Astrophysics” 

    From Ethan Siegel
    Jun 19, 2018

    1
    Various long-exposure campaigns, like the Hubble eXtreme Deep Field (XDF) shown here, have revealed thousands of galaxies in a volume of the Universe that represents a fraction of a millionth of the sky. Ambitious, flagship-class observatories are needed to take the next great leap forward for science. NASA, ESA, H. Teplitz and M. Rafelski (IPAC/Caltech), A. Koekemoer (STScI), R. Windhorst (Arizona State University), and Z. Levay (STScI)

    Every ten years, the field of astronomy and astrophysics undergoes a Decadal Survey. This charts out the path that NASA’s astrophysics division will follow for the next decade, including what types of questions they’ll investigate, which missions will be funded, and what won’t be chosen. The greatest scientific advances of all come when we invest a large amount of resources in a single, ultra-powerful, multi-purpose observatory, such as the Hubble Space Telescope.

    NASA/ESA Hubble Telescope

    These are high-risk, high-reward propositions. If the mission succeeds, we can learn an unprecedented amount about the Universe as never before.

    2
    Star birth in the Carina Nebula, in the optical (top) and the infrared (bottom). Our willingness to invest in fundamental science is directly related to how much we can learn about the Universe. NASA, ESA and the Hubble SM4 ERO Team

    Even though the mission proposals go through NASA, its the National Research Council and the National Academy of Sciences that ultimately make the recommendations. Since the inception of NASA in the 1960s, these Decadal Surveys have shaped the field of astronomy and astrophysics research. They brought us our greatest ground-based and space-based observatories. On the ground, radio arrays like the Very Large Array and the Very Long Baseline Array, as well as the Atacama Large Millimeter Array, owe their origins to the decadal surveys.

    NRAO/Karl V Jansky VLA, on the Plains of San Agustin fifty miles west of Socorro, NM, USA, at an elevation of 6970 ft (2124 m)

    NRAO VLBA

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    Space-based missions include NASA’s great observatories: the Hubble Space Telescope, the Chandra X-ray observatory, the Spitzer Space Telescope, and the Compton Gamma-Ray Observatory Even though the mission proposals go through NASA, its the National Research Council and the National Academy of Sciences that ultimately make the recommendations. Since the inception of NASA in the 1960s, these Decadal Surveys have shaped the field of astronomy and astrophysics research. They brought us our greatest ground-based and space-based observatories. On the ground, radio arrays like the Very Large Array and the Very Long Baseline Array, as well as the Atacama Large Millimeter Array, owe their origins to the decadal surveys. Space-based missions include NASA’s great observatories: the Hubble Space Telescope, the Chandra X-ray observatory, the Spitzer Space Telescope, and the Compton Gamma-Ray Observatory in the 1990s and early 2000s.

    NASA/Chandra X-ray Telescope


    NASA/Spitzer Infrared Telescope

    NASA Compton Gamma Ray Observatory

    4
    NASA’s Fermi Satellite has constructed the highest resolution, high-energy map of the Universe ever created. Without space-based observatories such as this one, we could never learn all that we have about the Universe. NASA/DOE/Fermi LAT Collaboration

    NASA/Fermi LAT


    NASA/Fermi Gamma Ray Space Telescope

    More recent Decadal Surveys, conducted this millennium, will bring us the James Webb Space Telescope, the WFIRST observatory designed to probe dark energy and exoplanets, and the Large Synoptic Survey Telescope (LSST), among others.

    NASA/ESA/CSA Webb Telescope annotated

    NASA/WFIRST

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    They’ve identified the major, most important science goals of astronomy and astrophysics, including dark energy, exoplanets, supernovae, mergers of extreme objects, and the formation of the first stars and the large-scale structure of the Universe. But there was a warning issued in 2001’s report that hasn’t been heeded, and now it’s creating an enormous problem.

    5
    The 2010 NASA mission timeline doesn’t just show a planned James Webb, but an enormous suite of missions that require ongoing funding. Without a commensurate increase in funds, that means fewer resources available for new missions. NASA Astrophysics Division.

    While a robust astronomy program has many benefits for the nation and the world, it’s vital to have a diverse portfolio of missions and observatories. Prior Decadal Surveys have simultaneously stressed the importance of the large flagship missions that drive the field forward like no other type of mission can, while warning against investing too much in these flagships at the expense of other small and medium-sized missions.

    They’ve also stressed the importance of providing additional funding or securing external funding to support ongoing missions, facilities, and observatories. Without it, the development of new missions is hamstrung by the need to continually fund the existing ones.

    6
    As a percentage of the federal budget, investment in NASA is at a 58 year low; at only 0.4% of the budget, you have to go back to 1959 to find a year where we invested a smaller percentage in our nation’s space agency. Office of Management & Budget.

    Many austerity proponents and budget-hawks — both in politics and among the general public — will often point to the large cost of these flagship missions, which can balloon if unexpected problems arise. The far greater problem, however, would arise if one of these flagship missions failed.

    When Hubble launched with its flawed mirror, unable to properly focus the light it gathered, fixing it became mandatory [Soon after Hubble began sending images from space, scientists discovered that the telescope’s primary mirror had a flaw called spherical aberration. The outer edge of the mirror was ground too flat by a depth of 4 microns (roughly equal to one-fiftieth the thickness of a human hair). The flaw resulted in images that were fuzzy because some of the light from the objects being studied was being scattered.After this discovery, scientists and engineers developed COSTAR, corrective optics that functioned like eyeglasses to restore Hubble’s vision. By placing small and carefully designed mirrors in front of the original Hubble instruments, COSTAR –installed during the 1993 First Servicing Mission — successfully improved their vision to their original design goals (Thank you, Sandy Faber)]. Yes, it was expensive, but the far greater cost — to science, to society, and to humanity — would have been not to fix it. Our choice to invest in repairing (and upgrading) Hubble directly led to some of our greatest discoveries of all-time.

    James Webb, similarly, is now over budget, and will require additional funds to complete. But the small, additional cost to get it right enormously outweighs the cost we’d all bear if we cheated ourselves and didn’t finish this incredible investment. [Also, here, we have commitments from CSA and ESA]

    7
    The science instruments aboard the ISIM module being lowered and installed into the main assembly of JWST in 2016. The telescope must be folded and properly stowed in order to fit aboard the Ariane 5 rocket which will launch it, and all its components must work together, correctly, to deliver a successful mission outcome. NASA / Chris Gunn.

    Now, the 2020 Decadal Survey approaches. The future course of astronomy and astrophysics will be charted, and one flagship mission will be selected as the top priority for a premiere mission of the 2030s. (James Webb was that mission for the 2010s; WFIRST will be it for the 2020s.) Unfortunately, a memorandum was just released by the astronomy & astrophysics director, Paul Hertz, of NASA’s Science Mission Directorate. In it, each of the four finalist teams were instructed to develop a second architechture: a lower-cost, scientifically-inferior option.

    8
    This figure shows the real stars in the sky for which a planet in the habitable zone can be observed. The color coding shows the probability of observing an exoEarth candidate if it’s present around that star (green is a high probability, red is a low one). Note how the size of your telescope/observatory in space impacts what you can see. C. Stark and J. Tumlinson, STScI.

    It flies in the face of what a flagship mission actually is. Speaking at this year’s big American Astronomical Society meeting, NASA Associate Administrator Thomas Zurbuchen said,

    “What we learn from these flagship missions is why we study the Universe. This is civilization-scale science… If we don’t do this, we aren’t NASA.”

    8
    A simulated view of the same part of the sky, with the same observing time, with both Hubble (L) and the initial architecture of LUVOIR (R). The difference is breathtaking, and represents what civilization-scale science can deliver. G. Snyder, STScI /M. Postman, STScI.

    And yet, these scaled-down architectures are by definition not as ambitious. It’s an indication from NASA that, unless the budget is increased to accommodate the actual costs of doing civilization-scale science, we won’t be doing it. Each of the four finalists has been instructed to propose an option with a total cost of below $5 billion, which will severely curtail the capabilities of such an observatory.

    9
    The concept design of the LUVOIR space telescope would place it at the L2 Lagrange point, where a 15.1-meter primary mirror would unfold and begin observing the Universe, bringing us untold scientific and astronomical riches. NASA / LUVOIR concept team; Serge Brunier (background)

    As an example, one of the proposals, LUVOIR, was designed to be the ultimate successor to Hubble: 40 times as powerful with a diameter of up to ~15 meters. It was designed to tackle problems in our Solar System, measure molecular biosignatures on exoplanets, to perform a cosmic census of stars in every type of galaxy in the Universe, to achieve the sensitivity capable of seeing every galaxy in the Universe, to directly image and map the gas in galaxies everywhere, and to measure the rotation of galaxies (and better understand dark matter) for every galaxy in the Universe.

    But the new architecture would be only half the diameter, half the resolution, and with a quarter of the light-gathering power of the original design. It would basically be an optical version of the James Webb Space Telescope. The sweeping ambition of the original project, with the potential to revolutionize our view of the Universe, would be lost.

    9
    A simulated image of what Hubble would see for a distant, star-forming galaxy (L), versus what a 10-15 meter class telescope would see for the same galaxy (R). With a telescope of half the size, the resolution would be halved, and the light-gathering time would need to be four times as great to create that inferior image. NASA / Greg Snyder / LUVOIR-HDST concept team.

    The other three proposals are more easily scaled-down, but again lose their power. HabEx, designed to directly image Earth-like planets around other stars, loses 87.5% of the interesting planets it can survey if its size is reduced in half. It might not offer much more than the other suites of missions that will fly, like WFIRST (especially if WFIRST gets a starshade), to justify being the flagship mission with such a reduction. LYNX, designed to be a next-generation X-ray observatory that’s vastly superior to Chandra and XMM-Newton, might not be much superior to the ESA’s upcoming Athena mission on such a budget. Its spatial and energy resolution were supposed to be its big selling points; on a reduced budget, it’s hard to see how it will achieve those.

    10
    An artist’s concept of the Origins Space Telescope, with the (architecture 1) 9.1 meter primary mirror. At lower resolutions and sizes, it still offers a tremendous improvement over current-and-previous far-IR observatories. NASA/GSFC

    The best bet might be OST: the Origins Space Telescope, which would represent a huge upgrade over Spitzer: the only other far-infrared observatory NASA’s ever sent to space. Its 9.1 meter design is likely impossible at that price point, but a reduction in size is less devastating to this mission. At a lower price tag, it can still teach us a huge amount about space, from our Solar System to exoplanets to black holes to distant, early galaxies. There is no NASA or European counterpart to compete with, and unlike the optical part of the spectrum, it’s notoriously challenging to attempt astronomy in this wavelength from the ground. The closest we have is the airplane-borne SOFIA, which is fantastic, but has a number of limitations.

    11
    NASA’s Stratospheric Observatory for Infrared Astronomy (SOFIA) with open telescope doors. This joint partnership between NASA and the German organization DLR enables us to take a state-of-the-art infrared telescope to any location on Earth’s surface, allowing us to observe events wherever they occur. NASA / Carla Thomas

    This is NASA. This is the pre-eminent space agency in the world. This is where science, research, development, discovery, and innovation all come together. The spinoff technologies alone justify the investment, but that’s not why we do it. We are here to discover the Universe. We are here to learn all that we can about the cosmos and our place within it. We are here to find out what the Universe looks like and how it came to be the way it is today.

    It’s time for the United States government to step up to the plate and invest in fundamental science in a way the world hasn’t seen in decades. It’s time to stop asking the scientific community to do more with less, and give them a realistic but ambitious goal: to do more with more. If we can afford an ill-thought-out space force, perhaps we can afford to learn about the greatest unexplored natural resource of all. The Universe, and the vast unknowns hiding in the great cosmic ocean.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

    Advertisements
     
  • richardmitnick 4:19 pm on June 19, 2018 Permalink | Reply
    Tags: , Sunway TaihuLight China, , Why the US and China's brutal supercomputer war matters,   

    From Wired: “Why the US and China’s brutal supercomputer war matters” 

    Wired logo

    Wired

    19 June 2018
    Chris Stokel-Walker

    ORNL IBM AC922 SUMMIT supercomputer. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Thought global arms races are all about ballistic missiles, space or nuclear development? Think again: the new diplomatic frontline is over processing power and computer chips.
    Multi-million dollar projects to eke out an advantage in processing power aren’t really about science, they’re an exercise in soft power.

    A major shift has taken place, with a new claimant to the crown of world’s fastest supercomputer. IBM’s Summit at Oak Ridge National Laboratory in Tennessee uses Power9 CPUs and NVIDIA Tesla V100 GPUs and has 4,068 servers powered by ten petabytes of memory working concurrently to process 200,000 trillion calculations per second – 200 petaflops. That’s a lot of numbers – and here’s one more. Summit’s processing power is 117 petaflops more than the previous record-holder, China’s TaihuLight.

    Sunway TaihuLight, China, US News

    While it may seem significant, it’s actually largely symbolic, says Andrew Jones of the Numerical Algorithms Group, a high-performance computing consultancy. “I put no value on being twice as fast or 20 per cent faster other than bragging rights.”

    That’s not to say that supercomputers don’t matter. They are “being driven by science”, says Jack Dongarra, a computer science professor at the University of Tennessee and the compiler of the world’s top 500 supercomputer list. And science is driven today by computer simulation, he adds – with high-powered computers crucial to carry out those tests.

    Supercomputers can crunch data far faster and more easily than regular computers, making them ideal for handling big data – from cybersecurity to medical informatics to astronomy. “We could quite easily go another four or five orders of magnitude and still find scientific and business reasons to benefit from it,” says Jones.

    Oak Ridge, where Summit is housed, is already soliciting bids for a project called Coral II, the successor to the Coral project which resulted in the Summit supercomputer. The Coral II will involve three separate hardware systems, each of which has a price tag of $600 million, says Dongarra. The goal? To build a supercomputer capable of calculating at a rate of exaflops – five times faster than Summit.

    While they are faster and more powerful, supercomputers are actually not much different from the hardware we interact with on a daily basis, says Jones. “The basic components are the same as a standard server,” he says. But because of their scale, and the complexity involved in programming them to process information as a single, co-ordinated unit, supercomputer projects require significant financial outlay to build, and political support to attract that funding.

    That political involvement transforms them from a simple computational tool into a way of exercising soft power and stoking intercontinental rivalries.

    With Summit, the US has wrested back the title of the world’s most powerful supercomputer for the first time since 2012 – though it still languishes behind China in terms of overall processing power. China is the home of 202 of the 500 most powerful supercomputers, having overtaken the US in November 2017.

    “What’s quite striking is that in 2001 there were no Chinese machines that’d be considered a supercomputer, and today they dominate,” explains Dongarra. The sudden surge of supercomputers in China over the last two decades is an indication of significant investment, says Jones. “It’s more a reflection of who’s got their lobbying sorted than anything else,” he adds.

    Recently, the Chinese leadership has been drifting away “from an aspirational ‘catch-up with the west’ mentality to aspiring to be world class and to lead,” says Jonathan Sullivan, director of the China Policy Institute at the University of Nottingham. “These achievements like the longest bridge, biggest dam and most powerful supercomputer aren’t just practical solutions, they also have symbolic meaning,” he adds.

    Or putting it differently: bragging rights matter enormously to whoever’s on top.

    [TaihuLight is about 2 years old. The Chinese supercomputer people have not been sitting on their hands. They knew this was coming. We will see how long Summit is at the top.]

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 3:43 pm on June 19, 2018 Permalink | Reply
    Tags: ClusterStor, , Cray Introduces All Flash Lustre Storage Solution Targeting HPC, , L300F a scalable all-flash storage solution, Lustre 2.11,   

    From HPC Wire: “Cray Introduces All Flash Lustre Storage Solution Targeting HPC” 

    From HPC Wire

    June 19, 2018
    John Russell

    1

    Citing the rise of IOPS-intensive workflows and more affordable flash technology, Cray today introduced the L300F, a scalable all-flash storage solution whose primary use case is to support high IOPS rates to/from a scratch storage pool in the Lustre file system. Cray also announced that sometime in August, it would be supporting Lustre 2.11 just released in April. This rapid productizing of Lustre’s latest release is likely to be appreciated by the user community which sometimes criticizes vendors for being slow to commercialize the latest features of the open source parallel file system.

    “Lustre 2.11 has been one of the drivers for us because it has unique performance enhancements, usability enhancements, and we think some of those features will pair nicely with a flash-based solution that’s sitting underneath the file system,” said Mark Wiertalla, product marketing director.

    The broader driver is the rise in use cases with demanding IOPS characteristics often including files of small size. Hard disk drives, by their nature, handle these workloads poorly. Cray cites AI, for example, as a good use case with high IOPS requirements.

    2

    Here’s a brief description from Cray of how L300F fits into the Cray ClusterStor systems:

    Unlike the existing building blocks in the ClusterStor family which use a 5U84 form factor (5 rack units high/84 drives slots) mainly for Hard Disk Drives (HDD) the L300F is a 2U24 form factorfilled exclusively with Solid State Drives (SDD).
    Like the existing building blocks (L300 and L300N) the L300F features two embedded server modules in a high availability configuration for the Object Storage Server (OSS) functionality of the open source, parallel file system Lustre.
    Like the existing building blocks, the L300 converges the Lustre Object Storage Servers (OSS) and the Object Storage Targets (OST) in the same building block for linear scalability.
    Like all ClusterStor building blocks the L300F is purpose-engineered to deliver the most effective parallel file system storage infrastructure for the leadership class of supercomputing environments.

    The existing L300 model is an all-HDD Lustre solution, well suited for environments using applications with large, sequential I/O workloads. The L300N model, by contrast, is a hybrid SSD/HDD solution with flash-accelerated NXD software that redirects I/O to the appropriate storage medium, delivering cost-effective, consistent performance on mixed I/O workloads while shielding the application, file system and users from complexity through transparent flash acceleration.

    In positioning L300F, Cray said, “L300F enables users such as engineers, researchers and scientists to dramatically reduce the runtime of their applications allowing jobs to reliably complete within their required schedule, supporting more iterations and faster time to insight. Supplementing Cray’s ClusterStor portfolio with an all-flash storage option, the ClusterStor L300F integrates with and complements the existing L300/L300N models to provide a comprehensive storage architecture. It allows customers to address performance bottlenecks without needlessly overprovisioning HDD storage capacity, creating a cost-competitive solution for improved application run time.”

    3

    Analysts are likewise bullish on flash. “Flash is poised to become an essential technology in every HPC storage solution,” said Jeff Janukowicz, IDC’s Research vice president, Solid State Drives and Enabling Technologies. “It has the unique role of satisfying the high-performance appetite of artificial intelligence applications even while helping customers optimize their storage budget for big data. With the ClusterStor L300F, Cray has positioned itself to be at the leading edge of next generation of HPC storage solutions.”

    According to Cray L300F simplifies storage management for storage administrators, allowing them to stand up a high-performance flash pool within their existing Lustre file system using existing tools and skills. “This eliminates the need for product-unique training or to administer a separate file system. Using ClusterStor Manager, administrators can reduce the learning curve and accelerate time-to-proficiency, thereby improving ROI. When coupled with Cray’s exclusive monitoring application Cray View for ClusterStor, administrators get an end-to-end view of Lustre jobs, network status and storage system performance. Cray View forClusterStor provides visibility into job runtime variability, event correlation, trend analysis and offers custom alerts based on any selected metric,” according to the announcement.

    Price remains an issue for flash. It’s currently about 13X more expensive on per terabyte basis. “But when flash is viewed on a dollar per IOPS basis, it is small fraction of the cost compared to hard disk drives. What our customers are telling us is they have unlocked that secret. Now they can think about uses cases and say here’s three of them that make sense immediately. That’s how they will deploy it. They’ll use it as a tactical tool,” said Wiertalla.

    “We see the L300F allowing many customers to start testing the waters with flash storage. We are seeing RFPs [and] we think we are going to see, as the delta in prices between flash and disk narrows over the next 3-5 years, that customers will find incrementally new use cases where flash become cost competitive and they will adopt it gradually. Maybe in the 2020s we’ll start to see customers think about putting file systems exclusively on flash.”

    Given Cray is approaching the first anniversary of its acquisition of the ClusterStor portfolio it is likely to showcase the line at ISC2018 (booth #E-921) next week (see HPCwire article, Cray Moves to Acquire the Seagate ClusterStor Line) and perhaps issue other news in its storage line.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    HPCwire is the #1 news and information resource covering the fastest computers in the world and the people who run them. With a legacy dating back to 1987, HPC has enjoyed a legacy of world-class editorial and topnotch journalism, making it the portal of choice selected by science, technology and business professionals interested in high performance and data-intensive computing. For topics ranging from late-breaking news and emerging technologies in HPC, to new trends, expert analysis, and exclusive features, HPCwire delivers it all and remains the HPC communities’ most reliable and trusted resource. Don’t miss a thing – subscribe now to HPCwire’s weekly newsletter recapping the previous week’s HPC news, analysis and information at: http://www.hpcwire.com.

     
  • richardmitnick 3:19 pm on June 19, 2018 Permalink | Reply
    Tags: , , ,   

    From World Community Grid (WCG): “Microbiome Immunity Project Researchers Create Ambitious Plans for Data” 

    New WCG Logo

    WCGLarge

    From World Community Grid (WCG)

    By: Dr. Tomasz Kościółek and Bryn Taylor
    University of California San Diego
    19 Jun 2018

    Summary
    The Microbiome Immunity Project researchers—from Boston, New York, and San Diego—met in person a few weeks ago to make plans that include a 3D map of the protein universe and other far-ranging uses for the data from the project.


    The research team members pictured above are (from left to right): Vladimir Gligorijevic (Simons Foundation’s Flatiron Institute), Tommi Vatanen (Broad Institute of MIT and Harvard), Tomasz Kosciolek (University of California San Diego), Rob Knight (University of California San Diego), Rich Bonneau (Simons Foundation’s Flatiron Institute), Doug Renfrew (Simons Foundation’s Flatiron Institute), Bryn Taylor (University of California San Diego), Julia Koehler Leman (Simons Foundation’s Flatiron Institute). Visit the project’s Research Participants page for additional team members.

    During the week of May 28, researchers from all Microbiome Immunity Project (MIP) institutions (University of California San Diego, Broad Institute of MIT and Harvard, and the Simons Foundation’s Flatiron Institute) met in San Diego to discuss updates on the project and plan future work.

    Our technical discussions included a complete overview of the practical aspects of the project, including data preparation, pre-processing, grid computations, and post-processing on our machines.

    We were excited to notice that if we keep the current momentum of producing new structures for the project, we will double the universe of known protein structures (compared to the Protein Data Bank) by mid-2019! We also planned how to extract the most useful information from our data, store it effectively for future use, and extend our exploration strategies.

    We outlined three major areas we want to focus on over the next six months.

    Structure-Aided Function Predictions

    We can use the structures of proteins to gain insight into protein function—or what the proteins actually do. Building on research from MIP co-principal investigator Richard Bonneau’s lab, we will extend their state-of-the-art algorithms to predict protein function using structural models generated through MIP. Using this new methodology based on deep learning, akin to the artificial intelligence algorithms of IBM, we hope to see improvements over more simplistic methods and provide interesting examples from the microbiome (e.g., discover new genes creating antibiotic resistance).

    Map of the Protein Universe

    Together we produce hundreds of high-quality protein models every month! To help researchers navigate this ever-growing space, we need to put them into perspective of what we already know about protein structures and create a 3D map of the “protein universe.” This map will illustrate how the MIP has eliminated the “dark matter” from this space one structure at a time. It will also be made available as a resource for other researchers to explore interactively.

    Structural and Functional Landscape of the Human Gut Microbiome

    We want to show what is currently known about the gut microbiome in terms of functional annotations and how our function prediction methods can help us bridge the gap in understanding of gene functions. Specifically, we want to follow up with examples from early childhood microbiome cohorts (relevant to Type-1 diabetes, or T1D) and discuss how our methodology can help us to better understand T1D and inflammatory bowel disease.

    The future of the Microbiome Immunity Project is really exciting, thanks to everyone who makes our research possible. Together we are making meaningful contributions to not one, but many scientific problems!

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Ways to access the blog:
    https://sciencesprings.wordpress.com
    http://facebook.com/sciencesprings

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”
    WCG projects run on BOINC software from UC Berkeley.
    BOINCLarge

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    BOINC WallPaper

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BET!!

    My BOINC
    MyBOINC
    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Microbiome Immunity Project

    FightAIDS@home Phase II

    FAAH Phase II
    OpenZika

    Rutgers Open Zika

    Help Stop TB
    WCG Help Stop TB
    Outsmart Ebola together

    Outsmart Ebola Together

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    faah-1-new-screen-saver

    faah-1-new

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

     
  • richardmitnick 2:42 pm on June 19, 2018 Permalink | Reply
    Tags: ASKAP, , , , , Meerkat, , ,   

    From AAAS: “New radio telescope in South Africa will study galaxy formation” 

    AAAS

    From AAAS

    Jun. 19, 2018
    Daniel Clery

    SKA Meerkat telescope, 90 km outside the small Northern Cape town of Carnarvon, SA

    Today, the Square Kilometre Array (SKA), a continent-spanning radio astronomy project, announced that Spain has come on board as the collaboration’s 11th member. That boost will help the sometimes-troubled project as, over the next year or so, it forms an international treaty organization and negotiates funding to start construction. Meanwhile, on the wide-open plains of the Karoo, a semiarid desert northeast of Cape Town, South Africa, part of the telescope is already in place in the shape of the newly completed MeerKAT, the largest and most powerful radio telescope in the Southern Hemisphere.

    The last of 64 13.5-meter dishes was installed late last year, and next month South African President Cyril Ramaphosa will officially open the facility. Spread across 8 kilometers, the dishes have a collecting area similar to that of the great workhorse of astrophysics, the Karl G. Jansky Very Large Array (VLA) near Socorro, New Mexico.

    NRAO/Karl V Jansky VLA, on the Plains of San Agustin fifty miles west of Socorro, NM, USA, at an elevation of 6970 ft (2124 m)

    But with new hardware designs and a powerful supercomputer to process data, the newcomer could have an edge on its 40-year-old northern cousin.

    “For certain studies, it will be the best” in the world, says Fernando Camilo, chief scientist of the South African Radio Astronomy Observatory in Cape Town, which operates MeerKAT. Sensitive across a wide swath of the radio spectrum, MeerKAT can study how hydrogen gas moves into galaxies to fuel star formation. With little experience, South Africa has “a major fantastic achievement,” says Heino Falcke of Radboud University in Nijmegen, the Netherlands.

    MeerKAT, which stands for Karoo Array Telescope along with the Afrikaans word for “more,” is one of several precursor instruments for the SKA. . The first phase of the SKA could begin in 2020 at a cost of €798 million. It would add another 133 dishes to MeerKAT, extending it across 150 kilometers, and place 130,000 smaller radio antennas across Australia—but only if member governments agree to fully fund the work. Months of delicate negotiations lie ahead. “In every country, people are having that discussion on what funding is available,” Falcke says.

    With MeerKAT’s 64 dishes now in place, engineers are learning how to process the data they gather. In a technique called interferometry, computers correlate the signals from pairs of dishes to build a much sharper image than a single dish could produce. For early science campaigns last year, 16 dishes were correlated. In March, the new supercomputer came online, and the team hopes to be fully operational by early next year. “It’s going to be a challenge,” Camilo says.

    MeerKAT’s dishes are smaller than the VLA’s, but having more of them puts it in “a sweet spot of sensitivity and resolution,” Camilo says. Its dishes are split into a densely packed core, which boosts sensitivity, and widely dispersed arms, which increase resolution. The VLA can opt for sensitivity or resolution, but not both at once—and only after the slow process of moving its 27 dishes into a different configuration.

    The combination makes MeerKAT ideal for mapping hydrogen, the fuel of star and galaxy formation. Because of a spontaneous transition in the atoms of neutral hydrogen, the gas constantly emits microwaves with a wavelength of 21 centimeters. Stretched to radio frequencies by the expansion of the universe, these photons land in the telescope’s main frequency band. It should have the sensitivity to map the faint signal to greater distances than before, and the resolution to see the gas moving in and around galaxies.

    MeerKAT will also watch for pulsars, dense and rapidly spinning stellar remnants. Their metronomic radio wave pulses serve as precise clocks that help astronomers study gravity in extreme conditions. “By finding new and exotic pulsars, MeerKAT can provide tests of physics,” says Philip Best of the University of Edinburgh. Falcke wants to get a better look at a highly magnetized pulsar discovered in 2013. He hopes it will shed light on the gravitational effects of the leviathan it orbits: the supermassive black hole at the center of the Milky Way.

    Other SKA precursors are taking shape. The Australian SKA Pathfinder (ASKAP) at the Murchison Radio-astronomy Observatory in Western Australia is testing a novel survey technology with its 36 12-meter dishes that could be used in a future phase of the SKA.

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    Whereas a conventional radio dish has a single-element detector—the equivalent of a single pixel—the ASKAP’s detectors have 188 elements, which should help it quickly map galaxies across large areas of the sky.

    Nearby is the Murchison Widefield Array (MWA), an array of 2048 antennas, each about a meter across, that look like metallic spiders.

    SKA Murchison Widefield Array, Boolardy station in outback Western Australia, at the Murchison Radio-astronomy Observatory (MRO)

    Sensitive to lower frequencies than MeerKAT, the MWA can pick up the neutral hydrogen signal from as far back as 500 million years after the big bang, when the first stars and galaxies were lighting up the universe. Astronomers have been chasing the faint signal for years, and earlier this year, one group reported a tentative detection. “We’re really curious to see if it can be replicated,” says MWA Director Melanie Johnston-Hollitt of Curtin University in Perth, Australia.

    If the MWA doesn’t deliver a verdict, the SKA, with 130,000 similar antennas, almost certainly will. Although the MWA may detect the universe lighting up, the SKA intends to map out where it happened.

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 2:14 pm on June 19, 2018 Permalink | Reply
    Tags: , , , , ,   

    From Science News: “Magnetic fields may be propping up the Pillars of Creation” 


    From Science News

    June 15, 2018
    Emily Conover

    The structure’s internal magnetism could mean the columns of gas and dust will be long-lived.

    1
    PILLAR OF STRENGTH Columns of cosmic gas and dust dubbed the Pillars of Creation (shown in this image from the Hubble Space Telescope) may be propped up by an internal magnetic field. NASA, ESA, Hubble Heritage Team/STScI and AURA

    The Pillars of Creation may keep standing tall due to the magnetic field within the star-forming region.

    For the first time, scientists have made a detailed map of the magnetic field inside the pillars, made famous by an iconic 1995 Hubble Space Telescope image (SN Online: 1/6/15). The data reveal that the field runs along the length of each pillar, perpendicular to the magnetic field outside. This configuration may be slowing the destruction of the columns of gas and dust, astronomer Kate Pattle and colleagues suggest in the June 10 Astrophysical Journal Letters.

    Hot ionized gas called plasma surrounds the pillars, located within the Eagle Nebula about 7,000 light-years from Earth. The pressure from that plasma could cause the pillars to pinch in at the middle like an hourglass before breaking up. However, the researchers suggest, the organization of the magnetic field within the pillars could be providing an outward force that resists the plasma’s onslaught, preventing the columns from disintegrating.

    The Pillars of Creation may keep standing tall due to the magnetic field within the star-forming region.

    For the first time, scientists have made a detailed map of the magnetic field inside the pillars, made famous by an iconic 1995 Hubble Space Telescope image (SN Online: 1/6/15). The data reveal that the field runs along the length of each pillar, perpendicular to the magnetic field outside. This configuration may be slowing the destruction of the columns of gas and dust, astronomer Kate Pattle and colleagues suggest in the June 10 Astrophysical Journal Letters.

    2
    FIELD OF DREAMS A map of the magnetic field within the Pillars of Creation reveals that the orientation of the field runs roughly parallel to each skinny column. White bars indicate the field’s orientation in that location. K. Pattle et al/Astrophysical Journal Letters 2018

    Hot ionized gas called plasma surrounds the pillars, located within the Eagle Nebula about 7,000 light-years from Earth. The pressure from that plasma could cause the pillars to pinch in at the middle like an hourglass before breaking up. However, the researchers suggest, the organization of the magnetic field within the pillars could be providing an outward force that resists the plasma’s onslaught, preventing the columns from disintegrating.

    Eagle Nebula NASA/ESA Hubble Public Domain

    The team studied light emitted from the pillars, measuring its polarization — the direction of the wiggling of the light’s electromagnetic waves — using the James Clerk Maxwell Telescope in Hawaii. Dust grains within the pillars are aligned with each other due to the magnetic field. These aligned particles emit polarized light, allowing the researchers to trace the direction of the magnetic field at various spots.

    “There are few clear measurements of the magnetic fields in objects like pillars,” says Koji Sugitani of Nagoya City University in Japan. To fully understand the formation of such objects, more observations are needed, he says.

    Studying objects where stars are born, such as the pillars, could help scientists better understand the role that magnetic fields may play in star formation (SN: 6/9/18, p. 12). “This is really one of the big unanswered questions,” says Pattle, of National Tsing Hua University in Hsinchu, Taiwan. “We just don’t have a very good idea of whether magnetic fields are important and, if they are, what they are doing.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:49 pm on June 19, 2018 Permalink | Reply
    Tags: , , , , HL-LHC, , ,   

    From CERN: “Major work starts to boost the luminosity of the LHC” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    1
    Civil works have begun on the ATLAS and CMS sites to build new underground structures for the High-Luminosity LHC. (Image: Julien Ordan / CERN)

    CERN map

    The Large Hadron Collider (LHC) is officially entering a new stage. Today, a ground-breaking ceremony at CERN celebrates the start of the civil-engineering work for the High-Luminosity LHC (HL-LHC): a new milestone in CERN’s history. By 2026 this major upgrade will have considerably improved the performance of the LHC, by increasing the number of collisions in the large experiments and thus boosting the probability of the discovery of new physics phenomena.

    The LHC started colliding particles in 2010. Inside the 27-km LHC ring, bunches of protons travel at almost the speed of light and collide at four interaction points. These collisions generate new particles, which are measured by detectors surrounding the interaction points. By analysing these collisions, physicists from all over the world are deepening our understanding of the laws of nature.

    While the LHC is able to produce up to 1 billion proton-proton collisions per second, the HL-LHC will increase this number, referred to by physicists as “luminosity”, by a factor of between five and seven, allowing about 10 times more data to be accumulated between 2026 and 2036. This means that physicists will be able to investigate rare phenomena and make more accurate measurements. For example, the LHC allowed physicists to unearth the Higgs boson in 2012, thereby making great progress in understanding how particles acquire their mass. The HL-LHC upgrade will allow the Higgs boson’s properties to be defined more accurately, and to measure with increased precision how it is produced, how it decays and how it interacts with other particles. In addition, scenarios beyond the Standard Model will be investigated, including supersymmetry (SUSY), theories about extra dimensions and quark substructure (compositeness).

    “The High-Luminosity LHC will extend the LHC’s reach beyond its initial mission, bringing new opportunities for discovery, measuring the properties of particles such as the Higgs boson with greater precision, and exploring the fundamental constituents of the universe ever more profoundly,” said CERN Director-General Fabiola Gianotti.

    The HL-LHC project started as an international endeavour involving 29 institutes from 13 countries. It began in November 2011 and two years later was identified as one of the main priorities of the European Strategy for Particle Physics, before the project was formally approved by the CERN Council in June 2016. After successful prototyping, many new hardware elements will be constructed and installed in the years to come. Overall, more than 1.2 km of the current machine will need to be replaced with many new high-technology components such as magnets, collimators and radiofrequency cavities.

    2
    Prototype of a quadrupole magnet for the High-Luminosity LHC. (Image: Robert Hradil, Monika Majer/ProStudio22.ch)

    FNAL magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC Photo Reidar Hahn

    The secret to increasing the collision rate is to squeeze the particle beam at the interaction points so that the probability of proton-proton collisions increases. To achieve this, the HL-LHC requires about 130 new magnets, in particular 24 new superconducting focusing quadrupoles to focus the beam and four superconducting dipoles. Both the quadrupoles and dipoles reach a field of about 11.5 tesla, as compared to the 8.3 tesla dipoles currently in use in the LHC. Sixteen brand-new “crab cavities” will also be installed to maximise the overlap of the proton bunches at the collision points. Their function is to tilt the bunches so that they appear to move sideways – just like a crab.

    FNAL Crab cavities for the HL-LHC

    CERN crab cavities that will be used in the HL-LHC

    Another key ingredient in increasing the overall luminosity in the LHC is to enhance the machine’s availability and efficiency. For this, the HL-LHC project includes the relocation of some equipment to make it more accessible for maintenance. The power converters of the magnets will thus be moved into separate galleries, connected by new innovative superconducting cables capable of carrying up to 100 kA with almost zero energy dissipation.

    “Audacity underpins the history of CERN and the High-Luminosity LHC writes a new chapter, building a bridge to the future,” said CERN’s Director for Accelerators and Technology, Frédérick Bordry. “It will allow new research and with its new innovative technologies, it is also a window to the accelerators of the future and to new applications for society.”

    To allow all these improvements to be carried out, major civil-engineering work at two main sites is needed, in Switzerland and in France. This includes the construction of new buildings, shafts, caverns and underground galleries. Tunnels and underground halls will house new cryogenic equipment, the electrical power supply systems and various plants for electricity, cooling and ventilation.

    During the civil engineering work, the LHC will continue to operate, with two long technical stop periods that will allow preparations and installations to be made for high luminosity alongside yearly regular maintenance activities. After completion of this major upgrade, the LHC is expected to produce data in high-luminosity mode from 2026 onwards. By pushing the frontiers of accelerator and detector technology, it will also pave the way for future higher-energy accelerators.


    The LHC will receive a major upgrade and transform into the High-Luminosity LHC over the coming years. But what does this mean and how will its goals be achieved? Find out in this video featuring several people involved in the project. (Video: Polar Media/CERN.)

    Fermilab is leading the U.S. contribution to the HL-LHC, in addition to building new components for the upgraded detector for the CMS experiment. The main innovation contributed by the United States for the HL-LHC is a novel new type of accelerator cavity that uses a breakthrough superconducting technology.

    Fermilab is also contributing to the design and construction of superconducting magnets that will focus the particle beam much more tightly than the magnets currently in use in the LHC. Fermilab scientists and engineers have also partnered with other CMS collaborators on new designs for tracking modules in the CMS detector, enabling it to respond more quickly to the increased number of collisions in the HL-LHC.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

     
  • richardmitnick 12:56 pm on June 19, 2018 Permalink | Reply
    Tags: , , , , , , The paleo-detector   

    From astrobites: “A Paleo-Detector for Dark Matter: How Ancient Rocks Could Help Unravel the Mystery” 

    Astrobites bloc

    From astrobites

    Title: Searching for Dark Matter with Paleo-Detectors
    Authors: S. Baum, A. K. Drukier, K. Freese, M. Górski, & P. Stengel
    First Author’s Institution: The Oskar Klein Centre for Cosmoparticle Physics, Department of Physics, Stockholm University, Sweden
    1
    Status: Pre-print available [open access on arXiv]

    Dark matter is, by its very nature, elusive. Though we can detect its presence by observing its gravitational influence, dark matter remains invisible because it doesn’t interact electromagnetically. The most widely accepted explanation for dark matter is the existence of weakly interacting massive particles (WIMPs). WIMPs, if eventually observed, would constitute a new, massive kind of elementary particle. Their discovery would be revolutionary for particle physics and cosmology; therefore, countless direct (in labs) and indirect (observing their annihilation or decay) detection experiments are being conducted to identify them. Today’s astrobite discusses a novel proposal for direct dark matter detection that seems more fit for scientists in Jurassic Park than for particle physicists: the paleo-detector.

    The authors of today’s featured paper theorize that ancient rocks could contain evidence of interactions between WIMPS and nuclei in the minerals, forming a completely natural “detector” that would allow scientists to search for evidence of the massive particles using excavated rocks. This experiment varies significantly from other direct detection efforts, as those look for evidence of WIMPs hitting Earth-based detectors in real time. The paleo-detector would instead trace nanometers-long “tracks” of chemical and physical change in the rocks as the result of WIMP-induced nuclear recoil that occurred long ago.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    What do we do?

    Astrobites is a daily astrophysical literature journal written by graduate students in astronomy. Our goal is to present one interesting paper per day in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.
    Why read Astrobites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.
    Our goal is to solve this problem, one paper at a time. In 5 minutes a day reading Astrobites, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in a new area of astronomy.

     
  • richardmitnick 12:23 pm on June 19, 2018 Permalink | Reply
    Tags: , , , Waiting for a sign   

    From Symmetry: “Waiting for a sign” 

    Symmetry Mag
    From Symmetry

    06/19/18
    Diana Kwon

    Some scientists spend decades trying to catch a glimpse of a rare process. But with good experimental design and a lot of luck, they often need only a handful of signals to make a discovery.

    In 2009, University of Naples physicist Giovanni de Lellis had a routine. Almost every day, he would sit at a microscope to examine the data from his experiment, the Oscillation Project with Emulsion-tRacking Apparatus, or OPERA, located in Gran Sasso, Italy. He was seeking the same thing he had been looking for since 1996, when he was with the CHORUS experiment at CERN: a tau neutrino.

    OPERA at Gran Sasso

    CHORUS installation at CERN

    More specifically, he was looking for evidence of a muon neutrino oscillating into a tau neutrino.

    Neutrinos come in three flavors: electron, muon and tau. At the time, scientists knew that they oscillated, changing flavors as they traveled at close to the speed of light. But they had never seen a muon neutrino transform into a tau neutrino.

    Until November 30, 2009. On that day, de Lellis and the rest of the OPERA collaboration spotted their first tau neutrino in a beam of muon neutrinos coming from CERN research center 730 kilometers away.

    “Normally, what you would do is look and look, and nothing comes,” says de Lellis, now spokesperson for the OPERA collaboration. “So it’s quite an exciting moment when you spot your event.”

    For physicists seeking rare events, patience is key. Experiments like these often involve many years of waiting for a signal to appear. Some phenomena, such as neutrinoless double-beta decay, proton decay and dark matter, continue to elude researchers, despite decades of searching.

    Still, scientists hope that after the lengthy wait, there will be a worthwhile reward. Finding neutrinoless double-beta decay would let researchers know that neutrinos are actually their own antiparticles and help explain why there’s more matter than antimatter. Discovering proton decay would test several grand unified theories—and let us know that one of the key components of atoms doesn’t last forever. And discovering dark matter would finally tell us what makes up about a quarter of the mass and energy in the universe.

    “These are really hard experiments,” says Reina Maruyama, a physicist at Yale University working on neutrinoless double-beta decay experiment CUORE (Cryogenic Underground Observatory for Rare Events) as well as a number of direct dark matter searches.

    CUORE experiment,at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) in Italy,a search for neutrinoless double beta decay

    “But they will help answer really fundamental questions that have implications for how the universe was put together.”

    Seeking signs, cutting noise

    For the OPERA collaboration, finding a likely tau neutrino candidate was just the beginning. Hours of additional work, including further analyses and verification from other scientists, were required to confirm that signal didn’t originate from another source.

    Luckily, the first signal passed all the checks, and the team was able to observe four more candidate events in the following years. By 2015, the team had gathered enough data to confidently confirm that muon neutrinos had transformed into tau neutrinos. More specifically, they were able to achieve a 5Σ result, the gold standard of detection in particle physics, which means there’s only a 1 in 3.5 million chance that the signal from an experiment was a fluke.

    For some experiments, seeing as few as two or three events could be enough to make a discovery, says Tiziano Camporesi, a physicist working on the CMS experiment at CERN.

    CERN/CMS Detector

    This was true when scientists at CERN’s Super Proton Synchrotron discovered the Z boson, a neutral elementary particle carrying the weak force, in 1983. “The Z boson discovery was basically made looking at three events,” Camporesi says, “but these three events were so striking that no other kind of particle being produced at the accelerator at the time could fake it.”

    Z boson depiction

    There are a number of ways scientists can improve their odds of catching an elusive event. In general, they can boost signals by making their detectors bigger and by improving the speed and precision with which they record incoming events.

    But a lot depends on background noise: How prevalent are other phenomena that could create a false signal that looks like the one the scientists are searching for?

    When it comes to rare events, scientists often have to go to great lengths to eliminate—or at least reduce—all sources of potential background noise. “Designing an experiment that is immune to background is challenging,” says Augusto Ceccucci, spokesperson for NA62, an experiment searching for an extremely rare kaon decay.

    For its part, NA62 scientists remove background noise by, for example, studying only the decay products that coincide in time with the passage of incoming particles from a kaon beam, and carefully identifying the characteristics of signals that could mimic what they’re looking for so they can eliminate them.

    The Super Cryogenic Dark Matter Search experiment, or SuperCDMS, led by SLAC National Accelerator Laboratory, goes to great lengths to protect its detectors from cosmic rays, particles that regularly rain down on Earth from space.

    SLAC SuperCDMS, at SNOLAB (Vale Inco Mine, Sudbury, Canada)

    SLAC SuperCDMS, at SNOLAB (Vale Inco Mine, Sudbury, Canada)

    To eliminate this source of background, scientists shield the detectors with iron, ship them by ground and sea, and operate them deep underground. “So it would not take many dark matter particles detected to satisfy the 5-sigma detection rule,” says Fermilab’s Dan Bauer, spokesperson for SuperCDMS.

    At particle accelerators, the search for rare phenomena looks a little different. Rather than simply waiting for a particle to show up in a detector, physicists try to create them in particle collisions. The more elusive a phenomenon is, the more collisions it requires to find. Thus, at the Large Hadron Collider, “in order to achieve smaller and smaller probability of production, we’re getting more and more intense beams,” Camporesi says.

    Triangulating the results of different experiments can help scientists build a picture of the particles or processes they’re looking for without actually finding them. For example, by understanding what dark matter is not, physicists can constrain what it could be. “You take combinations of different experiments and you start rejecting different hypotheses,” Maruyama says.

    Only time will tell whether scientists will be able to detect neutrinoless double-beta decay, proton decay, dark matter or other rare events that have yet to be spotted at physicists’ detectors. But once they do—and once scientists know what specific signatures to find, Maruyama says, “it becomes a lot easier to look for these things, and you can go ahead and study the heck out of them.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:52 am on June 19, 2018 Permalink | Reply
    Tags: , , , Halina Abramowicz, , , , ,   

    From Symmetry: Women in STEM-“Q&A: Planning Europe’s physics future” Halina Abramowicz 

    Symmetry Mag
    From Symmetry

    06/13/18
    Lauren Biron

    1
    Artwork by Sandbox Studio, Chicago

    Halina Abramowicz leads the group effort to decide the future of European particle physics.

    Physics projects are getting bigger, more global, more collaborative and more advanced than ever—with long lead times for complex physics machines. That translates into more international planning to set the course for the future.

    In 2014, the United States particle physics community set its priorities for the coming years using recommendations from the Particle Physics Project Prioritization Panel, or P5.

    FNAL Particle Physics Project Prioritization Panel -P5

    In 2020, the European community will refresh its vision with the European Strategy Update for Particle Physics.

    The first European strategy launched in 2006 and was revisited in 2013. In 2019, teams will gather input through planning meetings in preparation for the next refresh.

    Halina Abramowicz, a physicist who works on the ATLAS experiment at CERN’s Large Hadron Collider and the FCAL research and development collaboration through Tel Aviv University, is the chair of the massive undertaking. During a visit to Fermilab to provide US-based scientists with an overview of the process, she sat down with Symmetry writer Lauren Biron to discuss the future of physics in Europe.

    LB:What do you hope to achieve with the next European Strategy Update for Particle Physics?
    HA: Europe is a very good example of the fact that particle physics is very international, because of the size of the infrastructure that we need to progress, and because of the financial constraints.

    The community of physicists working on particle physics is very large; Europe has probably about 10,000 physicists. They have different interests, different expertise, and somehow, we have to make sure to have a very balanced program, such that the community is satisfied, and that at the same time it remains attractive, dynamic, and pushing the science forward. We have to take into account the interests of various national programs, universities, existing smaller laboratories, CERN, and make sure that there is a complementarity, a spread of activities—because that’s the way to keep the field attractive, that is, to be able to answer more questions faster.

    LB: How do you decide when to revisit the European plan for particle physics?
    HA: Once the Higgs was discovered, it became clear that it was time to revisit the strategy, and the first update happened in 2013. The recommendation was to vigorously pursue the preparations for the high-luminosity upgrade of the [Large Hadron Collider].

    The high-luminosity LHC program was formally approved by the CERN Council in September 2016. By the end of 2018, the LHC experiments will have collected almost a factor of 10 more data. It will be a good time to reflect on the latest results, to think about mid-term plans, to discuss what are the different options to consider next and their possible timelines, and to ponder what would make sense as we look into the long-term future.

    CERN HL-LHC map

    Machines, Projects and Experiments operating at CERN LHC and CLIC at three levels of power

    The other aspect which is very important is the fact that the process is called “strategy,” rather than “roadmap,” because it is a discussion not only of the scientific goals and associated projects, but also of how to achieve them. The strategy basically is about everything that the community should be doing in order to achieve the roadmap.

    LB: What’s the difference between a strategy and a roadmap?
    HA: The roadmap is about prioritizing the scientific goals and about the way to address them, while the strategy covers also all the different aspects to consider in order to make the program a success. For example, outreach is part of the strategy. We have to make sure we are doing something that society knows about and is interested in. Education: making sure we share our knowledge in a way which is understandable. Detector developments. Technology transfer. Work with industry. Making sure the byproducts of our activities can also be used for society. It’s a much wider view.

    LB: What is your role in this process?
    HA: The role of the secretary of the strategy is to organize the process and to chair the discussions so that there is an orderly process. At this stage, we have one year to prepare all the elements of the process that are needed—i.e. to collect the input. In the near future we will have to nominate people for the physics preparatory group that will help us organize the open symposium, which is basically the equivalent of a town-hall meeting.

    The hope is that if it’s well organized and we can reach a consensus, especially on the most important aspects, the outcome will come from the community. We have to make sure through interaction with the European community and the worldwide community that we aren’t forgetting anything. The more inputs we have, the better. It is very important that the process be open.

    The first year we debate the physics goals and try to organize the community around a possible plan. Then comes the process that is maybe a little shorter than a year, during which the constraints related to funding and interests of various national communities have to be integrated. I’m of course also hoping that we will get, as an input to the strategy discussions, some national roadmaps. It’s the role of the chair to keep this process flowing.

    LB: Can you tell us a little about your background and how you came to serve as the chair for European Strategy Update?
    HA: That’s a good question. I really don’t know. I did my PhD in 1978; I was one of the youngest PhDs of Warsaw University, thus I’ve spent 40 years in the field. That means that I have participated in at least five large experiments and at least two or three smaller projects. I have a very broad view—not necessarily a deep view—but a broad view of what’s happening.

    LB: There are major particle physics projects going on around the world, like DUNE in the US and Belle II in Japan. How much will the panel look beyond Europe to coordinate activities, and how will it incorporate feedback from scientists on those projects?

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    KEK Belle 2 detector, in Tsukuba, Ibaraki Prefecture, Japan

    HA: This is one of the issues that was very much discussed during my visit. We shouldn’t try to organize the whole world—in fact, a little bit of competition is very healthy. And complementarity is also very important.

    At the physics-level discussions, we’ll make sure that we have representatives from the United States and other countries so we are provided with all the information. As I was discussing with many people here, if there are ideas, experiments or existing collaborations which already include European partners, then of course, there is no issue [because the European partners will provide input to the strategy].

    LB: How do you see Europe working with Asia, in particular China, which has ambitions for a major collider?
    HA: Collaboration is very important, and at the global level we have to find the right balance between competition, which is stimulating, and complementarity. So we’re very much hoping to have one representative from China in the physics preparatory group, because China seems to have ambitions to realize some of the projects which have been discussed. And I’m not talking only about the equivalent of [the Future Circular Collider]; they are also thinking about an [electron-positron] circular collider, and there are also other projects that could potentially be realized in China. I also think that if the Chinese community decides on one of these projects, it may need contributions from around the world. Funding is an important aspect for any future project, but it is also important to reach a critical mass of expertise, especially for large research infrastructures.

    LB: This is a huge effort. What are some of the benefits and challenges of meeting with physicists from across Europe to come up with a single plan?
    HA: The benefits are obvious. The more input we have, the fuller the picture we have, and the more likely we are to converge on something that satisfies maybe not everybody, but at least the majority—which I think is very important for a good feeling in the community.

    The challenges are also obvious. On one hand, we rely very much on individuals and their creative ideas. These are usually the people who also happen to be the big pushers and tend to generate most controversies. So we will have to find a balance to keep the process interesting but constructive. There is no doubt that there will be passionate and exciting discussions that will need to happen; this is part of the process. There would be no point in only discussing issues on which we all agree.

    The various physics communities, in the ideal situation, get organized. We have the neutrino community, [electron-positron collider] community, precision measurements community, the axion community—and here you can see all kinds of divisions. But if these communities can get organized and come up with what one could call their own white paper, or what I would call a 10-page proposal, of how various projects could be lined up, and what would be the advantages or disadvantages of such an approach, then the job will be very easy.

    LB: And that input is what you’re aiming to get by December 2018?
    HA: Yes, yes.

    LB: How far does the strategy look out?
    HA: It doesn’t have an end date. This is why one of the requests for the input is for people to estimate the time scale—how much time would be needed to prepare and to realize the project. This will allow us to build a timeline.

    We have at present a large project that is approved: the high-luminosity LHC. This will keep an important part of our community busy for the next 10 to 20 years. But will the entire community remain fully committed for the whole duration of the program if there are no major discoveries?

    I’m not sure that we can be fed intellectually by one project. I think we need more than one. There’s a diversity program—diversity in the sense of trying to maximize the physics output by asking questions which can be answered with the existing facilities. Maybe this is the time to pause and diversify while waiting for the next big step.

    LB: Do you see any particular topics that you think are likely to come up in the discussion?
    HA: There are many questions on the table. For example, should we go for a proton-proton or an [electron-positron] program? There are, for instance, voices advocating for a dedicated Higgs factory, which would allow us to make measurements of the Higgs properties to a precision that would be extremely hard to achieve at the LHC. So we will have to discuss if the next machine should be an [electron-positron] machine and check whether it is realistic and on what time scale.

    One of the subjects that I’m pretty sure will come up as well is about pushing the accelerating technologies. Are we getting to the limit of what we can do with the existing technologies, and is it time to think about something else?

    To learn more about the European Strategy Update for Particle Physics, watch Abramowicz’s colloquium at Fermilab.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: