Tagged: Nature Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:02 pm on March 21, 2019 Permalink | Reply
    Tags: "Gigantic EU research programme takes shape" (U.S.- they are eating your lunch), Horizon Europe will fund a mix of academia–industry collaborations and discovery science, Innovation innovation innovation, Nature Magazine   

    From Nature: “Gigantic EU research programme takes shape” (U.S.- they are eating your lunch) 

    Nature Mag
    From Nature

    20 March 2019

    Horizon Europe will fund a mix of academia–industry collaborations and discovery science — but its proposed budget of €100 billion has yet to be agreed.

    The European Parliament Building, Brussels, Belgium, 22 December 2017, Steven Lek

    The European Union’s three governing institutions — the European parliament, council and commission — reached agreement in the small hours of 20 March on the outline of the EU’s next seven-year research-funding programme, Horizon Europe.

    Like its predecessor, Horizon 2020, the new programme will fund collaborations between academia and industry, and prestigious discovery science. But the agreement also includes some fresh ideas, including a greater focus on innovation and initiatives to help poorer nations compete for funds.

    One big element that is yet to be decided is the budget for Horizon Europe — due to launch in 2021 — which has been proposed at around €100 billion (US$114 billion) and is expected to be the largest EU research programme yet.

    “Europe wants to go big on research,” says Christian Ehler, a Member of the European Parliament from Germany and one of the rapporteurs for Horizon Europe.

    The agreement marks the end of a series of tough negotiations between the three EU bodies. Talks began in January to resolve sticking points in the commission’s original proposal, which was published last June. The framework’s structure must please both the parliament and the EU’s individual member states.

    The agreement’s details show that at least half of Horizon Europe’s money will be spent on collaborative programmes, in which academic scientists, research institutes and industry work together.

    These will include heavily financed ‘mission’ projects that target specific societal problems, akin to the billion-euro flagship schemes in the current EU research programme, Horizon 2020, that focus on the brain, graphene and quantum technologies. The topics of Horizon Europe’s missions are yet to be decided.

    Most of the rest of the money will go to familiar, prestigious programmes for discovery science, such as the European Research Council and the Marie Skłodowska-Curie Actions scheme — which trains young scientists and promotes international mobility — as well as to programmes to support innovation.

    Innovation, innovation, innovation

    Horizon Europe has a greater focus than its predecessor on innovation: a pumped-up European Innovation Council will invest in small and medium-sized technology companies, and provide competitive grants and other forms of support. The council will work alongside the established European Institute of Innovation and Technology, which supports large communities of scientists in industry and academia to develop innovative products or services.

    New elements in Horizon Europe include programmes aimed at supporting collaboration between museums in EU nations; a fast-track application procedure to develop innovative ideas proposed by the scientific community; and special initiatives to help former-communist countries to compete for research funds.

    The agreement must still be formally approved by the full European Parliament and the council. As well as the budget, it leaves open a key but sensitive decision — the rules under which non-EU member states will be able to participate.

    The three EU institutions want much of Horizon Europe, and particularly the parts relating to global societal challenges, to be open to scientists around the globe. But how this will be organized depends on final budget agreements. The European Commission originally proposed a budget of €94.1 billion, a 22% increase on Horizon 2020’s funds, but the parliament has called for €120 billion.

    The EU institutions will consider these aspects again after the European Parliament elections in May, but are unlikely to reach a decision before the end of the year.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 9:34 am on October 23, 2018 Permalink | Reply
    Tags: , , , KAUST, Nature Magazine   

    KAUST via Nature Magazine: “The holistic approach to catalyzing change” 




    Nature Mag
    Nature Magazine

    Rethinking industry-scale catalytic processes could slash global energy consumption and even turn carbon dioxide into a valuable commodity.

    Sep 6, 2018


    Jorge Gascon
    Professor/Center Director

    Chemical catalysts don’t spring to mind as revolutionary materials, yet Jorge Gascon, director of the KAUST Catalysis Center, says catalysts have sparked some of the biggest revolutions in human history. Take the Haber Bosch process, for example. This first practical method for industrial synthetic fertilizer production, developed in the early 1900s, triggered the agricultural revolution that fuels farming today.

    Catalysis research is poised to change the world again, Gascon claims. “We are about to have another revolution in the way we use our resources and in the way we produce and store energy, and I believe catalysis will play a huge role,” he says. “We at KAUST are in an excellent position to contribute strongly to that transition.”

    Gascon’s research—and that of the Center he has led since joining KAUST in October 2017—revolves around sustainability. “The main purpose of my group is to develop and deploy sustainable technologies for the production of chemicals, energy carriers and new environmental applications. Process intensification, feedstock efficiency and reduction of energy usage are our main objectives.”

    For example, the team recently gained insights that could significantly enhance the performance of catalysts that convert methanol into major chemical feedstocks called olefins1,2. These high-demand chemicals are traditionally sourced from oil, but new catalysts—which Gascon’s work is helping to make more efficient—enable olefin production from coal and natural gas, alleviating a bottleneck in olefin supply.

    Another major area of focus in Gascon’s lab, as well as others labs in the Center, is to develop catalysts that can efficiently turn carbon dioxide into a valuable chemical feedstock. The team has developed several catalysts that can combine CO2 with hydrogen, converting the troublesome greenhouse gas into a range of useful small hydrocarbon molecules.

    At the moment, the hydrogen for the process comes from natural gas in such a way that it generates CO2. “If the situation changes and we start to use solar energy to produce hydrogen from water, then that hydrogen can be used to make very useful products out of carbon dioxide,” Gascon says. Should governments introduce a tax on carbon dioxide emissions, recycling CO2 would become even more favorable. “Our main target is to make those technologies as efficient as possible so it becomes attractive to valorize carbon dioxide.”

    The catalysts Gascon works with are typically porous crystalline solids, such as zeolites and metal-organic frameworks. “I like these materials because working with crystalline structures gives you much more control over design,” Gascon says. The structures of these materials can be tuned at the nanoscale. By making such changes and noting the effects on catalytic performance, it is possible to gain deep insights into how the catalysts function and thus they can be improved. “Being able to explain a thing you can measure at the macroscale, by the structures that you build at the nanoscale, is super nice,” Gascon says.

    The great strength of the Catalysis Center is that there are researchers focused on every aspect of catalytic reaction development and implementation, Gascon adds. “We design new active sites at the nanoscale, but we also design how the catalyst particles should look, and now we are starting to design how reactors should look,” Gascon says. “We are starting to have a holistic approach. I think the Catalysis Center is probably unique in that we are able to cover almost every relevant aspect in catalysis.”

    One of the Center’s flagship projects, which began its second phase in early 2018, is the one-step conversion of crude oil to chemicals. The project illustrates the power of the holistic approach. Today, refineries pass crude oil through cleaning steps, then separate the oil into various chemical fractions, before those fractions are catalytically processed to form chemical feedstocks and fuels. “We want to avoid all those initial steps and go directly to the processing part,” says Gascon. Cutting these steps could save a lot of energy.

    To directly make chemicals from crude oil, you need catalysts that are very robust and resistant to poisoning by contaminants in the oil. But for the process to be successful, the team needs to go far beyond the catalyst itself. “You need to think of different reactor concepts to the ones that are used at the moment,” Gascon says. “You need to redesign the whole process. This is the type of research where I believe our Center can make a difference.”

    The project is a revolutionary idea in the best tradition of catalysis research. And the unique funding structure, facilities and expertise at KAUST make the Catalysis Center the place to do it, says Gascon. “From a research point of view, this is like Disneyland,” he says. “The possibilities here are absolutely amazing. This is probably the only place in the world where you are your own limit.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 11:59 am on September 17, 2018 Permalink | Reply
    Tags: 2015 Paris climate agreement, , Australia has no climate-change policy — again, , , Nature Magazine   

    From Nature: “Australia has no climate-change policy — again” 

    Nature Mag
    From Nature

    Scientists say the country will now struggle to meet it commitments to the Paris agreement.

    17 September 2018
    Adam Morton

    Large parts of Australia are enduring a crippling drought.Credit: David Gray/Reuters

    Australia’s new prime minister has abandoned the country’s policy for cutting greenhouse-gas emissions. Climate scientists say the move means the government has effectively dropped its commitment to the 2015 Paris climate agreement.

    “They’ve walked away from Paris without saying it, hoping no one would notice,” says Lesley Hughes, a climate-change scientist at Macquarie University in Sydney. Without a policy to cut carbon dioxide pollution, the government is dropping its international commitment by default, she says.

    Australia now becomes the second advanced economy after the United States to drop emissions-reduction policies since the 2015 Paris climate conference. President Donald Trump signed an executive order to start removing climate regulations in March 2017 and pulled the US out of the Paris agreement in June 2017.

    Australia’s effective abandonment of Paris can be traced back to late August, when the ruling conservative Liberal Party abruptly replaced former leader Malcolm Turnbull with Prime Minister Scott Morrison. The leadership change came after some party members objected to a policy that would have required electricity companies to meet emissions targets. Morrison subsequently said that he was abandoning the policy, called the National Energy Guarantee (NEG), and would instead focus on reducing the cost of energy for the public.

    The NEG is the fourth national climate policy rejected by Australia’s conservative government since it was elected in 2013, and comes as large parts of country feel the effects of global warming — a crippling drought grips the eastern states and dozens of bushfires have erupted unseasonably early in those regions.

    Some government members have even suggested that the country should join the Trump administration in officially withdrawing from the Paris agreement. Morrison has rejected this idea. He says Australia is on track to meet the target it announced before the Paris conference: to cut emissions by 26–28% below 2005 levels by 2030.

    But there is little evidence to suggest the government will be able to meet this target without new policies. In August, government advisers said it was unlikely that the electricity sector, responsible for one-third of Australia’s emissions, would reduce its emissions by 26% unless a policy was introduced to drive cleaner energy generation over the next decade.

    National emissions have risen each year since 2014, when the government repealed laws requiring big industrial emitters to pay for their emissions. There are also no significant policies to reduce the other major sources of pollution, such as transport, agriculture, heavy industry and mining, which together generate nearly two-thirds of Australia’s carbon emissions.

    Although the NEG was a modest policy, proposed after several more effective schemes failed to win political support, it had the potential to win the backing of the centre-left opposition Labor Party, says John Church, a specialist in sea-level rise at the Climate Change Research Centre (CCRC) at the University of New South Wales in Sydney. That would have enabled the policy to pass through parliament and into law. The policy also had the support of the business community, which has been calling for climate and energy strategies that encourage investment in new and cleaner power plants, he says. “Walking away from it was a disaster.”

    Sarah Perkins-Kirkpatrick, an authority on heatwaves, also at the CCRC, says government motivation to do something about climate change seems to have disappeared altogether. When she briefed senior officials on the latest climate-change science in August, she left the meeting feeling optimistic that more policies were coming. “People were trying to get things done, but now that’s not the case at all,” she says. “I’m extremely frustrated.”

    Public concern

    The decision to drop the policy also goes against the public’s support for action on climate change, says Hughes. A poll of 1,756 people, published on 12 September by research and advocacy organization the Australia Institute, found that 73% of respondents were concerned about climate change and 68% wanted domestic climate targets in line with the country’s Paris commitment.

    But Australia’s lack of climate policy could be short-lived. A national election is due by May 2019, and recent polls suggest that the Labor Party, led by former union boss Bill Shorten, is favoured to win. Labor says it would set a new emissions target of a 45% cut by 2030, although it has not revealed how it would reach the target. In the meantime, some states have mandated ambitious renewable-energy targets, and business leaders say investment in clean energy is increasing because it is now the cheapest option.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 12:43 pm on September 7, 2018 Permalink | Reply
    Tags: Nature Magazine, Peer reviewers unmasked: largest global survey reveals trends   

    From Nature: “Peer reviewers unmasked: largest global survey reveals trends” 

    Nature Mag
    From Nature

    07 September 2018
    Inga Vesper

    Scientists in emerging economies respond fastest to peer review invitations, but are invited least.

    Scientists in developed countries provide nearly three times as many peer reviews per paper submitted as researchers in emerging nations, according to the largest ever survey of the practice.

    The report — which surveyed more than 11,000 researchers worldwide — also finds a growing “reviewer fatigue”, with editors having to invite more reviewers to get each review done. The number rose from 1.9 invitations in 2013 to 2.4 in 2017.

    The Global State of Peer Review report was undertaken by Publons, a website that helps academics to track their reviews and other contributions to scientific journals. The authors used data from the survey, conducted from May to July 2018, as well as data from Publons, Web of Science Core Collection and Scholar One Manuscripts databases.

    The report notes that finding peer reviewers is becoming harder, even as the overall volume of publications rises globally (see ‘Is reviewer fatigue setting in?’).

    Source: Global State of Peer Review 2018

    And although contributions to peer review from emerging economies are lower compared with developed countries, they are rising rapidly, says Andrew Preston, managing director of Publons, in London. “Peer reviews lag publication, so it will take a few years for emerging regions to catch up,” he says.
    Data digging

    Researchers in leading science locations, such as the United States, the United Kingdom and Japan, write nearly 2 peer reviews per submitted article of their own, compared with about 0.6 peer reviews per submission by those in emerging countries such as China, Brazil, India and Poland, the study found (see ‘Uneven contributions’).

    Source: Global State of Peer Review 2018

    Scientists in emerging economies are more likely to accept requests for peer review and complete their reviews faster than those from established economies. But their reviews also tend to be shorter than those from colleagues in wealthy countries.

    The report says scientists from emerging economies might review less because editors’ networks and scientific are still largely centred in developed nations.

    In 2013–17, the United States contributed nearly 33% of peer reviews, and published 25.4% of articles worldwide. By contrast, emerging nations did 19% of peer reviews, and published 29% of all articles.

    China stood out — the country accounted for 13.8% of scientific articles during the period, but did only 8.8% of reviews. Even so, China overtook the United Kingdom in numbers of peer reviews conducted by its scientists in 2015, the study says.

    Peer review in numbers

    Data from the Global State of Peer Review report for 2013–17

    68.5 million hours spent reviewing globally each year

    16.4 days is the median review time

    5 hours is the median time spent writing each review

    477 words is the average length of review reports

    10% of reviewers are responsible for 50% of peer reviews

    41% of survey respondents see peer review as part of their job

    75% of journal editors say the hardest part of their job is finding willing reviewers

    71% of researchers decline review requests because the article is outside their area of expertise

    42% of researchers decline review requests because they are too busy

    39% of reviewers never received any peer-review training


    China’s inclusion of could skew the picture, says John Walsh, a sociologist at the Georgia Institute of Technology in Atlanta.

    He thinks the difference in peer-review activity between rich and poor nations is “actually surprisingly low”, considering the huge discrepancy in science funding and excellence. “China is the really dramatic case,” he says. “If you took China out, the picture would look different.”

    The study notes that the number of peer reviews from emerging nations grew by 193% in 2013–17. That’s not surprising, Walsh says, because peer review offers several perks to researchers, including — usually — a few months of free access to the journal and the opportunity to view the latest research before it gets published.

    Review requests

    The study’s main message, Preston says, is that scientists in emerging nations are keen to do peer review, but do not receive as many requests as their colleagues. This is despite the fact that journals find it increasingly difficult to get their articles peer-reviewed.

    This chimes with experience on the ground. Mohd Abas Shah, an entomologist at the ICAR Central Potato Research Station in Jalandhar, India, says he has published five articles in international journals, but has received only four peer-review requests throughout his whole career. “Peer review provides opportunity to develop a good reputation among colleagues and possible collaborations,” he says. “Fewer opportunities for peer review means missing out on that.”

    The solution, the study recommends, is for scientists to cast a wider net when looking for potential peer reviewers.

    But journal editors can also do their part by being more considerate of people’s language skills and by forming alliances with journals in emerging science regions, says Juan Corley, an ecologist at Argentina’s national agricultural-research institute in Buenos Aires, and editor of the International Journal of Pest Management.

    “We need to increase the number of editors and journal board members from developing economies,” he says. The study found that fewer than 4% of journal editors in its sample came from emerging economies.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 5:03 am on August 15, 2018 Permalink | Reply
    Tags: , Nature Magazine, , ,   

    From Nature via U Wisconsin IceCube: “Special relativity validated by neutrinos” 

    U Wisconsin ICECUBE neutrino detector at the South Pole

    IceCube employs more than 5000 detectors lowered on 86 strings into almost 100 holes in the Antarctic ice NSF B. Gudbjartsson, IceCube Collaboration

    Lunar Icecube

    IceCube DeepCore annotated

    IceCube PINGU annotated

    DM-Ice II at IceCube annotated

    Nature Mag
    From Nature

    13 August 2018
    Matthew Mewes

    Neutrinos are tiny, ghost-like particles that habitually change identity. A measurement of the rate of change in high-energy neutrinos racing through Earth provides a record-breaking test of Einstein’s special theory of relativity.

    The existence of extremely light, electrically neutral particles called neutrinos was first postulated in 1930 to explain an apparent violation of energy conservation in the decays of certain unstable atomic nuclei. Writing in Nature Physics, the IceCube Collaboration1 now uses neutrinos seen in the world’s largest particle detector to scrutinize another cornerstone of physics: Lorentz invariance. This principle states that the laws of physics are independent of the speed and orientation of the experimenter’s frame of reference, and serves as the mathematical foundation for Albert Einstein’s special theory of relativity. Scouring their data for signs of broken Lorentz invariance, the authors carry out one of the most stringent tests of special relativity so far, and demonstrate how the peculiarities of neutrinos can be used to probe the foundations of modern physics.

    Physicists generally assume that Lorentz invariance holds exactly. However, in the late 1990s, the principle began to be systematically challenged2, largely because of the possibility that it was broken slightly in proposed theories of fundamental physics, such as string theory3. Over the past two decades, researchers have tested Lorentz invariance in objects ranging from photons to the Moon4.

    The IceCube Collaboration instead tested the principle using neutrinos. Neutrinos interact with matter through the weak force — one of the four fundamental forces of nature. The influence of the weak force is limited to minute distances. As a result, interactions between neutrinos and matter are extremely improbable, and a neutrino can easily traverse the entire Earth unimpeded. This poses a challenge for physicists trying to study these elusive particles, because almost every neutrino will simply pass through any detector completely unnoticed.

    The IceCube Neutrino Observatory, located at the South Pole, remedies this problem by monitoring an immense target volume to glimpse the exceedingly rare interactions. At the heart of the detector are more than 5,000 light sensors, which are focused on 1 cubic kilometre (1 billion tonnes) of ice. The sensors constantly look for the telltale flashes of light that are produced when a neutrino collides with a particle in the ice.

    The main goal of the IceCube Neutrino Observatory is to observe comparatively scarce neutrinos that are produced during some of the Universe’s most violent astrophysical events. However, in its test of Lorentz invariance, the collaboration studied more-abundant neutrinos that are generated when fast-moving charged particles from space collide with atoms in Earth’s atmosphere. There are three known types of neutrino: electron, muon and tau. Most of the neutrinos produced in the atmosphere are muon neutrinos.

    Atmospheric neutrinos generated around the globe travel freely to the South Pole, but can change type along the way. Such changes stem from the fact that electron, muon and tau neutrinos are not particles in the usual sense. They are actually quantum combinations of three ‘real’ particles — ν1, ν2 and ν3 — that have tiny but different masses.

    In a simple approximation relevant to the IceCube experiment, the birth of a muon neutrino in the atmosphere can be thought of as the simultaneous production of two quantum-mechanical waves: one for ν2 and one for ν3 (Fig. 1). These waves are observed as a muon neutrino only because they are in phase, which means the peaks of the two waves are seen at the same time. By contrast, a tau neutrino results from out-of-phase waves, whereby the peak of one wave arrives with the valley of the other.

    Figure 1 | Propagation of neutrinos through Earth. There are three known types of neutrino: electron, muon and tau. a, A muon neutrino produced in Earth’s atmosphere can be thought of as the combination of two quantum-mechanical waves (red and blue) that are in phase — the peaks of the waves are observed at the same time. If a principle known as Lorentz invariance were violated, these waves could travel at different speeds through Earth’s interior and be detected in the out-of-phase tau-neutrino state. b, The IceCube Collaboration1 reports no evidence of such conversion, constraining the extent to which Lorentz invariance could be violated.

    If neutrinos were massless and Lorentz invariance held exactly, the two waves would simply travel in unison, always maintaining the in-phase muon-neutrino state. However, small differences in the masses of ν2 and ν3 or broken Lorentz invariance could cause the waves to travel at slightly different speeds, leading to a gradual shift from the muon-neutrino state to the out-of-phase tau-neutrino state. Such transitions are known as neutrino oscillations and enable the IceCube detector to pick out potential violations of Lorentz invariance. Oscillations resulting from mass differences are expected to be negligible at the neutrino energies considered in the authors’ analysis, so the observation of an oscillation would signal a possible breakdown of special relativity.

    The IceCube Collaboration is not the first group to seek Lorentz-invariance violation in neutrino oscillations [5–10]. However, two key factors allowed the authors to carry out the most precise search so far. First, atmospheric neutrinos that are produced on the opposite side of Earth to the detector travel a large distance (almost 13,000 km) before being observed, maximizing the probability that a potential oscillation will occur. Second, the large size of the detector allows neutrinos to be observed that have much higher energies than those that can be seen in other experiments.

    Such high energies imply that the quantum-mechanical waves have tiny wavelengths, down to less than one-billionth of the width of an atom. The IceCube Collaboration saw no sign of oscillations, and therefore inferred that the peaks of the waves associated with ν2 and ν3 are shifted by no more than this distance after travelling the diameter of Earth. Consequently, the speeds of the waves differ by no more than a few parts per 10^28 — a result that is one of the most precise speed comparisons in history.

    The authors’ analysis provides support for special relativity and places tight constraints on a number of different classes of Lorentz-invariance violation, many for the first time. Although already impressive, the IceCube experiment has yet to reach its full potential. Because of limited data, the authors restricted their attention to violations that are independent of the direction of neutrino propagation, neglecting possible direction-dependent violations that could arise more generally.

    With a greater number of neutrino detections, the experiment, or a larger future version [11], could search for direction-dependent violations. Eventually, similar studies involving more-energetic astrophysical neutrinos propagating over astronomical distances could test the foundations of physics at unprecedented levels.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

    IceCube is a particle detector at the South Pole that records the interactions of a nearly massless sub-atomic particle called the neutrino. IceCube searches for neutrinos from the most violent astrophysical sources: events like exploding stars, gamma ray bursts, and cataclysmic phenomena involving black holes and neutron stars. The IceCube telescope is a powerful tool to search for dark matter, and could reveal the new physical processes associated with the enigmatic origin of the highest energy particles in nature. In addition, exploring the background of neutrinos produced in the atmosphere, IceCube studies the neutrinos themselves; their energies far exceed those produced by accelerator beams. IceCube is the world’s largest neutrino detector, encompassing a cubic kilometer of ice.

  • richardmitnick 1:04 pm on August 14, 2018 Permalink | Reply
    Tags: , , Brute-force approach to particle hunt, , Nature Magazine, , , , ,   

    From Nature: “LHC physicists embrace brute-force approach to particle hunt” 

    Nature Mag
    From Nature

    14 August 2018
    Davide Castelvecchi

    The world’s most powerful particle collider has yet to turn up new physics [since Higgs] — now some physicists are turning to a different strategy.

    The ATLAS detector at the Large Hadron Collider near Geneva, Switzerland.Credit: Stefano Dal Pozzolo/Contrasto /eyevine

    A once-controversial approach to particle physics has entered the mainstream at the Large Hadron Collider (LHC).


    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    The LHC’s major ATLAS experiment has officially thrown its weight behind the method — an alternative way to hunt through the reams of data created by the machine — as the collider’s best hope for detecting behaviour that goes beyond the standard model of particle physics. Conventional techniques have so far come up empty-handed.

    So far, almost all studies at the LHC — at CERN, Europe’s particle-physics laboratory near Geneva, Switzerland — have involved ‘targeted searches’ for signatures of favoured theories. The ATLAS collaboration now describes its first all-out ‘general’ search of the detector’s data, in a preprint posted on the arXiv server last month and submitted to European Physics Journal C. Another major LHC experiment, CMS, is working on a similar project.

    “My goal is to try to come up with a really new way to look for new physics” — one driven by the data rather than by theory, says Sascha Caron of Radboud University Nijmegen in the Netherlands, who has led the push for the approach at ATLAS. General searches are to the targeted ones what spell checking an entire text is to searching that text for a particular word. These broad searches could realize their full potential in the near future, when combined with increasingly sophisticated artificial-intelligence (AI) methods.

    LHC researchers hope that the methods will lead them to their next big discovery — something that hasn’t happened since the detection of the Higgs boson in 2012, which put in place the final piece of the standard model. Developed in the 1960s and 1970s, the model describes all known subatomic particles, but physicists suspect that there is more to the story — the theory doesn’t account for dark matter, for instance. But big experiments such as the LHC have yet to find evidence for such behaviour. That means it’s important to try new things, including general searches, says Gian Giudice, who heads CERN’s theory department and is not involved in any of the experiments. “This is the right approach, at this point.”

    Collision course

    The LHC smashes together millions of protons per second at colossal energies to produce a profusion of decay particles, which are recorded by detectors such as ATLAS and CMS. Many different types of particle interaction can produce the same debris. For example, the decay of a Higgs might produce a pair of photons, but so do other, more common, processes. So, to search for the Higgs, physicists first ran simulations to predict how many of those ‘impostor’ pairs to expect. They then counted all photon pairs recorded in the detector and compared them to their simulations. The difference — a slight excess of photon pairs within a narrow range of energies — was evidence that the Higgs existed.

    ATLAS and CMS have run hundreds more of these targeted searches to look for particles that do not appear in the standard model.

    CERN/ATLAS detector

    CERN/CMS Detector

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Standard Model of Particle Physics from Symmetry Magazine

    Many searches have looked for various flavours of supersymmetry, a theorized extension of the model that includes hypothesized particles such as the neutralino, a candidate for dark matter. But these searches have come up empty so far.

    Standard model of Supersymmetry DESY

    This leaves open the possibility that there are exotic particles that produce signatures no one has thought of — something that general searches have a better chance of finding. Physicists have yet to look, for example, events that produced three photons instead of two, Caron says. “We have hundreds of people looking at Higgs decay and supersymmetry, but maybe we are missing something nobody thought of,” says Arnd Meyer, a CMS member at Aachen University in Germany.

    Whereas targeted searches typically look at only a handful of the many types of decay product, the latest study looked at more than 700 types at once. The study analysed data collected in 2015, the first year after an LHC upgrade raised the energy of proton collisions in the collider from 8 teraelectronvolts (TeV) to 13 TeV. At CMS, Meyer and a few collaborators have conducted a proof-of-principle study, which hasn’t been published, on a smaller set of data from the 8 TeV run.

    Neither experiment has found significant deviations so far. This was not surprising, the teams say, because the data sets were relatively small. Both ATLAS and CMS are now searching the data collected in 2016 and 2017, a trove tens of times larger.

    Statistical cons

    The approach “has clear advantages, but also clear shortcomings”, says Markus Klute, a physicist at the Massachusetts Institute of Technology in Cambridge. Klute is part of CMS and has worked on general searches in at previous experiments, but he was not directly involved in the more recent studies. One limitation is statistical power. If a targeted search finds a positive result, there are standard procedures for calculating its significance; when casting a wide net, however, some false positives are bound to arise. That was one reason that general searches had not been favoured in the past: many physicists feared that they could lead down too many blind alleys. But the teams say they have put a lot of work into making their methods more solid. “I am excited this came forward,” says Klute.

    Most of the people power and resources at the LHC experiments still go into targeted searches, and that might not change anytime soon. “Some people doubt the usefulness of such general searches, given that we have so many searches that exhaustively cover much of the parameter space,” says Tulika Bose of Boston University in Massachusetts, who helps to coordinate the research programme at CMS.

    Many researchers who work on general searches say that they eventually want to use AI to do away with standard-model simulations altogether. Proponents of this approach hope to use machine learning to find patterns in the data without any theoretical bias. “We want to reverse the strategy — let the data tell us where to look next,” Caron says. Computer scientists are also pushing towards this type of ‘unsupervised’ machine learning — compared with the supervised type, in which the machine ‘learns’ from going through data that have been tagged previously by humans.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 2:37 pm on May 5, 2018 Permalink | Reply
    Tags: 000 - not exactly inspirational, , Cash prizes of US$12000 $8000 and $5000 - not exactly inspirational, , Hosted by Google-owned company Kaggle, Nature Magazine, Too much data for existing computing assets, TrackML challenge   

    From Nature: “Particle physicists turn to AI to cope with CERN’s collision deluge” 

    Nature Mag

    04 May 2018
    No writer credit found

    The pixel detector at CERN’s CMS experiment records particles that emerge from collisions.Credit: CERN


    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Physicists at the world’s leading atom smasher are calling for help. In the next decade, they plan to produce up to 20 times more particle collisions in the Large Hadron Collider (LHC) than they do now, but current detector systems aren’t fit for the coming deluge. So this week, a group of LHC physicists has teamed up with computer scientists to launch a competition to spur the development of artificial-intelligence techniques that can quickly sort through the debris of these collisions. Researchers hope these will help the experiment’s ultimate goal of revealing fundamental insights into the laws of nature.

    At the LHC at CERN, Europe’s particle-physics laboratory near Geneva, two bunches of protons collide head-on inside each of the machine’s detectors 40 million times a second. Every proton collision can produce thousands of new particles, which radiate from a collision point at the centre of each cathedral-sized detector. Millions of silicon sensors are arranged in onion-like layers and light up each time a particle crosses them, producing one pixel of information every time. Collisions are recorded only when they produce potentially interesting by-products. When they are, the detector takes a snapshot that might include hundreds of thousands of pixels from the piled-up debris of up to 20 different pairs of protons. (Because particles move at or close to the speed of light, a detector cannot record a full movie of their motion.)

    From this mess, the LHC’s computers reconstruct tens of thousands of tracks in real time, before moving on to the next snapshot. “The name of the game is connecting the dots,” says Jean-Roch Vlimant, a physicist at the California Institute of Technology in Pasadena who is a member of the collaboration that operates the CMS detector at the LHC.

    The yellow lines depict reconstructed particle trajectories from collisions recorded by CERN’s CMS detector.Credit: CERN

    CERN CMS Higgs Event

    After future planned upgrades, each snapshot is expected to include particle debris from 200 proton collisions. Physicists currently use pattern-recognition algorithms to reconstruct the particles’ tracks. Although these techniques would be able to work out the paths even after the upgrades, “the problem is, they are too slow”, says Cécile Germain, a computer scientist at the University of Paris South in Orsay. Without major investment in new detector technologies, LHC physicists estimate that the collision rates will exceed the current capabilities by at least a factor of 10.

    Researchers suspect that machine-learning algorithms could reconstruct the tracks much more quickly. To help find the best solution, Vlimant and other LHC physicists teamed up with computer scientists including Germain to launch the TrackML challenge. For the next three months, data scientists will be able to download 400 gigabytes of simulated particle-collision data — the pixels produced by an idealized detector — and train their algorithms to reconstruct the tracks.

    Participants will be evaluated on the accuracy with which they do this. The top three performers of this phase hosted by Google-owned company Kaggle, will receive cash prizes of US$12,000, $8,000 and $5,000. A second competition will then evaluate algorithms on the basis of speed as well as accuracy, Vlimant says.

    Prize appeal

    Such competitions have a long tradition in data science, and many young researchers take part to build up their CVs. “Getting well ranked in challenges is extremely important,” says Germain. Perhaps the most famous of these contests was the 2009 Netflix Prize. The entertainment company offered US$1 million to whoever worked out the best way to predict what films its users would like to watch, going on their previous ratings. TrackML isn’t the first challenge in particle physics, either: in 2014, teams competed to ‘discover’ the Higgs boson in a set of simulated data (the LHC discovered the Higgs, long predicted by theory, in 2012). Other science-themed challenges have involved data on anything from plankton to galaxies.

    From the computer-science point of view, the Higgs challenge was an ordinary classification problem, says Tim Salimans, one of the top performers in that race (after the challenge, Salimans went on to get a job at the non-profit effort OpenAI in San Francisco, California). But the fact that it was about LHC physics added to its lustre, he says. That may help to explain the challenge’s popularity: nearly 1,800 teams took part, and many researchers credit the contest for having dramatically increased the interaction between the physics and computer-science communities.

    TrackML is “incomparably more difficult”, says Germain. In the Higgs case, the reconstructed tracks were part of the input, and contestants had to do another layer of analysis to ‘find’ the particle. In the new problem, she says, you have to find in the 100,000 points something like 10,000 arcs of ellipse. She thinks the winning technique might end up resembling those used by the program AlphaGo, which made history in 2016 when it beat a human champion at the complex game of Go. In particular, they might use reinforcement learning, in which an algorithm learns by trial and error on the basis of ‘rewards’ that it receives after each attempt.

    Vlimant and other physicists are also beginning to consider more untested technologies, such as neuromorphic computing and quantum computing. “It’s not clear where we’re going,” says Vlimant, “but it looks like we have a good path.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 5:59 pm on January 11, 2018 Permalink | Reply
    Tags: , , , , Mystery funders of Arecibo radio telescope can celebrate an early success, , Nature Magazine,   

    From Nature: “Mystery funders of Arecibo radio telescope can celebrate an early success” 

    Nature Mag

    10 January 2018

    The Arecibo radio telescope has revealed new details about fast radio bursts.Credit: Jay M. Pasachoff/Getty.

    When the US National Science Foundation (NSF) drew up a plan to demolish its radio telescope near Arecibo, Puerto Rico, it did conclude that something positive would result — although it was only a minor and short-term benefit. Five specialists in explosives would need to spend a month on the Caribbean island, and, the NSF said in an environmental-impact statement last year, the local community could profit from what the visitors would spend on meals and lodging.

    Hoteliers and restaurant owners aside, most of the local workers and researchers who help to keep the giant dish functioning breathed a sigh of relief last November, when the NSF announced that the telescope would remain standing. At least one partner organization had pledged to help fund it, solving a cash crunch at the decades-old facility.

    The identity of the saviours is still a closely guarded secret (although everyone in the astronomy community has their own idea of the funders’ identity, ranging from overseas agencies to universities). Whoever they are, they are sure to be smiling to themselves this week. Their new toy has shown what it can still do.

    In a paper in Nature this week, astronomer Daniele Michilli of the University of Amsterdam and his colleagues describe how they used the Arecibo dish to track a mysterious signal from deep space called a fast radio burst (D. Michilli et al. Nature 553, 182–185; 2018). These powerful but short-lived flashes of radio noise were first discovered a decade ago, but their source remains unknown. They are one of the biggest outstanding astrophysical mysteries today.

    Most of these sources blaze into life just once and then vanish. But a fast radio burst in the constellation Auriga, first spotted in November 2012, has shown itself many times since. Indeed, Michilli and his team recorded at least 16 separate flashes of its activity. Each time, they gleaned a little more information about its probable origin.

    The trick, it turns out, lies in looking at the polarization of radiation coming from the burst. The plane of polarization rotates when the light travels through a magnetic field, an effect first seen by physicist Michael Faraday in 1845. For the Auriga burst, the Faraday rotation is large and variable — suggesting that the light must be travelling through a highly magnetized environment.

    Until now, this type of Faraday rotation has been seen only close to black holes. So one possible explanation for this fast radio burst is that something is producing radio emissions very near to a black hole. Imagine, perhaps, a dense neutron star burping out radiation that twists and rotates as it travels through its highly magnetized surroundings. The work is the most precise look yet at what could be powering fast radio bursts (or at least one of them).

    The announcement of the discovery comes after a tumultuous couple of years for the Arecibo telescope. Alongside the uncertainty over its funding, the facility — like much of Puerto Rico — was battered and put temporarily out of action by Hurricane Maria last year. On restarting its science observations last November, the first thing the big dish did was to return its gaze to Auriga.

    Like many veteran science experiments, Arecibo has an impressive back catalogue. In cinema history, it’s where Jodie Foster listened for aliens in 1997’s Contact, and where Pierce Brosnan’s James Bond dispatched villain Sean Bean in GoldenEye (1995). In scientific history, the telescope beamed a message meant for extraterrestrials to the globular star cluster M13 in 1974, and has probed dangerous near-Earth asteroids to help protect the planet from cosmic impacts.

    Now the NSF wants to free up money for newer astronomical facilities by offloading some of its older ones, including Arecibo. With the demolition plan nixed, the current funding arrangement will end in April and the NSF will officially hand the controls to the mystery newcomers, who have agreed to step in as the agency scales down its annual contributions from US$8 million to $2 million over the next 5 years. (NASA will continue to pay one-third of the observatory’s costs.)

    The dish that the benefactors get for their money is no longer the world’s biggest telescope of its type. China switched on its larger Five-hundred-meter Aperture Spherical radio Telescope (FAST) in 2016, and the facility is already making headlines by chalking up discoveries — three new pulsars last month alone. But the sky is a big place, and there is plenty of science to go around. Arecibo is rightly safe from the dynamite for now.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: