Tagged: Science Node Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:13 am on March 14, 2019 Permalink | Reply
    Tags: "Can computing change the world?", Advanced Computing for Social Change, Computing4Change, Science Node,   

    From Science Node: “Can computing change the world?” 

    Science Node bloc
    From Science Node

    13 Mar, 2019
    Ellen Glover

    Last November, sixteen undergraduate students from around the world came together in Texas to combine their skills and tackle the issue of violence.


    The Computing4Change program brings together undergraduate students for a 48-hour intensive competition to apply computing to urgent social issues. This 2018 topic was “Resisting Cultural Acceptance of Violence.”

    This was part of Computing4Change, a program dedicated to empowering students of all races, genders, and backgrounds to implement change through advanced computing and research.

    The challenge was developed by Kelly Gaither and Rosalia Gomez from the Texas Advanced Computing Center (TACC), and Linda Akli of the Southeastern Universities Research Association.

    Three years ago, as the conference chair at the 2016 XSEDE conference in Miami, Gaither wanted to ensure that she authentically represented students’ voices to other conference attendees. Akli and Gomez led the student programs at the conference, bringing together a large, diverse group of students from Miami and surrounding area.

    So she asked the students what issues they cared about. “It was shocking that most of the issues had nothing to do with their school life and everything to do with the social conditions that they deal with every day,” Gaither says.

    After that, Gaither, Gomez, and Akli promised that they would start a larger program to give students a platform for the issues they found important. They brought in Ruby Mendenhall from the University of Illinois Urbana-Champaign and Sue Fratkin, a public policy analyst concentrating on technology and communication issues.

    2
    48-hour challenge. The student competitors had only 48 hours to do all of their research and come up with a 30-minute presentation before a panel of judges at the SC18 conference in Dallas, TX. Courtesy Computing4Change.

    Out of that collaboration came Advanced Computing for Social Change, a program that gave students a platform to use computing to investigate hot-button topics like Black Lives Matter and immigration. The inaugural competition was held at SC16 and was supported by the conference and by the National Science Foundation-funded XSEDE project.

    “The students at the SC16 competition were so empowered by being able to work on Black Lives Matter that they actually asked if they could work overnight and do the presentations later the next day,” Gaither says. “They felt like there was more work that needed to be done. I have never before seen that kind of enthusiasm for a given problem.”

    In 2018, Gaither, Gomez, and Akli made some big changes to the program and partnered with the Special Interest Group for High Performance Computing (SIGHPC). As a result of SIGHPC’s sponsorship, the program was renamed Computing4Change. Applications were opened up to national and international undergraduate students to ensure a diverse group of participants.

    “We know that the needle is not shifting with respect to diversity. We know that the pipeline is not coming in any more diverse, and we are losing diverse candidates when they do come into the pipeline,” Gaither says.

    The application included questions about what issues the applicants found important: What topics were they most passionate about and why? How did they see technology fitting into solutions?

    Within weeks, the program received almost 300 applicants for 16 available spots. An additional four students from Chaminade University of Honolulu were brought in to participate in the competition.

    In the months leading up to the conference, Gaither, Gomez, and Akli hosted a series of webinars teaching everything from data analytics to public speaking and understanding differences in personality types.

    All expenses, including flight, hotel, meals, and conference fees were covered for each student. “For some of these kids, this is the first time they’ve ever traveled on an airplane. We had a diverse set of academic backgrounds. For example, we had a student from Yale and a community college student,” says Gaither. “Their backgrounds span the gamut, but they all come in as equals.”

    Although they interacted online, the students didn’t meet in person until they showed up to the conference. That’s when they were assigned to their group of four and the competition topic of violence was revealed. The students had to individually decide what direction to take with the research and how that would mesh with their other group members’ choices.

    “Each of those kids had to have their individual hypothesis so that no one voice was more dominant than the other,” Gaither says. “And then they had to work together to find out what the common theme might be. We worked with them to assist with scope, analytics, and messaging.”

    The teams had 48 hours to do all of their research and come up with a 30-minute presentation to present to a panel of judges at the SC18 conference in Dallas, TX.

    All mentors stayed with the students, making sure they approached their research from a more personal perspective and worked through any unexpected roadblocks—just like they would have to in a real-world research situation.

    For example, one student wanted to find data on why people leave Honduras and seek asylum in the United States. Little explicit data exits on that topic, but there is data on why people from all countries seek asylum. The mentors encouraged her to look there for correlations.

    “That was a process of really trying to be creative about getting to the answer,” Gaither says. “But that’s life. With real data, that’s life.”

    The Computing4Change mentors also coached the students to analyze their data and present it clearly to the judges. Gaither hopes the students leave the program not only knowing more about advanced computing, but also more aware of their power to effect change. She says it’s easy to teach someone a skill, but it’s much more impactful to help them find a personal passion within that skill.

    “If you’re passionate about something, you’ll stick with it,” Gaither says. “You can plug into very large, complex problems that are relevant to all of us.”

    The next Computing4Change event will be held in Denver, CO, co-located with the SC19 conference Nov 16-22, 2019. Travel, housing, meals, and SC19 conference registration covered for the 20 students selected. Application deadline is April 8, 2019. Apply here.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 11:15 am on December 28, 2018 Permalink | Reply
    Tags: , , , , , Science Node, , The cosmos in a computer   

    From Science Node: “The cosmos in a computer” 

    Science Node bloc
    From Science Node

    28 Nov, 2018
    Ellen Glover

    How simulated galaxies could bring us one step closer to the origin of our universe.

    Thanks to telescopes like the Hubble and spacecrafts like Kepler, we know more than ever about the Milky Way Galaxy and what lies beyond. However, these observations only tell part of the story.

    NASA/ESA Hubble Telescope

    NASA/Kepler Telescope

    How did our incomprehensively vast universe come to be? What’s it going to look like millions of years from now? These age-old questions are now getting answers thanks to simulations created by supercomputers.

    One of these supercomputers is a Cray XC50, nicknamed ATERUI II and located at the National Astronomical Observatory in Japan (NAOJ).

    NAOJ ATERUI II Cray XC50 supercomputer ocated at the National Astronomical Observatory in Japan (NAOJ)

    It is the fastest supercomputer dedicated to astronomy and is ranked #83 of the top 500 most powerful supercomputers in the world.

    Named after a prominent 9th century chief, the ATERUI II is located in the same city where Aterui led his tribe in a battle against Emperor Kanmu. Despite the odds, Aterui and his people fought well. Since then, Aterui has become a symbol of intelligence, bravery, and unification.

    2
    100 billion. ATERUI II is able to calculate the mutual gravitational interactions between each of the more than 100 billion stars that make up our galaxy, allowing for the most detailed Milky Way simulation yet. Courtesy National Astronomical Observatory of Japan.

    “We named the supercomputer after him so that our astronomers can be brave and smart. While we are not the fastest in the world, we hope the ATERUI II can be used in a smart way to help unify us so we can better understand the universe,” says Eiichiro Kokubo, project director of the Center for Computational Astrophysics at NAOJ.

    ATERUI II was officially launched last June and serves as a bigger and better version of its decommissioned predecessor, ATERUI. With more than 40,000 processing cores and 385 Terabytes of memory, ATERUI II can perform as many as 3 quadrillion operations per second.

    In other words: it’s an incredibly powerful machine that is allowing us to boldly go where no one has ever gone before, from the Big Bang to the death of a star. It’s also exceedingly popular with researchers—150 astronomers are slated to use the supercomputer by the end of the year.

    ATERUI II’s unique power means it is capable of solving problems deemed too difficult for other supercomputers. For example, an attempt to simulate the Milky Way on a different machine meant researchers had to group the stars together in order to calculate their gravitational interactions.

    ATERUI II doesn’t have that problem. It’s able to calculate the mutual gravitational interactions between each of the more than 100 billion stars that make up our galaxy individually, allowing for the most detailed Milky Way Galaxy simulation yet.

    3
    The death of a star a thousand years ago left behind a superdense neutron star that expels extremely high-energy particles. By simulating events like these, ATERUI II gives astronomer’s insights that can’t be discovered through observation alone. Courtesy NASA/JPL-Caltech/ESA/CXC/Univ. of Ariz./Univ. of Szeged.

    While computational astronomy is a fairly young field, we need it in order to understand the universe beyond just observing celestial bodies. With its superior computational power, Kokubo says there are plans for ATERUI II to simulate everything from Saturn’s rings through a binary star formation to the large scale structure of the universe.

    “If we produce the universe in a computer, then we can use it to simulate the past and the future as well,” Kokubo says. “The universe exists in four dimensions: the first three are space and the last one is time. If we can capture the space, then we can better observe it through time.”

    ATERUI II isn’t only working on ways to better understand the stars and planets that make up the universe, it is also being used to explore the possibility of alien life. This starts with life on Earth.

    “If we can simulate and understand the origin of life on Earth and what it means to be habitable, we will be even closer to finding it elsewhere in the universe,” Kokubo says. “I’m interested in life and why we are here.”

    Kokubo isn’t alone. The mystery of how we came to be and what it all means has fascinated mankind for centuries. Our unknown origins have been explored in great pieces of art and literature throughout history and are at the core of every religion. Now, thanks to ATERUI II, we are one step closer to getting our answer.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 9:39 am on December 13, 2018 Permalink | Reply
    Tags: , , , , HPC Spaceborne Computer, , Science Node, Spaceborne Computer is first step in helping NASA get humanity to Mars,   

    From Science Node: “Launching a supercomputer into space” 

    Science Node bloc
    From Science Node

    03 Dec, 2018
    Kevin Jackson

    1
    HPC Spaceborne supercomputer replica.

    Spaceborne Computer is first step in helping NASA get humanity to Mars.

    The world needs more scientists like Dr. Mark Fernandez. His southern drawl and warm personality almost make you overlook the fact that he’s probably forgotten more about high-performance computing (HPC) than you’ll ever know.


    The Spaceborne Computer is currently flying aboard the International Space Station to prove that high-performance computing hardware can survive and operate in outer space conditions. Courtesy HPE.

    Fernandez is the Americas HPC Technology Officer for Hewlett Packard Enterprise (HPE). His current baby is the Spaceborne Computer, a supercomputer that has spent more than a year aboard the International Space Station (ISS).

    In this time, the Spaceborne Computer has run through a gamut of tests to ensure it works like it’s supposed to. Now, it’s a race to accomplish as much as possible before the machine is brought home.

    Computing for the stars

    The Spaceborne Computer’s history extends well before its launch to the ISS. In fact, Fernandez explains that the project began about three years prior.

    “NASA Ames was in a meeting with us in the summer of 2014 and they said that, for a mission to Mars or for a lunar outpost, the distance was so far that they would not be able to continue their mission of supporting the space explorers,” says Fernandez. “And so they just sort of off-handedly said, ‘take part of our current supercomputer and see what it would take to get it operating in space.’ And we took up the challenge.”

    When astronauts send and receive data to and from Earth, this information is moving at the speed of light. In the ISS, which is 240 miles (400 kilometers) away from Earth, data transmission still happens very quickly. The same won’t be true when humans begin our journey into the rest of the cosmos.

    “All science and engineering done here on Earth requires some type of high performance computing to make it function,” says Fernandez. “You don’t want to be 24 minutes away and trying to do your Mars dust storm predictions. You want to be able to take those scientific and engineering computations that are currently done here on Earth and bring them with you.”

    To get ready for these kinds of tasks, the Spaceborne Computer has spent the past year performing standard benchmarking tests in what Fernandez calls the “acceptance phase.” Now that these experiments are done, it’s time to get interesting.

    The sky’s not the limit

    For traditional supercomputers, powering and cooling the machine often represents a huge cost. This isn’t true in space.

    “The Moderate Temperature Loop (MTL) is how the environment for the human astronauts is maintained at a certain temperature,” says Fernandez. “Our experiments are allowed to tap into that MTL, and that’s where we put our heat. Our heat is then expelled into the coldness of space for free. We have free electricity coming from the solar cells, and we have free cooling from the coldness of space and therefore, by definition, we have the most energy efficient supercomputer in existence anywhere on Earth or elsewhere.”

    The cost-neutral aspect of the Spaceborne Computer allows HPE to give researchers access to the machine for free before it must return to Earth. One of these experiments, announced at SC18, concerns Entry, Descent, and Landing (EDL) software.

    “If you’re going to build a Mars habitat, you need to land carefully,” says Fernandez. “This EDL software runs in real time, it’s connected to the thrusters on the spacecraft, and in real time determines where you are and adjusts your thrusters so that you can land within 50 meters of your target. Now, it’s never been tested in space, and the only place it will ever run is in space. So they’re very excited about getting it to run on the Spaceborne Computer.”

    While Fernandez is delighted that his machine will be able to test important innovations like this, he seems dismayed by all the science he won’t be able to do. The Spaceborne Computer will soon be brought back home by NASA, and he’s doing what he can to cram in as many important experiments as possible.

    Fernandez’s attitude speaks volumes about the mental outlook we’ll need to traverse the cosmos. He often uses the term “space explorers” in place of “astronauts” or even “researchers.” It’s a term that cuts to the heart of what scientists like him are attempting to do.

    “We’re proud to be good space explorers,” says Fernandez. “I say, let’s all work together. We’ve got free electricity. We have free cooling. Let’s push science as far and as hard as we can.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 10:34 am on November 29, 2018 Permalink | Reply
    Tags: , , , Science Node,   

    From Science Node: “The race to exascale” 

    Science Node bloc
    From Science Node

    30 Jan, 2018
    Alisa Alering

    Who will get the first exascale machine – a supercomputer capable of 10^18 floating point operations per second? Will it be China, Japan, or the US?

    1
    When it comes to computing power you can never have enough. In the last sixty years, processing power has increased more than a trillionfold.

    Researchers around the world are excited because these new, ultra-fast computers represent a 50- to 100-fold increase in speed over today’s supercomputers and promise significant breakthroughs in many areas. That exascale supercomputers are coming is pretty clear. We can even predict the date, most likely in the mid-2020s. But the question remains as to what kind of software will run on these machines.

    Exascale computing heralds an era of ubiquitous massive parallelism, in which processors perform coordinated computations simultaneously. But the number of processors will be so high that computer scientists will have to constantly cope with failing components.

    The high number of processors will also likely slow programs tremendously. The consequence is that beyond the exascale hardware, we will also need exascale brains to develop new algorithms and implement them in exascale software.

    In 2011, the German Research Foundation established a priority program “Software for Exascale Computing”( SPPEXA ) to addresses fundamental research on various aspects of high performance computing (HPC) software, making the program the first of its kind in Germany.

    SPPEXA connects relevant sub-fields of computer science with the needs of computational science and engineering and HPC. The program provides the framework for closer cooperation and a co-design-driven approach. This is a shift from the current service-driven collaboration of groups focusing on fundamental HPC methodology (computer science or mathematics) on the one side with those working on science applications and providing the large codes (science and engineering) on the other side.

    Despite exascale computing still being several years away, SPPEXA scientists are well ahead of the game, developing scalable and efficient algorithms that will make the best use of resources when the new machines finally arrive. SPPEXA drives research towards extreme-scale computing in six areas: computational algorithms, system software, application software, data management and exploration, programming, and software tools.

    Some major projects include research on alternative sources of clean energy; stronger, lighter weight steel manufacturing; and unprecedented simulations of the earth’s convective processes:

    EXAHD supports Germany’s long-standing research into the use of plasma fusion as a clean, safe, and sustainable carbon-free energy source. One of the main goals of the EXAHD project is to develop scalable and efficient algorithms to run on distributed systems, with the aim of facilitating the progress of plasma fusion research.

    EXASTEEL is a massively parallel simulation environment for computational material science. Bringing together experts from mathematics, material and computer sciences, and engineering, EXASTEEL will serve as a virtual laboratory for testing new forms of steel with greater strengths and lower weight.

    TerraNeo addresses the challenges of understanding the convection of Earth’s mantle – the cause of most of our planet’s geological activity, from plate tectonics to volcanoes and earthquakes. Due to the sheer scale and complexity of the models, the advent of exascale computing offers a tremendous opportunity for greater understanding. But in order to take full advantage of the coming resources, TerraNeo is working to design new software with optimal algorithms that permit a scalable implementation.

    Exascale hardware is expected to have less consistent performance than current supercomputers due to fabrication, power, and heat issues. Their sheer size and unprecedented number of components will likely increase fault rates. Fast and Fault-Tolerant Microkernel-based Operating System for Exascale Computing (FFMK) aims to address these challenges through a coordinated approach that connects system software, computational algorithms, and application software.

    Mastering the various challenges related to the paradigm shift from moderately to massively parallel processing will be the key to any future capability computing application at exascale. It will also be crucial for learning how to effectively and efficiently deal with near-future commodity systems smaller-scale or capacity computing tasks. No matter who puts the first machine online, exascale supercomputing is coming. SPPEXA is making sure we are prepared to take full advantage of it.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 9:38 am on November 29, 2018 Permalink | Reply
    Tags: 1. Summit (US), 2. Sierra (US), 3. Sunway TaihuLight (China), 4. Tianhe-2 (China), 5. Piz Daint (Switzerland), , , Science Node, ,   

    From Science Node: “The 5 fastest supercomputers in the world” 

    Science Node bloc
    From Science Node

    Countries around the world strive to reach the peak of computing power–but there can be only one.

    19 Nov, 2018
    11.29.18 update
    Kevin Jackson

    Peak performance within supercomputing is a constantly moving target. In fact, a supercomputer is defined as being any machine “that performs at or near the currently highest operational rate.” The field is a continual battle to be the best. Those who achieve the top rank may only hang on to it for a fleeting moment.

    Competition is what makes supercomputing so exciting, continually driving engineers to reach heights that were unimaginable only a few years ago. To celebrate this amazing technology, let’s take a look at the fastest computers as defined by computer ranking project TOP500—and at what these machines are used for.

    5. Piz Daint (Switzerland)

    Cray Piz Daint supercomputer of the Swiss National Supercomputing Center (CSCS)

    Named after a mountain in the Swiss Alps, Piz Daint has been Europe’s fastest supercomputer since its debut in November 2013. But a recent 40 million Euro upgrade has boosted the Swiss National Supercomputer Centre’s machine into the global top five, now running at 21.2 petaFLOPS and ­utilizing 387,872 cores.

    The machine has helped scientists at the University of Basel make discoveries about “memory molecules” in the brain. Other Swiss scientists have taken advantage of its ultra-high resolutions to set up a near-global climate simulation.

    4. Tianhe-2 (China)

    China’s Tianhe-2 Kylin Linux supercomputer at National Supercomputer Center, Guangzhou, China

    Tianhe-2, whose name translates as “MilkyWay-2,” has also seen recent updates. But despite now boasting a whopping 4,981,760 cores and running at 61.4 petaFLOPS, that hasn’t stopped it from slipping two spots in just one year—from #2 to #4.

    TOP500 reported that the machine, developed by the National University of Defense Technology (NUDT) in China, is intended mainly for government security applications. This means that much of the work done by Tianhe-2 is kept secret, but if its processing power is anything to judge by, it must be working on some pretty important projects.

    3. Sunway TaihuLight (China)

    Sunway NRCPC TaihuLight, China, US News

    A former number one, Sunway TaihuLight dominated the list since its debut in June 2016. At that time, it’s 93.01 petaFLOPS and 10,649,000 cores made it the world’s most powerful supercomputer by a wide margin, boasting more than five times the processing power of its nearest competitor (ORNL’s Titan) and nearly 19 times more cores.

    But given the non-stop pace of technological advancement, no position is ever secure for long. TaihuLight ceded the top spot to competitors in June 2018.

    Located at the National Supercomputing Center in Wuxi, China, TaihuLight’s creators are using the supercomputer for tasks ranging from climate science to advanced manufacturing. It has also found success in marine forecasting, helping ships avoid rough seas while also helping with offshore oil drilling.

    2. Sierra (US)

    LLNL IBM NVIDIA Mellanox ATS-2 Sierra Supercomputer

    Sierra initially debuted at #3 on the June 2018 list with 71.6 petaFLOPS, but optimization has since pushed the processing speed on its 1,572,480 cores to 94.6 petaFLOPS, earning it the #2 spot in November 2018.

    Incorporating both IBM central processing units (CPUs) and NVIDIA graphics processing units (GPUs), Sierra is specifically designed for modeling and simulations essential for the US National Nuclear Security Administration.

    1. Summit (US)

    ORNL IBM AC922 SUMMIT supercomputer. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Showing further evidence of the US Department of Energy’s renewed commitment to supercomputing power, Oak Ridge National Laboratory’s (ORNL) Summit first claimed the #1 spot in June 2018, taking the top rank from China for the first time in 6 years. Further upgrades have cemented that spot—at least until the next list comes out in June 2019.

    In the five months since its debut on the June 2018 list, Summit has widened its lead as the number one system, improving its High Performance Linpack (HPL) performance from 122.3 to 143.5 petaFLOPS.

    Scientists are already putting the world’s most powerful computer to work. A seven-member team from ORNL won the 2018 Gordon Bell Prize for their deployment of Summit to process genetic data in order to better understand how individuals develop chronic pain and respond to opioids.

    The race to possess the most powerful supercomputer never really ends. This friendly competition between countries has propelled a boom in processing power, and it doesn’t look like it’ll be slowing down anytime soon. With scientists using supercomputers for important projects such as curing debilitating diseases, we can only hope it will continue for years to come. [Whoever thinks this is a “friendly competition between countries” is way off base. This is a part of the Chinese route to world dominance]

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 10:54 am on November 15, 2018 Permalink | Reply
    Tags: "Searching for ocean microbes", , Bermuda Atlantic Time Series, , Cyverse, DNA Databank of Japan, European Bioinformatics Institute, Hawaiian Ocean Time Series, Hurwitz Lab-University of Arizona, iMicrobe platform, National Center for Biotechnology Information, National Microbiome Collaborative, , Planet Microbe, Science Node, The Hurwitz Lab corrals big data sets into a more searchable form to help scientists study microorganisms,   

    From Science Node: “Searching for ocean microbes” 

    Science Node bloc
    From Science Node

    07 Nov, 2018
    Susan McGinley

    How one lab is consolidating ocean data to track climate change.

    1
    Courtesy David Clode/Unsplash.

    Scientists have been making monthly observations of the physical, biological, and chemical properties of the ocean since 1988. Now, thanks to the Hurwitz Lab at the University of Arizona (UA), researchers around the world have greater access than ever before to the information collected at these remote ocean sites.

    U Arizona bloc

    Led by Bonnie Hurwitz, assistant professor of biosystems engineering at UA, the Hurwitz Lab corrals big data sets into a more searchable form to help scientists study microorganisms – bacteria, fungi, algae, viruses, protozoa – and how they relate to each other, their hosts and the environment.

    3
    Sample collection. Bonnie Hurwitz next to the metal pod that serves as the main chamber for the Alvin submersible that scientists operate to collect samples from the deepest parts of the ocean not accessible to people. Courtesy Stefan Sievert, Woods Hole Oceanographic Institution.

    The lab is building a data infrastructure on top of Cyverse to integrate and build information from diverse data stores in collaboration with the broader cyber community. The goal is to give people the ability to use data sets that span a range of storage servers, all in one place.

    “One of the exciting things my lab is funded for is Planet Microbe, a three-year project through the National Science Foundation (NSF), to bring together genomic and environmental data sets coming from ocean research cruises,” Hurwitz said.

    “Samples of water are taken using an instrument called a CTD that measures salinity, temperature, depth, and other features to create a scan of ocean conditions across the water column.”

    As the CTD descends into the ocean, bottles are triggered at different depths to collect water samples for a variety of experiments including sequencing the DNA/RNA of microbes. The moment each sample leaves the ship is often the last time these valuable and varied data appear together.

    The first phase of the project focuses on the Hawaiian Ocean Time Series and the Bermuda Atlantic Time Series. At both locations, samples are collected across an ocean transect at a variety of depths across the water column, from surface to deep ocean.

    4
    A CTD device that measures water conductivity (salinity), temperature and depth is mounted underneath a set of water bottles used for collecting samples at varying depths in a column of water. Courtesy Tara Clemente, University of Hawaii.

    The readings taken at each level stream out to data banks around the world. Different labs conduct the analyses, but the Hurwitz lab reunites all of the data sets, including data from these long-term ecological sites used for monitoring climate and changes in the oceans.

    “Oceanographers have different tool kits. They are collecting data on ship to observe both the ocean environment and the genetics of microbes to understand the role they play in the ocean,” Hurwitz said. “We are including these data in a very simple web-based platform where users can run their own analyses and data pipelines to use the data in new ways.”

    While still in year one of the project, the first data have just been released under the iMicrobe platform, which connects users with computational resources for analyzing and visualizing the data.

    The platform’s bioinformatics tools let researchers analyze the data in new ways that may not have originally been possible when the data were collected, or to compare these global ocean data sets with new data as it becomes available.

    “We’re plumbers, actually, creating the pipelines between the world’s oceanographic data sets. We’re trying to enable scientists to access data from the world’s oceans,” Hurwitz said.

    A larger mission

    In addition to their Planet Microbe work, Hurwitz and her team work with the three entities that store and sync all of the world’s “omics” (genomics, proteomics) data – the European Bioinformatics Institute, the National Center for Biotechnology Information and the DNA Databank of Japan, and others.

    “We are working with the National Microbiome Collaborative, a national effort to bring together the world’s data in the microbiome sciences, from human to ocean and everything in between,” Hurwitz said.

    “Having those data sets captured and searchable is great,” said Hurwitz. “They are so big they can’t be housed in any one place. The infrastructure allows you to search across these areas.”

    5
    Going deep. Hurwitz and Amy Apprill, associate scientist at Woods Hole Oceanographic Institution, in front of the human-piloted Alvin submersible. Deep-water samples are collected using the pod’s robotic arm because the pressure of the water is too intense for divers. Courtesy Stefan Sievert, Woods Hole Oceanographic Institution.

    “If we want to start looking at things together in a holistic manner, we need to be able to remotely access data that are not on our servers. We are essentially indexing the world’s data and becoming a search engine for microbiome sciences.”

    By reconnecting ‘omics data with environmental data from oceanographic cruises, Hurwitz and her team are speeding up discoveries into environmental changes affecting the marine microbes that are responsible for producing half the air that we breathe.

    These data can be used in the future to predict how our oceans respond to change and to specific environmental conditions.

    “Our researchers can not only use a $30 million supercomputer at XSEDE (Extreme Science and Engineering Discovery Environment) supported by the NSF for running analyses, they also have access to modern big data architectures through a simple computer interface.”

    “We’re trying to understand where all the data are and how we can sync them,” Hurwitz said. “How data are structured and assembled together has been like the Wild West. We’re figuring it out.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 11:09 am on November 1, 2018 Permalink | Reply
    Tags: Abigail Hsu, Harshitha Menon, Laura Stephey, Margaret Lawson, More women more science, Science Node, Tess Bernard, Who says women don't like science?,   

    From Science Node: Women in STEM-” Who says women don’t like science? From renewable energy to big data, these five women are making a difference with advanced computing.” 

    Science Node bloc
    From Science Node

    31 Oct, 2018
    Alisa Alering

    1

    From renewable energy to big data, these five women are making a difference with advanced computing.

    There’s a misconception out there that women don’t like science. Or computers.

    But let’s not forget that it was Ada Lovelace who kicked off the computer era in 1843 when she outlined a sequence of operations for solving mathematical problems with Charles Babbage’s Analytical Engine. Or that up through the 1960s, women actually were the computers and the primary programmers.

    Times have changed, but women’s contributions to computing haven’t. So to correct some mistaken ideas, here are five cool things women are doing with high-performance computing.

    Speeding up our understanding of the Universe

    The Dark Energy Spectroscopic Instrument (DESI) survey will make the largest, most-detailed 3D map of the Universe ever created and help scientists better understand dark energy. Every night for 5 years, DESI will take images of the night sky that will be used to construct a 3D map spanning the nearby universe to 11 billion light years.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA


    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    But in order for that map to be made, images from the telescope must be processed by the Cori supercomputer.

    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    Laura Stephey, a postdoctoral fellow at Lawrence Berkeley National Lab (LBNL) is optimizing data processing for the DESI experiment so that results can be returned to researchers overnight in order to plan their next night of observation.

    Developing fusion as a renewable energy source


    Model power source. University of Texas student Tess Bernard is developing computer simulations to model the physics of plasmas in order to design successful fusion experiments. Courtesy Kurzgesagt.

    Plasma is the fourth state of matter, made up of energetic, charged particles. Fusion happens when two light elements, like hydrogen, fuse together to form a heavier element, such as helium, and give off a lot of energy. This process happens naturally in stars like our sun, but scientists are working to recreate this in a lab.

    Tess Bernard, a graduate student at the University of Texas at Austin is developing computer simulations to model the physics of plasmas in order to help design successful fusion experiments. Says Bernard, “If we can successfully harness fusion energy on earth, we can provide a clean, renewable source of energy for the world.”

    Dealing with big data

    Modern scientific computing addresses a wide variety of real-world problems, from developing efficient fuels to predicting extreme weather. But these applications produce immense volumes of data which are cumbersome to store, manage, and explore.

    Which is why Margaret Lawson, a PhD student at the University of Illinois at Urbana-Champaign and Sandia National Laboratories is creating a system that allows scientists working with massive amounts of data to tag and search specific data. This makes it easier for scientists to make discoveries since the most interesting data is highlighted for further analysis.

    Preparing for exascale

    Exascale computing will represent a 50- to 100-fold increase in speed over today’s supercomputers and promises significant breakthroughs in many areas. But to reach these speeds, exascale machines will be massively parallel, and applications must be able to perform on a wide variety of architectures.

    Abigail Hsu, a PhD student at Stony Brook University, is investigating how different approaches to parallel optimization impact the performance portability of unstructured mesh Fortran codes. She hopes this will encourage the development of Fortran applications for exascale architectures.

    Sanity-checking simulations

    Computers make mistakes. And sometimes those failures have serious consequences. Like during the Gulf War, when an American missile failed to intercept an incoming Iraqi Scud. The Scud struck a barrack, killing 28 soldiers and injuring a hundred others. A report attributed this to computer arithmetic error–specifically a small error of 0.34 seconds in the system’s internal clock.

    Harshitha Menon, a computer scientist at Lawrence Livermore National Laboratory (LLNL) is developing a method to understand the impact of arithmetic errors in computing. Her tool identifies vulnerable regions of code to ensure that simulations give correct results.

    Says Menon, “We need to understand the impact of these errors on our computer programs because scientists and policy makers rely on their results to make accurate predictions that can have lasting impact.”

    More women, more science

    Want to find out more? All of these researchers—and many more—will be presenting their work at SC18 in the Women in HPC workshop on Sunday, November 11, 2018.

    So that covers astronomy, physics, computer science, and math. And they say women don’t like science. We say that’s a pretty unscientific conclusion.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 11:12 am on October 18, 2018 Permalink | Reply
    Tags: , Power of Neuorgaming Center, Science Node, UC San Diego Qualcomm Institute   

    From Science Node: “Don’t look away” 

    Science Node bloc
    From Science Node

    16 Oct, 2018
    Alicia Clarke

    What if eye-tracking games could improve the lives of people living with autism?

    At some point we’ve probably all found ourselves immersed in a video game—having fun while trying to advance to the next level. But what if games could do more than entertain? What if they could improve cognitive behaviors and motor skills at the same time?

    1
    If you look away, you crash your spaceship. Gaze-driven games harness the connection between eye movement and attention, training players that better engagement gets better results. Courtesy Alex Matthews, UC San Diego Qualcomm Institute.

    Those are some of the questions that led neuroscientist and eye tracking expert Leanne Chukoskie to create video games that do just that. Chukoskie now directs the Power of Neuorgaming Center (aptly shortened to PoNG) at the Qualcomm Institute. There she and her team create video games to help people on the autism spectrum lead fuller lives.

    Filling a gap

    Together with Jeanne Townsend, director of UC San Diego’s Research on Autism and Development Lab, Chukoskie saw an opening to explore neurogaming as a way to improve attention, gaze control and other behaviors associated with autism. The video games are gaze-driven, which means that they are played with the eyes, and not a mouse or a touchscreen.

    “We realized there was enormous growth potential in autism intervention—where you translate research into tools that can help people,” Chukoskie said. “Jeanne and I wanted to intervene, not just measure things. We wanted our work to be useful to the world sooner rather than later. And these games are the result of that goal.”


    The power of attention. UCSD researchers are developing games that train attention-orienting skills like a muscle, improving social development outcomes for children with autism. Courtesy Global Silicon Valleys.

    Chukoskie and her team, which includes adults on the autism spectrum and high school students, created four games and are busy making more. Their work was recently on display at the Qualcomm Institute during PoNG’s 2018 Internship Showcase.

    “Dr. Mole and Mr. Hide is one of our favorites. It’s basically what you think it is—all these little moles pop out of holes and you have to look at them to knock them back down. There are ninja moles you want to hit. Then the player begins to see professor moles, which we don’t want them to hit. (My joke is we don’t hit professors at UC San Diego!) This promotes fast and accurate eye movement and builds inhibitory control,” she explained.

    Beyond the lab

    Getting the games in the hands of people who can benefit from them most is another aspect that keeps Chukoskie busy. She and Townsend co-founded BrainLeap Technologies in 2017 to make that goal a reality. BrainLeap Technologies is headquartered in the Qualcomm Institute Innovation Space, just a short walk from the PoNG lab.

    3
    Dr. Mole and Mr. Hide. Knocking down moles as they pop out of holes promotes fast and accurate eye movement and builds inhibition control. Courtesy BrainLeap Technologies.

    “We want to make the games available to families, and eventually schools, so they do the most good for the most people.” said Chukoskie. “Starting a company wasn’t what I had in mind initially, but it soon became clear that’s what we needed to do.”

    As with her lab, students and interns play a critical role at BrainLeap Technologies. They bring their creativity, energy and skill. In return, they develop professional skills they can take into the workforce and their communities.

    The power of collaboration

    6
    Not just for autism. Neuroscientist Leanne Chukoskie is also exploring using video game simulations with sensors that monitor stress responses as a possible intervention against human trafficking. Courtesy Alex Matthews, UC San Diego Qualcomm Institute.

    Chukoskie’s enthusiasm and knack for developing products with real-world applications is creating buzz within the walls of the Qualcomm Institute. She is exploring other fields where neurogaming could have an impact. One area is human trafficking. Could video simulations with sensors that monitor stress responses help people recognize subtle signs of danger first in a simulation and then later in the real world? The opportunities for interdisciplinary collaborations are endless.

    “UC San Diego, and especially the Qualcomm Institute, opened my eyes to what can happen when we bring the power of our expertise together,” Chukoskie said. “On top of that, the institute has a strong social mission. It didn’t take long for it to become obvious that the Qualcomm Institute was the right place for our lab and our business.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 10:34 am on September 6, 2018 Permalink | Reply
    Tags: , , Science Node, ,   

    From Science Node: “Putting neutrinos on ice” 

    Science Node bloc
    From Science Node

    29 Aug, 2018
    Ken Chiacchia
    Jan Zverina

    1
    IceCube Collaboration/Google Earth: PGC/NASA U.S. Geological Survy Data SIO,NOAA, U.S. Navy, NGA, GEBCO Landsat/Copernicus.

    Identification of cosmic-ray source by IceCube Neutrino Observatory depends on global collaboration.

    Four billion years ago—before the first life had developed on Earth—a massive black hole shot out a proton at nearly the speed of light.

    Fast forward—way forward—to 45.5 million years ago. At that time, the Antarctic continent had started collecting an ice sheet. Eventually Antarctica would capture 61 percent of the fresh water on Earth.

    Thanks to XSEDE resources and help from XSEDE Extended Collaborative Support Service (ECSS) experts, scientists running the IceCube Neutrino Observatory in Antarctica and their international partners have taken advantage of those events to answer a hundred-year-old scientific mystery: Where do cosmic rays come from?

    U Wisconsin IceCube neutrino observatory

    U Wisconsin ICECUBE neutrino detector at the South Pole

    IceCube employs more than 5000 detectors lowered on 86 strings into almost 100 holes in the Antarctic ice NSF B. Gudbjartsson, IceCube Collaboration

    Lunar Icecube

    IceCube DeepCore annotated

    IceCube PINGU annotated


    DM-Ice II at IceCube annotated

    Making straight the path

    First identified in 1912, cosmic rays have puzzled scientists. The higher in the atmosphere you go, the more of them you can measure. The Earth’s thin shell of air, scientists came to realize, was protecting us from potentially harmful radiation that filled space. Most cosmic ray particles consist of a single proton. That’s the smallest positively charged particle of normal matter.

    Cosmic ray particles are ridiculously powerful. Gonzalo Merino, computing facilities manager for the Wisconsin IceCube Particle Astrophysics Center at the University of Wisconsin-Madison (UW), compares the force of a proton accelerated by the LHC, the world’s largest atom-smasher, as similar to the force of a mosquito flying into a person.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    By comparison, the “Oh-My-God” cosmic ray particle detected by the University of Utah in 1991 hit with the force of a baseball flying at 58 miles per hour.

    Because cosmic-ray particles are electrically charged, they would be pushed and pulled by every magnetic field they encounter along the way. Cosmic rays would not travel in a straight line, particularly if they came from some powerful object far away in the Universe. You can’t figure out where they originated from by their direction when they hit Earth.

    Particle-physics theorists came to the rescue.

    “If cosmic rays hit any matter around them, the collision will generate secondary products,” Merino says. “A byproduct of any high-energy interaction with the protons that make up much of a cosmic ray will be neutrinos.”

    Neutrinos respond to gravity and to what’s known as the weak subatomic force, like most matter. But they aren’t affected by the electromagnetic forces that send cosmic rays on a drunkard’s walk. Scientists realized that the intense showers of protons at the source of cosmic rays had to be hitting matter nearby, producing neutrinos that can be tracked back to their source.

    The shape of water

    But if the matter that makes up your instrument can’t interact with an incoming neutrino, how are you going to detect it? The answer lay in making the detector big.

    “The probability that a neutrino will interact with matter is extremely low, but not zero,” Merino explains. “If you want to see neutrinos, you need to build a huge detector so that they collide with matter at a reasonable rate.”

    2
    Multimessenger astronomy combines information from different cosmic messenger—cosmic rays, neutrinos, gamma rays, and gravitational waves—to learn about the distant and extreme universe. Courtesy IceCube Collaboration.

    Enter the Antarctic ice shelf. The ice here is nearly pure water and could be used as a detector. From 2005 through 2010, a UW-led team created the IceCube Neutrino Observatory by drilling 86 holes deep in the ice, re-freezing detectors in the holes. Their new detector consisted of 5,160 detectors suspended in a huge ice cube six-tenths of a mile on each side.

    The IceCube scientists weren’t quite ready to detect cosmic-ray-associated neutrinos yet. While the IceCube observatory was nearly pure water, it wasn’t completely pure. As a natural formation, its transparency might differ a bit from spot to spot, which could affect detection.

    “Progress in understanding the precise optical properties of the ice leads to increasing complexity in simulating the propagation of photons in the instrument and to a better overall performance of the detector,” says Francis Halzen, a UW professor of physics and the lead scientist for the IceCube Neutrino Observatory.

    GPUs to the rescue

    The collaborators simulated the effects of neutrinos hitting the ice using traditional supercomputers containing standard central processing units (CPUs). They realized, though, that portions of their computations would instead work faster on graphics-processing units (GPUs), invented to improve video-game animation.

    “We realized that a part of the simulation is a very good match for GPUs,” Merino says. “These computations run 100 to 300 times faster on GPUs than on CPUs.”

    Madison’s own GPU cluster and collaborators’ campuses’ GPU systems helped, but it wasn’t enough.

    3

    Then Merino had a talk with XSEDE ECSS expert Sergiu Sanielevici from the Pittsburgh Supercomputing Center (PSC), lead of XSEDE’s Novel and Innovative Projects.

    Pittsburgh Supercomputing Center 3000 cores, 6 TFLOPS

    Sanielevici filled him in on the large GPU capability of XSEDE supercomputing systems. The IceCube team wound up using a number of XSEDE machines for GPU and CPU computations: Bridges at PSC, Comet at the San Diego Supercomputer Center (SDSC), XStream at Stanford University and the collection of clusters available through the Open Science Grid Consortium.

    3
    Bridges at PSC

    SDSC Dell Comet supercomputer at San Diego Supercomputer Center (SDSC)

    Stanford U Cray Xstream supercomputer

    The IceCube scientists could not assume that their computer code would run well in the XSEDE system. Their massive and complex flow of calculations could have slowed down considerably had the new machines conflicted with it. ECSS expertise was critical to making the join-up smooth.

    “XSEDE’s resources integrated seamlessly; that was very important for us,” Merino says. “XSEDE has been very collaborative, extremely open in facilitating that integration.”
    Paydirt

    Their detector built and simulated, the IceCube scientists had to wait for it to detect a cosmic neutrino. On Sept. 22, 2017, it happened. An automated system tuned to the signature of a cosmic-ray neutrino sent a message to the members of the IceCube Collaboration, an international team with more than 300 scientists in 12 countries.

    This was important. A single neutrino detection would not have been proof by itself. Scientists at observatories that detect other types of radiation expected from cosmic rays needed to look at the same spot in the sky.

    4
    Blazars are a type of active galaxy with one of its jets pointing toward us. It emits both neutrinos and gamma rays that could be detected by the IceCube Neutrino Observatory as well as by other telescopes on Earth and in space. Courtesy IceCube/NASA.

    They found multiple types of radiation coming from the same spot in the sky as the neutrino. At this spot was a “blazar” called TXS 0506+056, about 4 billion light years from Earth. A type of active galactic nucleus (AGN), a blazar is a huge black hole sitting in the center of a distant galaxy, flaring as it eats the galaxy’s matter. Blazars are AGNs that happen to be pointed straight at us.

    The scientists think that the vast forces surrounding the black hole are likely the catapult that shot cosmic-ray particles on their way toward Earth. After a journey of 4 billion years across the vastness of space, one of the neutrinos created by those particles blazed a path through IceCube’s detector.

    The IceCube scientists went back over nine and a half years of detector data, before they’d set up their automated warning. They found several earlier detections from TXS 0506+056, greatly raising their confidence.

    The findings led to papers in the prestigious journal Science and Science in July 2018. Future work will focus on confirming that blazars are the source—or at least a major source—of the high-energy particles that fill the Universe.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: