Tagged: Science Node Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:35 pm on June 10, 2017 Permalink | Reply
    Tags: , , Irish Centre for High-End Computing, , , PRACE, , Science Node, Sinéad Ryan, ,   

    From Science Node= Women in STEM-“A day in the life of an Irish particle physicist” Sinéad Ryan 

    Science Node bloc

    Science Node

    02 Jun, 2017
    Tristan Fitzpatrick

    2
    Sinéad Ryan is a quantum chromodynamics expert in Dublin. She relies on PRACE HPC resources to calculate the mass of quarks, gluons, and hadrons — and uncover the secrets of the universe.

    Uncovering the mysteries of the cosmos is just another day in the office for Sinéad Ryan.

    2

    Ryan, professor of theoretical high energy physics at Trinity College Dublin, specializes in quantum chromodynamics (QCD). The field examines how quarks and gluons form hadrons, the fundamental starting point of our universe.

    “Quarks and gluons are the building blocks for everything in the world around us and for our universe,” says Ryan. “The question is, how do these form the matter that we see around us?”

    To answer this, Ryan performs numerical simulations on high-performance computing (HPC) resources managed by the Partnership for Advanced Computing in Europe’s (PRACE).

    “I think PRACE is crucial for our field,” says Ryan, “and I’m sure other people would tell you the same thing.”

    When quarks are pulled apart, energy grows between them, similar to the tension in a rubber band when it is stretched. Eventually, enough energy is produced to create more quarks which then form hadrons in accordance with Einstein’s equation E=MC2.

    The problem, according to Ryan, comes in solving the equations of QCD. PRACE’s HPC resources make Ryan’s work possible because they enable her to run simulations on a larger scale than simple pen and paper would allow.

    “It’s a huge dimensional integral to solve, and we’re talking about solving a million times a million matrices that we must invert,” says Ryan.

    “This is where HPC comes in. If you want to make predictions in the theory, you need to be able to do the simulations numerically.”

    In Ireland, the Irish Centre for High-End Computing is one resource Ryan has tapped in her research, but PRACE enables her and her collaborators to access resources not just locally but across the world.

    IITAC IBM supercomputer

    “This sort of work tends to be very collaborative and international,” says Ryan. “We can apply through PRACE for time on HPC machines throughout Europe. In my field, any machine anywhere is fair game.”

    Besides providing resources, PRACE also determines whether HPC resources are suitable for the kinds of research questions scientists are interested in answering.

    “PRACE’s access to these facilities means that good science gets done on these machines,” says Ryan. “These are computations that are based around fundamental questions posed by people who have a track record for doing good science and asking the right questions. I think that’s crucial.”

    Without PRACE’s support, Ryan’s work examining how quarks and gluons form matter and the beginnings of our universe would be greatly diminished, leaving us one step further behind uncovering the building blocks of the universe.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 7:56 pm on June 9, 2017 Permalink | Reply
    Tags: , , Science Node, SDSC- San Diego Supercomputer Center,   

    From Science Node: “XSEDE cuts through the noise” 

    Science Node bloc
    Science Node

    06 June, 2017
    Alisa Alering

    3
    Courtesy LIGO; Caltech; MIT; Sonoma State; Aurore Simonnet.

    Over two billion years ago, when multicellular life had only just begun to evolve on Earth, two black holes collided and merged to form a new black hole.

    With a mass 49 times that of our sun, the massive collision set off ripples in space-time that radiated from the event like waves from a stone thrown into a pond. Predicted by Albert Einstein in 1916 and known as gravitational waves, those ripples are still traveling.


    Surf’s up! The Extreme Science and Engineering Discovery Environment (XSEDE) provides the HPC resources required to pluck gravitational waves from the noise found on LIGO detectors. Courtesy XSEDE.

    Able to pass through dust, matter, or anything else without being distorted, gravitational waves carry unique information about cosmic events that can’t be obtained in any other way. When the waves reach Earth, they give astrophysicists a completely new way to explore the universe.

    The first such waves were detected on September 14, 2015 by the Laser Interferometer Gravitational-Wave Observatory (LIGO) Scientific Collaboration. In the months since, two more gravitational wave events have been confirmed, one in December 2015 and the most recent on January 4, 2017.


    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project


    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    ESA/eLISA the future of gravitational wave research

    Signal from noise

    When gravitational waves pass by, they change the distance between objects. The change is so infinitesimal that it can’t be felt, or seen with a microscope. But incredibly sensitive scientific instruments—interferometers—can detect a change that is a thousand times smaller than a proton.

    The LIGO Scientific Collaboration, a body of more than 1,000 international scientists who collectively perform LIGO research, operates two interferometers located over 2000 miles apart in Washington and Louisiana, USA.

    Despite the sensitivity of the instruments, it’s not easy to detect a gravitational wave. When a signal is received, scientists must determine what it means and how likely it is to be noise or a real gravitational wave. Making that determination requires high-performance computing.

    Since 2013, LIGO has collaborated with the Extreme Science and Engineering Discovery Environment (XSEDE), a National Science Foundation (NSF)-funded cyberinfrastructure network that includes not just high-performance computing systems but also experts who help researchers move projects forward.

    Better, faster, cheaper

    In order to validate the discovery of a gravitational wave, researchers measure the significance of the signal by calculating a false alarm rate for the event.


    Making waves, taking names. The top part of the animation shows two black holes orbiting each other until they merge, and the lower part shows the two distinct gravitational waves emitted. Thanks to supercomputers at TACC and SDSC, researchers can pick out these waves from other detector noise. Courtesy Simulating eXtreme Spacetimes collaboration.

    TACC Maverick HP NVIDIA supercomputer

    TACC Lonestar Cray XC40 supercomputer

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    TACC HPE Apollo 8000 Hikari supercomputer

    TACC Maverick HP NVIDIA supercomputer

    SDSC Triton HP supercomputer

    SDSC Gordon-Simons supercomputer

    SDSC Dell Comet supercomputer

    Once confirmed, further supercomputer analysis is used to extract precise estimates of the physical properties of the event, including the masses of the colliding objects, position, orientation, and distance from the Earth, carefully checking millions of combinations of these characteristics and testing how well the predicted waveform matches the signal detected by LIGO.

    To draw larger conclusions about the nature of black holes requires careful modeling based on the received data. Each simulation can take from a week to one month to complete, depending upon the complexity.

    Such intensive data analysis requires large scale high-throughput computing with parallel workflows at the scale of tens of thousands of cores for long periods of time. LIGO has been allocated millions of hours on XSEDE’s high-performance computers, including Stampede at the Texas Advanced Computing Center (TACC) and Comet at the San Diego Supercomputer Center (SDSC).

    Over the first year of XSEDE’s collaboration with LIGO, XSEDE worked to increase the speed of the applications, making them 8-10x faster on average.

    “The strategic collaboration between the two NSF-funded projects allows for accelerated scientific discovery which also translates into cost-savings for LIGO on the order of tens of millions of dollars so far,” says Pedro Marronetti, Gravitational Physics program director at the NSF.

    Waves of the future

    LIGO plans to upgrade its observatories and improve the sensitivity of its detectors before the next observational period begins in late 2018. LIGO predicts that once its observatories reach their most sensitive state, they may able to detect as many as 40 gravitational waves per year.

    3

    More instruments like LIGO will soon be listening for waves around the world in Italy, Japan, and India. Scientists also hope to place interferometers in orbit in order to avoid interference from Earth noise.

    And that means much more computing power will be required to verify the signals and extract information about the nature and origins of our universe.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 1:12 pm on May 27, 2017 Permalink | Reply
    Tags: 1 millionº and breezy: Your solar forecast, , , , Science Node, ,   

    From Science Node: “1 millionº and breezy: Your solar forecast” 

    Science Node bloc
    Science Node

    24 May, 2017
    Alisa Alering

    Space is a big place, so modeling activities out there calls for supercomputers that match. PRACE provided scientists the resources to run the Vlasiator code and simulate the solar wind around the earth.

    1
    Courtesy Minna Palmroth; Finnish Meteorological Institute.

    Outer space is a tough place to be a lonely blue planet.

    With only a thin atmosphere standing between a punishing solar wind and the 1.5 million species living on its surface, any indication of the solar mood is appreciated.

    The sun emits a continuous flow of plasma traveling at speeds up to 900 km/s and temperatures as high as 1 millionº Celsius. The earth’s magnetosphere blocks this wind and allows it to flow harmlessly around the planet like water around a stone in the middle of a stream.

    Magnetosphere of Earth, original bitmap from NASA. SVG rendering by Aaron Kaase

    But under the force of the solar bombardment, the earth’s magnetic field responds dramatically, changing size and shape. The highly dynamic conditions this creates in near-Earth space is known as space weather.

    Vlasiator, a new simulation developed by Minna Palmroth, professor in computational space physics at the University of Helsinki, models the entire magnetosphere. It helps scientists to better understand interesting and hard-to-predict phenomena that occur in near-Earth space weather.

    Unlike previous models that could only simulate a small segment of the magnetosphere, Vlasiator allows scientists to study causal relationships between plasma phenomena for the first time and to consider smaller scale phenomena in a larger context.

    “With Vlasiator, we are simulating near-Earth space with better accuracy than has even been possible before,” says Palmroth.

    Navigating near-Earth

    Over 1,000 satellites and other near-Earth spacecraft are currently in operation around the earth, including the International Space Station and the Hubble Telescope.

    Nearly all communications on Earth — including television and radio, telephone, internet, and military — rely on links to these spacecraft.

    Still other satellites support navigation and global positioning and meteorological observation.

    New spacecraft are launched every day, and the future promises even greater dependence on their signals. But we are launching these craft into a sea of plasma that we barely understand.

    “Consider a shipping company that would send its vessel into an ocean without knowing what the environment was,” says Palmroth. “That wouldn’t be very smart.”

    Space weather has an enormous impact on spacecraft, capable of deteriorating signals to the navigation map on your phone and disrupting aviation. Solar storms even have the potential to overwhelm transformers and black out the power grid.

    Through better comprehension and prediction of space weather, Vlasiator’s comprehensive model will help scientists protect vital communications and other satellite functions.

    Three-level parallelization

    The Vlasiator’s simulations are so detailed that it can model the most important physical phenomena in the near-Earth space at the ion-kinetic scale. This amounts to a volume of 1 million km3 — a massive computational challenge that has not previously been possible.

    After being awarded several highly competitive grants from the European Research Council, Palmroth secured computation time on HPC resources managed by the Partnership for Advanced Computing in Europe (PRACE).

    4
    Hazel Hen

    She began with the Hornet supercomputer and then its successor Hazel Hen, both at the High-Performance Computing Center Stuttgart. Most recently she has been using the Marconi supercomputer at CINECA in Italy.

    7
    Marconi supercomputer at CINECA in Italy

    Palmroth’s success is due to three-level parallelization of the simulation code. Her team uses domain decomposition to split the near-Earth space into grid cells within each area they wish to simulate.

    They use load-balancing to divide the simulations and then parallelize using OpenMP. Finally, they vectorize the code to parallelize through the supercomputer’s cores.

    Even so, simulation datasets range from 1 to 100 terabytes, depending on how often they save the simulations, and require anywhere between 500 – 100,000 cores, possibly beyond, on Hazel Hen.

    “We are continuously making algorithmic improvements in the code, making new optimizations, and utilizing the latest advances in HPC to improve the efficiency of the calculations all the time,” says Palmroth.

    Taking off into the future

    In addition to advancing our knowledge of space weather, Vlasiator also helps scientists to better understand plasma physics. Until now, most fundamental plasma physical phenomena have been discovered from space because it’s the best available laboratory.

    But the universe is comprised of 99.9 percent plasma, the fourth state of matter. In order to understand the universe, you need to understand plasma physics. For scientists undertaking any kind of matter research, Vlasiator’s capacity to simulate the near-Earth space is significant.

    “As a scientist, I’m curious about what happens in the world,” says Palmroth. “I can’t really draw a line beyond which I don’t want to know what happens.”

    Significantly, Vlasiator has recently helped to explain some features of ultra-low frequency waves in the earth’s foreshock that have perplexed scientists for decades.

    A collaboration with NASA in the US helped validate those results with the THEMIS spacecraft, a constellation of five identical probes designed to gather information about large-scale space physics.

    Exchanging information with her colleagues at NASA allows Palmroth to get input from THEMIS’s direct observation of space phenomena and to exchange modeling results with the observational community.

    “The work we are doing now is important for the next generation,” says Palmroth. “We’re learning all the time. If future generations build upon our advances, their understanding of the universe will be on much more certain ground.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 9:48 am on May 25, 2017 Permalink | Reply
    Tags: , , Science Node,   

    From Science Node: “More than meets the eye” 

    Science Node bloc
    Science Node

    22 Mar, 2017 [Where has this been?]
    Tristan Fitzpatrick

    1
    Courtesy Innovega.

    Hold on tight, because an NSF-funded contact lens and eyewear combo is about to plunge us all into the Metaverse.

    7

    Augmented reality (AR) has been steadily making inroads into society. Sure the gaming is fun, but when you consider fields like medical training and remote site access for safety inspectors and science teachers, AR offers a lot of promise.

    However, many head mounted displays (HMD) are clunky and cumbersome and continue to restrict wide scale adoption of AR. What’s more, continual access to the digital world while navigating the real world presents a safety challenge.


    Close encounter. A smart investment by the NSF, eMacula puts augmented reality in a contact lens.

    So what’s the remedy that will vault us in to the Metaverse?

    “Contact lenses appear to be an optimal solution, but only if the user experience can deliver on the promise in comfort, real functionality, and a price point that consumers can afford,” says Chris Collins, founder and technical lead for the Center for Simulations and Virtual Environments Research (UCSIM) at the University of Cincinnati.

    “A product that meets all of those challenges could be a game-changer and bring us that much closer to a seamless immersive experience.”

    Virtual freedom

    Finally, scientists have come up with a way to free users from HMD and bring the virtual world closer than ever before.

    To better integrate the two worlds, Seattle-based startup Innovega has developed eMacula.

    “Allowing a user to have their digital media within their normal and unobstructed field of view means that people will stop staring at their phones and devices and start looking at each other again,” says Jay Marsh, vice president of engineering at Innovega.

    “It means that bio-metric health monitoring can be truly ‘real time, all the time’ and with driving/navigation directions painted on the road in front of you, you will never be distracted with a gaze shift to your mobile devices for guidance.”

    Funded in part by Small Business Innovation Research grants from the US National Science Foundation, Innovega has developed a filtered lens in a hybrid style contact lens along with lightweight and stylish eyewear.

    Pairing the glasses with a contact lens significantly reduces the burdens previously associated with HMD.

    “NSF was a critical player in the development of our current soft lens technology,” says Marsh.

    3
    Double trouble. Augmented reality has been held back by cumbersome interfaces and uncomfortable user experience. Courtesy David Hoffman, et al.

    Troubled technology

    With traditional AR, a problem known as vergence-accommodation conflict arises since our eyes try to focus on the screen in front of them, but end up converging at a farther distance. Often eyestrain, headaches, and nausea result.

    The contact lenses circumvent this problem because the media on the lenses is in focus while the eyes converge on the glasses behind.

    Merging the real and virtual worlds has broad implications for many professions, Marsh says. Doctors and technicians that need both hands to execute their work while also needing access to critical information will find great use in merging the virtual and real worlds.

    In an industry setting, Marsh foresees applications that allow for system level safety data to be provided to all operators in real time, and more refined information based on physical locations.

    The lenses are currently undergoing FDA approval and should be on the market later in 2017.

    It appears the next step in the digital mobile evolution of technology may be on the horizon, and the natural integration of the virtual world into our real experience is at hand.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 11:45 am on May 24, 2017 Permalink | Reply
    Tags: , , HPC heard around the world, Science Node   

    From Science Node: “HPC heard around the world” 

    Science Node bloc
    Science Node

    19 May, 2017
    Tristan Fitzpatrick

    A new advanced computing alliance indicates cooperation and collaboration are alive and well in the global research community.

    It was an international celebration in Barcelona, Spain, as representatives from three continents met at PRACEdays17 to sign a memorandum of understanding (MoU), formalizing a new era in advanced research computing.

    On hand to recognize the partnership were John Towns, principal investigator of the Extreme Science and Engineering Discovery Environment (XSEDE), Serge Bogaerts, managing director of the Partnership for Advanced Computing in Europe (PRACE), and Masahiro Seki, president of the Japanese Research Organization for Information Science and Technology (RIST).

    1

    “We are excited about this development and fully expect this effort will support the growing number of international collaborations emerging across all fields of scholarship,” says Towns.

    As steward of the US supercomputing infrastructure, XSEDE will share their socio-technical platform that integrates and coordinates the advanced digital services that support contemporary science across the country.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 1:14 pm on May 7, 2017 Permalink | Reply
    Tags: , , , , , Into the wild simulated yonder, Science Node   

    From Science Node: “Into the wild simulated yonder” 

    Science Node bloc
    Science Node

    28 Apr, 2017
    Tristan Fitzpatrick

    So much of our universe is known to us, but there is so much more we don’t know – yet.

    For example, normal matter that is easily observable (such as gas and rocks) makes up only four percent of the known mass-energy in the universe, according to the Canada-France Hawaii Lensing Survey (CFHTLenS).

    1
    Image: NASA/ESA



    CFHT Telescope, Mauna Kea, Hawaii, USA

    The other 96 percent is dark matter and energy. These two components are critical to science’s understanding of how the galaxy was formed, but there’s one problem: A combination of dark matter and energy can only be observed by how it affects the four percent of the universe scientists can measure.

    Telescopes don’t help much in observing these galactic effects, and this creates a challenge for scholars.

    Building a mystery

    One research project seeks to bring science one step closer to solving this cosmic mystery.

    Carnegie Mellon University associate professor Rachel Mandelbaum uses generative adversarial networks (GANs) to simulate galaxies warped by gravitational lensing – and takes science one step closer to solving the origins of our cosmos.

    Gravitational Lensing NASA/ESA


    Gravitational microlensing, S. Liebes, Physical Review B, 133 (1964): 835

    Gravitational lensing is a process by which mass bends light, an effect predicted by Albert Einstein’s theory of general relativity. The larger an object, the greater its gravitational field will be, and the greater its ability to bend light rays.

    ______________________________________________________________________

    Because light bending is sensitive to the strength of the gravitational field it goes through, observing the lensing effect can be used to measure dark matter.
    ______________________________________________________________________

    There are several difficulties with observing rays of light, however. According to Mandelbaum’s research [ScienceDirect], detector imperfections, telescopic blurring and distortion, atmospheric effects, and noise can all affect the quality of the data, making research challenging for scientists.

    Unlike GANs, a traditional neural network, for example, will detect the difference between different images, but only if these images have been tagged by people and include descriptions. Eventually, the artificial intelligence will learn to distinguish images by itself, but only after it first sorts through images manually one-by-one.

    GANs save resources compared to other neural networks because fewer people are needed to operate them and because a GAN does not include a tagging and descriptor process. An image generator produces fake and real images on its own, without outside help, and the network will eventually learn to tell the difference between the two.

    Astronomical implications

    Mandelbaum’s research has wide implications in astronomy, as it can serve as a useful starting point for astronomical image analysis when telescopic problems and other issues create obstacles for scientists.

    2
    A portion of Mandelbaum’s simulation of a focal plane on the Large Synoptic Survey Telescope [?Not built yet]. Courtesy Mandelbaum, et al.

    Astrophysicist Peter Nugent at the Computational Research Division of Lawrence Berkeley National Laboratory and his colleagues have researched a gravitationally lensed supernova using computer simulations, which will shed light on how matter is distributed throughout the galaxy.

    Discovering new ways to explore the unknown universe is just one of the many possibilities computational science offers.

    To learn more about gravitational lensing, visit the CFHTLenS website or check out Mandelbaum’s research paper.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 2:53 pm on May 6, 2017 Permalink | Reply
    Tags: Apolo, Purdue University, Science Node, , Universidad EAFIT in Medellin   

    From Science Node: “Supercomputing sister sites” 

    Science Node bloc
    Science Node

    03 May, 2017
    Kirsten Gibson

    Juan Carlos Vergara used to have go two weeks at a time without his personal computer while it was busy modeling earthquakes. Then he found Apolo.

    1
    Long-distance relationship. Purdue University and EAFIT have teamed up to bring supercomputing to Colombia. Here, Gerry McCartney and Donna Cumberland from Purdue University discuss Apolo with Juan David Pineda from EAFIT. Courtesy Purdue University, EAFIT.

    Apolo, the first research supercomputer at the Universidad EAFIT in Medellin, is the fruit of a partnership between Purdue University’s research computing unit and the Apolo Scientific Computing Center.

    With Apolo, Vergara finishes his work in days instead of months, and can expand the scale of his simulations five million times.

    With Apolo comes a staff to run it, including Juan David Pineda, Apolo’s technology coordinator, and Mateo Gomez, a high-performance computing (HPC) analyst.

    “Sometimes we would be up until 1am helping me solve problems,” says Vergara, a doctoral student of applied mechanics at EAFIT. “I saw them as part of my team, fundamental to what I do every day.”

    For their part of the partnership, Purdue brought a lot of experience accelerating discoveries in science and engineering. Purdue’s central information technology organization has built and operated nine HPC systems for faculty researchers in as many years, most rated among the world’s top 500 supercomputers.

    Hardware from one of those machines, the retired Steele cluster, became the foundation of Apolo.

    2
    Steele cluster

    People powered

    While the hardware is important, the partnership is more about people. Purdue research computing staff have traveled to Colombia to help train and to work with EAFIT colleagues. EAFIT students have participated in Purdue’s Summer Undergraduate Research Fellow (SURF) program working with many supercomputing experts.

    EAFIT and Purdue have also sent joint teams to student supercomputing competitions in New Orleans and Frankfurt, Germany. Some of the Colombian students on the teams have become key staff members at the Apolo center, which, in turn, trains the next generation of Colombia’s high-performance computing experts.

    Juan Luis Mejía, rector at Universidad EAFIT, says EAFIT had been searching for an international partner to help reverse decades of isolation. What it found in Purdue was unexpected.

    “Finding an alliance with a true interest in sharing knowledge of technology and without a hidden agenda allows us to progress,” Mejía says. “I believe that the relationship between our university and Purdue is one of the most valuable.”

    Quantum leap

    Because of the partnership with Purdue, Apolo has enabled research ranging from earthquake science, to a groundbreaking examination of the tropical disease leishmaniasis, to the most ‘green’ way to process cement, to quantum mechanics – in all cases, Apolo accelerates EAFIT researchers’ time to science.

    And since EAFIT is one of the few Colombian universities with a supercomputer and a strong partnership with a major American research university, it is poised to receive big money from the Colombia Científica program.

    EAFIT has already attracted the attention of Grupo Nutresa, a Latin American food processing company headquartered in Medellín, and researchers like Pilar Cossio, a Colombian HIV researcher working for the Max Planck Institute in Germany.

    When Cossio came home to Colombia after studying and working in Italy, the US, and Germany, the biophysicist figured that one big task she was going to face would be building her own supercomputer and finding someone to run it.

    But thanks to the partnership with Purdue, she conducts her research at the Universidad de Antioquia in Medellín with help from the Apolo Scientific Computing Center at EAFIT.

    Cossio’s research combines physics, computational biology, and chemistry. She’s studying protein changes at the atomic level which can help design drugs to cure HIV. That endeavor requires examining around two million different compounds to see which ones bind the best with particular proteins.

    “There are only two supercomputers in Colombia for bioinformatics,” Cossio says. “Apolo is the only one that focuses on satisfying scientific needs. It’s important for us in the developing countries to have partnerships with universities that can help us access these crucial scientific tools.”

    As it is for many scientists, high-performance and parallel computing power are vital for her research — she just didn’t anticipate finding a ready-made solution in her home country.

    Then she found Apolo.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 1:03 pm on April 29, 2017 Permalink | Reply
    Tags: , , Science Node, The risks to science-based policy we aren’t talking about   

    From Science Node: “The risks to science-based policy we aren’t talking about” 

    Science Node bloc
    Science Node

    19 Apr, 2017 [Where has this been?]
    Gretchen Goldman

    1
    Courtesy Jesse Springer.

    You’d think public policy would benefit the public, but increasingly that’s not the case. Gretchen Goldman from the Union of Concerned Scientists outlines the threats to evidence-based policies.


    The evidence of how the relationship between corporations and the political system is playing out.

    “Thank you, Dr. Goldman. That was frightening,” moderator Keesha Gaskins-Nathan said to me after I spoke last week as the only scientist at the Stetson University Law Review Symposium.

    “My talk covered the ways that the role of science in federal decisionmaking is being degraded by the Trump administration, by Congress, and by corporate and ideological forces.

    Together, these alarming moves are poised to damage the crucial role that science plays in keeping us all safe and healthy — this is why I will march at the March for Science on April 22.

    If current trends proceed unabated, science-based policy as we know it could change forever. Indeed, some of its core tenets are being chipped away. And a lot is at stake if we fail to stop it.

    We are currently witnessing efforts by this administration and Congress to freeze and roll back the federal government’s work to protect public health and safety. Congress is attempting to pollute the science advice that decisionmakers depend on, and is appointing decisionmakers who are openly hostile to the very missions of the science agencies they now lead.

    Threats to science-based America

    We cannot afford to make decisions without science. But now, this very process by which we make science-based policies in this country is under threat.

    Our decisionmakers have deep conflicts of interest, disrespect for science, and aren’t being transparent.

    This is a recipe for disaster.

    How can our leaders use science effectively to inform policy decisions if they can’t even make independent decisions and don’t recognize the value of science?

    EPA chief administrator Scott Pruitt, for example, this month said that carbon dioxide “is not a primary contributor to global warming.” (It is.)

    This blatant misinforming on climate science occurred on top of his extensive record of suing the agency over the science-based ozone rule I just described (among other rules).

    This type of disrespect for science-based policies from cabinet members is an alarming signal of the kind of scientific integrity losses we can expect under this administration.

    Congress is trying to degrade science advice.

    A cornerstone of science-based policy is the role of independent science advice feeding into policy decisions.

    But Congress wants to change who sits on science advisory committees and redefine what counts as science. The Regulatory Accountability Act, for example, would threaten how federal agencies can use science to make policy decisions.

    Past versions of the bill (which has already passed the House this year and is expected to be introduced soon in the Senate) have included concerning provisions. One mandated that government agencies could only use science if all of the underlying data and methods were publicly available — including health data, proprietary data, trade secrets, and intellectual property.

    In another case, the bill added more than 70 new regulatory procedures that would effectively shut down the government’s ability to protect us from new threats to our health, safety, and the environment. It is a dangerous precedent when politicians — not scientists — are deciding how science can inform policy decisions.

    Scientists face intimidation, muzzling, and political attacks.

    No one becomes a scientist because they want a political target on their back. But this is unfortunately what many scientists are now facing.

    While it won’t be enacted in its current form, the president’s budget shows his frightening priorities, which apparently include major cuts to science agencies like the EPA, Department of Energy, and NOAA.

    Communication gag orders, disappearing data, and review of scientific documents by political appointees in the first month of the administration have created a chilling effect for scientists within the government.

    Congress has even revived the Holman Rule, which allows them to reduce the salary of a federal employee down to $1.

    It is easy to see how such powers could be used to target government scientists producing political controversial science.

    Hurting science hurts real people

    Importantly, we must be clear about who will be affected most if science-based policymaking is dismantled. In many cases, these burdens will disproportionately fall on low-income communities and communities of color.

    If we cannot protect people from ozone pollution, those in urban areas, those without air conditioning, and those with lung diseases will be hurt most.

    If we cannot address climate change, frontline communities in low-lying areas will bear the brunt of it.

    If we cannot keep harmful chemicals out of children’s toys, families who buy cheaper products at dollar stores will pay the price.

    If we cannot protect people from unsafe drugs (FDA), contaminated food (USDA, FDA), occupational hazards (OSHA), chemical disasters (EPA, OSHA, DHS), dangerous vehicles (DOT) and unsafe consumer products (CPSC), then we’re all at risk.

    This is about more than science. It is about protecting people using the power of science. We have everything to lose.

    But we can take action. We can articulate the benefits of science to decisionmakers, the media, and the public.

    We can hold our leaders accountability for moves they make to dismantle science-based policy process.

    And we can support our fellow scientists both in and outside of the federal government.

    It starts with marching — but it cannot end here.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 4:34 pm on April 28, 2017 Permalink | Reply
    Tags: , , Open Science Grid, Science Node, XSEDE-Extreme Science and Engineering Discovery Environment   

    From Science Node: “A record year for the Open Science Grid” 

    Science Node bloc
    Science Node

    1
    Courtesy Open Science Grid.

    27 Apr, 2017
    Greg Moore

    Serving researchers across a wide variety of scientific disciplines, the Open Science Grid (OSG) weaves the national fabric of distributed high throughput computing.

    Over the last 12 months, the OSG has handled over one billion CPU hours. These record numbers have transformed the face of science nationally.

    2

    “We just had a record week recently of over 30 million hours (close to 32.8 million) and the trend is pointing to frequent 30 million-hour weeks — it will become typical,” says Scott Teige, manager of OSG’s Grid Operations Center at Indiana University (IU).

    “To reach 32.8 million, we need 195,000 cores running 24/7 for a week.”

    Teige’s job is to keep things running smoothly. The OSG Grid Operations Center provides operational support for users, developers, and system administrators. They are also on point for real-time monitoring and problem tracking, grid service maintenance, security incident response, and information repositories.

    Big and small

    Where is all this data coming from? Teige explains that the largest amount of data is coming from the experiments associated with the Large Hadron Collider (LHC), for which the OSG was originally designed.

    But the LHC is just part of the story. There are plenty of CPU cycles to go around, so opportunistic use has become a much larger focus. When OSG resources are not busy, scientists from many disciplines use those hours to revolutionize their science.

    For example, the Structural Protein-Ligand Interactome (SPLINTER) project by the Indiana University School of Medicine predicts the interaction of thousands of small molecules with thousands of proteins using the three-dimensional structure of the bound complex between each pair of protein and compound.

    By using the OSG, SPLINTER finds a quick and efficient solution to its computing needs — and develops a systems biology approach to target discovery.

    The opportunistic resources deliver millions of CPU hours in a matter of days, greatly reducing simulation time. This allows researchers to identify small molecule candidates for individual proteins, or new protein targets for existing FDA-approved drugs and biologically active compounds.

    “We serve virtual organizations (VOs) that may not have their own resources,” says Teige. “SPLINTER is a prime example of how we partner with the OSG to transform research — our resources alone cannot meet their needs.”

    Hoosier nexus

    Because Teige’s group is based at Indiana University, a lot of the OSG operational infrastructure is run out of the IU Data Center. And, because IU is an Extreme Science and Engineering Discovery Environment (XSEDE) resource, the university also handles submissions to the OSG.

    3
    OSG meets LHC. A view inside the Compact Muon Solenoid (CMS) detecter, a particle detector on the LHC. The OSG was designed for the massive datasets generated in the search for particles like the Higgs boson. Courtesy Tighe Flanagan. (CC BY-SA 3.0)

    That means scientists and researchers nationwide can connect both to XSEDE’s collection of integrated digital resources and services and to OSG’s opportunistic resources.

    “We operate information services to determine states of resources used in how jobs are submitted,” said Teige. “We operate the various user interfaces like the GOC homepage, support tools, and the ticket system. We also operate a global file system called Oasis where files are deposited to be available for use in a reasonably short time span. And we provide certification services for the user community.”

    From LHC big data to smaller opportunistic research computing needs, Teige’s team makes sure the OSG has the support they depend on so discovery moves forward reliably and transparently.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 10:41 am on March 14, 2017 Permalink | Reply
    Tags: , dark web, Science Node, , Wrangling crime in the deep   

    From Science Node: “Wrangling crime in the deep, dark web” 

    Science Node bloc
    Science Node

    06 Mar, 2017
    Jorge Salazar

    Much of the internet hides like an iceberg below the surface.

    This so-called ‘deep web’ is estimated to be 500 times bigger than the ‘surface web’ seen through search engines like Google. For scientists and others, the deep web holds important computer code and licensing agreements.

    Nestled further inside the deep web, one finds the ‘dark web,’ a place where images and video are used by traders in illicit drugs, weapons, and human lives.

    “Behind forms and logins, there are bad things,” says Chris Mattmann, chief architect in the instrument and science data systems section of the NASA Jet Propulsion Laboratory (JPL) at the California Institute of Technology.

    “Behind the dynamic portions of the web, people are doing nefarious things, and on the dark web, they’re doing even more nefarious things. They traffic in guns and human organs. They’re doing these activities and then they’re tying them back to terrorism.”

    In 2014, the Defense Advanced Research Projects Agency (DARPA) started a program called Memex to make the deep web accessible. “The goal of Memex was to provide search engines the retrieval capacity to deal with those situations and to help defense and law enforcement go after the bad guys on the deep web,” Mattmann says.

    At the same time, the US National Science Foundation (NSF) invested $11.2 million in a first-of-its-kind data-intensive supercomputer – the Wrangler supercomputer, now housed at the Texas Advanced Computing Center (TACC). The NSF asked engineers and computer scientists at TACC, Indiana University, and the University of Chicago if a computer could be built to handle massive amounts of input and output.


    TACC Wrangler

    Wrangler does just that, enabling the speedy file transfers needed to fly past big data bottlenecks that can slow down even the fastest computers. It was built to work in tandem with number crunchers such as TACC’s Stampede, which in 2013 was the sixth fastest computer in the world.


    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    “Although we have a lot of search-based queries through different search engines like Google, it’s still a challenge to query the system in way that answers your questions directly,” says Karanjeet Singh.

    Singh is a University of Southern California graduate student who works with Chris Mattmann on Memex and other projects.

    “The objective is to get more and more domain-specific information from the internet and to associate facts from that information.”

    Once the Memex user extracts the information they need, they can apply tools such as named entity recognizer, sentiment analysis, and topic summarization. This can help law enforcement agencies find links between different activities, such as illegal weapon sales and human trafficking.

    The problem is that even the fastest computers like Stampede weren’t designed to handle the input and output of the millions of files needed for the Memex project.

    “Let’s say that we have one system directly in front of us, and there is some crime going on,” Singh says. “What the JPL is trying to do is automate a lot of domain-specific query processes into a system where you can just feed in the questions and receive the answers.”

    For that, he works with an open source web crawler called Apache Nutch. It retrieves and collects web page and domain information of the deep web. The MapReduce framework powers those crawls with a divide-and-conquer approach to big data that breaks it up into small pieces that run simultaneously.

    The NSF asked engineers and computer scientists at TACC, Indiana University, and the University of Chicago if a computer could be built to handle massive amounts of input and output.

    Wrangler avoids data overload by virtue of its 600 terabytes of speedy flash storage. What’s more, Wrangler supports the Hadoop framework, which runs using MapReduce.

    Together, Wrangler and Memex constitute a powerful crime-fighting duo. NSF investment in advanced computation has placed powerful tools in the hands of public defense agencies, moving law enforcement beyond the limitations of commercial search engines.

    “Wrangler is a fantastic tool that we didn’t have before as a mechanism to do research,” says Mattman. “It has been an amazing resource that has allowed us to develop techniques that are helping save people, stop crime, and stop terrorism around the world.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: