Tagged: isgtw Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:27 am on July 29, 2015 Permalink | Reply
    Tags: , , isgtw, ,   

    From isgtw: “Supercomputers listen for extraterrestrial life” 


    international science grid this week

    July 29, 2015
    Lance Farrell

    Last week, NASA’s New Horizons spacecraft thrilled us with images from its close encounter with Pluto.

    NASA New Horizons spacecraft II
    NASA/New Horizons

    New Horizons now heads into the Kuiper belt and to points spaceward. Will it find life?


    Known objects in the Kuiper belt beyond the orbit of Neptune (scale in AU; epoch as of January 2015).

    That’s the question motivating Aline Vidotto, scientific collaborator at the Observatoire de Genève in Switzerland. Her recent study harnesses supercomputers to find out how to tune our radio dials to listen in on other planets.

    1
    Model of an interplanetary medium. Stellar winds stream from the star and interact with the magnetosphere of the hot-Jupiters. Courtesy Vidotto

    Vidotto has been studying interstellar environments for a while now, focusing on the interplanetary atmosphere surrounding so-called hot-Jupiter exoplanets since 2009. Similar in size to our Jupiter, these exoplanets orbit their star up to 20 times as closely as Earth orbits the sun, and are considered ‘hot’ due to the extra irradiation they receive.

    Every star generates a stellar wind, and the characteristics of this wind depend on the star from which it originates. The speed of its rotation, its magnetism, its gravity, or how active it is are among the factors affecting this wind. These variables also modify the effect this wind will have on planets in its path.

    Since the winds of different star systems are likely to be very different from our own, we need computers to help us boldly go where no one has ever gone before. “Observationally, we know very little about the winds and the interplanetary space of other stars,” Vidotto says. “This is why we need models and numerical simulations.”

    Vidotto’s research focuses on planets four to nine times closer to their host star than Mercury is to the sun. She takes observations of the magnetic fields around five stars from astronomers at the Canada-France-Hawaii Telescope (CFHT) in Hawaii and the Bernard-Lyot Telescope in France and feeds them into 3D simulations. For her most recent study, she divided the computational load between the Darwin cluster (part of the DiRAC network) at the University of Cambridge (UK) and the Piz Daint at the Swiss National Supercomputing Center.

    Canada-France-Hawaii Telescope
    CFHT nterior
    CFHT

    Bernard Lyot telescope
    Bernard Lyot telescope interior
    Bernard Lyot

    The Darwin cluster consists of 9,728 cores, with a theoretical peak in excess of 202 teraFLOPS. Piz Daint consists of 5,272 compute nodes with 32 GB of RAM per node, and is capable of 7.8 petaFLOPS — that’s more computation in a day than a typical laptop could manage in a millennium.

    Vidotto’s analysis of the DiRAC simulations reveals a much different interplanetary medium than in our home solar system, with an overall interplanetary magnetic field 100 times larger than ours, and stellar wind pressures at the point of orbit in excess of 10,000 times ours.

    This immense pressure means these planets must have a very strong magnetic shield (magnetosphere) or their atmospheres would be blown away by the stellar wind, as we suspect happened on Mars. A planet’s atmosphere is thought to be initimately related to its habitability.

    A planet’s magnetism can also tell us something about the interior properties of the planet such as its thermal state, composition, and dynamics. But since the actual magnetic fields of these exoplanets have not been observed, Vidotto is pursuing a simple hypothesis: What if they were similar to our own Jupiter?

    Temp 1
    A model of an exoplanet magnetosphere interacting with an interstellar wind. Knowing the characteristics of the interplanetary medium and the flux of the exoplanet radio emissions in this medium can help us tune our best telescopes to listen for distant signs of life. Courtesy Vidotto.

    If this were the case, then the magnetosphere around these planets would extend five times the radius of the planet (Earth’s magnetosphere extends 10-15 times). Where it mingles with the onrushing stellar winds, it creates the effect familiar to us as an aurora display. Indeed, Vidotto’s research reveals the auroral power in these exoplanets is more impressive than Jupiter’s. “If we were ever to live on one of these planets, the aurorae would be a fantastic show to watch!” she says.

    Knowing this auroral power enables astronomers to realistically characterize the interplanetary medium around the exoplanets, as well as the auroral ovals through which cosmic and stellar particles can penetrate the exoplanet atmosphere. This helps astronomers correctly estimate the flux of exoplanet radio emissions and how sensitive equipment on Earth would have to be to detect them. In short, knowing how to listen is a big step toward hearing.

    Radio emissions from these hot-Jupiters would present a challenge to our current class of radio telescopes, such as the Low Frequency Array for radio astronomy (LOFAR). However, “there is one radio array that is currently being designed where these radio fluxes could be detected — the Square Kilometre Array (SKA),” Vidotto says. The SKA is set for completion in 2023, and in the DiRAC clusters Vidotto finds some of the few supercomputers in the world capable of testing correlation software solutions.

    Lofar radio telescope

    While there’s much more work ahead of us, Vidotto’s research presents a significant advance in radio astronomy and is helping refine our ability to detect signals from beyond. With her 3D exoplanet simulations, the DiRAC computation power, and the ears of SKA, it may not be long before we’re able to hear radio signals from distant worlds.

    Stay tuned!

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 1:47 pm on July 21, 2015 Permalink | Reply
    Tags: , , isgtw,   

    From isgtw: “Simulations reveal a less crowded universe” 


    international science grid this week

    July 15, 2015
    Jan Zverina

    1
    Blue Waters supercomputer

    Simulations conducted on the Blue Waters supercomputer at the National Center for Supercomputing Applications (NCSA) suggest there may be far fewer galaxies in the universe than expected.

    The study, published this week in Astrophysical Journal Letters, shows the first results from the Renaissance Simulations, a suite of extremely high-resolution adaptive mesh refinement calculations of high redshift galaxy formation. Taking advantage of data transferred to SDSC Cloud at the San Diego Supercomputer Center (SDSC), these simulations show hundreds of well-resolved galaxies.

    “Most critically, we show that the ultraviolet luminosity function of our simulated galaxies is consistent with observations of redshift galaxy populations at the bright end of the luminosity function, but at lower luminosities is essentially flat rather than rising steeply,” says principal investigator and lead author Brian W. O’Shea, an associate professor at Michigan State University.

    This discovery allows researchers to make several novel and verifiable predictions ahead of the October 2018 launch of the James Webb Space Telescope, a new space observatory succeeding the Hubble Space Telescope.

    NASA Webb Telescope
    NASA/Webb

    NASA Hubble Telescope
    NASA/ESA Hubble

    “The Hubble Space Telescope can only see what we might call the tip of the iceberg when it comes to taking inventory of the most distant galaxies,” said SDSC director Michael Norman. “A key question is how many galaxies are too faint to see. By analyzing these new, ultra-detailed simulations, we find that there are 10 to 100 times fewer galaxies than a simple extrapolation would predict.”

    The simulations ran on the National Science Foundation (NSF) funded Blue Waters supercomputer, one of the largest and most powerful academic supercomputers in the world. “These simulations are physically complex and very large — we simulate thousands of galaxies at a time, including their interactions through gravity and radiation, and that poses a tremendous computational challenge,” says O’Shea.

    Blue Waters, based at the University of Illinois, is used to tackle a wide range of challenging problems, from predicting the behavior of complex biological systems to simulating the evolution of the cosmos. The supercomputer has more than 1.5 petabytes of memory — enough to store 300 million images from a digital camera — and can achieve a peak performance level of more than 13 quadrillion calculations per second.

    “The flattening at lower luminosities is a key finding and significant to researchers’ understanding of the reionization of the universe, when the gas in the universe changed from being mostly neutral to mostly ionized,” says John H. Wise, Dunn Family assistant professor of physics at the Georgia Institute of Technology.

    Temp 1
    Matter overdensity (top row) and ionized fraction (bottom row) for the regions simulated in the Renaissance Simulations. The red triangles represent locations of galaxies detectable with the Hubble Space Telescope. The James Webb Space Telescope will detect many more distant galaxies, shown by the blue squares and green circles. These first galaxies reionized the universe shown in the image with blue bubbles around the galaxies. Courtesy Brian W. O’Shea (Michigan State University), John H. Wise (Georgia Tech); Michael Norman and Hao Xu (UC San Diego). Click for larger image.

    The term ‘reionized’ is used because the universe was ionized immediately after the fiery big bang. During that time, ordinary matter consisted mostly of hydrogen atoms with positively charged protons stripped of their negatively charged electrons. Eventually, the universe cooled enough for electrons and protons to combine and form neutral hydrogen. They didn’t give off any optical or UV light — and without it, conventional telescopes are of no use in finding traces of how the cosmos evolved during these Dark Ages. The light returned when reionization began.

    In an earlier paper, previous simulations concluded that the universe was 20 percent ionized about 300 million years after the Big Bang; 50 percent ionized at 550 million years after; and fully ionized at 860 million years after its creation.

    “Our work suggests that there are far fewer faint galaxies than one could previously infer,” says O’Shea. “Observations of high redshift galaxies provide poor constraints on the low-luminosity end of the galaxy luminosity function, and thus make it challenging to accurately account for the full budget of ionizing photons during that epoch.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 3:39 pm on June 24, 2015 Permalink | Reply
    Tags: , , isgtw   

    From isgtw: “Analyzing a galaxy far, far away for clues to our origins” 


    international science grid this week

    June 24, 2015
    Makeda Easter

    Temp 0
    The Earth’s location in the universe. Courtesy Andrew Z. Colvin. CC BY-SA 3.0 or GFDL, via Wikimedia Commons.

    The Andromeda Galaxy (M31) lies more than two million light years away from Earth.

    2
    Andromeda. Author Adam Evans

    In 2011, an international group of astronomers began a four-year program to map and study the millions of stars comprising the galaxy. With the help of the Hubble telescope, Extreme Science and Engineering Discovery Environment (XSEDE), and the Texas Advanced Computing Center (TACC), they not only produced the best Andromeda pictures ever seen, but also put the question of universal star formation to rest.

    To map M31, the Panchromatic Hubble Andromeda Treasury (PHAT) looked to its namesake Hubble Space Telescope (HST). Because the HST orbits the Earth, it can provide information to astronomers that ground-based telescopes cannot. But more than just stunning pictures, each star revealed by the HST holds clues to the history of the galaxy’s formation — and thus our own. For instance, by analyzing a star’s color, researchers can infer its age. From its luminosity, scientists can measure its distance from Earth.

    PHAT used this information to develop star formation histories for M31, which meant decoding the number of stars of each type (age, mass, chemistry) and how much dust is obscuring their light. Modeling the star formation history of 100 million stars requires powerful computation, so the team turned to the US National Science Foundation (NSF), XSEDE, and TACC.

    “We had to measure over 100 million objects with 100 different parameters for every single one of them,” says Julianne Dalcanton, principal investigator on the PHAT project. “Having XSEDE resources has been absolutely fantastic because we were able to easily run the same process over and over again in parallel.”

    XSEDE enables researchers to interactively share computing resources, data, and expertise. Through XSEDE, the team gained access to the Stampede supercomputer at TACC, which was essential to determining the ages of every star mapped, patterns of star formation, and how the galaxy evolved over time.

    PHAT used this information to develop star formation histories for M31, which meant decoding the number of stars of each type (age, mass, chemistry) and how much dust is obscuring their light. Modeling the star formation history of 100 million stars requires powerful computation, so the team turned to the US National Science Foundation (NSF), XSEDE, and TACC.

    “We had to measure over 100 million objects with 100 different parameters for every single one of them,” says Julianne Dalcanton, principal investigator on the PHAT project. “Having XSEDE resources has been absolutely fantastic because we were able to easily run the same process over and over again in parallel.”

    XSEDE enables researchers to interactively share computing resources, data, and expertise. Through XSEDE, the team gained access to the Stampede supercomputer at TACC, which was essential to determining the ages of every star mapped, patterns of star formation, and how the galaxy evolved over time.

    Read more about the PHAT team’s quest to understand infinity here.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 7:33 pm on April 15, 2015 Permalink | Reply
    Tags: , , isgtw,   

    From isgtw: “Supercomputing enables researchers in Norway to tackle cancer” 


    international science grid this week

    April 15, 2015
    Yngve Vogt

    Cancer researchers are using the Abel supercomputer at the University of Oslo in Norway to detect which versions of genes are only found in cancer cells. Every form of cancer, even every tumour, has its own distinct variants.

    “This charting may help tailor the treatment to each patient,” says Rolf Skotheim, who is affiliated with the Centre for Cancer Biomedicine and the research group for biomedical informatics at the University of Oslo, as well as the Department of Molecular Oncology at Oslo University Hospital.

    Temp 0
    “Charting the versions of the genes that are only found in cancer cells may help tailor the treatment offered to each patient,” says Skotheim. Image courtesy Yngve Vogt.

    His research group is working to identify the genes that cause bowel and prostate cancer, which are both common diseases. There are 4,000 new cases of bowel cancer in Norway every year. Only six out of ten patients survive the first five years. Prostate cancer affects 5,000 Norwegians every year. Nine out of ten survive.

    Comparisons between healthy and diseased cells

    In order to identify the genes that lead to cancer, Skotheim and his research group are comparing genetic material in tumours with genetic material in healthy cells. In order to understand this process, a brief introduction to our genetic material is needed:

    Our genetic material consists of just over 20,000 genes. Each gene consists of thousands of base pairs, represented by a specific sequence of the four building blocks, adenine, thymine, guanine, and cytosine, popularly abbreviated to A, T, G, and C. The sequence of these building blocks is the very recipe for the gene. Our whole DNA consists of some six billion base pairs.

    The DNA strand carries the molecular instructions for activity in the cells. In other words, DNA contains the recipe for proteins, which perform the tasks in the cells. DNA, nevertheless, does not actually produce proteins. First, a copy of DNA is made: this transcript is called RNA and it is this molecule that is read when proteins are produced.

    RNA is only a small component of DNA, and is made up of its active constituents. Most of DNA is inactive. Only 1–2 % of the DNA strand is active.

    In cancer cells, something goes wrong with the RNA transcription. There is either too much RNA, which means that far too many proteins of a specific type are formed, or the composition of base pairs in the RNA is wrong. The latter is precisely the area being studied by the University of Oslo researchers.

    Wrong combinations

    All genes can be divided into active and inactive parts. A single gene may consist of tens of active stretches of nucleotides (exons). “RNA is a copy of a specific combination of the exons from a specific gene in DNA,” explains Skotheim. There are many possible combinations, and it is precisely this search for all of the possible combinations that is new in cancer research.

    Different cells can combine the nucleotides in a single gene in different ways. A cancer cell can create a combination that should not exist in healthy cells. And as if that didn’t make things complicated enough, sometimes RNA can be made up of stretches of nucleotides from different genes in DNA. These special, complex genes are called fusion genes.

    Temp 0
    “We need powerful computers to crunch the enormous amounts of raw data,” says Skotheim. “Even if you spent your whole life on this task, you would not be able to find the location of a single nucleotide.”

    In other words, researchers must look for errors both inside genes and between the different genes. “Fusion genes are usually found in cancer cells, but some of them are also found in healthy cells,” says Skotheim. In patients with prostate cancer, researchers have found some fusion genes that are only created in diseased cells. These fusion genes may then be used as a starting-point in the detection of and fight against cancer.

    The researchers have also found fusion genes in bowel cells, but they were not cancer-specific. “For some reason, these fusion genes can also be found in healthy cells,” adds Skotheim. “This discovery was a let-down.”
    Improving treatment

    There are different RNA errors in the various cancer diseases. The researchers must therefore analyze the RNA errors of each disease.

    Among other things, the researchers are comparing RNA in diseased and healthy tissue from 550 patients with prostate cancer. The patients that make up the study do not receive any direct benefits from the results themselves. However, the research is important in order to be able to help future patients.

    “We want to find the typical defects associated with prostate cancer,” says Skotheim. “This will make it easier to understand what goes wrong with healthy cells, and to understand the mechanisms that develop cancer. Once we have found the cancer-specific molecules, they can be used as biomarkers.” In some cases, the biomarkers can be used to find cancer, determine the level of severity of the cancer and the risk of spreading, and whether the patient should be given a more aggressive treatment.

    Even though the researchers find deviations in the RNA, there is no guarantee that there is appropriate, targeted medicine available. “The point of our research is to figure out more of the big picture,” says Skotheim. “If we identify a fusion gene that is only found in cancer cells, the discovery will be so important in itself that other research groups around the world will want to begin working on this straight away. If a cure is found that counteracts the fusion genes, this may have enormous consequences for the cancer treatment.”

    Laborious work

    Recreating RNA is laborious work. The set of RNA molecules consists of about 100 million bases, divided into a few thousand bases from each gene.

    The laboratory machine reads millions of small nucleotides. Each one is only 100 base pairs long. In order for the researchers to be able to place them in the right location, they must run large statistical analyses. The RNA analysis of a single patient can take a few days.

    All of the nucleotides must be matched with the DNA strand. Unfortunately the researchers do not have the DNA strands of each patient. In order to learn where the base pairs come from in the DNA strand, they must therefore use the reference genome of the human species. “This is not ideal, because there are individual differences,” explains Skotheim. The future potentially lies in fully sequencing the DNA of each patient when conducting medical experiments.
    Supercomputing

    There is no way this research could be carried out using pen and paper. “We need powerful computers to crunch the enormous amounts of raw data. Even if you spent your whole life on this task, you would not be able to find the location of a single nucleotide. This is a matter of millions of nucleotides that must be mapped correctly in the system of coordinates of the genetic material. Once we have managed to find the RNA versions that are only found in cancer cells, we will have made significant progress. However, the work to get that far requires advanced statistical analyses and supercomputing,” says Skotheim.

    The analyses are so demanding that the researchers must use the University of Oslo’s Abel supercomputer, which has a theoretical peak performance of over 250 teraFLOPS. “With the ability to run heavy analyses on such large amounts of data, we have an enormous advantage not available to other cancer researchers,” explains Skotheim. “Many medical researchers would definitely benefit from this possibility. This is why they should spend more time with biostatisticians and informaticians. RNA samples are taken from the patients only once. The types of analyses that can be run are only limited by the imagination.”

    “We need to be smart in order to analyze the raw data.” He continues: “There are enormous amounts of data here that can be interpreted in many different ways. We just got started. There is lots of useful information that we have not seen yet. Asking the right questions is the key. Most cancer researchers are not used to working with enormous amounts of data, and how to best analyze vast data sets. Once researchers have found a possible answer, they must determine whether the answer is chance or if it is a real finding. The solution is to find out whether they get the same answers from independent data sets from other parts of the world.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 4:07 am on April 2, 2015 Permalink | Reply
    Tags: , , isgtw   

    From isgtw: “Supporting research with grid computing and more” 


    international science grid this week

    April 1, 2015
    Andrew Purcell

    Temp 1
    “In order for researchers to be able to collaborate and share data with one another efficiently, the underlying IT infrastructures need to be in place,” says Gomes. “With the amount of data produced by research collaborations growing rapidly, this support is of paramount importance.”

    Jorge Gomes is the principal investigator of the computing group at the Portuguese Laboratory of Instrumentation and Experimental Particles Physics (LIP) in Lisbon and a member of the European Grid Infrastructure (EGI)executive board. As the technical coordinator of the Portuguese national grid infrastructure (INCD), he is also responsible for Portugal’s contribution to the Worldwide LHC Computing Grid (WLCG).

    iSGTW speaks to Gomes about the importance of supporting researchers through a variety of IT infrastructures ahead of the EGI Conference in Lisbon from 18 to 22 May 2015.

    What’s the main focus of your work at LIP?

    I’ve been doing research in the field of grid computing since 2001. LIP participates in both the ATLAS and CMS experiments on the Large Hadron Collider (LHC) at CERN, which is why we’ve been working on research and development projects for the grid computing infrastructure that supports these experiments.

    CERN ATLAS New
    ATLAS

    CERN CMS New II
    CMS

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC

    CERN Control Center
    CERN

    Here in Portugal, we now have a national ‘road map’ for research infrastructures, which includes IT infrastructures. Our work in the context of the Portuguese national grid infrastructure now involves supporting a wide range of research communities, not just high-energy physics. Today, we support research in fields such as astrophysics, life sciences, chemistry, civil engineering, and environmental modeling, among others. For us, it’s very important to support as wide a range of communities as possible.

    So, when you talk about supporting researchers by providing ‘IT infrastructures’, it’s about much more than grid computing, right?

    Yes, today we’re engaged in cloud computing, high-performance computing, and a wide range of data-related services. This larger portfolio of services has evolved to match the needs of the Portuguese research community.

    2
    Cloud computing metaphor: For a user, the network elements representing the provider-rendered services are invisible, as if obscured by a cloud.

    Why is it important to provide IT infrastructures to support research?

    Research is no longer done by isolated individuals; instead, it is increasingly common for it to be carried out by large collaborations, often on an international or even an intercontinental basis. So, in order for researchers to be able to collaborate and share data with one another efficiently, the underlying IT infrastructures need to be in place. With the amount of data produced by research collaborations growing rapidly, this support is of paramount importance.

    Here in Portugal, we have a lot of communities that don’t yet have access to these services, but they really do need them. Researchers don’t want to have to set up their own IT infrastructures, they want to concentrate on doing research in their own specialist field. This is why it’s important for IT specialists to provide them with these underlying services.

    Also, particularly in relatively small countries like Portugal, it’s important that resources scattered across universities and other research institutions can be integrated, in order to extract the maximum possible value.

    When it comes to encouraging researchers to make use of the IT infrastructures you provide, what are the main challenges you face?

    Trust, in particular, is a very important aspect. For researchers to build scientific software on top of IT infrastructures, they need to have confidence that the infrastructures will still be there several years down the line. This is also connected to challenges like ‘vendor lock in’ and standards in relation to cloud computing infrastructure. We need to have common solutions so that if a particular IT infrastructure provider — either public or private — fails, users can move to other available resources.

    Another challenge is related to the structure of some research communities. The large, complex experimental apparatuses involved in high-energy physics means that these research communities are very structured and there is often a high degree of collaboration between research groups. In other domains however, where it is common to have much smaller research groups, this is often not the case, which means it can be much more difficult to develop standard IT solutions and to achieve agreement on a framework for sharing IT resources.

    Why do you believe it is important to provide grid computing infrastructure at a European scale, through EGI, rather than just at a national scale?

    More and more research groups are working internationally, so it’s no longer enough to provide IT infrastructures at a national level. That’s why we also collaborate with our colleagues in Spain to provide IberGrid.

    EGI is of great strategic importance to research in Europe. We’re now exploring a range of exciting opportunities through the European Strategy Forum on Research Infrastructures (ESFRI) to support large flagship European research projects.

    The theme for the upcoming EGI conference is ‘engaging the research community towards an open science commons’. What’s the role of EGI in helping to establish this commons?

    In Europe we still have a fragmented ecosystem of services provided by many entities with interoperability issues. A better level of integration and sharing is needed to take advantage of the growing amounts of scientific data available. EGI proposes an integrated vision that encompasses data, instruments, ICT services, and knowledge to reduce the barriers to scientific collaboration and result sharing.

    EGI is in a strategic position to integrate services at the European level and to enable access to open data, thus promoting knowledge sharing. By gathering key players, next month’s conference will be an excellent opportunity to further develop this vision.

    Finally, what are you most looking forward to about the conference?

    The conference is a great opportunity for users, developers, and resource providers to meet and exchange experiences and ideas at all levels. It’s also an excellent opportunity for researchers to discuss their requirements and to shape the development of future IT infrastructures. I look forward to seeing a diverse range of people at the event!

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 11:43 am on January 24, 2015 Permalink | Reply
    Tags: , isgtw,   

    From isgtw: “Unlocking the secrets of vertebrate evolution” 


    international science grid this week

    January 21, 2015
    Lance Farrell

    Conventional wisdom holds that snakes evolved a particular form and skeleton by losing regions in their spinal column over time. These losses were previously explained by a disruption in Hox genes responsible for patterning regions of the vertebrae.

    Paleobiologists P. David Polly, professor of geological sciences at Indiana University, US, and Jason Head, assistant professor of earth and atmospheric sciences at the University of Nebraska-Lincoln, US, overturned that assumption. Recently published in Nature, their research instead reveals that snake skeletons are just as regionalized as those of limbed vertebrates.

    Using Quarry [being taken out of service Jan 30, 2015 and replaced by Karst, a supercomputer at Indiana University, Polly and Head arrived at a compelling new explanation for why snake skeletons are so different: Vertebrates like mammals, birds, and crocodiles evolved additional skeletal regions independently from ancestors like snakes and lizards.

    Karst
    Karst

    “Our study finds that snakes did not require extensive modification to their regulatory gene systems to evolve their elongate bodies,” Head notes.

    Despite having no limbs and more vertebrae, snake skeletons are just as regionalized as lizards’ skeletons.

    “Our study finds that snakes did not require extensive modification to their regulatory gene systems to evolve their elongate bodies,” Head notes.

    3
    P. David Polly. Photo courtesy Indiana University.

    Polly and Head had to overcome challenges in collection and analysis to arrive at this insight. “If you are sequencing a genome all you really need is a little scrap of tissue, and that’s relatively easy to get,” Polly says. “But if you want to do something like we have done, you not only need an entire skeleton, but also one for a whole lot of species.”

    To arrive at their conclusion, Head and Polly sampled 56 skeletons from collections worldwide. They began by photographing and digitizing the bones, then chose specific landmarks on each spinal segment. Using the digital coordinates of each vertebra, they then applied a technique called geometric-morphometrics, a multi-variant analysis that plots x and y coordinates to analyze an object’s shape.

    Armed with shape information, the scientists then fit a series of regressions and tracked each vertebra’s gradient over the entire spine. This led to a secondary challenge — with 36,000 landmarks applied to 3,000 digitized vertebrae, the regression analyses required to peer into the snake’s past called for a new analytical tool.

    “The computations required iteratively fitting four or more segmented regression models, each with 10 to 83 parameters, for every regional permutation of up to 230 vertebrae per skeleton. The amount of computational power required is well beyond any desktop system,” Head observes.

    Researchers like Polly and Head increasingly find quantitative analyses of data sets this size require the computational resources to match. With 7.2 million different models making up the data for their study, nothing less than a supercomputer would do.

    5
    Jason Head with ball python. Photo courtesy Craig Chandler, University of Nebraska-Lincoln.

    “Our supercomputing environments serve a broad base of users and purposes,” says David Hancock, manager of IU’s high performance systems. “We often support the research done in the hard sciences and math such as Polly’s, but we also see analytics done for business faculty, marketing and modeling for interior design projects, and lighting simulations for theater productions.”

    Analyses of the scale Polly and Head needed would have been unapproachable even a decade ago, and without US National Science Foundation support remain beyond the reach of most institutions. “A lot of the big jobs ran on Quarry,” says Polly. “To run one of these exhaustive models on a single snake took about three and a half days. Ten years ago we could barely have scratched the surface.”

    As high-performance computing resources reshape the future, scientists like Polly and Head have greater abilities to look into the past and unlock the secrets of evolution.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 5:00 pm on January 21, 2015 Permalink | Reply
    Tags: , , isgtw, Simulation Astronomy,   

    From isgtw: “Exploring the universe with supercomputing” 


    international science grid this week

    January 21, 2015
    Andrew Purcell

    The Center for Computational Astrophysics (CfCA) in Japan recently upgraded its ATERUI supercomputer, doubling the machine’s theoretical peak performance to 1.058 petaFLOPS. Eiichiro Kokubo, director of the center, tells iSGTW how supercomputers are changing the way research is conducted in astronomy.

    What’s your research background?

    I investigate the origin of planetary systems. I use many-body simulations to study how planets form and I also previously worked on the development of the Gravity Pipe, or ‘GRAPE’ supercomputer.

    Why is it important to use supercomputers in this work?

    In the standard scenario of planet formation, small solid bodies — known as ‘planetisimals’ — interact with one another and this causes their orbits around the sun to evolve. Collisions between these building blocks lead to the formation of rocky planets like the Earth. To understand this process, you really need to do very-large-scale many-body simulations. This is where the high-performance computing comes in: supercomputers act as telescopes for phenomena we wouldn’t otherwise be able to see.

    The scales of mass, energy, and time are generally huge in astronomy. However, as supercomputers have become ever more powerful, we’ve become able to program the relevant physical processes — motion, fluid dynamics, radiative transfer, etc. — and do meaningful simulation of astronomical phenomena. We can even conduct experiments by changing parameters within our simulations. Simulation is numerical exploration of the universe!

    How has supercomputing changed the way research is carried out?

    Simulation astronomy’ has now become a third major methodological approach within the field, alongside observational and theoretical astronomy. Telescopes rely on electromagnetic radiation, but there are still many things that we cannot see even with today’s largest telescopes. Supercomputers enable us to use complex physical calculations to visualize phenomena that would otherwise remain hidden to us. Their use also gives us the flexibility to simulate phenomena across a vast range of spatial and temporal scales.

    Simulation can be used to simply test hypotheses, but it can also be used to explore new worlds that are beyond our current imagination. Sometimes you get results from a simulation that you really didn’t expect — this is often the first step on the road to making new discoveries and developing new astronomical theories.

    2
    ATERUI has made the leap to become a petaFLOPS-scale supercomputer. Image courtesy NAOJ/Makoto Shizugami (VERA/CfCA, NAOJ).

    In astronomy, there are three main kinds of large-scale simulation: many-body, fluid dynamics, and radiative transfer. These problems can all be parallelized effectively, meaning that massively parallel computers — like the Cray XC30 system we’ve installed — are ideally suited to performing these kinds of simulations.

    3
    “Supercomputers act as telescopes for phenomena we wouldn’t otherwise be able to see,” says Kokubo.

    What research problems will the ATERUI enable you tackle?

    There are over 100 users in our community and they are tackling a wide variety of problems. One project, for example, is looking at supernovae: having very high-resolution 3D simulations of these explosions is vital to improving our understanding. Another project is looking at the distribution of galaxies throughout the universe, and there is a whole range of other things being studied using ATERUI too.

    Since installing ATERUI, it’s been used at over 90% of its capacity, in terms of the number of CPUs running at any given time. Basically, it’s almost full every single day!

    Don’t forget, we also have the K computer here in Japan. The National Astronomical Observatory of Japan, of which the CfCA is part, is actually one of the consortium members of the K supercomputer project. As such, we also have plenty of researchers using that machine, as well. High-end supercomputers like K are absolutely great, but it is also important to have middle-class supercomputers dedicated to specific research fields available.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 6:51 pm on January 20, 2015 Permalink | Reply
    Tags: , , isgtw,   

    From isgtw and Sandia Lab: “8 Mind-Blowing Scientific Research Machines” 

    ISGTW

    Sandia Lab

    Scientific innovation and discovery are defining characteristics of humanity’s innate curiosity. Mankind has developed advanced scientific research machines to help us better understand the universe. They constitute some of the greatest human endeavors for the sake of technological and scientific progress. These projects also connect people of many nations and cultures, and inspire future generations of engineers and scientists.

    Apart from the last two experiments that are under construction, the images in this article are not fake or altered; they are real and showcase machines on the frontier of scientific innovation and discovery. Read on to learn more about the machines, what the images show, and how NI technology helps make them possible.

    1
    Borexino, a solar neutrino experiment, recently confirmed the energy output of the sun has not changed in 100,000 years. Its large underground spherical detector contains 2,000 soccer-ball-sized photomultiplier tubes.

    Borexino and DarkSide

    Gran Sasso National Laboratory, Assergi, Italy

    2
    PMTs are contained inside the Liquid Scintillator Veto spherical tank, a component of the DarkSide Experiment used to actively suppress background events from radiogenic and cosmogenic neutrons.

    Borexino and DarkSide are located 1.4 km (0.87 miles) below the earth’s surface in the word’s largest underground laboratory for experiments in particle astrophysics. Only a tiny fraction of the contents of the universe is visible matter, the rest is thought to be composed of dark matter and dark energy. A leading hypothesis for dark matter is that it comprises Weakly Interacting Massive Particles (WIMPs). The DarkSide experiment attempts to detect these particles to better understand the nature of dark matter and its interactions.

    These experiments use NI oscilloscopes to acquire electrical signals resulting from scintillation light captured by the photomultiplier tubes (PMTs). In DarkSide, 200 high-speed, high-resolution channels need to be tightly synchronized to make time-of-flight measurements of photons. Watch the NIWeek 2013 keynote or view a technical presentation for more information.

    Joint European Torus (JET)

    Culham Centre for Fusion Energy (CCFE), Oxfordshire, United Kingdom

    5
    Plasma is contained and heated in a torus within the interior of the JET tokamak.

    Currently the largest experimental tokamak fusion reactor in the world, JET uses magnetic confinement to contain plasma at around 100 million degrees Celsius, nearly seven times the temperature of the sun’s core (15 million degrees Celsius). Nuclear fusion is the process that powers the sun. Harnessing this type of energy can help solve the world’s growing energy demand. This facility is crucial to the research and development for future larger fusion reactors.

    Large Hadron Collider (LHC)
    CERN, Geneva, Switzerland

    a

    The A Toroidal LHC ApparatuS (ATLAS) is LHC’s largest particle detector involved in the recent discovery of the Higgs boson.

    The LHC is the largest and most powerful particle accelerator in the world, located in a 27 km (16.78 mile) ring tunnel underneath Switzerland and France. The experiment recently discovered the Higgs boson, deemed the “God Particle” that gives everything its mass. CERN is set to reopen the upgraded LHC in early 2015 at much higher energies to help physicists probe deeper into the nature of the universe and address the questions of supersymmetry and dark matter.

    National Ignition Facility (NIF)
    Lawrence Livermore National Laboratory (LLNL), California, USA

    7

    The image looks up into NIF’s 10 m (33 ft) diameter spherical target chamber with the target held on the protruding pencil-shaped arm.

    NIF is the largest inertial confinement fusion device in the world. The experiment converges the beams of 192 high-energy lasers on a single fuel-filled target, producing a 500 TW flash of light to trigger nuclear fusion. The aim of this experiment is to produce a condition known as ignition, in which the fusion reaction becomes self-sustaining. The machine was also used as the set for the warp drive in the latest Star Trek movie.

    Z Machine
    Sandia National Laboratories, Albuquerque, New Mexico, USA

    8

    The Z Machine creates residual lightning as it releases 350 TW of stored energy.

    The world’s largest X-ray generator is used for various high-pulsed power experiments requiring extreme temperatures and pressures. This includes inertial confinement fusion research. The extremely high voltages are achieved by rapidly discharging huge capacitors in a large insulated bath of oil and water onto a central target.

    European Extremely Large Telescope (E-ELT)

    European Southern Observatory (ESO), Cerro Armazones, Chile

    8

    This artist’s rendition of the E-ELT shows it at its high-altitude Atacama Desert site.

    The E-ELT is the largest optical/near-infrared ground-based telescope being built by ESO in northern Chile. It will allow astronomers to probe deep into space and investigate many unanswered questions about the universe. Images from E-ELT will be 16 times sharper than those from the Hubble Space Telescope, allowing astronomers to study the creation and atmospheres of extrasolar planets. The primary M1 mirror (shown in the image) is nearly 40 m (131 ft) in diameter, consisting of about 800 hexagonal segments.

    NASA Hubble Telescope
    Hubble

    International Thermonuclear Experimental Reactor (ITER)
    ITER Organization, Cadarache, France

    9

    This cutaway computer model shows ITER with plasma at its core. A technician is shown to demonstrate the machine’s size.

    ITER is an international effort to build the largest experimental fusion tokamak in the world, a critical step toward future fusion power plants. The European Union, India, Japan, China, Russia, South Korea, and United States are collaborating on the project, which is currently under construction in southern France.

     
  • richardmitnick 5:55 pm on December 10, 2014 Permalink | Reply
    Tags: , , , , isgtw   

    From isgtw: “Supercomputer compares modern and ancient DNA” 


    international science grid this week

    December 10, 2014
    Jorge Salazar, Texas Advanced Computing Center
    tc

    What if you researched your family’s genealogy, and a mysterious stranger turned out to be an ancestor? A team of scientists who peered back into Europe’s murky prehistoric past thousands of years ago had the same surprise. With sophisticated genetic tools, supercomputing simulations and modeling, they traced the origins of modern Europeans to three distinct populations.The international research team’s results are published in the journal Nature.

    s
    The Stuttgart skull, from a 7,000-year-old skeleton found in Germany among artifacts from the first widespread farming culture of central Europe. Right: Blue eyes and dark skin – how the European hunter-gatherer appeared 7,000 years ago. Artist depiction based on La Braña 1, whose remains were recovered at La Braña-Arintero site in León, Spain. Images courtesy Consejo Superior de Investigaciones Cientificas.

    “Europeans seem to be a mixture of three different ancestral populations,” says study co-author Joshua Schraiber, a National Science Foundation postdoctoral fellow at the University of Washington, in Seattle, US. Schraiber says the results surprised him because the prevailing view among scientists held that only two distinct groups mixed between 7,000 and 8,000 years ago in Europe, as humans first started to adopt agriculture.

    Scientists have only a handful of ancient remains well preserved enough for genome sequencing. An 8,000-year-old skull discovered in Loschbour, Luxembourg provided DNA evidence for the study. The remains were found at the caves of Loschbour, La Braña, Stuttgart, a ritual site at Motala, and at Mal’ta.

    The third mystery group that emerged from the data is ancient northern Eurasians. “People from the Siberia area is how I conceptualize it,” says Schraiber. “We don’t know too much anthropologically about who these people are. But the genetic evidence is relatively strong because we do have ancient DNA from an individual that’s very closely related to that population, too.”

    The individual is a three-year-old boy whose remains were found near Lake Baikal in Siberia at the Mal’ta site. Scientists determined his arm bone to be 24,000 years old. They then sequence his genome, making it the second oldest modern human sequenced. Interestingly enough, in late 2013 scientists used the Mal’ta genome to find that about one-third of Native American ancestry originated through gene flow from these ancient North Eurasians.

    The researchers took the genomes from these ancient humans and compared them to those from 2,345 modern-day Europeans. “I used the POPRES data set, which had been used before to ask similar questions just looking at modern Europeans,” Schraiber says. “Then I used software called Beagle, which was written by Brian Browning and Sharon Browning at the University of Washington, which computationally detects these regions of identity by descent.”

    The National Science Foundation’s XSEDE (Extreme Science and Engineering Discovery Environment) and Stampede supercomputer at the Texas Advanced Computing Center provided computational resources used in the study. The research was funded in part by the National Cancer Institute of the National Institutes of Health.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 10:28 pm on December 3, 2014 Permalink | Reply
    Tags: , , , , , , , isgtw, , ,   

    From isgtw: “Volunteer computing: 10 years of supporting CERN through LHC@home” 


    international science grid this week

    December 3, 2014
    Andrew Purcell

    LHC@home recently celebrated a decade since its launch in 2004. Through its SixTrack project, the LHC@home platform harnesses the power of volunteer computing to model the progress of sub-atomic particles traveling at nearly the speed of light around the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland. It typically simulates about 60 particles whizzing around the collider’s 27km-long ring for ten seconds, or up to one million loops. Results from SixTrack were used to help the engineers and physicists at CERN design stable beam conditions for the LHC, so today the beams stay on track and don’t cause damage by flying off course into the walls of the vacuum tube. It’s now also being used to carry out simulations relevant to the design of the next phase of the LHC, known as the High-Luminosity LHC.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “The results of SixTrack played an essential role in the design of the LHC, and the high-luminosity upgrades will naturally require additional development work on SixTrack,” explains Frank Schmidt, who works in CERN’s Accelerators and Beam Physics Group of the Beams Department and is the main author of the SixTrack code. “In addition to its use in the design stage, SixTrack is also a key tool for the interpretation of data taken during the first run of the LHC,” adds Massimo Giovannozzi, who also works in CERN’s Accelerators and Beams Physics Group. “We use it to improve our understanding of particle dynamics, which will help us to push the LHC performance even further over the coming years of operation.” He continues: “Managing a project like SixTrack within LHC@home requires resources and competencies that are not easy to find: Igor Zacharov, a senior scientist at the Particle Accelerator Physics Laboratory (LPAP) of the Swiss Federal Institute of Technology in Lausanne (EPFL), provides valuable support for SixTrack by helping with BOINC integration.”

    c
    Volunteer computing is a type of distributed computing through which members of the public donate computing resources (usually processing power) to aid research projects. Image courtesy Eduardo Diez Viñuela, Flickr (CC BY-SA 2.0).

    Before LHC@home was created, SixTrack was run only on desktop computers at CERN, using a platform called the Compact Physics Screen Saver (CPSS). This proved to be a useful tool for a proof of concept, but it was first with the launch of the LHC@home platform in 2004 that things really took off. “I am surprised and delighted by the support from our volunteers,” says Eric McIntosh, who formerly worked in CERN’s IT Department and is now an honorary member of the Beams Department. “We now have over 100,000 users all over the world and many more hosts. Every contribution is welcome, however small, as our strength lies in numbers.”

    Virtualization to the rescue

    Building on the success of SixTrack, the Virtual LHC@home project (formerly known as Test4Theory) was launched in 2011. It enables users to run simulations of high-energy particle physics using their home computers, with the results submitted to a database used as a common resource by both experimental and theoretical scientists working on the LHC.

    Whereas the code for SixTrack was ported for running on Windows, OS X, and Linux, the high-energy-physics code used by each of the LHC experiments is far too large to port in a similar way. It is also being constantly updated. “The experiments at CERN have their own libraries and they all run on Linux, while the majority of people out there have common-or-garden variety Windows machines,” explains CERN honorary staff member of the IT department and chief technology officer of the Citizen Cyberscience Centre Ben Segal. “Virtualization is the way to solve this problem.”

    The birth of the LHC@home platform

    In 2004, Ben Segal and François Grey , who were both members of CERN’s IT department at the time, were asked to plan an outreach event for CERN’s 50th anniversary that would help people around the world to get an impression of the computational challenges facing the LHC. “I had been an early volunteer for SETI@home after it was launched in 1999,” explains Grey. “Volunteer computing was often used as an illustration of what distributed computing means when discussing grid technology. It seemed to me that it ought to be feasible to do something similar for LHC computing and perhaps even combine volunteer computing and grid computing this way.”

    “I contacted David Anderson, the person behind SETI@Home, and it turned out the timing was good, as he was working on an open-source platform called BOINC to enable many projects to use the SETI@home approach,” Grey continues. BOINC (Berkeley Open Infrastructures for Network Computing)is an open-source software platform for computing with volunteered resources. It was first developed at the University of California, Berkeley in the US to manage the SETI@Home project, and uses the unused CPU and GPU cycles on a computer to support scientific research.

    “I vividly remember the day we phoned up David Anderson in Berkeley to see if we could make a SETI-like computing challenge for CERN,” adds Segal. “We needed a CERN application that ran on Windows, as over 90% of BOINC volunteers used that. The SixTrack people had ported their code to Windows and had already built a small CERN-only desktop grid to run it on, as they needed lots of CPU power. So we went with that.”

    A runaway success

    “I was worried that no one would find the LHC as interesting as SETI. Bear in mind that this was well before the whole LHC craziness started with the Angels and Demons movie, and news about possible mini black holes destroying the planet making headlines,” says Grey. “We made a soft launch, without any official announcements, in 2004. To our astonishment, the SETI@home community immediately jumped in, having heard about LHC@home by word of mouth. We had over 1,000 participants in 24 hours, and over 7,000 by the end of the week — our server’s maximum capacity.” He adds: “We’d planned to run the volunteer computing challenge for just three months, at the time of the 50th anniversary. But the accelerator physicists were hooked and insisted the project should go on.”

    Predrag Buncic, who is now coordinator of the offline group within the ALICE experiment, led work to create the CERN Virtual Machine in 2008. He, Artem Harutyunyan (former architect and lead developer of CernVM Co-Pilot), and Segal subsequently adopted this virtualization technology for use within Virtual LHC@home. This has made it significantly easier for the experiments at CERN to create their own volunteer computing applications, since it is no longer necessary for them to port their code. The long-term vision for Virtual LHC@home is to support volunteer-computing applications for each of the large LHC experiments.
    Growth of the platform

    The ATLAS experiment recently launched a project that simulates the creation and decay of supersymmetric bosons and fermions. “ATLAS@Home offers the chance for the wider public to participate in the massive computation required by the ATLAS experiment and to contribute to the greater understanding of our universe,” says David Cameron, a researcher at the University of Oslo in Norway. “ATLAS also gains a significant computing resource at a time when even more resources will be required for the analysis of data from the second run of the LHC.”

    CERN ATLAS New
    ATLAS

    ATLAS@home

    Meanwhile, the LHCb experiment has been running a limited test prototype for over a year now, with an application running Beauty physics simulations set to be launched for the Virtual LHC@home project in the near future. The CMS and ALICE experiments also have plans to launch similar applications.

    CERN LHCb New
    LHCb

    CERN CMS New
    CMS

    CERN ALICE New
    ALICE

    An army of volunteers

    “LHC@home allows CERN to get additional computing resources for simulations that cannot easily be accommodated on regular batch or grid resources,” explains Nils Høimyr, the member of the CERN IT department responsible for running the platform. “Thanks to LHC@home, thousands of CPU years of accelerator beam dynamics simulations for LHC upgrade studies have been done with SixTrack, and billions of events have been simulated with Virtual LHC@home.” He continues: “Furthermore, the LHC@home platform has been an outreach channel, giving publicity to LHC and high-energy physics among the general public.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 454 other followers

%d bloggers like this: