Tagged: isgtw Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:43 am on January 24, 2015 Permalink | Reply
    Tags: , isgtw,   

    From isgtw: “Unlocking the secrets of vertebrate evolution” 


    international science grid this week

    January 21, 2015
    Lance Farrell

    Conventional wisdom holds that snakes evolved a particular form and skeleton by losing regions in their spinal column over time. These losses were previously explained by a disruption in Hox genes responsible for patterning regions of the vertebrae.

    Paleobiologists P. David Polly, professor of geological sciences at Indiana University, US, and Jason Head, assistant professor of earth and atmospheric sciences at the University of Nebraska-Lincoln, US, overturned that assumption. Recently published in Nature, their research instead reveals that snake skeletons are just as regionalized as those of limbed vertebrates.

    Using Quarry [being taken out of service Jan 30, 2015 and replaced by Karst, a supercomputer at Indiana University, Polly and Head arrived at a compelling new explanation for why snake skeletons are so different: Vertebrates like mammals, birds, and crocodiles evolved additional skeletal regions independently from ancestors like snakes and lizards.

    Karst
    Karst

    “Our study finds that snakes did not require extensive modification to their regulatory gene systems to evolve their elongate bodies,” Head notes.

    Despite having no limbs and more vertebrae, snake skeletons are just as regionalized as lizards’ skeletons.

    “Our study finds that snakes did not require extensive modification to their regulatory gene systems to evolve their elongate bodies,” Head notes.

    3
    P. David Polly. Photo courtesy Indiana University.

    Polly and Head had to overcome challenges in collection and analysis to arrive at this insight. “If you are sequencing a genome all you really need is a little scrap of tissue, and that’s relatively easy to get,” Polly says. “But if you want to do something like we have done, you not only need an entire skeleton, but also one for a whole lot of species.”

    To arrive at their conclusion, Head and Polly sampled 56 skeletons from collections worldwide. They began by photographing and digitizing the bones, then chose specific landmarks on each spinal segment. Using the digital coordinates of each vertebra, they then applied a technique called geometric-morphometrics, a multi-variant analysis that plots x and y coordinates to analyze an object’s shape.

    Armed with shape information, the scientists then fit a series of regressions and tracked each vertebra’s gradient over the entire spine. This led to a secondary challenge — with 36,000 landmarks applied to 3,000 digitized vertebrae, the regression analyses required to peer into the snake’s past called for a new analytical tool.

    “The computations required iteratively fitting four or more segmented regression models, each with 10 to 83 parameters, for every regional permutation of up to 230 vertebrae per skeleton. The amount of computational power required is well beyond any desktop system,” Head observes.

    Researchers like Polly and Head increasingly find quantitative analyses of data sets this size require the computational resources to match. With 7.2 million different models making up the data for their study, nothing less than a supercomputer would do.

    5
    Jason Head with ball python. Photo courtesy Craig Chandler, University of Nebraska-Lincoln.

    “Our supercomputing environments serve a broad base of users and purposes,” says David Hancock, manager of IU’s high performance systems. “We often support the research done in the hard sciences and math such as Polly’s, but we also see analytics done for business faculty, marketing and modeling for interior design projects, and lighting simulations for theater productions.”

    Analyses of the scale Polly and Head needed would have been unapproachable even a decade ago, and without US National Science Foundation support remain beyond the reach of most institutions. “A lot of the big jobs ran on Quarry,” says Polly. “To run one of these exhaustive models on a single snake took about three and a half days. Ten years ago we could barely have scratched the surface.”

    As high-performance computing resources reshape the future, scientists like Polly and Head have greater abilities to look into the past and unlock the secrets of evolution.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 5:00 pm on January 21, 2015 Permalink | Reply
    Tags: , , isgtw, Simulation Astronomy,   

    From isgtw: “Exploring the universe with supercomputing” 


    international science grid this week

    January 21, 2015
    Andrew Purcell

    The Center for Computational Astrophysics (CfCA) in Japan recently upgraded its ATERUI supercomputer, doubling the machine’s theoretical peak performance to 1.058 petaFLOPS. Eiichiro Kokubo, director of the center, tells iSGTW how supercomputers are changing the way research is conducted in astronomy.

    What’s your research background?

    I investigate the origin of planetary systems. I use many-body simulations to study how planets form and I also previously worked on the development of the Gravity Pipe, or ‘GRAPE’ supercomputer.

    Why is it important to use supercomputers in this work?

    In the standard scenario of planet formation, small solid bodies — known as ‘planetisimals’ — interact with one another and this causes their orbits around the sun to evolve. Collisions between these building blocks lead to the formation of rocky planets like the Earth. To understand this process, you really need to do very-large-scale many-body simulations. This is where the high-performance computing comes in: supercomputers act as telescopes for phenomena we wouldn’t otherwise be able to see.

    The scales of mass, energy, and time are generally huge in astronomy. However, as supercomputers have become ever more powerful, we’ve become able to program the relevant physical processes — motion, fluid dynamics, radiative transfer, etc. — and do meaningful simulation of astronomical phenomena. We can even conduct experiments by changing parameters within our simulations. Simulation is numerical exploration of the universe!

    How has supercomputing changed the way research is carried out?

    Simulation astronomy’ has now become a third major methodological approach within the field, alongside observational and theoretical astronomy. Telescopes rely on electromagnetic radiation, but there are still many things that we cannot see even with today’s largest telescopes. Supercomputers enable us to use complex physical calculations to visualize phenomena that would otherwise remain hidden to us. Their use also gives us the flexibility to simulate phenomena across a vast range of spatial and temporal scales.

    Simulation can be used to simply test hypotheses, but it can also be used to explore new worlds that are beyond our current imagination. Sometimes you get results from a simulation that you really didn’t expect — this is often the first step on the road to making new discoveries and developing new astronomical theories.

    2
    ATERUI has made the leap to become a petaFLOPS-scale supercomputer. Image courtesy NAOJ/Makoto Shizugami (VERA/CfCA, NAOJ).

    In astronomy, there are three main kinds of large-scale simulation: many-body, fluid dynamics, and radiative transfer. These problems can all be parallelized effectively, meaning that massively parallel computers — like the Cray XC30 system we’ve installed — are ideally suited to performing these kinds of simulations.

    3
    “Supercomputers act as telescopes for phenomena we wouldn’t otherwise be able to see,” says Kokubo.

    What research problems will the ATERUI enable you tackle?

    There are over 100 users in our community and they are tackling a wide variety of problems. One project, for example, is looking at supernovae: having very high-resolution 3D simulations of these explosions is vital to improving our understanding. Another project is looking at the distribution of galaxies throughout the universe, and there is a whole range of other things being studied using ATERUI too.

    Since installing ATERUI, it’s been used at over 90% of its capacity, in terms of the number of CPUs running at any given time. Basically, it’s almost full every single day!

    Don’t forget, we also have the K computer here in Japan. The National Astronomical Observatory of Japan, of which the CfCA is part, is actually one of the consortium members of the K supercomputer project. As such, we also have plenty of researchers using that machine, as well. High-end supercomputers like K are absolutely great, but it is also important to have middle-class supercomputers dedicated to specific research fields available.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 6:51 pm on January 20, 2015 Permalink | Reply
    Tags: , , isgtw,   

    From isgtw and Sandia Lab: “8 Mind-Blowing Scientific Research Machines” 

    ISGTW

    Sandia Lab

    Scientific innovation and discovery are defining characteristics of humanity’s innate curiosity. Mankind has developed advanced scientific research machines to help us better understand the universe. They constitute some of the greatest human endeavors for the sake of technological and scientific progress. These projects also connect people of many nations and cultures, and inspire future generations of engineers and scientists.

    Apart from the last two experiments that are under construction, the images in this article are not fake or altered; they are real and showcase machines on the frontier of scientific innovation and discovery. Read on to learn more about the machines, what the images show, and how NI technology helps make them possible.

    1
    Borexino, a solar neutrino experiment, recently confirmed the energy output of the sun has not changed in 100,000 years. Its large underground spherical detector contains 2,000 soccer-ball-sized photomultiplier tubes.

    Borexino and DarkSide

    Gran Sasso National Laboratory, Assergi, Italy

    2
    PMTs are contained inside the Liquid Scintillator Veto spherical tank, a component of the DarkSide Experiment used to actively suppress background events from radiogenic and cosmogenic neutrons.

    Borexino and DarkSide are located 1.4 km (0.87 miles) below the earth’s surface in the word’s largest underground laboratory for experiments in particle astrophysics. Only a tiny fraction of the contents of the universe is visible matter, the rest is thought to be composed of dark matter and dark energy. A leading hypothesis for dark matter is that it comprises Weakly Interacting Massive Particles (WIMPs). The DarkSide experiment attempts to detect these particles to better understand the nature of dark matter and its interactions.

    These experiments use NI oscilloscopes to acquire electrical signals resulting from scintillation light captured by the photomultiplier tubes (PMTs). In DarkSide, 200 high-speed, high-resolution channels need to be tightly synchronized to make time-of-flight measurements of photons. Watch the NIWeek 2013 keynote or view a technical presentation for more information.

    Joint European Torus (JET)

    Culham Centre for Fusion Energy (CCFE), Oxfordshire, United Kingdom

    5
    Plasma is contained and heated in a torus within the interior of the JET tokamak.

    Currently the largest experimental tokamak fusion reactor in the world, JET uses magnetic confinement to contain plasma at around 100 million degrees Celsius, nearly seven times the temperature of the sun’s core (15 million degrees Celsius). Nuclear fusion is the process that powers the sun. Harnessing this type of energy can help solve the world’s growing energy demand. This facility is crucial to the research and development for future larger fusion reactors.

    Large Hadron Collider (LHC)
    CERN, Geneva, Switzerland

    a

    The A Toroidal LHC ApparatuS (ATLAS) is LHC’s largest particle detector involved in the recent discovery of the Higgs boson.

    The LHC is the largest and most powerful particle accelerator in the world, located in a 27 km (16.78 mile) ring tunnel underneath Switzerland and France. The experiment recently discovered the Higgs boson, deemed the “God Particle” that gives everything its mass. CERN is set to reopen the upgraded LHC in early 2015 at much higher energies to help physicists probe deeper into the nature of the universe and address the questions of supersymmetry and dark matter.

    National Ignition Facility (NIF)
    Lawrence Livermore National Laboratory (LLNL), California, USA

    7

    The image looks up into NIF’s 10 m (33 ft) diameter spherical target chamber with the target held on the protruding pencil-shaped arm.

    NIF is the largest inertial confinement fusion device in the world. The experiment converges the beams of 192 high-energy lasers on a single fuel-filled target, producing a 500 TW flash of light to trigger nuclear fusion. The aim of this experiment is to produce a condition known as ignition, in which the fusion reaction becomes self-sustaining. The machine was also used as the set for the warp drive in the latest Star Trek movie.

    Z Machine
    Sandia National Laboratories, Albuquerque, New Mexico, USA

    8

    The Z Machine creates residual lightning as it releases 350 TW of stored energy.

    The world’s largest X-ray generator is used for various high-pulsed power experiments requiring extreme temperatures and pressures. This includes inertial confinement fusion research. The extremely high voltages are achieved by rapidly discharging huge capacitors in a large insulated bath of oil and water onto a central target.

    European Extremely Large Telescope (E-ELT)

    European Southern Observatory (ESO), Cerro Armazones, Chile

    8

    This artist’s rendition of the E-ELT shows it at its high-altitude Atacama Desert site.

    The E-ELT is the largest optical/near-infrared ground-based telescope being built by ESO in northern Chile. It will allow astronomers to probe deep into space and investigate many unanswered questions about the universe. Images from E-ELT will be 16 times sharper than those from the Hubble Space Telescope, allowing astronomers to study the creation and atmospheres of extrasolar planets. The primary M1 mirror (shown in the image) is nearly 40 m (131 ft) in diameter, consisting of about 800 hexagonal segments.

    NASA Hubble Telescope
    Hubble

    International Thermonuclear Experimental Reactor (ITER)
    ITER Organization, Cadarache, France

    9

    This cutaway computer model shows ITER with plasma at its core. A technician is shown to demonstrate the machine’s size.

    ITER is an international effort to build the largest experimental fusion tokamak in the world, a critical step toward future fusion power plants. The European Union, India, Japan, China, Russia, South Korea, and United States are collaborating on the project, which is currently under construction in southern France.

     
  • richardmitnick 5:55 pm on December 10, 2014 Permalink | Reply
    Tags: , , , , isgtw   

    From isgtw: “Supercomputer compares modern and ancient DNA” 


    international science grid this week

    December 10, 2014
    Jorge Salazar, Texas Advanced Computing Center
    tc

    What if you researched your family’s genealogy, and a mysterious stranger turned out to be an ancestor? A team of scientists who peered back into Europe’s murky prehistoric past thousands of years ago had the same surprise. With sophisticated genetic tools, supercomputing simulations and modeling, they traced the origins of modern Europeans to three distinct populations.The international research team’s results are published in the journal Nature.

    s
    The Stuttgart skull, from a 7,000-year-old skeleton found in Germany among artifacts from the first widespread farming culture of central Europe. Right: Blue eyes and dark skin – how the European hunter-gatherer appeared 7,000 years ago. Artist depiction based on La Braña 1, whose remains were recovered at La Braña-Arintero site in León, Spain. Images courtesy Consejo Superior de Investigaciones Cientificas.

    “Europeans seem to be a mixture of three different ancestral populations,” says study co-author Joshua Schraiber, a National Science Foundation postdoctoral fellow at the University of Washington, in Seattle, US. Schraiber says the results surprised him because the prevailing view among scientists held that only two distinct groups mixed between 7,000 and 8,000 years ago in Europe, as humans first started to adopt agriculture.

    Scientists have only a handful of ancient remains well preserved enough for genome sequencing. An 8,000-year-old skull discovered in Loschbour, Luxembourg provided DNA evidence for the study. The remains were found at the caves of Loschbour, La Braña, Stuttgart, a ritual site at Motala, and at Mal’ta.

    The third mystery group that emerged from the data is ancient northern Eurasians. “People from the Siberia area is how I conceptualize it,” says Schraiber. “We don’t know too much anthropologically about who these people are. But the genetic evidence is relatively strong because we do have ancient DNA from an individual that’s very closely related to that population, too.”

    The individual is a three-year-old boy whose remains were found near Lake Baikal in Siberia at the Mal’ta site. Scientists determined his arm bone to be 24,000 years old. They then sequence his genome, making it the second oldest modern human sequenced. Interestingly enough, in late 2013 scientists used the Mal’ta genome to find that about one-third of Native American ancestry originated through gene flow from these ancient North Eurasians.

    The researchers took the genomes from these ancient humans and compared them to those from 2,345 modern-day Europeans. “I used the POPRES data set, which had been used before to ask similar questions just looking at modern Europeans,” Schraiber says. “Then I used software called Beagle, which was written by Brian Browning and Sharon Browning at the University of Washington, which computationally detects these regions of identity by descent.”

    The National Science Foundation’s XSEDE (Extreme Science and Engineering Discovery Environment) and Stampede supercomputer at the Texas Advanced Computing Center provided computational resources used in the study. The research was funded in part by the National Cancer Institute of the National Institutes of Health.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 10:28 pm on December 3, 2014 Permalink | Reply
    Tags: , , , , , , , isgtw, , ,   

    From isgtw: “Volunteer computing: 10 years of supporting CERN through LHC@home” 


    international science grid this week

    December 3, 2014
    Andrew Purcell

    LHC@home recently celebrated a decade since its launch in 2004. Through its SixTrack project, the LHC@home platform harnesses the power of volunteer computing to model the progress of sub-atomic particles traveling at nearly the speed of light around the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland. It typically simulates about 60 particles whizzing around the collider’s 27km-long ring for ten seconds, or up to one million loops. Results from SixTrack were used to help the engineers and physicists at CERN design stable beam conditions for the LHC, so today the beams stay on track and don’t cause damage by flying off course into the walls of the vacuum tube. It’s now also being used to carry out simulations relevant to the design of the next phase of the LHC, known as the High-Luminosity LHC.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “The results of SixTrack played an essential role in the design of the LHC, and the high-luminosity upgrades will naturally require additional development work on SixTrack,” explains Frank Schmidt, who works in CERN’s Accelerators and Beam Physics Group of the Beams Department and is the main author of the SixTrack code. “In addition to its use in the design stage, SixTrack is also a key tool for the interpretation of data taken during the first run of the LHC,” adds Massimo Giovannozzi, who also works in CERN’s Accelerators and Beams Physics Group. “We use it to improve our understanding of particle dynamics, which will help us to push the LHC performance even further over the coming years of operation.” He continues: “Managing a project like SixTrack within LHC@home requires resources and competencies that are not easy to find: Igor Zacharov, a senior scientist at the Particle Accelerator Physics Laboratory (LPAP) of the Swiss Federal Institute of Technology in Lausanne (EPFL), provides valuable support for SixTrack by helping with BOINC integration.”

    c
    Volunteer computing is a type of distributed computing through which members of the public donate computing resources (usually processing power) to aid research projects. Image courtesy Eduardo Diez Viñuela, Flickr (CC BY-SA 2.0).

    Before LHC@home was created, SixTrack was run only on desktop computers at CERN, using a platform called the Compact Physics Screen Saver (CPSS). This proved to be a useful tool for a proof of concept, but it was first with the launch of the LHC@home platform in 2004 that things really took off. “I am surprised and delighted by the support from our volunteers,” says Eric McIntosh, who formerly worked in CERN’s IT Department and is now an honorary member of the Beams Department. “We now have over 100,000 users all over the world and many more hosts. Every contribution is welcome, however small, as our strength lies in numbers.”

    Virtualization to the rescue

    Building on the success of SixTrack, the Virtual LHC@home project (formerly known as Test4Theory) was launched in 2011. It enables users to run simulations of high-energy particle physics using their home computers, with the results submitted to a database used as a common resource by both experimental and theoretical scientists working on the LHC.

    Whereas the code for SixTrack was ported for running on Windows, OS X, and Linux, the high-energy-physics code used by each of the LHC experiments is far too large to port in a similar way. It is also being constantly updated. “The experiments at CERN have their own libraries and they all run on Linux, while the majority of people out there have common-or-garden variety Windows machines,” explains CERN honorary staff member of the IT department and chief technology officer of the Citizen Cyberscience Centre Ben Segal. “Virtualization is the way to solve this problem.”

    The birth of the LHC@home platform

    In 2004, Ben Segal and François Grey , who were both members of CERN’s IT department at the time, were asked to plan an outreach event for CERN’s 50th anniversary that would help people around the world to get an impression of the computational challenges facing the LHC. “I had been an early volunteer for SETI@home after it was launched in 1999,” explains Grey. “Volunteer computing was often used as an illustration of what distributed computing means when discussing grid technology. It seemed to me that it ought to be feasible to do something similar for LHC computing and perhaps even combine volunteer computing and grid computing this way.”

    “I contacted David Anderson, the person behind SETI@Home, and it turned out the timing was good, as he was working on an open-source platform called BOINC to enable many projects to use the SETI@home approach,” Grey continues. BOINC (Berkeley Open Infrastructures for Network Computing)is an open-source software platform for computing with volunteered resources. It was first developed at the University of California, Berkeley in the US to manage the SETI@Home project, and uses the unused CPU and GPU cycles on a computer to support scientific research.

    “I vividly remember the day we phoned up David Anderson in Berkeley to see if we could make a SETI-like computing challenge for CERN,” adds Segal. “We needed a CERN application that ran on Windows, as over 90% of BOINC volunteers used that. The SixTrack people had ported their code to Windows and had already built a small CERN-only desktop grid to run it on, as they needed lots of CPU power. So we went with that.”

    A runaway success

    “I was worried that no one would find the LHC as interesting as SETI. Bear in mind that this was well before the whole LHC craziness started with the Angels and Demons movie, and news about possible mini black holes destroying the planet making headlines,” says Grey. “We made a soft launch, without any official announcements, in 2004. To our astonishment, the SETI@home community immediately jumped in, having heard about LHC@home by word of mouth. We had over 1,000 participants in 24 hours, and over 7,000 by the end of the week — our server’s maximum capacity.” He adds: “We’d planned to run the volunteer computing challenge for just three months, at the time of the 50th anniversary. But the accelerator physicists were hooked and insisted the project should go on.”

    Predrag Buncic, who is now coordinator of the offline group within the ALICE experiment, led work to create the CERN Virtual Machine in 2008. He, Artem Harutyunyan (former architect and lead developer of CernVM Co-Pilot), and Segal subsequently adopted this virtualization technology for use within Virtual LHC@home. This has made it significantly easier for the experiments at CERN to create their own volunteer computing applications, since it is no longer necessary for them to port their code. The long-term vision for Virtual LHC@home is to support volunteer-computing applications for each of the large LHC experiments.
    Growth of the platform

    The ATLAS experiment recently launched a project that simulates the creation and decay of supersymmetric bosons and fermions. “ATLAS@Home offers the chance for the wider public to participate in the massive computation required by the ATLAS experiment and to contribute to the greater understanding of our universe,” says David Cameron, a researcher at the University of Oslo in Norway. “ATLAS also gains a significant computing resource at a time when even more resources will be required for the analysis of data from the second run of the LHC.”

    CERN ATLAS New
    ATLAS

    ATLAS@home

    Meanwhile, the LHCb experiment has been running a limited test prototype for over a year now, with an application running Beauty physics simulations set to be launched for the Virtual LHC@home project in the near future. The CMS and ALICE experiments also have plans to launch similar applications.

    CERN LHCb New
    LHCb

    CERN CMS New
    CMS

    CERN ALICE New
    ALICE

    An army of volunteers

    “LHC@home allows CERN to get additional computing resources for simulations that cannot easily be accommodated on regular batch or grid resources,” explains Nils Høimyr, the member of the CERN IT department responsible for running the platform. “Thanks to LHC@home, thousands of CPU years of accelerator beam dynamics simulations for LHC upgrade studies have been done with SixTrack, and billions of events have been simulated with Virtual LHC@home.” He continues: “Furthermore, the LHC@home platform has been an outreach channel, giving publicity to LHC and high-energy physics among the general public.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 2:59 pm on October 22, 2014 Permalink | Reply
    Tags: , , isgtw,   

    From isgtw: “Laying the groundwork for data-driven science” 


    international science grid this week

    October 22, 2014
    Amber Harmon

    he ability to collect and analyze massive amounts of data is rapidly transforming science, industry, and everyday life — but many of the benefits of big data have yet to surface. Interoperability, tools, and hardware are still evolving to meet the needs of diverse scientific communities.

    data
    Image courtesy istockphoto.com.

    One of the US National Science Foundation’s (NSF’s) goals is to improve the nation’s capacity in data science by investing in the development of infrastructure, building multi-institutional partnerships to increase the number of data scientists, and augmenting the usefulness and ease of using data.

    As part of that effort, the NSF announced $31 million in new funding to support 17 innovative projects under the Data Infrastructure Building Blocks (DIBBs) program. Now in its second year, the 2014 DIBBs awards support research in 22 states and touch on research topics in computer science, information technology, and nearly every field of science supported by the NSF.

    “Developed through extensive community input and vetting, NSF has an ambitious vision and strategy for advancing scientific discovery through data,” says Irene Qualters, division director for Advanced Cyberinfrastructure. “This vision requires a collaborative national data infrastructure that is aligned to research priorities and that is efficient, highly interoperable, and anticipates emerging data policies.”

    Of the 17 awards, two support early implementations of research projects that are more mature; the others support pilot demonstrations. Each is a partnership between researchers in computer science and other science domains.

    One of the two early implementation grants will support a research team led by Geoffrey Fox, a professor of computer science and informatics at Indiana University, US. Fox’s team plans to create middleware and analytics libraries that enable large-scale data science on high-performance computing systems. Fox and his team plan to test their platform with several different applications, including geospatial information systems (GIS), biomedicine, epidemiology, and remote sensing.

    “Our innovative architecture integrates key features of open source cloud computing software with supercomputing technology,” Fox said. “And our outreach involves ‘data analytics as a service’ with training and curricula set up in a Massive Open Online Course or MOOC.”Among others, US institutions collaborating on the project include Arizona State University in Phoenix; Emory University in Atlanta, Georgia; and Rutgers University in New Brunswick, New Jersey.

    Ken Koedinger, professor of human computer interaction and psychology at Carnegie Mellon University in Pittsburgh, Pennsylvania, US, leads the other early implementation project. Koedinger’s team concentrates on developing infrastructure that will drive innovation in education.

    The team will develop a distributed data infrastructure, LearnSphere, that will make more educational data accessible to course developers, while also motivating more researchers and companies to share their data with the greater learning sciences community.

    “We’ve seen the power that data has to improve performance in many fields, from medicine to movie recommendations,” Koedinger says. “Educational data holds the same potential to guide the development of courses that enhance learning while also generating even more data to give us a deeper understanding of the learning process.”

    The DIBBs program is part of a coordinated strategy within NSF to advance data-driven cyberinfrastructure. It complements other major efforts like the DataOne project, the Research Data Alliance, and Wrangler, a groundbreaking data analysis and management system for the national open science community.

    See the full article here.

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 6:57 pm on July 23, 2014 Permalink | Reply
    Tags: , isgtw,   

    From isgtw: “A case for computational mechanics in medicine” 

    international science grid this week

    July 23, 2014
    Monica Kortsha

    Members of the US National Committee on Theoretical and Applied Mechanics and collaborators, including Thomas Hughes, director of the computational mechanics group at the Institute for Computational Engineering and Sciences (ICES) at The University of Texas at Austin, US, and Shaolie Hossain, ICES research fellow and research scientist at the Texas Heart Institute, have published an article reviewing the new opportunities computational mechanics is creating in medicine.

    New treatments for tumor growth and heart disease are just two opportunities presenting themselves. The article is published in the Journal of the Royal Society Interface. “This journal truly serves as an interface between medicine and science,” Hossain says. “If physicians are looking for computational research advancements, the article is sure to grab their attention.”

    The article presents three research areas where computational medicine has already made important progress, and will likely continue to do so: nano and microdevices, biomedical devices — including diagnostic systems, and organ models — and cellular mechanics.

    “[Disease is a] multi-scale phenomena and investigators research diverse aspects of it,” says Hossain, explaining that although disease may be perceived at an organ level, treatments usually function at the molecular and cellular scales.

    Hughes and Hossain’s research on vulnerable plaques (VPs), a category of atherosclerosis responsible for 70% of all lethal heart attacks, is an example of applied research incorporating all three notable areas.

    two
    Hughes and Hossain pictured next to a simulation of a vulnerable plaque within an artery. Current medical techniques cannot effectively detect vulnerable plaques. However, Hughes and Hossain say that nano-particles and computational modeling technologies offer diagnostic and treatment solutions. Image courtesy the Institute for Computational Engineering and Sciences at The University of Texas at Austin, US.

    “The detection and treatment of VPs represents an enormous unmet clinical need,” says Hughes. “Progress on this has the potential to save innumerable lives. Computational mechanics combined with high-performance computing provides new and unique technologies for investigating disease, unlike anything that has been traditionally used in medical research.”

    heart
    HeartFlow uses anatomic data from coronary artery CT scans to create a 3D model of the coronary arteries. Coronary blood flow and pressure are computed by applying the principles of coronary physiology and computational fluid dynamics. Fractional flow reserve (FFRCT) is calculated as the ratio of distal coronary pressure to proximal aortic pressure, under conditions simulating maximal coronary hyperemia. The image demonstrates a stenosis (narrowing) of the left anterior descending coronary artery with an FFRCT of 0.58 distal to the stenosis (in red). FFR values ≤0.80 are hemodynamically significant (meaning they obstruct blood flow) and indicate that the patient may benefit from coronary revascularization (removing or bypassing blockages). Image courtesy HeartFlow.

    The high mortality rate attributed to VPs stems from their near clinical invisibility; conventional plaque detection techniques such as MRI and CT scanning do not register VPs because significant vascular narrowing is not present. Hughes and Hossain, however, have developed a computational toolset that can aid in making the plaques visible through targeted delivery of functionalized nanoparticles.

    Their computational models draw on patient-specific data to predict how well nanoparticles can adhere to a potential plaque, thus enabling researchers to test and refine site-specific treatments. If a VP is detected, the same techniques can be employed to send nanoparticles containing medicine directly to the VP.

    The models are being applied at the Texas Heart Institute, where Hossain is a research scientist and assistant professor. “Early intervention and prevention of heart attacks are where we certainly want to go and we are excited about the possibilities for computational mechanics being a vehicle to get us there safely and more rapidly,” says James Willerson, Texas Heart Institute president.

    Other computationally aided models are already being used to help physicians evaluate and treat patients. HeartFlow, a company founded by Charles Taylor, uses CT scan data to create patient-specific models of arteries, which can be used to diagnose coronary artery disease.

    Despite its success and demonstrated potential, computational mechanics in the medical field is still a new concept for scientists and physicians alike, says Hossain. “The potential that we have, in my opinion, hasn’t been tapped to the fullest because of the gap in knowledge.”

    To help integrate medicine into a field that has historically focused on more traditional engineering domains, the article advocates for incorporating biology and chemistry questions into computational mechanics classes, as well as offering classes that can benefit both medical and computational science students.

    See the full article here.

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 6:02 pm on October 2, 2013 Permalink | Reply
    Tags: isgtw,   

    From isgtw: “Preparing for tomorrow’s big data” 

    Until recently, the large CERN experiments, ATLAS and CMS, owned and controlled the computing infrastructure they operated on in the US, and accessed data only when it was locally available on the hardware they operated. However, [Frank] Würthwein, UC San Diego, explains, with data-taking rates set to increase dramatically by the end of LS1 in 2015, the current operational model is no longer viable to satisfy peak processing needs. Instead, he argues, large-scale processing centers need to be created dynamically to cope with spikes in demand. To this end, Würthwein and colleagues carried out a successful proof-of-concept study, in which the Gordon Supercomputer at the San Diego Supercomputer Center was dynamically and seamlessly integrated into the CMS production system to process a 125-terabyte data set.

    gordon
    SDSC’s Gordon Supercomputer. Photo: Alan Decker. Gordon is part of the National Science Foundation’s (NSF) Extreme Science and Engineering Discovery Environment, or XSEDE program, a nationwide partnership comprising 16 supercomputers and high-end visualization and data analysis resources.

    See the full article here.

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 11:36 am on July 17, 2013 Permalink | Reply
    Tags: , , , , , isgtw   

    From isgtw: "Mystery solved: X-ray light emitted from black holes" 

    July 17, 2013
    Amber Harmon

    “Exactly how do black holes produce so many high-power X-rays? The answer has remained a mystery to scientists for decades – until now. Supported by 40 years of theoretical progress, astrophysicists have conducted research that finally bridges the gap between theory and observation, demonstrating that gas spiraling toward a black hole inevitably results in X-ray emissions.

    Published in May in The Astrophysical Journal, the study reveals that gas spiraling toward a black hole through an accretion disk (formed by material in orbit, typically around a star) heats up to roughly 10 million degrees Celsius. The main body of the disk is roughly 2,000 times hotter than the sun, and emits low-energy or “soft” X-rays. However, observations also detect “hard” X-rays, which produce up to 100 times higher energy levels. The collaborators showed for the first time that high-energy light emission is an inevitable outcome of gas being drawn into a black hole.

    As the quality and quantity of high-energy light observations improved over the years, increasing evidence showed that photons are created in a hot, tenuous region called the corona. This corona, boiling violently above the comparatively cool accretion disk, is similar to the corona surrounding the sun, which is responsible for much of the ultra-violet and X-ray luminosity seen in the solar spectrum.

    Collaborators on the study include Julian Krolik, professor of physics and astronomy at Johns Hopkins University in Maryland, US, Jeremy Schnittman, lead author and research astrophysicist at the NASA Goddard Space Flight Center in Maryland, US, and Scott Noble, an associate research scientist at the Center for Computational Relativity and Gravitation at Rochester Institute of Technology in New York, US.”

    See the full article here.

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 6:04 am on July 4, 2013 Permalink | Reply
    Tags: , , , , isgtw   

    From isgtw: “Here comes the sun: Harvard and the World Community Grid light the way to affordable solar cells” 

    July 3, 2013
    Ceci Jones SchrockClean Energy

    “The dream of a solar-powered world recently took a major step forward. On 24 June, in a joint announcement with the US White House and IBM, the Harvard Clean Energy Project released a database of 2.3 million organic compounds – 35,000 of which have promising properties for high-performance semiconductors – that solar-power developers could use to make affordable photovoltaic cells (PVCs). In the spirit of open science, the project provides the Harvard Clean Energy Project Database (CEPDB) free of charge to the public and researchers.

    Affordability is crucial to the success of solar power. The sun has always been up to the task of powering our planet, with enough sunlight reaching Earth every hour to supply our energy needs for an entire year. However, humans have yet to find a way to convert light into energy sufficiently cheaply and efficiently. Today’s commercially available solar cells are typically made of inorganic semiconductor materials, such as silicon, and require a lot of energy and capital to produce.

    The Clean Energy Project’s announcement is a potential game changer. Led by Harvard quantum chemist Alán Aspuru-Guzik, the project seeks to develop candidate molecules for high-performance solar cells made of plastic. These materials can be made in the form of sheets, films, and coatings — imagine powering your home with PVCs painted on the roof. But how do scientists determine which molecules – out of millions of possible combinations – will most efficiently absorb light and convert it into electricity?

    For Aspuru-Guzik, the answer was clear: run computer simulations through World Community Grid, the world’s largest nonprofit computing grid. This IBM-sponsored initiative allows anyone who owns a computer to install secure, free software that uses the machine’s spare compute resources when it is idle. All of the WCG’s sponsored projects help humanity in some way, and the scientific results are available in the public domain. In addition to the Clean Energy Project, volunteers with the World Community Grid are donating their computing time to help fight malaria and childhood cancer, find clean water, and cure muscular dystrophy.”

    WCG

    See the full article here.

    All WCG projects run on BOINC software from The Space Science Lab at UC Berkeley.

    BOINC

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 419 other followers

%d bloggers like this: