Tagged: Computational research Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:59 pm on October 22, 2014 Permalink | Reply
    Tags: Computational research, , ,   

    From isgtw: “Laying the groundwork for data-driven science” 


    international science grid this week

    October 22, 2014
    Amber Harmon

    he ability to collect and analyze massive amounts of data is rapidly transforming science, industry, and everyday life — but many of the benefits of big data have yet to surface. Interoperability, tools, and hardware are still evolving to meet the needs of diverse scientific communities.

    data
    Image courtesy istockphoto.com.

    One of the US National Science Foundation’s (NSF’s) goals is to improve the nation’s capacity in data science by investing in the development of infrastructure, building multi-institutional partnerships to increase the number of data scientists, and augmenting the usefulness and ease of using data.

    As part of that effort, the NSF announced $31 million in new funding to support 17 innovative projects under the Data Infrastructure Building Blocks (DIBBs) program. Now in its second year, the 2014 DIBBs awards support research in 22 states and touch on research topics in computer science, information technology, and nearly every field of science supported by the NSF.

    “Developed through extensive community input and vetting, NSF has an ambitious vision and strategy for advancing scientific discovery through data,” says Irene Qualters, division director for Advanced Cyberinfrastructure. “This vision requires a collaborative national data infrastructure that is aligned to research priorities and that is efficient, highly interoperable, and anticipates emerging data policies.”

    Of the 17 awards, two support early implementations of research projects that are more mature; the others support pilot demonstrations. Each is a partnership between researchers in computer science and other science domains.

    One of the two early implementation grants will support a research team led by Geoffrey Fox, a professor of computer science and informatics at Indiana University, US. Fox’s team plans to create middleware and analytics libraries that enable large-scale data science on high-performance computing systems. Fox and his team plan to test their platform with several different applications, including geospatial information systems (GIS), biomedicine, epidemiology, and remote sensing.

    “Our innovative architecture integrates key features of open source cloud computing software with supercomputing technology,” Fox said. “And our outreach involves ‘data analytics as a service’ with training and curricula set up in a Massive Open Online Course or MOOC.”Among others, US institutions collaborating on the project include Arizona State University in Phoenix; Emory University in Atlanta, Georgia; and Rutgers University in New Brunswick, New Jersey.

    Ken Koedinger, professor of human computer interaction and psychology at Carnegie Mellon University in Pittsburgh, Pennsylvania, US, leads the other early implementation project. Koedinger’s team concentrates on developing infrastructure that will drive innovation in education.

    The team will develop a distributed data infrastructure, LearnSphere, that will make more educational data accessible to course developers, while also motivating more researchers and companies to share their data with the greater learning sciences community.

    “We’ve seen the power that data has to improve performance in many fields, from medicine to movie recommendations,” Koedinger says. “Educational data holds the same potential to guide the development of courses that enhance learning while also generating even more data to give us a deeper understanding of the learning process.”

    The DIBBs program is part of a coordinated strategy within NSF to advance data-driven cyberinfrastructure. It complements other major efforts like the DataOne project, the Research Data Alliance, and Wrangler, a groundbreaking data analysis and management system for the national open science community.

    See the full article here.

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 10:33 am on August 8, 2013 Permalink | Reply
    Tags: , , , Computational research, ,   

    From SLAC: “New Analysis Shows How Proteins Shift Into Working Mode” 

    August 8, 2013
    Mike Ross

    “In an advance that will help scientists design and engineer proteins, a team including researchers from SLAC and Stanford has found a way to identify how protein molecules flex into specific atomic arrangements required to catalyze chemical reactions essential for life.

    The achievement, published Sunday (Aug. 4, 2013) in Nature Methods, uses a new computer algorithm to analyze data from X-ray studies of crystallized proteins. Scientists were able to identify cascades of atomic adjustments that shift protein molecules into new shapes, or conformations.

    prot
    This 3-D figure of the enzyme dihydrofolate reductase (dhfr) shows the nine different areas where a small fluctuation in one part of this flexible molecule causes a sequence of atomic movements to propagate like falling dominoes. A new computer algorithm, CONTACT, identified these areas, which are colored red, yellow, green, orange, salmon, grey, light blue, dark blue and purple.

    ‘Proteins need to move around to do their part in keeping the organism alive,’ said Henry van den Bedem, first author on the paper and a researcher with the Structure Determination Core of the Joint Center for Structural Genomics (JCSG) at the SSRL Directorate of SLAC. ‘But often these movements are very subtle and difficult to discern. Our research is aimed at identifying those fluctuations from X-ray data and linking them to a protein’s biological functions. Our work provides important new insights, which will eventually allow us to re-engineer these molecular machines.'”

    hb
    Henry van den Bedem. (Matt Beardsley/SLAC)

    Central to the new technique is a new computer algorithm, called CONTACT, that analyzes protein structures determined by room temperature X-ray crystallography. Built upon an earlier algorithm created by van den Bedem, CONTACT detects how subtle features in the experimental data produced by changing conformations propagate through the protein and identifies regions within the protein where these cascades of small changes are likely to result in stable conformations.

    The research team also included scientists from University of California-San Francisco and The Scripps Research Institute in La Jolla.

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 11:39 am on July 3, 2013 Permalink | Reply
    Tags: , Computational research,   

    From Fermilab: “Synergia pushes the state of the art” 

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Wednesday, July 3, 2013

    ja
    Jim Amundson, deputy head of the Computational Physics Department and leader of the Computational Physics for Accelerators Group, wrote this column.

    “In an era when the field of particle physics is looking to decide what accelerator projects to pursue, accelerator modeling expertise is of tremendous importance. Fermilab’s Synergia simulation tool is helping accelerator experts here and at CERN optimize their machines and plan for the future.

    A little over 10 years ago, a small accelerator modeling team at Fermilab received its first grant from the then newly established Scientific Discovery through Advanced Computing (SciDAC) program. The thrust of this grant was to combine state-of-the-art space charge calculations with similarly advanced software for single-particle beam dynamics—a capability that did not exist in this field at that point. This work requires the sort of advanced high-performance computing (HPC) platforms championed by the SciDAC program. We named our new program Synergia (Συνεργια)—the Greek word for synergy.

    Our first application of Synergia in 2002 was to improve the modeling and, ultimately, performance of the Fermilab Booster, when the delivery of protons for the Tevatron collider experiments and MiniBooNE were the lab’s highest priorities. After our initial successes modeling the Booster, we have continued to use SciDAC to enhance Synergia through both funding and collaborating with other physicists and computer scientists. In the past decade Synergia has evolved into a general framework for the calculation of intensity-dependent effects in beam dynamics. And while running Synergia efficiently on 128 parallel processors used to seem like a major accomplishment, we now have demonstrated efficient running on 131,072 cores, keeping us at the leading edge of the rapidly changing field of HPC.”

    See the full article here.

    Fermilab campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 7:33 pm on May 3, 2013 Permalink | Reply
    Tags: , Computational research,   

    From Berkeley Lab: “Brain Visualization Prototype Holds Promise for Precision Medicine” 


    Berkeley Lab

    From the Computational Research Division
    Berkeley Computational Research Division

    Berkeley Lab, UCSF and Oblong Industries Show Brain Browser at Summit

    “The ability to combine all of a patient’s neurological test results into one detailed, interactive “brain map” could help doctors diagnose and tailor treatment for a range of neurological disorders, from autism to epilepsy. But before this can happen, researchers need a suite of automated tools and techniques to manage and make sense of these massive complex datasets.

    brain
    Computational researchers from Berkeley Lab used existing computational tools to translate laboratory data collected at UCSF into 3D visualizations of brain structures and activity.

    To get an idea of what these tools would look like, computational researchers from the Lawrence Berkeley National Laboratory (Berkeley Lab) are working with neuroscientists from the University of California, San Francisco (UCSF). So far, the Berkeley Lab team has used existing computational tools to translate UCSF laboratory data into 3D visualizations of brain structures and activity. Earlier this year, Los Angeles-based Oblong Industries joined the collaboration and implemented a state-of-the-art, gesture-based navigation interface that allows researchers to interactively explore 3D brain visualizations with hand poses movements.

    This is terrific new science.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 10:14 am on February 13, 2013 Permalink | Reply
    Tags: , Computational research, , ,   

    From ESA Technology: “Silicon brains to oversee satellites” 

    ESASpaceForEuropeBanner
    European Space Agency

    XMM Newton
    XMM-Newton

    herschelHerschel


    Planck

    13 February 2013
    No Writer Credit

    A beautiful and expensive sight: upwards of €6 million-worth of silicon wafers, crammed with the complex integrated circuits that sit at the heart of each and every ESA mission. Years of meticulous design work went into these tiny brains, empowering satellites with intelligence.

    chgips
    Silicon wafers etched with integrated circuits for space missions. No image credit.

    The image shows a collection of six silicon wafers that contain some 14 different chip designs developed by several European companies during the last eight years with ESA’s financial and technical support.

    Each of these 20 cm-diameter wafers contains between 30 and 80 replicas of each chip, each one carrying up to about 10 million transistors or basic circuit switches.

    To save money on the high cost of fabrication, various chips designed by different companies and destined for multiple ESA projects are crammed onto the same silicon wafers, etched into place at specialised semiconductor manufacturing plants or ‘fabs’, in this case LFoundry (formerly Atmel) in France.

    Once manufactured, the chips, still on the wafer, are tested. The wafers are then chopped up. They become ready for use when placed inside protective packages – just like standard terrestrial microprocessors – and undergo final quality tests.

    Through little metal pins or balls sticking out of their packages these miniature brains are then connected to other circuit elements – such as sensors, actuators, memory or power systems – used across the satellite.

    To save the time and money needed to develop complex chips like these, ESA’s Microelectronics section maintains a catalogue of chip designs, known as Intellectual Property (IP) cores, available to European industry through ESA licence.”

    See the full article here.

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA Technology


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 6:12 pm on February 10, 2013 Permalink | Reply
    Tags: , , , , Computational research,   

    From Argonne Lab: “New classes of magnetoelectric materials promise advances in computing technology” 

    News from Argonne National Laboratory

    February 7, 2013
    Jared Sagoff

    Although scientists have been aware that magnetism and electricity are two sides of the same proverbial coin for almost 150 years, researchers are still trying to find new ways to use a material’s electric behavior to influence its magnetic behavior, or vice versa.

    star
    An illustration of a titanium-europium oxide cage lattice studied in the experiment.Image by Renee Carlson.

    Thanks to new research by an international team of researchers led by the U.S. Department of Energy’s Argonne National Laboratory, physicists have developed new methods for controlling magnetic order in a particular class of materials known as “magnetoelectrics.”

    Magnetoelectrics get their name from the fact that their magnetic and electric properties are coupled to each other. Because this physical link potentially allows control of their magnetic behavior with an electrical signal or vice versa, scientists have taken a special interest in magnetoelectric materials.

    ‘Electricity and magnetism are intrinsically coupled – they’re the same entity,’ said Philip Ryan, a physicist at Argonne’s Advanced Photon Source. ‘Our research is designed to accentuate the coupling between the electric and magnetic parameters by subtly altering the structure of the material.

    This new approach to cross-coupling magnetoelectricity could prove a key step toward the development of next-generation memory storage, improved magnetic field sensors, and many other applications long dreamed about. Unfortunately, scientists still have a ways to go to translating these findings into commercial devices.’

    ‘Instead of having just a ‘0’ or a ‘1,’ you could have a broader range of different values,’ Ryan said. ‘A lot of people are looking into what that kind of logic would look like.’

    A paper based on the research, “Reversible control of magnetic interactions by electric field in a single-phase material,” was published in Nature Communications. “

    See the full article here.

    Argonne Lab Campus

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 1:50 pm on January 22, 2013 Permalink | Reply
    Tags: , Computational research, ,   

    From isgtw: “The mystery of the slowing space probes” 

    Find out how big data preservation helped to solve the Pioneer anomaly

    January 16, 2013
    Stefan Janusz

    It could have been the slowing of their on-board clocks. They could have been feeling the effects of dark energy. Or, perhaps, they provided just the evidence needed to support a theory of modified Newtonian dynamics – proposed to explain why spiral galaxies don’t lose their shape as they spin.

    p10
    Pioneer 10* (Image NASA Ames Research Center)

    Thermal radiation was emanating from the decaying radioisotopes which serve as the probes’ power sources and this was producing a small amount of thermal recoil. Thermal recoil is a miniscule force which results from the emission of thermal photons from a surface. If the emission of these photons is unevenly distributed across the surface of a spacecraft, they could cause an imbalance in the forces acting on different parts of the spacecraft. This is exactly what happened in the case of Pioneer 10 and 11, due to each probe’s radioactive power source being held on the end of a long boom, so as to prevent it interfering with sensor equipment.

    Larry Kellogg, a former Pioneer team member, had a hunch that the answer lay in careful reanalysis of the data. He had been preserving it for years, scrupulously transferring it from magnetic tape and magneto-optical disks that had been abandoned under a staircase at the NASA Jet Propulsion Laboratory onto a modern hard disk, where it could be more easily accessed. 40Gb of Doppler data, and some meticulous computer modeling of the craft (there were no CAD models for Pioneer 10 when it was launched 40 years ago), eventually identified thermal recoil as the most likely candidate for the slowdown.”

    See the full article here.

    *Pioneer 11 looks identical to Pioneer 10

    isgtw is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 1:00 pm on January 7, 2013 Permalink | Reply
    Tags: , , Computational research, ,   

    From ESA Space Engineering: “LEON: the space chip that Europe built” 

    ESA Space Engineering Banner

    ESASpaceForEuropeBanner
    European Space Agency

    XMM Newton
    XMM-Newton

    herschelHerschel


    Planck

    7 January 2013
    No Writer Credit

    Just like home computers, the sophisticated capabilities of today’s space missions are made possible by the power of their processor chips. ESA’s coming Alphasat telecom satellite, the Proba-V microsatellite, the Earth-monitoring Sentinel family and the BepiColombo mission to Mercury are among the first missions to use an advanced 32-bit microprocessor – engineered and built in Europe.

    leon2
    Layout of the LEON2-FT chip, alias AT697

    All of them incorporate the new LEON2-FT chip, commercially known as the AT697. Engineered to operate within spacecraft computers, this microprocessor is manufactured by Atmel in France but originally designed by ESA.

    The underlying LEON design has also been made available to Europe’s space industry as the basis for company-owned ‘system-on-chip’ microprocessors optimised for dedicated tasks. For instance, Astrium is using it to create a space-based GPS/Galileo satnav receiver.

    chip
    LEON2-FT chip within Proba-2’s computer

    Independence from non-European parts is also a driver of our European Components Initiative, in place for the last decade, which is working with European industry to bring new components to market.”

    See the full article here.

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 11:12 am on January 7, 2013 Permalink | Reply
    Tags: Computational research, ,   

    From Fermilab: “Fermilab explores scientific use of cloud computing through FermiCloud” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Monday, Jan. 7, 2013
    Leah Hesla

    Ever on the lookout for how they can make scientists’ lives easier, members of the Scientific Computing Division continually explore ways to provide researchers with more effective access to the laboratory’s computer systems. One of these undertakings is a project called FermiCloud, which aims to develop easy-to-use computing in two different but related areas: cloud services and virtualized machines.

    fc

    See the full article here.

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 11:37 am on January 3, 2013 Permalink | Reply
    Tags: , , , Computational research,   

    From Berkeley Lab: “How Computers Push on the Molecules They Simulate” 


    Berkeley Lab

    Berkeley Lab bioscientists and their colleagues decipher a far-reaching problem in computer simulations

    January 03, 2013
    Paul Preuss

    Because modern computers have to depict the real world with digital representations of numbers instead of physical analogues, to simulate the continuous passage of time they have to digitize time into small slices. This kind of simulation is essential in disciplines from medical and biological research, to new materials, to fundamental considerations of quantum mechanics, and the fact that it inevitably introduces errors is an ongoing problem for scientists.

    image
    Dynamic computer simulations of molecular systems depend on finite time steps, but these introduce apparent extra work that pushes the molecules around. Using models of water molecules in a box, researchers have learned to separate this shadow work from the protocol work explicitly modeled in the simulations. No image credit.

    Scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have now identified and characterized the source of tenacious errors and come up with a way to separate the realistic aspects of a simulation from the artifacts of the computer method. The research was done by David Sivak and his advisor Gavin Crooks in Berkeley Lab’s Physical Biosciences Division and John Chodera, a colleague at the California Institute of Quantitative Biosciences (QB3) at the University of California at Berkeley. The three report their results in Physical Review X.

    See the full and very informative article here.
    A U.S. Department of Energy National Laboratory Operated by the University of California

    i1


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 348 other followers

%d bloggers like this: