From Science Node: “A record year for the Open Science Grid”

Science Node bloc
Science Node

1
Courtesy Open Science Grid.

27 Apr, 2017
Greg Moore

Serving researchers across a wide variety of scientific disciplines, the Open Science Grid (OSG) weaves the national fabric of distributed high throughput computing.

Over the last 12 months, the OSG has handled over one billion CPU hours. These record numbers have transformed the face of science nationally.

2

“We just had a record week recently of over 30 million hours (close to 32.8 million) and the trend is pointing to frequent 30 million-hour weeks — it will become typical,” says Scott Teige, manager of OSG’s Grid Operations Center at Indiana University (IU).

“To reach 32.8 million, we need 195,000 cores running 24/7 for a week.”

Teige’s job is to keep things running smoothly. The OSG Grid Operations Center provides operational support for users, developers, and system administrators. They are also on point for real-time monitoring and problem tracking, grid service maintenance, security incident response, and information repositories.

Big and small

Where is all this data coming from? Teige explains that the largest amount of data is coming from the experiments associated with the Large Hadron Collider (LHC), for which the OSG was originally designed.

But the LHC is just part of the story. There are plenty of CPU cycles to go around, so opportunistic use has become a much larger focus. When OSG resources are not busy, scientists from many disciplines use those hours to revolutionize their science.

For example, the Structural Protein-Ligand Interactome (SPLINTER) project by the Indiana University School of Medicine predicts the interaction of thousands of small molecules with thousands of proteins using the three-dimensional structure of the bound complex between each pair of protein and compound.

By using the OSG, SPLINTER finds a quick and efficient solution to its computing needs — and develops a systems biology approach to target discovery.

The opportunistic resources deliver millions of CPU hours in a matter of days, greatly reducing simulation time. This allows researchers to identify small molecule candidates for individual proteins, or new protein targets for existing FDA-approved drugs and biologically active compounds.

“We serve virtual organizations (VOs) that may not have their own resources,” says Teige. “SPLINTER is a prime example of how we partner with the OSG to transform research — our resources alone cannot meet their needs.”

Hoosier nexus

Because Teige’s group is based at Indiana University, a lot of the OSG operational infrastructure is run out of the IU Data Center. And, because IU is an Extreme Science and Engineering Discovery Environment (XSEDE) resource, the university also handles submissions to the OSG.

3
OSG meets LHC. A view inside the Compact Muon Solenoid (CMS) detecter, a particle detector on the LHC. The OSG was designed for the massive datasets generated in the search for particles like the Higgs boson. Courtesy Tighe Flanagan. (CC BY-SA 3.0)

That means scientists and researchers nationwide can connect both to XSEDE’s collection of integrated digital resources and services and to OSG’s opportunistic resources.

“We operate information services to determine states of resources used in how jobs are submitted,” said Teige. “We operate the various user interfaces like the GOC homepage, support tools, and the ticket system. We also operate a global file system called Oasis where files are deposited to be available for use in a reasonably short time span. And we provide certification services for the user community.”

From LHC big data to smaller opportunistic research computing needs, Teige’s team makes sure the OSG has the support they depend on so discovery moves forward reliably and transparently.

See the full article here .

Please help promote STEM in your local schools.
STEM Icon

Stem Education Coalition

Science Node is an international weekly online publication that covers distributed computing and the research it enables.

“We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

Advertisements