October 16, 2012
Glenn Roberts Jr.
“In a darkened barn in Sweden in 1941, astronomer Erik Holmberg constructed two identical sets of 37 lightbulbs, arranged in rings, to study the effects of a close encounter by two passing galaxies.
Using a light sensor connected to a device that measured electric current, Holmberg carefully charted, by hand, gravitational effects in energy signatures as he moved the two sets of bulbs closer together, and noted the emergence of “spiral arm” patterns.
He correctly concluded that galaxies can cluster and merge together as a result of such close passages. The experiment demonstrated the power of simulations and visualizations in understanding complex astrophysical phenomena, even before the era of computing.
In a windowless room at the Kavli Institute for Particle Astrophysics and Cosmology [Stanford University], visitors wearing 3D glasses witness the grand gravitational interplay of two large galaxies passing in close proximity, their spiral arms swinging out like choreographed combatants. Then the galaxies collide in a burst of light, with the scattered bits circling back and joining a new, larger galaxy. The two-minute, highly detailed visualization encapsulates 2 billion years, incorporates 40 million particles, and plays out on a 123-inch screen, offering an immersive and interactive way to dial back the universe’s clock and refine calculations about its progression by comparing the visualization with observations.
To create a visualization, researchers today start with simulations that are based on theories and models, as well as an underlying question that they hope to answer. They can program in the laws of physics and step back the clock to allow their mini universe to evolve—dusty clouds form into planets; stars and galaxies take light as the universe expands; and dark matter spreads its invisible tendrils amongst it all.
Andy Nonaka, an applied mathematician in the Center for Computational Sciences and Engineering at Lawrence Berkeley National Laboratory, works with visualizations for complex simulations. These simulations can require the supercomputing power equivalent to tens and hundreds of thousands of desktop computers, and the data from each time step can fill up hundreds of gigabytes of memory.
The top supercomputers, such as the Hopper system [Cray XE6, with a peak performance of 1.28 Petaflops/sec, 153,216 compute cores, 212 Terabytes of memory, and 2 Petabytes of disk. Hopper placed number 5 on the November 2010 Top500 Supercomputer list.] at Berkeley Lab’s National Energy Research Scientific Computing Center, complete quadrillions of calculations per second at peak operation, with the combined power of hundreds of thousands of processor cores. Nonaka’s visualizations are typically less computing-intensive than simulations but can still require hundreds of computer processor cores to quickly render the graphics.
At Oak Ridge National Laboratory, researchers use one of the largest supercomputers [Cray XK6 Titan, which replaced the XK5 Jaguar which was 6th on the TOP500 list] in the world to run powerful simulations and visualizations of exploding stars, or supernovae, and other phenomena. There, a team of scientists recently used high-resolution visualizations to understand how some supernovae explosions can lead to the formation of incredibly dense neutron stars, which measure only about 12 miles in diameter but have a mass greater than our sun, and pulsars, which are spinning neutron stars that spew brilliant streams of particles from their magnetic poles.”
There is a whole lot more to learn from visiting the full article here.