Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:14 pm on May 24, 2017 Permalink | Reply
    Tags: , , ,   

    From The Atlantic via SETI@home: “A Brief History of SETI@Home” 

    SETI@home
    SETI@home

    The Atlantic

    1
    Frank Drake, (Left) president of the SETI (Search for Extraterrestrial Intelligence) reviews data from radiotelescopes used to scan the universe for intelligent life.

    How astronomers deputized early internet users to help find alien civilizations.

    The year was 1999, and the people were going online. AOL, Compuserve, mp3.com, and AltaVista loaded bit by bit after dial-up chirps, on screens across the world. Watching the internet extend its reach, a small group of scientists thought a more extensive digital leap was in order, one that encompassed the galaxy itself. And so it was that before the new millennium dawned, researchers at the University of California released a citizen-science program called SETI@Home.

    The idea went like this: When internet-farers abandoned their computers long enough that a screen saver popped up, that saver wouldn’t be WordArt bouncing around, 3-D neon-metallic pipes installing themselves inch by inch, or a self-satisfied flying Windows logo. No. Their screens would be saved by displays of data analysis, showing which and how much data from elsewhere their CPUs were churning through during down-time. The data would come from observations of distant stars, conducted by astronomers searching for evidence of an extraterrestrial intelligence. Each participating computer would dig through SETI data for suspicious signals, possibly containing a “Hello, World” or two from aliens. Anyone with 28 kbps could be the person to discover another civilization.

    When the researchers launched SETI@Home, in May of ’99, they thought maybe 1,000 people might sign up. That number—and the bleaker view from outsiders, who said perhaps no one would join the crew—informed a poor decision: to set up a single desktop to farm out the data and take back the analysis.

    But the problem was, people really liked the idea of letting their computers find aliens while they did nothing except not touch the mouse. And for SETI@Home’s launch, a million people signed up. Of course, the lone data-serving desktop staggered. SETI@Home fell down as soon as it started walking. Luckily, now-defunct Sun Microsystems donated computers to help the program get back on its feet. In the years since, more than 4 million people have tried SETI@Home. Together, they make up a collective computing power that exceeds 2008’s premier supercomputer.

    But they have yet to find any aliens.

    SETI is a middle-aged science, with 57 years under its sagging belt. It began in 1960, when an astronomer named Frank Drake used an 85-foot radio telescope in Green Bank, West Virginia, to scan two Sun-like stars for signs of intelligent life—radio emissions the systems couldn’t produce on their own, like the thin-frequency broadcasts of our radio stations, or blips that repeated in a purposeful-looking way.

    Green Bank today



    GBO radio telescope, Green Bank, West Virginia, USA

    Since then, scientists and engineers have used radio and optical telescopes to search much more of the sky—for those “narrowband” broadcasts, for fast pings, for long drones, for patterns distinguishing themselves from the chaotic background static and natural signals from stars and supernovae.

    But the hardest part about SETI is that scientists don’t know where ET may live, or how ET’s civilization might choose to communicate. And so they have to look for a rainbow of possible missives from other solar systems, all of which move and spin at their own special-snowflake speeds through the universe. There’s only one way to do that, says Dan Werthimer, the chief SETI scientist at Berkeley and a co-founder of SETI@Home: “We need a lot of computing power.”

    In the 1970s, when Werthimer’s Berkeley colleagues launched a SETI project called SERENDIP, they sucked power from all the computers in their building, then the neighboring building. In a way, it was a SETI@Home prototype. In the decades that followed, they turned to supercomputers. And then, they came for your CPUs.

    The idea for SETI@Home originated at a cocktail party in Seattle, when computer scientist David Gedye asked a friend what it might take to excite the public about science. Could computers somehow do something similar to what the Apollo program had done? Gedye dreamed up the idea of “volunteer computing,” in which people gave up their hard drives for the greater good when those drives were idle, much like people give up their idle cars, for periods of time, to Turo (if Turo didn’t make money and also served the greater good). What might people volunteer to help with? His mind wandered to The X-Files, UFOs, hit headlines fronting the National Enquirer. People were so interested in all that. “It’s a slightly misguided interest, but still,” says David Anderson, Gedye’s graduate-school advisor at Berkeley. Interest is interest is interest, misguided or guided perfectly.

    But Gedye wasn’t a SETI guy—he was a computer guy—so he didn’t know if or how a citizen-computing project would work. He got in touch with astronomer Woody Sullivan, who worked at the University of Washington in Seattle. Sullivan turned him over to Werthimer. And Gedye looped in Anderson. They had a quorum, of sorts.

    Anderson, who worked in industry at the time, dedicated evenings to writing software that could take data from the Arecibo radio telescope, mother-bird it into digestible bits, send it to your desktop, command it to hunt for aliens, and then send the results back to the Berkeley home base. No small task.

    They raised some money—notably, $50,000 from the Planetary Society and $10,000 from a Paul Allen-backed company. But most of the work-hours, like the computer-hours they were soliciting, were volunteer labor. Out of necessity, they did hire a few people with operating-system expertise, to deal with the wonky screensaver behavior of both Windows and Macintosh. “It’s difficult trying to develop a program that’s intended to run on every computer in the world,” says Anderson.

    __________________________________________________________________
    Today, you can use BOINC to serve up your computer’s free time to develop malaria drugs, cancer drugs, HIV drugs.

    __________________________________________________________________

    And yet, by May 17, 1999, they were up, and soon after, they were running. And those million people in this world were looking for not-people on other worlds.

    One morning, early in the new millennium, the team came into the office and surveyed the record of what those million had done so far. In the previous 24 hours, the volunteers had done what would have taken a single desktop one thousand years to do. “Suppose you’re a scientist, and you have some idea, and it’s going to take 1,000 years,” says Anderson. “You’re going to discard it. But we did it.”

    After being noses-down to their keyboards since the start, it was their first feeling of triumph. “It was really a battle for survival,” says Anderson. “We didn’t really have time to look up and realize what an amazing thing we were doing.”

    Then, when they looked up again, at the SETI@Home forums, they saw something else: “It was probably less than a year after we started that we started getting notices about the weddings of people who met through SETI@Home,” says Eric Korpela, a SETI@Home project scientist and astronomer at Berkeley.

    The SETI astronomers began to collect more and different types of data, from the likes of the Arecibo radio telescope. Operating systems evolved. There were new signal types to search for, like pulses so rapid they would have seemed like notes held at pianissimo to previous processors. With all that change, they needed to update the software frequently. But they couldn’t put out a new version every few months and expect people to download it.

    Anderson wanted to create a self-updating infrastructure that would solve that problem—and be flexible enough that other, non-SETI projects could bring their work onboard and benefit from distributed computing. And so BOINC—Berkeley Open Infrastructure for Network Computing—was born.

    Today, you can use BOINC to serve up your computer’s free time to develop malaria drugs, cancer drugs, HIV drugs. You can fold proteins or help predict the climate. You can search for gravitational waves or run simulations of the heart’s electrical activity, or any of 30 projects. And you can now run BOINC on GPUs—graphical processing units, brought to you by gamers—and on Android smartphones Nearly half a million people use the infrastructure now, making the système totale a 19 petaflop supercomputer, the third-largest megacalculator on the planet.

    Home computers have gotten about 100 times faster since 1999, thank God, and on the data distribution side, Berkeley has gotten about 10 times faster. They’re adding BOINC as a bandwidth-increasing option to the Texas Advanced Computing Center and nanoHUB, and also letting people sign up for volunteer computing, tell the system what they think are the most important scientific goals, and then have their computers be automatically matched to projects as those projects need time. It’s like OkCupid dating, for scientific research. BOINC, and SETI@Home can do more work than ever.

    The thing is, though, they’ve already done a lot of work—so much work they can’t keep up with themselves. Sitting in a database are 7 billion possible alien signals that citizen scientists and their idle computers have already uncovered.

    Most of these are probably human-made interference: short-circuiting electric fences, airport radar, XM satellite radio, or a microwave opened a second too soon. Others are likely random noise that added up to a masquerade of significance. As Anderson says, “Random noise has the property that whatever you’re looking for, it eventually occurs. If you generate random letters. You eventually get the complete works of Shakespeare.” Or the emissions are just miscategorized natural signals.

    Anderson has been working on a program called Nebula that will trawl that billions-and-billions-strong database, reject the interference, and upvote the best candidates that might—just might—be actual alien signals. Four thousand computers at the Max Planck Institute for Gravitational Physics in Germany help him narrow down the digital location of that holiest of grails. Once something alien in appearance pops up—say from around the star Vega—the software automatically searches the rest of the data. It finds all the other times, in the 18 years of SETI@Home history, that Arecibo or the recently added telescopes from a $100 milion initiative called Breakthrough Listen have looked at Vega. Was the signal there then too? “We’re kind of hoping that the aliens are sending a constant beacon,” says Korpela, “and that every time a telescope passes over a point in the sky, we see it.”

    If no old data exists—or if the old data is particularly promising—the researchers request new telescope time and ask SETI colleagues to verify the signal with their own telescopes, to see if they can intercept that beacon, that siren, that unequivocal statement of what SETI scientists and SETI@Home participants hope is true: That we are not alone.

    So far, that’s a no-go. “We’ve never had a candidate so exciting that we call the director and say, ‘Throw everybody off the telescope,’” says Werthimer. “We’ve never had anything that resembles ET.”

    And partly for that reason, the SETI@Homers are now working on detecting “wideband” signals—ones that come at a spread spectrum of frequencies, like the beam-downs from DIRECTV. Humans (and by extension, extraterrestrials) can embed more information more efficiently in these spread-spectrum emissions. If the goal is to disseminate information, rather than just graffiti “We’re here!” on the fabric of spacetime, wideband is the way to go. And SETI scientists’ thinking goes like this: We’ve been looking mostly for purposeful, obvious transmissions, ones wrapped neatly for us. But we haven’t found any—which might mean they just aren’t there. Extraterrestrial communications might be aimed at members of their own civilizations, in which case they’re more likely to go the DIRECTV route, and we’re likely to find only the “leakage” of those communication lines.

    “If there really are these advanced civilizations, it’d be trivial to contact us,” says Werthimer. “They’d be landing on the White House—well, maybe not this White House. But they’d be shining a laser in Frank Drake’s eyes. I don’t see why they would make it so difficult that we would have to do all this hard stuff.”

    And so humans, and our sleeping computers, may have to eavesdrop on messages not addressed to us—the ones the aliens send to their own (for lack of a better word) people, and then insert ourselves into the chatter. “I don’t mean to interrupt,” we might someday say, “but I couldn’t help overhearing…” And because of SETI@Home and BOINC, it might be your laptop that gets that awkward conversation started.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The science of SETI@home
    SETI (Search for Extraterrestrial Intelligence) is a scientific area whose goal is to detect intelligent life outside Earth. One approach, known as radio SETI, uses radio telescopes to listen for narrow-bandwidth radio signals from space. Such signals are not known to occur naturally, so a detection would provide evidence of extraterrestrial technology.

    Radio telescope signals consist primarily of noise (from celestial sources and the receiver’s electronics) and man-made signals such as TV stations, radar, and satellites. Modern radio SETI projects analyze the data digitally. More computing power enables searches to cover greater frequency ranges with more sensitivity. Radio SETI, therefore, has an insatiable appetite for computing power.

    Previous radio SETI projects have used special-purpose supercomputers, located at the telescope, to do the bulk of the data analysis. In 1995, David Gedye proposed doing radio SETI using a virtual supercomputer composed of large numbers of Internet-connected computers, and he organized the SETI@home project to explore this idea. SETI@home was originally launched in May 1999.

    SETI@home is not a part of the SETI Institute

    The SETI@home screensaver image
    SETI@home screensaver

    To participate in this project, download and install the BOINC software on which it runs. Then attach to the project. While you are at BOINC, look at some of the other projects which you might find of interest.

    MAJOR PROJECTS RUNNING ON BOINC SOFTWARE

    SETI@home The search for extraterrestrial intelligence. “SETI (Search for Extraterrestrial Intelligence) is a scientific area whose goal is to detect intelligent life outside Earth. One approach, known as radio SETI, uses radio telescopes to listen for narrow-bandwidth radio signals from space. Such signals are not known to occur naturally, so a detection would provide evidence of extraterrestrial technology.

    Radio telescope signals consist primarily of noise (from celestial sources and the receiver’s electronics) and man-made signals such as TV stations, radar, and satellites. Modern radio SETI projects analyze the data digitally. More computing power enables searches to cover greater frequency ranges with more sensitivity. Radio SETI, therefore, has an insatiable appetite for computing power.

    Previous radio SETI projects have used special-purpose supercomputers, located at the telescope, to do the bulk of the data analysis. In 1995, David Gedye proposed doing radio SETI using a virtual supercomputer composed of large numbers of Internet-connected computers, and he organized the SETI@home project to explore this idea. SETI@home was originally launched in May 1999.”


    SETI@home is the birthplace of BOINC software. Originally, it only ran in a screensaver when the computer on which it was installed was doing no other work. With the powerand memory available today, BOINC can run 24/7 without in any way interfering with other ongoing work.

    seti
    The famous SET@home screen saver, a beauteous thing to behold.

    einstein@home The search for pulsars. “Einstein@Home uses your computer’s idle time to search for weak astrophysical signals from spinning neutron stars (also called pulsars) using data from the LIGO gravitational-wave detectors, the Arecibo radio telescope, and the Fermi gamma-ray satellite. Einstein@Home volunteers have already discovered more than a dozen new neutron stars, and we hope to find many more in the future. Our long-term goal is to make the first direct detections of gravitational-wave emission from spinning neutron stars. Gravitational waves were predicted by Albert Einstein almost a century ago, but have never been directly detected. Such observations would open up a new window on the universe, and usher in a new era in astronomy.”

    MilkyWay@Home Milkyway@Home uses the BOINC platform to harness volunteered computing resources, creating a highly accurate three dimensional model of the Milky Way galaxy using data gathered by the Sloan Digital Sky Survey. This project enables research in both astroinformatics and computer science.”

    Leiden Classical “Join in and help to build a Desktop Computer Grid dedicated to general Classical Dynamics for any scientist or science student!”

    World Community Grid (WCG) World Community Grid is a special case at BOINC. WCG is part of the social initiative of IBM Corporation and the Smarter Planet. WCG has under its umbrella currently eleven disparate projects at globally wide ranging institutions and universities. Most projects relate to biological and medical subject matter. There are also projects for Clean Water and Clean Renewable Energy. WCG projects are treated respectively and respectably on their own at this blog. Watch for news.

    Rosetta@home “Rosetta@home needs your help to determine the 3-dimensional shapes of proteins in research that may ultimately lead to finding cures for some major human diseases. By running the Rosetta program on your computer while you don’t need it you will help us speed up and extend our research in ways we couldn’t possibly attempt without your help. You will also be helping our efforts at designing new proteins to fight diseases such as HIV, Malaria, Cancer, and Alzheimer’s….”

    GPUGrid.net “GPUGRID.net is a distributed computing infrastructure devoted to biomedical research. Thanks to the contribution of volunteers, GPUGRID scientists can perform molecular simulations to understand the function of proteins in health and disease.” GPUGrid is a special case in that all processor work done by the volunteers is GPU processing. There is no CPU processing, which is the more common processing. Other projects (Einstein, SETI, Milky Way) also feature GPU processing, but they offer CPU processing for those not able to do work on GPU’s.

    gif

    These projects are just the oldest and most prominent projects. There are many others from which you can choose.

    There are currently some 300,000 users with about 480,000 computers working on BOINC projects That is in a world of over one billion computers. We sure could use your help.

    My BOINC

    graph

     
  • richardmitnick 3:46 pm on May 24, 2017 Permalink | Reply
    Tags: , , Early days, , , ,   

    From FNAL: “Early Tevatron design days” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    May 24, 2017
    Tom Nicol

    1
    Fermilab technicians assemble a magnet spool piece for the Tevatron. Photo: Fermilab

    When I started at the lab in December 1977, work on the dipole magnets for the Tevatron was well under way in what was then called the Energy Doubler Department in the Technical Services Section.

    My first project was to work on the quadrupole magnets and spools, which hadn’t really been started yet. The spool is a special unit that attaches to each quadrupole and the adjacent dipole. It contains what we used to call “the stuff that wouldn’t fit anywhere else” – correction magnets and their power leads, quench stoppers to dump the energy from all the magnets, beam position monitors, relief valves, things like that.

    At the time, we were located in the Village in the old director’s complex, which now houses the daycare center. We had a large open area where the engineers, designers and drafters worked and a small conference room where we kept up-to-date models of some of the things we were working on.

    2
    A team tests a magnet spool piece. Photo: Fermilab

    For several weeks we worked feverishly on the design of the quadrupole and spool combination — we in the design room and the model makers in the model shop on their full-scale models. We would work all week, then have a meeting with the lab director, Bob Wilson. Dr. Wilson would come out to see how we were doing, but more importantly to see what our designs looked like.

    It turns out he was very interested in that and very fussy that things — even those buried in the tunnel — looked just so.

    After every one of those meetings we’d walk back into the design room and tell everyone to tear up what we’d been working on and start over. The same would hold for the model makers. This went on for several weeks until Dr. Wilson was happy. We began to really dread going into those meetings, but in the end they served us very well.

    FNAL Tevatron

    FNAL/Tevatron map


    FNAL/Tevatron DZero detector


    FNAL/Tevatron CDF detector

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 3:04 pm on May 24, 2017 Permalink | Reply
    Tags: , , , Taiwan,   

    From Temblor: “M=5.0 Taiwan earthquake preceded by foreshock sequence” 

    1

    temblor

    May 24, 2017
    David Jacobson

    1
    Southwestern Taiwan was hit by a M=5.0 earthquake today. This quake was preceded by a foreshock sequence that lasted approximately 33 hours. (Photo from: holidayssg.com)

    At 9:10 p.m. local time today (24 May), a M=5.0 earthquake struck western Taiwan near the city of Chiayi, which is home to over 250,000 people. This earthquake was preceded by a foreshock sequence of five earthquakes beginning approximately 33 hours earlier. The foreshock sequence began with a M=3.6, and culminated with another M=3.6 five minutes before the M=5.0. Most earthquakes are not preceded by a foreshock sequence, making this quake rare. At this stage, there have been no reports of damage, and according to the Taiwan Central Weather Bureau, moderate shaking was felt in the M=5.0, which can rock buildings, and cause slight damage. So, close to the epicenter, it is possible that minor damage was sustained. Should we hear any reports of damage, we will update this post.

    2
    This Temblor map shows the location of the M=5.0 earthquake in Taiwan. In addition to the location from EMSC, the USGS location is also shown to illustrate the discrepancy in the catalogs. One of the earthquakes in the foreshock sequence is also shown.

    At this stage, there is a discrepancy between where the USGS and EMSC plot the location of today’s quake. The USGS has it in a stepover between the Chiuchiungkeng and Muchiliao-Liuchia faults, while EMSC has it just to the east of the Chiuchiungkeng Fault. The USGS location has been added to the Temblor map above so that this discrepancy can be seen (For any location outside the United States, Temblor shows EMSC data). The USGS has also produced a focal mechanism for this earthquake, which suggests both strike-slip and extensional components of slip, which is not consistent with the regional geology. Should a Taiwan focal mechanism come out, we will update this post.

    Based on the location shown in Temblor, this earthquake was likely associated with the Chiuchiungkeng Fault, a thrust fault within the southwestern foothills of Taiwan. Because of high slip rates associated with this fault, the region is believed to have a high probability of experiencing a large magnitude earthquake. This is verified when we look at the Taiwan Earthquake Model (see below). This model shows the likelihood of strong ground motion in the next 50 years.

    3
    This figure shows the Taiwan Earthquake Model with recent earthquakes shown. This colors in the figure represent ground motion values (g) with a 10% likelihood in 50 years. This is the spectral acceleration at a period of 0.3 seconds (3.3 Hz).

    In addition to the Taiwan Earthquake Model, we can also consult the Global Earthquake Activity Rate (GEAR) model, to see what the likely earthquake magnitude is for this portion of Taiwan. This model, which uses global strain rates and seismicity since 1977, forecasts what the likely earthquake magnitude in your lifetime is for any location on earth. From the Temblor map below, one can see that a M=7.5 earthquake is likely in your lifetime. Such a quake could be devastating to the country, as a significant portion of the country’s agriculture is grown in southwestern Taiwan, and a large earthquake could damage valuable resources. Should anything change regarding the location or focal mechanism from today’s earthquake, we will update this post.

    4
    This Temblor map shows the Global Earthquake Activity Rate (GEAR) model for Taiwan. What can be seen from this figure is that the area around today’s earthquake is susceptible to M=7.5+ quakes. Such an earthquake would be devastating to the area.

    References
    European-Mediterranean Seismological Centre (EMSC)
    USGS
    Taiwan Earthquake Model (TEM)
    Taiwan Central Weather Bureau

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    You can help many citizen scientists in detecting earthquakes and getting the data to emergency services people in affected area.
    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    BOINCLarge

    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

     
  • richardmitnick 2:42 pm on May 24, 2017 Permalink | Reply
    Tags: , , , , New imaging technique aims to ensure surgeons completely remove cancer,   

    From Wash U: “New imaging technique aims to ensure surgeons completely remove cancer” 

    Wash U Bloc

    Washington University in St.Louis

    Caltech Logo

    Caltech

    May 17, 2017
    Tamara Bhandari
    tbhandari@wustl.edu

    1
    A new imaging technique based on light and sound produces images doctors can use to distinguish cancerous breast tissue (below the dotted blue line) from normal tissue more quickly than is currently possible. Pathologists routinely inspect surgical specimens to make sure all cancerous tissue has been removed. The new technique (right) produces images as detailed and accurate as traditional methods (left), but in far less time. The researchers are working to make the technique fast enough to be used during a surgery, so patients don’t have to return for a second surgery. (Image: Terence T.W. Wong)

    Of the quarter-million women diagnosed with breast cancer every year in the United States, about 180,000 undergo surgery to remove the cancerous tissue while preserving as much healthy breast tissue as possible.

    However, there’s no accurate method to tell during surgery whether all of the cancerous tissue has been successfully removed. The gold-standard analysis takes a day or more, much too long for a surgeon to wait before wrapping up an operation. As a result, about a quarter of women who undergo lumpectomies receive word later that they will need a second surgery because a portion of the tumor was left behind.

    Now, researchers at Washington University School of Medicine in St. Louis and California Institute of Technology report that they have developed a technology to scan a tumor sample and produce images detailed and accurate enough to be used to check whether a tumor has been completely removed.

    Called photoacoustic imaging, the new technology takes less time than standard analysis techniques. But more work is needed before it is fast enough to be used during an operation.

    The research is published May 17 in Science Advances.

    “This is a proof of concept that we can use photoacoustic imaging on breast tissue and get images that look similar to traditional staining methods without any sort of tissue processing,” said Deborah Novack, MD, PhD, an associate professor of medicine, and of pathology and immunology, and a co-senior author on the study.

    The researchers are working on improvements that they expect will bring the time needed to scan a specimen down to 10 minutes, fast enough to be used during an operation. The current gold-standard method of analysis, which is based on preserving the tissue and then staining it to make the cells easier to see, hasn’t gotten any faster since it was first developed in the mid-20th century.

    For solid tumors in most parts of the body, doctors use a technique known as a frozen section to do a quick check of the excised lump during the surgery. They look for a thin rim of normal cells around the tumor. Malignant cells at the margins suggest the surgeon missed some of the tumor, increasing the chances that the disease will recur.

    But frozen sections don’t work well on fatty specimens like those from the breast, so the surgeon must finish a breast lumpectomy without knowing for sure how successful it was.

    “Right now, we don’t have a good method to assess margins during breast cancer surgeries,” said Rebecca Aft, MD, PhD, a professor of surgery and a co-senior author on the study. Aft, a breast cancer surgeon, treats patients at Siteman Cancer Center at Barnes-Jewish Hospital and Washington University School of Medicine.

    Currently, after surgery a specimen is sent to a pathologist, who slices it, stains it and inspects the margins for malignant cells under a microscope. Results are sent back to the surgeon within a few days.

    To speed up the process, the researchers took advantage of a phenomenon known as the photoacoustic effect. When a beam of light of the right wavelength hits a molecule, some of the energy is absorbed and then released as sound in the ultrasound range. These sound waves can be detected and used to create an image.

    “All molecules absorb light at some wavelength,” said co-senior author Lihong Wang, who conducted the work when he was a professor of biomedical engineering at Washington University’s School of Engineering & Applied Science. He is now at Caltech. “This is what makes photoacoustic imaging so powerful. Essentially, you can see any molecule, provided you have the ability to produce light of any wavelength. None of the other imaging technologies can do that. Ultrasound will not do that. X-rays will not do that. Light is the only tool that allows us to provide biochemical information.”

    The researchers tested their technique by scanning slices of tumors removed from three breast cancer patients. For comparison, they also stained each specimen according to standard procedures.

    The photoacoustic image matched the stained samples in all key features. The architecture of the tissue and subcellular detail such as the size of nuclei were clearly visible.

    “It’s the pattern of cells – their growth pattern, their size, their relationship to one another – that tells us if this is normal tissue or something malignant,” Novack said. “Overall, the photoacoustic images had a lot of the same features that we see with standard staining, which means we can use the same criteria to interpret the photoacoustic imaging. We don’t have to come up with new criteria.”

    Having established that photoacoustic techniques can produce usable images, the researchers are working on reducing the scanning time.

    “We expect to be able to speed up the process,” Wang said. “For this study, we had only a single channel for emitting light. If you have multiple channels, you can scan in parallel and that reduces the imaging time. Another way to speed it up is to fire the laser faster. Each laser pulse gives you one data point. Faster pulsing means faster data collection.”

    Aft, Novack and Wang are applying for a grant to build a photoacoustic imaging machine with multiple channels and fast lasers.

    “One day we think we’ll be able to take a specimen straight from the patient, plop it into the machine in the operating room and know in minutes whether we’ve gotten all the tumor out or not,” Aft said. “That’s the goal.”

    This work was supported by the National Institutes of Health, grant number DP1 EB016986 and R01 CA186567, and by Washington University’s Siteman Cancer Center’s 2014 Research Development Award.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”


    Caltech campus

    Wash U campus
    Wash U campus

    Washington University’s mission is to discover and disseminate knowledge, and protect the freedom of inquiry through research, teaching, and learning.

    Washington University creates an environment to encourage and support an ethos of wide-ranging exploration. Washington University’s faculty and staff strive to enhance the lives and livelihoods of students, the people of the greater St. Louis community, the country, and the world.

     
  • richardmitnick 1:55 pm on May 24, 2017 Permalink | Reply
    Tags: , , Charm mesons and baryons, , , , ,   

    From FNAL: ” Fermilab measures lifetimes and properties of charm mesons and baryons” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    properties of charm mesons and baryons

    May 24, 2017
    Troy Rummler

    1

    Heavy quarks produced in high-energy collisions decay within a tiny fraction of a second, traveling less than a few centimeters from the collision point. To study properties of these particles, Fermilab began using microstrip detectors in the late 1970s. These detectors are made of thin slices of silicon and placed close to the interaction point in order to take advantage of the microstrip’s tremendous position resolution. Over time, Fermilab developed this technology, improving our understanding of silicon’s capabilities and adapting the technology to other detectors, including those at CDF and DZero.

    FNAL Tevatron

    FNAL/Tevatron map


    FNAL/Tevatron DZero detector


    FNAL/Tevatron CDF detector

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 1:22 pm on May 24, 2017 Permalink | Reply
    Tags: , , , , ,   

    From U Wisconsin IceCube: “IceCube sets new best limits for dark matter searches in neutrino detectors” 

    icecube
    U Wisconsin IceCube South Pole Neutrino Observatory

    24 May 2017
    Sílvia Bravo

    Studies aimed at understanding the nature and origin of dark matter include experiments in astronomy, astrophysics and particle physics. Astronomical observations point to the existence of dark matter in large amounts and in many cosmic environments, including the Milky Way. However, at the same time, the international quest to detect a dark matter interaction has so far been unsuccessful.

    IceCube has proven to be a champion detector for indirect searches of dark matter using neutrinos. As the amount of data grows and a better understanding of the detector allows making evermore precise measurements, the IceCube Collaboration continues exploring a vast range of dark matter energies and decay channels. In the most recent study, the collaboration sets the best limits on a neutrino signal from dark matter particles with masses between 10 and 100 GeV. These results have recently been submitted to the European Physical Journal C.

    2
    Comparison of upper limits on , i.e., the velocity averaged product of the dark matter self-annihilation cross section and the relative velocity of the dark matter particles, versus WIMP mass, for dark matter self-annihilating through taus to neutrinos. The ‘natural scale’ refers to the value that is needed for WIMPs to be a thermal relic. Credit: IceCube Collaboration.

    Searches for dark matter usually focus on a generic candidate, called a weakly interacting massive particle, or WIMP. Physicists expect WIMPs to interact with other matter particles or to self-annihilate, producing a cascade of known particles, which for many channels and energies include neutrinos that can be detected on Earth. If this is the case, a neutrino detector on Earth is expected to detect an excess of neutrinos related to the distribution of dark matter in our galaxy. A similar signal is expected for photons.

    “The enormous size of IceCube allows the rare detection of high-energy neutrinos, but it is also essential for the detection of neutrinos at lower energies as it serves to identify incoming muons produced in cosmic ray air showers, which is a major challenge in searching for a signal from the Southern Hemisphere,” explains Morten Medici, a PhD student at the Niels Bohr Institute in Denmark and corresponding author of this study.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ICECUBE neutrino detector
    IceCube is a particle detector at the South Pole that records the interactions of a nearly massless sub-atomic particle called the neutrino. IceCube searches for neutrinos from the most violent astrophysical sources: events like exploding stars, gamma ray bursts, and cataclysmic phenomena involving black holes and neutron stars. The IceCube telescope is a powerful tool to search for dark matter, and could reveal the new physical processes associated with the enigmatic origin of the highest energy particles in nature. In addition, exploring the background of neutrinos produced in the atmosphere, IceCube studies the neutrinos themselves; their energies far exceed those produced by accelerator beams. IceCube is the world’s largest neutrino detector, encompassing a cubic kilometer of ice.

     
  • richardmitnick 1:11 pm on May 24, 2017 Permalink | Reply
    Tags: , ,   

    From JPL-Caltech: “NASA Moves Up Launch of Psyche Mission to a Metal Asteroid” 

    NASA JPL Banner

    JPL-Caltech

    May 24, 2017
    D.C. Agle
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-393-9011
    agle@jpl.nasa.gov

    Karin Valentine
    Arizona State University School of Earth and Space Exploration, Tempe
    480-965-9345
    karin.valentine@asu.edu

    Laurie Cantillo
    NASA Headquarters, Washington
    202-358-1077
    laura.l.cantillo@nasa.gov

    Dwayne Brown
    NASA Headquarters, Washington
    202-358-1726
    dwayne.c.brown@nasa.gov

    1
    This artist’s-concept illustration depicts the spacecraft of NASA’s Psyche mission near the mission’s target, the metal asteroid Psyche. The artwork was created in May 2017 to show the five-panel solar arrays planned for the spacecraft.
    The spacecraft’s structure will include power and propulsion systems to travel to, and orbit, the asteroid. These systems will combine solar power with electric propulsion to carry the scientific instruments used to study the asteroid through space.
    The mission plans launch in 2022 and arrival at Psyche, between the orbits of Mars and Jupiter, in 2026. This selected asteroid is made almost entirely of nickel-iron metal. It offers evidence about violent collisions that created Earth and other terrestrial planets.
    Image credit: NASA/JPL-Caltech/Arizona State Univ./Space Systems Loral/Peter Rubin

    Psyche, NASA’s Discovery Mission to a unique metal asteroid, has been moved up one year with launch in the summer of 2022, and with a planned arrival at the main belt asteroid in 2026 — four years earlier than the original timeline.

    “We challenged the mission design team to explore if an earlier launch date could provide a more efficient trajectory to the asteroid Psyche, and they came through in a big way,” said Jim Green, director of the Planetary Science Division at NASA Headquarters in Washington. “This will enable us to fulfill our science objectives sooner and at a reduced cost.”

    The Discovery program announcement of opportunity had directed teams to propose missions for launch in either 2021 or 2023. The Lucy mission was selected for the first launch opportunity in 2021, and Psyche was to follow in 2023. Shortly after selection in January, NASA gave the direction to the Psyche team to research earlier opportunities.

    2
    Lucy

    “The biggest advantage is the excellent trajectory, which gets us there about twice as fast and is more cost effective,” said Principal Investigator Lindy Elkins-Tanton of Arizona State University in Tempe. “We are all extremely excited that NASA was able to accommodate this earlier launch date. The world will see this amazing metal world so much sooner.”

    The revised trajectory is more efficient, as it eliminates the need for an Earth gravity assist, which ultimately shortens the cruise time. In addition, the new trajectory stays farther from the sun, reducing the amount of heat protection needed for the spacecraft. The trajectory will still include a Mars gravity assist in 2023.

    “The change in plans is a great boost for the team and the mission,” said Psyche Project Manager Henry Stone at NASA’s Jet Propulsion Laboratory, Pasadena, California. “Our mission design team did a fantastic job coming up with this ideal launch opportunity.”

    The Psyche spacecraft is being built by Space Systems Loral (SSL), Palo Alto, California. In order to support the new mission trajectory, SSL redesigned the solar array system from a four-panel array in a straight row on either side of the spacecraft to a more powerful five-panel x-shaped design, commonly used for missions requiring more capability. Much like a sports car, by combining a relatively small spacecraft body with a very high-power solar array design, the Psyche spacecraft will speed to its destination at a faster pace than is typical for a larger spacecraft.

    “By increasing the size of the solar arrays, the spacecraft will have the power it needs to support the higher velocity requirements of the updated mission,” said SSL Psyche Program Manager Steve Scott.

    The Psyche Mission

    Psyche, an asteroid orbiting the sun between Mars and Jupiter, is made almost entirely of nickel-iron metal. As such, it offers a unique look into the violent collisions that created Earth and the terrestrial planets.

    The Psyche Mission was selected for flight earlier this year under NASA’s Discovery Program, a series of lower-cost, highly focused robotic space missions that are exploring the solar system.

    The scientific goals of the Psyche mission are to understand the building blocks of planet formation and explore firsthand a wholly new and unexplored type of world. The mission team seeks to determine whether Psyche is the core of an early planet, how old it is, whether it formed in similar ways to Earth’s core, and what its surface is like. The spacecraft’s instrument payload will include magnetometers, multispectral imagers, and a gamma ray and neutron spectrometer.

    For more information about NASA’s Psyche mission go to:

    http://www.nasa.gov/psyche

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 12:45 pm on May 24, 2017 Permalink | Reply
    Tags: , , , , , Enter the ‘Synestia’   

    From Centauri Dreams: “Enter the ‘Synestia’” 

    Centauri Dreams

    May 24, 2017
    Paul Gilster

    What happens when giant objects collide? We know the result will be catastrophic, as when we consider the possibility that the Moon was formed by a collision between the Earth and a Mars-sized object in the early days of the Solar System. But Sarah Stewart (UC-Davis) and Simon Lock (a graduate student at Harvard University) have produced a different possible outcome. Perhaps an impact between two infant planets would produce a single, disk-shaped object like a squashed doughnut, made up of vaporized rock and having no solid surface.

    Call it a ‘synestia,’ a coinage invoking the Greek goddess Hestia (goddess of the hearth, family, and domestic life, although the authors evidently drew on Hestia’s mythological connections to architecture). Stewart and Lock got interested in the possibility of such structures by asking about the effects of angular momentum, which would be conserved in any collision. Thus two giant bodies smashing into each other should result in the angular momentum of each being added together. Given enough energy (and there should be plenty), the hypothesized structure should form, an indented disk much larger than either planet.

    1
    Image: The structure of a planet, a planet with a disk and a synestia, all of the same mass. Credit: Simon Lock and Sarah Stewart

    The paper [ AGU Journal of Geophysical Resarch ] on this work notes that “…the structure of post-impact bodies influences the physical processes that control accretion, core formation and internal evolution. Synestias also lead to new mechanisms for satellite formation.” Moreover, Stewart and Lock believe that rocky planets are vaporized multiple times during their formation. Thus synestias should be a common outcome in young systems.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Tracking Research into Deep Space Exploration

    Alpha Centauri and other nearby stars seem impossible destinations not just for manned missions but even for robotic probes like Cassini or Galileo. Nonetheless, serious work on propulsion, communications, long-life electronics and spacecraft autonomy continues at NASA, ESA and many other venues, some in academia, some in private industry. The goal of reaching the stars is a distant one and the work remains low-key, but fascinating ideas continue to emerge. This site will track current research. I’ll also throw in the occasional musing about the literary and cultural implications of interstellar flight. Ultimately, the challenge may be as much philosophical as technological: to reassert the value of the long haul in a time of jittery short-term thinking.

     
  • richardmitnick 12:27 pm on May 24, 2017 Permalink | Reply
    Tags: , , , , Galaxy IC 342, NASA/DLR SOFIA   

    From SOFIA: “Understanding Star Formation in the Nucleus of Galaxy IC 342” 

    NASA SOFIA Banner

    NASA SOFIA

    SOFIA (Stratospheric Observatory For Infrared Astronomy)

    May 23, 2017
    Nicholas A. Veronico
    NVeronico@sofia.usra.edu
    SOFIA Science Center
    NASA Ames Research Center
    Moffett Field, California

    1
    A BIMA-SONG radio map of the IC 342 central molecular zone; dots indicate locations of SOFIA/GREAT observations.
    Credits: Röllig et al.

    An international team of researchers used NASA’s Stratospheric Observatory for Infrared Astronomy, SOFIA, to make maps of the ring of molecular clouds that encircles the nucleus of galaxy IC 342. The maps determined the proportion of hot gas surrounding young stars as well as cooler gas available for future star formation. The SOFIA maps indicate that most of the gas in the central zone of IC 342, like the gas in a similar region of our Milky Way Galaxy, is heated by already-formed stars, and relatively little is in dormant clouds of raw material.

    At a distance of about 13 million light years, galaxy IC 342 is considered relatively nearby. It is about the same size and type as our Milky Way Galaxy, and oriented face-on so we can see its entire disk in an undistorted perspective. Like our galaxy, IC 342 has a ring of dense molecular gas clouds surrounding its nucleus in which star formation is occurring. However, IC 342 is located behind dense interstellar dust clouds in the plane of the Milky Way, making it difficult to study by optical telescopes.

    The team of researchers from Germany and the Netherlands, led by Markus Röllig of the University of Cologne, Germany, used the German Receiver for Astronomy at Terahertz frequencies, GREAT, onboard SOFIA to scan the center of IC 342 at far-infrared wavelengths to penetrate the intervening dust clouds. Röllig’s group mapped the strengths of two far-infrared spectral lines – one line, at a wavelength of 158 microns, is emitted by ionized carbon, and the other, at 205 microns, is emitted by ionized nitrogen.

    The 158-micron line is produced both by cold interstellar gas that is the raw material for new stars, and also by hot gas illuminated by stars that have already finished forming. The 205-micron spectral line is only emitted by the hot gas around already-formed young stars. Comparison of the strengths of the two spectral lines allows researchers to determine of the amount of warm gas versus cool gas in the clouds.

    Röllig’s team found that most of the ionized gas in IC 342’s central molecular zone (CMZ) is in clouds heated by fully formed stars rather than in cooler gas found farther out in the zone, like the situation in the Milky Way’s CMZ. The team’s research was published in Astronomy and Astrophysics, volume 591.

    “SOFIA and its powerful GREAT instrument allowed us to map star formation in the center of IC 342 in unprecedented detail,” said Markus Röllig of the University of Cologne, Germany, “These measurements are not possible from ground-based telescopes or existing space telescopes.”

    Researchers previously used SOFIA’s GREAT spectrometer for a corresponding study of the Milky Way’s CMZ. That research, published in 2015 by principal investigator W.D. Langer, et. al, appeared in the journal Astronomy & Astrophysics 576, A1; an overview of that study can be found here.

    For more information about SOFIA, visit:

    http://www.nasa.gov/sofiahttp://www.dlr.de/en/sofia

    For information about SOFIA’s science mission and scientific instruments, visit:

    http://www.sofia.usra.eduhttp://www.dsi.uni-stuttgart.de/index.en.html

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SOFIA is a joint project of NASA and the German Aerospace Center (DLR). The aircraft is based at and the program is managed from NASA Armstrong Flight Research Center’s facility in Palmdale, California. NASA’s Ames Research Center, manages the SOFIA science and mission operations in cooperation with the Universities Space Research Association (USRA) headquartered in Columbia, Maryland, and the German SOFIA Institute (DSI) at the University of Stuttgart.

    NASA image

    DLR Bloc

     
  • richardmitnick 12:15 pm on May 24, 2017 Permalink | Reply
    Tags: , , , , HD 192163,   

    From Chandra- “Crescent Nebula: Live Fast, Blow Hard and Die Young” From 2003, But Worth It 

    NASA Chandra Banner

    NASA Chandra Telescope

    NASA Chandra

    October 14, 2003 [From before I was doing this. But worth it.]

    1
    Credit: X-ray: NASA/UIUC/Y. Chu & R. Gruendl et al. Optical: SDSU/MLO/Y. Chu et al.

    Massive stars lead short, spectacular lives. This composite X-ray (blue)/optical (red and green) image reveals dramatic details of a portion of the Crescent Nebula, a giant gaseous shell created by powerful winds blowing from the massive star HD 192163 (a.k.a. WR 136, the star is out of the field of view to the lower right).

    After only 4.5 million years (one-thousandth the age of the Sun), HD 192163 began its headlong rush toward a supernova catastrophe. First it expanded enormously to become a red giant and ejected its outer layers at about 20,000 miles per hour. Two hundred thousand years later – a blink of the eye in the life of a normal star – the intense radiation from the exposed hot, inner layer of the star began pushing gas away at speeds in excess of 3 million miles per hour!

    When this high speed “stellar wind” rammed into the slower red giant wind, a dense shell was formed. In the image, a portion of the shell is shown in red. The force of the collision created two shock waves: one that moved outward from the dense shell to create the green filamentary structure, and one that moved inward to produce a bubble of million degree Celsius X-ray emitting gas (blue). The brightest X-ray emission is near the densest part of the compressed shell of gas, indicating that the hot gas is evaporating matter from the shell. The massive star HD 192183 that has produced the nebula appears as the bright dot at the center of the full-field image.

    HD 192163 will likely explode as a supernova in about a hundred thousand years. This image enables astronomers to determine the mass, energy, and composition of the gaseous shell around this pre-supernova star. An understanding of such environments provides important data for interpreting observations of supernovas and their remnants.

    SDSU MLO Mount Laguna Observatory telescope approximately 75 kilometers (47 mi) east of downtown San Diego California (USA)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA’s Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra’s science and flight operations from Cambridge, Mass.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: