Tagged: Basic Research Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:00 pm on August 27, 2015 Permalink | Reply
    Tags: Basic Research, ,   

    From Rockefeller: “Research identifies a protein that helps determine the fate of RNA” 

    Rockefeller U bloc

    Rockefeller University

    August 27, 2015
    Wynne Parry | 212-327-7789

    Ready to read: The newly identified protein, HNRNPA2B1 (green), which recognizes the m6A tag, is found within the nuclei (blue) of cells. Actin filaments, important elements of the cell’s structure, show up in red. Credit: Lisa Noble and Gloria Wu

    After it is transcribed from DNA, RNA can go on to many fates. While the most familiar path may lead directly to the production of protein, RNA molecules themselves can also become capable of altering the expression of genes. New research helps explain how the destiny of an RNA sequence is achieved.

    In a study published August 27 in Cell, Rockefeller University scientists and a colleague at Columbia University have identified a protein that recognizes a chemical instruction tag affixed to an RNA sequence, an important step in the decision-making process.

    “This tag, known as m6A, was identified more than four decades ago on RNA sequences. Since then, this abundant label has been implicated in several important processes that influence the production of proteins, as well as so-called microRNAs,” says study author Sohail Tavazoie, Leon Hess Associate Professor, and head of the Elizabeth and Vincent Meyer Laboratory of Systems Cancer Biology. (Micro-RNAs are small RNA molecules that do not code for proteins, but instead turn down the activity of genes.)

    “However,” Tavazoie says, “a lingering question remained: What reads this chemical tag within the nucleus of cells? Claudio Alarcón, a research associate in my lab, has identified one such “reader” protein. Because of the fundamental nature of the processes involved, this discovery has implications for cells’ normal function and for disease.”

    The tag, called m6A, is a methyl group attached to a particular part of an adenosine, a component of RNA’s sequence. In previous work, Alarcón and colleagues identified the m6A tag as an important regulator of the production of microRNAs. The “writer” protein that places this tag was already known to mark RNA molecules that need to be spliced before they are translated into proteins. Many genes contain sections that must be cut out, and the RNA splicing process is crucial to the function and identity of a cell.

    The recent experiments show that the newly discovered reader, a protein known as HNRNPA2B1, recognizes m6A tags on RNA destined for two separate fates: Trimming to become microRNAs or splicing for proper production into protein. After recognizing m6A tags on microRNA precursors, HNRNPA2B1 then recruits the cutting machinery responsible for further trimming and processing those RNA molecules. Future work is required to understand how HNRNPA2B1 interacts with the proteins involved in the splicing of RNA.

    The researchers suspected that the HNRNPA2B1 protein acts as a reader because they found it frequently binds to the same sites on the RNA molecule where the m6A tag attaches. To determine what the alleged reader was doing there, the team reduced its presence in cells.

    In cells with reduced HNRNPA2B1 levels, they found a shift in the expression of microRNAs overall, with many microRNAs reduced. They also looked at the effects on RNA destined for splicing. Here too, they found telling changes in the splicing of different RNA molecules that are dependent on m6A tags.

    HNRNPA2B1 is the first m6A nuclear reader to be identified, and evidence from the experiments suggests the existence of additional readers within the nucleus that also recognize this tag.

    “The discovery of this new m6A reader has ramifications for a broad range of processes,” Alarcón says. “RNA splicing establishes the repertoire of proteins available in cells. Meanwhile, abnormalities in microRNAs have been associated with several diseases, including cancer. This work also contributes to growing evidence that information beyond the sequence of RNA, such as chemical modifications, can determine the ultimate fate and function of RNA molecules.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Rockefeller U Campus

    The Rockefeller University is a world-renowned center for research and graduate education in the biomedical sciences, chemistry, bioinformatics and physics. The university’s 76 laboratories conduct both clinical and basic research and study a diverse range of biological and biomedical problems with the mission of improving the understanding of life for the benefit of humanity.

    Founded in 1901 by John D. Rockefeller, the Rockefeller Institute for Medical Research was the country’s first institution devoted exclusively to biomedical research. The Rockefeller University Hospital was founded in 1910 as the first hospital devoted exclusively to clinical research. In the 1950s, the institute expanded its mission to include graduate education and began training new generations of scientists to become research leaders around the world. In 1965, it was renamed The Rockefeller University.

  • richardmitnick 7:43 pm on August 27, 2015 Permalink | Reply
    Tags: , , Basic Research, Building a CubeSat   

    From ASU: “How a university went into space: ASU’s story” 

    ASU Bloc


    August 27th, 2015
    Scott Seckel

    A Mars rover replica at ASU. The university has played roles in 25 missions to eight planets, three asteroids, two moons and the sun.
    Photo by: ASU

    Only 30 institutions in the United States can build spacecraft. Only seven build interplanetary spacecraft that leave Earth’s orbit.

    Arizona State University is one of them.

    ASU’s space program is in elite company. And this week’s CubeSat mission announcement adds to the university’s stellar resume: It will be the first time ASU will lead an interplanetary science expedition.

    It’s not the university’s first outing by a long shot, however.

    ASU has played roles in 25 missions to eight planets, three asteroids, two moons and the sun.

    The School of Earth and Space Exploration was created in 2006. As an institution however, ASU’s space program started much longer ago. This is the story of how a traditional geology program merged with the astronomy side of the physics department and grew into a powerhouse that builds spacecraft.

    Rocks and fighter jocks

    ASU’s space exploration origins lie in the quest to send men to the moon in the 1960s. Ron Greeley, one of the founders of planetary geology, was working at NASA, helping select landing sites for the Apollo missions and assisting in geologic training for astronauts.

    Back in the Apollo days, science was incidental to missions. Engineers – who just wanted to put boots on the moon – frequently clashed with scientists, who wanted to do at least a few things as long as we were going all that way.

    One famous story illustrating the rift centered on a geologist who suggested a rock hammer be included in an astronaut’s tool bag. “But we took one of those on the last mission!” an engineer exploded.

    Early astronauts tended to be fighter jocks who weren’t much interested in rocks either. Greeley succeeded in educating them to be more sophisticated than simply describing rocks as big or little, and how to differentiate between an interesting rock and a more prosaic sample.

    “He was trying to get them to think about the geology and the rocks and what to look for when they got to the moon,” said Phil Christensen, a Regents Professor of geological sciences in ASU’s School of Earth and Space Exploration. “If you listen to the transcripts of those astronauts, Ron and others who trained them did a fantastic job. There were a few (astronauts) who were classic test pilots, Navy guys on an adventure and, oh, I picked up a few rocks. Most of them did a good job.”

    In 1977, Greeley was hired at ASU and focused his research on data from early robotic NASA missions. He received a number of honors during his career, from an asteroid named for him (30785 Greeley) in 1988 to numerous NASA awards.

    From rocks to gadgets

    If Greeley was the father of ASU’s space program, Christensen is the founder of what the program has become.

    Back in 1981, Greeley hired Christensen as a young postdoc who was starting to get involved in space missions. Christensen won a big NASA grant to put an instrument on one of the Mars orbiters.

    He’s since become a Regents Professor (top tenured faculty who have made significant contributions to their field) and is the director of the Mars Space Flight Facility in SESE.

    Greeley was a brilliant field geologist and planetary scientist, but he wasn’t an instrument guy, Christensen said.

    “Ron was a pioneer in looking at the data that came back from these probes, looking at images of the moon and Mars and analyzing them, thinking about them,” he said. “He had no interest in building the instruments, building the cameras, building the spectrometers. … He was on the team, he had access to the data, he was a leader in the field, but he was mostly looking at data that existed and doing the usual science. That’s what ASU did. They didn’t build anything.”

    And when Christensen won a huge contract to build an instrument in the early 1980s, hardly anyone jumped for joy. In fact, the reaction was nervousness and wondering where to put them.

    Christensen asked an associate dean for office space.

    “He said, “Well, there’s a couple of filing cabinets you can have.’ They just didn’t get it. We had this 10, 20 million dollar contract. It was the biggest contract ASU had ever done. They had no idea how to do it. They had no idea how to deal with an aerospace company. So to go from someone offering me two file cabinets to (the current space program and state-of-the-art facilities) … there’s been a lot of changes at this university. It’s been really amazing to watch this grow.”Jim Bell is a professor in SESE, the deputy principal investigator of the LunaH-Map CubeSat mission, and director of the NewSpace Initiative at ASU.

    LunaH-Map CubeSat

    The latter is a program that connects students and faculty doing space-related work with outside entities doing the same thing. They range “from SpaceX to a couple of teenagers in a garage,” Bell said. “Where do they need our help? Can you do a mission for 1 percent of the cost of a big NASA mission?” (They don’t know the answer to that yet.)

    Until now, ASU’s space program has revolved around making instruments that are snapped up by NASA. ASU faculty have been involved with all of NASA’s robotic missions.

    “NASA knows us scientifically, but also from an engineering standpoint,” said Bell, who has built several cameras currently on Mars or in space.

    And that is because of Christensen and Greeley.

    “Those two guys were part of the bedrock foundation of the NASA work here at ASU,” Bell said.

    How to woo NASA

    In the early 1980s, NASA picked the University of Arizona to run a Mars mission. That university asked Christensen if he could build an instrument for it.

    At the same time, defense contractor Raytheon shut down the Santa Barbara facility where Christensen had been working for ASU. Three or four of his colleagues became available. He thought if they came in, and ASU helped out, an instrument could be built at ASU. The instrument they wanted was very similar to one they had already built.

    “It was a perfect storm,” Christensen said. “We were one instrument that was part of a bigger project. It wasn’t a huge risk to NASA to pick the UofA to run this mission and one of the instruments will be built at ASU. It was very similar to what we’d built before. … It was fortuitous that everything came together just right.”

    They worked their tails off for five years.

    “This was a one-shot deal,” Christensen said. “Reputation works both ways. If we screw this up, they’re never going to talk to ASU again. Fifteen people on this project took that really seriously. Not just their careers; ASU had spent a lot of money on this building and these facilities. There was a lot riding on us succeeding. People took a lot of pride in this succeeding. And it did.”

    The campus where spacecraft are built

    ASU had no place to build instruments or spacecraft when Christensen landed at the university in the early 1980s.

    “Now we can build a NASA flight-quality instrument in this building,” he said. “Ten years ago we would have laughed: ‘We can’t do that. We don’t have the facilities, the people, the credibility.’ But we’ve done it. And now because of that, people are coming to me to build them instruments for Europa and other missions.

    NASA Europa

    Jim Bell can say we can build and test cameras here. We have new faculty coming in. Ten years from now there will be several people building instruments in this building. ASU will eventually win a Discovery-class mission.”

    NASA’s Discovery missions are low-cost missions within the solar system with narrow focus. (Cost is relative in space. Discovery missions still cost what an average person would consider a vast sum, but they’re cheap compared with anything involving people being present.)

    “A NASA mission is 90 percent about the process,” Christensen said. “How do you do it? How do you make it work? All things you have to do, all the people working together, keeping them together, keeping them from killing each other – to me that’s half the fun. … Within NASA, like a lot of other places, it’s all about reputation. Can you do it? Once you can, that’s a huge step. Suddenly you’re building more, and people come because of that. It sort of mushrooms.”

    And the university’s physical investment in its space program has come a long way from two battered filing cabinets.

    The 300,000-square-foot Interdisciplinary Science & Technology Building IV (or ISTB4, in local parlance) opened in 2012. It boasts labs, clean rooms, offices, high bays, a 250-seat auditorium and one of two mission operations centers on campus.

    “My colleagues at any other institution come here and they’re jealous,” Christensen said. Last week a Jet Propulsion Lab delegation met with Christensen at the space building. They were jealous, too.

    “It takes money to make money,” Christensen said. “You build a facility like this, it pays for itself. NASA does not want you building stuff out of spit and baling wire. When they come here and see this, they say, ‘You guys are for real.’ ”

    Incidentally, 40 countries can build spacecraft, but only four can build interplanetary spacecraft. That puts ASU ahead of most countries in that aspect.

    The clean rooms in the ASU space building are about the size of a small high school gym.

    “That’s where we’ll build the (LunaH-Map) spacecraft,” Bell said.

    It has the usual desks, monitors and chairs. What isn’t usual are the two vacuum chambers, one the size of a packing crate and the other about the size of a Volkswagen bus. They’re used to simulate space conditions. The lab team can crank all the oxygen out of the chamber, drop the temperature down to absolute zero (minus 459.67 degrees Fahrenheit), and see how what they’ve built stands up to space conditions.

    “You turn it into outer space,” Bell said. “It’s pretty rare for a college campus (to be able to test instruments in that environment). Only a handful of campuses around the country have that capability. Typically you only find that in NASA centers and big aerospace companies.”

    Working together beating things up

    Space system engineer Jekan Thanga came to ASU two years ago, attracted by the school and the space program. He specializes in robots, artificial evolution, exploration of extreme environments, and CubeSats, the small spacecraft like the one ASU is sending to the moon. (He is the chief engineer on the project.)

    The institute’s collaborative nature drew Thanga here. It’s not a conventional aerospace environment. A scientist can walk down the hall, tell an engineer like Thanga he needs to get data from somewhere really nasty and inaccessible, and the engineer can figure out how to make a machine that will go there, survive and get the data home.

    “To the engineering world, it’s a radical departure,” Thanga said. “There is determination here.”

    Thanga and his team spend a lot of time in the clean rooms. They have put machines inside the vacuum chambers, thrown in a bunch of dust and rocks, and cranked them up to see how they fared. (If you were in put it, your eyeballs would pop, the blood in your veins would boil, and eventually you’d boil away. Outer space is a tough place.)

    It’s not uncommon to come in to the clean rooms at 7 a.m. on a Saturday morning and find grad students working on projects. About 15 to 20 people are working on all aspects of design and development at any given time.

    The cutting edge of space exploration

    It’s a far cry from the ’60s, when engineers fought scientists. Now they are in the same building, unseparated by distance or bureaucratic walls.

    “The cutting edge of space exploration is that it’s not good enough to just tell somebody to go build a camera and show up and use it later,” Bell said. “You really have to have your goals in mind while that instrument is on paper. You really have to dive in and become an optics expert. I’ve got to work with optics experts and electrical engineers and all that because I want to make a certain measurement to a certain level of accuracy in a certain environment.

    “The more I can partner with people who understand the engineering and the guts of the electronics, the better my experiments will be. Building those people into the department that is my home at the university is just incredibly efficient and wonderful.”

    Mars rocks

    Some 40 years after Greeley’s time, NASA comes to ASU’s door.

    “When you do things well – really, really well – people notice,” Christensen said. “It’s not just me. ‘Oh, ASU can build those instruments.’ And that flows over to Jim and Craig (Hardgrove, principal investigator on the lunar CubeSat mission) and Erik (Asphaug, working on how to perform a CAT scan on a comet) and Linda (Elkins-Tanton, school director). We’ve built ASU’s reputation.”

    The Mars Rover helped a lot too, he said.

    “Being world leaders in something as visible as exploring Mars got a lot of attention to ASU that leveraged a lot of things going on here now,” Christensen said. “A lot of science is fabulous but, I’m sorry, landing on Mars is not the same as discovering a new type of plastic for Coke bottles; OK, great. Landing on Mars gets you on the cover of magazines.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ASU is the largest public university by enrollment in the United States.[11] Founded in 1885 as the Territorial Normal School at Tempe, the school underwent a series of changes in name and curriculum. In 1945 it was placed under control of the Arizona Board of Regents and was renamed Arizona State College.[12][13][14] A 1958 statewide ballot measure gave the university its present name.
    ASU is classified as a research university with very high research activity (RU/VH) by the Carnegie Classification of Institutions of Higher Education, one of 78 U.S. public universities with that designation. Since 2005 ASU has been ranked among the Top 50 research universities, public and private, in the U.S. based on research output, innovation, development, research expenditures, number of awarded patents and awarded research grant proposals. The Center for Measuring University Performance currently ranks ASU 31st among top U.S. public research universities.[15]

    ASU awards bachelor’s, master’s and doctoral degrees in 16 colleges and schools on five locations: the original Tempe campus, the West campus in northwest Phoenix, the Polytechnic campus in eastern Mesa, the Downtown Phoenix campus and the Colleges at Lake Havasu City. ASU’s “Online campus” offers 41 undergraduate degrees, 37 graduate degrees and 14 graduate or undergraduate certificates, earning ASU a Top 10 rating for Best Online Programs.[16] ASU also offers international academic program partnerships in Mexico, Europe and China. ASU is accredited as a single institution by The Higher Learning Commission.

    ASU Tempe Campus
    ASU Tempe Campus

  • richardmitnick 3:17 pm on August 27, 2015 Permalink | Reply
    Tags: , Basic Research, ,   

    From CfA: “Interstellar Seeds Could Create Oases of Life” 

    Smithsonian Astrophysical Observatory
    Smithsonian Astrophysical Observatory

    August 27, 2015
    Christine Pulliam
    Media Relations Manager
    Harvard-Smithsonian Center for Astrophysics


    We only have one example of a planet with life: Earth. But within the next generation, it should become possible to detect signs of life on planets orbiting distant stars. If we find alien life, new questions will arise. For example, did that life arise spontaneously? Or could it have spread from elsewhere? If life crossed the vast gulf of interstellar space long ago, how would we tell?

    New research by Harvard astrophysicists shows that if life can travel between the stars (a process called panspermia), it would spread in a characteristic pattern that we could potentially identify.

    “In our theory clusters of life form, grow, and overlap like bubbles in a pot of boiling water,” says lead author Henry Lin of the Harvard-Smithsonian Center for Astrophysics (CfA).

    There are two basic ways for life to spread beyond its host star. The first would be via natural processes such as gravitational slingshotting of asteroids or comets. The second would be for intelligent life to deliberately travel outward. The paper does not deal with how panspermia occurs. It simply asks: if it does occur, could we detect it? In principle, the answer is yes.

    The model assumes that seeds from one living planet spread outward in all directions. If a seed reaches a habitable planet orbiting a neighboring star, it can take root. Over time, the result of this process would be a series of life-bearing oases dotting the galactic landscape.

    “Life could spread from host star to host star in a pattern similar to the outbreak of an epidemic. In a sense, the Milky Way galaxy would become infected with pockets of life,” explains CfA co-author Avi Loeb.

    If we detect signs of life in the atmospheres of alien worlds, the next step will be to look for a pattern. For example, in an ideal case where the Earth is on the edge of a “bubble” of life, all the nearby life-hosting worlds we find will be in one half of the sky, while the other half will be barren.

    Lin and Loeb caution that a pattern will only be discernible if life spreads somewhat rapidly. Since stars in the Milky Way drift relative to each other, stars that are neighbors now won’t be neighbors in a few million years. In other words, stellar drift would smear out the bubbles.

    This research has been accepted for publication in The Astrophysical Journal Letters.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About CfA

    The Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory (SAO) is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory (HCO), founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy. The long relationship between the two organizations, which began when the SAO moved its headquarters to Cambridge in 1955, was formalized by the establishment of a joint center in 1973. The CfA’s history of accomplishments in astronomy and astrophysics is reflected in a wide range of awards and prizes received by individual CfA scientists.

    Today, some 300 Smithsonian and Harvard scientists cooperate in broad programs of astrophysical research supported by Federal appropriations and University funds as well as contracts and grants from government agencies. These scientific investigations, touching on almost all major topics in astronomy, are organized into the following divisions, scientific departments and service groups.

  • richardmitnick 3:04 pm on August 27, 2015 Permalink | Reply
    Tags: , Basic Research,   

    From U Texas McDonald Observatory: “Dying Stars Suffer from ‘Irregular Heartbeats'” 

    McDonald Observatory bloc

    McDonald Observatory

    26 August 2015

    White Dwarf Outburst

    Keaton Bell

    Some dying stars suffer from ‘irregular heartbeats,’ research led by astronomers at The University of Texas at Austin and the University of Warwick has discovered.

    The team discovered rapid brightening events — outbursts — in two otherwise normal pulsating white dwarf stars. Ninety-seven percent of all stars, including the Sun, will end their lives as extremely dense white dwarfs after they exhaust their nuclear fuel. Such outbursts have never been seen in this type of star before.

    “It’s the discovery of an entirely new phenomenon,” said graduate student Keaton Bell of The University of Texas at Austin. Bell reported the first pulsating white dwarf to show these outbursts, KIC 4552982, in a recent issue of The Astrophysical Journal.

    This week, a team led by recent University of Texas PhD J.J. Hermes, now of the University of Warwick, is reporting the second white dwarf to show this trait: PG1149+057. Hermes’ team includes Bell and others from The University of Texas. Their research is published in the current Astrophysical Journal Letters.

    Both white dwarf discoveries were made using data from the Kepler space mission.

    NASA Kepler Telescope

    The Kepler spacecraft trails Earth in its orbit around the Sun, recording time lapse movies of a few patches of sky for months on end.

    The Kepler data show that in addition to the regular rhythm of pulsations expected from a white dwarf, which cause the star to get a few percent brighter and fainter every few minutes, both stars also experienced arrhythmic, massive outbursts every few days, breaking their regular pulse and significantly heating up their surfaces for many hours.

    “We have essentially found rogue waves in a pulsating star, akin to ‘irregular heartbeats,’” Hermes explained. “These were truly a surprise to see: We have been watching pulsating white dwarfs for more than 50 years now from the ground, and only by being able to stare uninterrupted for months from space have we been able to catch these events.”

    Bell elaborated: “When we build a telescope that observes the sky in an entirely new way, we’re going to end up discovering things that we never expected.” Though Kepler’s notoriety derives from its prowess as a planet hunter, “it’s told us at least as much about stars as it has about planets,” Bell said.

    White dwarfs have been known to pulsate for decades, and some are exceptional clocks, with pulsations that have kept nearly perfect time for more than 40 years. Pulsations are believed to be a naturally occurring stage when a white dwarf reaches the right temperature to generate a mix of partially ionized hydrogen atoms at its surface.

    That mix of excited atoms can store up and then release energy, causing the star to resonate with pulsations characteristically every few minutes. Astronomers can use the regular periods of these pulsations just like seismologists use earthquakes on Earth, to see below the surface of the star into its exotic interior. This was why astronomers targeted these stars with Kepler, hoping to learn more about their dense cores. In the process, they caught these unexpected outbursts.

    “These are highly energetic events, which can raise the star’s overall brightness by more than 15% and its overall temperature by more than 750 degrees in a matter of an hour,” Hermes said. “For context, the Sun will only increase in overall brightness by about 1% over the next 100 million years.”

    There is a narrow range of surface temperatures where pulsations can be excited in white dwarfs, and so far irregularities have only been seen in the coolest of those that pulsate. Thus, these irregular outbursts may not be just an oddity; they have the potential to change the way astronomers understand how pulsations, the regular heartbeats, ultimately cease in white dwarfs.

    “The theory of stellar pulsations has long failed to explain why pulsations in white dwarfs stop at the temperature we observe them to,” Texas’ Keaton Bell said. “That both stars exhibiting this new outburst phenomenon are right at the temperature where pulsations shut down suggests that the outbursts could be the key to revealing the missing physics in our pulsation theory.”

    Astronomers are still trying to settle on an explanation for these outbursts. Given the similarity between the first two stars to show this behavior, they suspect it might have to do with how the pulsation waves interact with themselves, perhaps via a resonance.

    “Ultimately, this may be a new type of nonlinear behavior that is triggered when the amplitude of a pulsation passes a certain threshold, perhaps similar to rogue waves on the open seas here on Earth, which are massive, spontaneous waves that can be many times larger than average surface waves,” Hermes said. “Still, this is a fresh discovery from observations, and there may be more to these irregular stellar heartbeats than we can imagine yet.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    McDonald Observatory Campus

    Telescopes Are Windows To the Universe

    Astronomers use them to study everything from the asteroids and planets in our own solar system to galaxies billions of light-years away in space and time. Though they bring the mysteries of the universe to us, their workings are anything but mysterious. They gather and focus light from objects in the sky, so that it can be directed into an instrument attached to the telescope, and ultimately, studied in detail by a scientist. At McDonald Observatory, we have several telescopes, built at various times since the Observatory’s founding in the 1930s.

    Here is an introduction to the telescopes that McDonald Observatory astronomers use for their research:

    McDonald Observatory Hobby-Eberly Telescope
    Hobby-Eberly Telescope

    McDonald Observatory Harlan J Smith Telescope
    Harlan J. Smith Telescope

    McDonald Observatory Otto Struve telescope
    Otto Struve Telescope

    McDonald Observatory .8 meter telescope
    0.8-meter Telescope

    McDonald Observatory .9 meter telescope
    0.9-meter Telescope

    McDonald Observatory Rebecca Gale  Telescope Park
    Rebecca Gale Telescope Park

  • richardmitnick 2:51 pm on August 27, 2015 Permalink | Reply
    Tags: , Basic Research, ,   

    From NERSC: “NERSC, Cray Move Forward With Next-Generation Scientific Computing” 

    NERSC Logo

    April 22, 2015
    Jon Bashor, jbashor@lbl.gov, 510-486-5849

    The Cori Phase 1 system will be the first supercomputer installed in the new Computational Research and Theory Facility now in the final stages of construction at Lawrence Berkeley National Laboratory.

    The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray Inc. announced today that they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.


    This supercomputer will be used as Phase 1 of NERSC’s next-generation system named “Cori” in honor of bio-chemist and Nobel Laureate Gerty Cori. Expected to be delivered this summer, the Cray XC40 supercomputer will feature the Intel Haswell processor. The second phase, the previously announced Cori system, will be delivered in mid-2016 and will feature the next-generation Intel Xeon Phi™ processor “Knights Landing,” a self-hosted, manycore processor with on-package high bandwidth memory that offers more than 3 teraflop/s of double-precision peak performance per single socket node.

    NERSC serves as the primary high performance computing facility for the Department of Energy’s Office of Science, supporting some 6,000 scientists annually on more than 700 projects. This latest contract represents the Office of Science’s ongoing commitment to supporting computing to address challenges such as developing new energy sources, improving energy efficiency, understanding climate change and analyzing massive data sets from observations and experimental facilities around the world.

    “This is an exciting year for NERSC and for NERSC users,” said Sudip Dosanjh, director of NERSC. “We are unveiling a brand new, state-of-the-art computing center and our next-generation supercomputer, designed to help our users begin the transition to exascale computing. Cori will allow our users to take their science to a level beyond what our current systems can do.”

    “NERSC and Cray share a common vision around the convergence of supercomputing and big data, and Cori will embody that overarching technical direction with a number of unique, new technologies,” said Peter Ungaro, president and CEO of Cray. “We are honored that the first supercomputer in NERSC’s new center will be our flagship Cray XC40 system, and we are also proud to be continuing and expanding our longstanding partnership with NERSC and the U.S. Department of Energy as we chart our course to exascale computing.”
    Support for Data-Intensive Science

    A key goal of the Cori Phase 1 system is to support the increasingly data-intensive computing needs of NERSC users. Toward this end, Phase 1 of Cori will feature more than 1,400 Intel Haswell compute nodes, each with 128 gigabytes of memory per node. The system will provide about the same sustained application performance as NERSC’s Hopper system, which will be retired later this year. The Cori interconnect will have a dragonfly topology based on the Aries interconnect, identical to NERSC’s Edison system.

    However, Cori Phase 1 will have twice as much memory per node than NERSC’s current Edison supercomputer (a Cray XC30 system) and will include a number of advanced features designed to accelerate data-intensive applications:

    Large number of login/interactive nodes to support applications with advanced workflows
    Immediate access queues for jobs requiring real-time data ingestion or analysis
    High-throughput and serial queues can handle a large number of jobs for screening, uncertainty qualification, genomic data processing, image processing and similar parallel analysis
    Network connectivity that allows compute nodes to interact with external databases and workflow controllers
    The first half of an approximately 1.5 terabytes/sec NVRAM-based Burst Buffer for high bandwidth low-latency I/O
    A Cray Lustre-based file system with over 28 petabytes of capacity and 700 gigabytes/second I/O bandwidth

    In addition, NERSC is collaborating with Cray on two ongoing R&D efforts to maximize Cori’s data potential by enabling higher bandwidth transfers in and out of the compute node, high-transaction rate data base access, and Linux container virtualization functionality on Cray compute nodes to allow custom software stack deployment.

    “The goal is to give users as familiar a system as possible, while also allowing them the flexibility to explore new workflows and paths to computation,” said Jay Srinivasan, the Computational Systems Group lead. “The Phase 1 system is designed to enable users to start running their workload on Cori immediately, while giving data-intensive workloads from other NERSC systems the ability to run on a Cray platform.”
    Burst Buffer Enhances I/O

    A key element of Cori Phase 1 is Cray’s new DataWarp technology, which accelerates application I/O and addresses the growing performance gap between compute resources and disk-based storage. This capability, often referred to as a “Burst Buffer,” is a layer of NVRAM designed to move data more quickly between processor and disk and allow users to make the most efficient use of the system. Cori Phase 1 will feature approximately 750 terabytes of capacity and approximately 750 gigabytes/second of I/O bandwidth. NERSC, Sandia and Los Alamos national laboratories and Cray are collaborating to define use cases and test early software that will provide the following capabilities:

    Improve application reliability (checkpoint-restart)
    Accelerate application I/O performance for small blocksize I/O and analysis files
    Enhance quality of service by providing dedicated I/O acceleration resources
    Provide fast temporary storage for out-of-core applications
    Serve as a staging area for jobs requiring large input files or persistent fast storage between coupled simulations
    Support post-processing analysis of large simulation data as well asin situandin transitvisualization and analysis using the Burst Buffer nodes

    Combining Extreme Scale Data Analysis and HPC on the Road to Exascale

    As previously announced, Phase 2 of Cori will be delivered in mid-2016 and will be combined with Phase 1 on the same high speed network, providing a unique resource. When fully deployed, Cori will contain more than 9,300 Knights Landing compute nodes and more than 1,900 Haswell nodes, along with the file system and a 2X increase in the applications I/O acceleration.

    “In the scientific computing community, the line between large scale data analysis and simulation and modeling is really very blurred,” said Katie Antypas, head of NERSC’s Scientific Computing and Data Services Department. “The combined Cori system is the first system to be specifically designed to handle the full spectrum of computational needs of DOE researchers, as well as emerging needs in which data- and compute-intensive work are part of a single workflow. For example, a scientist will be able to run a simulation on the highly parallel Knights Landing nodes while simultaneously performing data analysis using the Burst Buffer on the Haswell nodes. This is a model that we expect to be important on exascale-era machines.”

    NERSC is funded by the Office of Advanced Scientific Computing Research in the DOE’s Office of Science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The National Energy Research Scientific Computing Center (NERSC) is the primary scientific computing facility for the Office of Science in the U.S. Department of Energy. As one of the largest facilities in the world devoted to providing computational resources and expertise for basic scientific research, NERSC is a world leader in accelerating scientific discovery through computation. NERSC is a division of the Lawrence Berkeley National Laboratory, located in Berkeley, California. NERSC itself is located at the UC Oakland Scientific Facility in Oakland, California.

    More than 5,000 scientists use NERSC to perform basic scientific research across a wide range of disciplines, including climate modeling, research into new materials, simulations of the early universe, analysis of data from high energy physics experiments, investigations of protein structure, and a host of other scientific endeavors.

    The NERSC Hopper system, a Cray XE6 with a peak theoretical performance of 1.29 Petaflop/s. To highlight its mission, powering scientific discovery, NERSC names its systems for distinguished scientists. Grace Hopper was a pioneer in the field of software development and programming languages and the creator of the first compiler. Throughout her career she was a champion for increasing the usability of computers understanding that their power and reach would be limited unless they were made to be more user friendly.

    (Historical photo of Grace Hopper courtesy of the Hagley Museum & Library, PC20100423_201. Design: Caitlin Youngquist/LBNL Photo: Roy Kaltschmidt/LBNL)

    NERSC is known as one of the best-run scientific computing facilities in the world. It provides some of the largest computing and storage systems available anywhere, but what distinguishes the center is its success in creating an environment that makes these resources effective for scientific research. NERSC systems are reliable and secure, and provide a state-of-the-art scientific development environment with the tools needed by the diverse community of NERSC users. NERSC offers scientists intellectual services that empower them to be more effective researchers. For example, many of our consultants are themselves domain scientists in areas such as material sciences, physics, chemistry and astronomy, well-equipped to help researchers apply computational resources to specialized science problems.

  • richardmitnick 2:12 pm on August 27, 2015 Permalink | Reply
    Tags: Basic Research, , ,   

    From Symmetry: “Looking for strings inside inflation” 


    August 27, 2015
    Troy Rummler


    Theorists from the Institute for Advanced Study have proposed a way forward in the quest to test string theory.

    Two theorists recently proposed a way to find evidence for an idea famous for being untestable: string theory. It involves looking for particles that were around 14 billion years ago, when a very tiny universe hit a growth spurt that used 15 billion times more energy than a collision in the Large Hadron Collider.

    Scientists can’t crank the LHC up that high, not even close. But they could possibly observe evidence of these particles through cosmological studies, with the right technological advances.
    Unknown particles

    During inflation—the flash of hyperexpansion that happened 10-33 seconds after the big bang— particles were colliding with astronomical power. We see remnants of that time in tiny fluctuations in the haze of leftover energy called the cosmic microwave background [CMB].

    Cosmic Background Radiation Planck
    CMB per Planck

    ESA Planck

    Scientists might be able to find remnants of any prehistoric particles that were around during that time as well.

    “If new particles existed during inflation, they can imprint a signature on the primordial fluctuations, which can be seen through specific patterns,” says theorist Juan Maldacena of the Institute for Advanced Study at Princeton University.

    Maldacena and his IAS collaborator, theorist Nima Arkani-Hamed, have used quantum field theory calculations to figure out what these patterns might look like. The pair presented their findings at an annual string theory conference held this year in Bengaluru, India, in June.

    The probable, impossible string

    String theory is frequently summed up by its basic tenet: that the fundamental units of matter are not particles. They are one-dimensional, vibrating strings of energy.

    The theory’s purpose is to bridge a mathematic conflict between quantum mechanics and [Albert] Einstein’s theory of general relativity. Inside a black hole, for example, quantum mechanics dictates that gravity is impossible. Any attempt to adjust one theory to fit the other causes the whole delicate system to collapse. Instead of trying to do this, string theory creates a new mathematical framework in which both theories are natural results. Out of this framework emerges an astonishingly elegant way to unify the forces of nature, along with a correct qualitative description of all known elementary particles.

    As a system of mathematics, string theory makes a tremendous number of predictions. Testable predictions? None so far.

    Strings are thought to be the smallest objects in the universe, and computing their effects on the relatively enormous scales of particle physics experiments is no easy task. String theorists predict that new particles exist, but they cannot compute their masses.

    To exacerbate the problem, string theory can describe a variety of universes that differ by numbers of forces, particles or dimensions. Predictions at accessible energies depend on these unknown or very difficult details. No experiment can definitively prove a theory that offers so many alternative versions of reality.
    Putting string theory to the test

    But scientists are working out ways that experiments could at least begin to test parts of string theory. One prediction that string theory makes is the existence of particles with a unique property: a spin of greater than two.

    Spin is a property of fundamental particles. Particles that don’t spin decay in symmetric patterns. Particles that do spin decay in asymmetric patterns, and the greater the spin, the more complex those patterns get. Highly complex decay patterns from collisions between these particles would have left signature impressions on the universe as it expanded and cooled.

    Scientists could find the patterns of particles with greater than spin 2 in subtle variations in the distribution of galaxies or in the cosmic microwave background, according to Maldacena and Arkani-Hamed. Observational cosmologists would have to measure the primordial fluctuations over a wide range of length scales to be able to see these small deviations.

    The IAS theorists calculated what those measurements would theoretically be if these massive, high-spin particles existed. Such a particle would be much more massive than anything scientists could find at the LHC.

    A challenging proposition

    Cosmologists are already studying patterns in the cosmic microwave background. Experiments such as Planck, BICEP and POLAR BEAR are searching for polarization, which would be evidence that a nonrandom force acted on it.

    BICEP 2
    BICEP 2 interior

    POLARBEAR McGill Telescope

    If they rewind the effects of time and mathematically undo all other forces that have interacted with this energy, they hope that what pattern remains will match the predicted twists imbued by inflation.

    The patterns proposed by Maldacena and Arkani-Hamed are much subtler and much more susceptible to interference. So any expectation of experimentally finding such signals is still a long way off.

    But this research could point us toward someday finding such signatures and illuminating our understanding of particles that have perhaps left their mark on the entire universe.
    The value of strings

    Whether or not anyone can prove that the world is made of strings, people have proven that the mathematics of string theory can be applied to other fields.

    In 2009, researchers discovered that string theory math could be applied to conventional problems in condensed matter physics. Since then researchers have been applying string theory to study superconductors.

    Fellow IAS theorist Edward Witten, who received the Fields Medal in 1990 for his mathematical contributions to quantum field theory and Supersymmetry, says Maldacena and Arkani-Hamed’s presentation was among the most innovative work he saw at the Strings ‘15 conference.

    Witten and others believe that such successes in other fields indicate that string theory actually underlies all other theories at some deeper level.

    “Physics—like history—does not precisely repeat itself,” Witten says. However, with similar structures appearing at different scales of lengths and energies, “it does rhyme.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 1:29 pm on August 27, 2015 Permalink | Reply
    Tags: , Basic Research,   

    From Hubble: “Hubble Finds That the Nearest Quasar Is Powered by a Double Black Hole” 

    NASA Hubble Telescope


    August 27, 2015

    Ray Villard
    Space Telescope Science Institute, Baltimore, Md.

    Jana Smith
    University of Oklahoma, Norman, Ok.

    Xinyu Dai
    University of Oklahoma, Norman, Ok.

    Quasar Host Galaxy Markarian 231
    This Hubble Space Telescope image reveals a bright starlike glow in the center of the interacting galaxy Markarian 231, the nearest quasar to Earth. Located 581 million light-years away, we are seeing the galaxy as it looked before multicelled life first appeared on Earth. Quasars are powered by a central black hole that heats the gas around it to unleash tremendous amounts of energy. Hubble spectroscopic observations infer the presence of two supermassive black holes whirling around each other. Because such a dynamic duo is found in the nearest quasar, it would imply that many quasars host binary-black-hole systems. It would be a natural result of a galaxy merger.
    Object Names: Markarian 231, Mrk 231, UGC 8058, VII Zw 490, QSO B1254+571
    Image Type: Astronomical
    Credit: NASA, ESA, the Hubble Heritage Team (STScI/AURA)-ESA/Hubble Collaboration, and A. Evans (University of Virginia, Charlottesville/NRAO/Stony Brook University)
    The galaxy pair was imaged with the ACS/WFC instrument with filters F435W (B) and F814W (I) on May 10, 2002.

    NASA Hubble ACS

    NASA Hubble WFC3

    Astronomers using NASA’s Hubble Space Telescope have found that Markarian 231 (Mrk 231), the nearest galaxy to Earth that hosts a quasar, is powered by two central black holes furiously whirling about each other.

    The finding suggests that quasars — the brilliant cores of active galaxies — may commonly host two central supermassive black holes that fall into orbit about one another as a result of the merger between two galaxies. Like a pair of whirling skaters, the black-hole duo generates tremendous amounts of energy that makes the core of the host galaxy outshine the glow of the galaxy’s population of billions of stars, which scientists then identify as quasars.

    Scientists looked at Hubble archival observations of ultraviolet radiation emitted from the center of Mrk 231 to discover what they describe as “extreme and surprising properties.”

    If only one black hole were present in the center of the quasar, the whole accretion disk made of surrounding hot gas would glow in ultraviolet rays. Instead, the ultraviolet glow of the dusty disk abruptly drops off towards the center. This provides observational evidence that the disk has a big donut hole encircling the central black hole. The best explanation for the observational data, based on dynamical models, is that the center of the disk is carved out by the action of two black holes orbiting each other. The second, smaller black hole orbits in the inner edge of the accretion disk, and has its own mini-disk with an ultraviolet glow.

    “We are extremely excited about this finding because it not only shows the existence of a close binary black hole in Mrk 231, but also paves a new way to systematically search binary black holes via the nature of their ultraviolet light emission,” said Youjun Lu of the National Astronomical Observatories of China, Chinese Academy of Sciences.

    “The structure of our universe, such as those giant galaxies and clusters of galaxies, grows by merging smaller systems into larger ones, and binary black holes are natural consequences of these mergers of galaxies,” added co-investigator Xinyu Dai of the University of Oklahoma.

    The central black hole is estimated to be 150 million times the mass of our sun, and the companion weighs in at 4 million solar masses. The dynamic duo completes an orbit around each other every 1.2 years.

    The lower-mass black hole is the remnant of a smaller galaxy that merged with Mrk 231. Evidence of a recent merger comes from the host galaxy’s asymmetry, and the long tidal tails of young blue stars.

    The result of the merger has been to make Mrk 231 an energetic starburst galaxy with a star-formation rate 100 times greater than that of our Milky Way galaxy. The infalling gas fuels the black hole “engine,” triggering outflows and gas turbulence that incites a firestorm of star birth.

    The binary black holes are predicted to spiral together and collide within a few hundred thousand years.

    Mrk 231 is located 581 million light-years away.

    The results were published in the August 14, 2015, edition of The Astrophysical Journal.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    ESA50 Logo large

    AURA Icon

  • richardmitnick 11:53 am on August 27, 2015 Permalink | Reply
    Tags: , Basic Research, , , , ,   

    From U Maryland: “Evidence Suggests Subatomic Particles Could Defy the Standard Model” 

    U Maryland bloc

    University of Maryland

    August 26, 2015
    Matthew Wright

    Large Hadron Collider team finds hints of leptons acting out against time-tested predictions

    The Standard Model of particle physics, which explains most of the known behaviors and interactions of fundamental subatomic particles, has held up remarkably well over several decades.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    This far-reaching theory does have a few shortcomings, however—most notably that it doesn’t account for gravity. In hopes of revealing new, non-standard particles and forces, physicists have been on the hunt for conditions and behaviors that directly violate the Standard Model.

    Now, a team of physicists working at CERN’s Large Hadron Collider (LHC) has found new hints of particles—leptons, to be more precise—being treated in strange ways not predicted by the Standard Model. The discovery, scheduled for publication in the September 4, 2015 issue of the journal Physical Review Letters, could prove to be a significant lead in the search for non-standard phenomena.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    In this event display from the LHCb experiment at CERN’s Large Hadron Collider, proton-proton collisions at the interaction point (far left) result in a shower of leptons and other charged particles. The yellow and green lines are computer-generated reconstructions of the particles’ trajectories through the layers of the LHCb detector. Image credit: CERN/LHCb Collaboration

    LHCb Detector

    The team, which includes physicists from the University of Maryland who made key contributions to the study, analyzed data collected by the LHCb detector during the first run of the LHC in 2011-12. The researchers looked at B meson decays, processes that produce lighter particles, including two types of leptons: the tau lepton and the muon. Unlike their stable lepton cousin, the electron, tau leptons and muons are highly unstable and quickly decay within a fraction of a second.

    According to a Standard Model concept called “lepton universality,” which assumes that leptons are treated equally by all fundamental forces, the decay to the tau lepton and the muon should both happen at the same rate, once corrected for their mass difference. However, the team found a small, but notable, difference in the predicted rates of decay, suggesting that as-yet undiscovered forces or particles could be interfering in the process.

    “The Standard Model says the world interacts with all leptons in the same way. There is a democracy there. But there is no guarantee that this will hold true if we discover new particles or new forces,” said study co-author and UMD team lead Hassan Jawahery, Distinguished University Professor of Physics and Gus T. Zorn Professor at UMD. “Lepton universality is truly enshrined in the Standard Model. If this universality is broken, we can say that we’ve found evidence for non-standard physics.”

    The LHCb result adds to a previous lepton decay finding, from the BaBar experiment at the Stanford Linear Accelerator Center, which suggested a similar deviation from Standard Model predictions.

    SLAC Babar

    (The UMD team has participated in the BaBar experiment since its inception in 1990’s.) While both experiments involved the decay of B mesons, electron collisions drove the BaBar experiment and higher-energy proton collisions drove the LHC experiment.

    “The experiments were done in totally different environments, but they reflect the same physical model. This replication provides an important independent check on the observations,” explained study co-author Brian Hamilton, a physics research associate at UMD. “The added weight of two experiments is the key here. This suggests that it’s not just an instrumental effect—it’s pointing to real physics.”

    “While these two results taken together are very promising, the observed phenomena won’t be considered a true violation of the Standard Model without further experiments to verify our observations,” said co-author Gregory Ciezarek, a physicist at the Dutch National Institute for Subatomic Physics (NIKHEF).

    “We are planning a range of other measurements. The LHCb experiment is taking more data during the second run right now. We are working on upgrades to the LHCb detector within the next few years,” Jawahery said. “If this phenomenon is corroborated, we will have decades of work ahead. It could point theoretical physicists toward new ways to look at standard and non-standard physics.”

    With the discovery of the Higgs boson—the last major missing piece of the Standard Model—during the first LHC run, physicists are now looking for phenomena that do not conform to Standard Model predictions.

    Higgs Boson Event
    Higgs Boson event at CMS

    CERN CMS Detector
    CMS Detector in the LHC at CERN

    Jawahery and his colleagues are excited for the future, as the field moves into unknown territory.

    “Any knowledge from here on helps us learn more about how the universe evolved to this point. For example, we know that dark matter and dark energy exist, but we don’t yet know what they are or how to explain them. Our result could be a part of that puzzle,” Jawahery said. “If we can demonstrate that there are missing particles and interactions beyond the Standard Model, it could help complete the picture.”


    In addition to Jawahery and Hamilton, UMD Graduate Assistants Jason Andrews and Jack Wimberley are co-authors on the paper. The UMD LHCb team also includes Research Associate William Parker and Engineer Thomas O’Bannon, who are not coauthors on the paper.

    The research paper, “Measurement of the ratio of branching fractions…,” The LHCb Collaboration, is scheduled to appear online August 31, 2015 and to be published September 4, 2015 in the journal Physical Review Letters.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Maryland Campus

    Driven by the pursuit of excellence, the University of Maryland has enjoyed a remarkable rise in accomplishment and reputation over the past two decades. By any measure, Maryland is now one of the nation’s preeminent public research universities and on a path to become one of the world’s best. To fulfill this promise, we must capitalize on our momentum, fully exploit our competitive advantages, and pursue ambitious goals with great discipline and entrepreneurial spirit. This promise is within reach. This strategic plan is our working agenda.

    The plan is comprehensive, bold, and action oriented. It sets forth a vision of the University as an institution unmatched in its capacity to attract talent, address the most important issues of our time, and produce the leaders of tomorrow. The plan will guide the investment of our human and material resources as we strengthen our undergraduate and graduate programs and expand research, outreach and partnerships, become a truly international center, and enhance our surrounding community.

    Our success will benefit Maryland in the near and long term, strengthen the State’s competitive capacity in a challenging and changing environment and enrich the economic, social and cultural life of the region. We will be a catalyst for progress, the State’s most valuable asset, and an indispensable contributor to the nation’s well-being. Achieving the goals of Transforming Maryland requires broad-based and sustained support from our extended community. We ask our stakeholders to join with us to make the University an institution of world-class quality with world-wide reach and unparalleled impact as it serves the people and the state of Maryland.

  • richardmitnick 11:18 am on August 27, 2015 Permalink | Reply
    Tags: , Basic Research, Electromagnetism,   

    From phys.org: “New theory leads to radiationless revolution” 


    August 27, 2015
    No Writer Credit

    Dr. Miroshnichenko with his visualization of anapoles as dark matter. Credit: Stuart Hay, ANU

    Physicists have found a radical new way confine electromagnetic energy without it leaking away, akin to throwing a pebble into a pond with no splash.

    The theory could have broad ranging applications from explaining dark matter to combating energy losses in future technologies.

    However, it appears to contradict a fundamental tenet of electrodynamics, that accelerated charges create electromagnetic radiation, said lead researcher Dr Andrey Miroshnichenko from The Australian National University (ANU).

    “This problem has puzzled many people. It took us a year to get this concept clear in our heads,” said Dr Miroshnichenko, from the ANU Research School of Physics and Engineering.

    The fundamental new theory could be used in quantum computers, lead to new laser technology and may even hold the key to understanding how matter itself hangs together.

    “Ever since the beginning of quantum mechanics people have been looking for a configuration which could explain the stability of atoms and why orbiting electrons do not radiate,” Dr Miroshnichenko said.

    The absence of radiation is the result of the current being divided between two different components, a conventional electric dipole and a toroidal dipole (associated with poloidal current configuration), which produce identical fields at a distance.

    If these two configurations are out of phase then the radiation will be cancelled out, even though the electromagnetic fields are non-zero in the area close to the currents.

    Visualization of dark matter as energy confined within non-radiating anapoles. Credit: Andrey Miroshnichenko

    Dr Miroshnichenko, in collaboration with colleagues from Germany and Singapore, successfully tested his new theory with a single silicon nanodiscs between 160 and 310 nanometres in diameter and 50 nanometres high, which he was able to make effectively invisible by cancelling the disc’s scattering of visible light.

    This type of excitation is known as an anapole (from the Greek, ‘without poles’).

    Dr Miroshnichenko’s insight came while trying to reconcile differences between two different mathematical descriptions of radiation; one based on Cartesian multipoles and the other on vector spherical harmonics used in a Mie basis set.

    “The two gave different answers, and they shouldn’t. Eventually we realised the Cartesian description was missing the toroidal components,” Dr Miroshnichenko said.

    “We realised that these toroidal components were not just a correction, they could be a very significant factor.”

    Dr Miroshnichenko said the confined energy of anapoles could be important in the development of tiny lasers on the surface of materials, called spasers, and also in the creation of efficient X-ray lasers by high-order harmonic generation

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

  • richardmitnick 11:04 am on August 27, 2015 Permalink | Reply
    Tags: , Basic Research, Digital seafloor maps,   

    From isgtw: “World’s first digital ocean floor map” 

    international science grid this week

    August 26, 2015

    Download mp4 here.

    Researchers from the University of Sydney’s School of Geosciences in Australia have created the world’s first digital map of the seafloor. Understanding the ocean — the Earth’s largest storehouse of carbon — as it relates to the seabed is critical to know how climate change will affect the ocean environment.


    “In order to understand environmental change in the oceans we need to better understand the seabed,” says lead researcher Dr. Adriana Dutkiewicz. “Our research opens the door to a better understanding of the workings and history of the marine carbon cycle. We urgently need to understand how the ocean responds to climate change.”

    The last seabed map was hand drawn more than 40 years ago. Using an artificial intelligence method called support vector machine, experts at the National ICT Australia (NICTA) turned an assemblage of descriptions and sediment samples collected since the 1950s into a single contiguous digital map.

    “The difference between the new and old map is a little like comparing a barren tundra landscape with an exotic tropical paradise full of diversity,” says Dutkiewicz. “The ocean floor used to be portrayed as a monotonous seascape whereas the new map echoes the colorful patchworks of dreamtime art.”

    The map data can be downloaded for free [I got the download, but could not find a program to open it], and you can see the dreamy interactive 3D globe here.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 463 other followers

%d bloggers like this: