Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:05 pm on September 20, 2014 Permalink | Reply
    Tags: , , , Fossil Record   

    From astrobio.net: “Meteorite that doomed the dinosaurs helped the forests bloom” 

    Astrobiology Magazine

    Astrobiology Magazine

    Sep 20, 2014
    Source: PLOS
    No Writer Credit

    66 million years ago, a 10-km diameter chunk of rock hit the Yukatan peninsula near the site of the small town of Chicxulub with the force of 100 teratons of TNT. It left a crater more than 150 km across, and the resulting megatsunami, wildfires, global earthquakes and volcanism are widely accepted to have wiped out the dinosaurs and made way for the rise of the mammals. But what happened to the plants on which the dinosaurs fed?

    A new study led by researchers from the University of Arizona reveals that the meteorite impact that spelled doom for the dinosaurs also decimated the evergreen flowering plants to a much greater extent than their deciduous peers. They hypothesize that the properties of deciduous plants made them better able to respond rapidly to chaotically varying post-apocalyptic climate conditions. The results are publishing on September 16 in the open access journal PLOS Biology.

    Applying biomechanical formulae to a treasure trove of thousands of fossilized leaves of angiosperms — flowering plants excluding conifers — the team was able to reconstruct the ecology of a diverse plant community thriving during a 2.2 million-year period spanning the cataclysmic impact event, believed to have wiped out more than half of plant species living at the time. The fossilized leaf samples span the last 1,400,000 years of the Cretaceous and the first 800,000 of the Paleogene.

    The researchers found evidence that after the impact, fast-growing, deciduous angiosperms had replaced their slow-growing, evergreen peers to a large extent. Living examples of evergreen angiosperms, such as holly and ivy, tend to prefer shade, don’t grow very fast and sport dark-colored leaves.

    “When you look at forests around the world today, you don’t see many forests dominated by evergreen flowering plants,” said the study’s lead author, Benjamin Blonder. “Instead, they are dominated by deciduous species, plants that lose their leaves at some point during the year.”

    leaf
    Seen here is a Late Cretaceous specimen from the Hell Creek Formation, morphotype HC62, taxon ”Rhamnus” cleburni. Specimens are housed at the Denver Museum of Nature and Science in Denver, Colorado. Credit: Image credit: Benjamin Blonder.

    Blonder and his colleagues studied a total of about 1,000 fossilized plant leaves collected from a location in southern North Dakota, embedded in rock layers known as the Hell Creek Formation, which at the end of the Cretaceous was a lowland floodplain crisscrossed by river channels.

    The collection consists of more than 10,000 identified plant fossils and is housed primarily at the Denver Museum of Nature and Science. “When you hold one of those leaves that is so exquisitely preserved in your hand knowing it’s 66 million years old, it’s a humbling feeling,” said Blonder.

    “If you think about a mass extinction caused by catastrophic event such as a meteorite impacting Earth, you might imagine all species are equally likely to die,” Blonder said. “Survival of the fittest doesn’t apply — the impact is like a reset button. The alternative hypothesis, however, is that some species had properties that enabled them to survive.

    “Our study provides evidence of a dramatic shift from slow-growing plants to fast-growing species,” he said. “This tells us that the extinction was not random, and the way in which a plant acquires resources predicts how it can respond to a major disturbance. And potentially this also tells us why we find that modern forests are generally deciduous and not evergreen.”

    Previously, other scientists found evidence of a dramatic drop in temperature caused by dust from the impact. “The hypothesis is that the impact winter introduced a very variable climate,” Blonder said. “That would have favored plants that grew quickly and could take advantage of changing conditions, such as deciduous plants.”

    “We measured the mass of a given leaf in relation to its area, which tells us whether the leaf was a chunky, expensive one to make for the plant, or whether it was a more flimsy, cheap one,” Blonder explained. “In other words, how much carbon the plant had invested in the leaf.” In addition the researchers measured the density of the leaves’ vein networks, a measure of the amount of water a plant can transpire and the rate at which it can acquire carbon.

    “There is a spectrum between fast- and slow-growing species,” said Blonder. “There is the ‘live fast, die young’ strategy and there is the ‘slow but steady’ strategy. You could compare it to financial strategies investing in stocks versus bonds.” The analyses revealed that while slow-growing evergreens dominated the plant assemblages before the extinction event, fast-growing flowering species had taken their places afterward.

    See the full article here.

    NASA

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:41 pm on September 20, 2014 Permalink | Reply
    Tags: , , , , ,   

    From Princeton: “‘Solid’ light could compute previously unsolvable problems” 

    Princeton University
    Princeton University

    Sep 08, 2014
    John Sullivan

    Researchers at Princeton University have begun crystallizing light as part of an effort to answer fundamental questions about the physics of matter.

    The researchers are not shining light through crystal – they are transforming light into crystal. As part of an effort to develop exotic materials such as room-temperature superconductors, the researchers have locked together photons, the basic element of light, so that they become fixed in place.

    “It’s something that we have never seen before,” said Andrew Houck, an associate professor of electrical engineering and one of the researchers. “This is a new behavior for light.”

    The results raise intriguing possibilities for a variety of future materials. But the researchers also intend to use the method to address questions about the fundamental study of matter, a field called condensed matter physics.

    “We are interested in exploring – and ultimately controlling and directing – the flow of energy at the atomic level,” said Hakan Türeci, an assistant professor of electrical engineering and a member of the research team. “The goal is to better understand current materials and processes and to evaluate materials that we cannot yet create.”

    The team’s findings, reported online on Sept. 8 in the journal Physical Review X, are part of an effort to answer fundamental questions about atomic behavior by creating a device that can simulate the behavior of subatomic particles. Such a tool could be an invaluable method for answering questions about atoms and molecules that are not answerable even with today’s most advanced computers.

    light

    In part, that is because current computers operate under the rules of classical mechanics, which is a system that describes the everyday world containing things like bowling balls and planets. But the world of atoms and photons obeys the rules of quantum mechanics, which include a number of strange and very counterintuitive features. One of these odd properties is called “entanglement” in which multiple particles become linked and can affect each other over long distances.

    The difference between the quantum and classical rules limits a standard computer’s ability to efficiently study quantum systems. Because the computer operates under classical rules, it simply cannot grapple with many of the features of the quantum world. Scientists have long believed that a computer based on the rules of quantum mechanics could allow them to crack problems that are currently unsolvable. Such a computer could answer the questions about materials that the Princeton team is pursuing, but building a general-purpose quantum computer has proven to be incredibly difficult and requires further research.

    Another approach, which the Princeton team is taking, is to build a system that directly simulates the desired quantum behavior. Although each machine is limited to a single task, it would allow researchers to answer important questions without having to solve some of the more difficult problems involved in creating a general-purpose quantum computer. In a way, it is like answering questions about airplane design by studying a model airplane in a wind tunnel – solving problems with a physical simulation rather than a digital computer.

    In addition to answering questions about currently existing material, the device also could allow physicists to explore fundamental questions about the behavior of matter by mimicking materials that only exist in physicists’ imaginations.

    To build their machine, the researchers created a structure made of superconducting materials that contains 100 billion atoms engineered to act as a single “artificial atom.” They placed the artificial atom close to a superconducting wire containing photons.

    By the rules of quantum mechanics, the photons on the wire inherit some of the properties of the artificial atom – in a sense linking them. Normally photons do not interact with each other, but in this system the researchers are able to create new behavior in which the photons begin to interact in some ways like particles.

    “We have used this blending together of the photons and the atom to artificially devise strong interactions among the photons,” said Darius Sadri, a postdoctoral researcher and one of the authors. “These interactions then lead to completely new collective behavior for light – akin to the phases of matter, like liquids and crystals, studied in condensed matter physics.”

    Türeci said that scientists have explored the nature of light for centuries; discovering that sometimes light behaves like a wave and other times like a particle. In the lab at Princeton, the researchers have engineered a new behavior.

    “Here we set up a situation where light effectively behaves like a particle in the sense that two photons can interact very strongly,” Türeci said. “In one mode of operation, light sloshes back and forth like a liquid; in the other, it freezes.”

    The current device is relatively small, with only two sites where an artificial atom is paired with a superconducting wire. But the researchers say that by expanding the device and the number of interactions, they can increase their ability to simulate more complex systems – growing from the simulation of a single molecule to that of an entire material. In the future, the team plans to build devices with hundreds of sites with which they hope to observe exotic phases of light such as superfluids and insulators.

    “There is a lot of new physics that can be done even with these small systems,” said James Raftery, a graduate student in electrical engineering and one of the authors. “But as we scale up, we will be able to tackle some really interesting questions.”

    Besides Houck, Türeci, Sadri and Raftery, the research team included Sebastian Schmidt, a senior researcher at the Institute for Theoretical Physics at ETH Zurich, Switzerland. Support for the project was provided by: the Eric and Wendy Schmidt Transformative Technology Fund; the National Science Foundation; the David and Lucile Packard Foundation; the U.S. Army Research Office; and the Swiss National Science Foundation.

    See the full article here.

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield
    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:25 pm on September 20, 2014 Permalink | Reply
    Tags: , , ,   

    From phys.org: “UCI team is first to capture motion of single molecule in real time” 

    physdotorg
    phys.org

    September 16, 2014
    No Writer Credit

    UC Irvine chemists have scored a scientific first: capturing moving images of a single molecule as it vibrates, or “breathes,” and shifts from one quantum state to another.

    aa

    The groundbreaking achievement, led by Ara Apkarian, professor of chemistry, and Eric Potma, associate professor of chemistry, opens a window into the strange realm of quantum mechanics – where nanoscopic bits of matter seemingly defy the logic of classical physics.

    phy
    A simplified view on fields of modern physics theories. Please note that from historical point of view, this diagram is very simplified. In fact, when quantum mechanics was originally formulated, it was applied to models whose correspondence limit was non-relativistic. Many attempts were made to merge quantum mechanics with special relativity with a covariant equation such as the [[w:Dirac equation|Dirac equation]]. In other days, the relativistic quantum mechanics is now abandoned in favour of the quantum theory of fields.

    This could lead to a wide variety of important applications, including lightning-fast quantum computers and uncrackable encryption of private messages. It also moves researchers a step closer to viewing the molecular world in action – being able to see the making and breaking of bonds, which controls biological processes such as enzymatic reactions and cellular dynamics.

    The August issue of Nature Photonics features this breakthrough as its cover story.

    “Our work is the first to capture the motion of one molecule in real time,” Apkarian said. While still images of single molecules have been possible since the 1980s, recording a molecule’s extremely rapid movements had proven elusive.

    In addition to using precisely tuned, ultrafast lasers and microscopes, the researchers had to equip the molecule with a tiny antenna consisting of two gold nanospheres in order to track its activity and record measurements over the course of an hour.

    When the many repeated measurements were averaged, an astonishing finding emerged: The molecule was oscillating from one quantum state to another.

    The scientists have produced a movie in which a small, glowing dot appears to emit pulses of bright light. “That’s the light broadcast from the antenna every time the molecule completes a cycle of its vibrational motion,” Apkarian said. “The bond moves at a rate of 1013 cycles per second – a million, million times 10 cycles in one second.” Making the movie was like freeze-frame photography with a very fast flash and repeating the measurement over and over again.

    Seeing a molecule as it moves is “essential to a deeper understanding of how it forms and breaks chemical bonds,” Potma said. “The aim of the present experiment was to demonstrate that we can capture a molecule in motion on its own timescale.”

    The next and even more ambitious goal is to acquire moving images of molecules in their natural environment without tethering them to an antenna. “Ultimately, we’d like to be able to [examine] a molecule … as it’s undergoing chemistry,” Apkarian said.

    See the full article here.

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:35 pm on September 20, 2014 Permalink | Reply
    Tags: , , , ,   

    From NASA: A Colorful Lunar Eclipse 

    NASA Science Science News

    Mark your calendar: On Oct. 8th, the Moon will pass through the shadow of Earth for a total lunar eclipse. Sky watchers in the USA will see the Moon turn a beautiful shade of celestial red and maybe turquoise, too. Watch, enjoy, learn.

    NASA leads the nation on a great journey of discovery, seeking new knowledge and understanding of our planet Earth, our Sun and solar system, and the universe out to its farthest reaches and back to its earliest moments of existence. NASA’s Science Mission Directorate (SMD) and the nation’s science community use space observatories to conduct scientific studies of the Earth from space to visit and return samples from other bodies in the solar system, and to peer out into our Galaxy and beyond. NASA’s science program seeks answers to profound questions that touch us all:

    This is NASA’s science vision: using the vantage point of space to achieve with the science community and our partners a deep scientific understanding of our planet, other planets and solar system bodies, the interplanetary environment, the Sun and its effects on the solar system, and the universe beyond. In so doing, we lay the intellectual foundation for the robotic and human expeditions of the future while meeting today’s needs for scientific information to address national concerns, such as climate change and space weather. At every step we share the journey of scientific exploration with the public and partner with others to substantially improve science, technology, engineering and mathematics (STEM) education nationwide.

    NASA

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:43 pm on September 19, 2014 Permalink | Reply
    Tags: , , ,   

    From astrobio.net: “What set the Earth’s plates in motion?” 

    Astrobiology Magazine

    Astrobiology Magazine

    Sep 19, 2014
    Source: University of Sidney

    The mystery of what kick-started the motion of our earth’s massive tectonic plates across its surface has been explained by researchers at the University of Sydney.

    “Earth is the only planet in our solar system where the process of plate tectonics occurs,” said Professor Patrice Rey, from the University of Sydney’s School of Geosciences.

    “The geological record suggests that until three billion years ago the earth’s crust was immobile so what sparked this unique phenomenon has fascinated geoscientists for decades. We suggest it was triggered by the spreading of early continents then eventually became a self-sustaining process.”

    Professor Rey is lead author of an article on the findings published in Nature on Wednesday, 17 September.

    The other authors on the paper are Nicolas Flament, also from the School of Geosciences and Nicolas Coltice, from the University of Lyon.

    split
    The image shows a snapshot from the film after 45 million years of spreading. The pink is the region where the mantle underneath the early continent has melted, facilitating its spreading, and the initiation of the plate tectonic process. Credit: Patrice Rey, Nicolas Flament and Nicolas Coltice.

    The image shows a snapshot after 45 million years of spreading. The pink is the region where the mantle underneath the early continent has melted, facilitating its spreading, and the initiation of the plate tectonic process. Credit: Patrice Rey, Nicolas Flament and Nicolas Coltice.

    There are eight major tectonic plates that move above the earth’s mantle at rates up to 150 millimetres every year.

    In simple terms the process involves plates being dragged into the mantle at certain points and moving away from each other at others, in what has been dubbed ‘the conveyor belt’.

    Plate tectonics depends on the inverse relationship between density of rocks and temperature.

    At mid-oceanic ridges, rocks are hot and their density is low, making them buoyant or more able to float. As they move away from those ridges they cool down and their density increases until, where they become denser than the underlying hot mantle, they sink and are ‘dragged’ under.

    ridge
    Mid-ocean ridge

    But three to four billion years ago, the earth’s interior was hotter, volcanic activity was more prominent and tectonic plates did not become cold and dense enough to spontaneously sank.

    “So the driving engine for plate tectonics didn’t exist,” said Professor Rey said.

    “Instead, thick and buoyant early continents erupted in the middle of immobile plates. Our modelling shows that these early continents could have placed major stress on the surrounding plates. Because they were buoyant they spread horizontally, forcing adjacent plates to be pushed under at their edges.”

    “This spreading of the early continents could have produced intermittent episodes of plate tectonics until, as the earth’s interior cooled and its crust and plate mantle became heavier, plate tectonics became a self-sustaining process which has never ceased and has shaped the face of our modern planet.”

    The new model also makes a number of predictions explaining features that have long puzzled the geoscience community.

    See the full article here.

    NASA

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:21 pm on September 19, 2014 Permalink | Reply
    Tags: , , , ,   

    FromSpace.com: “The Physics of the Death Star” 

    space-dot-com logo

    SPACE.com

    September 18, 2014
    Ethan Siegel [Starts With a Bang]

    How to destroy an Alderaan-sized planet.

    “What’s that star?
    It’s the Death Star.
    What does it do?
    It does Death. It does Death, buddy. Get out of my way!” -Eddie Izzard

    It’s one of the most iconic sequences in all of film: the evil galactic empire takes the captured princess to her home planet of Alderaan, a world not so different from Earth, threatening to destroy it unless she tells them the location of the hidden rebel base. Distressed but loyal to her cause, she lies, giving them the name of a false location, which they have no way of knowing. Nevertheless, they give the order to fire, and despite her protestations, this is what happens next.

    I want you to think about this for a moment:

    A battle station the size of the Moon,
    With a mysterious, unexplained power source at its core,
    Charges up and fires a laser-like ray at an entire, Earth-sized planet,
    And completely destroys it.

    Not only does the Death Star completely destroy Alderaan from the force of its blast, it does so in a matter of seconds, and kicks off at least a substantial fraction of the world into interplanetary space with an incredible velocity.

    See for yourself!

    blow up

    From a physics point of view — and using the Earth as a proxy for Alderaan — how much energy/power would it take to cause this destruction, and what are the physical possibilities for actually making this happen?

    First off, let’s consider the planet Earth, and force binding it together.

    earth

    As Obi-Wan famously said, “It surrounds us and penetrates us; it binds the galaxy together.” But the force binding the Earth together isn’t the mysterious one from the Star Wars Universe, but simply gravitation. And the gravitational binding energy of our planet — which is the minimum amount of energy we’d have to put into it to blast it apart — is an astounding 2.24 × 10^32 Joules, or 224,000,000,000,000,000,000,000,000,000,000 Joules of energy!

    To put that in perspective, think about the entire energy output of the Sun, a “mere” 3.8 × 10^26 Watts.

    sun

    It would take a full week’s worth of the Sun’s total energy output — delivered to an entire planet in the span of a few seconds — to cause that kind of reaction!

    Remember what goes on inside an actual Sun-like star: hydrogen is burned via the process of nuclear fusion into heavier isotopes and elements, resulting in helium. Each second in the Sun, 4.3 billion kilograms of mass are converted into pure energy, which is the source of the Sun’s energy output. Let’s imagine that’s exactly what the Death Star is doing, in the most efficient way possible.

    watts

    We could simply have the Death Star fire a beam of light into the planet (e.g., laser light), requiring that it generate all that energy on board itself, and then firing it at Alderaan. This would be catastrophically inefficient, however: imagine a solid material structure — even one as big as our Moon — trying to generate, direct and expel all that energy in just a matter of a few seconds. Releasing that much energy in one direction (2.24 × 10^32 Joules), would cause a Moon-mass object to accelerate in the opposite direction to a speed of 78 km/s from rest, something that clearly didn’t happen when the Death Star was fired.

    flash

    In fact, there was no discernible recoil at all! And that’s not even considering how such intense energy would be managed, since it would heat up everything surrounding it (by simple heat diffusion) and quite clearly melt the tubes inside. But there’s another way this planetary destruction could’ve happened, predicated on one simple, indisputable fact: Princess Leia is made up of matter, and not antimatter.

    Since she’s made of matter and grew up on Alderaan, we can assume Alderaan is made of matter as well, meaning that if if the Death Star instead fired pure antimatter at Alderaan, it would only need to supply half the total energy, since the target (Alderaan itself) would provide the other half of the fuel.

    If this were the case, “only” 1.24 trillion tonnes of antimatter would suffice to provide the minimum amount of energy needed to blast that world apart. In the grand scheme of things, that isn’t so big.

    hunks
    Image credit: montage by Emily Lakdawalla of the Planetary Society, via http://www.planetary.org/blogs/emily-lakdawalla/2008/1634.html, all credits as follows: NASA / JPL / Ted Stryk except: Mathilde: NASA / JHUAPL / Ted Stryk; Steins: ESA / OSIRIS team; Eros: NASA / JHUAPL; Itokawa: ISAS / JAXA / Emily Lakdawalla; Halley: Russian Academy of Sciences / Ted Stryk; Tempel 1: NASA / JPL / UMD; Wild 2: NASA / JPL

    Here are some of the larger asteroids and comet nuclei known in the Solar System; 1.24 trillion tonnes is only about the mass of the asteroid 5535 Annefrank, or one of the smaller asteroids in this montage. It’s larger than Dactyl and smaller than Ida, and denser than any of the cometary nuclei like Halley or Tempel.

    In fact, if we were to compare 5535 Annefrank with Earth — an Alderaan-sized planet — it would be about one tenth the size of what Ida looks like.

    ida
    Image credit: Matt Francis of Galileo’s Pendulum, via http://galileospendulum.org/2012/03/05/moonday-a-bite-sized-moon/

    In other words, the “antimatter” asteroid that would theoretically destroy an entire planet would barely be a single pixel in the above image!

    It’s not completely inconceivable that such a small amount of antimatter could be generated and fired at a planet! Storing that much antimatter in a Death Star-sized object might be the hard part, but here’s the thing: just like matter binds to itself through the electromagnetic force and — if you get a large amount of “stuff” together — through gravitation, antimatter behaves exactly in the same way.

    two
    We’ve been able to create neutral antimatter and store it, successfully, for reasonably long periods of time: not mere picoseconds, microseconds or even milliseconds, but long enough that it’s only our failure to keep normal matter away from it that causes it to annihilate in short order.

    It isn’t unreasonable that an advanced technological civilization — one that’s mastered hyperdrive and faster-than-light travel — could harness, say, the energy from an uninhabited star and use it to produce neutral antimatter. The way we do it on Earth in particle accelerators is relatively simple: we collide protons with other protons at high energies, producing three protons and one antiproton as a result. That antiproton could then be merged with a positron to produce neutral antihydrogen. You might wish for rocky, crystalline structures based on elements like silicon or carbon, but under the right conditions, hydrogen can produce a crystal-like structure.

    pro
    The quark structure of the proton. The color assignment of individual quarks is arbitrary, but all three colors must be present. Forces between quarks are mediated by gluons

    ap
    The quark structure of the antiproton.

    draw
    Image credit: NASA/R.J. Hall, via http://en.wikipedia.org/wiki/File:Jupiter_interior.png

    In the interiors of gas giants like Jupiter and Saturn, the incredibly thick hydrogen atmosphere extends down for tens of thousands of kilometers. Whereas the pressure at Earth’s atmosphere is around 100,000 Pascals (where a Pascal is a N/m^2), at pressures of tens of Gigapascals (or 10^10 Pascals), hydrogen can enter a metallic phase, something that should no doubt happen in the interiors of gas giant planets.

    If we could achieve this state of matter, hydrogen would actually become an electrical conductor, and is thought to be responsible for the intense magnetic field of Jupiter. All the laws of physics suggest that if this is how matter behaves, and we can do this with hydrogen, then this must also be how antimatter — and hence, antihydrogen — behaves, too.

    So all it would take, if you want to destroy an (Earth-like) planet like Alderaan, is a little over a trillion tonnes of metallic antihydrogen, and to transport it down to the planet’s surface. Once it hits the planet’s surface, it should have no trouble clearing a path down near the core, where the densities are highest.

    graph
    Image credit: Wikimedia Commons user AllenMcC, via http://www.gps.caltech.edu/uploads/File/People/dla/DLApepi81.pdf.

    And as matter-and-antimatter annihilate according to E=mc^2, the result is the release of pure energy. So long as it’s more than the gravitational binding energy of the planet — and that’s not a whole lot of antimatter, mind you — the result could be literally world-ending!

    blast
    Image credit: user Jugus of the Halo Wikia, via http://halo.wikia.com/wiki/Shield_0459. It’s the same idea.

    But if you wanted to destroy an entire planet, it would only take a small amount of antimatter to do the job: just 0.00000002% the mass of the planet in question. For comparison, a single antimatter star — and not necessarily a behemoth, but something like a relatively common A-star like Vega — would be able to undo an entire Milky Way-sized galaxy.

    When you think about it, it should make you really, really glad that matter won out over antimatter in the Universe, and that there aren’t starships, planets, stars and galaxies made out of antimatter out there. The way the Universe is destructing — slowly and gradually — is more than sufficient as-is.

    Leave your planet-destroying comments at the Starts With A Bang forum here!

    See the full article,with video, here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 12:34 pm on September 19, 2014 Permalink | Reply
    Tags: , , , , , ,   

    From Fermilab- “Frontier Science Result: CMS Three ways to be invisible” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Friday, Sept. 19, 2014
    Jim Pivarski

    There is a common misconception that the LHC was built only to search for the Higgs boson. It is intended to answer many different questions about subatomic particles and the nature of our universe, so the collision data are reused by thousands of scientists, each studying their own favorite questions. Usually, a single analysis only answers one question, but recently, one CMS analysis addressed three different new physics: dark matter, extra dimensions and unparticles.

    CERN CMS New
    CMS

    CERN LHC Grand Tunnel
    CERN LHC Map
    CERN LHC particles
    LHC

    The study focused on proton collisions that resulted in a single jet of particles and nothing else. This can only happen if some of the collision products are invisible — for instance, one proton may emit a jet before collision and the collision itself produces only invisible particles. The jet is needed to be sure that a collision took place, but the real interest is in the invisible part.

    proton
    The quark structure of the proton. The color assignment of individual quarks is arbitrary, but all three colors must be present. Forces between quarks are mediated by gluons

    Sometimes, the reason that nothing else was seen in the detector is mundane. Particles may be lost because their trajectories missed the active area of the detector or a component of the detector was malfunctioning during the event. More often, the reason is due to known physics: 20 percent of Z bosons decay into invisible neutrinos. If there were an excess of invisible events, more than predicted by the Standard Model, these extra events would be evidence of new phenomena.

    The classic scenario involving invisible particles is dark matter. Dark matter has been observed through its gravitational effects on galaxies and the expansion of the universe, but it has never been detected in the laboratory. Speculations about the nature of dark matter abound, but it will remain mysterious until its properties can be studied experimentally.

    Another way to get invisible particles is through extra dimensions. If our universe has more than three spatial dimensions (with only femtometers of “breathing room” in the other dimensions), then the LHC could produce gravitons that spin around the extra dimensions. Gravitons interact very weakly with ordinary matter, so they would appear to be invisible.

    A third possibility is that there is a new form of matter that isn’t made of indivisible particles. These so-called unparticles can be produced in batches of 1½ , 2¾ , or any other amount. Unparticles, if they exist, would also interact weakly with matter.

    All three scenarios produce something invisible, so if the CMS data had revealed an excess of invisible events, any one of the scenarios could have been responsible. Follow-up studies would have been needed to determine which one it was. As it turned out, however, there was no excess of invisible events, so the measurement constrains all three models at once. Three down in one blow!

    LHC scientists are eager to see what the higher collision energy of Run 2 will deliver.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 12:15 pm on September 19, 2014 Permalink | Reply
    Tags: , Mechanical Engineering,   

    From M.I.T.: “Fingertip sensor gives robot unprecedented dexterity” 


    MIT News

    September 19, 2014
    Larry Hardesty | MIT News Office

    Researchers at MIT and Northeastern University have equipped a robot with a novel tactile sensor that lets it grasp a USB cable draped freely over a hook and insert it into a USB port.

    hook
    Armed with the GelSight sensor, a robot can grasp a freely hanging USB cable and plug it into a USB port. Photo: Melanie Gonick/MIT

    The sensor is an adaptation of a technology called GelSight, which was developed by the lab of Edward Adelson, the John and Dorothy Wilson Professor of Vision Science at MIT, and first described in 2009. The new sensor isn’t as sensitive as the original GelSight sensor, which could resolve details on the micrometer scale. But it’s smaller — small enough to fit on a robot’s gripper — and its processing algorithm is faster, so it can give the robot feedback in real time.

    gel

    Industrial robots are capable of remarkable precision when the objects they’re manipulating are perfectly positioned in advance. But according to Robert Platt, an assistant professor of computer science at Northeastern and the research team’s robotics expert, for a robot taking its bearings as it goes, this type of fine-grained manipulation is unprecedented.

    “People have been trying to do this for a long time,” Platt says, “and they haven’t succeeded because the sensors they’re using aren’t accurate enough and don’t have enough information to localize the pose of the object that they’re holding.”

    The researchers presented their results at the International Conference on Intelligent Robots and Systems this week. The MIT team — which consists of Adelson; first author Rui Li, a PhD student; Wenzhen Yuan, a master’s student; and Mandayam Srinivasan, a senior research scientist in the Department of Mechanical Engineering — designed and built the sensor. Platt’s team at Northeastern, which included Andreas ten Pas and Nathan Roscup, developed the robotic controller and conducted the experiments.

    Synesthesia

    Whereas most tactile sensors use mechanical measurements to gauge mechanical forces, GelSight uses optics and computer-vision algorithms.

    “I got interested in touch because I had children,” Adelson says. “I expected to be fascinated by watching how they used their visual systems, but I was actually more fascinated by how they used their fingers. But since I’m a vision guy, the most sensible thing, if you wanted to look at the signals coming into the finger, was to figure out a way to transform the mechanical, tactile signal into a visual signal — because if it’s an image, I know what to do with it.”

    A GelSight sensor — both the original and the new, robot-mounted version — consists of a slab of transparent, synthetic rubber coated on one side with a metallic paint. The rubber conforms to any object it’s pressed against, and the metallic paint evens out the light-reflective properties of diverse materials, making it much easier to make precise optical measurements.

    In the new device, the gel is mounted in a cubic plastic housing, with just the paint-covered face exposed. The four walls of the cube adjacent to the sensor face are translucent, and each conducts a different color of light — red, green, blue, or white — emitted by light-emitting diodes at the opposite end of the cube. When the gel is deformed, light bounces off of the metallic paint and is captured by a camera mounted on the same cube face as the diodes.

    From the different intensities of the different-colored light, the algorithms developed by Adelson’s team can infer the three-dimensional structure of ridges or depressions of the surface against which the sensor is pressed.

    Although there was a 3-millimeter variation in where the robot grasped the plug, it was still able to measure its position accurately enough to insert it into a USB port that tolerated only about a millimeter’s error. By that measure, even the lower-resolution, robot-mounted version of the GelSight sensor is about 100 times more sensitive than a human finger.

    Plug ‘n play

    In Platt’s experiments, a Baxter robot from MIT spinout Rethink Robotics was equipped with a two-pincer gripper, one of whose pincers had a GelSight sensor on its tip. Using conventional computer-vision algorithms, the robot identified the dangling USB plug and attempted to grasp it. It then determined the position of the USB plug relative to its gripper from an embossed USB symbol. Although there was a 3-millimeter variation, in each of two dimensions, in where the robot grasped the plug, it was still able to insert it into a USB port that tolerated only about a millimeter’s error.

    “Having a fast optical sensor to do this kind of touch sensing is a novel idea,” says Daniel Lee, a professor of electrical and systems engineering at the University of Pennsylvania and director of the GRASP robotics lab, “and I think the way that they’re doing it with such low-cost components — using just basically colored LEDs and a standard camera — is quite interesting.”

    How GelSight fares against other approaches to tactile sensing will depend on “the application domain and what the price points are,” Lee says. “What Rui’s device has going for it is that it has very good spatial resolution. It’s able to see heights on the level of tens of microns. Compared to other devices in the domain that use things like barometers, the spatial resolution is very good.”

    “As roboticists, we are always looking for new sensors,” Lee adds. “This is a promising prototype. It could be developed into practical device.”

    See the full article, with video, here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 12:00 pm on September 19, 2014 Permalink | Reply
    Tags: , , , ,   

    From Chandra: “Tarantula Nebula (30 Doradus): A New View of the Tarantula Nebula” 2012 

    NASA Chandra

    April 17, 2012

    A new composite of 30 Doradus (aka, the Tarantula Nebula) contains data from Chandra (blue), Hubble (green), and Spitzer (red). 30 Doradus is one of the largest star-forming regions located close to the Milky Way. This region contains thousands of young massive stars, making it an excellent place to study how stars are born.

    NASA Hubble Telescope
    NASA/ESA Hubble

    NASA Spitzer Telescope
    NASA/Spitzer

    clomp
    Composite

    xray
    X-ray

    infra
    Infrared

    opt
    Optical
    Credit X-ray: NASA/CXC/PSU/L.Townsley et al.; Optical: NASA/STScI; Infrared: NASA/JPL/PSU/L.Townsley et al.
    Release Date April 17, 2012

    To celebrate its 22nd anniversary in orbit, the Hubble Space Telescope has released a dramatic new image of the star-forming region 30 Doradus, also known as the Tarantula Nebula because its glowing filaments resemble spider legs. A new image from all three of NASA’s Great Observatories – Chandra, Hubble, and Spitzer – has also been created to mark the event.

    30 Doradus is located in the neighboring galaxy called the Large Magellanic Cloud, and is one of the largest star-forming regions located close to the Milky Way . At the center of 30 Doradus, thousands of massive stars are blowing off material and producing intense radiation along with powerful winds. The Chandra X-ray Observatory detects gas that has been heated to millions of degrees by these stellar winds and also by supernova explosions. These X-rays, colored blue in this composite image, come from shock fronts — similar to sonic booms — formed by this high-energy stellar activity.

    lmc
    Large Magellanic Cloud

    The Hubble data in the composite image, colored green, reveals the light from these massive stars along with different stages of star birth including embryonic stars a few thousand years old still wrapped in cocoons of dark gas. Infrared emission from Spitzer, seen in red, shows cooler gas and dust that have giant bubbles carved into them. These bubbles are sculpted by the same searing radiation and strong winds that comes from the massive stars at the center of 30 Doradus.

    See the full article here.

    Another view:

    tr
    This first light image of the TRAPPIST national telescope at La Silla shows the Tarantula Nebula, located in the Large Magellanic Cloud (LMC) — one of the galaxies closest to us. Also known as 30 Doradus or NGC 2070, the nebula owes its name to the arrangement of bright patches that somewhat resembles the legs of a tarantula. Taking the name of one of the biggest spiders on Earth is very fitting in view of the gigantic proportions of this celestial nebula — it measures nearly 1000 light-years across! Its proximity, the favourable inclination of the LMC, and the absence of intervening dust make this nebula one of the best laboratories to help understand the formation of massive stars better. The image was made from data obtained through three filters (B, V and R) and the field of view is about 20 arcminutes across.
    8 June 2010

    ESO TRAPPIST telescope
    ESO Trappist Interior
    ESO/TRAPPIST Telescope

    NASA’s Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA’s Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra’s science and flight operations from Cambridge, Mass.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 11:35 am on September 19, 2014 Permalink | Reply
    Tags: , ,   

    From phys.org: “Physical constant is constant even in strong gravitational fields” 

    physdotorg
    phys.org

    September 19, 2014
    Ans Hekkenberg

    An international team of physicists has shown that the mass ratio between protons and electrons is the same in weak and in very strong gravitational fields. Their study, which was partly funded by the FOM Foundation, is published online on 18 September 2014 in Physical Review Letters.

    The idea that the laws of physics and its fundamental constants do not depend on local circumstances is called the equivalence principle. This principle is a cornerstone to [Albert] Einstein’s theory of general relativity. To put the principle to the test, FOM physicists working at the LaserLaB at VU University Amsterdam determined whether one fundamental constant, the mass ratio between protons and electrons , depends on the strength of the gravitational field that the particles are in.

    lab
    Picture of the laser system with which the hydrogen molecules were investigated on earth. Credit: LaserLaB VU University Amsterdam/Wim Ubachs

    Laboratories on earth and in space

    The researchers compared the proton-electron mass ratio near the surface of a white dwarf star to the mass ratio in a laboratory on Earth. White dwarfs stars, which are in a late stage of their life cycle, have collapsed to less than 1% of their original size. The gravitational field at the surface of these stars is therefore much larger than that on earth, by a factor of 10,000. The physicists concluded that even these strong gravitational conditions, the proton-electron mass ratio is the same within a margin of 0.005%. In both cases, the proton mass is 1836.152672 times as big as the electron mass.

    Absorption spectra

    To reach their conclusion, the Dutch physicists collaborated with astronomers of the University of Leicester, the University of Cambridge and the Swinburne University of Technology in Melbourne. The team analysed absorption spectra of hydrogen molecules in white dwarf photospheres (the outer shell of a star from which light is radiated). The spectra were then compared to spectra obtained with a laser at LaserLaB in Amsterdam.

    Absorption spectra reveal which radiation frequencies are absorbed by a particle. A small deviation of the proton-electron mass ration would affect the structure of the molecule, and therefore the absorption spectrum as well. However, the comparison revealed that the spectra were very similar, which proves that the value of the proton-electron mass ratio is indeed independent of the strength of the gravitation field.

    Rock-solid

    FOM PhD student Julija Bagdonaite: “Previously, we confirmed the constancy of this fundamental constant on a cosmological time scale with the Very Large Telescope in Chile. Now we searched for a dependence on strong gravitational fields using the Hubble Space Telescope. Gradually we find that the fundamental constants seem to be rock-solid and eternal.”

    ESO VLT Interferometer
    ESO/VLT

    NASA Hubble Telescope
    NASA/ESA Hubble

    See the full article here.

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 322 other followers

%d bloggers like this: