Tagged: ars technica Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:41 am on August 26, 2015 Permalink | Reply
    Tags: , ars technica, , ,   

    From ars technica: “Quantum dots may be key to turning windows into photovoltaics” 

    Ars Technica
    ars technica

    Aug 26, 2015
    John Timmer

    1
    Some day, this might generate electricity. Flickr user Ricardo Wang

    While wind may be one of the most economical power sources out there, photovoltaic solar energy has a big advantage: it can go small. While wind gets cheaper as turbines grow larger, the PV hardware scales down to fit wherever we have infrastructure. In fact, simply throwing solar on our existing building stock could generate a very large amount of carbon-free electricity.

    But that also highlights solar’s weakness: we have to install it after the infrastructure is in place, and that installation adds considerably to its cost. Now, some researchers have come up with some hardware that could allow photovoltaics to be incorporated into a basic building component: windows. The solar windows would filter out a small chunk of the solar spectrum and convert roughly a third of it to electricity.

    As you’re probably aware, photovoltaic hardware has to absorb light in order to work, and a typical silicon panel appears black. So, to put any of that hardware (and its supporting wiring) into a window that doesn’t block the view is rather challenging. One option is to use materials that only capture a part of the solar spectrum, but these tend to leave the light that enters the building with a distinctive tint.

    The new hardware takes a very different approach. The entire window is filled with a diffuse cloud of quantum dots that absorb almost all of the solar spectrum. As a result, the “glass” portion of things simply dims the light passing through the window slightly. (The quantum dots are actually embedded in a transparent polymer, but that could be embedded in or coat glass.) The end result is what optics people call a neutral density filter, something often used in photography. In fact, tests with the glass show that the light it transmits meets the highest standards for indoor lighting.

    Of course, simply absorbing the light doesn’t help generate electricity. And, in fact, the quantum dots aren’t used to generate the electricity. Instead, the authors generated quantum dots made of copper, indium, and selenium, covered in a layer of zinc sulfide. (The authors note that there are no toxic metals involved here.) These dots absorb light across a broad band of spectrum, but re-emit it at a specific wavelength in the infrared. The polymer they’re embedded in acts as a waveguide to take many of the photons to the thin edge of the glass.

    And here’s where things get interesting: the wavelength of infrared the quantum dots emit happens to be very efficiently absorbed by a silicon photovoltaic device. So, if you simply place these devices along the edges of the glass, they’ll be fed a steady diet of photons.

    The authors model the device’s behavior and find that nearly half the infrared photons end up being fed the photovoltaic devices (equal amounts get converted to heat or escape the window entirely). It’s notable that the devices are small, though (about 12cm squares)—larger panes would presumably allow even more photons to escape.

    The authors tested a few of the devices, one that filtered out 20 percent of the sunlight and one that only captured 10 percent. The low-level filter sent about one percent of the incident light to the sides, while the darker one sent over three percent.

    There will be losses in the conversion to electricity as well, so this isn’t going to come close to competing with a dedicated panel on a sunny roof. Which is fine, because it’s simply not meant to. Any visit to a major city will serve as a good reminder that we’re regularly building giant walls of glass that currently reflect vast amounts of sunlight, blinding or baking (or both!) the city’s inhabitants on a sunny day. If we could cheaply harvest a bit of that instead, we’re ahead of the game.

    Nature Nanotechnology, 2015. DOI: 10.1038/NNANO.2015.178 (About DOIs).

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon
    Stem Education Coalition
    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

     
  • richardmitnick 6:54 am on July 28, 2015 Permalink | Reply
    Tags: , ars technica,   

    From Ars Technica: “Inside the world’s quietest room” 

    Ars Technica logo

    ars technica

    Jul 28, 2015
    Sebastian Anthony

    In a hole, on some bedrock a few miles outside central Zurich, there lived a spin-polarised scanning electron microscope. Not a nasty, dirty, wet hole: it was a nanotech hole, and that means quiet. And electromagnetically shielded. And vibration-free. And cool.

    When you want to carry out experiments at the atomic scale—when you want to pick up a single atom and move it to the other end of a molecule—it requires incredibly exacting equipment. That equipment, though, is worthless without an equally exacting laboratory to put it in. If you’re peering down the (figurative) barrel of a microscope at a single atom, you need to make sure there are absolutely no physical vibrations at all, or you’ll just get a blurry image. Similarly, atoms really don’t like to sit still: you don’t want to spend a few hours setting up a transmission electron microscope (TEM), only to have a temperature fluctuation or EM field imbue the atoms with enough energy to start jumping around on their own accord.

    One solution, as you have probably gathered from the introduction to this story, is to build a bunker deep underground, completely from scratch, with every facet of the project simulated, designed, and built with a singular purpose in mind: to block out the outside world entirely. That’s exactly what IBM Research did back in 2011, when it opened the Binnig and Rohrer Nanotechnology Center.

    5

    The Center, which is located just outside Zurich in Rüschlikon, cost about €80 million (£60 million, $90 million) to build, which includes equipment costs of around €27 million (£20 million, $30 million). IBM constructed and owns the building, but IBM Research and ETH Zurich have shared use of the building and equipment. ETH and IBM collaborate on a lot of research, especially on nanoscale stuff.

    1
    The entrance hall to the Binnig and Rohrer Nanotechnology Center.

    2

    Deep below the Center there are six quiet rooms—or, to put it another way, rooms that are almost completely devoid of any kind of noise, from acoustic waves to physical vibrations to electromagnetic radiation. Each room is dedicated to a different nanometre-scale experiment: in one room, I was shown a Raman microscope, which is used for “fingerprinting” molecules; in another, a giant TEM, which is like an optical microscope, but it uses a beam of electrons instead of light to resolve details as small as 0.09nm. Every room is eerily deadened and quiet, which is juxtapositionally belied by the hulking silhouette of a multi-million-pound apparatus sitting in the middle of it. After investigating a few rooms, I notice that my phone is uncharacteristically lifeless. “That’s the nickel-iron box that encases every room,” my guide informs me.

    It’s impossible to go into every design feature of the noise-free rooms, but I’ll run through the most important and the most interesting. For a start, the rooms are built directly on the bedrock, significantly reducing vibrations from a nearby road and an underground train. Then, the walls of each room are clad with the aforementioned nickel-iron alloy, screening against most external electromagnetic fields, including those produced by experiments in nearby rooms. There are dozens of external sources of EM radiation, but the strongest are generated by mobile phone masts, overhead power lines, and the (electric) underground train, all of which would play havoc with IBM’s nanoscale experiments.

    Internally, most rooms are divided in two: there’s a small ante chamber, which is where the human controller sits, and then the main space with the actual experiment/equipment. Humans generate around 100 watts of heat, and not inconsiderable amounts of noise and vibration, so it’s best to keep them away from experiments while they’re running.

    To provide even more isolation, there are two separate floors in each room: one suspended floor for the scientists to walk on, and another separate floor that only the equipment sits on. The latter isn’t actually a floor: it’s a giant (up-to-68-ton) concrete block that rests on active air suspension. Any vibrations that make it through the bedrock, or that come from large trucks rumbling by, are damped in real time by the air suspension.

    We’re not done yet! To minimise acoustic noise (i.e. sound), the rooms are lined with acoustically absorbent material. Furthermore, if an experiment has noisy ancillary components (a vacuum pump, electrical transformer, etc.), they are placed in another room away from the main apparatus, so that they’re physically and audibly isolated.

    And finally, there’s some very clever air conditioning that’s quiet, generates minimal air flux, and is capable of keeping the temperature in the rooms very stable. In every room, the suspended floor (the human-designated bit) is perforated with holes. Cold air slowly ekes out of these holes, rises to the ceiling, and is then sucked out. The air flow was hardly noticeable, except for on my ankles: in a moment of unwarranted hipness earlier that morning, I had decided to wear boat shoes without socks.

    That’s about it for the major, physical features of IBM Research’s quiet rooms, but there are two other bits that are pretty neat. First, the whole place is lit with LEDs, driven by a DC power supply that is far enough away that its EM emissions don’t interfere. Second, each room is equipped with three pairs of Helmholtz coils, oriented so that they cover the X, Y, and Z axes. These coils are tuned to cancel out any residual magnetic fields that haven’t already been damped by various other shields, such as the Earth’s magnetic field.

    3
    Labelled images of IBM’s noise-free labs, showing various important features

    Just how quiet are the rooms?

    So, after all that effort—each of the six rooms cost about €1.4 million to build, before equipment—just how quiet are the rooms below the Binnig and Rohrer Nanotechnology Center? Let’s break it down by the type of noise.

    The temperature at waist height in the rooms is set to 21 degrees Celsius, with a stability of 0.1°C per hour (i.e. it would take an hour for the temperature to rise to 21.1°C).

    Electromagnetic fields produced by AC sources are damped to less than 3 nT (nanotesla)—or about 1,500 times weaker than the magnetic field produced by a fridge magnet. From DC sources, it’s damped to 20 nT.

    The vibration damping is probably the most impressive: for the equipment on the concrete pedestals, movement is reduced to less than 300nm/s at 1Hz, and less than 10nm/s above 100Hz. These are well below the specs of NIST’s Advanced Measurement Laboratory in Maryland, USA.

    Somewhat ironically for the world’s quietest rooms, the weakest link is acoustic noise. Even though the rooms themselves are shielded from outside noises, and the acoustically absorbent material does a good job of stopping internal sound waves dead, there’s no avoiding the quiet hum of some of the machines or the slight susurration of the ventilation system.

    The acoustic noise level in the rooms is always below 30 dB, dipping down as low as 21 dB if there isn’t a noisy experiment running. In human terms, the rooms were definitely quiet, but not so quiet that I could feel my sanity slipping away, or anything crazy like that. I was a little bit disappointed that I couldn’t hear my various internal organs shifting around, truth be told.

    Why did IBM build six of these rooms?

    “You’re only as good as your tools.” It’s a trite, overused statement, but in this case it perfectly describes why IBM and ETH Zurich spent so many millions of euros on the quiet rooms.

    Big machines like the TEM or spin-SEM need to be kept very still, with as little outside interference as possible: if you can’t stay within the machine’s nominal operational parameters, you’re not going to get much scientifically useful data out of it.

    On the flip side, however, if you surpass the machine’s optimal parameters—if you reduce the amount of vibration, noise, etc. beyond the “recommended specs”—then you can produce images and graphs with more resolution than even the manufacturer thought possible.

    IBM Research’s spin-SEM, for example, used to be located in the basement of the main building, on a normal concrete floor. After being relocated to the quiet rooms, the lead scientist who uses the the spin-SEM said its resolution is 2-3 times better (an utterly huge gain, in case you were wondering).

    For much the same reason, my guide said that “several tooling manufacturers” have contacted IBM Research to ask if they can test their equipment in the noise-free labs: they want to see just how well it will perform under near-perfect conditions.

    The best story, though, I saved for last. Back in the ’80s and ’90s, before the Center was built, the IBM researchers didn’t have a specialised nanotechnology facility: they just worked in their labs, which were usually located down in the basement. When Gerd Binnig and Heinrich Rohrer invented the scanning tunnelling microscope (STM)—an achievement that would later net them a Nobel prize—they worked in the dead of night to minimise vibrations from the nearby road and other outside interference.

    After the new building was finished—which, incidentally, is named after Binnig and Rohrer—my guide spoke to some IBM retirees who had just finished inspecting the noise-free rooms. “We wish we’d had these rooms back in the ’80s and 90s, so that we didn’t have to work at 3am,” they said.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 11:36 am on July 14, 2015 Permalink | Reply
    Tags: ars technica, , , Coma Cluster,   

    From ars technica: “Huge population of “Ultra-Dark Galaxies” discovered” 

    Ars Technica
    ars technica

    Jul 11, 2015
    Xaq Rzetelny

    1

    About 321 million light-years away from us is the Coma Cluster, a massive grouping of more than 1,000 galaxies.

    2
    A Sloan Digital Sky Survey/Spitzer Space Telescope mosaic of the Coma Cluster in long-wavelength infrared (red), short-wavelength infrared (green), and visible light. The many faint green smudges are dwarf galaxies in the cluster.
    Credit: NASA/JPL-Caltech/GSFC/SDSS

    Some of its galaxies are a little unusual, however: they’re incredibly dim. So dim, in fact, that they have earned the title of “Ultra-Dark Galaxies” (UDGs). (The term is actually “Ultra-Diffuse Galaxies”, as their visible matter is thinly spread, though “ultra-dark” has been used by some sources and, let’s face it, sounds a lot better). This was discovered earlier this year in a study that identified 47 such galaxies.

    Dimness isn’t necessarily unusual in a galaxy. Most of a galaxy’s light comes from its stars, so the smaller a galaxy is (and thus the fewer stars it has), the dimmer it will be. We’ve found many dwarf galaxies that are significantly dimmer than their larger cousins.

    What was so unusual about these 47 is that they’re not small enough to account for their dimness. In fact, many of them are roughly the size of our own Milky Way (ranging in diameter from 1.5 to 4.6 kiloparsecs, compared with the Milky Way’s roughly 3.6) but have only roughly one thousandth of the Milky Way’s stars. The authors of the recent study interpret this to mean that these galaxies must be even more dominated by dark matter than are ordinary galaxies.

    Finding the dark

    Intrigued by this tantalizing observation, a group of researchers constructed a more detailed study. Using archival data from the 8.2-meter Subaru telescope, they examined the sky region in question and discovered more UDGs—854 of them. Given that the images they were working with don’t cover the full cluster, the researchers estimated that there should be roughly 1,000 UDGs visible in the cluster altogether.

    NAOJ Subaru Telescope
    NAOJ Subaru Telescope interior
    NAOJ/Subaru

    There are a lot of small caveats to this conclusion. First of all, it’s not certain that all these galaxies are actually in the Coma Cluster, as some might just be along the same line of sight. However, it’s very likely that most of them do lie within the cluster. If the UDGs aren’t part of the cluster, then they’re probably a typical sample of what we’d observe in any patch of sky the same size as the Subaru observation. If that’s true, then the Universe has an absurdly high number of UDGs, and we should have seen more of them already.

    In this particular patch of sky, the concentration of UDGs is stronger towards the center of the Coma Cluster. While that doesn’t prove they’re part of the cluster, it’s strongly suggestive.

    Dark tug-of-war

    The dim galaxies’ relationship to the cluster probably has something to do with the mechanism that made the UDGs so dark in the first place. These galaxies would have had an ample supply of gas with which to make stars, so something must have prevented that from happening. This could be because the gas was somehow stripped from its galaxy or because something cut off a supply of gas from elsewhere.

    The dense environment in the cluster might be responsible for this. Gravitational interactions can pull the galaxies apart or strip them of their gas. These encounters can also deplete the gas near the galaxies, cutting off the inflow of new material. Since there are plenty of galaxies swirling around in the dense cluster, there are plenty of opportunities for this to happen to an unfortunate galaxy. The victims of these vampiric attacks might become dark, losing their ability to form stars. Neither living nor dead, these bodies still roam the Universe, perhaps waiting to strip unsuspecting galaxies of their gas.

    But unlike those bitten by movie vampires, the galaxies have a way to fight back. Rather than letting their blood (or in this case gas) get sucked away, the galaxy’s own gravity can hang onto it. And since most of a galaxy’s mass comes in the form of dark matter, the mysterious substance is pretty important in the tug-of-war over the galaxy’s star-forming material. The more dark matter a galaxy’s got, the more likely it will be able to hold onto its material when other galaxies pass by.

    “We believe that something invisible must be protecting the fragile star systems of these galaxies, something with a high mass,” said Jin Koda, an astrophysicist with Stony Brook University and the paper’s lead author. “That ‘something’ is very likely an excessive amount of dark matter.”

    The role dark matter plays in this struggle is useful for researchers here on Earth. If they want to find out how much dark matter one of these UDGs has, all they have to do is look at how much material the galaxy has held onto. While the results of an encounter between galaxies are complicated and dependent on many factors, this technique can at least give them a rough idea.

    Close to the core

    Near the core of the Coma Cluster, there’s a higher density of galaxies, and so many more opportunities for galaxies to lose their gas in encounters. Tidal forces are much stronger there, and as such it takes more dark matter to continue to hold onto material.

    The earlier study’s smaller sample of UDGs didn’t see any of them very close to the core, and it seemed safe to assume any potential UDGs deeper in had been ripped apart entirely. That provided a clue as to the amount of dark matter these galaxies contain: not enough to hold them together in the core. The authors of that study used this information to put an upper limit on the percentage of dark matter in the UDGs, but it was very high—up to 98 percent. But even galaxies with 98 percent dark matter shouldn’t survive in the rough center of the cluster.

    Thus, in the new study, researchers didn’t expect to find UDGs any closer to the core. But they did. These galaxies are less clearly resolved because, in the cluster’s center, more interference from background objects mucks up the view. But assuming they have been correctly identified, they’ve got even more dark matter than the previous estimate: greater than 99 percent. There can be no doubt these UDGs live up to their (unofficial) name, as everything else the UDG includes—stars, black holes, planets, gas—make up less than one percent of the galaxy’s mass.

    Into the dark

    The discovery of so many dark galaxies in the Coma Cluster is a stride forward in the exploration of these objects. (Note: some of the objects included in the study had been previously discovered and were included in galaxy catalogs, but they were inconsistently classified, with many of them not identified as UDGs at all). The study’s large sample size compared strengthens its conclusions and also provides a more detailed picture of how these dark galaxies come to be.

    Many questions remain for future work to address, however. It’s still not known exactly how many of the objects identified in the study are actually part of the Coma Cluster, though it is likely that most are. Another question is whether the Coma Cluster’s UDG distribution is typical of other clusters, which will determine how well the findings of this study can be extrapolated elsewhere in the Universe. Modeling should also provide a more detailed look into the complex interactions of galaxies in the cluster, including the exact mechanisms responsible for the creation of UDGs.

    And crucially, UDGs offer an excellent opportunity to observe and study dark matter. Situations like this one, where dark matter’s interactions with baryonic (ordinary) matter can be observed, are ripe for study.

    “This discovery of dark galaxies may be the tip of the iceberg,” said Dr. Koda. “We may find more if we look for fainter galaxies embedded in a large amount of dark matter, with the Subaru Telescope and additional observations may expose this hidden side of the Universe.”

    The Astrophysical Journal Letters, 2015. DOI: 10.1088/2041-8205/807/1/L2 (About DOIs)

    Suprisingly, the institution responsible for this research is not named, nor are we given the names of the team members and their affiliations.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon
    Stem Education Coalition
    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

     
  • richardmitnick 10:55 am on March 18, 2015 Permalink | Reply
    Tags: ars technica, , ,   

    From ars technica: “Shining an X-Ray torch on quantum gravity” 

    Ars Technica
    ars technica

    Mar 17, 2015
    Chris Lee

    1
    This free electron laser could eventually provide a test of quantum gravity. BNL

    Quantum mechanics has been successful beyond the wildest dreams of its founders. The lives and times of atoms, governed by quantum mechanics, play out before us on the grand stage of space and time. And the stage is an integral part of the show, bending and warping around the actors according to the rules of general relativity. The actors—atoms and molecules—respond to this shifting stage, but they have no influence on how it warps and flows around them.

    This is puzzling to us. Why is it such a one directional thing: general relativity influences quantum mechanics, but quantum mechanics has no influence on general relativity? It’s a puzzle that is born of human expectation rather than evidence. We expect that, since quantum mechanics is punctuated by sharp jumps, somehow space and time should do the same.

    There’s also the expectation that, if space and time acted a bit more quantum-ish, then the equations of general relativity would be better behaved. In general relativity, it is possible to bend space and time infinitely sharply. This is something we simply cannot understand: what would infinitely bent space look like? To most physicists, it looks like something that cannot actually be real, indicating a problem with the theory. Might this be where the actors influence the stage?

    Quantum mechanics and relativity on the clock

    To try and catch the actors modifying the stage requires the most precise experiments ever devised. Nothing we have so far will get us close, so a new idea from a pair of German physicists is very welcome. They focus on what’s perhaps the most promising avenue for detecting quantum influences on space-time: time-dilation experiments. Modern clocks rely on the quantum nature of atoms to measure time. And the flow of time depends on relative speed and gravitational acceleration. Hence, we can test general relativity, special relativity, and quantum mechanics all in the same experiment.

    To get an idea of how this works, let’s take a look at the traditional atomic clock. In an atomic clock, we carefully prepare some atoms in a predefined superposition state: that is the atom is prepared such that it has a fifty percent chance of being in state A, and a fifty percent chance of being in state B. As time passes, the environment around the atom forces the superposition state to change. At some later point, it will have a seventy five percent chance of being in state A; even later, it will certainly be in state A. Keep on going, however, and the chance of being in state A starts to shrink, and it continues to do so until the atom is certainly in state B. Provided that the atom is undisturbed, these oscillations will continue.

    These periodic oscillations provide the perfect ticking clock. We simply define the period of an oscillation to be our base unit of time. To couple this to general relativity measurements is, in principle, rather simple. Build two clocks and place them beside each other. At a certain moment, we start counting ticks from both clocks. When one clock reaches a thousand (for instance), we compare the number of ticks from the two clocks. If we have done our job right, both clocks should have reached a thousand ticks.

    If we shoot one into space, however, and perform the same experiment, and relativity demands that the clock in orbit record more ticks than the clock on Earth. The way we record the passing of time is by a phenomena that is purely quantum in nature, while the passing of time is modified by gravity. These experiments work really well. But at present, they are not sensitive enough to detect any deviation from either quantum mechanics or general relativity.

    Going nuclear

    That’s where the new ideas come in. The researchers propose, essentially, to create something similar to an atomic clock, but instead of tracking the oscillation atomic states, they want to track nuclear states. Usually, when I discuss atoms, I ignore the nucleus entirely. Yes, it is there, but I only really care about the influence the nucleus has on the energetic states of the electrons that surround it. However, in one key way the nucleus is just like the electron cloud that surrounds it: it has its own set of energetic states. It is possible to excite nuclear states (using X-Ray radiation) and, afterwards, they will return the ground state by emitting an X-Ray.

    So let’s imagine that we have a crystal of silver sitting on the surface of the Earth. The silver atoms all experience a slightly different flow of time because the atoms at the top of the crystal are further away from the center of the Earth compared to the atoms at the bottom of the crystal.

    To kick things off, we send in a single X-Ray photon, which is absorbed by the crystal. This is where the awesomeness of quantum mechanics puts on sunglasses and starts dancing. We don’t know which silver atom absorbed the photon, so we have to consider that all of them absorbed a tiny fraction of the photon. This shared absorption now means that all of the silver atoms enter a superposition state of having absorbed and not absorbed a photon. This superposition state changes with time, just like in an atomic clock.

    In the absence of an outside environment, all the silver atoms will change in lockstep. And when the photon is re-emitted from the crystal, all the atoms will contribute to that emission. So each atom behaves as if it is emitting a partial photon. These photons add together, and a single photon flies off in the same direction as the absorbed photon had been traveling. Essentially because all the atoms are in lockstep, the charge oscillations that emit the photon add up in phase only in the direction that the absorbed photon was flying.

    Gravity, though, causes the atoms to fall out of lockstep. So when the time comes to emit, the charge oscillations are all slightly out of phase with each other. But they are not random: those at the top of the crystal are just slightly ahead of those at the bottom of the crystal. As a result, the direction for which the individual contributions add up in phase is not in the same direction as the flight path of the absorbed photon, but at a very slight angle.

    How big is this angle? That depends on the size of the crystal and how long it takes the environment to randomize the emission process. For a crystal of silver atoms that is less than 1mm thick, the angle could be as large as 100 micro-degrees, which is small but probably measurable.
    Spinning crystals

    That, however, is only the beginning of a seam of clever. If the crystal is placed on the outside of a cylinder and rotated during the experiment, then the top atoms of the crystal are moving faster than the bottom, meaning that the time-dilation experienced at the top of the crystal is greater than that at the bottom. This has exactly the same effect as placing the crystal in a gravitational field, but now the strength of that field is governed by the rate of rotation.

    In any case, by spinning a 10mm diameter cylinder very fast (70,000 revolutions per second), the angular deflection is vastly increased. For silver, for instance, it reaches 90 degrees. With such a large signal, even smaller deviations from the predictions of general relativity should be detectable in the lab. Importantly, these deviations happen on very small length scales, where we would normally start thinking about quantum effects in matter. Experiments like these may even be sensitive enough to see the influence of quantum mechanics on space and time.

    A physical implementation of this experiment will be challenging but not impossible. The biggest issue is probably the X-Ray source and doing single photon experiments in the X-Ray regime. Following that, the crystals need to be extremely pure, and something called a coherent state needs to be created within them. This is certainly not trivial. Given that it took atomic physicists a long time to achieve this for electronic transitions, I think it will take a lot more work to make it happen at X-Ray frequencies.

    On the upside free electron lasers have come a very long way, and they have much better control over beam intensities and stability. This is, hopefully, the sort of challenge that beam-line scientists live for.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon
    Stem Education Coalition
    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

     
  • richardmitnick 11:07 am on March 9, 2015 Permalink | Reply
    Tags: ars technica,   

    From ars technica: “Imaging a supernova with neutrinos” 

    Ars Technica
    ars technica

    Mar 4, 2015
    John Timmer

    1
    Two men in a rubber raft inspect the wall of photodetectors of the partly filled Super-Kamiokande neutrino (BNL)

    There are lots of ways to describe how rarely neutrinos interact with normal matter. Duke’s Kate Scholberg, who works on them, provided yet another. A 10 Mega-electron Volt gamma ray will, on average, go through 20 centimeters of carbon before it’s absorbed; a 10 MeV neutrino will go a light year. “It’s called the weak interaction for a reason,” she quipped, referring to the weak-force-generated processes that produce and absorb these particles.

    But there’s one type of event that produces so many of these elusive particles that we can’t miss it: a core-collapse supernova, which occurs when a star can no longer produce enough energy to counteract the pull of gravity. We typically spot these through the copious amounts of light they produce, but in energetic terms, that’s just a rounding error: Scholberg said that 99 percent of the gravitational energy of the supernova goes into producing neutrinos.

    Within instants of the start of the collapse, gravity forces electrons and protons to fuse, producing neutrons and releasing neutrinos. While the energy that goes into producing light gets held up by complicated interactions with the outer shells of the collapsing star, neutrinos pass right through any intervening matter. Most of them do, at least; there are so many produced that their rare interactions collectively matter, though our supernova models haven’t quite settled on how yet.

    But our models do say that, if we could detect them all, we’d see their flavors (neutrinos come in three of them) change over time, and distinct patterns of emission during the star’s infall, accretion of matter, and then post-supernova cooling. Black hole formation would create a sudden stop to their emission, so they could provide a unique window into the events. Unfortunately, there’s the issue of too few of them interacting with our detectors to learn much.

    The last nearby supernova, SN 1987a, saw a burst of 20 electron antineutrinos be detected about 2.5 hours before the light from the explosion became visible.

    2
    Remnant of SN 1987A seen in light overlays of different spectra. ALMA data (radio, in red) shows newly formed dust in the center of the remnant. Hubble (visible, in green) and Chandra (X-ray, in blue) data show the expanding shock wave.

    ALMA Array

    NASA Hubble Telescope
    Hubble

    NASA Chandra Telescope
    Chandra

    (Scholberg quipped that the Super-Kamiokande detector “generated orders of magnitude more papers than neutrinos.”) But researchers weren’t looking for this, so the burst was only recognized after the fact.

    Super-Kamiokande experiment Japan
    Super-Kamiokande detector

    That’s changed now. Researchers can go to a Web page hosted by Brookhaven National Lab and have an alert sent to them if any of a handful of detectors pick up a burst of neutrinos. The Daya Bay, IceCube, and Super-Kamiokande detectors are all part of this program.) When the next burst of neutrinos arrives, astronomers will be alert and searching for the source.

    Daya Bay
    Daya Bay

    ICECUBE neutrino detector
    IceCube

    “The neutrinos are coming!” Scholberg said. “The supernovae have already happened, their wavefronts are on their way.” She said estimates are that there are three core collapse supernovae in our neighborhood each century and, by that measure, “we’re due.”

    If that supernova has occurred in the galactic core, it will put on quite a show. Rather than detecting individual events, the entire area of ice monitored by the IceCube detector will end up glowing. The Super-Kamiokande detector will see 10,000 individual neutrinos; “It will light up like a Christmas tree,” Scholberg said.

    It’ll be an impressive show, and it’s one that I’m sure most physicists (along with me) hope happen in their lifetimes. But if it takes a little time, the show may be even better. There are apparently plans afoot to build a “Hyper-Kamiokande,” which would be able to detect 100,000 neutrinos from a galactic core supernova. Imagine how many papers that would produce.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon
    Stem Education Coalition
    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

     
  • richardmitnick 9:02 pm on October 5, 2014 Permalink | Reply
    Tags: ars technica, ,   

    From ars technica: “Exploring the monstrous creatures at the edges of the dark matter map” 

    Ars Technica
    ars technica

    Sept 30 2014
    Matthew Francis

    So far, we’ve focused on the simplest dark matter models, consisting of one type of object and minimal interactions among individual dark matter particles. However, that’s not how ordinary matter behaves: the interactions among different particle types enable the existence of atoms, molecules, and us. Maybe the same sort of thing is true for dark matter, which could be subject to new forces acting primarily between particles.

    Some theories describe a kind of “dark electromagnetism” where particles carry charges like electricity, but they’re governed by a force that doesn’t influence electrons and the like. Just as normal electromagnetism describes light, these models include “dark photons,” which sound like something from the last season of Star Trek: The Next Generation (after the writers ran out of ideas).

    elec
    Diagram of a solenoid and its magnetic field lines. The shape of all lines is correct according to the laws of electrodynamics.

    Like many WDM candidates, dark photons would be difficult—if not impossible—to detect directly, but if they exist, they would carry energy away from interacting dark matter systems. That would be detectable by its effect on things like the structure of neutron stars and other compact astronomical bodies. Observations of these objects would let researchers place some stringent limits on the strength of dark forces. Another consequence is that dark forces would tend to turn spherical galactic halos into flatter, more disk-like structures. Since we don’t see that in real galaxies, there are strong constraints on how much dark forces can affect dark matter motion.

    som
    The “Sombrero” galaxy shows that matter interacting with itself flattens into disks. Dark matter doesn’t seem to do that, limiting the strength of possible interactions between particles.
    NASA, ESA, and The Hubble Heritage Team (STScI/AURA)

    NASA Hubble Telescope
    NASA/ESA Hubble

    Another side effect of dark forces is that there should be dark antimatter and dark matter-antimatter annihilation. The results of such interactions could include ordinary photons, another intriguing hint in the wake of observations of excess gamma-rays, possibly due to dark matter annihilation in the Milky Way and other galaxies.

    What’s cooler than cold dark matter?

    While most low-mass particles are “hot,” a hypothetical particle known as the axion is an exception. Axions were first predicted as a solution to a thorny problem in the physics of the strong nuclear force, but certain properties make them appealing as dark matter candidates. Mainly, they are electrically neutral and don’t interact directly with ordinary matter except through gravity.

    Axions are also very low-mass (at least in one proposed version), but unlike hot dark matter, they “condensed” in the early Universe into a slow, thick soup. In other words, they behave much like cold dark matter, but without the large mass usually implied by the term.

    Axions aren’t part of the Standard Model, but in a sense they’re a minimally invasive addition. Unlike supersymmetry, which involves adding one particle for each type in the Standard Model, axions are just one particle type, albeit one with some unique properties. (To be fair, these aren’t mutually exclusive concepts: it’s possible both SUSY particles and axions are real, and some versions of SUSY even include a hypothetical partner for axions.)

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Supersymmetry standard model
    Standard Model of Supersymmetry

    Like WDM, axions don’t interact directly with ordinary matter. But according to theory, in a strong magnetic field, axions and photons can oscillate into each other, switching smoothly between particle types. That means axions could be created all the time near black holes, neutron stars, or other places with intense magnetic fields—possibly including superconducting circuits here on Earth. This is how experiments hunt for axions, most notably the Axion Dark Matter eXperiment (ADMX).

    So far, no experiment has turned up axions, at least of the type we’d expect to see. Particle physics has a lot of wiggle-room for possibilities, so it’s too soon to say no axions exist, but axion partisans are disappointed. A universe with axions makes more sense than one without, but it wouldn’t be the first time something that really seemed to be a good idea didn’t quite work out.

    A physicist’s fear

    Long as it is becoming, this list is far from complete. We’ve excluded exotic particles with sufficiently tiny electric charges to be nearly invisible, weird (but unlikely) interactions that change the character of known particles under special circumstances, plus a number of other possibilities. One interesting candidate is jokingly known a WIMPzilla, which consists of one or more particle type more than a trillion times the mass of a proton. These would have been born at a much earlier era than WIMPs, when the Universe was even hotter. Because they are so much heavier, WIMPzillas can be rarer and interact more readily with normal matter, but—as with other more exotic candidates—they aren’t really considered to be a strong possibility.

    godzilla
    If the leading ideas for dark matter don’t hold up to experimental scrutiny, then we’ve definitely sailed off the map into the unknown.
    Castle Gallery, College of New Rochelle

    And more non-WIMP dark matter candidates seem to crop up every year, though many are implausible enough they won’t garner much attention even from other theorists. However, each guess—even unlikely ones—can help us understand what dark matter can be, and what it can’t.

    We’ve also omitted a whole other can of worms known as “modified gravity”—a proposition that the matter we see is all there is, and the observational phenomena that don’t make sense can be explained by a different theory of gravity. So far, no modified gravity model has reproduced all the observed phenomena attributed to dark matter, though of course that doesn’t say it can never happen.

    To put it another way: most astronomers and cosmologists accept that dark matter exists because it’s the simplest explanation that accounts for all the observational data. If you want a more grumpy description, you could say that dark matter is the worst idea, except for all the other options.

    Of course, Nature is sly. Perhaps more than one of these dark matter candidates is out there. A world with both axions and WIMPs—motivated as they are by different problems arising from the Standard Model—would be confounding but not beyond reason. Given the unexpected zoo of normal particles discovered in the 20th century, maybe we’ll be pleasantly surprised; after all, wouldn’t it be nice if several of our hypotheses were simultaneously correct for once? (I’m a both/and kind of guy.) More than one type might also help explain why we have yet to see any dark matter in our detectors so far. If a substantial fraction of dark matter particles is made of axions, then the density of WIMPs or WDM must be correspondingly lower and vice versa.

    But a bigger worry lurks in the minds of many researchers. Maybe dark matter doesn’t interact with ordinary matter at all, and it doesn’t annihilate in a way we can detect easily. Then the “dark sector” is removed from anything we can probe experimentally, and that’s an upsetting thought. Researchers would have a hard time explaining how such particles came to be after the Big Bang, but worse: without a way to study their properties in the lab, we would be stuck with the kind of phenomenology we have now. Dark matter would be perpetually assigned to placeholder status.

    In old maps made by European cartographers, distant lands were sometimes shown populated by monstrous beings. Today of course, everyone knows that those lands are inhabited by other human beings and creatures that, while sometimes strange, aren’t the monsters of our imagination. Our hope is that the monstrous beings of our theoretical space imaginings will some day seem ordinary, too, and “dark matter” will be part of physics as we know it.

    See the full article here.

    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 12:06 pm on July 29, 2014 Permalink | Reply
    Tags: ars technica, , , , ,   

    From ARS Technica: “Dark matter makes up 80% of the Universe—but where is it all?” 

    Ars Technica
    ARS Technica

    July 27 2014
    Matthew Francis

    It’s in the room with you now. It’s more subtle than the surveillance state, more transparent than air, more pervasive than light. We may not be aware of the dark matter around us (at least without the ingestion of strong hallucinogens), but it’s there nevertheless.

    dm
    Composite image of X-ray (pink) and weak gravitational lensing (blue) of the famous Bullet Cluster of galaxies.
    X-ray: NASA/CXC/CfA/ M.Markevitch et al.; Lensing Map: NASA/STScI; ESO WFI; Magellan/U.Arizona/ D.Clowe et al. Optical: NASA/STScI; Magellan/U.Arizona/D.Clowe et al.

    Although we can’t see dark matter, we know a bit about how much there is and where it’s located. Measurement of the cosmic microwave background shows that 80 percent of the total mass of the Universe is made of dark matter, but this can’t tell us exactly where that matter is distributed. From theoretical considerations, we expect some regions—the cosmic voids—to have little or none of the stuff, while the central regions of galaxies have high density. As with so many things involving dark matter, though, it’s hard to pin down the details.

    Cosmic Background Radiation Planck
    CMB per ESA/Planck

    ESA Herschel
    ESA/Planck

    Unlike ordinary matter, we can’t see where dark matter is by using the light it emits or absorbs. Astronomers can only map dark matter’s distribution using its gravitational effects. That’s especially complicated in the denser parts of galaxies, where the chaotic stew of gas, stars, and other forms of ordinary matter can mask or mimic the presence of dark matter. Even in the galactic suburbs or intergalactic space, dark matter’s transparency to all forms of light makes it hard to locate with precision.

    Despite that difficulty, astronomers are making significant progress. While individual galaxies are messy, analyzing surveys of huge numbers of them can provide a gravitational map of the cosmos. Astronomers also hope to overcome the messiness of galaxies and estimate how much dark matter must be in the central regions using careful observation of the motion of stars and gas.

    There’s also been a tantalizing hint of dark matter particles themselves in the form of a signal that may come from their annihilation near the center of the Milky Way. If this is borne out by other observations, it could constrain dark matter’s properties while avoiding messy gravitational considerations. Adding it all up, it’s a promising time for mapping the location of dark matter, even as researchers still build particle detectors to identify what it is.

    A (very) brief history of dark matter

    In the 1930s, Fritz Zwicky measured the motion of galaxies within the Coma galaxy cluster. Based on simple gravitational calculations, he found that they shouldn’t move as they did unless the cluster contained a lot more mass than he could see. As it turned out, Zwicky’s estimates of how much matter there was were too large by a huge factor. Still, he was correct in the broader picture: more than 80 percent of a galaxy cluster’s mass isn’t in the form of atoms.

    Zwicky’s work didn’t get a lot of attention at the time, but Vera Rubin’s later observations of spiral galaxies were another matter. She found that the combined stars and gas had too little mass to explain the rotation rates she measured. Between Rubin’s work and subsequent measurements, astronomers established that every spiral galaxy is engulfed by a roughly spherical halo (as it is called) of matter—matter that’s transparent to every form of light.

    The Bullet Cluster

    That leads us to the “Bullet Cluster,” one of the most important systems in astronomy.

    bullt
    X-ray photo by Chandra X-ray Observatory of the Bullet Cluster (1E0657-56). Exposure time was 0.5 million seconds (~140 hours) and the scale is shown in megaparsecs. Redshift (z) = 0.3, meaning its light has wavelengths stretched by a factor of 1.3. Based on today’s theories this shows the cluster to be about 4 billion light years away. In this photograph, a rapidly moving galaxy cluster with a shock wave trailing behind it seems to have hit another cluster at high speed. The gases collide, and gravitational fields of the stars and galaxies interact. When the galaxies collided, based on black-body temperature readings, the temperature reached 160 million degrees and X-rays were emitted in great intensity, claiming title of the hottest known galactic cluster. Studies of the Bullet cluster, announced in August 2006, provide the best evidence to date for the existence of dark matter.

    First described in 2006, it’s actually a pair of galaxy clusters observed in the act of colliding. Researchers mapped it in visible and X-ray light, finding that it consists of two clumps of galaxies. But it’s the stuff they couldn’t image directly that ensured the Bullet Cluster is rightfully cited as one of the best pieces of evidence for dark matter’s existence (the title of the paper announcing the discovery even calls it “direct empirical proof”).

    Galaxy clusters are the biggest individual objects in the Universe. They can contain thousands of galaxies bound to each other by mutual gravity. However, the stuff within those galaxies—stars, gas, dust—is outweighed by an extremely hot, gaseous plasma between them, which shines brightly in X-rays. In the Bullet Cluster, the collision between the two clusters created a shock wave in the plasma (the shape of this shock wave gives the structure its name).

    More dramatically, though, the astronomers who described the cluster used gravitational lensing—the distortion of light from more distant galaxies by the mass within the cluster—to map the distribution of most of the material in the Bullet Cluster. That method is known as “weak gravitational lensing.” Unlike the sexier strong lensing, weak lensing doesn’t create multiple images of the more distant galaxies. Instead, it slightly warps the light from background objects in a small but measurable way, depending on the amount and concentration of mass in the “lens”—in this case, the cluster.

    Astronomers found the shocked plasma, which represents most of the mass of the Bullet Cluster, was almost entirely in the region between the two clusters, separated from the galaxies. However, the mass was largely concentrated around the galaxies themselves. This enabled a clear, independent measurement of the amount of dark matter, separate from the mass of the gas.

    The results also confirmed some predictions about the behavior of dark matter. Thanks to the shock of the collision, the plasma stayed in the region between the two clusters. Since the dark matter doesn’t interact much with either itself or normal matter, it passed right through the collision without any noticeable change.

    It’s a phenomenal discovery, but it’s only one galaxy cluster, and that ain’t enough. Science is inherently greedy for evidence (as it should be). A single example of anything tells us very little in a Universe full of possibilities. We want to know if dark matter always clusters around galaxies or if it can be more widely dispersed. We want to know where all the dark matter is, in all galaxy clusters and beyond, throughout the entire cosmos.

    A dark matter census

    Weak gravitational lensing provides a method to search for dark matter in other galaxy clusters, too, as well as even larger and smaller structures. Princeton University astronomers Neta Bahcall and Andrea Kulier took a weak lensing census of 132,473 galaxy groups and clusters, all within a well-defined patch of the sky but at a range of distances from the Milky Way. (“Groups” are smaller associations of galaxies; for example, the Milky Way is the second largest galaxy in the Local Group, after the Andromeda galaxy.) While individual galaxy clusters usually can’t tell us much, a large sample allowed the astronomers to treat the problem statistically—weak lensing effects that were too small to spot for a single cluster became obvious when looking at hundreds of thousands.

    For example, a typical quantity used in studying galaxies is the mass-to-light ratio. To measure this statistically, Bahcall and Kulier looked at the cumulative amount of light (mostly emitted by stars) and weak lensing (mostly from dark matter), starting from the centers of each cluster and working outward. They found something intriguing: the amount of mass and light increased in tandem and then leveled off together. That means neither the dark matter nor the light extends farther than the other: the stars inside these groups and clusters were a very good tracer for the dark matter. That’s surprising because stars are typically less than two percent of the mass in a cluster, with the balance of ordinary matter made up by gas and dust.

    As Kulier told Ars, “The total amount of dark matter in galaxy groups and clusters might be accounted for entirely by the amount of dark matter in the halos of their constituent galaxies.” That’s an average result, though; the details could look quite different. “This does not necessarily imply that the halos are still ‘attached’ to the galaxies,” Kulier said. In other words, when galaxies came together to form clusters, the stronger forces acting on galaxies and their stars could in principle separate them from their dark matter but leave everything inside the cluster thanks to mutual gravity.

    Kulier pointed out that these results provide strong support for the “hierarchical” model of structure formation: “smaller structures collapse earlier than larger ones, so that galaxies form first and then merge together to form larger structures like clusters.” The Bullet Cluster is an archetypical example of this, but things could be otherwise. For instance, dark matter could have ended up in the center of clusters, separate from the galaxies and their individual halos.

    But that’s not what astronomers see. In their analysis, Bahcall and Kulier also calculated that the total ratio of dark matter to ordinary matter in galaxy clusters matches that of the Universe as a whole. That’s another strong piece of evidence in favor of the standard model in cosmology: maybe most of the dark matter everywhere is in galactic halos.

    Every galaxy wears a halo

    halo
    Computer reconstruction of the location of mass in terms of how it affects the image of distant galaxies through weak lensing.
    S. Colombi (IAP), CFHT Team

    So what about the halos themselves and the galaxies that wear them? Historically, dark matter was first recognized for its role in spiral galaxies. However, it’s one thing to say that dark matter is present. It’s another to map out where it is—especially in the dense, star-choked inner parts of galaxies.

    Spiral galaxies consist of three basic parts: the disk, the bulge, and the halo. The disk is a thin region containing the spiral arms and most of the bright stars. The bulge is the central, densest part, with large populations of older stars and (at its very heart) a supermassive black hole. The halo is a more or less spherical region containing a smattering of stars; it envelops the other regions, extending several times beyond the limit of the disk. To provide an example, the Milky Way’s disk is about 100,000 light-years in diameter, but its halo is between 300,000 to 1 million light-years across.

    Because of the relative sizes of the different regions, most of a galaxy’s dark matter is in the halo. Relatively little is in the disk; Jo Bovy and Scott Tremaine showed that the disk and halo contain less than the equivalent mass of 100 Earths in a cube one light-year across. That may sound like a lot, but Earth isn’t that large, and a light-year defines a big volume. That amount isn’t enough to affect the Sun’s orbit around the galactic center strongly. (It’s still enough for a few particles to drift through detectors like LUX, though.)

    By contrast, the amount of dark matter increases toward the galaxy’s center, so the density should be much higher in the bulge than anywhere else. For that reason, a number of astronomers look to the central part of the Milky Way for indications of dark matter annihilation, which (under some models) would produce gamma rays. This would occur if dark matter particles are their own antimatter partners, so that their (very rare) collisions result in mutual destruction and some high-energy photons. This winter, a group of researchers announced a possible detection of excess gamma rays originating in the Milky Way’s core, based on data from the orbiting [NASA] Fermi gamma ray observatory.

    NASA Fermi Telescope
    NASA/Fermi

    However, the bulge also has the highest density of stars, making it a tangled mess. Many things in that region could produce an excess of gamma rays. As University of Melbourne cosmologist Katherine Mack told me, “The Galactic Center is a really messy place, and the analysis of the signal is complicated. It’ll take a lot to show that the signal has to be dark matter annihilation rather than some un-accounted-for astrophysical source.” We can’t rule out the possibility of dark matter annihilation, but it’s definitely too soon to break out the champagne.

    The difference between the ease of calculating an average density and detecting the presence of dark matter is illustrative of the general problem with mapping dark matter inside galaxies. It’s relatively simple to put limits on how much there is in the disk, since that’s a small fraction of the total volume of a galaxy. The tougher questions include how steeply the density falls off from the galactic center, how far the halo actually extends, and how lumpy the halo is.

    For instance, our galaxy’s halo is big enough to encompass its satellite galaxies, including the Magellanic Clouds and a host of smaller objects. But these galaxies also have their own halos in accordance with the hierarchical model. Because they’re denser dark matter lumps inside the Milky Way’s larger halo, the satellites’ halos create a substructure.

    Our dark matter models predict how much substructure should be present. However, dwarf galaxies are very faint, so astronomers have difficulty determining if there are enough of them to account for all the predicted substructure. This is known as the “missing satellite problem,” but many astronomers suspect the problem will evaporate as they get better at finding these faint objects.

    A hopeful conclusion

    So where is the dark matter? Based on both theory and observation, it looks like most of it is in galactic halos. Surveys using weak gravitational lensing are ongoing, with many more planned for the future. These surveys will show where most of the mass in the Universe is located in unprecedented detail.

    How dark matter is distributed within those halos is still a bit mysterious, but there are several hopeful approaches. By looking for “dark galaxies”—small satellites with few stars but high dark matter concentrations—astronomers can determine the substructure within larger halos. The [ESA]Gaia mission is working to produce a three-dimensional map of a billion stars and their motions, which will provide information about the structure of the Milky Way and its surrounding satellites. That in turn will allow researchers to work backward, determining the gravitational field dictating the motion of these stars. With that data in hand, we should have a good map of the dark matter in many regions that are currently difficult to study.

    Dark matter may be subtle and invisible, but we’re much closer than ever to knowing exactly where it hides.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 12:32 pm on December 17, 2010 Permalink | Reply
    Tags: ars technica, , ,   

    From ars technica:”Theorists seek dark matter in hot neutron stars” 

    By Chris Lee

    [I am not going to tell Chris’s story here, he is an excellent writer and knows his subject. I just want to get you intrigued.]

    “Dark matter is an enigma wrapped in a conundrum. We have lots of gravitational evidence for the presence of dark matter. In fact, the evidence is from so many different types of observations, and is all so consistent, that very few astronomers or cosmologists appear to doubt that some type of dark matter exists. That is the enigma: it is very likely to exist, but we know very little of the specific details about what exactly exists.

    Going further than that has been a problem. The very nature of dark matter, which makes cosmologists so certain of its existence, means it has the very properties that make it so damn hard to find by any means other than gravity—something of a conundrum, really. A recent paper that looks at how dark matter might be detectable in neutron stars inadvertently makes this problem very clear.

    Most cosmologists believe that dark matter consists of weakly interacting massive particles (WIMPs). As the name suggests, they are heavy and they only talk to normal matter very rarely. But rarely is not never and, if we can find a place where there is an awful lot of both normal matter and dark matter, we might be able to observe the consequences of the two colliding. There are a fair number of experiments going on that attempt to do this, and they have some tantalizing results. But tantalizing is all they are—nothing that would get you calling your Mom in excitement in the middle of the night.

    So, when I stumbled across a paper discussing the effects of dark matter on neutron stars, I was intrigued. The basic idea, it turned out, was that neutron stars have huge densities, so the likelihood of dark matter colliding with normal matter is greater there than in any other objects in the observable universe. If the neutron star happens to be near the galactic center or in a globular cluster, then there should be a lot of WIMPs around to play with….”

    i2
    Credit: NASA

    So, click on the link and read Chris’s story here.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 462 other followers

%d bloggers like this: