Tagged: livescience Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:36 pm on April 15, 2016 Permalink | Reply
    Tags: , , livescience   

    From livescience: “Why Do So Many Earthquakes Strike Japan?” 


    April 15, 2016
    Denise Chow

    A 7.0-magnitude earthquake struck the Kumamoto region on Japan’s Kyushu Island at 1:25 a.m. local time on April 16 (12:25 p.m. ET on April 15).

    A magnitude-7.0 earthquake struck southern Japan today, less than two days after a 6.2-magnitude temblor rocked the same region, triggering tsunami advisories in the area.

    The most recent earthquake struck the Kumamoto region on Japan’s Kyushu Island early Saturday (April 16) at 1:25 a.m. local time (12:25 p.m. ET on April 15), according to the U.S. Geological Survey (USGS). The smaller 6.2-magnitude quake on Thursday (April 14) killed nine people and injured hundreds more, reported CBS News.

    With residents of the Kumamoto region reeling from two sizable earthquakes in as many days, and with memories of the massive 9.0-magnitude earthquake and tsunami that devastated Tohoku, Japan, in 2011 not far from people’s minds, what is it about this part of the world that makes it so seismically active?

    For starters, Japan is located along the so-called Pacific Ring of Fire, which is the most active earthquake belt in the world. This “ring” is actually an imaginary horseshoe-shaped zone that follows the rim of the Pacific Ocean, where many of the world’s earthquakes and volcanic eruptions occur.

    Within the Ring of Fire, several tectonic plates — including the Pacific Plate beneath the Pacific Ocean and the Philippine Sea Plate — mash and collide.

    “The Earth’s surface is broken up into about a dozen or so major chunks that are all moving around. Where they all interact at their edges, interesting things happen,” said Douglas Given, a geophysicist with the USGS in Pasadena, California.

    Today’s earthquake seems to have been caused by the Philippines Sea Plate diving underneath the Eurasia Plate, according to Paul Caruso, a geophysicist with the USGS.

    While Japan is no stranger to earthquakes, the 7.0-magnitude temblor is one of the largest ever recorded in this part of southern Japan, Caruso told Live Science.

    “The second-largest was probably on March 20, 1939 — there was a magnitude-6.7 in this area. And we’ve had magnitude-6.5 and magnitude-6.3 earthquakes, but this is the largest quake that has been measured in that vicinity,” he said.

    A tsunami advisory was issued after today’s earthquake, but it was subsequently lifted by the Japan Meteorological Agency, and there are currently no major tsunami warnings or advisories in effect.

    Not all earthquakes trigger tsunamis, Caruso said. In general, there are three key ingredients that can produce a dangerous earthquake-tsunami combination, he added. First, the earthquake must be at least a magnitude-7 temblor. Second, the quake’s epicenter has to be underneath the ocean, Caruso said. And finally, the earthquake has to be shallow.

    “We have quakes around Fiji all the time, but those are sometimes 400 miles [640 kilometers] underground, so they aren’t going to generate a tsunami,” he said.

    Today’s earthquake was shallow — about 6 miles (10 km) underground — but the epicenter was on land, meaning there aren’t likely to be any dangerous tsunamis as a result, Caruso said.

    Given said he hasn’t seen many damage reports yet, but Japanese authorities and scientists at the USGS will be monitoring the area for potentially dangerous aftershocks, which are smaller quakes that follow the largest event in a series and that generally decrease in strength.

    “This seems to be a pretty energetic sequence, and there are lots of large aftershocks,” Given told Live Science. “And of course, after a large earthquake, structures are often weakened as a result. Additional damage can be expected.”

    Residents of the area should expect more shaking in the coming days, according to Caruso.

    “We can say for certain that there are going to be more aftershocks in this area,” he said. “Exactly when and how big they’re going to be is difficult to say, though. No one can predict that.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 7:57 am on April 14, 2016 Permalink | Reply
    Tags: , , , , livescience   

    From FNAL’s Don Lincoln on livescience: “Collider Unleashed! The LHC Will Soon Hit Its Stride” 


    April 12, 2016

    FNAL Don Lincoln
    Don Lincoln, Senior Scientist, Fermi National Accelerator Laboratory; Adjunct Professor of Physics, University of Notre Dame

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    If you’re a science groupie and would love nothing better than for a cornerstone scientific theory to be overthrown and replaced with something newer and better, then 2016 might well be your year. The world’s largest particle accelerator, the Large Hadron Collider (LHC), is resuming operations after a pause during the winter months, when the cost for electricity in France is highest.

    So why is it such a big deal that LHC coming back on line? It’s because this is the year the accelerator will operate at something approaching its design specifications. Scientists will smash the gas pedal to the floor, crank the fire hose wide open, spin the amplifier button to eleven or enact whatever metaphor you like. This year is the first real year of full-scale LHC operations.

    A particle smasher reborn

    Now if you actually are a science groupie, you know what the LHC is and have probably heard about some of its accomplishments. You know it smashes together two beams of protons traveling at nearly the speed of light. You know scientists using the LHC found the Higgs boson.

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    You know that this marvel is the largest scientific device ever built.

    So what’s different now? Well, let’s go back in time to 2008, when the LHC circulated its first beams. At the time, the world’s premier particle accelerator was the U.S. Department of Energy’s Fermilab Tevatron, which collided beams at a whopping 2 trillion electron volts (TeV) of energy and with a beam brightness of about 2 × 1032 cm-2 s-1.

    FNAL/Tevatron map
    FNAL/Tevatron map

    FNAL/Tevatron CDF
    FNAL/Tevatron CDF detectorFNAL/DZero detector
    FNAL/DZero detector

    The technical term for beam brightness is “instantaneous luminosity,” and basically it’s a density. More precisely, when a beam passes through a target, the instantaneous luminosity (L) is the number of particles per second in a beam that pass a location (ΔNB/Δt) divided by the area of the beam (A), multiplied by the number of targets (NT), L = ΔNB/Δt × (1/A) × NT. (And the target can be another beam.)

    The simplest analogy that will help you understand this quantity is a light source and a magnifying glass. You can increase the “luminosity” of the light by turning up the brightness of the light source or by focusing the light more tightly. It is the same way with a beam. You can increase the instantaneous luminosity by increasing the number of beam or target particles, or by concentrating the beam into a smaller area.

    The LHC was built to replace the Tevatron and trounce that machine’s already-impressive performance numbers.

    [If our USA Congress was not filled with idiots, we would have built in Texas the Superconducting Super Collider and not lost this HEP race.]

    The new accelerator was designed to collide beams at a collision energy of 14 TeV and to have a beam brightness — instantaneous luminosity — of at least 100 × 1032 cm-2 s-1. So the beam energy was to be seven times higher, and the beam brightness would increase 50- to 100-fold.

    Sadly, in 2008, a design flaw was uncovered in the LHC when an electrical short caused severe damage, requiring two years to repair . Further, when the LHC actually did run, in 2010, it operated at half the design energy (7 TeV) and at a beam brightness basically the same as that of the Fermilab Tevatron. The lower energy was to give a large safety margin, as the design flaw had been only patched, not completely reengineered.

    The situation improved in 2011 when the beam brightness got as high as 30 × 1032 cm-2 s-1, although with the same beam energy. In 2012, the beam energy was raised to 8 TeV, and the beam brightness was higher still, peaking at about 65 × 1032 cm-2 s-1.

    The LHC was shut down during 2013 and 2014 to retrofit the accelerator to make it safe to run at closer to design specifications. The retrofits consisted mostly of additional industrial safety measures that allowed for better monitoring of the electrical currents in the LHC. This helps ensure there are no electrical shorts and that there is sufficient venting. The venting guarantees no catastrophic ruptures of the LHC magnets (which steer the beams) in the event that cryogenic liquids — helium and nitrogen — in the magnets warm up and turn into a gas. In 2015, the LHC resumed operations, this time at 13 TeV and with a beam brightness of 40 × 1032 cm-2 s-1.

    So what’s expected in 2016?

    The LHC will run at 13 TeV and with a beam brightness that is expected to approach 100 × 1032 cm-2 s-1 and possibly even slightly exceed that mark. Essentially, the LHC will be running at design specifications.

    In addition, there is a technical change in 2016. The protons in the LHC beams will be spread more uniformly around the ring, thus reducing the number of protons colliding simultaneously, resulting in better data that is easier to interpret.

    At a technical level, this is kind of interesting. A particle beam isn’t continuous like a laser beam or water coming out of a hose. Instead, the beam comes in a couple of thousand distinct “bunches.” A bunch looks a little bit like a stick of uncooked spaghetti, except it is about a foot long and much thinner — about 0.3 millimeters, most of the time. These bunches travel in the huge 16-mile-long (27 kilometers) circle that is the LHC, with each bunch separated from the other bunches by a distance that (until now) has been about 50 feet (15 meters).

    The technical change in 2016 is to take the same number of beam protons (roughly 3 × 1014 protons) and split them up into 2,808 bunches, each separated not by 50 feet, but by 25 feet (7.6 m). This doubles the number of bunches, but cuts the number of protons in each bunch in half. (Each bunch contains about 1011 protons.)

    Because the LHC has the same number of protons but separated into more bunches, that means when two bunches cross and collide in the center of the detector, there are fewer collisions per crossing. Since most collisions are boring and low-energy affairs, having a lot of them at the same time that an interesting collision occurs just clutters up the data.

    Ideally, you’d like to have only an interesting collision and no simultaneous boring ones. This change of bunch separation distance from 50 feet to 25 feet brings the data collection closer to ideal.

    Luminous beams

    Another crucial design element is the integrated beam. Beam brightness (instantaneous luminosity) is related to the number of proton collisions per second, while integrated beam (integrated luminosity) is related to the total number of collisions that occur as the two counter-rotating beams continually pass through the detector. Integrated luminosity is something that adds up over the days, months and years.

    The unit of integrated luminosity is a pb-1. This unit is a bit confusing, but not so bad. The “b” in “pb” stands for a barn (more on that in a moment). A barn is 10-24 cm2. A picobarn (pb) is 10-36 cm2. The term “barn” is a unit of area and comes from another particle physics term called a cross section, which is related to how likely it is that two particles will interact and generate a specific outcome. Two objects that have large effective area will interact easily, while objects with a small effective area will interact rarely.

    An object with an area of a barn is a square with a length of 10-12 cm. That’s about the size of the nucleus of a uranium atom.

    During World War II, physicists at Purdue University in Indiana were working with uranium and needed to mask their work for security reasons. So they invented the term “barn,” defining it as an area about the size of a uranium nucleus. Given how big this area is in the eyes of nuclear and particle physicists, the Purdue scientists were co-opting the phrase “as big as a barn.” In the luminosity world, with its units of (1/barn), small numbers mean more luminosity.

    This trend is evident in the integrated luminosity seen in the LHC each year as scientists improved their ability to operate the accelerator. The integrated luminosity in 2010 was 45 pb-1. In 2011 and 2012, it was 6,100 pb-1 and 23,300 pb-1, respectively. As time went on, the accelerator ran more reliably, resulting in far higher numbers of recorded collisions.

    Because the accelerator had been re-configured during the 2013 to 2014 shutdown, the luminosity was lower in 2015, coming in at 4,200 pb-1, although, of course, at the much higher beam energy. The 2016 projection could be as high as 35,000 pb-1. The predicted increase merely reflects the accelerator operators’ increased confidence in their ability to operate the facility.

    This means in 2016, we could actually record eight times as much data as we did in 2015. And it is expected that 2017 will bring even higher performance.

    Illuminating new science

    Let’s think about what these improvements mean. When LHC first collided beams, in 2010, the Higgs boson was still to be observed.

    Higgs Boson Event
    Higgs Boson Event

    On the other hand, the particle was already predicted, and there was good circumstantial evidence to expect that the Higgs would be discovered. And, without a doubt, it must be admitted that the discovery of the Higgs boson was an enormous scientific triumph.

    But confirming previously predicted particles, no matter how impressive, is not why the LHC was built.

    Scientists’ current theory of the particle world is called the Standard Model, and it was developed in the late 1960s, half a century ago.

    The Standard Model of elementary particles , with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles , with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth

    While it is an incredibly successful theory, it is known to have holes. Although it explains why particles have mass, it doesn’t explain why some particles have more mass than others. It doesn’t explain why there are so many fundamental particles, given that only a handful of them are needed to constitute the ordinary matter of atoms and puppies and pizzas. It doesn’t explain why the universe is composed solely of matter, when the theory predicts that matter and antimatter should exist in equal quantities. It doesn’t identify dark matter, which is five times more prevalent than ordinary matter and is necessary to explain why galaxies rotate in a stately manner and don’t rip themselves apart.

    When you get right down to it, there is a lot the Standard Model doesn’t explain. And while there are tons of ideas about new and improved theories that could replace it, ideas are cheap. The trick is to find out which idea is right.

    That’s where the LHC comes in. The LHC can explore what happens if we expose matter to more and more severe conditions. Using Einstein’s equation E = mc2, we can see how the high-collision energies only achievable in the LHC are converted into forms of matter never before seen. We can sift through the LHC data to find clues that point us in the right direction to hopefully figure out the next bigger and more effective theory. We can take another step toward our ultimate goal of finding a theory of everything.

    With the LHC now operating at essentially design spec, we can finally use the machine to do what we built it for: to explore new realms, to investigate phenomena never before seen and, stealing a line from my favorite television show, “to boldly go where no one has gone before.” We scientists are excited. We’re giddy. We’re pumped. In fact, there can be but one way to express how we view this upcoming year:

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 7:53 am on March 29, 2016 Permalink | Reply
    Tags: , , livescience   

    From livescience: “Oklahoma Is Now an Earthquake Hotspot, New Map Shows” 


    March 28, 2016
    Laura Geggel

    This USGS map shows the potential for an area to experience damage from natural or human-induced earthquakes in 2016. Credit: USGS

    The chances of a damaging earthquake are now as likely in parts of Oklahoma and some neighboring states as they are in temblor-heavy California, according to a report by the United States Geological Survey (USGS).

    The culprit: Man-made activities related to oil and gas production are creating the shaky conditions in a region in the Central and Eastern U.S., the USGS seismologists say.

    USGS scientists just released their first map that includes earthquake risks from both natural and human-induced causes for the coming year. Until now, the government agency included only temblor risks linked to natural causes.

    The report, which is part of a 50-year forecast examining earthquake hazards, reveals that about 7 million people live and work in areas at risk of human-induced seismicity. Areas in the Central and Eastern U.S. (CEUS) are at risk of experiencing a quake of the same magnitude as the naturally occurring ones in California, the USGS said.

    “By including human-induced events, our assessment of earthquake hazards has significantly increased in parts of the U.S.,” Mark Petersen, chief of the USGS National Seismic Hazard Mapping Project, said in a statement. “This research also shows that much more of the nation faces a significant chance of having damaging earthquakes over the next year, whether natural or human-induced.”

    The CEUS’ induced earthquakes are often the product of wastewater disposal, the USGS said. This wastewater comes from oil and gas production, when it is pumped into underground wells deep in the Earth. This is different from hydraulic fracturing, also known as fracking, in which water, sand and chemicals get pumped into the Earth to break up rock and extract oil and gas. The actual fracking is probably a more infrequent cause of felt earthquakes, the USGS said. (Wastewater from fracking is generally pumped back into wastewater injection wells.)

    Still, wastewater injection practices have put six states on the earthquake map. Oklahoma has the highest risk, followed by Kansas, Texas, Colorado, New Mexico and Arkansas, the USGS reported. Oklahoma and Texas have the largest populations living near induced earthquake hotspots.

    “In the past five years, the USGS has documented high shaking and damage in areas of these six states, mostly from induced earthquakes,” Petersen said. “Furthermore, the USGS Did You Feel It? website has archived tens of thousands of reports from the public who experienced shaking in those states, including about 1,500 reports of strong shaking or damage.”

    For instance, from 1973 to 2008, an average of 24 earthquakes with a magnitude of 3.0 or larger shook the Central United States each of those years. But, from 2009 to 2015, that number increased to an average of 318 earthquakes of this magnitude per year. The year 2015 saw the greatest number, with 1,010 earthquakes of a magnitude-3.0 or greater. [Video: Watch 2,500+ Earthquakes in Oklahoma Linked to Humans]

    And through mid-March of this year, 226 earthquakes of a magnitude-3.0 or larger have already hit the Central United States, the USGS said. The largest earthquake to occur near a wastewater injection site was a magnitude-5.6 temblor near Prague, Oklahoma, in 2011.

    Overall, USGS researchers found 21 areas with increased rates of human-induced seismicity. Some areas — such as regions within Alabama and Ohio — experienced human-induced earthquakes in the past, but have relatively little risk in the coming year because the activities that caused these quakes have decreased.

    But other areas of Alabama and some parts of Mississippi have shown an increase in these activities. But researchers are still determining whether earthquakes in these areas happened naturally or were human-induced, the USGS said.

    The scientists found the greatest risk for a human-induced earthquake in north-central Oklahoma and the southernmost part of Kansas, where they calculated at 10 to 12 percent risk of an earthquake with strong shaking this year. Such an earthquake, they estimated, would register a 6 or greater on the Modified Mercalli Intensity Scale, meaning it would easily be felt but would likely cause just slight damage.

    Though scientists disagree about whether wastewater injection leads to larger or smaller earthquakes compared with natural ones, in the CEUS region, when a large temblor does strike thousands of faults could rupture, according to the USGS. What’s more, human-induced quakes tend to come in swarms of smaller events at shallower depths where shaking is more likely to be felt and cause damage.

    The new earthquake report will help architects determine how to safely design buildings within areas of high risk. People who live in earthquake territory can read about safety measures at FEMA’s Ready Campaign.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 8:57 am on February 13, 2016 Permalink | Reply
    Tags: , , livescience, MyShake,   

    From livescience: “‘MyShake’ App Turns Your Smartphone into Earthquake Detector” 


    February 12, 2016
    Mindy Weisberger


    Seismologists and app developers are shaking things up with a new app that transforms smartphones into personal earthquake detectors.

    By tapping into a smartphone’s accelerometer — the motion-detection instrument — the free Android app, called MyShake, can pick up and interpret nearby quake activity, estimating the earthquake’s location and magnitude in real-time, and then relaying the information to a central database for seismologists to analyze.

    In time, an established network of users could enable MyShake to be used as an early- warning system, the researchers said.

    UC Berkeley MyShake
    MyShake network

    Crowdsourcing quakes

    Seismic networks worldwide detect earthquakes and convey quake data to scientists around the clock, providing a global picture of the tremors that are part of Earth’s ongoing dynamic processes. But there are areas where the network is thin, which means researchers are missing pieces in the seismic puzzle. However, “citizen- scientists” with smartphones could fill those gaps, according to Richard Allen, leader of the MyShake project and director of the Berkeley Seismological Laboratory in California.

    “As smartphones became more popular and it became easier to write software that would run on smartphones, we realized that we had the potential to use the accelerometer that runs in every smartphone to record earthquakes,” Allen told Live Science.

    How it works

    Accelerometers measure forces related to acceleration: vibration, tilt and movement, and also the static force of gravity’s pull. In smartphones, accelerometers detect changes in the device’s orientation, allowing the phone to know exactly which end is up and to adjust visual displays to correspond to the direction it’s facing.

    Fitness apps for smartphones use accelerometers to pinpoint specific changes in motion in order to calculate the number of steps you take, for example. And the MyShake app is designed to recognize when a smartphone’s accelerometer picks up the signature shaking of an earthquake, Allen said, which is different from other types of vibrating motion, or “everyday shaking.”

    In fact, the earthquake-detection engine in MyShake is designed to recognize an earthquake’s vibration profile much like a fitness app recognizes steps, according to Allen.

    “It’s about looking at the amplitude and the frequency content of the earthquake,” Allen said, “and it’s quite different from the amplitude and frequency content of most everyday shakes. It’s very low-frequency energy and the amplitude is not as big as the amplitude for most everyday activities.”

    In other words, the difference between the highs and lows of the motion generated by an earthquake are smaller than the range you’d find in other types of daily movement, he said.

    Quake, rattle and roll

    When a smartphone’s MyShake app detects an earthquake, it instantly sends an alert to a central processing site. A network detection algorithm is activated by incoming data from multiple phones in the same area, to “declare” an earthquake, identify its location and estimate its magnitude, Allen said.

    For now, the app will only collect and transmit data to the central processor. But the end goal, Allen said, is for future versions of the app to send warnings back to individual users.

    An iPhone version of the app will also be included in future plans for MyShake, according to Allen.For seismologists, the more data they can gather about earthquakes, the better, Allen said. A bigger data pool means an improved understanding of quake behavior, which could help experts design better early warning systems and safety protocols, things that are especially critical in urban areas prone to frequent quake activity. With 2.6 billion smartphones currently in circulation worldwide and an anticipated 6 billion by 2020, according to an Ericsson Mobility Report released in 2015, a global network of handheld seismic detectors could go a long way toward keeping people safe by improving quake preparation and response.

    The findings were published online today (Feb. 12) in the journal Science Advances, and the MyShake app is available for download at myshake.berkeley.edu.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 12:46 pm on February 6, 2016 Permalink | Reply
    Tags: , livescience, ,   

    From livescience: “What If the Supercontinent Pangaea Had Never Broken Up?” 


    Brought forward 2.6.16
    Original date May 13, 2011

    Adam Hadhazy

    Things would be a little different.

    Pangaea and its breakup

    From about 300 million to 200 million years ago, all seven modern continents were mashed together as one landmass, dubbed Pangaea . The continents have since “drifted” apart because of the movements of the Earth’s crust, known as plate tectonics. Some continents have maintained their puzzle piece-like shapes: Look at how eastern South America tucks into western Africa.

    Techtonic plates
    The tectonic plates of the world were mapped in the second half of the 20th century.

    Life would be: Far less diverse. A prime driver of speciation the development of new species from existing ones is geographical isolation, which leads to the evolution of new traits by subjecting creatures to different selective pressures. Consider, for example, the large island of Madagascar, which broke off from Gondwana, Pangaea’s southern half, 160 million years ago. About nine out of 10 of the plant and mammal species that have evolved on the island are not found anywhere else on the planet, according to Conservation International.

    A locked-in Pangaea further constrains life’s possibilities because much of its interior would be arid and hot, said Damian Nance, a professor of geosciences at Ohio University. “Because of Pangaea’s size, moisture-bearing clouds would lose most of their moisture before getting very far inland,” Nance told Life’s Little Mysteries.

    Excess mass on a spinning globe shifts away from the poles, so the supercontinent would also become centered on the equator, the warmest part of the planet. Reptiles could deal with such a climate better than most, which is partly why dinosaurs, which emerged during the time the planet’s surface was one giant chunk, thrived before mammals.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:45 am on January 20, 2016 Permalink | Reply
    Tags: , Baking soda and premature death, livescience   

    From livescience: “Baking-Soda Ingredient May Lower Risk of Premature Death” 


    January 19, 2016
    Christopher Wanjek

    Temp 1

    Older people may be at increased risk of premature death if they have low levels of bicarbonate, a main ingredient in baking soda, in their blood, a new study suggests.

    In the study, researchers examined nearly 3,000 relatively healthy adults ages 70 to 79 over a 10-year period. During this time, about half of these people died from natural causes. But the adults with low levels of bicarbonate in their blood were nearly 25 percent more likely than the adults with normal or high levels of bicarbonate in their blood to die during the study period, the researchers found.

    The reason for the link isn’t exactly clear, but it may have to do with the ill effects of having slightly acidic blood, the researchers said. Bicarbonate, a base, is a natural byproduct of metabolism that the body uses to regulate the pH level of the blood. Bicarbonate counters carbon dioxide and other acidic byproducts of eating and breathing to keep the blood at a neutral pH.

    Another possible explanation is that the low levels might point to underlying (and undiagnosed) kidney problems, the researchers said. Given that it’s easy to test for bicarbonate in the blood, it may be prudent for doctors to monitor bicarbonate levels in older adults to reduce their risk of premature death, the researchers said.

    The study, led by Dr. Kalani Raphael of the University of Utah in Salt Lake City, appears this week in the Clinical Journal of the American Society of Nephrology.

    Acidic blood

    Scientists have long understood the importance and basic mechanisms for regulating blood pH. As aerobic organisms, humans breathe in oxygen and use this chemical with food nutrients to create energy, with the byproduct being the slightly acidic chemical carbon dioxide. Digestion also produces acids such as sulfuric acid. Dangerous levels of acid could build up in the bloodstream without a mechanism to neutralize or eliminate it.

    Many organs produce bicarbonate, which can buffer the acid, and the kidneys excrete excess acid through urine.

    The new study found only an association, not a cause-and-effect relationship, between bicarbonate levels and the risk of early death, the researchers noted. They could not determine whether the low levels of bicarbonate in the blood contributed to the deaths or were an indicator of an underlying medical problem that could have contributed to these people’s deaths.

    Previous studies have found that low bicarbonate levels are associated with declining kidney function over time, even in people without kidney disease, and this increases the risk of death, including cardiovascular death, Raphael told Live Science. Low bicarbonate levels are also associated with inflammation and a loss of bone mineral and muscle mass, “so these factors may play a role,” Raphael said.

    Dr. Michael Emmett, chief of internal medicine at Baylor University Medical Center in Dallas, who was not part of the study, added that kidney function declines naturally with age, making it harder for the body to excrete acid loads derived from the diet. Acid retention reduces bicarbonate levels, and even low to normal levels may contribute to a variety of age-related disorders, such as osteoporosis, reduced muscle mass and strength, and kidney stones, he said.

    The findings, if confirmed in larger studies, may provide primary care doctors with a simple measurement to help identify patients who have an elevated risk of early death. However, the remedy to reduce this risk isn’t yet clear.

    Doctors who find low levels of bicarbonate in a patient’s blood might want to look for the root cause, to see if an underlying condition is responsible. They also might want to have patients increase their bicarbonate levels through the diet — but that doesn’t necessarily mean eating more baking soda or the baking goods that might contain it, Raphael said.

    Older people in general might benefit from a diet with less meat and more fruits and vegetables, Emmett said. Meat, which is rich in protein and thus amino acids, increases the body’s acid load, and this could be a challenge for aging kidneys to excrete. But when plant-based foods, particularly fruits, are digested, they produce bicarbonate. This, in theory, could raise bicarbonate levels in the blood.

    This simple premise of pH regulation has given rise to the alternative medical practice of consuming baking soda (sodium bicarbonate) or other sources of bicarbonate, which is purported to slow the aging process and cure cancer. But there is no evidence to support such claims, scientists say. [8 Tips for Healthy Aging]

    Worse, consuming too much baking soda can lead to health problems such as a perforated stomach, high blood pressure due to the sodium load, and possibly kidney stones from producing excess calcium in the urine, Emmett said. Instead, potassium bicarbonate might be a better alternative to baking soda, but it should be taken only under a doctor’s supervision, he said.

    “We don’t know for sure if raising low bicarbonate levels into the normal range with baking soda or taking baking soda if your bicarbonate levels are normal improves health,” Raphael said. “People with kidney, heart, lung and liver diseases, and women who are pregnant, should never self-medicate with bicarbonate or baking soda.”

    But no disrespect to baking soda — it gets out stains, deodorizes closets and refrigerators, whitens teeth, and makes fluffy baked goods. That ain’t too shabby.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 9:22 am on January 18, 2016 Permalink | Reply
    Tags: , , , livescience   

    From livescience: ” 727 People on Chesapeake Bay Island Could Become America’s First ‘Climate Refugees’ “ 


    December 11, 2015
    Stephanie Pappas

    Temp 1
    An aerial view of the town of Tangier on Tangier Island in Virginia’s Chesapeake Bay. Homes sit on yards bordered by estuarine marshes and tidal creeks. Most of the 700 or so inhabitants of Tangier get around on foot or by bicycle or golf cart. Credit: U.S. Army Corps of Engineers & David Schulte

    Rising seas will likely render the last inhabited island in Virginia uninhabitable in 50 years, a new study finds.

    The Chesapeake Bay’s Tangier Island, the site of the town of Tangier (population 727), will become uninhabitable under a midrange estimate of sea level rise due to climate change by 2063, researchers report in the Dec. 10 issue of the journal Scientific Reports.

    Already, more than 500 lower-level islands in the Chesapeake Bay have vanished since Europeans first arrived in the area in the 1600s, said study leader David Schulte, an oceanographer with the U.S. Army Corps of Engineers Norfolk District. Engineering efforts could shore up Tangier, Schulte told Live Science, but saving the island and its neighbors will ultimately require action on climate.

    “There are actions that we can take,” he said. “But obviously the best action we could take would be to do something about the bigger issue.”

    Tangier is one of the Tangier Islands, a series of grassy spits of land about 14 miles (22 kilometers) east of mainland Virginia, within the Chesapeake Bay. Tangier is the southernmost of the islands, which also include Goose Island, Uppards Island and Port Isobel.

    Temp 2
    A seawall protects the small airport at the town of Tangier on Tangier Island. Erected in 1989, this seawall keeps erosion from storms and sea level rise at bay on the western end of the island. New research, however, suggests that the island will be uninhabitable by 2063.
    Credit: U.S. Army Corps of Engineers & David Schulte

    Thirty-nine islands in the Chesapeake Bay were once habitable, Schulte said. Today, Tangier and Smith Island in Maryland are the only two that remain so. Erosion and sea level rise (and, to some extent, other factors like land subsidence due to groundwater pumping) have eaten away at the rest.

    Reliable maps of the Tangier Islands date back to the 1850s. Schulte and his colleagues compared these maps to the modern geography of the islands and then using projected rates of local sea level rise to estimate land loss in the future. Sea level has been rising globally between 0.04 and 0.1 inches (0.1 and 0.25 centimeters) per year, according to the National Oceanic and Atmospheric Administration, and that rate is accelerating. In addition, because of local geology, wind and ocean patterns, some regions will see relatively larger sea level increases. One such hotspot is along the U.S. East Coast from Boston, Massachusetts to Cape Hatteras, North Carolina, a stretch that includes the Chesapeake Bay.

    Already, 66.75 percent of the Tangier Islands’ 1850 land mass has been lost, Schulte and his colleagues found. On the west side of Tangier Island, erosion from large storms plays a big role in the loss, Schulte said. On the east side, gradual sea level rise is mostly to blame.

    A conservative, midrange estimate of sea level rise gives Tangier Island a mere 50 years to live, Schulte said. “If you take the more extreme high sea level rise, they’ve got about half the time, maybe 25 years,” he said.

    To the north, Goose Island is likely to be inundated by 2038 in the midrange sea level rise scenario, and Uppards will be mostly inundated by 2063 and gone by 2113. As firm land converts to sea marsh, the town of Tangiers will likely be uninhabitable by 2063.

    Climate displacement?

    That could make the 700 or so townspeople of Tangier the first climate refugees in the United States, Schulte said. And the loss of the islands has ecological and economic consequences, too. By 2063, an estimated $1.75 million per of “ecological services,” such as water filtration, bird nesting habitat and blue crab habitat, will be lost, Schulte said.

    A rock wall on Tangier Island, built in 1989, already protects a small airport from erosion. Other engineering solutions, like breakwaters and man-made dunes, could extend the life of the island by a few decades, Schulte said. The changes in the island, however, are already visible to the naked eye. A settlement on the island’s north end called Canaan was abandoned in the 1920s because of frequent flooding, but 10 years ago, visitors could still see old foundations and a graveyard, Schulte said. Today, it’s all gone.

    “What was land is now underwater,” Schulte said.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 11:21 am on January 17, 2016 Permalink | Reply
    Tags: , Death of our sun, livescience   

    From livescience: “What Will Happen to Earth When the Sun Dies?” 


    December 01, 2010 [Brought forward today.]
    No writer credit
    Space.com and Life’s Little Mysteries Staff

    Temp 1
    The sun is 93 million miles (149.6 million km) away from Earth. No image credit.

    The sun is dying, and when it finally kicks, it will take Earth with it. We probably won’t be around to see it, though: The sun’s death throes will have taken out life here well before it swallows the planet.

    The good news? We’ve got a very, very long time before any of this happens.

    A panel of scientists at the annual meeting of the American Association for the Advancement of Science described the situation in 2000, and it still holds true. Astronomers generally agree that the sun will burn up its hydrogen fuel supply sometime in the next 5 billion to 7 billion years. As it does, gravity will force the sun to collapse into its core, which will ratchet up the heat on the remaining hydrogen and cause the sun to expand into a red giant.

    At this point, the sun will swallow the Earth.

    “Earth will end up in the sun, vaporizing and blending its material with that of the sun,” said Iowa State University’s Lee Anne Willson. “That part of the sun then blows away into space, so one might say Earth is cremated and the ashes are scattered into interstellar space.”

    By then, the sun will be hot enough to burn all its stored helium and the sun will fluctuate in size. The sun isn’t quite massive enough to explode in an awesome supernova, so it will merely collapse into a relatively cool white dwarf.

    Perhaps a moot point, though, because we’ll most likely be long dead before this occurs. As the sun revs up to its red giant phase, it’s getting about 10 percent brighter every billion years. At that rate, scientists estimate that all the water on the planet will evaporate in the next billion years.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 11:02 am on January 17, 2016 Permalink | Reply
    Tags: , , , livescience,   

    From livescience: “Stephen Hawking: Black Holes Have ‘Hair'” 


    January 14, 2016
    Tia Ghose

    Temp 1
    This artist’s concept shows a black hole’s surroundings, including its accretion disk, jet and magnetic field. Credit: ESO/L. Calçada

    Black holes may sport a luxurious head of “hair” made up of ghostly, zero-energy particles, says a new hypothesis proposed by Stephen Hawking and other physicists.

    Temp 5
    Dr. Stephen Hawking

    The new paper, which was published online Jan. 5 in the preprint journal arXiv, proposes that at least some of the information devoured by a black hole is stored in these electric hairs.

    Still, the new proposal doesn’t prove that all the information that enters a black hole is preserved.

    “The million dollar question is whether all the information is stored in this way, and we have made no claims about that,” said study author Andrew Strominger, a physicist at Harvard University in Massachusetts. “It seems unlikely that the kind of hair that we described is rich enough to store all the information.”

    Black holes

    According to [Albert] Einstein’s theory of general relativity, black holes are extremely dense celestial objects that warp space-time so strongly that no light or matter can escape their clutches. Some primordial black holes formed soon after the Big Bang and may be the size of a single atom yet as massive as a mountain, according to NASA. Others form as gigantic stars collapse in on themselves, while supermassive black holes lie at the hearts of almost all galaxies.

    In the 1960s, physicist John Wheeler and colleagues proposed that black holes “have no hair,” a metaphor meaning that black holes were shorn of all complicated particularities. In Wheeler’s formulation, all black holes were identical except for their spin, angular momentum and mass.

    Then, in the 1970s, Stephen Hawking proposed the notion now called Hawking radiation. In this formulation, all black holes “leak” mass in the form of ghostly quantum particles that escape over time. Eventually, Hawking radiation causes black holes to evaporate altogether, leaving a single, unique vacuum. The vacuums left by these black holes, according to the original theory, would be identical, and thus incapable of storing information about the objects from which they were formed, Strominger said.

    Since the Hawking radiation leaking from a black hole is completely random, that would mean black holes lose information over time, and there would be no way of knowing much about the celestial objects that formed the black holes. Yet that notion creates a paradox, because on the smallest scale, the laws of physics are completely reversible, meaning information that existed in the past should be theoretically recoverable. In recent years, Hawking has walked back the notion of information loss and conceded that black holes do store information after all.

    Black hole “snowflakes”

    In the past several years, Strominger has been dismantling some of these notions. First, he asked the question: What happens if you add a “soft” photon, or a particle of light with no energy, to the vacuum left behind after a black hole evaporates?

    Though most people have never heard of soft photons, the particles are ubiquitous, Strominger said. (Other particles, called soft gravitons, are hypothetical quantum particles that transmit gravity. Though they have never been detected, most physicists believe these particles exist and are also incredibly abundant, Strominger said).

    “Every collision at the Large Hadron Collider produces an infinite number of soft photons and soft gravitons,” Strominger said. “We’re swimming in them all the time.”

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    After working through the equations, he — together with Hawking and Malcolm Perry, who are both physicists at the University of Cambridge in England — found that the black hole vacuum would have the same energy but different angular momentum after the addition of a soft photon. That meant the vacuum state of an evaporated black hole is a kind of celestial snowflake, with its individual properties dependent on its origin and history.

    “Far from being a simple, vanilla object, it’s like a large hard drive which can store essentially an infinite amount of information in the form of these zero-energy photons and gravitons,” Strominger told Live Science.

    The new work is an extension of a short paper Hawking put out in 2014, which argued that the event horizon, or the point of no return before an object would get swallowed into a black hole forever, may not be a fixed boundary. The new paper posits that hairs of soft photons and gravitons fringe a black holes’ event horizon.

    Information paradox stands

    The problem is that this information is “incredibly scrambled up,” so retrieving it from a black hole is akin to determining what someone tossed into a bonfire after it has burned up, Strominger said. Essentially, the new work is the black hole equivalent of using smoke and fire to figure out the identity of the original object that was burnt, he added.

    “It’s not a final answer to the information problem, but it does seem like a step in the right direction,” said Aidan Chatwin-Davies, a physicist at the California Institute of Technology, who was not involved in the study.

    While some of the information in a black hole may be contained in its hairy halo of soft photons and gravitons, not all of it necessarily resides there, he said.

    “If anything, it puts forward some new ideas for us to think about which could prove very helpful in understanding black holes and how they encode information,” Chatwin-Davies told Live Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 6:32 pm on January 3, 2016 Permalink | Reply
    Tags: , livescience, Plague in the USA   

    From livescience: “Ingredients of Plague Risk in Western US Identified” 


    December 29, 2015
    Laura Geggel

    Temp 1
    Fleas that bite rodents infected with the bacteria that cause the plague can transmit the disease to people. Credit: Janice Haney Carr/CDC

    Small outbreaks of the plague still occur in the western United States, and now new research shows these clusters don’t happen at random. Instead, they tend to pop up in areas that have certain mix of climates, animals and elevation, a new study finds.

    Every year, an average of seven people in the western United States are infected with the bacteria that cause plague (Yersinia pestis). The bacteria — infamous for killing millions of people in Europe during the Middle Ages — typically live in rodents and fleas.

    In the new study, researchers wanted “to identify and map those areas with the greatest potential for human exposure to this infection,” Michael Walsh, an assistant professor in the Department of Epidemiology and Biostatistics in the School of Public Health at the SUNY Downstate Medical Center in New York, said in a statement. The researchers used surveillance data of plague in wild and domestic animals from all over the American West.

    The researchers determined that plague cases in the United States tend to happen in areas that have large populations of deer mice (Peromyscus maniculatus), rainy weather, moderate elevations and ground largely covered with man-made surfaces, such as roads and buildings.

    Temp 2
    A map of the 66 confirmed animal plague cases that occurred in the United States between 2000 and 2015.
    Credit: Walsh M. and Haseeb M.A., PeerJ, 2015.

    Plague first came to the United States in 1900, when steamships carrying infected rats docked at U.S. port cities, according to the Centers for Disease Control and Prevention. The bacteria then spread from urban rats to rural rodents, eventually becoming endemic (or constantly present) in animals in the rural American West.

    These days, most human cases of plague in the United States happen in two regions: one area stretches across southern Colorado and the northern parts of New Mexico and Arizona, while the other region includes California, southern Oregon and western Nevada, the researchers said.

    But little is known about what specific factors — such as climate, land type and elevation — lead to small clusters of plague cases within these broad areas. To investigate, the researchers mapped 66 confirmed cases of plague in wild animals and pets that officials had documented between 2000 and 2015. Then, the researchers zeroed in on several conditions to determine what had contributed to outbreaks.

    Plague risk factors

    The resulting models showed that the presence of deer mice was the most influential factor contributing to plague cases, followed by elevation, the distance between the place where an infected animal was found and a man-made surface, and the average rainfall during the area’s wettest and driest seasons.

    Areas at higher elevations were associated with increased risk of plague in animals, but only among elevations lower than 1.2 miles (2 kilometers), the researchers found.

    “The reason for such a threshold is not entirely clear,” but might have to do with habitat availability, the researchers wrote in the study. For instance, deer mice prefer living around pinyon and juniper pines, trees that grow at moderate but not high elevations, the researchers said.

    Moreover, rainfall influenced plague risk. Places that had wet weather during the rainy season had a higher plague risk, but only up to 4 inches (100 millimeters) of rain in a three-month period. Beyond that threshold, plague risk declined, the researchers found.

    Likewise, increased rainfall during the dry season also corresponded to increased plague risk, but only up to a threshold of 2 inches (50 mm) of rain, after which plague risk dropped to zero. It’s likely that some (but not too much) rain leads to better food availability for rodents, the researchers said, which would explain this threshold.

    Finally, areas of animal habitat that were close to man-made surfaces also had an increased plague risk.

    “To the best of the authors’ knowledge, this is the first study to demonstrate an influence of developed land on animal plague occurrence in the U.S.,” the researchers said. It’s likely that developed areas bring wild animals closer together to people and domestic animals, increasing the risk of spreading plague, the researchers said.

    The findings may help public health officials monitor areas in the American West that are at high risk of plague infection, Walsh said.

    The study was published online Dec. 14 in the journal PeerJ.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: