Tagged: The Conversation Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:14 pm on July 16, 2017 Permalink | Reply
    Tags: , Comparing competitiveness, Is America’s digital leadership on the wane?, Just five companies – Comcast Spectrum Verizon CenturyLink and AT&T – serve more than 80 percent of wired-internet customers in ths US, The Conversation, The US is stalling out, We looked not only at current conditions but also at how fast those conditions are changing   

    From The Conversation: “Is America’s digital leadership on the wane?” Simple Answer Yes 

    Conversation
    The Conversation

    July 16, 2017
    Bhaskar Chakravorti

    American leadership in technology innovation and economic competitiveness is at risk if U.S. policymakers don’t take crucial steps to protect the country’s digital future. The country that gave the world the internet and the very concept of the disruptive startup could find its role in the global innovation economy slipping from reigning incumbent to a disrupted has-been.

    My research, conducted with Ravi Shankar Chaturvedi, investigates our increasingly digital global society, in which physical interactions – in communications, social and political exchange, commerce, media and entertainment – are being displaced by electronically mediated ones. Our most recent report, “Digital Planet 2017: How Competitiveness and Trust in Digital Economies Vary Across the World,” confirms that the U.S. is on the brink of losing its long-held global advantage in digital innovation.

    Our yearlong study examined factors that influence innovation, such as economic conditions, governmental backing, startup funding, research and development spending and entrepreneurial talent across 60 countries. We found that while the U.S. has a very advanced digital environment, the pace of American investment and innovation is slowing. Other countries – not just major powers like China, but also smaller nations like New Zealand, Singapore and the United Arab Emirates – are building significant public and private efforts that we expect to become foundations for future generations of innovation and successful startup businesses.

    Based on our findings, I believe that rolling back net neutrality rules [NYT]will jeopardize the digital startup ecosystem that has created value for customers, wealth for investors and globally recognized leadership for American technology companies and entrepreneurs. The digital economy in the U.S. is already on the verge of stalling; failing to protect an open internet [freepress] would further erode the United States’ digital competitiveness, making a troubling situation even worse.

    2
    Comparing 60 countries’ digital economies. Harvard Business Review, used and reproducible by permission, CC BY-ND.

    Comparing competitiveness

    In the U.S., the reins of internet connectivity are tightly controlled. Just five companies – Comcast, Spectrum, Verizon, CenturyLink and AT&T – serve more than 80 percent of wired-internet customers. What those companies provide is both slower and more expensive than in many countries around the world. Ending net neutrality, as the Trump administration has proposed, would give internet providers even more power, letting them decide which companies’ innovations can reach the public, and at what costs and speeds.

    However, our research shows that the U.S. doesn’t need more limits on startups. Rather, it should work to revive the creative energy that has been America’s gift to the digital planet. For each of the 60 countries we examined, we combined 170 factors – including elements that measure technological infrastructure, government policies and economic activity – into a ranking we call the Digital Evolution Index.

    To evaluate a country’s competitiveness, we looked not only at current conditions, but also at how fast those conditions are changing. For example, we noted not only how many people have broadband internet service, but also how quickly access is becoming available to more of a country’s population. And we observed not just how many consumers are prepared to buy and sell online, but whether this readiness to transact online is increasing each year and by how much.

    The countries formed four major groups:

    “Stand Out” countries can be considered the digital elite; they are both highly digitally evolved and advancing quickly.
    “Stall Out” countries have reached a high level of digital evolution, but risk falling behind due to a slower pace of progress and would benefit from a heightened focus on innovation.
    “Break Out” countries score relatively low for overall digital evolution, but are evolving quickly enough to suggest they have the potential to become strong digital economies.
    “Watch Out” countries are neither well advanced nor improving rapidly. They have a lot of work to do, both in terms of infrastructure development and innovation.

    The US is stalling out

    The picture that emerges for the U.S. is not a pretty one. Despite being the 10th-most digitally advanced country today, America’s progress is slowing. It is close to joining the major EU countries and the Nordic nations in a club of nations that are, digitally speaking, stalling out.

    The “Stand Out” countries are setting new global standards of high states of evolution and high rates of change, and exploring various innovations such as self-driving cars or robot policemen. New Zealand, for example, is investing in a superior telecommunications system and adopting forward-looking policies that create incentives for entrepreneurs. Singapore plans to invest more than US$13 billion in high-tech industries by 2020. The United Arab Emirates has created free-trade zones and is transforming the city of Dubai into a “smart city,” linking sensors and government offices with residents and visitors to create an interconnected web of transportation, utilities and government services.

    3
    India’s smartphone market – a key element of internet connectivity there – is growing rapidly. Shailesh Andrade/Reuters.

    The “Break Out” countries, many in Asia, are typically not as advanced as others at present, but are catching up quickly, and are on pace to surpass some of today’s “Stand Out” nations in the near future. For example, China – the world’s largest retail and e-commerce market, with the world’s largest number of people using the internet – has the fastest-changing digital economy. Another “Break Out” country is India, which is already the world’s second-largest smartphone market. Though only one-fifth of its 1.3 billion people have online access today, by 2030, some estimates suggest, 1 billion Indians will be online.

    By contrast, the U.S. is on the edge between “Stand Out” and “Stall Out.” One reason is that the American startup economy is slowing down: Private startups are attracting huge investments, but those efforts aren’t paying off when the startups are either acquired by bigger companies or offer themselves on the public stock markets.

    Investors, business leaders and policymakers need to take a more realistic look at the best way to profit from innovation, balancing efforts toward both huge results and modest ones. They may need to recall the lesson from the founding of the internet itself: If government invests in key aspects of digital infrastructure, either directly or by creating subsidies and tax incentives, that lays the groundwork for massive private investment and innovation that can transform the economy.

    In addition, investments in Asian digital startups have exceeded those in the U.S. for the first time. According to CB Insights and PwC, US$19.3 billion in venture capital from sources around the world was invested in Asian tech startups in the second quarter of 2017, while the U.S. had $18.4 billion in new investment over the same period.

    This is consistent with our findings that Asian high-momentum countries are the ones in the “Break Out” zone; these countries are the ones most exciting for investors. Over time, the U.S.-Asia gap could widen; both money and talent could migrate to digital hot spots elsewhere, such as China and India, or smaller destinations, such as Singapore and New Zealand.

    For the country that gave the world the foundations of the digital economy and a president who seems perpetually plugged in, falling behind would, indeed, be a disgrace.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 10:29 am on May 8, 2017 Permalink | Reply
    Tags: , Cycling to work: major new study suggests health benefits are staggering, The Conversation   

    From The Conversation: “Cycling to work: major new study suggests health benefits are staggering” 

    Conversation
    The Conversation

    April 19, 2017 [Not exactly prompt.]
    Jason Gill
    Reader, Institute of Cardiovascular and Medical Sciences, University of Glasgow

    Carlos Celis-Morales
    Research Associate, Institute of Cardiovascular and Medical Sciences, University of Glasgow

    1
    Pump action. Csaba Peterdi

    Research has consistently shown that people who are less physically active are both more likely to develop health problems like heart disease and type 2 diabetes, and to die younger. Yet there is increasing evidence that physical activity levels are on the decline.

    The problem is that when there are many demands on our time, many people find prioritising exercise difficult. One answer is to multi-task by cycling or walking to work. We’ve just completed the largest ever study into how this affects your health.

    Published in the British Medical Journal today, the results for cycling in particular have important implications. They suggest that councils and governments need to make it a top priority to encourage as many commuters to get on their bikes as possible.

    The findings

    Cycling or walking to work, sometimes referred to as active commuting, is not very common in the UK. Only three per cent of commuters cycle to work and 11% walk, one of the lowest rates in Europe. At the other end of the scale, 43% of the Dutch and 30% of Danes cycle daily.

    To get a better understanding of what the UK could be missing, we looked at 263,450 people with an average age of 53 who were either in paid employment or self-employed, and didn’t always work at home. Participants were asked whether they usually travelled to work by car, public transport, walking, cycling or a combination.

    We then grouped our commuters into five categories: non-active (car/public transport); walking only; cycling (including some who also walked); mixed-mode walking (walking plus non-active); and mixed-mode cycling (cycling plus non-active, including some who also walked).

    We followed people for around five years, counting the incidences of heart disease, cancers and death. Importantly, we adjusted for other health influences including sex, age, deprivation, ethnicity, smoking, body mass index, other types of physical activity, time spent sitting down and diet. Any potential differences in risk associated with road accidents is also accounted for in our analysis, while we excluded participants who had heart disease or cancer already.

    We found that cycling to work was associated with a 41% lower risk of dying overall compared to commuting by car or public transport. Cycle commuters had a 52% lower risk of dying from heart disease and a 40% lower risk of dying from cancer. They also had 46% lower risk of developing heart disease and a 45% lower risk of developing cancer at all.

    Walking to work was not associated with a lower risk of dying from all causes. Walkers did, however, have a 27% lower risk of heart disease and a 36% lower risk of dying from it.

    The mixed-mode cyclists enjoyed a 24% lower risk of death from all causes, a 32% lower risk of developing cancer and a 36% lower risk of dying from cancer. They did not have a significantly lower risk of heart disease, however, while mixed-mode walkers did not have a significantly lower risk of any of the health outcomes we analysed.

    For both cyclists and walkers, there was a trend for a greater lowering of risk in those who commuted longer distances. In addition, those who cycled part of the way to work still saw benefits – this is important as many people live too far from work to cycle the entire distance.

    As for walkers, the fact that their health benefits were more modest may be related to distance, since they commute fewer miles on average in the UK – six per week compared to 30 for cyclists. They may therefore need to walk longer distances to elicit meaningful benefits. Equally, however, it may be that the lower benefits from walking are related to the fact that it’s a less intense activity.

    What now?

    Our work builds on the evidence from previous studies [American Journel of Epedemiology] in a number of important ways. Our quarter of a million participants was larger than all previous studies combined, which enabled us to show the associations between cycling/walking to work and health outcomes more clearly than before.

    In particular, the findings resolve previous uncertainties about the association with cancer, and also with heart attacks and related fatalities. We also had enough participants to separately evaluate cycling, walking and mixed-mode commuting for the first time, which helped us confirm that cycling to work is more beneficial than walking.

    In addition, much of the previous research was undertaken in places like China and the Nordic countries where cycling to work is common and the supporting infrastructure is good. We now know that the same benefits apply in a country where active commuting is not part of the established culture.

    It is important to stress that while we did our best to eliminate other potential factors which might influence the findings, it is never possible to do this completely. This means we cannot conclusively say active commuting is the cause of the health outcomes that we measured. Nevertheless, the findings suggest policymakers can make a big difference to public health by encouraging cycling to work in particular. And we should not forget other benefits such as reducing congestion and motor emissions.

    Some countries are well ahead of the UK in encouraging cyclists. In Copenhagen and Amsterdam, for instance, people cycle because it is the easiest way to get around town.

    2
    Dutch courage. S-F

    It was not always this way – both cities pursued clear strategies to improve cycle infrastructure first. Ways to achieve this include increasing provision for cycle lanes, city bike hire schemes, subsidised bike purchase schemes, secure cycle parking and more facilities for bicycles on public transport.

    For the UK and other countries that have lagged behind, the new findings suggest there is a clear opportunity. If decision makers are bold enough to rise to the challenge, the long-term benefits are potentially transformative.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 9:22 am on April 8, 2017 Permalink | Reply
    Tags: , , , , , , Exoplanet discovery by an amateur astronomer shows the power of citizen science, The Conversation   

    From CSIRO via The Conversation: “Exoplanet discovery by an amateur astronomer shows the power of citizen science “ 

    CSIRO bloc

    Commonwealth Scientific and Industrial Research Organisation

    The Conversation

    4.7.17
    Ray Norris

    You don’t need to be a professional astronomer to find new worlds orbiting distant stars. Darwin mechanic and amateur astronomer Andrew Grey this week helped to discover a new exoplanet system with at least four orbiting planets.

    2
    An artist’s impression of some of the thousands of exoplanets discovered by NASA’s Kepler Space Telescope. Credit: NASA/JPL

    But Andrew did have professional help and support.

    The discovery was a highlight moment of this week’s three-evening special ABC Stargazing Live, featuring British physicist Brian Cox, presenter Julia Zemiro and others.

    Viewers were encouraged to join in the search for exoplanets – planets orbiting distant stars – using the Exoplanet Explorers website. After a quick tutorial they were then asked to trawl through data on thousands of stars recently observed with NASA’s Kepler Space Telescope.

    NASA/Kepler Telescope

    Grey checked out more than 1,000 stars on the website before discovering the characteristic dips in brightness of the star in the data that signify an exoplanet.

    2
    As the planet passes in front of the star, it hides part of the star, causing a characteristic dip in brightness. ABC/Zooniverse

    Together with other co-discoverers, Grey’s name will appear on a scientific paper reporting the very significant discovery of a star with four planets, orbiting closer to the star than Mercury is to our Sun.

    Grey told Stargazing Live:

    “That is amazing. Definitely my first scientific publication … just glad that I can contribute. It feels very good.”

    Cox was clearly impressed by the new discovery:

    “In the seven years I’ve been making Stargazing Live this is the most significant scientific discovery we’ve ever made.”

    A breakthrough for citizen science

    So just what does this discovery signify? First, let’s be clear: this is no publicity stunt, or a bit of fake news dressed up to make a good story.

    This is a real scientific discovery, to be reported in the scientific literature like other discoveries made by astronomers.

    It will help us understand the formation of our own Earth. It’s also a step towards establishing whether we are alone in the universe, or whether there are other planets populated by other civilisations.

    On the other hand, it must be acknowledged that this discovery joins the list of more than 2,300 known exoplanets discovered by Kepler so far. There are thousands more candidate planets to be examined.

    If Grey and his colleagues hadn’t discovered this new planetary system, then somebody else would have eventually discovered it. But that can be said of all discoveries. The fact remains that this particular discovery was made by Grey and his fellow citizen scientists.

    Amateurs and professionals working together

    I think that the greatest significance of this discovery is that it heralds a change in the way we do science.

    As I said earlier, Grey didn’t make this discovery alone. He used data from the Kepler spacecraft with a mission cost of US$600 million.

    Although we can build stunning telescopes that produce vast amounts of valuable data, we can’t yet build an algorithm that approaches the extraordinary abilities of the human brain to examine that data.

    A human brain can detect patterns in the data far more effectively than any machine-learning algorithm yet devised. Because of the large volume of data generated by Kepler and other scientific instruments, we need large teams of human brains – larger than any research lab.

    But the brains don’t need to be trained astrophysicists, they just need to have the amazing cognitive abilities of the human brain.

    This results in a partnership where big science produces data, and citizen scientists inspect the data to help make discoveries. It means that anyone can be involved in cutting-edge science, accelerating the growth of human knowledge.

    A gathering of brainpower

    This is happening all over science and even the arts, from butterfly hunting to transcribing Shakespeare’s handwriting.

    Last year citizen scientists in the Australian-led Radio Galaxy Zoo project discovered the largest known cluster of galaxies.

    None of these projects would be possible without widespread access to the internet, and readily-available tools to build citizen science projects, such as the Zooniverse project.

    Will machines ever make citizen scientists redundant? I have argued before that we need to build algorithms called “machine scientists” to make future discoveries from the vast volumes of data we are generating.

    But these algorithms still need to be trained by humans. The larger our human-generated training set, the better our machine scientists will work.

    So rather than making citizen scientists redundant, the machine scientists multiply the power of citizen scientists, so that a discovery made by a future Andrew Grey may result in hundreds of discoveries by machines trained using his discovery.

    I see the power of citizen scientists continuing to grow. I suspect this is only the start. We can do much more. We can increase the “fun” of doing citizen science by introducing “gaming” elements into citizen science programs, or by taking advantage of new technologies such as augmented reality and immersive virtual reality.

    Perhaps we can tap into other human qualities such as imagination and creativity to achieve goals that still frustrate machines.

    I look forward to the day when a Nobel prize is won by someone in a developing country without access to a traditional university education, but who uses the power of their mind, the wealth of information on the web and the tools of citizen science to transcend the dreams of traditional science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 8:17 am on April 6, 2017 Permalink | Reply
    Tags: , , Paradoxes of probability and other statistical strangeness, The Conversation   

    From COSMOS: “Paradoxes of probability and other statistical strangeness” 

    Cosmos Magazine bloc

    COSMOS

    1
    Statistics and probability can sometimes yield mind bending results. Shutterstock

    Statistics is a useful tool for understanding the patterns in the world around us. But our intuition often lets us down when it comes to interpreting those patterns. In this series we look at some of the common mistakes we make and how to avoid them when thinking about statistics, probability and risk.

    You don’t have to wait long to see a headline proclaiming that some food or behaviour is associated with either an increased or a decreased health risk, or often both. How can it be that seemingly rigorous scientific studies can produce opposite conclusions?

    Nowadays, researchers can access a wealth of software packages that can readily analyse data and output the results of complex statistical tests. While these are powerful resources, they also open the door to people without a full statistical understanding to misunderstand some of the subtleties within a dataset and to draw wildly incorrect conclusions.

    Here are a few common statistical fallacies and paradoxes and how they can lead to results that are counterintuitive and, in many cases, simply wrong.

    Simpson’s paradox, What is it?

    This is where trends that appear within different groups disappear when data for those groups are combined. When this happens, the overall trend might even appear to be the opposite of the trends in each group.

    One example of this paradox is where a treatment can be detrimental in all groups of patients, yet can appear beneficial overall once the groups are combined.
    How does it happen?

    This can happen when the sizes of the groups are uneven. A trial with careless (or unscrupulous) selection of the numbers of patients could conclude that a harmful treatment appears beneficial.
    Example

    Consider the following double blind trial of a proposed medical treatment. A group of 120 patients (split into subgroups of sizes 10, 20, 30 and 60) receive the treatment, and 120 patients (split into subgroups of corresponding sizes 60, 30, 20 and 10) receive no treatment.

    The overall results make it look like the treatment was beneficial to patients, with a higher recovery rate for patients with the treatment than for those without it.

    3
    The Conversation, CC BY-ND

    However, when you drill down into the various groups that made up the cohort in the study, you see in all groups of patients, the recovery rate was 50% higher for patients who had no treatment.

    4
    The Conversation, CC BY-ND

    But note that the size and age distribution of each group is different between those who took the treatment and those who didn’t. This is what distorts the numbers. In this case, the treatment group is disproportionately stacked with children, whose recovery rates are typically higher, with or without treatment.

    Base rate fallacy
    What is it?

    This fallacy occurs when we disregard important information when making a judgement on how likely something is.

    If, for example, we hear that someone loves music, we might think it’s more likely they’re a professional musician than an accountant. However, there are many more accountants than there are professional musicians. Here we have neglected that the base rate for the number of accountants is far higher than the number of musicians, so we were unduly swayed by the information that the person likes music.
    How does it happen?

    The base rate fallacy occurs when the base rate for one option is substantially higher than for another.
    Example

    Consider testing for a rare medical condition, such as one that affects only 4% (1 in 25) of a population.

    Let’s say there is a test for the condition, but it’s not perfect. If someone has the condition, the test will correctly identify them as being ill around 92% of the time. If someone doesn’t have the condition, the test will correctly identify them as being healthy 75% of the time.

    So if we test a group of people, and find that over a quarter of them are diagnosed as being ill, we might expect that most of these people really do have the condition. But we’d be wrong.

    5
    In a typical sample of 300 patients, for every 11 people correctly identified as unwell, a further 72 are incorrectly identified as unwell. The Conversation, CC BY-ND

    According to our numbers above, of the 4% of patients who are ill, almost 92% will be correctly diagnosed as ill (that is, about 3.67% of the overall population). But of the 96% of patients who are not ill, 25% will be incorrectly diagnosed as ill (that’s 24% of the overall population).

    What this means is that of the approximately 27.67% of the population who are diagnosed as ill, only around 3.67% actually are. So of the people who were diagnosed as ill, only around 13% (that is, 3.67%/27.67%) actually are unwell.

    Worryingly, when a famous study asked general practitioners to perform a similar calculation to inform patients of the correct risks associated with mammogram results, just 15% of them did so correctly.

    Will Rogers paradox
    What is it?

    This occurs when moving something from one group to another raises the average of both groups, even though no values actually increase.

    The name comes from the American comedian Will Rogers, who joked that “when the Okies left Oklahoma and moved to California, they raised the average intelligence in both states”.

    Former New Zealand Prime Minister Rob Muldoon provided a local variant on the joke in the 1980s, regarding migration from his nation into Australia.

    How does it happen?

    When a datapoint is reclassified from one group to another, if the point is below the average of the group it is leaving, but above the average of the one it is joining, both groups’ averages will increase.
    Example

    Consider the case of six patients whose life expectancies (in years) have been assessed as being 40, 50, 60, 70, 80 and 90.

    The patients who have life expectancies of 40 and 50 have been diagnosed with a medical condition; the other four have not. This gives an average life expectancy within diagnosed patients of 45 years and within non-diagnosed patients of 75 years.

    If an improved diagnostic tool is developed that detects the condition in the patient with the 60-year life expectancy, then the average within both groups rises by 5 years.

    6
    The Conversation, CC BY-ND

    Berkson’s paradox
    What is it?

    Berkson’s paradox can make it look like there’s an association between two independent variables when there isn’t one.
    How does it happen?

    This happens when we have a set with two independent variables, which means they should be entirely unrelated. But if we only look at a subset of the whole population, it can look like there is a negative trend between the two variables.

    This can occur when the subset is not an unbiased sample of the whole population. It has been frequently cited in medical statistics. For example, if patients only present at a clinic with disease A, disease B or both, then even if the two diseases are independent, a negative association between them may be observed.

    Example

    Consider the case of a school that recruits students based on both academic and sporting ability. Assume that these two skills are totally independent of each other. That is, in the whole population, an excellent sportsperson is just as likely to be strong or weak academically as is someone who’s poor at sport.

    If the school admits only students who are excellent academically, excellent at sport or excellent at both, then within this group it would appear that sporting ability is negatively correlated with academic ability.

    To illustrate, assume that every potential student is ranked on both academic and sporting ability from 1 to 10. There are an equal proportion of people in each band for each skill. Knowing a person’s band in either skill does not tell you anything about their likely band in the other.

    Assume now that the school only admits students who are at band 9 or 10 in at least one of the skills.

    If we look at the whole population, the average academic rank of the weakest sportsperson and the best sportsperson are both equal (5.5).

    However, within the set of admitted students, the average academic rank of the elite sportsperson is still that of the whole population (5.5), but the average academic rank of the weakest sportsperson is 9.5, wrongly implying a negative correlation between the two abilities.

    7
    The Conversation, CC BY-ND

    Multiple comparisons fallacy
    What is it?

    This is where unexpected trends can occur through random chance alone in a data set with a large number of variables.

    How does it happen?

    When looking at many variables and mining for trends, it is easy to overlook how many possible trends you are testing. For example, with 1,000 variables, there are almost half a million (1,000×999/2) potential pairs of variables that might appear correlated by pure chance alone.

    While each pair is extremely unlikely to look dependent, the chances are that from the half million pairs, quite a few will look dependent.
    Example

    The Birthday paradox is a classic example of the multiple comparisons fallacy.

    In a group of 23 people (assuming each of their birthdays is an independently chosen day of the year with all days equally likely), it is more likely than not that at least two of the group have the same birthday.

    People often disbelieve this, recalling that it is rare that they meet someone who shares their own birthday. If you just pick two people, the chance they share a birthday is, of course, low (roughly 1 in 365, which is less than 0.3%).

    However, with 23 people there are 253 (23×22/2) pairs of people who might have a common birthday. So by looking across the whole group you are testing to see if any one of these 253 pairings, each of which independently has a 0.3% chance of coinciding, does indeed match. These many possibilities of a pair actually make it statistically very likely for coincidental matches to arise.

    For a group of as few as 40 people, it is almost nine times as likely that there is a shared birthday than not.

    8
    The probability of no shared birthdays drops as the number of people in a group increases. The Conversation, CC BY-ND

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:57 am on February 20, 2017 Permalink | Reply
    Tags: , Man-made earthquakes, The Conversation   

    From The Conversation: “Earthquakes triggered by humans pose growing risk” 

    Conversation
    The Conversation

    January 22, 2017
    No writer credit found

    1
    Devastation in Sichuan province after the 2008 Wenchuan earthquake, thought to be induced by industrial activity at a nearby reservoir. dominiqueb/flickr

    People knew we could induce earthquakes before we knew what they were. As soon as people started to dig minerals out of the ground, rockfalls and tunnel collapses must have become recognized hazards.

    Today, earthquakes caused by humans occur on a much greater scale. Events over the last century have shown mining is just one of many industrial activities that can induce earthquakes large enough to cause significant damage and death. Filling of water reservoirs behind dams, extraction of oil and gas, and geothermal energy production are just a few of the modern industrial activities shown to induce earthquakes.

    As more and more types of industrial activity were recognized to be potentially seismogenic, the Nederlandse Aardolie Maatschappij BV, an oil and gas company based in the Netherlands, commissioned us to conduct a comprehensive global review of all human-induced earthquakes.

    Our work assembled a rich picture from the hundreds of jigsaw pieces scattered throughout the national and international scientific literature of many nations. The sheer breadth of industrial activity we found to be potentially seismogenic came as a surprise to many scientists. As the scale of industry grows, the problem of induced earthquakes is increasing also.

    In addition, we found that, because small earthquakes can trigger larger ones, industrial activity has the potential, on rare occasions, to induce extremely large, damaging events.

    How humans induce earthquakes

    As part of our review we assembled a database of cases that is, to our knowledge, the fullest drawn up to date. On Jan. 28, we will release this database publicly. We hope it will inform citizens about the subject and stimulate scientific research into how to manage this very new challenge to human ingenuity.

    Our survey showed mining-related activity accounts for the largest number of cases in our database.

    Earthquakes caused by humans

    Last year, the Nederlandse Aardolie Maatschappij BV commissioned a comprehensive global review of all human-induced earthquakes. The sheer breadth of industrial activity that is potentially seismogenic came as a surprise to many scientists. These examples are now catalogued at The Induced Earthquakes Database.

    Mining 37.4%
    Water reservoir impoundment 23.3%
    Conventional oil and gas 15%
    Geothermal 7.8%
    Waste fluid injection 5%
    Fracking 3.9%
    Nuclear explosion 3%
    Research experiments 1.8%
    Groundwater extraction 0.7%
    Construction 0.3%
    Carbon capture and storage 0.3%

    Source: Earth-Science Reviews Get the data

    Initially, mining technology was primitive. Mines were small and relatively shallow. Collapse events would have been minor – though this might have been little comfort to anyone caught in one.

    But modern mines exist on a totally different scale. Precious minerals are extracted from mines that may be over two miles deep or extend several miles offshore under the oceans. The total amount of rock removed by mining worldwide now amounts to several tens of billions of tons per year. That’s double what it was 15 years ago – and it’s set to double again over the next 15. Meanwhile, much of the coal that fuels the world’s industry has already been exhausted from shallow layers, and mines must become bigger and deeper to satisfy demand.

    As mines expand, mining-related earthquakes become bigger and more frequent. Damage and fatalities, too, scale up. Hundreds of deaths have occurred in coal and mineral mines over the last few decades as a result of earthquakes up to magnitude 6.1 that have been induced.

    Other activities that might induce earthquakes include the erection of heavy superstructures. The 700-megaton Taipei 101 building, raised in Taiwan in the 1990s, was blamed for the increasing frequency and size of nearby earthquakes.

    Since the early 20th century, it has been clear that filling large water reservoirs can induce potentially dangerous earthquakes. This came into tragic focus in 1967 when, just five years after the 32-mile-long Koyna reservoir in west India was filled, a magnitude 6.3 earthquake struck, killing at least 180 people and damaging the dam.

    Throughout the following decades, ongoing cyclic earthquake activity accompanied rises and falls in the annual reservoir-level cycle. An earthquake larger than magnitude 5 occurs there on average every four years. Our report found that, to date, some 170 reservoirs the world over have reportedly induced earthquake activity.

    Magnitude of human-induced earthquakes

    The magnitudes of the largest earthquakes postulated to be associated with projects of different types varies greatly. This graph shows the number of cases reported for projects of various types vs. maximum earthquake magnitude for the 577 cases for which data are available.

    4
    *”Other” category includes carbon capture and storage, construction, groundwater extraction, nuclear explosion, research experiments, and unspecified oil, gas and waste water.

    Source: Earth-Science Reviews Get the data [links are above]

    The production of oil and gas was implicated in several destructive earthquakes in the magnitude 6 range in California. This industry is becoming increasingly seismogenic as oil and gas fields become depleted. In such fields, in addition to mass removal by production, fluids are also injected to flush out the last of the hydrocarbons and to dispose of the large quantities of salt water that accompany production in expiring fields.

    A relatively new technology in oil and gas is shale-gas hydraulic fracturing, or fracking, which by its very nature generates small earthquakes as the rock fractures. Occasionally, this can lead to a larger-magnitude earthquake if the injected fluids leak into a fault that is already stressed by geological processes.

    The largest fracking-related earthquake that has so far been reported occurred in Canada, with a magnitude of 4.6. In Oklahoma, multiple processes are underway simultaneously, including oil and gas production, wastewater disposal and fracking. There, earthquakes as large as magnitude 5.7 have rattled skyscrapers that were erected long before such seismicity was expected. If such an earthquake is induced in Europe in the future, it could be felt in the capital cities of several nations.

    Our research shows that production of geothermal steam and water has been associated with earthquakes up to magnitude 6.6 in the Cerro Prieto Field, Mexico. Geothermal energy is not renewable by natural processes on the timescale of a human lifetime, so water must be reinjected underground to ensure a continuous supply. This process appears to be even more seismogenic than production. There are numerous examples of earthquake swarms accompanying water injection into boreholes, such as at The Geysers, California.

    What this means for the future

    Nowadays, earthquakes induced by large industrial projects no longer meet with surprise or even denial. On the contrary, when an event occurs, the tendency may be to look for an industrial project to blame. In 2008, an earthquake in the magnitude 8 range struck Ngawa Prefecture, China, killing about 90,000 people, devastating over 100 towns, and collapsing houses, roads and bridges. Attention quickly turned to the nearby Zipingpu Dam, whose reservoir had been filled just a few months previously, although the link between the earthquake and the reservoir has yet to be proven.

    The minimum amount of stress loading scientists think is needed to induce earthquakes is creeping steadily downward. The great Three Gorges Dam in China, which now impounds 10 cubic miles of water, has already been associated with earthquakes as large as magnitude 4.6 and is under careful surveillance.

    Scientists are now presented with some exciting challenges. Earthquakes can produce a “butterfly effect”: Small changes can have a large impact. Thus, not only can a plethora of human activities load Earth’s crust with stress, but just tiny additions can become the last straw that breaks the camel’s back, precipitating great earthquakes that release the accumulated stress loaded onto geological faults by centuries of geological processes. Whether or when that stress would have been released naturally in an earthquake is a challenging question.

    An earthquake in the magnitude 5 range releases as much energy as the atomic bomb dropped on Hiroshima in 1945. A earthquake in the magnitude 7 range releases as much energy as the largest nuclear weapon ever tested, the Tsar Bomba test conducted by the Soviet Union in 1961. The risk of inducing such earthquakes is extremely small, but the consequences if it were to happen are extremely large. This poses a health and safety issue that may be unique in industry for the maximum size of disaster that could, in theory, occur. However, rare and devastating earthquakes are a fact of life on our dynamic planet, regardless of whether or not there is human activity.

    Our work suggests that the only evidence-based way to limit the size of potential earthquakes may be to limit the scale of the projects themselves. In practice, this would mean smaller mines and reservoirs, less minerals, oil and gas extracted from fields, shallower boreholes and smaller volumes injected. A balance must be struck between the growing need for energy and resources and the level of risk that is acceptable in every individual project.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 10:17 am on February 16, 2017 Permalink | Reply
    Tags: , , , , , , , SCOAP³, The Conversation   

    From The Conversation: “How the insights of the Large Hadron Collider are being made open to everyone” 

    Conversation
    The Conversation

    January 12, 2017 [Just appeared in social media.]
    Virginia Barbour

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    If you visit the Large Hadron Collider (LHC) exhibition, now at the Queensland Museum, you’ll see the recreation of a moment when the scientist who saw the first results indicating discovery of the Higgs boson laments she can’t yet tell anyone.

    It’s a transitory problem for her, lasting as long as it takes for the result to be thoroughly cross-checked. But it illustrates a key concept in science: it’s not enough to do it; it must be communicated.

    That’s what is behind one of the lesser known initiatives of CERN (European Organization for Nuclear Research): an ambitious plan to make all its research in particle physics available to everyone, with a big global collaboration inspired by the way scientists came together to make discoveries at the LHC.

    This initiative is called SCOAP³, the Sponsoring Consortium for Open Access in Particle Physics Publishing, and is now about to enter its fourth year of operation. It’s a worldwide collaboration of more than 3,000 libraries (including six in Australia), key funding agencies and research centres in 44 countries, together with three intergovernmental organisations.

    It aims to make work previously only available to paying subscribers of academic journals freely and immediately available to everyone. In its first three years it has made more than 13,000 articles available.

    Not only are these articles free for anyone to read, but because they are published under a Creative Commons attribution license (CCBY), they are also available for anyone to use in anyway they wish, such as to illustrate a talk, pass onto a class of school children, or feed to an artificial intelligence program to extract information from. And these usage rights are enshrined forever.

    The concept of sharing research is not new in physics. Open access to research is now a growing worldwide initiative, including in Australasia. CERN, which runs the LHC, was also where the world wide web was invented in 1989 by Tim Berners-Lee, a British computer scientist at CERN.

    The main purpose of the web was to enable researchers contributing to CERN from all over the world share documents, including scientific drafts, no matter what computer systems they were using.

    Before the web, physicists had been sharing paper drafts by post for decades, so they were one of the first groups to really embrace the new online opportunities for sharing early research. Today, the pre-press site arxiv.org has more than a million free article drafts covering physics, mathematics, astronomy and more.

    But, with such a specialised field, do these “open access” papers really matter? The short answer is “yes”. Downloads have doubled to journals participating in SCOAP³.

    With millions of open access articles now being downloaded across all specialities, there is enormous opportunity for new ideas and collaborations to spring from chance readership. This is an important trend: the concept of serendipity enabled by open access was explored in 2015 in an episode of ABC RN’s Future Tense program.

    Greater than the sum of the parts

    There’s also a bigger picture to SCOAP³’s open access model. Not long ago, the research literature was fragmented. Individual papers and the connections between them were only as good as the physical library, with its paper journals, that academics had access to.

    Now we can do searches in much less time than we spend thinking of the search question, and the results we are presented with are crucially dependent on how easily available the findings themselves are. And availability is not just a function of whether an article is free or not but whether it is truly open, i.e. connected and reusable.

    One concept is whether research is “FAIR”, or Findable, Accessible, Interoperable and Reusable. In short, can anyone find, read, use and reuse the work?

    The principle is most advanced for data, but in Australia work is ongoing to apply it to all research outputs. This approach was also proposed at the November 2016 meeting of the G20 Science, Technology and Innovation Ministers Meeting. Research findings that are not FAIR can, effectively, be invisible. It’s a huge waste of millions of taxpayer dollars to fund research that won’t be seen.

    There is an even bigger picture that research and research publications have to fit into: that of science in society.

    Across the world we see politicians challenging accepted scientific norms. Is the fact that most academic research remains available only to those who can pay to see it contributing to an acceptance of such misinformed views?

    If one role for science is to inform public debate, then restricting access to that science will necessarily hinder any informed public debate. Although no one suggests that most readers of news sites will regularly want to delve into the details of papers in high energy physics, open access papers are 47% more likely to end up being cited in Wikipedia, which is a source that many non-scientists do turn to.

    Even worse, work that is not available openly now may not even be available in perpetuity, something that is being discussed by scientists in the USA.

    So in the same way that CERN itself is an example of the power of international collaboration to ask some of the fundamental scientific questions of our time, SCOAP³ provides a way to ensure that the answers, whatever they are, are available to everyone, forever.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 9:29 am on January 30, 2017 Permalink | Reply
    Tags: , , , , The Conversation   

    From The Conversation: “Giant atoms could help unveil ‘dark matter’ and other cosmic secrets” 

    Conversation
    The Conversation

    January 5, 2017
    Diego A. Quiñones

    1
    Composite image showing the galaxy cluster 1E 0657-56. Chandra X-Ray Observatory/NASA

    The universe is an astonishingly secretive place. Mysterious substances known as dark matter and dark energy account for some 95% of it. Despite huge effort to find out what they are, we simply don’t know.

    We know dark matter exists because of the gravitational pull of galaxy clusters – the matter we can see in a cluster just isn’t enough to hold it together by gravity. So there must be some extra material there, made up by unknown particles that simply aren’t visible to us. Several candidate particles have already been proposed.

    Scientists are trying to work out what these unknown particles are by looking at how they affect the ordinary matter we see around us. But so far it has proven difficult, so we know it interacts only weakly with normal matter at best. Now my colleague Benjamin Varcoe and I have come up with a new way to probe dark matter that may just prove successful: by using atoms that have been stretched to be 4,000 times larger than usual.

    Advantageous atoms

    We have come a long way from the Greeks’ vision of atoms as the indivisible components of all matter. The first evidence-based argument for the existence of atoms was presented in the early 1800s by John Dalton. But it wasn’t until the beginning of the 20th century that JJ Thomson and Ernest Rutherford discovered that atoms consist of electrons and a nucleus. Soon after, Erwin Schrödinger described the atom mathematically using what is today called quantum theory.

    Modern experiments have been able to trap and manipulate individual atoms with outstanding precision. This knowledge has been used to create new technologies, like lasers and atomic clocks, and future computers may use single atoms as their primary components.

    Individual atoms are hard to study and control because they are very sensitive to external perturbations. This sensitivity is usually an inconvenience, but our study suggests that it makes some atoms ideal as probes for the detection of particles that don’t interact strongly with regular matter – such as dark matter.

    Our model is based on the fact that weakly interacting particles must bounce from the nucleus of the atom it collides with and exchange a small amount of energy with it – similar to the collision between two pool balls. The energy exchange will produce a sudden displacement of the nucleus that will eventually be felt by the electron. This means the entire energy of the atom changes, which can be analysed to obtain information about the properties of the colliding particle.

    However the amount of transferred energy is very small, so a special kind of atom is necessary to make the interaction relevant. We worked out that the so-called “Rydberg atom” would do the trick. These are atoms with long distances between the electron and the nucleus, meaning they possess high potential energy. Potential energy is a form of stored energy. For example, a ball on a high shelf has potential energy because this could be converted to kinetic energy if it falls off the shelf.

    In the lab, it is possible to trap atoms and prepare them in a Rydberg state – making them as big as 4,000 times their original size. This is done by illuminating the atoms with a laser with light at a very specific frequency.

    This prepared atom is likely much heavier than the dark matter particles. So rather than a pool ball striking another, a more appropriate description will be a marble hitting a bowling ball. It seems strange that big atoms are more perturbed by collisions than small ones – one may expect the opposite (smaller things are usually more affected when a collision occurs).

    The explanation is related to two features of Rydberg atoms: they are highly unstable because of their elevated energy, so minor perturbations would disturb them more. Also, due to their big area, the probability of the atoms interacting with particles is increased, so they will suffer more collisions.

    Spotting the tiniest of particles

    Current experiments typically look for dark matter particles by trying to detect their scattering off atomic nuclei or electrons on Earth. They do this by looking for light or free electrons in big tanks of liquid noble gases that are generated by energy transfer between the dark matter particle and the atoms of the liquid.

    1
    The Large Underground Xenon experiment installed 4,850 ft underground inside a 70,000-gallon water tank shield. Gigaparsec at English Wikipedia, CC BY-SA

    But, according to the laws of quantum mechanics, there needs to be a certain a minimum energy transfer for the light to be produced. An analogy would be a particle colliding with a guitar string: it will produce a note that we can hear, but if the particle is too small the string will not vibrate at all.

    So the problem with these methods is that the dark matter particle has to be big enough if we are to detect it in this way. However, our calculations show that the Rydberg atoms will be disturbed in a significant way even by low-mass particles – meaning they can be applied to search for candidates of dark matter that other experiments miss. One of such particles is the Axion, a hypothetical particle which is a strong candidate for dark matter.

    Experiments would require for the atoms to be treated with extreme care, but they will not require to be done in a deep underground facility like other experiments, as the Rydberg atoms are expected to be less susceptible to cosmic rays compared to dark matter.

    We are working to further improve the sensitivity of the system, aiming to extend the range of particles that it may be able to perceive.

    Beyond dark matter we are also aiming to one day apply it for the detection of gravitational waves, the ripples in the fabric of space predicted by Einstein long time ago. These perturbations of the space-time continuum have been recently discovered, but we believe that by using atoms we may be able to detect gravitational waves with a different frequency to the ones already observed.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
    • gregoriobaquero 9:46 am on January 30, 2017 Permalink | Reply

      Precisely to the point of my paper. If I am right nothing is going to be found. No new particles. The density of neutrinos”hot Dark Matter”) we can measure in our frame of reference does not tell the whole picture since we have the same local time with neutrinos passing by. What had not been taken into account is that gravitational time dilation is accumulating neutrinos when compared to neutrinos passing far away from the galaxy. Sent from my iPhone

      >

      Like

    • gregoriobaquero 9:48 am on January 30, 2017 Permalink | Reply

      Also, this phenomenon is similar to how relativity explains electromagnetism. Veritasium has a good video about it.

      Sent from my iPhone

      >

      Like

    • gregoriobaquero 9:54 am on January 30, 2017 Permalink | Reply

      Precisely to the point of my paper. If I am right nothing is going to be found. No new particles. The density of neutrinos”hot Dark Matter”) we can measure in our frame of reference does not tell the whole picture since we have the same local time with neutrinos passing by. What had not been taken into account is that gravitational time dilation is accumulating neutrinos when compared to neutrinos passing far away from the galaxy.

      Also, this phenomenon is similar to how relativity explains electromagnetism. Veritasium has a good video about it.

      Like

    • richardmitnick 10:19 am on January 30, 2017 Permalink | Reply

      Thank you so much for coming on to comment. I appreciate it very much.

      Like

  • richardmitnick 12:06 pm on January 16, 2017 Permalink | Reply
    Tags: ASKAP finally hits the big-data highway, , , , , , , The Conversation, WALLABY - Widefield ASKAP L-band Legacy All-sky Blind surveY   

    From The Conversation for SKA: “The Australian Square Kilometre Array Pathfinder finally hits the big-data highway” 

    Conversation
    The Conversation

    SKA Square Kilometer Array

    SKA

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia
    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    January 15, 2017
    Douglas Bock
    Director of Astronomy and Space Science, CSIRO

    Antony Schinckel
    ASKAP Director, CSIRO

    You know how long it takes to pack the car to go on holidays. But there’s a moment when you’re all in, everyone has their seatbelt on, you pull out of the drive and you’re off.

    Our ASKAP (Australian Square Kilometre Array Pathfinder) telescope has just pulled out of the drive, so to speak, at its base in Western Australia at the Murchison Radio-astronomy Observatory (MRO), about 315km northeast of Geraldton.

    ASKAP is made of 36 identical 12-metre wide dish antennas that all work together, 12 of which are currently in operation. Thirty ASKAP antennas have now been fitted with specialised phased array feeds, the rest will be installed later in 2017.

    Until now, we’d been taking data mainly to test how ASKAP performs. Having shown the telescope’s technical excellence it’s now off on its big trip, starting to make observations for the big science projects it’ll be doing for the next five years.

    And it’s taking lots of data. Its antennas are now churning out 5.2 terabytes of data per second (about 15 per cent of the internet’s current data rate).

    Once out of the telescope, the data is going through a new, almost automatic data-processing system we’ve developed.

    It’s like a bread-making machine: put in the data, make some choices, press the button and leave it overnight. In the morning you have a nice batch of freshly made images from the telescope.

    Go the WALLABIES

    The first project we’ve been taking data for is one of ASKAP’s largest surveys, WALLABY (Widefield ASKAP L-band Legacy All-sky Blind surveY).

    On board the survey are a happy band of 100-plus scientists – affectionately known as the WALLABIES – from many countries, led by one of our astronomers, Bärbel Koribalski, and Lister Staveley-Smith of the International Centre for Radio Astronomy Research (ICRAR), University of Western Australia.

    They’re aiming to detect and measure neutral hydrogen gas in galaxies over three-quarters of the sky. To see the farthest of these galaxies they’ll be looking three billion years back into the universe’s past, with a redshift of 0.26.

    2
    Neutral hydrogen gas in one of the galaxies, IC 5201 in the southern constellation of Grus (The Crane), imaged in early observations for the WALLABY project. Matthew Whiting, Karen Lee-Waddell and Bärbel Koribalski (all CSIRO); WALLABY team, Author provided

    Neutral hydrogen – just lonely individual hydrogen atoms floating around – is the basic form of matter in the universe. Galaxies are made up of stars but also dark matter, dust and gas – mostly hydrogen. Some of the hydrogen turns into stars.

    Although the universe has been busy making stars for most of its 13.7-billion-year life, there’s still a fair bit of neutral hydrogen around. In the nearby (low-redshift) universe, most of it hangs out in galaxies. So mapping the neutral hydrogen is a useful way to map the galaxies, which isn’t always easy to do with just starlight.

    But as well as mapping where the galaxies are, we want to know how they live their lives, get on with their neighbours, grow and change over time.

    When galaxies live together in big groups and clusters they steal gas from each other, a processes called accretion and stripping. Seeing how the hydrogen gas is disturbed or missing tells us what the galaxies have been up to.

    We can also use the hydrogen signal to work out a lot of a galaxy’s individual characteristics, such as its distance, how much gas it contains, its total mass, and how much dark matter it contains.

    This information is often used in combination with characteristics we learn from studying the light of the galaxy’s stars.

    Oh what big eyes you have ASKAP

    ASKAP sees large pieces of sky with a field of view of 30 square degrees. The WALLABY team will observe 1,200 of these fields. Each field contains about 500 galaxies detectable in neutral hydrogen, giving a total of 600,000 galaxies.

    3
    One of the first fields targeted by WALLABY, the NGC 7232 galaxy group. Ian Heywood (CSIRO); WALLABY team, Author provided

    This image (above) of the NGC 7232 galaxy group was made with just two nights’ worth of data.

    ASKAP has now made 150 hours of observations of this field, which has been found to contain 2,300 radio sources (the white dots), almost all of them galaxies.

    It has also observed a second field, one containing the Fornax cluster of galaxies, and started on two more fields over the Christmas and New Year period.

    Even more will be dug up by targeted searches. Simply detecting all the WALLABY galaxies will take more than two years, and interpreting the data even longer. ASKAP’s data will live in a huge archive that astronomers will sift through over many years with the help of supercomputers at the Pawsey Centre in Perth, Western Australia.

    ASKAP has nine other big survey projects planned, so this is just the beginning of the journey. It’s really a very exciting time for ASKAP and the more than 350 international scientists who’ll be working with it.

    Who knows where this Big Trip will take them, and what they’ll find along the way?

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 11:43 am on August 4, 2016 Permalink | Reply
    Tags: , , , The Conversation   

    From The Conversation: “Expanding citizen science models to enhance open innovation” 

    Conversation
    The Conversation

    August 3, 2016
    Kendra L. Smith

    Over the years, citizen scientists have provided vital data and contributed in invaluable ways to various scientific quests. But they’re typically relegated to helping traditional scientists complete tasks the pros don’t have the time or resources to deal with on their own. Citizens are asked to count wildlife, for instance, or classify photos that are of interest to the lead researchers.

    This type of top-down engagement has consigned citizen science to the fringes, where it fills a manpower gap but not much more. As a result, its full value has not been realized. Marginalizing the citizen scientists and their potential contribution is a grave mistake – it limits how far we can go in science and the speed and scope of discovery.

    Instead, by harnessing globalization’s increased interconnectivity, citizen science should become an integral part of open innovation. Science agendas can be set by citizens, data can be open, and open-source software and hardware can be shared to assist in the scientific process. And as the model proves itself, it can be expanded even further, into nonscience realms.

    1
    Since 1900 the Audubon Society has sponsored its annual Christmas Bird Count, which relies on amateur volunteers nationwide. USFWS Mountain-Prairie, CC BY

    Some major citizen science successes

    Citizen-powered science has been around for over 100 years, utilizing the collective brainpower of regular, everyday people to collect, observe, input, identify and crossmatch data that contribute to and expand scientific discovery. And there have been some marked successes.

    eBird allows scores of citizen scientists to record bird abundance via field observation; those data have contributed to over 90 peer-reviewed research articles. Did You Feel It? crowdsources information from people around that world who have experienced an earthquake. Snapshot Serengeti uses volunteers to identify, classify and catalog photos taken daily in this African ecosystem.

    FoldIt is an online game where players are tasked with using the tools provided to virtually fold protein structures. The goal is to help scientists figure out if these structures can be used in medical applications. A set of users determined the crystal structure of an enzyme involved in the monkey version of AIDS in just three weeks – a problem that had previously gone unsolved for 15 years.

    Galaxy Zoo is perhaps the most well-known online citizen science project. It uploads images from the Sloan Digital Sky Survey [SDSS] and allows users to assist with the morphological classification of galaxies. The citizen astronomers discovered an entirely new class of galaxy – “green pea” galaxies – that have gone on to be the subject of over 20 academic articles.

    SDSS Telescope at Apache Point, NM, USA
    SDSS Telescope at Apache Point, NM, USA

    These are all notable successes, with citizens contributing to the projects set out by professional scientists. But there’s so much more potential in the model. What does the next generation of citizen science look like?

    2
    People can contribute to crowdsourced projects from just about anywhere. Nazareth College, CC BY

    Open innovation could advance citizen science

    The time is right for citizen science to join forces with open innovation. This is a concept that describes partnering with other people and sharing ideas to come up with something new. The assumption is that more can be achieved when boundaries are lowered and resources – including ideas, data, designs and software and hardware – are opened and made freely available.

    Open innovation is collaborative, distributed, cumulative and it develops over time. Citizen science can be a critical element here because its professional-amateurs can become another significant source of data, standards and best practices that could further the work of scientific and lay communities.

    Globalization has spurred on this trend through the ubiquity of internet and wireless connections, affordable devices to collect data (such as cameras, smartphones, smart sensors, wearable technologies), and the ability to easily connect with others. Increased access to people, information and ideas points the way to unlock new synergies, new relationships and new forms of collaboration that transcend boundaries. And individuals can focus their attention and spend their time on anything they want.

    We are seeing this emerge in what has been termed the “solution economy” – where citizens find fixes to challenges that are traditionally managed by government.

    Consider the issue of accessibility. Passage of the 1990 Americans with Disabilities Act aimed to improve accessibility issues in the U.S. But more than two decades later, individuals with disabilities are still dealing with substantial mobility issues in public spaces – due to street conditions, cracked or nonexistent sidewalks, missing curb cuts, obstructions or only portions of a building being accessible. These all can create physical and emotional challenges for the disabled.

    To help deal with this issue, several individual solution seekers have merged citizen science, open innovation and open sourcing to create mobile and web applications that provide information about navigating city streets. For instance, Jason DaSilva, a filmmaker with multiple sclerosis, developed AXS Map – a free online and mobile app powered by Google Places API. It crowdsources information from people across the country about wheelchair accessibility in cities nationwide.

    Broadening the model

    There’s no reason the diffuse resources and open process of the citizen scientist model need be applied only to science questions.

    For instance, Science Gossip is a Zooniverse citizen science project. It’s rooted in Victorian-era natural history – the period considered to be the dawn of modern science – but it crosses disciplinary boundaries. At the time, scientific information was produced everywhere and recorded in letters, books, newspapers and periodicals (it was also the beginning of mass printing). Science Gossip allows citizen scientists to pore through pages of Victorian natural history periodicals. The site prompts them with questions meant to ensure continuity with other user entries.

    The final product is digitized data based on the 140,000 pages of 19th-century periodicals. Anyone can access it on Biodiversity Heritage Library easily and for free. This work has obvious benefits for natural history researchers but it also can be used by art enthusiasts, ethnographers, biographers, historians, rhetoricians, or authors of historical fiction or filmmakers of period pieces who seek to create accurate settings. The collection possesses value that goes beyond scientific data and becomes critical to understanding the period in which data was collected.

    It’s also possible to imagine flipping the citizen science script, with the citizens themselves calling the shots about what they want to see investigated. Implementing this version of citizen science in disenfranchised communities could be a means of access and empowerment. Imagine Flint, Michigan residents directing expert researchers on studies of their drinking water.

    Or consider the aim of many localities to become so-called smart cities – connected cities that integrate information and communication technologies to improve the quality of life for residents as well as manage the city’s assets. Citizen science could have a direct impact on community engagement and urban planning via data consumption and analysis, feedback loops and project testing. Or residents can even collect data on topics important to local government. With technology and open innovation, much of this is practical and possible.

    What stands in the way?

    Perhaps the most pressing limitation of scaling up the citizen science model is issues with reliability. While many of these projects have been proven reliable, others have fallen short.

    For instance, crowdsourced damage assessments from satellite images following 2013’s Typhoon Haiyan in the Philippines faced challenges. But according to aid agencies, remote damage assessments by citizen scientists had a devastatingly low accuracy of 36 percent. They overrepresented “destroyed” structures by 134 percent.

    4
    Crowds can’t reliably rate typhoon damage like this without adequate training. Bronze Yu, CC BY-NC-ND

    Reliability problems often stem from a lack of training, coordination and standardization in platforms and data collection. It turned out in the case of Typhoon Haiyan the satellite imagery did not provide enough detail or high enough resolution for contributors to accurately classify buildings. Further, volunteers weren’t given proper guidance on making accurate assessments. There also were no standardized validation review procedures for contributor data.

    Another challenge for open source innovation is organizing and standardizing data in a way that would be useful to others. Understandably, we collect data to fit our own needs – there isn’t anything wrong with that. However, those in charge of databases need to commit to data collection and curation standards so anyone may use the data with complete understanding of why, by whom and when they were collected.

    Finally, deciding to open data – making it freely available for anyone to use and republish – is critical. There’s been a strong, popular push for government to open data of late but it isn’t done widely or well enough to have widespread impact. Further, the opening of of nonproprietary data from nongovernment entities – nonprofits, universities, businesses – is lacking. If they are in a position to, organizations and individuals should seek to open their data to spur innovation ecosystems in the future.

    Citizen science has proven itself in some fields and has the potential to expand to others as organizers leverage the effects of globalization to enhance innovation. To do so, we must keep an eye on citizen science reliability, open data whenever possible, and constantly seek to expand the model to new disciplines and communities.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 9:30 pm on July 17, 2016 Permalink | Reply
    Tags: , , , The Conversation   

    From The Conversation via ANU: “How to keep more women in science, technology, engineering and mathematics (STEM)” 

    Conversation
    The Conversation

    ANU Bloc
    Australian National University

    July 12, 2016

    3
    http://www.masterstudies.com/article/Why-Science-is-%28also%29-for-Women/

    There have been myriad promises made by the major political parties over the years focused on funding programs aimed at increasing the number of women pursuing careers in science, technology, engineering and mathematics (STEM).

    Although some of the policies do target disciplines where women are underrepresented, there seems to be very little acknowledgement of the bigger problem.

    Attracting women to STEM careers is one issue, retaining them is another. And that does not seem to get the same level of attention.

    Simply trying to get more women into STEM without addressing broader systemic issues will achieve nothing except more loss through a leaky pipeline.

    Higher Education Research Data from 2014 shows more females than males were being awarded undergraduate degrees in STEM fields. Early career researchers, classified as level A and B academics, are equally represented in the genders.

    1
    Gender disparity in STEM fields at the higher academic levels (C-E) based on Higher Education Research Data, 2014. Science in Australia Gender Equity (SAGE)

    At senior levels, though, the gender disparity plainly manifests – males comprise almost 80% of the most senior positions.

    A biological and financial conundrum

    Studies in the United States found that women having children within five to ten years of completing their PhD are less likely to have tenured or tenure-track positions, and are more likely to earn less than their male or childless female colleagues.

    Angela (name changed) is a single parent and a PhD student in the sciences. She told me she is determined to forge a career for herself in academia, despite the bureaucratic and financial hurdles she has to overcome.

    ” Finding ways to get enough money to afford childcare […] jumping through bureaucratic hoops […] It was ridiculous and at times I wondered if it was all worth it.

    It may be just one reason for women leaving STEM, especially those with children, and doubly so for single parent women.”

    Women tend to be the primary caregivers for children, and are more likely to work part time, so perhaps this could explain the financial disparity. But according to the latest report from the Office of the Chief Scientist on Australia’s STEM workforce, men who also work part time consistently earn more, irrespective of their level of qualification.

    2
    Percentage of doctorate level STEM graduates working part time who earned more than $104 000 annually, by age group and gender. Australia’s STEM Workforce March 2016 report from the Office of the Australian Chief Scientist., CC BY-NC-SA

    The same report also shows that women who do not have children tend to earn more than women who do, but both groups still earn less than men.

    Perhaps children do play a part in earning capacity, but the pay disparities or part-time employment do not seem to fully explain why women leave STEM.

    Visible role models

    The absence of senior females in STEM removes a source of visible role models for existing and aspiring women scientists. This is a problem for attracting and retaining female scientists.

    Having female role models in STEM helps younger women envision STEM careers as potential pathways they can take, and mentors can provide vital support.

    Yet even with mentoring, women in STEM still have higher attrition rates than their male colleagues.

    So what else can we do?

    There are many programs and initiatives that are already in place to attract and support women in STEM, including the Science in Australia Gender Equity (SAGE) pilot, based on the United Kingdom’s Athena SWAN charter.

    But women’s voices are still absent from leadership tables to our detriment.

    Homeward Bound

    This absence is especially noticeable in STEM and policy making arenas, and was the impetus for Australian leadership expert, Fabian Dattner, in collaboration with Dr Jess Melbourne-Thomas from the Australian Antarctic Division, to create Homeward Bound.

    Dattner says she believes the absence of women from leadership “possibly, if not probably, places us at greatest peril”.

    To address this, Homeward Bound is aimed at developing the leadership, strategic and scientific capabilities of female scientists to enhance their impact in influencing policy and decisions affecting the sustainability of the planet.

    Initially, it will involve 77 women scientists from around the world. But this is only the first year of the program, and it heralds the beginning of a global collaboration of 1,000 women over ten years.

    These women are investing heavily – financially, emotionally and professionally – and it is clearly not an option for everyone.

    Flexible approaches

    There are other simple ways to support women in STEM, which anyone can do.

    Simply introducing genuinely flexible work arrangements could do a lot towards alleviating the pressure as Angela shows:

    ” My supervisor made sure that we never had meetings outside of childcare hours […] or I could Skype her from home once my child was in bed. They really went above and beyond to make sure that I was not disadvantaged.”

    We have already attracted some of the best and brightest female minds to STEM.

    If keeping them there means providing support, publicly celebrating high-achieving women, and being flexible in how meetings are held, surely that’s an investment we can all make.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: