Tagged: Science Node Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:41 am on March 14, 2017 Permalink | Reply
    Tags: , dark web, Science Node, , Wrangling crime in the deep   

    From Science Node: “Wrangling crime in the deep, dark web” 

    Science Node bloc
    Science Node

    06 Mar, 2017
    Jorge Salazar

    Much of the internet hides like an iceberg below the surface.

    This so-called ‘deep web’ is estimated to be 500 times bigger than the ‘surface web’ seen through search engines like Google. For scientists and others, the deep web holds important computer code and licensing agreements.

    Nestled further inside the deep web, one finds the ‘dark web,’ a place where images and video are used by traders in illicit drugs, weapons, and human lives.

    “Behind forms and logins, there are bad things,” says Chris Mattmann, chief architect in the instrument and science data systems section of the NASA Jet Propulsion Laboratory (JPL) at the California Institute of Technology.

    “Behind the dynamic portions of the web, people are doing nefarious things, and on the dark web, they’re doing even more nefarious things. They traffic in guns and human organs. They’re doing these activities and then they’re tying them back to terrorism.”

    In 2014, the Defense Advanced Research Projects Agency (DARPA) started a program called Memex to make the deep web accessible. “The goal of Memex was to provide search engines the retrieval capacity to deal with those situations and to help defense and law enforcement go after the bad guys on the deep web,” Mattmann says.

    At the same time, the US National Science Foundation (NSF) invested $11.2 million in a first-of-its-kind data-intensive supercomputer – the Wrangler supercomputer, now housed at the Texas Advanced Computing Center (TACC). The NSF asked engineers and computer scientists at TACC, Indiana University, and the University of Chicago if a computer could be built to handle massive amounts of input and output.


    TACC Wrangler

    Wrangler does just that, enabling the speedy file transfers needed to fly past big data bottlenecks that can slow down even the fastest computers. It was built to work in tandem with number crunchers such as TACC’s Stampede, which in 2013 was the sixth fastest computer in the world.


    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    “Although we have a lot of search-based queries through different search engines like Google, it’s still a challenge to query the system in way that answers your questions directly,” says Karanjeet Singh.

    Singh is a University of Southern California graduate student who works with Chris Mattmann on Memex and other projects.

    “The objective is to get more and more domain-specific information from the internet and to associate facts from that information.”

    Once the Memex user extracts the information they need, they can apply tools such as named entity recognizer, sentiment analysis, and topic summarization. This can help law enforcement agencies find links between different activities, such as illegal weapon sales and human trafficking.

    The problem is that even the fastest computers like Stampede weren’t designed to handle the input and output of the millions of files needed for the Memex project.

    “Let’s say that we have one system directly in front of us, and there is some crime going on,” Singh says. “What the JPL is trying to do is automate a lot of domain-specific query processes into a system where you can just feed in the questions and receive the answers.”

    For that, he works with an open source web crawler called Apache Nutch. It retrieves and collects web page and domain information of the deep web. The MapReduce framework powers those crawls with a divide-and-conquer approach to big data that breaks it up into small pieces that run simultaneously.

    The NSF asked engineers and computer scientists at TACC, Indiana University, and the University of Chicago if a computer could be built to handle massive amounts of input and output.

    Wrangler avoids data overload by virtue of its 600 terabytes of speedy flash storage. What’s more, Wrangler supports the Hadoop framework, which runs using MapReduce.

    Together, Wrangler and Memex constitute a powerful crime-fighting duo. NSF investment in advanced computation has placed powerful tools in the hands of public defense agencies, moving law enforcement beyond the limitations of commercial search engines.

    “Wrangler is a fantastic tool that we didn’t have before as a mechanism to do research,” says Mattman. “It has been an amazing resource that has allowed us to develop techniques that are helping save people, stop crime, and stop terrorism around the world.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 3:52 pm on February 12, 2017 Permalink | Reply
    Tags: Gender discrimination is not just a women's issue, , Science Node,   

    From Science Node: Women in STEM – “Continue the conversation” Very Important 

    Science Node bloc
    Science Node

    07 Feb, 2017
    Helen Patton

    1

    According to recent research conducted by the US National Science Foundation (NSF), although women comprise more than half of the U.S. workforce, only 28% are employed in STEM related fields and of that, only 11% are pursuing a career in information security.

    Similarly, when looking at workplace diversity, minorities represented 29% of STEM related fields, with approximately 6% Hispanic and 8% African-American representation in the IT sector.

    2
    Leading ladies. Participants in a gender and diversity panel at the Internet2 Technology Exchange. From left, Theresa Semmens, Helen Patton, Mary Dunker, and Kimberly Milford. Courtesy Internet2.

    When asked why women chose to leave the profession or why they might not consider a career in information security and IT, often the answers are as complex as the problem.

    Some cite stereotyping, organizational culture, and the lack of encouragement and support from management and fellow colleagues. Others cite the lack of guidance from management and uncertainty about their career trajectory.

    Moving forward, my co-panelists and I offer the following guiding principles to anyone interested in supporting gender and diversity initiatives.

    Engaging everyone in the dialogue

    Gender discrimination is not just a women’s issue – it’s a men’s issue, too. Similarly, making concerted efforts to challenge the lack of diversity in the workplace should be everyone’s concern. It’s important to include both men and women in the conversation and work collectively at solving the gender discrimination and diversity problems in the workplace.

    Building a community of allies

    Many of our male colleagues have expressed their desire to put an end to gender discrimination and make real change to improve diversity. There needs to be tools and resources, such as this one about male allies, that help our colleagues become allies for women both at work and at home.

    Sharing success stories

    It’s important to move beyond simply presenting the data on gender discrimination in the workplace. In addition to making tools accessible, we must highlight possible solutions and share success stories alongside the data. A good reference for this is the National Center for Women and Information Technology (NCWIT).


    Target practices. Building on insights from behavioral economics, Iris Bohnet argues that to overcome gender bias in organizations and society, we should focus on de-biasing systems — how we evaluate performance, hire, promote, structure tests, form groups — rather than on trying to de-bias people.

    Another great resource is Iris Bohnet’s book What Works – Gender Diversity by Design, which makes suggestions on ways we can recruit, hire, develop, and promote gender diverse talent.

    Inclusive language

    We want to be conscious of how we present the profession through the use of language. We want to avoid using terms and descriptions that may come across as biased, either consciously or unconsciously. We want to ensure the terms and language we use are gender-inclusive.

    Commitment to mentorship

    A coach, mentor, or advocate will instill in the person seeking help and advice the idea that they can make a difference and are valued for their contributions. Mentorship forms a support system that enhances a positive experience of growth and development for an individual’s career.

    Research suggests that the most beneficial mentoring is based on mutual learning, active engagement, and striving to push the leadership capabilities of mentees.

    Championing diversity

    We need to ensure that everyone who has an interest and desire to break into information security has the opportunity, comfort level, and confidence to do so.

    Diversity in the workplace contributes to an institution’s creativity and adds new perspectives to professional conversations. It creates a well-rounded team and allows for more efficiencies, diverse ideas, varied technical skill sets, broader communication forums, and business management skill sets.

    Women and minorities need champions; those who advocate, support, and recognize their efforts and contributions.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 1:35 pm on February 11, 2017 Permalink | Reply
    Tags: Francine Berman, IoT Bill of Rights, Science Node, Toward an ethics of the Internet of Things   

    From Science Node: “Toward an ethics of the Internet of Things” 

    Science Node bloc
    Science Node

    02 Feb, 2017
    Lance Farrell

    1
    Courtesy Esther Emmely Gons. (CC BY 2.0)

    2
    Francine Berman

    The Internet of Things (IoT): let’s define some terms here.

    Internet: Well that’s simple enough — that electronic communications network through which we all gather our mail, news, and cat videos.

    Then what’s this ‘things’ part?

    Examples might help here: Refrigerators, phones, copy machines, baby monitors, automobiles, microwaves, street lights – really, any computerized object that can link to the internet.

    In this brave new world of networked devices, how do we maintain individual rights and manage the IoT safely and ethically?

    That’s the question we put to Francine Berman, the Edward P. Hamilton distinguished professor in computer science at Rensselaer Polytechnic Institute.

    What’s the promise and peril inherent in IoT?

    The IoT has tremendous potential to enhance society, work, and life. Smart, networked systems can make us safer in our homes and vehicles, increase our efficiency at work and at play, empower us through information, and create new opportunities. But technologies have neither social values nor ethics.

    The same systems that can be used to enhance our lives can also be used to facilitate misbehavior. Denial-of-service attacks or hackers can put smart infrastructure systems — and the people they serve — in danger. Remote surveillance systems that allow parents to check on their infants, or adult children to check on aging parents have also been used to spy on unsuspecting individuals and scream at babies.

    The potential of the IoT will be achieved when we have a common sense of appropriate behavior, social mechanisms to enforce responsibility and accountability, and when we enable technical architectures that incorporate safety, security, and protections. For best results, we need to develop all of these in coordination and not just after technologies have matured.

    Many people assume the rights and protections we enjoy in democratic society are applicable to the IoT realm. Is this not the case?

    Whether we’re dealing with rights and protections in existing scenarios or new ones, the IoT will be a brave new world. We will need to conceptualize, extend, or re-establish a working notion of individual rights and the public good.

    Our mileage will vary: In some cases, rights and protections from other spheres of life will be extensible to the IoT, although the IoT and digital technologies will vastly impact the interpretation and potential consequences of existing policy and law.

    For example, we have seen policy makers and lawmakers struggle to extend copyright law into digital media such as YouTube and apply health information privacy laws to smartphone health apps.

    These scenarios provide vastly different environments than the original scenarios covered by law and policy and will need to evolve to adequately promote responsible behavior in IoT environments.

    The IoT will also necessitate new rights and protections. For example, in environments with embedded surveillance, do you have a right to opt out? It may be that in many instances you don’t. What are your rights and what is society’s responsibility toward you in these environments?

    An IoT ‘Bill of Rights’ sounds like a good idea — to what extent will or won’t it work?

    An IoT Bill of Rights provides an important framework for thinking about the impact of IoT, but will only be as good as its scope, interpretation, and enforcement.

    For example, a ‘right to privacy’ that gives individuals control of the data they generate and the metadata collected about them could ensure control over a digital persona.

    However, the technical infrastructure that implements this right may be challenging to engineer.

    _______________________________________________________________
    “The potential of the IoT will be achieved when we have a common sense of appropriate behavior, social mechanisms to enforce responsibility and accountability, and when we enable technical architectures that incorporate safety, security, and protections. ~Francine Berman”
    _______________________________________________________________

    Will individuals want to or be able to sift through all records for all on-line IoT services and devices they use (smart phone, refrigerator, car, shopping site, browser, etc.) to pick and choose which information is private and what can be shared?

    Will a strong individual right to privacy also make public data in the IoT less valuable? For example, if half of the residents in an area choose to keep the location and images of their homes private, the Google map of that area may cease to be useful.

    How we determine what information is private and under what circumstances, who can control it, who can access it, who can enforce it, and what happens when things go wrong will be critical for a ‘right to privacy’ to be meaningful.

    So how to safely and ethically deploy IoT?

    3

    I don’t think that we should set up a governance system for the IoT without substantive discussion and experience with IoT technologies.

    So now is exactly the right time for thought leadership and exploration. We need to be developing the intellectual groundwork for IoT governance, policy, and laws to understand how to prioritize and promote the public good within the IoT.

    We need to understand intended and unintended consequences for a broad spectrum of potential IoT policy, regulation, and frameworks.

    We need to work now to understand the specifics of how the IoT will impact current social mechanisms or necessitate new ones to create an environment in which the IoT can achieve its highest potential.

    We also need to pilot and experiment with various IoT policies now to gain experience with how various approaches will work. Smart systems, cities, and workplaces can collect information on the success and challenges of various policies, system coordination and security mechanisms, and approaches to data rights, privacy, stewardship and use.

    This is already happening in opportunistic areas like transportation (Who is responsible when a self-driving car has an accident? What information should be private within a vehicle-net?) but is needed for the broader spectrum of IoT scenarios, systems, and devices.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 5:38 pm on February 8, 2017 Permalink | Reply
    Tags: E-wave propagation index (EPI), Find a better way of predicting blood clots in the heart, Science Node,   

    From Science Node: “Protecting the heart with supercomputers” 

    Science Node bloc
    Science Node

    30 Jan, 2017 [Where has this been?]
    Aaron Dubrow

    1
    Simulations on the Stampede supercomputer find a better way of predicting blood clots in the heart.

    What pump can run for 80 years without breaking down? Why, the heart, of course.

    But when it does malfunction, the results can be dire. To head these problems off at the pass, researchers harnessed some Texas supercomputers to find a better way to predict which patients are at risk from blood clots.

    Matters of the heart

    Blood clots, frequently the byproduct of a heart weakened by disease or an injury, are among the leading causes of deaths related to heart-disease. Since the chambers of the heart are the largest reservoirs of blood in the body, they are most at risk for generating clots.

    2
    Rajat Mittal led a team that tapped Texas supercomputers to develop a better way to predict blood clot risk. Courtesy Johns Hopkins University.

    The challenge for physicians is predicting when a patient is in danger of developing a blood clot.

    The degree to which a stream of blood penetrates the mitral valve into the left ventricle of the heart is the critical factor. If this mitral jet doesn’t travel deep enough into the ventricle, it can prevent the heart from properly flushing, leading to clots and other consequences.

    The metric that characterizes the jet penetration, the E-wave propagation index (EPI), assesses patient risk of clot formation much more accurately than current tools and procedures.

    “The beauty of the index is that it doesn’t require additional measurements. It reformulates echocardiogram data into a new metric,” says Rajat Mittal, professor of mechanical engineering at Johns Hopkins University and one of the principal investigators on the research. “The clinician doesn’t have to do any additional work.”

    Mittal’s findings were based on simulations performed at the Texas Advanced Computing Center (TACC) and validated using data from patients who both did and did not experience post-heart attack blood clots.

    TACC bloc

    “Because we understood the fluid dynamics in the heart using our computational models,” Mittal observes, “we can see that the ejection fraction (the current procedure of choice in cardiology) is not able to stratify clot risk, whereas the EPI can very accurately stratify who will get a clot and who will not.”

    Patient computing

    Mittal and his team required large computing resources to derive and test their hypothesis. Run in parallel on 256 to 512 processors, each simulation took several 100,000 computing hours to complete.

    “This work cannot be done by simulating a single case. Having a large enough sample size to base conclusions on was essential for this research,” Mittal says. “We could never come close to being able to do what we needed to do it if weren’t for the Stampede supercomputer.”

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF
    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    To arrive at their hypothesis, Mittal’s team captured detailed measurements from 13 patients and used those to make patient-specific models of the heart that consider fluid flow, physical structures, and bio-chemistry.

    These models led, in turn, to new insights into the factors that correlate most closely to stagnation in the left ventricle, chief among them, mitral jet penetration.

    Working in collaboration with clinicians, including lead author, Thura Harfi of Ohio State University, the team tested their hypothesis using data from 75 individuals — 25 healthy patients, 25 patients who experienced clots in their left ventricle, and 25 patients who had a compromised heart but who didn’t have any clots.

    They found that, based on the EPI measurement, one in every five patients with severe cardiomyopathy who are currently not being treated with anti-clotting drugs would be at risk of a left ventricular clot and would benefit from anticoagulation.

    Future flows

    In addition to establishing the new diagnostic tool for clinicians, Mittal’s research helps advance new, efficient computational models that will be necessary to make patient-specific diagnostics feasible.

    Mittal foresees a time where doctors will perform patient-specific heart simulations routinely to determine the best course of treatment. However, hospitals would need systems hundreds of times faster than a current desktop computer to be able to figure out a solution locally in a reasonable timeframe.

    5
    The team plans to continue to test their hypothesis, applying the EPI metric to a larger dataset. They hope in the future to run a clinical study with a forward-looking analysis.

    “These research results are an important first step to move our basic scientific understanding of the physics of how blood flows in the heart to real-time predictions and treatments for the well-being of patients,” says Ronald Joslin, NSF Fluid Dynamics program director.

    With a better understanding of the mechanics of blood clots and ways to the predict them, the researchers have turned their attention to other sources of blood clots, including bio-prosthetic heart valves and atrial fibrillation (AFib) – a quivering or irregular heartbeat that affects around 3 million Americans.

    “The potential for impact in this area is very motivating,” Mittal said, “not just for me but for my collaborators, students and post-docs as well.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 9:09 am on January 13, 2017 Permalink | Reply
    Tags: 2016 in review, , , Science Node   

    From Science Node: “2016 in review: A big first year for Science Node” 

    Science Node bloc
    Science Node

    20 Dec, 2016
    Sarah Engel

    Science Node celebrated its first full year in September. As we look back on these last 12 months, we noticed a few patterns worth highlighting.

    By December, we were also celebrating over 24,000 connections across our newsletter and social media platforms.

    This growth is due in no small part to partners like XSEDE, Internet2, Open Science Grid, ESnet, and, of course, Indiana University. We’re also grateful for past support from the US National Science Foundation, CERN (the European Organization for Nuclear Research), and the European Commission (via the e-Science Talk project, as well as others).

    Our growth is about more than connections, though. It’s due in large part to the persistence of Managing Editor Lance Farrell – and behind the scenes help from Indiana University’s Greg Moore. In late 2016, we also welcomed two new writers, Alisa Alering and Tristan Fitzpatrick. You’ve seen some of their work already, and you can expect even more in the coming months.

    Science gets personal

    Citizen science and personalized medicine are two examples of how science now reaches into our daily lives – and promises to, on the one hand, hold us close to discovery and, on the other hand, improve our ability to avoid and manage disease.

    Check out Alisa’s take on how science is closer to us than ever before.

    For the history books

    2016 was also a year of amazing discoveries. Scientists confirmed Albert Einstein’s 100-year-old prediction of gravitational waves when LIGO heard the echo of a massive merger of black holes. Science Node was there to cover the computational collaboration that made the discovery possible.

    We also cheered when astrophysicists revved up galactic-sized supercomputer simulations and discovered evidence of a dark planet lurking at the distant edge of our solar system. All that remains is for Konstantin Batygin to actually locate this planet that the models say must be there!

    Find these stories and more in Tristan’s article about the big science news of the year.

    An international focus

    We’re very proud of our global science community – like the German scientist who used a Swiss supercomputer to spot a lake of lava under an island in the Sea of Japan, and the Australian scientists who adapted a firefighting technique to a supercomputing environment and found a smart way to combat invasive species.

    Explore these examples in Lance’s around the world article.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 2:49 pm on January 3, 2017 Permalink | Reply
    Tags: A day in the life of a molecular machine, , , Georgia State University, , NERSC - National Energy Research for Scientific Computing Center, Science Node,   

    From Science Node: “A day in the life of a molecular machine” 

    Science Node bloc
    Science Node

    01 Dec, 2016 [Where has this been?]
    Jorge Salazar

    1
    Courtesy Macmillan Publishers Ltd; Yuan He, et al.

    Supercomputers and cryo-electron microscopy take a perfect picture of molecular machines.

    It sounds like something out of Star Trek: Nano-sized robots self-assemble to form biological machines that do the work of living. And yet this is not science fiction – this really happens.

    Every cell in our body has identical DNA, the twisted staircase of nucleic acids uniquely coded to each organism. Molecular machines take pieces of DNA called genes and make a brain cell when needed, instead of, say, a bone cell.

    2
    Model scientist. Ivaylo Ivanov, associate professor of chemistry at Georgia State University, conducted over four million hours of supercomputer simulations to model molecular machines.

    Scientists today are just starting to understand their structure and function using the latest microscopes and supercomputers.

    Cryo-electron microscopy (cryo-EM) combined with supercomputer simulations have created the best model yet of a vital molecular machine, the human pre-initiation complex (PIC).

    “For the first time, structures have been detailed of the complex groups of molecules that open human DNA,” says study co-author Ivaylo Ivanov, associate professor of chemistry at Georgia State University.

    Ivanov led the computational work that modeled the atoms of the different proteins that act like cogs of the PIC molecular machine.

    The experiment began with images painstakingly taken of PIC. They were made by a group led by study co-author Eva Nogales, senior faculty scientist at Lawrence Berkeley National Laboratory.

    Nogales’ group used cryo-EM to freeze human PIC bound to DNA before zapping it with electron beams. Thanks to recent advances, cryo-EM can now image at near atomic resolution large and complicated biological structures that have proven too difficult to crystalize.

    In all, over 1.4 million cryo-EM ‘freeze frames’ of PIC were processed using supercomputers at the National Energy Research for Scientific Computing Center (NERSC)

    NERSC
    NERSC CRAY Cori supercomputer
    NERSC CRAY Cori supercomputer
    LBL NERSC Cray XC30 Edison supercomputer
    LBL NERSC Cray XC30 Edison supercomputer

    “Cryo-EM is going through a great expansion,” Nogales says. “It is allowing us to get higher resolution of more structures in different states so that we can describe several pictures showing how they are moving. We don’t see a continuum, but we see snapshots through the process of action.”

    Using eXtreme Science and Engineering Discovery Environment (XSEDE) resources, scientists next built an accurate model that made physical sense of the density maps of PIC.

    4
    Ice queen. Eva Nogales, senior faculty scientist at the Lawrence Berkeley National Laboratory uses cryo-electron microscopy to produce near atomic-level resolution images of molecular structure. Courtesy Eva Nogales.

    To model complex molecular machines, including those for this study, Ivanov’s team ran over four million core hours of simulations on the Stampede supercomputer at the Texas Advanced Computing Center (TACC).
    TACC bloc
    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF
    “Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    The goal of all this computational effort is to produce atomic models that tell the full story of the structure and function of the protein complex of molecules. To get there, Ivanov’s team took the twelve components of the PIC assembly and created homology models for each component that accounted for their amino acid sequences and their relation to similar known protein 3-D structures.

    XSEDE was “absolutely necessary” for this modeling, says Ivanov. “When we include water and counter ions in addition to the PIC complex in a molecular dynamics simulation box, we get the simulation system size of over a million atoms. For that we need to go to a thousand cores. In this case, we went up to two thousand and forty-eight cores – for that we needed Stampede,” Ivanov said.

    One of the insights gained in the study is a working model of how PIC opens the otherwise stable DNA double helix for transcription. Imagine a cord made of two threads twisted around each other, Nogales explains. Hold one end very tightly, then grab the other and twist it in the opposite direction of the threading to unravel the cord. That’s basically how the living machines that keep us alive do it.

    4
    Changing stations. By aligning the three models of holo-PICs, sequential states are morphed with a special focus on the nucleic acids regions. Courtesy Macmillan Publishers Ltd; Yuan He, et al.

    Both scientists said that they are just beginning to get an atomic-level understanding of transcription, crucial to gene expression and ultimately disease.

    “Many disease states come about because there are errors in how much a certain gene is being read and how much a certain protein with a certain activity in the cell is present,” Nogales says. “Those disease states could be due to excess production of the protein, or conversely not enough. It is very important to understand the molecular process that regulates this production so that we can understand the disease state.”

    While this fundamental work does not directly produce cures, it does lay the foundation to help develop them in the future, said Ivanov. “In order to understand disease, we have to understand how these complexes function in the first place… A collaboration between computational modelers and experimental structural biologists could be very fruitful in the future. ”

    The results,Near-atomic resolution visualization of human transcription promoter opening, were recently published in Nature.

    The article was authored by Yuan He, Lawrence Berkeley National Laboratory and now at Northwestern University; Chunli Yan and Ivaylo Ivanov, Georgia State University; Jie Fang, Carla Inouye, Robert Tjian, Eva Nogales, UC Berkeley.

    Funding came from the National Institute of General Medical Sciences (NIH) and the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 9:50 pm on December 23, 2016 Permalink | Reply
    Tags: , , Science Node,   

    From Science Node: “Supercomputing an earthquake-ready building” 

    Science Node bloc
    Science Node

    19 Dec, 2016
    Tristan Fitzpatrick

    Preparing for an earthquake takes more than luck, thanks to natural hazard engineers and their supercomputers.

    1
    Courtesy Ellen Rathje.

    If someone is inside a building during an earthquake, there isn’t much they can do except duck under a table and hope for the best.

    That’s why designing safe buildings is an important priority for natural hazards researchers.

    Natural hazards engineering involves experimentation, numerical simulation, and data analysis to improve seismic design practices.

    To facilitate this research, the US National Science Foundation (NSF) has invested in the DesignSafe cyberinfrastructure so that researchers can fully harness the vast amount of data available in natural hazards engineering.

    Led by Ellen Rathje at the University of Texas and developed by the Texas Advanced Computing Center (TACC), DesignSafe includes an interactive web interface, repositories to share data sets, and a cloud-based workspace for researchers to perform simulation, computation, data analysis, and other tasks.

    TACC bloc

    For example, scientists may use a device known as a shake table to simulate earthquake movement and measure how buildings respond to them.

    “From a shaking table test we can measure the movements of a building due to a certain seismic loading,” Rathje says, “and then we can develop a numerical model of that building subjected to the same earthquake loading.”

    Researchers then compare the simulation to experimental data that’s been collected previously from observations in the field.

    “In natural hazards engineering, we take advantage of a lot of experimental data,” Rathje says, “and try to couple it with numerical simulations, as well as field data from observations, and bring it all together to make advances.”

    The computational resources of Extreme Science and Engineering Discovery Environment (XSEDE) make these simulations possible. DesignSafe facilitates the use of these resources within the natural hazards engineering research community.

    2
    Taming the tsunami? The 2011 Tohuko tsunami caused severe structural damage and the loss of many lives — almost 16,000 dead, over 6,000 injured, and 2,500 missing. Natural hazards engineers use supercomputer simulations and shake tables to minimize damage by designing safer buildings. Courtesy EPA.

    According to Rathje, the merger between the two groups is beneficial for both and for researchers interested in natural hazards engineering.

    Rathje previously researched disasters such as the Haiti earthquake in 2010 and earthquakes in Japan. While the collaboration between XSEDE and TACC is a step forward for natural hazards research, Rathje says it’s just another step toward making buildings safer during earthquakes.

    “There’s still a lot of work to be done in natural hazards engineering,” she admits, “but we’ve been able to bring it all under one umbrella so that natural hazards researchers can come to one place to get the data they need for their research.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 6:50 pm on December 11, 2016 Permalink | Reply
    Tags: , Science Node, , The right way to simulate the Milky Way   

    From Science Node: “The right way to simulate the Milky Way” 

    Science Node bloc
    Science Node

    13 Sep, 2016 [Where oh where has this been?]
    Whitney Clavin

    Astronomers have created the most detailed computer simulation to date of our Milky Way galaxy’s formation, from its inception billions of years ago as a loose assemblage of matter to its present-day state as a massive, spiral disk of stars.

    The simulation solves a decades-old mystery surrounding the tiny galaxies that swarm around the outside of our much larger Milky Way. Previous simulations predicted that thousands of these satellite, or dwarf, galaxies should exist. However, only about 30 of the small galaxies have ever been observed. Astronomers have been tinkering with the simulations, trying to understand this ‘missing satellites’ problem to no avail.


    Access mp4 video here .
    Supercomputers and superstars. Caltech associate professor of theoretical astrophysics Phil Hopkins and Carnegie-Caltech research fellow Andrew Wetzel use XSEDE supercomputers to build the most detailed and realistic simulation of galaxy formation ever created. The results solve a decades-long mystery regarding dwarf galaxies around our Milky Way. Courtesy Caltech.

    Now, with the new simulation — which used resources from the Extreme Science and Engineering Discovery Environment (XSEDE) running in parallel for 700,000 central processing unit (CPU) hours — astronomers at the California Institute of Technology (Caltech) have created a galaxy that looks like the one we live in today, with the correct, smaller number of dwarf galaxies.

    “That was the aha moment, when I saw that the simulation can finally produce a population of dwarf galaxies like the ones we observe around the Milky Way,” says Andrew Wetzel, postdoctoral fellow at Caltech and Carnegie Observatories in Pasadena, and lead author of a paper about the new research, published August 20 in Astrophysical Journal Letters.

    One of the main updates to the new simulation relates to how supernovae, explosions of massive stars, affect their surrounding environments. In particular, the simulation incorporated detailed formulas that describe the dramatic effects that winds from these explosions can have on star-forming material and dwarf galaxies. These winds, which reach speeds up to thousands of kilometers per second, “can blow gas and stars out of a small galaxy,” says Wetzel.

    Indeed, the new simulation showed the winds can blow apart young dwarf galaxies, preventing them from reaching maturity. Previous simulations that were producing thousands of dwarf galaxies weren’t taking the full effects of supernovae into account.

    “We had thought before that perhaps our understanding of dark matter was incorrect in these simulations, but these new results show we don’t have to tinker with dark matter,” says Wetzel. “When we more precisely model supernovae, we get the right answer.”

    Astronomers simulate our galaxy to understand how the Milky Way, and our solar system within it, came to be. To do this, the researchers tell a computer what our universe was like in the early cosmos. They write complex codes for the basic laws of physics and describe the ingredients of the universe, including everyday matter like hydrogen gas as well as dark matter, which, while invisible, exerts gravitational tugs on other matter. The computers then go to work, playing out all the possible interactions between particles, gas, and stars over billions of years.

    “In a galaxy, you have 100 billion stars, all pulling on each other, not to mention other components we don’t see, like dark matter,” says Caltech’s Phil Hopkins, associate professor of theoretical astrophysics and principal scientist for the new research. “To simulate this, we give a supercomputer equations describing those interactions and then let it crank through those equations repeatedly and see what comes out at the end.”

    The researchers are not done simulating our Milky Way. They plan to use even more computing time, up to 20 million CPU hours, in their next rounds. This should lead to predictions about the very faintest and smallest of dwarf galaxies yet to be discovered. Not a lot of these faint galaxies are expected to exist, but the more advanced simulations should be able to predict how many are left to find.

    The study was funded by Caltech, a Sloan Research Fellowship, the US National Science Foundation (NSF), NASA, an Einstein Postdoctoral Fellowship, the Space Telescope Science Institute, UC San Diego, and the Simons Foundation.

    Other coauthors on the study are: Ji-Hoon Kim of Stanford University, Claude-André Faucher-Giguére of Northwestern University, Dušan Kereš of UC San Diego, and Eliot Quataert of UC Berkeley.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 10:20 am on October 22, 2016 Permalink | Reply
    Tags: , , , From greenhouse gas to usable ethanol, , Science Node   

    From Science Node: “From greenhouse gas to usable ethanol” 

    Science Node bloc
    Science Node

    19 Oct, 2016
    Morgan McCorkle

    ORNL scientists find a way to use nano-spike catalysts to convert carbon dioxide directly into ethanol.

    In a new twist to waste-to-fuel technology, scientists at the Department of Energy’s Oak Ridge National Laboratory (ORNL) have developed an electrochemical process that uses tiny spikes of carbon and copper to turn carbon dioxide, a greenhouse gas, into ethanol. Their finding, which involves nanofabrication and catalysis science, was serendipitous.


    Access mp4 video here .
    Serendipitous science. Looking to understand a chemical reaction, scientists accidentally discovered a method for converting combustion waste products into ethanol. The chance discovery may revolutionize the ability to use variable energy sources. Courtesy ORNL.

    “We discovered somewhat by accident that this material worked,” said ORNL’s Adam Rondinone, lead author of the team’s study published in ChemistrySelect. “We were trying to study the first step of a proposed reaction when we realized that the catalyst was doing the entire reaction on its own.”

    The team used a catalyst made of carbon, copper and nitrogen and applied voltage to trigger a complicated chemical reaction that essentially reverses the combustion process. With the help of the nanotechnology-based catalyst which contains multiple reaction sites, the solution of carbon dioxide dissolved in water turned into ethanol with a yield of 63 percent. Typically, this type of electrochemical reaction results in a mix of several different products in small amounts.

    “We’re taking carbon dioxide, a waste product of combustion, and we’re pushing that combustion reaction backwards with very high selectivity to a useful fuel,” Rondinone said. “Ethanol was a surprise — it’s extremely difficult to go straight from carbon dioxide to ethanol with a single catalyst.”

    The catalyst’s novelty lies in its nanoscale structure, consisting of copper nanoparticles embedded in carbon spikes. This nano-texturing approach avoids the use of expensive or rare metals such as platinum that limit the economic viability of many catalysts.

    “By using common materials, but arranging them with nanotechnology, we figured out how to limit the side reactions and end up with the one thing that we want,” Rondinone said.

    The researchers’ initial analysis suggests that the spiky textured surface of the catalysts provides ample reactive sites to facilitate the carbon dioxide-to-ethanol conversion.

    “They are like 50-nanometer lightning rods that concentrate electrochemical reactivity at the tip of the spike,” Rondinone said.

    Given the technique’s reliance on low-cost materials and an ability to operate at room temperature in water, the researchers believe the approach could be scaled up for industrially relevant applications. For instance, the process could be used to store excess electricity generated from variable power sources such as wind and solar.

    “A process like this would allow you to consume extra electricity when it’s available to make and store as ethanol,” Rondinone said. “This could help to balance a grid supplied by intermittent renewable sources.”

    The researchers plan to refine their approach to improve the overall production rate and further study the catalyst’s properties and behavior.

    ORNL’s Yang Song, Rui Peng, Dale Hensley, Peter Bonnesen, Liangbo Liang, Zili Wu, Harry Meyer III, Miaofang Chi, Cheng Ma, Bobby Sumpter and Adam Rondinone are coauthors on the study.

    The work was supported by DOE’s Office of Science and used resources at the ORNL’s Center for Nanophase Materials Sciences, which is a DOE Office of Science User Facility.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 9:11 am on September 20, 2016 Permalink | Reply
    Tags: , , Science Node,   

    From Science Node: “Seeing the solar wind” 

    Science Node bloc
    Science Node

    02 Sep, 2016 [Just appeared in social media.]
    Lance Farrell

    The details of how rays in the sun’s upper atmosphere transitioned into solar wind have always been a mystery. Now they’re starring in a movie.

    Until recently, the solar wind was virtually an article of faith. But in the summer of 2016 scientists were able to see it for the first time.

    Know your sun

    Moving from the center outward, our dear sun has a core, radiative and convective zones, a photosphere, a chromosphere, and a corona — the sun’s atmosphere. The corona extends about a million miles into space, and is what we see when we look into the sky, or when we drew the sun as children.

    As solar plasma escapes from the sun, the sun’s gravitational hold weakens with distance, eventually reaching a transition point between the corona and the solar wind.

    At about 20 million miles out from the sun, solar particles – some streaming at a million miles per hour – gain turbulence as the sun’s gravitational hold loosens. (Think of how a stream of water loses its force and focus the further away it gets from the nozzle of your garden hose.)

    This turbulent blast of plasma extends in all directions from the sun and extends to the outer reaches of our solar system. Scientists refer to this ‘bubble’ as the heliosphere. We refer to it as home.

    1
    The sun is vast, but its atmosphere is even bigger. Bigger still is the area covered by the solar wind.The solar wind blows throughout our solar system. Courtesy of NASA.

    Suncatchers

    Reporting in The Astrophysical Journal, scientists offer a nuanced definition of our sun with visual evidence to back it up.

    Turns out, the sun is neither a hard ball in space nor an immense raging ball of fire.

    Rather, it is a mass of particles and magnetic fields extending outward well beyond the sun’s actual surface. This atmosphere pushes outward throughout our solar system, engulfing all the planets in the solar wind.

    To create the first movie of this wind, scientists started with images taken over a 15-day interval in late December of 2008 from the Heliospheric Imager in the SECCHI suite onboard NASA’s Solar Terrestrial Relations Observatory (STEREO).

    NASA/STEREO spacecraft
    NASA/STEREO spacecraft


    Blowin’ in the wind. Animation of filtered imagery taken from the Heliospheric Imager in the SECCHI suite onboard NASA’s Solar Terrestrial Relations Observatory (STEREO). Courtesy C. E. DeForest, et al.

    Then, using novel image processing techniques including background brightness suppression, scientists were able to discern the boundary between the upper corona and the solar wind.

    No longer mere theory or an an article of faith, astrophysicists can now see the solar wind with their own eyes.

    Imaging the solar wind is significant because it identifies a shift in plasma texture as it flows away from the sun.

    Objects placed in this upper extreme of the solar corona (say, a satellite, space ship, or a person) will be safer and more productive with this better understanding of the solar wind.

    And the images look pretty cool, too.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: