Tagged: Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:24 pm on April 1, 2018 Permalink | Reply
    Tags: , , Computer searches telescope data for evidence of distant planets, Computing,   

    From MIT: “Computer searches telescope data for evidence of distant planets” 

    MIT News

    MIT Widget

    MIT News

    March 29, 2018
    Larry Hardesty

    1
    A young sun-like star encircled by its planet-forming disk of gas and dust.
    Image: NASA/JPL-Caltech

    Machine-learning system uses physics principles to augment data from NASA crowdsourcing project.

    As part of an effort to identify distant planets hospitable to life, NASA has established a crowdsourcing project in which volunteers search telescopic images for evidence of debris disks around stars, which are good indicators of exoplanets.

    Using the results of that project, researchers at MIT have now trained a machine-learning system to search for debris disks itself. The scale of the search demands automation: There are nearly 750 million possible light sources in the data accumulated through NASA’s Wide-Field Infrared Survey Explorer (WISE) mission alone.

    NASA/WISE Telescope

    In tests, the machine-learning system agreed with human identifications of debris disks 97 percent of the time. The researchers also trained their system to rate debris disks according to their likelihood of containing detectable exoplanets. In a paper describing the new work in the journal Astronomy and Computing, the MIT researchers report that their system identified 367 previously unexamined celestial objects as particularly promising candidates for further study.

    The work represents an unusual approach to machine learning, which has been championed by one of the paper’s coauthors, Victor Pankratius, a principal research scientist at MIT’s Haystack Observatory. Typically, a machine-learning system will comb through a wealth of training data, looking for consistent correlations between features of the data and some label applied by a human analyst — in this case, stars circled by debris disks.

    But Pankratius argues that in the sciences, machine-learning systems would be more useful if they explicitly incorporated a little bit of scientific understanding, to help guide their searches for correlations or identify deviations from the norm that could be of scientific interest.

    “The main vision is to go beyond what A.I. is focusing on today,” Pankratius says. “Today, we’re collecting data, and we’re trying to find features in the data. You end up with billions and billions of features. So what are you doing with them? What you want to know as a scientist is not that the computer tells you that certain pixels are certain features. You want to know ‘Oh, this is a physically relevant thing, and here are the physics parameters of the thing.’”

    Classroom conception

    The new paper grew out of an MIT seminar that Pankratius co-taught with Sara Seager, the Class of 1941 Professor of Earth, Atmospheric, and Planetary Sciences, who is well-known for her exoplanet research. The seminar, Astroinformatics for Exoplanets, introduced students to data science techniques that could be useful for interpreting the flood of data generated by new astronomical instruments. After mastering the techniques, the students were asked to apply them to outstanding astronomical questions.

    For her final project, Tam Nguyen, a graduate student in aeronautics and astronautics, chose the problem of training a machine-learning system to identify debris disks, and the new paper is an outgrowth of that work. Nguyen is first author on the paper, and she’s joined by Seager, Pankratius, and Laura Eckman, an undergraduate majoring in electrical engineering and computer science.

    From the NASA crowdsourcing project, the researchers had the celestial coordinates of the light sources that human volunteers had identified as featuring debris disks. The disks are recognizable as ellipses of light with slightly brighter ellipses at their centers. The researchers also used the raw astronomical data generated by the WISE mission.

    To prepare the data for the machine-learning system, Nguyen carved it up into small chunks, then used standard signal-processing techniques to filter out artifacts caused by the imaging instruments or by ambient light. Next, she identified those chunks with light sources at their centers, and used existing image-segmentation algorithms to remove any additional sources of light. These types of procedures are typical in any computer-vision machine-learning project.

    Coded intuitions

    But Nguyen used basic principles of physics to prune the data further. For one thing, she looked at the variation in the intensity of the light emitted by the light sources across four different frequency bands. She also used standard metrics to evaluate the position, symmetry, and scale of the light sources, establishing thresholds for inclusion in her data set.

    In addition to the tagged debris disks from NASA’s crowdsourcing project, the researchers also had a short list of stars that astronomers had identified as probably hosting exoplanets. From that information, their system also inferred characteristics of debris disks that were correlated with the presence of exoplanets, to select the 367 candidates for further study.

    “Given the scalability challenges with big data, leveraging crowdsourcing and citizen science to develop training data sets for machine-learning classifiers for astronomical observations and associated objects is an innovative way to address challenges not only in astronomy but also several different data-intensive science areas,” says Dan Crichton, who leads the Center for Data Science and Technology at NASA’s Jet Propulsion Laboratory. “The use of the computer-aided discovery pipeline described to automate the extraction, classification, and validation process is going to be helpful for systematizing how these capabilities can be brought together. The paper does a nice job of discussing the effectiveness of this approach as applied to debris disk candidates. The lessons learned are going to be important for generalizing the techniques to other astronomy and different discipline applications.”

    “The Disk Detective science team has been working on its own machine-learning project, and now that this paper is out, we’re going to have to get together and compare notes,” says Marc Kuchner, a senior astrophysicist at NASA’s Goddard Space Flight Center and leader of the crowdsourcing disk-detection project known as Disk Detective. “I’m really glad that Nguyen is looking into this because I really think that this kind of machine-human cooperation is going to be crucial for analyzing the big data sets of the future.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

    Advertisements
     
  • richardmitnick 1:13 pm on February 19, 2018 Permalink | Reply
    Tags: , , Automating materials design, Computing, ,   

    From MIT: “Automating materials design” 

    MIT News

    MIT Widget

    MIT News

    February 2, 2018 [Just showed up in social media.]
    Larry Hardesty

    1
    New software identified five different families of microstructures, each defined by a shared “skeleton” (blue), that optimally traded off three mechanical properties. Courtesy of the researchers.

    With new approach, researchers specify desired properties of a material, and a computer system generates a structure accordingly.

    For decades, materials scientists have taken inspiration from the natural world. They’ll identify a biological material that has some desirable trait — such as the toughness of bones or conch shells — and reverse-engineer it. Then, once they’ve determined the material’s “microstructure,” they’ll try to approximate it in human-made materials.

    Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory have developed a new system that puts the design of microstructures on a much more secure empirical footing. With their system, designers numerically specify the properties they want their materials to have, and the system generates a microstructure that matches the specification.

    The researchers have reported their results in Science Advances. In their paper, they describe using the system to produce microstructures with optimal trade-offs between three different mechanical properties. But according to associate professor of electrical engineering and computer science Wojciech Matusik, whose group developed the new system, the researchers’ approach could be adapted to any combination of properties.

    “We did it for relatively simple mechanical properties, but you can apply it to more complex mechanical properties, or you could apply it to combinations of thermal, mechanical, optical, and electromagnetic properties,” Matusik says. “Basically, this is a completely automated process for discovering optimal structure families for metamaterials.”

    Joining Matusik on the paper are first author Desai Chen, a graduate student in electrical engineering and computer science; and Mélina Skouras and Bo Zhu, both postdocs in Matusik’s group.

    Finding the formula

    The new work builds on research reported last summer, in which the same quartet of researchers generated computer models of microstructures and used simulation software to score them according to measurements of three or four mechanical properties. Each score defines a point in a three- or four-dimensional space, and through a combination of sampling and local exploration, the researchers constructed a cloud of points, each of which corresponded to a specific microstructure.

    Once the cloud was dense enough, the researchers computed a bounding surface that contained it. Points near the surface represented optimal trade-offs between the mechanical properties; for those points, it was impossible to increase the score on one property without lowering the score on another.

    2
    No image caption or credit.

    That’s where the new paper picks up. First, the researchers used some standard measures to evaluate the geometric similarities of the microstructures corresponding to the points along the boundaries. On the basis of those measures, the researchers’ software clusters together microstructures with similar geometries.

    For every cluster, the software extracts a “skeleton” — a rudimentary shape that all the microstructures share. Then it tries to reproduce each of the microstructures by making fine adjustments to the skeleton and constructing boxes around each of its segments. Both of these operations — modifying the skeleton and determining the size, locations, and orientations of the boxes — are controlled by a manageable number of variables. Essentially, the researchers’ system deduces a mathematical formula for reconstructing each of the microstructures in a cluster.

    Next, the researchers use machine-learning techniques to determine correlations between specific values for the variables in the formulae and the measured properties of the resulting microstructures. This gives the system a rigorous way to translate back and forth between microstructures and their properties.

    3

    On automatic

    Every step in this process, Matusik emphasizes, is completely automated, including the measurement of similarities, the clustering, the skeleton extraction, the formula derivation, and the correlation of geometries and properties. As such, the approach would apply as well to any collection of microstructures evaluated according to any criteria.

    By the same token, Matusik explains, the MIT researchers’ system could be used in conjunction with existing approaches to materials design. Besides taking inspiration from biological materials, he says, researchers will also attempt to design microstructures by hand. But either approach could be used as the starting point for the sort of principled exploration of design possibilities that the researchers’ system affords.

    “You can throw this into the bucket for your sampler,” Matusik says. “So we guarantee that we are at least as good as anything else that has been done before.”

    In the new paper, the researchers do report one aspect of their analysis that was not automated: the identification of the physical mechanisms that determine the microstructures’ properties. Once they had the skeletons of several different families of microstructures, they could determine how those skeletons would respond to physical forces applied at different angles and locations.

    But even this analysis is subject to automation, Chen says. The simulation software that determines the microstructures’ properties can also identify the structural elements that deform most under physical pressure, a good indication that they play an important functional role.

    The work was supported by the U.S. Defense Advanced Research Projects Agency’s Simplifying Complexity in Scientific Discovery program.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 7:06 am on February 6, 2018 Permalink | Reply
    Tags: , , Computing, , ,   

    From CSIROscope: “Cybersecurity: we can hack it” 

    CSIRO bloc

    CSIROscope

    6 February 2018
    Chris Chelvan

    1
    No image capton or credit.

    It is estimated that 3,885,567,619 people across the world have access to the internet, roughly 51.7% of the world population. More often than not, the internet is used to benefit society — from connecting opposite sides of the world to making knowledge more accessible. But sometimes, the anonymity provided by the internet creates risks of cyberbullying as well as threats to cyber security.

    Every month, at least 50,000 new cyber threats arise that expose internet users to risk. The National Vulnerability Database (NVD) operated by the National Institute of Standards and Technology suggests that between 500 and 1,000 new vulnerabilities emerge every month, of which at least 25 per cent are critical and pose a risk for significant damage.

    Some of the largest cybersecurity threats emerged just last year. The WannaCry ransomware attack in May 2017 affected more than 300,000 computers across 150 countries causing billions of dollars in damage. Spectre and Meltdown, too, exposed critical cyber vulnerabilities in computers and mobile phones around the world, exposing millions of people to hackers — in fact, Data61 researcher Dr Yuval Yarom from the Trustworthy Systems Group was one of the contributors whose research uncovered the Spectre issue.

    Not only are cyber threats increasing, they’re also evolving. First the focus was on attacking technology: hacking, malware, and remote access. Then the focus shifted to attacking humans with phishing, social engineering and ransomware, like WannaCry. Now cyber attacks are now more sophisticated than ever and even harder to detect.

    And yet, given all these threats, Australia has next to no cyber security specialists. The Australian Cyber Security Growth Network has said the demand for skills in the sector far outstrips demand. A recent Government report estimated Australia would need another 11,000 cyber security specialists over the next decade.

    It’s against this diverse backdrop of new and constantly changing threats that we celebrate Safer Internet Day and call on our future generation of science, technology, engineering and mathematics (STEM) leaders to fill the glaring shortage of cybersecurity professionals in Australia.

    Not only are we short of information security professionals now, but data show that by 2022 we’ll be short up to 1.8 million positions. This is particularly urgent in Australia, where women make up just one in three students studying STEM — a proportion that needs to rise to meet the country’s growing cyber security needs.

    Introducing STEM Professional in Schools, our education program that shows young women how they can make an impact in Australia and across the world. STEM Professionals in Schools is Australia’s leading STEM education volunteering program, bringing real world STEM into the classroom to inspire students and teachers.

    Our Data61 CEO, Adrian Turner, visited Melbourne Girls College to talk about safer internet usage and the importance of STEM.

    “These students are our future innovators, scientists and engineers,” Mr Turner said.

    “It’s essential to equip them with the skills they need in school, and to capture their interest in cybersecurity and why it matters now and in the future so they can see how much of a crucial role it is and will continue to play in Australia’s data-driven future and digital economy.”

    A rewarding career in STEM can take on many forms, too. Data61’s STEM graduates have worked in various roles and research projects, spanning everything from machine learning and robotics to analytics and of course — cybersecurity.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 12:10 pm on November 29, 2017 Permalink | Reply
    Tags: , , Bridging gaps in high-performance computing languages, Computing, Generative programming, Programming languages, Tiark Rompf   

    From ASCRDiscovery: “Language barrier” 

    ASCRDiscovery
    Advancing Science Through Computing

    November 2017
    No writer credit

    A Purdue University professor is using a DOE early-career award to bridge gaps in high-performance computing languages.

    1
    Detail from an artwork made through generative programming. Purdue University’s Tiark Rompf is investigating programs that create new ones to bring legacy software up to speed in the era of exascale computing. Painting courtesy of Freddbomba via Wikimedia Commons.

    A Purdue University assistant professor of computer science leads a group effort to find new and better ways to generate high-performance computing codes that run efficiently on as many different kinds of supercomputer architectures as possible.

    That’s the challenging goal Tiark Rompf has set for himself with his recent Department of Energy Early Career Research Program award – to develop what he calls “program generators” for exascale architectures and beyond.

    “Programming supercomputers is hard,” Rompf says. Coders typically write software in so-called general-purpose languages. The languages are low-level, meaning “specialized to a given machine architecture. So when a machine is upgraded or replaced, one has to rewrite most of the software.”

    As an alternative to this rewriting, which involves tediously translating low-level code from one supercomputer platform into another, programmers would prefer to use high-level languages “written in a way that feels natural” to them, Rompf says, and “closer to the way a programmer thinks about the computation.”

    But high-level and low-level languages are far apart, with a steel wall of differences between the ways the two types of languages are written, interpreted and executed. In particular, high-level languages rarely perform as well as desired. Executing them requires special so-called smart compilers that must use highly specialized analysis to figure out what the program “really means and how to match it to machine instructions.”

    2
    Tiark Rompf. Photo courtesy of Purdue University.

    Rompf and his group propose avoiding that with something called generative programming, which he has worked on since before he received his 2012 Ph.D. from Ecole Polytechnique Federale de Lausanne (EPFL) in Switzerland. The idea is to create special programs structured so they’re able to make additional programs where needed.

    In a 2015 paper, Rompf and research colleagues at EPFL, Stanford University and ETH Zurich also called for a radical reassessment of high-level languages. “We really need to think about how to design programming languages and (software) libraries that embrace this generative programming idea,” he adds.

    Program generators “are attractive because they can automate the process of producing very efficient code,” he says. But building them “has also been very hard, and therefore only a few exist today. We’re planning to build the necessary infrastructure to make it an order of magnitude easier.”

    As he noted in his early-career award proposal, progress building program generators is extremely difficult for more reasons than just programmer-computer disharmony. Other obstacles include compiler limitations, differing capabilities of supercomputer processors, the changing ways data are stored and the ways software libraries are accessed. Rompf plans to use his five-year, $750,000 award to evaluate generative programming as a way around some of those roadblocks.

    One idea, for instance, is to identify and create an extensible stack of intermediate languages that could serve as transitional steps when high-level codes must be translated into machine code. These also are described as “domain-specific languages” or DSLs, as they encode more knowledge about the application subject than general-purpose languages.

    Eventually, programmers hope to entirely phase out legacy languages such as C and Fortran, substituting only high-level languages and DSLs. Rompf points out that legacy codes can be decades older than the processors they run on, and some have been heavily adapted to run on new generations of machines, an investment that can make legacy codes difficult to jettison.

    __________________________________________________
    Rompf started Project Lancet to integrate generative approaches into a virtual machine for high-level languages.
    __________________________________________________

    Generative programming was the basis for Rompf’s doctoral research. It was described as an approach called Lightweight Modular Staging, or LMS, in a 2010 paper he wrote with his EPFL Ph.D. advisor, Martin Odersky. That’s “a software platform that provides capabilities for other programmers to develop software in a generative style,” Rompf says.

    LMS also underpins Delite, a software framework Rompf later developed in collaboration with a Stanford University group to build DSLs targeting parallel processing in supercomputer architectures – “very important for the work I’m planning to do,” he says.

    While working at Oracle Labs between 2012 and 2014, Rompf started Project Lancet to integrate generative approaches into a virtual machine for high-level languages. Virtual machines are code that can induce real computers to run selected programs. In the case of Lancet, software executes high-level languages and then performs selective compilations in machine code.

    Born and raised in Germany, Rompf joined Purdue in the fall of 2014. It’s “a great environment for doing this kind of research,” he says. “We have lots of good students in compilers, high-performance and databases. We’ve been hiring many new assistant professors. There are lots of young people who all want to accomplish things.”

    He calls his DOE Early Career award a great honor. “I think there are many opportunities for future work in getting more of the DOE community in the interaction.” Although he is the project’s only principal investigator, he is collaborating with other groups at Purdue, ETH Zurich and Stanford and has received recent and related National Science Foundation research grants.

    As a busy assistant professor, he has six graduate students on track to get their doctorates, plus a varying number of undergraduate assistants. Rompf also is a member of the Purdue Research on Programming Languages group (PurPL), with 10 faculty members and their students.

    “It’s a very vibrant group, which like the Purdue computer science department has been growing a lot in recent years,” he says.

    Now in its eighth year, the DOE Office of Science’s Early Career Research Program for researchers in universities and DOE national laboratories supports the development of individual research programs of outstanding scientists early in their careers and stimulates research careers in the disciplines supported by the Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

     
  • richardmitnick 2:04 pm on October 13, 2017 Permalink | Reply
    Tags: , , , , Computing, ,   

    From BNL: “Scientists Use Machine Learning to Translate ‘Hidden’ Information that Reveals Chemistry in Action” 

    Brookhaven Lab

    October 10, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    New method allows on-the-fly analysis of how catalysts change during reactions, providing crucial information for improving performance.

    1
    A sketch of the new method that enables fast, “on-the-fly” determination of three-dimensional structure of nanocatalysts. The neural network converts the x-ray absorption spectra into geometric information (such as nanoparticle sizes and shapes) and the structural models are obtained for each spectrum. No image credit.

    Chemistry is a complex dance of atoms. Subtle shifts in position and shuffles of electrons break and remake chemical bonds as participants change partners. Catalysts are like molecular matchmakers that make it easier for sometimes-reluctant partners to interact.

    Now scientists have a way to capture the details of chemistry choreography as it happens. The method—which relies on computers that have learned to recognize hidden signs of the steps—should help them improve the performance of catalysts to drive reactions toward desired products faster.

    The method—developed by an interdisciplinary team of chemists, computational scientists, and physicists at the U.S. Department of Energy’s Brookhaven National Laboratory and Stony Brook University—is described in a new paper published in the Journal of Physical Chemistry Letters. The paper demonstrates how the team used neural networks and machine learning to teach computers to decode previously inaccessible information from x-ray data, and then used that data to decipher 3D nanoscale structures.

    Decoding nanoscale structures

    “The main challenge in developing catalysts is knowing how they work—so we can design better ones rationally, not by trial-and-error,” said Anatoly Frenkel, leader of the research team who has a joint appointment with Brookhaven Lab’s Chemistry Division and Stony Brook University’s Materials Science Department. “The explanation for how catalysts work is at the level of atoms and very precise measurements of distances between them, which can change as they react. Therefore it is not so important to know the catalysts’ architecture when they are made but more important to follow that as they react.”

    2
    Anatoly Frenkel (standing) with co-authors (l to r) Deyu Lu, Yuewei Lin, and Janis Timoshenko. No image credit.

    Trouble is, important reactions—those that create important industrial chemicals such as fertilizers—often take place at high temperatures and under pressure, which complicates measurement techniques. For example, x-rays can reveal some atomic-level structures by causing atoms that absorb their energy to emit electronic waves. As those waves interact with nearby atoms, they reveal their positions in a way that’s similar to how distortions in ripples on the surface of a pond can reveal the presence of rocks. But the ripple pattern gets more complicated and smeared when high heat and pressure introduce disorder into the structure, thus blurring the information the waves can reveal.

    So instead of relying on the “ripple pattern” of the x-ray absorption spectrum, Frenkel’s group figured out a way to look into a different part of the spectrum associated with low-energy waves that are less affected by heat and disorder.

    “We realized that this part of the x-ray absorption signal contains all the needed information about the environment around the absorbing atoms,” said Janis Timoshenko, a postdoctoral fellow working with Frenkel at Stony Brook and lead author on the paper. “But this information is hidden ‘below the surface’ in the sense that we don’t have an equation to describe it, so it is much harder to interpret. We needed to decode that spectrum but we didn’t have a key.”

    Fortunately Yuewei Lin and Shinjae Yoo of Brookhaven’s Computational Science Initiative and Deyu Lu of the Center for Functional Nanomaterials (CFN) had significant experience with so-called machine learning methods. They helped the team develop a key by teaching computers to find the connections between hidden features of the absorption spectrum and structural details of the catalysts.

    “Janis took these ideas and really ran with them,” Frenkel said.

    The team used theoretical modeling to produce simulated spectra of several hundred thousand model structures, and used those to train the computer to recognize the features of the spectrum and how they correlated with the structure.

    “Then we built a neural network that was able to convert the spectrum into structures,” Frenkel said.

    When they tested to see if the method would work to decipher the shapes and sizes of well-defined platinum nanoparticles (using x-ray absorption spectra previously published by Frenkel and his collaborators) it did.

    “This method can now be used on the fly,” Frenkel said. “Once the network is constructed it takes almost no time for the structure to be obtained in any real experiment.”

    That means scientists studying catalysts at Brookhaven’s National Synchrotron Light Source II (NSLS-II), for example, could obtain real-time structural information to decipher why a particular reaction slows down, or starts producing an unwanted product—and then tweak the reaction conditions or catalyst chemistry to achieve desired results. This would be a big improvement over waiting to analyze results after completing the experiments and then figuring out what went wrong.

    In addition, this technique can process and analyze spectral signals from very low-concentration samples, and will be particularly useful at new high flux and high-energy-resolution beamlines incorporating special optics and high-throughput analysis techniques at NSLS-II.

    “This will offer completely new methods of using synchrotrons for operando research,” Frenkel said.

    This work was funded by the DOE Office of Science (BES) and by Brookhaven’s Laboratory Directed Research and Development program. Previously published spectra for the model nanoparticles used to validate the neural network were collected at the Advanced Photon Source (APS) at DOE’s Argonne National Laboratory and the original National Synchrotron Light Source (NSLS) at Brookhaven Lab, now replaced by NSLS-II. CFN, NSLS-II, and APS are DOE Office of Science User Facilities. In addition to Frenkel and Timoshenko, Lu and Lin are co-authors on the paper.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 9:32 am on July 7, 2017 Permalink | Reply
    Tags: , , Computing, , St. Jude Children’s Research Hospital   

    From ORNL: “ORNL researchers apply imaging, computational expertise to St. Jude research” 

    i1

    Oak Ridge National Laboratory

    July 6, 2017
    Stephanie G. Seay
    seaysg@ornl.gov
    865.576.9894

    1
    Left to right: ORNL’s Derek Rose, Matthew Eicholtz, Philip Bingham, Ryan Kerekes, and Shaun Gleason.

    2
    Measuring migrating neurons in a developing mouse brain.

    3
    Identifying and analyzing neurons in a mouse auditory cortex.
    No image credits for above images

    In the quest to better understand and cure childhood diseases, scientists at St. Jude Children’s Research Hospital accumulate enormous amounts of data from powerful video microscopes. To help St. Jude scientists mine that trove of data, researchers at Oak Ridge National Laboratory have created custom algorithms that can provide a deeper understanding of the images and quicken the pace of research.

    The work resides in St. Jude’s Department of Developmental Neurobiology in Memphis, Tennessee, where scientists use advanced microscopy to capture the details of phenomena such as nerve cell growth and migration in the brains of mice. ORNL researchers take those videos and leverage their expertise in image processing, computational science, and machine learning to analyze the footage and create statistics.

    A recent Science article details St. Jude research on brain plasticity, or the ability of the brain to change and form new connections between neurons. In this work, ORNL helped track mice brain cell electrical activity in the auditory cortex when the animals were exposed to certain tones.

    ORNL researchers created an algorithm to measure electrical activations, or signals, across groups of neurons, collecting statistics and making correlations between cell activity in the auditory cortex and tones heard by the mice. The team first had to stabilize the video because it was taken while the mice were awake and moving to ensure a proper analysis was being conducted, said Derek Rose, who now leads the work at ORNL.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 3:39 pm on May 14, 2017 Permalink | Reply
    Tags: 22-Year-Old Researcher Accidentally Stops Global Cyberattack, , Computing, , Massive cyberattack thwarted   

    From Inverse: “22-Year-Old Researcher Accidentally Stops Global Cyberattack” 

    INVERSE

    INVERSE

    May 13, 2017
    Grace Lisa Scott

    And then he blogged about how he did it.

    On Friday, a massive cyberattack spread across 74 countries, infiltrating global companies like FedEx and Nissan, telecommunication networks, and most notably the UK’s National Health Service. It left the NHS temporarily crippled, with test results and patient records becoming unavailable and phones not working.

    The ransomware attack employed a malware called WannaCrypt that encrypts a user’s data and then demands a payment — in this instance $300-worth of bitcoins — to retrieve and unlock said data. The malware is spread through email and exploits a vulnerability in Windows. Microsoft did release a patch that fixes the vulnerability back in March, but any computer without the update would have remained vulnerable.

    The attack was suddenly halted early Friday afternoon (Eastern Standard Time) thanks to a 22-year-old cybersecurity researcher from southwest England. Going by the pseudonym MalwareTech on Twitter, the researcher claimed he accidentally activated the software’s “kill switch” by registering a complicated domain name hidden in the malware.

    After getting home from lunch with a friend and realizing the true severity of the cyberattack, the cybersecurity expert started looking for a weakness within the malware with the help of a few fellow researchers. On Saturday, he detailed how he managed to stop the malware spread in a blog post endearingly-titled “How to Accidentally Stop a Global Cyber Attacks”.

    “You’ve probably read about the WannaCrypt fiasco on several news sites, but I figured I’d tell my story,” he says.

    MalwareTech had registered the domain as a way to track the spread. “My job is to look for ways we can track and potentially stop botnets (and other kinds of malware), so I’m always on the lookout to pick up unregistered malware control server (C2) domains. In fact I registered several thousand of such domains in the past year,” he says.

    By registering the domain and setting up a sinkhole server he was planning to track the WannaCrypt spread.

    Fortunately, it didn’t turn out to be necessary because just by registering the domain MalwareTech he had engaged what was possibly an obscure but intentional kill switch for the ransomware. A peer linked MalwareTech to a tweet by a fellow researcher named Darien Huss who had just tweeted the discovery.

    The move gave companies and institutions time to patch their systems to avoid infection before the attackers could change the code and get the ransomware going again.

    In an interview with The Guardian Saturday, MalwareTech warned that the attack was probably not over. “The attackers will realize how we stopped it, they’ll change the code and then they’ll start again. Enable windows update, update and then reboot.”

    As for MalwareTech himself, he says he prefers to remain anonymous. “…It just doesn’t make sense to give out my personal information, obviously we’re working against bad guys and they’re not going to be happy about this,” he told the Guardian.

    To get into the nitty gritty of just why MalwareTech’s sinkhole managed to stop the international ransomware you can read his full blog post here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:31 pm on April 4, 2017 Permalink | Reply
    Tags: , , Computing, , Tim Berners-Lee wins $1 million Turing Award,   

    From MIT: “Tim Berners-Lee wins $1 million Turing Award” 

    MIT News

    MIT Widget

    MIT News

    April 4, 2017
    Adam Conner-Simons

    1
    Tim Berners-Lee was honored with the Turing Award for his work inventing the World Wide Web, the first web browser, and “the fundamental protocols and algorithms [that allowed] the web to scale.” Photo: Henry Thomas

    CSAIL researcher honored for inventing the web and developing the protocols that spurred its global use.

    MIT Professor Tim Berners-Lee, the researcher who invented the World Wide Web and is one of the world’s most influential voices for online privacy and government transparency, has won the most prestigious honor in computer science, the Association for Computing Machinery (ACM) A.M. Turing Award. Often referred to as “the Nobel Prize of computing,” the award comes with a $1 million prize provided by Google.

    In its announcement today, ACM cited Berners-Lee for “inventing the World Wide Web, the first web browser, and the fundamental protocols and algorithms allowing the web to scale.” This year marks the 50th anniversary of the award.

    A principal investigator at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) with a joint appointment in the Department of Electrical Engineering and Computer Science, Berners-Lee conceived of the web in 1989 at the European Organization for Nuclear Research (CERN) as a way to allow scientists around the world to share information with each other on the internet. He introduced a naming scheme (URIs), a communications protocol (HTTP), and a language for creating webpages (HTML). His open-source approach to coding the first browser and server is often credited with helping catalyzing the web’s rapid growth.

    “I’m humbled to receive the namesake award of a computing pioneer who showed that what a programmer could do with a computer is limited only by the programmer themselves,” says Berners-Lee, the 3COM Founders Professor of Engineering at MIT. “It is an honor to receive an award like the Turing that has been bestowed to some of the most brilliant minds in the world.”

    Berners-Lee is founder and director of the World Wide Web Consortium (W3C), which sets technical standards for web development, as well as the World Wide Web Foundation, which aims to establish the open web as a public good and a basic right. He also holds a professorship at Oxford University.

    As director of CSAIL’s Decentralized Information Group, Berners-Lee has developed data systems and privacy-minded protocols such as “HTTP with Accountability” (HTTPA), which monitors the transmission of private data and enables people to examine how their information is being used. He also leads Solid (“social linked data”), a project to re-decentralize the web that allows people to control their own data and make it available only to desired applications.

    “Tim Berners-Lee’s career — as brilliant and bold as they come — exemplifies MIT’s passion for using technology to make a better world,” says MIT President L. Rafael Reif. “Today we celebrate the transcendent impact Tim has had on all of our lives, and congratulate him on this wonderful and richly deserved award.”

    While Berners-Lee was initially drawn to programming through his interest in math, there was also a familial connection: His parents met while working on the Ferranti Mark 1, the world’s first commercial general-purpose computer. Years later, he wrote a program called Enquire to track connections between different ideas and projects, indirectly inspiring what later became the web.

    “Tim’s innovative and visionary work has transformed virtually every aspect our lives, from communications and entertainment to shopping and business,” says CSAIL Director Daniela Rus. “His work has had a profound impact on people across the world, and all of us at CSAIL are so very proud of him for being recognized with the highest honor in computer science.”

    Berners-Lee has received multiple accolades for his technical contributions, from being knighted by Queen Elizabeth to being named one of TIME magazine’s “100 Most Important People of the 20th Century.” He will formally receive the Turing Award during the ACM’s annual banquet June 24 in San Francisco.

    Past Turing Award recipients who have taught at MIT include Michael Stonebraker (2014), Shafi Goldwasser and Silvio Micali (2013), Barbara Liskov (2008), Ronald Rivest (2002), Butler Lampson (1992), Fernando Corbato (1990), John McCarthy (1971) and Marvin Minsky (1969).

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 2:48 pm on January 30, 2017 Permalink | Reply
    Tags: , , Computing, Dr. Miriam Eisenstein, GSK-3, Modeling of molecules on the computer, , ,   

    From Weizmann: Women in STEM – “Staff Scientist: Dr. Miriam Eisenstein” 

    Weizmann Institute of Science logo

    Weizmann Institute of Science

    30.01.2017
    No writer credit found

    1
    Name: Dr. Miriam Eisenstein
    Department: Chemical Research Support

    “The modeling of molecules on the computer,” says Dr. Miriam Eisenstein, Head of the Macromolecular Modeling Unit of the Weizmann Institute of Science’s Chemical Research Support Department, “is sometimes the only way to understand exactly how such complex molecules as proteins interact.”

    Eisenstein was one of the first to develop molecular docking methods while working with Prof. Ephraim Katzir – over two decades ago – and she has worked in collaboration with many groups at the Weizmann Institute.

    But even with all her experience, protein interactions can still surprise her. This was the case in a recent collaboration with the lab group of Prof. Hagit Eldar-Finkelman of Tel Aviv University, in research that was hailed as a promising new direction for finding treatments for Alzheimer’s disease. Eldar-Finkelman and her group were investigating an enzyme known as GSK-3, which affects the activity of various proteins by clipping a particular type of chemical tag, known as a phosphate group, onto them. GSK-3 thus performs quite a few crucial functions in the body, but it can also become overactive, and this extra activity has been implicated in a number of diseases, including diabetes and Alzheimer’s.

    The Tel Aviv group, explains Eisenstein, was exploring a new way of blocking, or at least damping down, the activity of this enzyme. GSK-3 uses ATP — a small, phosphate-containing molecule — in the chemical tagging process, transferring one of the ATP phosphate groups to a substrate. The ATP binding site on the enzyme is often targeted with ATP-like drug compounds that by themselves binding prevent the ATP from binding, thus blocking the enzyme’s activity. But such compounds are not discriminating enough, often blocking related enzymes in the process, which is an undesired side effect. This is why Eldar-Finkelman and her team looked for molecules that would compete with the substrate and occupy its binding cavity, so that the enzyme’s normal substrates cannot attach to GSK-3 and clip onto the phosphate groups.

    After identifying one molecule – a short piece of protein, or peptide – that substituted for GSK-3’s substrates in experiments, Eldar-Finkelman turned to Eisenstein to design peptides that would be better at competing with the substrate. At first Eisenstein computed model structures of the enzyme with an attached protein substrate and the enzyme with an attached peptide; she then characterized the way in which the enzyme binds either the substrate or the competing peptide. The model structures pinpointed the contacts, and these were verified experimentally by Eldar-Finkelman.

    This led to the next phase, a collaborative effort to introduce alterations to the peptide so as to improve its binding capabilities. One of the new peptides was predicted by Eisenstein to be a good substrate, and Eldar-Finkelman’s experiments showed that it indeed was. Once chemically tagged, the new peptide proved to be excellent at binding to GSK-3 – many times better than the original – and this was the surprise, because normally, once they are tagged, such substrates are repelled from the substrate-binding cavity and end up dissociating from the enzyme. Molecular modeling explained what was happening. After initially binding as a substrate and attaining a phosphate group, the peptide slid within the substrate-binding cavity, changing its conformation in the process, and attached tightly to a position normally occupied by the protein substrate.

    Experiments in Eldar-Finkelman’s group showed that this peptide is also active in vivo and, moreover, was able to reduce the symptoms of an Alzheimer-like condition in mice. The results of this research appeared in Science Signaling.

    “This experiment is a great example of the synergy between biologists and computer modelers,” says Eisenstein. “Hagit understands the function of this enzyme in the body, and she had this great insight on a possible way to control its actions. I am interested in the way that two proteins fit together and influence one another at the molecular and atomic levels, so I can provide the complementary insight.”

    “Molecular modeling is such a useful tool, it has enabled me to work with a great many groups and take part in a lot of interesting, exciting work, over the years,” she adds. “Computers have become much stronger in that time, but the basic, chemical principles of attraction and binding between complex molecules remain the same, and our work is as relevant as ever.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Weizmann Institute Campus

    The Weizmann Institute of Science is one of the world’s leading multidisciplinary research institutions. Hundreds of scientists, laboratory technicians and research students working on its lushly landscaped campus embark daily on fascinating journeys into the unknown, seeking to improve our understanding of nature and our place within it.

    Guiding these scientists is the spirit of inquiry so characteristic of the human race. It is this spirit that propelled humans upward along the evolutionary ladder, helping them reach their utmost heights. It prompted humankind to pursue agriculture, learn to build lodgings, invent writing, harness electricity to power emerging technologies, observe distant galaxies, design drugs to combat various diseases, develop new materials and decipher the genetic code embedded in all the plants and animals on Earth.

    The quest to maintain this increasing momentum compels Weizmann Institute scientists to seek out places that have not yet been reached by the human mind. What awaits us in these places? No one has the answer to this question. But one thing is certain – the journey fired by curiosity will lead onward to a better future.

     
  • richardmitnick 4:00 pm on May 8, 2016 Permalink | Reply
    Tags: , Computing, ,   

    From INVERSE: “What Will Replace Moore’s Law as Technology Advances Beyond the Microchip?” 

    INVERSE

    INVERSE

    May 5, 2016
    Adam Toobin

    The mathematics of Moore’s Law has long baffled observers, even as it underlies much of the technological revolution that has transformed the world over the past 50 years, but as chips get smaller, there’s now renewed speculation that it will be squeezed out.

    In the 1965, Intel cofounder Dr. Gordon Moore observed that the number of digital transistors on a single microchip doubled every two years. The trend has stuck ever since: computers the size of entire rooms now rest in the palm of your hand, at a fraction of the cost.

    But with the under-girding technology approaching the size of a single atom, many fear the heyday of the digital revolution is coming to a close, forcing technologists around the world to rethink their business strategies and their notions of computing altogether.

    We have faced the end of Moore’s Law before — in fact, Brian Krzanich, Intel’s chief executive, jokes he has seen the doomsday prediction made no less than four times in his life. But what makes the coming barrier different is that whether we have another five or even ten years of boosting the silicon semiconductors that constitute the core of modern computing, we are going to hit a physical wall sooner rather than later.

    BOINC WallPaper

    1
    Transistor counts for integrated circuits plotted against their dates of introduction. The curve shows Moore’s law – the doubling of transistor counts every two years. The y-axis is logarithmic, so the line corresponds to exponential growth.

    If Moore’s Law is to survive, it would require a radical innovation, rather than the predictable progress that has sustained chip makers over recent decades.

    And most technology companies in the world are beginning to acknowledge the changing forecast for digital hardware. Semiconductor industry associations of the United States, Europe, Japan, South Korea, and Taiwan will issue only one more report forecasting chip technology growth. Intel’s CEO casts these gloomy predictions as premature and refused to participate with the final report. Krzanich insists Intel has the technical capabilities to keep improving chips while keeping costs low for manufacturers, though few in the industry believe the faltering company will maintain its quixotic course for long.


    Access mp4 video here .

    The rest of the industry is casting forth to new opportunities. New technologies like graphene (an atomic-scale honeycomb-like web of carbon atoms) and quantum computing offer a unique way out of physical limitations imposed by silicon superconductors. Graphene has recently enthralled chipmakers with its affordable carbon base and configuration that makes it an ideal candidate for faster, though still largely conventional, digital processing.

    2
    The ideal crystalline structure of graphene is a hexagonal grid.

    “As you look at Intel saying the PC industry is slowing and seeing the first signs of slowing in mobile computing, people are starting to look for new places to put semiconductors,” said David Kanter, a semiconductor industry analyst at Real World Technologies in San Francisco, told The New York Times.

    Quantum computing, on the other hand, would tap the ambiguity inherent in the universe to change computing forever. The prospect has long intrigued tech companies, and the recent debut of some radical early stage designs have reignited the fervor of quantum’s advocates.

    3
    This image appeared in an IBM promotion that read: “IBM unlocks quantum computing capabilities, lifts limits of innovation.”

    For many years, the end of Moore’s Law was viewed as a kind of apocalypse scenario for the technology industry: What would we do when there was no more room on the chip? Much of what has been forecast about the future of the digital world has been preceded on the notion that we will continue to make the incredible improvements of the past half century.

    It’s perhaps a good sign that technology companies are soberly looking to the future and getting excited about new, promising developments that may yet yield entirely new frontiers.

    Photos via Wgsimon [CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0) or GFDL (http://www.gnu.org/copyleft/fdl.html)%5D, via Wikimedia Commons, AlexanderAlUS (Own work) [CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0) or GFDL (http://www.gnu.org/copyleft/fdl.html)%5D, via Wikimedia Commons, IBM, Jamie Baxter

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: