Tagged: Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:51 am on August 13, 2018 Permalink | Reply
    Tags: , , , Computers can’t have needs cravings or desires, Computing   

    From aeon: “Robot says: Whatever” 

    1

    From aeon

    8.13.18
    Margaret Boden

    1
    Chief priest Bungen Oi holds a robot AIBO dog prior to its funeral ceremony at the Kofukuji temple in Isumi, Japan, on 26 April 2018. Photo by Nicolas Datiche /AFP/Getty
    https://www.headlines24.nl/nieuwsartikel/135259/201808/robot–whatever

    What stands in the way of all-powerful AI isn’t a lack of smarts: it’s that computers can’t have needs, cravings or desires.

    In Henry James’s intriguing novella The Beast in the Jungle (1903), a young man called John Marcher believes that he is marked out from everyone else in some prodigious way. The problem is that he can’t pinpoint the nature of this difference. Marcher doesn’t even know whether it is good or bad. Halfway through the story, his companion May Bartram – a wealthy, New-England WASP, naturally – realises the answer. But by now she is middle-aged and terminally ill, and doesn’t tell it to him. On the penultimate page, Marcher (and the reader) learns what it is. For all his years of helpfulness and dutiful consideration towards May, detailed at length in the foregoing pages, not even she had ever really mattered to him.

    That no one really mattered to Marcher does indeed mark him out from his fellow humans – but not from artificial intelligence (AI) systems, for which nothing matters. Yes, they can prioritise: one goal can be marked as more important or more urgent than another. In the 1990s, the computer scientists Aaron Sloman and Ian Wright even came up with a computer model of a nursemaid in charge of several unpredictable and demanding babies, in order to illustrate aspects of Sloman’s theory about anxiety in humans who must juggle multiple goals. But this wasn’t real anxiety: the computer couldn’t care less.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Since 2012, Aeon has established itself as a unique digital magazine, publishing some of the most profound and provocative thinking on the web. We ask the big questions and find the freshest, most original answers, provided by leading thinkers on science, philosophy, society and the arts.

    Aeon has three channels, and all are completely free to enjoy:

    Essays – Longform explorations of deep issues written by serious and creative thinkers

    Ideas – Short provocations, maintaining Aeon’s high editorial standards but in a more nimble and immediate form. Our Ideas are published under a Creative Commons licence, making them available for republication.

    Video – A mixture of curated short documentaries and original Aeon productions

    Through our Partnership program, we publish pieces from university research groups, university presses and other selected cultural organisations.

    Aeon was founded in London by Paul and Brigid Hains. It now has offices in London, Melbourne and New York. We are a not-for-profit, registered charity operated by Aeon Media Group Ltd. Aeon is endorsed as a Deductible Gift Recipient (DGR) organisation in Australia and, through its affiliate Aeon America, registered as a 501(c)(3) charity in the US.

    We are committed to big ideas, serious enquiry and a humane worldview. That’s it.

    Advertisements
     
  • richardmitnick 9:12 am on June 18, 2018 Permalink | Reply
    Tags: Computing, Deep Neural Network Training with Analog Memory Devices, ,   

    From HPC Wire: “IBM Demonstrates Deep Neural Network Training with Analog Memory Devices” 

    From HPC Wire

    June 18, 2018
    Oliver Peckham

    1
    Crossbar arrays of non-volatile memories can accelerate the training of fully connected neural networks by performing computation at the location of the data. (Source: IBM)

    From smarter, more personalized apps to seemingly-ubiquitous Google Assistant and Alexa devices, AI adoption is showing no signs of slowing down – and yet, the hardware used for AI is far from perfect. Currently, GPUs and other digital accelerators are used to speed the processing of deep neural network (DNN) tasks – but all of those systems are effectively wasting time and energy shuttling that data back and forth between memory and processing. As the scale of AI applications continues to increase, those cumulative losses are becoming massive.

    In a paper published this month in Nature, by Stefano Ambrogio, Pritish Narayanan, Hsinyu Tsai, Robert M. Shelby, Irem Boybat, Carmelo di Nolfo, Severin Sidler, Massimo Giordano, Martina Bodini, Nathan C. P. Farinha, Benjamin Killeen, Christina Cheng, Yassine Jaoudi, and Geoffrey W. Burr, IBM researchers demonstrate DNN training on analog memory devices that they report achieves equivalent accuracy to a GPU-accelerated system. IBM’s solution performs DNN calculations right where the data are located, storing and adjusting weights in memory, with the effect of conserving energy and improving speed.

    Analog computing, which uses variable signals rather than binary signals, is rarely employed in modern computing due to inherent limits on precision. IBM’s researchers, building on a growing understanding that DNN models operate effectively at lower precision, decided to attempt an accurate approach to analog DNNs.

    The research team says it was able to accelerate key training algorithms, notably the backpropagation algorithm, using analog non-volatile memories (NVM). Writing for the IBM blog, lead author Stefano Ambrogio explains:

    “These memories allow the “multiply-accumulate” operations used throughout these algorithms to be parallelized in the analog domain, at the location of weight data, using underlying physics. Instead of large circuits to multiply and add digital numbers together, we simply pass a small current through a resistor into a wire, and then connect many such wires together to let the currents build up. This lets us perform many calculations at the same time, rather than one after the other. And instead of shipping digital data on long journeys between digital memory chips and processing chips, we can perform all the computation inside the analog memory chip.”

    The authors note that their mixed hardware-software approach is able to achieve classification accuracies equivalent to pure software based-training using TensorFlow despite imperfections of existing analog memory devices. Writes Ambrogio:

    “By combining long-term storage in phase-change memory (PCM) devices, near-linear update of conventional Complementary Metal-Oxide Semiconductor (CMOS) capacitors and novel techniques for cancelling out device-to-device variability, we finessed these imperfections and achieved software-equivalent DNN accuracies on a variety of different networks. These experiments used a mixed hardware-software approach, combining software simulations of system elements that are easy to model accurately (such as CMOS devices) together with full hardware implementation of the PCM devices. It was essential to use real analog memory devices for every weight in our neural networks, because modeling approaches for such novel devices frequently fail to capture the full range of device-to-device variability they can exhibit.”

    Ambrogio and his team believe that their early design efforts indicate that a full implemention of the analog approach “should indeed offer equivalent accuracy, and thus do the same job as a digital accelerator – but faster and at lower power.” The team is exploring the design of prototype NVM-based accelerator chips, as part of an IBM Research Frontiers Institute project.

    The team estimates that it will be able to deliver chips with a computational energy efficiency of 28,065 GOP/sec/W and throughput-per-area of 3.6 TOP/sec/mm2. This would be a two orders of magnitude improvement over today’s GPUs according to the reserachers.

    The researchers will now turn their attention to demonstrating their approach on larger networks that call for large, fully-connected layers, such as recurrently-connected Long Short Term Memory (LSTM) and Gated Recurrent Unit (GRU) networks with emerging utility for machine translation, captioning and text analytics. As new and better forms of analog memory are developed, they expect continued improvements in areal density and energy efficiency.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    HPCwire is the #1 news and information resource covering the fastest computers in the world and the people who run them. With a legacy dating back to 1987, HPC has enjoyed a legacy of world-class editorial and topnotch journalism, making it the portal of choice selected by science, technology and business professionals interested in high performance and data-intensive computing. For topics ranging from late-breaking news and emerging technologies in HPC, to new trends, expert analysis, and exclusive features, HPCwire delivers it all and remains the HPC communities’ most reliable and trusted resource. Don’t miss a thing – subscribe now to HPCwire’s weekly newsletter recapping the previous week’s HPC news, analysis and information at: http://www.hpcwire.com.

     
  • richardmitnick 5:24 pm on April 1, 2018 Permalink | Reply
    Tags: , , Computer searches telescope data for evidence of distant planets, Computing,   

    From MIT: “Computer searches telescope data for evidence of distant planets” 

    MIT News

    MIT Widget

    MIT News

    March 29, 2018
    Larry Hardesty

    1
    A young sun-like star encircled by its planet-forming disk of gas and dust.
    Image: NASA/JPL-Caltech

    Machine-learning system uses physics principles to augment data from NASA crowdsourcing project.

    As part of an effort to identify distant planets hospitable to life, NASA has established a crowdsourcing project in which volunteers search telescopic images for evidence of debris disks around stars, which are good indicators of exoplanets.

    Using the results of that project, researchers at MIT have now trained a machine-learning system to search for debris disks itself. The scale of the search demands automation: There are nearly 750 million possible light sources in the data accumulated through NASA’s Wide-Field Infrared Survey Explorer (WISE) mission alone.

    NASA/WISE Telescope

    In tests, the machine-learning system agreed with human identifications of debris disks 97 percent of the time. The researchers also trained their system to rate debris disks according to their likelihood of containing detectable exoplanets. In a paper describing the new work in the journal Astronomy and Computing, the MIT researchers report that their system identified 367 previously unexamined celestial objects as particularly promising candidates for further study.

    The work represents an unusual approach to machine learning, which has been championed by one of the paper’s coauthors, Victor Pankratius, a principal research scientist at MIT’s Haystack Observatory. Typically, a machine-learning system will comb through a wealth of training data, looking for consistent correlations between features of the data and some label applied by a human analyst — in this case, stars circled by debris disks.

    But Pankratius argues that in the sciences, machine-learning systems would be more useful if they explicitly incorporated a little bit of scientific understanding, to help guide their searches for correlations or identify deviations from the norm that could be of scientific interest.

    “The main vision is to go beyond what A.I. is focusing on today,” Pankratius says. “Today, we’re collecting data, and we’re trying to find features in the data. You end up with billions and billions of features. So what are you doing with them? What you want to know as a scientist is not that the computer tells you that certain pixels are certain features. You want to know ‘Oh, this is a physically relevant thing, and here are the physics parameters of the thing.’”

    Classroom conception

    The new paper grew out of an MIT seminar that Pankratius co-taught with Sara Seager, the Class of 1941 Professor of Earth, Atmospheric, and Planetary Sciences, who is well-known for her exoplanet research. The seminar, Astroinformatics for Exoplanets, introduced students to data science techniques that could be useful for interpreting the flood of data generated by new astronomical instruments. After mastering the techniques, the students were asked to apply them to outstanding astronomical questions.

    For her final project, Tam Nguyen, a graduate student in aeronautics and astronautics, chose the problem of training a machine-learning system to identify debris disks, and the new paper is an outgrowth of that work. Nguyen is first author on the paper, and she’s joined by Seager, Pankratius, and Laura Eckman, an undergraduate majoring in electrical engineering and computer science.

    From the NASA crowdsourcing project, the researchers had the celestial coordinates of the light sources that human volunteers had identified as featuring debris disks. The disks are recognizable as ellipses of light with slightly brighter ellipses at their centers. The researchers also used the raw astronomical data generated by the WISE mission.

    To prepare the data for the machine-learning system, Nguyen carved it up into small chunks, then used standard signal-processing techniques to filter out artifacts caused by the imaging instruments or by ambient light. Next, she identified those chunks with light sources at their centers, and used existing image-segmentation algorithms to remove any additional sources of light. These types of procedures are typical in any computer-vision machine-learning project.

    Coded intuitions

    But Nguyen used basic principles of physics to prune the data further. For one thing, she looked at the variation in the intensity of the light emitted by the light sources across four different frequency bands. She also used standard metrics to evaluate the position, symmetry, and scale of the light sources, establishing thresholds for inclusion in her data set.

    In addition to the tagged debris disks from NASA’s crowdsourcing project, the researchers also had a short list of stars that astronomers had identified as probably hosting exoplanets. From that information, their system also inferred characteristics of debris disks that were correlated with the presence of exoplanets, to select the 367 candidates for further study.

    “Given the scalability challenges with big data, leveraging crowdsourcing and citizen science to develop training data sets for machine-learning classifiers for astronomical observations and associated objects is an innovative way to address challenges not only in astronomy but also several different data-intensive science areas,” says Dan Crichton, who leads the Center for Data Science and Technology at NASA’s Jet Propulsion Laboratory. “The use of the computer-aided discovery pipeline described to automate the extraction, classification, and validation process is going to be helpful for systematizing how these capabilities can be brought together. The paper does a nice job of discussing the effectiveness of this approach as applied to debris disk candidates. The lessons learned are going to be important for generalizing the techniques to other astronomy and different discipline applications.”

    “The Disk Detective science team has been working on its own machine-learning project, and now that this paper is out, we’re going to have to get together and compare notes,” says Marc Kuchner, a senior astrophysicist at NASA’s Goddard Space Flight Center and leader of the crowdsourcing disk-detection project known as Disk Detective. “I’m really glad that Nguyen is looking into this because I really think that this kind of machine-human cooperation is going to be crucial for analyzing the big data sets of the future.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 1:13 pm on February 19, 2018 Permalink | Reply
    Tags: , , Automating materials design, Computing, ,   

    From MIT: “Automating materials design” 

    MIT News

    MIT Widget

    MIT News

    February 2, 2018 [Just showed up in social media.]
    Larry Hardesty

    1
    New software identified five different families of microstructures, each defined by a shared “skeleton” (blue), that optimally traded off three mechanical properties. Courtesy of the researchers.

    With new approach, researchers specify desired properties of a material, and a computer system generates a structure accordingly.

    For decades, materials scientists have taken inspiration from the natural world. They’ll identify a biological material that has some desirable trait — such as the toughness of bones or conch shells — and reverse-engineer it. Then, once they’ve determined the material’s “microstructure,” they’ll try to approximate it in human-made materials.

    Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory have developed a new system that puts the design of microstructures on a much more secure empirical footing. With their system, designers numerically specify the properties they want their materials to have, and the system generates a microstructure that matches the specification.

    The researchers have reported their results in Science Advances. In their paper, they describe using the system to produce microstructures with optimal trade-offs between three different mechanical properties. But according to associate professor of electrical engineering and computer science Wojciech Matusik, whose group developed the new system, the researchers’ approach could be adapted to any combination of properties.

    “We did it for relatively simple mechanical properties, but you can apply it to more complex mechanical properties, or you could apply it to combinations of thermal, mechanical, optical, and electromagnetic properties,” Matusik says. “Basically, this is a completely automated process for discovering optimal structure families for metamaterials.”

    Joining Matusik on the paper are first author Desai Chen, a graduate student in electrical engineering and computer science; and Mélina Skouras and Bo Zhu, both postdocs in Matusik’s group.

    Finding the formula

    The new work builds on research reported last summer, in which the same quartet of researchers generated computer models of microstructures and used simulation software to score them according to measurements of three or four mechanical properties. Each score defines a point in a three- or four-dimensional space, and through a combination of sampling and local exploration, the researchers constructed a cloud of points, each of which corresponded to a specific microstructure.

    Once the cloud was dense enough, the researchers computed a bounding surface that contained it. Points near the surface represented optimal trade-offs between the mechanical properties; for those points, it was impossible to increase the score on one property without lowering the score on another.

    2
    No image caption or credit.

    That’s where the new paper picks up. First, the researchers used some standard measures to evaluate the geometric similarities of the microstructures corresponding to the points along the boundaries. On the basis of those measures, the researchers’ software clusters together microstructures with similar geometries.

    For every cluster, the software extracts a “skeleton” — a rudimentary shape that all the microstructures share. Then it tries to reproduce each of the microstructures by making fine adjustments to the skeleton and constructing boxes around each of its segments. Both of these operations — modifying the skeleton and determining the size, locations, and orientations of the boxes — are controlled by a manageable number of variables. Essentially, the researchers’ system deduces a mathematical formula for reconstructing each of the microstructures in a cluster.

    Next, the researchers use machine-learning techniques to determine correlations between specific values for the variables in the formulae and the measured properties of the resulting microstructures. This gives the system a rigorous way to translate back and forth between microstructures and their properties.

    3

    On automatic

    Every step in this process, Matusik emphasizes, is completely automated, including the measurement of similarities, the clustering, the skeleton extraction, the formula derivation, and the correlation of geometries and properties. As such, the approach would apply as well to any collection of microstructures evaluated according to any criteria.

    By the same token, Matusik explains, the MIT researchers’ system could be used in conjunction with existing approaches to materials design. Besides taking inspiration from biological materials, he says, researchers will also attempt to design microstructures by hand. But either approach could be used as the starting point for the sort of principled exploration of design possibilities that the researchers’ system affords.

    “You can throw this into the bucket for your sampler,” Matusik says. “So we guarantee that we are at least as good as anything else that has been done before.”

    In the new paper, the researchers do report one aspect of their analysis that was not automated: the identification of the physical mechanisms that determine the microstructures’ properties. Once they had the skeletons of several different families of microstructures, they could determine how those skeletons would respond to physical forces applied at different angles and locations.

    But even this analysis is subject to automation, Chen says. The simulation software that determines the microstructures’ properties can also identify the structural elements that deform most under physical pressure, a good indication that they play an important functional role.

    The work was supported by the U.S. Defense Advanced Research Projects Agency’s Simplifying Complexity in Scientific Discovery program.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 7:06 am on February 6, 2018 Permalink | Reply
    Tags: , , Computing, , ,   

    From CSIROscope: “Cybersecurity: we can hack it” 

    CSIRO bloc

    CSIROscope

    6 February 2018
    Chris Chelvan

    1
    No image capton or credit.

    It is estimated that 3,885,567,619 people across the world have access to the internet, roughly 51.7% of the world population. More often than not, the internet is used to benefit society — from connecting opposite sides of the world to making knowledge more accessible. But sometimes, the anonymity provided by the internet creates risks of cyberbullying as well as threats to cyber security.

    Every month, at least 50,000 new cyber threats arise that expose internet users to risk. The National Vulnerability Database (NVD) operated by the National Institute of Standards and Technology suggests that between 500 and 1,000 new vulnerabilities emerge every month, of which at least 25 per cent are critical and pose a risk for significant damage.

    Some of the largest cybersecurity threats emerged just last year. The WannaCry ransomware attack in May 2017 affected more than 300,000 computers across 150 countries causing billions of dollars in damage. Spectre and Meltdown, too, exposed critical cyber vulnerabilities in computers and mobile phones around the world, exposing millions of people to hackers — in fact, Data61 researcher Dr Yuval Yarom from the Trustworthy Systems Group was one of the contributors whose research uncovered the Spectre issue.

    Not only are cyber threats increasing, they’re also evolving. First the focus was on attacking technology: hacking, malware, and remote access. Then the focus shifted to attacking humans with phishing, social engineering and ransomware, like WannaCry. Now cyber attacks are now more sophisticated than ever and even harder to detect.

    And yet, given all these threats, Australia has next to no cyber security specialists. The Australian Cyber Security Growth Network has said the demand for skills in the sector far outstrips demand. A recent Government report estimated Australia would need another 11,000 cyber security specialists over the next decade.

    It’s against this diverse backdrop of new and constantly changing threats that we celebrate Safer Internet Day and call on our future generation of science, technology, engineering and mathematics (STEM) leaders to fill the glaring shortage of cybersecurity professionals in Australia.

    Not only are we short of information security professionals now, but data show that by 2022 we’ll be short up to 1.8 million positions. This is particularly urgent in Australia, where women make up just one in three students studying STEM — a proportion that needs to rise to meet the country’s growing cyber security needs.

    Introducing STEM Professional in Schools, our education program that shows young women how they can make an impact in Australia and across the world. STEM Professionals in Schools is Australia’s leading STEM education volunteering program, bringing real world STEM into the classroom to inspire students and teachers.

    Our Data61 CEO, Adrian Turner, visited Melbourne Girls College to talk about safer internet usage and the importance of STEM.

    “These students are our future innovators, scientists and engineers,” Mr Turner said.

    “It’s essential to equip them with the skills they need in school, and to capture their interest in cybersecurity and why it matters now and in the future so they can see how much of a crucial role it is and will continue to play in Australia’s data-driven future and digital economy.”

    A rewarding career in STEM can take on many forms, too. Data61’s STEM graduates have worked in various roles and research projects, spanning everything from machine learning and robotics to analytics and of course — cybersecurity.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 12:10 pm on November 29, 2017 Permalink | Reply
    Tags: , , Bridging gaps in high-performance computing languages, Computing, Generative programming, Programming languages, Tiark Rompf   

    From ASCRDiscovery: “Language barrier” 

    ASCRDiscovery
    Advancing Science Through Computing

    November 2017
    No writer credit

    A Purdue University professor is using a DOE early-career award to bridge gaps in high-performance computing languages.

    1
    Detail from an artwork made through generative programming. Purdue University’s Tiark Rompf is investigating programs that create new ones to bring legacy software up to speed in the era of exascale computing. Painting courtesy of Freddbomba via Wikimedia Commons.

    A Purdue University assistant professor of computer science leads a group effort to find new and better ways to generate high-performance computing codes that run efficiently on as many different kinds of supercomputer architectures as possible.

    That’s the challenging goal Tiark Rompf has set for himself with his recent Department of Energy Early Career Research Program award – to develop what he calls “program generators” for exascale architectures and beyond.

    “Programming supercomputers is hard,” Rompf says. Coders typically write software in so-called general-purpose languages. The languages are low-level, meaning “specialized to a given machine architecture. So when a machine is upgraded or replaced, one has to rewrite most of the software.”

    As an alternative to this rewriting, which involves tediously translating low-level code from one supercomputer platform into another, programmers would prefer to use high-level languages “written in a way that feels natural” to them, Rompf says, and “closer to the way a programmer thinks about the computation.”

    But high-level and low-level languages are far apart, with a steel wall of differences between the ways the two types of languages are written, interpreted and executed. In particular, high-level languages rarely perform as well as desired. Executing them requires special so-called smart compilers that must use highly specialized analysis to figure out what the program “really means and how to match it to machine instructions.”

    2
    Tiark Rompf. Photo courtesy of Purdue University.

    Rompf and his group propose avoiding that with something called generative programming, which he has worked on since before he received his 2012 Ph.D. from Ecole Polytechnique Federale de Lausanne (EPFL) in Switzerland. The idea is to create special programs structured so they’re able to make additional programs where needed.

    In a 2015 paper, Rompf and research colleagues at EPFL, Stanford University and ETH Zurich also called for a radical reassessment of high-level languages. “We really need to think about how to design programming languages and (software) libraries that embrace this generative programming idea,” he adds.

    Program generators “are attractive because they can automate the process of producing very efficient code,” he says. But building them “has also been very hard, and therefore only a few exist today. We’re planning to build the necessary infrastructure to make it an order of magnitude easier.”

    As he noted in his early-career award proposal, progress building program generators is extremely difficult for more reasons than just programmer-computer disharmony. Other obstacles include compiler limitations, differing capabilities of supercomputer processors, the changing ways data are stored and the ways software libraries are accessed. Rompf plans to use his five-year, $750,000 award to evaluate generative programming as a way around some of those roadblocks.

    One idea, for instance, is to identify and create an extensible stack of intermediate languages that could serve as transitional steps when high-level codes must be translated into machine code. These also are described as “domain-specific languages” or DSLs, as they encode more knowledge about the application subject than general-purpose languages.

    Eventually, programmers hope to entirely phase out legacy languages such as C and Fortran, substituting only high-level languages and DSLs. Rompf points out that legacy codes can be decades older than the processors they run on, and some have been heavily adapted to run on new generations of machines, an investment that can make legacy codes difficult to jettison.

    __________________________________________________
    Rompf started Project Lancet to integrate generative approaches into a virtual machine for high-level languages.
    __________________________________________________

    Generative programming was the basis for Rompf’s doctoral research. It was described as an approach called Lightweight Modular Staging, or LMS, in a 2010 paper he wrote with his EPFL Ph.D. advisor, Martin Odersky. That’s “a software platform that provides capabilities for other programmers to develop software in a generative style,” Rompf says.

    LMS also underpins Delite, a software framework Rompf later developed in collaboration with a Stanford University group to build DSLs targeting parallel processing in supercomputer architectures – “very important for the work I’m planning to do,” he says.

    While working at Oracle Labs between 2012 and 2014, Rompf started Project Lancet to integrate generative approaches into a virtual machine for high-level languages. Virtual machines are code that can induce real computers to run selected programs. In the case of Lancet, software executes high-level languages and then performs selective compilations in machine code.

    Born and raised in Germany, Rompf joined Purdue in the fall of 2014. It’s “a great environment for doing this kind of research,” he says. “We have lots of good students in compilers, high-performance and databases. We’ve been hiring many new assistant professors. There are lots of young people who all want to accomplish things.”

    He calls his DOE Early Career award a great honor. “I think there are many opportunities for future work in getting more of the DOE community in the interaction.” Although he is the project’s only principal investigator, he is collaborating with other groups at Purdue, ETH Zurich and Stanford and has received recent and related National Science Foundation research grants.

    As a busy assistant professor, he has six graduate students on track to get their doctorates, plus a varying number of undergraduate assistants. Rompf also is a member of the Purdue Research on Programming Languages group (PurPL), with 10 faculty members and their students.

    “It’s a very vibrant group, which like the Purdue computer science department has been growing a lot in recent years,” he says.

    Now in its eighth year, the DOE Office of Science’s Early Career Research Program for researchers in universities and DOE national laboratories supports the development of individual research programs of outstanding scientists early in their careers and stimulates research careers in the disciplines supported by the Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

     
  • richardmitnick 2:04 pm on October 13, 2017 Permalink | Reply
    Tags: , , , , Computing, ,   

    From BNL: “Scientists Use Machine Learning to Translate ‘Hidden’ Information that Reveals Chemistry in Action” 

    Brookhaven Lab

    October 10, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    New method allows on-the-fly analysis of how catalysts change during reactions, providing crucial information for improving performance.

    1
    A sketch of the new method that enables fast, “on-the-fly” determination of three-dimensional structure of nanocatalysts. The neural network converts the x-ray absorption spectra into geometric information (such as nanoparticle sizes and shapes) and the structural models are obtained for each spectrum. No image credit.

    Chemistry is a complex dance of atoms. Subtle shifts in position and shuffles of electrons break and remake chemical bonds as participants change partners. Catalysts are like molecular matchmakers that make it easier for sometimes-reluctant partners to interact.

    Now scientists have a way to capture the details of chemistry choreography as it happens. The method—which relies on computers that have learned to recognize hidden signs of the steps—should help them improve the performance of catalysts to drive reactions toward desired products faster.

    The method—developed by an interdisciplinary team of chemists, computational scientists, and physicists at the U.S. Department of Energy’s Brookhaven National Laboratory and Stony Brook University—is described in a new paper published in the Journal of Physical Chemistry Letters. The paper demonstrates how the team used neural networks and machine learning to teach computers to decode previously inaccessible information from x-ray data, and then used that data to decipher 3D nanoscale structures.

    Decoding nanoscale structures

    “The main challenge in developing catalysts is knowing how they work—so we can design better ones rationally, not by trial-and-error,” said Anatoly Frenkel, leader of the research team who has a joint appointment with Brookhaven Lab’s Chemistry Division and Stony Brook University’s Materials Science Department. “The explanation for how catalysts work is at the level of atoms and very precise measurements of distances between them, which can change as they react. Therefore it is not so important to know the catalysts’ architecture when they are made but more important to follow that as they react.”

    2
    Anatoly Frenkel (standing) with co-authors (l to r) Deyu Lu, Yuewei Lin, and Janis Timoshenko. No image credit.

    Trouble is, important reactions—those that create important industrial chemicals such as fertilizers—often take place at high temperatures and under pressure, which complicates measurement techniques. For example, x-rays can reveal some atomic-level structures by causing atoms that absorb their energy to emit electronic waves. As those waves interact with nearby atoms, they reveal their positions in a way that’s similar to how distortions in ripples on the surface of a pond can reveal the presence of rocks. But the ripple pattern gets more complicated and smeared when high heat and pressure introduce disorder into the structure, thus blurring the information the waves can reveal.

    So instead of relying on the “ripple pattern” of the x-ray absorption spectrum, Frenkel’s group figured out a way to look into a different part of the spectrum associated with low-energy waves that are less affected by heat and disorder.

    “We realized that this part of the x-ray absorption signal contains all the needed information about the environment around the absorbing atoms,” said Janis Timoshenko, a postdoctoral fellow working with Frenkel at Stony Brook and lead author on the paper. “But this information is hidden ‘below the surface’ in the sense that we don’t have an equation to describe it, so it is much harder to interpret. We needed to decode that spectrum but we didn’t have a key.”

    Fortunately Yuewei Lin and Shinjae Yoo of Brookhaven’s Computational Science Initiative and Deyu Lu of the Center for Functional Nanomaterials (CFN) had significant experience with so-called machine learning methods. They helped the team develop a key by teaching computers to find the connections between hidden features of the absorption spectrum and structural details of the catalysts.

    “Janis took these ideas and really ran with them,” Frenkel said.

    The team used theoretical modeling to produce simulated spectra of several hundred thousand model structures, and used those to train the computer to recognize the features of the spectrum and how they correlated with the structure.

    “Then we built a neural network that was able to convert the spectrum into structures,” Frenkel said.

    When they tested to see if the method would work to decipher the shapes and sizes of well-defined platinum nanoparticles (using x-ray absorption spectra previously published by Frenkel and his collaborators) it did.

    “This method can now be used on the fly,” Frenkel said. “Once the network is constructed it takes almost no time for the structure to be obtained in any real experiment.”

    That means scientists studying catalysts at Brookhaven’s National Synchrotron Light Source II (NSLS-II), for example, could obtain real-time structural information to decipher why a particular reaction slows down, or starts producing an unwanted product—and then tweak the reaction conditions or catalyst chemistry to achieve desired results. This would be a big improvement over waiting to analyze results after completing the experiments and then figuring out what went wrong.

    In addition, this technique can process and analyze spectral signals from very low-concentration samples, and will be particularly useful at new high flux and high-energy-resolution beamlines incorporating special optics and high-throughput analysis techniques at NSLS-II.

    “This will offer completely new methods of using synchrotrons for operando research,” Frenkel said.

    This work was funded by the DOE Office of Science (BES) and by Brookhaven’s Laboratory Directed Research and Development program. Previously published spectra for the model nanoparticles used to validate the neural network were collected at the Advanced Photon Source (APS) at DOE’s Argonne National Laboratory and the original National Synchrotron Light Source (NSLS) at Brookhaven Lab, now replaced by NSLS-II. CFN, NSLS-II, and APS are DOE Office of Science User Facilities. In addition to Frenkel and Timoshenko, Lu and Lin are co-authors on the paper.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 9:32 am on July 7, 2017 Permalink | Reply
    Tags: , , Computing, , St. Jude Children’s Research Hospital   

    From ORNL: “ORNL researchers apply imaging, computational expertise to St. Jude research” 

    i1

    Oak Ridge National Laboratory

    July 6, 2017
    Stephanie G. Seay
    seaysg@ornl.gov
    865.576.9894

    1
    Left to right: ORNL’s Derek Rose, Matthew Eicholtz, Philip Bingham, Ryan Kerekes, and Shaun Gleason.

    2
    Measuring migrating neurons in a developing mouse brain.

    3
    Identifying and analyzing neurons in a mouse auditory cortex.
    No image credits for above images

    In the quest to better understand and cure childhood diseases, scientists at St. Jude Children’s Research Hospital accumulate enormous amounts of data from powerful video microscopes. To help St. Jude scientists mine that trove of data, researchers at Oak Ridge National Laboratory have created custom algorithms that can provide a deeper understanding of the images and quicken the pace of research.

    The work resides in St. Jude’s Department of Developmental Neurobiology in Memphis, Tennessee, where scientists use advanced microscopy to capture the details of phenomena such as nerve cell growth and migration in the brains of mice. ORNL researchers take those videos and leverage their expertise in image processing, computational science, and machine learning to analyze the footage and create statistics.

    A recent Science article details St. Jude research on brain plasticity, or the ability of the brain to change and form new connections between neurons. In this work, ORNL helped track mice brain cell electrical activity in the auditory cortex when the animals were exposed to certain tones.

    ORNL researchers created an algorithm to measure electrical activations, or signals, across groups of neurons, collecting statistics and making correlations between cell activity in the auditory cortex and tones heard by the mice. The team first had to stabilize the video because it was taken while the mice were awake and moving to ensure a proper analysis was being conducted, said Derek Rose, who now leads the work at ORNL.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 3:39 pm on May 14, 2017 Permalink | Reply
    Tags: 22-Year-Old Researcher Accidentally Stops Global Cyberattack, , Computing, , Massive cyberattack thwarted   

    From Inverse: “22-Year-Old Researcher Accidentally Stops Global Cyberattack” 

    INVERSE

    INVERSE

    May 13, 2017
    Grace Lisa Scott

    And then he blogged about how he did it.

    On Friday, a massive cyberattack spread across 74 countries, infiltrating global companies like FedEx and Nissan, telecommunication networks, and most notably the UK’s National Health Service. It left the NHS temporarily crippled, with test results and patient records becoming unavailable and phones not working.

    The ransomware attack employed a malware called WannaCrypt that encrypts a user’s data and then demands a payment — in this instance $300-worth of bitcoins — to retrieve and unlock said data. The malware is spread through email and exploits a vulnerability in Windows. Microsoft did release a patch that fixes the vulnerability back in March, but any computer without the update would have remained vulnerable.

    The attack was suddenly halted early Friday afternoon (Eastern Standard Time) thanks to a 22-year-old cybersecurity researcher from southwest England. Going by the pseudonym MalwareTech on Twitter, the researcher claimed he accidentally activated the software’s “kill switch” by registering a complicated domain name hidden in the malware.

    After getting home from lunch with a friend and realizing the true severity of the cyberattack, the cybersecurity expert started looking for a weakness within the malware with the help of a few fellow researchers. On Saturday, he detailed how he managed to stop the malware spread in a blog post endearingly-titled “How to Accidentally Stop a Global Cyber Attacks”.

    “You’ve probably read about the WannaCrypt fiasco on several news sites, but I figured I’d tell my story,” he says.

    MalwareTech had registered the domain as a way to track the spread. “My job is to look for ways we can track and potentially stop botnets (and other kinds of malware), so I’m always on the lookout to pick up unregistered malware control server (C2) domains. In fact I registered several thousand of such domains in the past year,” he says.

    By registering the domain and setting up a sinkhole server he was planning to track the WannaCrypt spread.

    Fortunately, it didn’t turn out to be necessary because just by registering the domain MalwareTech he had engaged what was possibly an obscure but intentional kill switch for the ransomware. A peer linked MalwareTech to a tweet by a fellow researcher named Darien Huss who had just tweeted the discovery.

    The move gave companies and institutions time to patch their systems to avoid infection before the attackers could change the code and get the ransomware going again.

    In an interview with The Guardian Saturday, MalwareTech warned that the attack was probably not over. “The attackers will realize how we stopped it, they’ll change the code and then they’ll start again. Enable windows update, update and then reboot.”

    As for MalwareTech himself, he says he prefers to remain anonymous. “…It just doesn’t make sense to give out my personal information, obviously we’re working against bad guys and they’re not going to be happy about this,” he told the Guardian.

    To get into the nitty gritty of just why MalwareTech’s sinkhole managed to stop the international ransomware you can read his full blog post here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:31 pm on April 4, 2017 Permalink | Reply
    Tags: , , Computing, , Tim Berners-Lee wins $1 million Turing Award,   

    From MIT: “Tim Berners-Lee wins $1 million Turing Award” 

    MIT News

    MIT Widget

    MIT News

    April 4, 2017
    Adam Conner-Simons

    1
    Tim Berners-Lee was honored with the Turing Award for his work inventing the World Wide Web, the first web browser, and “the fundamental protocols and algorithms [that allowed] the web to scale.” Photo: Henry Thomas

    CSAIL researcher honored for inventing the web and developing the protocols that spurred its global use.

    MIT Professor Tim Berners-Lee, the researcher who invented the World Wide Web and is one of the world’s most influential voices for online privacy and government transparency, has won the most prestigious honor in computer science, the Association for Computing Machinery (ACM) A.M. Turing Award. Often referred to as “the Nobel Prize of computing,” the award comes with a $1 million prize provided by Google.

    In its announcement today, ACM cited Berners-Lee for “inventing the World Wide Web, the first web browser, and the fundamental protocols and algorithms allowing the web to scale.” This year marks the 50th anniversary of the award.

    A principal investigator at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) with a joint appointment in the Department of Electrical Engineering and Computer Science, Berners-Lee conceived of the web in 1989 at the European Organization for Nuclear Research (CERN) as a way to allow scientists around the world to share information with each other on the internet. He introduced a naming scheme (URIs), a communications protocol (HTTP), and a language for creating webpages (HTML). His open-source approach to coding the first browser and server is often credited with helping catalyzing the web’s rapid growth.

    “I’m humbled to receive the namesake award of a computing pioneer who showed that what a programmer could do with a computer is limited only by the programmer themselves,” says Berners-Lee, the 3COM Founders Professor of Engineering at MIT. “It is an honor to receive an award like the Turing that has been bestowed to some of the most brilliant minds in the world.”

    Berners-Lee is founder and director of the World Wide Web Consortium (W3C), which sets technical standards for web development, as well as the World Wide Web Foundation, which aims to establish the open web as a public good and a basic right. He also holds a professorship at Oxford University.

    As director of CSAIL’s Decentralized Information Group, Berners-Lee has developed data systems and privacy-minded protocols such as “HTTP with Accountability” (HTTPA), which monitors the transmission of private data and enables people to examine how their information is being used. He also leads Solid (“social linked data”), a project to re-decentralize the web that allows people to control their own data and make it available only to desired applications.

    “Tim Berners-Lee’s career — as brilliant and bold as they come — exemplifies MIT’s passion for using technology to make a better world,” says MIT President L. Rafael Reif. “Today we celebrate the transcendent impact Tim has had on all of our lives, and congratulate him on this wonderful and richly deserved award.”

    While Berners-Lee was initially drawn to programming through his interest in math, there was also a familial connection: His parents met while working on the Ferranti Mark 1, the world’s first commercial general-purpose computer. Years later, he wrote a program called Enquire to track connections between different ideas and projects, indirectly inspiring what later became the web.

    “Tim’s innovative and visionary work has transformed virtually every aspect our lives, from communications and entertainment to shopping and business,” says CSAIL Director Daniela Rus. “His work has had a profound impact on people across the world, and all of us at CSAIL are so very proud of him for being recognized with the highest honor in computer science.”

    Berners-Lee has received multiple accolades for his technical contributions, from being knighted by Queen Elizabeth to being named one of TIME magazine’s “100 Most Important People of the 20th Century.” He will formally receive the Turing Award during the ACM’s annual banquet June 24 in San Francisco.

    Past Turing Award recipients who have taught at MIT include Michael Stonebraker (2014), Shafi Goldwasser and Silvio Micali (2013), Barbara Liskov (2008), Ronald Rivest (2002), Butler Lampson (1992), Fernando Corbato (1990), John McCarthy (1971) and Marvin Minsky (1969).

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: