Tagged: Computer Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:11 am on March 31, 2017 Permalink | Reply
    Tags: , , Computer Science, Wei Xu,   

    From BNL: Women in STEM “Visualizing Scientific Big Data in Informative and Interactive Ways” Wei Xu 

    Brookhaven Lab

    March 31, 2017
    Ariana Tantillo

    Brookhaven Lab computer scientist Wei Xu develops visualization tools for analyzing large and varied datasets.

    Wei Xu, a computer scientist who is part of Brookhaven Lab¹s Computational Science Initiative, helps scientists analyze large and varied datasets by developing visualization tools, such as the color-mapping tool seen projected from her laptop onto the large screen.

    Humans are visual creatures: our brain processes images 60,000 times faster than text, and 90 percent of information sent to the brain is visual. Visualization is becoming increasingly useful in the era of big data, in which we are generating so much data at such high rates that we cannot keep up with making sense of it all. In particular, visual analytics—a research discipline that combines automated data analysis with interactive visualizations—has emerged as a promising approach to dealing with this information overload.

    “Visual analytics provides a bridge between advanced computational capabilities and human knowledge and judgment,” said Wei Xu, a computer scientist in the Computational Science Initiative (CSI) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory and a research assistant professor in the Department of Computer Science at Stony Brook University. “The interactive visual representations and interfaces enable users to efficiently explore and gain insights from massive datasets.”

    At Brookhaven, Xu has been leading the development of several visual analytics tools to facilitate the scientific decision-making and discovery process. She works closely with Brookhaven scientists, particularly those at the National Synchrotron Light Source II (NSLS-II) and the Center for Functional Nanomaterials (CFN)—both DOE Office of Science User Facilities.


    By talking to researchers early on, Xu learns about their data analysis challenges and requirements. She continues the conversation throughout the development process, demoing initial prototypes and making refinements based on their feedback. She also does her own research and proposes innovative visual analytics methods to the scientists.

    Recently, Xu has been collaborating with the Visual Analytics and Imaging (VAI) Lab at Stony Brook University—her alma mater, where she completed doctoral work in computed tomography with graphics processing unit (GPU)-accelerated computing.

    Though Xu continued work in these and related fields when she first joined Brookhaven Lab in 2013, she switched her focus to visualization by the end of 2015.

    “I realized how important visualization is to the big data era,” Xu said. “The visualization domain, especially information visualization, is flourishing, and I knew there would be lots of research directions to pursue because we are dealing with an unsolved problem: how can we most efficiently and effectively understand the data? That is a quite interesting problem not only in the scientific world but also in general.”

    It was at this time that Xu was awarded a grant for a visualization project proposal she submitted to DOE’s Laboratory Directed Research and Development program, which funds innovative and creative research in areas of importance to the nation’s energy security. At the same time, Klaus Mueller—Xu’s PhD advisor at Stony Brook and director of the VAI Lab—was seeking to extend his research to a broader domain. Xu thought it would be a great opportunity to collaborate: she would present the visualization problem that originated from scientific experiments and potential approaches to solve it, and, in turn, doctoral students in Mueller’s lab would work with her and their professor to come up with cutting-edge solutions.

    This Brookhaven-Stony Brook collaboration first led to the development of an automated method for mapping data involving multiple variables to color. Variables with a similar distribution of data points have similar colors. Users can manipulate the color maps, for example, enhancing the contrast to view the data in more detail. According to Xu, these maps would be helpful for any image dataset involving multiple variables.

    The color-mapping tool was used to visualize a multivariable fluorescence dataset from the Hard X-ray Nanoprobe (HXN) beamline at Brookhaven’s National Synchrotron Light Source II. The color map (a) shows how the different variables—the chemical elements cerium (Ce), cobalt (Co), iron (Fe), and gadolinium (Gd)—are distributed in a sample of an electrolyte material used in solid oxide fuel cells. The fluorescence spectrum of the selected data point (the circle indicated by the overlaid white arrows) is shown by the colored bars, with their height representing the relative elemental ratios. The fluorescence image (b), pseudo-colored based on the color map in (a), represents a joint colorization of the individual images in (d), whose colors are based on the four points at the circle boundary (a) for each of the four elements. The arrow indicates where new chemical phases can exist—something hard to detect when observing the individual plots (d). Enhancing the color contrast—for example, of the rectangular region in (b)—enables a more detailed view, in this case providing better contrast between Fe (red) and Co (green) in image (c).

    “Different imaging modalities—such as fluorescence, differential phase contrasts, x-ray scattering, and tomography—would benefit from this technique, especially when integrating the results of these modalities,” she said. “Even subtle differences that are hard to identify in separate image displays, such as differences in elemental ratios, can be picked up with our tool—a capability essential for new scientific discovery.” Currently, Xu is trying to install the color mapping at NSLS-II beamlines, and advanced features will be added gradually.

    In conjunction with CFN scientists, the team is also developing a multilevel display for exploring large image sets. When scientists scan a sample, they generate one scattering image at each point within the sample, known as the raw image level. They can zoom in on this image to check the individual pixel values (the pixel level). For each raw image, scientific analysis tools are used to generate a series of attributes that represent the analyzed properties of the sample (the attribute level), with a scatterplot showing a pseudo-color map of any user-chosen attribute from the series—for example, the sample’s temperature or density. In the past, scientists had to hop between multiple plots to view these different levels. The interactive display under development will enable scientists to see all of these levels in a single view, making it easier to identify how the raw data are related and to analyze data across the entire scanned sample. Users will be able to zoom in and out on different levels of interest, similar to how Google Maps works.

    The multilevel display tool enables scientists conducting scattering experiments to explore the resulting image sets at the scatterplot level (0), attribute pseudo-color level (1), zoom-in attribute level (2), raw image level (3), zoom-in raw image level (4), and pixel level (5), all in a single display.

    The ability to visually reconstruct a complete joint dataset from several partial marginal datasets is at the core of another visual analytics tool that Xu’s Stony Brook collaborators developed. This web-based tool enables users to reconstruct all possible solutions to a given problem and locate the subset of preferred solutions through interactive filtering.

    “Scientists commonly describe a single object with datasets from different sources—each covering only a portion of the complete properties of that object—for example, the same sample scanned in different beamlines,” explained Xu. “With this tool, scientists can recover a property with missing fields by refining its potential ranges and interactively acquiring feedback about whether the result makes sense.”

    Their research led to a paper that was published in the Institute of Electrical and Electronics Engineers (IEEE) journal Transactions on Visualization and Computer Graphics and awarded the Visual Analytics Science and Technology (VAST) Best Paper Honorable Mention at the 2016 IEEE VIS conference.

    At this same conference, another group of VAI Lab students whom Xu worked with were awarded the Scientific Visualization (SciVis) Best Poster Honorable Mention for their poster, “Extending Scatterplots to Scalar Fields.” Their plotting technique helps users link correlations between attributes and data points in a single view, with contour lines that show how the numerical values of the attributes change. For their case study, the students demonstrated how the technique could help college applications select the right university by plotting the desired attributes (e.g., low tuition, high safety, small campus size) with different universities (e.g., University of Virginia, Stanford University, MIT). The closer a particular college is to some attribute, the higher that attribute value.

    The scatter plots above are based on a dataset containing 46 universities with 14 attributes of interest for prospective students: academics, athletics, housing, location, nightlife, safety, transportation, weather, score, tuition, dining, PhD/faculty, population, and income. The large red nodes represent the attributes and the small blue points represent the universities; the contour lines (middle plot) show how the numerical values of the attributes change. This prospective student wants to attend a university with good academics (>9/10). Universities that meet this criterion are within the contours lines whose value exceeds 9. To determine which universities meet multiple criteria, students would see where the universities and attributes overlap (right plot).

    According to Xu, this kind of technique also could be applied to visualize artificial neural networks—the deep learning (a type of machine learning) frameworks that are used to address problems such as image classification and speech recognition.

    “Because neural network models have a complex structure, it is hard to understand how their intrinsic learning process works and how they arrive at intermediate results, and thus quite challenging to debug them,” explained Xu. “Neural networks are still largely regarded as black boxes. Visualization tools like this one could help researchers get a better idea of their model’s performance.”

    Besides her Stony Brook collaborations, Xu is currently involved in the Co-Design Center for Online Data Analysis and Reduction at the Exascale (CODAR), which Brookhaven is partnering on with other national laboratories and universities through DOE’s Exascale Computing Project. Her role is to visualize data evaluating the performance of computing clusters, applications, and workflows that the CODAR team is developing to analyze and reduce data online before the data are written to disk for possible further offline analysis. Exascale computer systems are projected to provide unprecedented increases in computational speed but the input/output (I/O) rates of transferring the computed results to storage disks are not expected to keep pace, so it will be infeasible for scientists to save all of their scientific results for offline analysis. Xu’s visualization will help the team “diagnose” any performance issues with the computation processes, including individual application execution, computation job management in the clusters, I/O performance in the runtime system, and data reduction and reconstruction efficiency.

    Xu is also part of a CSI effort to build a virtual reality (VR) lab for an interactive data visualization experience. “It would be a more natural way to observe and interact with data. VR techniques replicate a realistic and immersive 3D environment,” she said.

    For Xu, her passion for visualization most likely stemmed from an early interest in drawing.

    “As a child, I liked to draw,” she said. “In growing up, I took my drawings from paper to the computer.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 3:55 pm on November 13, 2014 Permalink | Reply
    Tags: , Computer Science, , Steve Ballmer   

    From Harvard Physics: “Why I’m making a big investment in Harvard’s computer science faculty” 

    Harvard University

    Harvard University

    November 13, 2014

    Steve Ballmer
    Former CEO, Microsoft

    The field of computer science—what it is, what it enables, what it takes to do it well—is undergoing rapid change. Decades of advances in theory and practice have shaped what it now demands of students and scholars, and what it makes possible. When any field undergoes rapid change, what it takes to succeed also changes. New possibilities are created, and new competitors emerge.

    Now, the field has turned outward, driving scientific discovery and deepening our knowledge of everything from how the body works to why people form communities to what shapes the global economy. The power of computational thinking has the potential to spark breakthroughs in nearly every sphere of human endeavor.

    Few areas of achievement can claim as broad a range of influence as computer science can. But what more can we do to accelerate change? It’s a question I thought a lot about while running Microsoft, and the answer is simple: support the very best talent at a place where talent can be fully realized. To that end, I am helping support an increase in the size of the computer science faculty at Harvard by 50%. There is, in my mind, no place better positioned to catch the next wave in computer science than greater Boston. I’ll give you four reasons why:

    First, this area has led the scientific revolution in computing. Harvard has been home to major breakthroughs in engineering and applied sciences, from the invention of the first large-scale automatic digital computer to significant advances in cryptography, learning theory, computer graphics and robotics. During World War II, MIT helped the US Navy design a universal flight simulator that led to the first radar-based air defense system. MIT faculty, including Harvard engineering PhD Leo Beranek, founded Cambridge’s BBN Technologies, which designed and built early Internet infrastructure. Computer science at Harvard and MIT is thriving today as advances in algorithms, systems and artificial intelligence combine to transform the way in which we live with and interact with computers and information. While Stanford and Carnegie Mellon are serious competitors with strong computer science programs powering important breakthroughs, neither—on their own—match the combined strength of Harvard and MIT.

    Second, the next generation of computer science will be outward facing: bringing expertise in harnessing data and computing power not only to statistics, engineering, and applied math, but to biology, chemistry, neuroscience, design, the social sciences and public policy. At Harvard, CS exists as a department without walls, within a school (Engineering and Applied Sciences) known for reaching across disciplines to tackle challenging problems. MIT and Harvard computer scientists routinely collaborate—edX is one great example—and use each other’s school as a recruiting tool, since students can cross-register for classes at both schools, and it is not unusual for labs to have scientists from both universities. This openness creates an inquisitive and entrepreneurial culture, a tendency to share ideas and collaborate and hustle that is energizing research as well as teaching and learning.

    Third, Boston and Cambridge have the unmatched advantage of more top-ranked academic departments than any other metropolitan area in the world. Where other than Harvard can you find eminent schools of business, law, and medicine and a renowned design school, public policy school, and schools of education and public health? What other region has the Whitehead Institute, the Broad Institute, and the Wyss Institute, all taking the study of biology, medicine, and engineering to new heights thanks in large part to what is being enabled by machine learning and big data? Imagine the puzzles we can solve—the questions we can ask—by uniting expertise. What is intelligence? Which disease patterns can be discerned through new algorithms? How can cities and urban environments become more efficient and more sustainable? It is not an exaggeration to say that in this region, a top-notch collaborator on these and other fundamental questions can be found down the street or even down the hall.

    Fourth, Harvard is transforming Allston into a hub of entrepreneurship that will draw on the region’s extraordinary strengths to attract thinkers and doers from around the world. With a soon-to-be-built signature building for Engineering and Applied Sciences, the University will rethink how spaces are best configured for the education and research of tomorrow, for collaborative research, and for meeting and brainstorming. Kendall Square stands as a testament to what’s possible when land is developed for technology companies—start-ups and giants both—right next to top-notch computer scientists and their collaborators. Harvard’s dramatically expanded CS faculty will be located just across the street from the existing Innovation Lab and adjacent to a new enterprise zone that will house both start-ups and established businesses. Allston will be the place to be for anyone who wants to watch the ripple of his or her idea spread and gain speed.

    Computer science and computational thinking has transformed the world. But too often, ideas generated here have been commercialized on the West Coast. By hiring the brightest, most creative professors, by educating the world’s best students, and by leveraging the strengths of all of Harvard as well as MIT, this region can catch the next wave and lead the world in shaping the computing of tomorrow. If successful, Boston will set a standard for the field that will bring the very best of computer science to big problems in every field, and change all of our lives for the better.

    See the full article here.

    Harvard is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.


    STEM Education Coalition

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 12:44 pm on March 29, 2013 Permalink | Reply
    Tags: , , , Computer Science, , ,   

    From PNNL Lab: “Striking While the Iron Is Hot” 

    Chromatography combined with database search strategy identifies hard-to-find heme proteins

    March 2013
    Suraiya Farukhi
    Christine Sharp

    Results: Heme c is an important iron-containing post-translational modification found in many proteins. It plays an important role in respiration, metal reduction, and nitrogen fixation, especially anaerobic respiration of environmental microbes. Such bacteria and their c-type cytochromes are studied extensively because of their potential use in bioremediation, microbial fuel cells, and electrosynthesis of valuable biomaterials.

    heme c
    Heme C

    Until recently, these modifications were hard to find using traditional proteomic methods. Scientists at Pacific Northwest National Laboratory combined a heme c tag protein affinity purification strategy called histidine affinity chromatography (HAC) with enhanced database searching. This combination confidently identified heme c peptides in liquid chromatography-tandem mass spectrometry (LC-MS/MS) experiments-by as much as 100-fold in some cases.”

    Why It Matters: Iron is a critical part of many biological processes; however, it is often not biologically available or it can be toxic in high quantities. So, biological systems have developed intricate methods to use and store iron. Many environmentally important microbes and microbial communities are rich in c-type cytochromes. Combining HAC and data analysis tailored to the unique properties of heme c peptides should enable more detailed study of the role of c-type cytochromes in these microbes and microbial communities.

    ‘Several proteomics studies have analyzed the expression of c-type cytochromes under various conditions,’ said PNNL postdoctoral researcher Dr. Eric Merkley, and lead author of a paper that appeared in the Journal of Proteome Research. ‘A shared feature of these studies is that the cytochrome-rich fractions, the cell envelope or extracellular polymeric substance, were purified and explicitly analyzed to efficiently detect cytochromes. Analyses of large-scale proteomics datasets have typically suggested that c-type cytochromes, particularly the heme c peptides, are under-represented.'”

    See the full article here.

    Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.


    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 7:22 pm on February 4, 2013 Permalink | Reply
    Tags: , , Computer Science   

    From CERN: “CERN and Oracle celebrate 30 years of collaboration” 

    CERN New Masthead

    4 Feb 2013
    Andrew Purcell

    On Friday 1 February, 2013, CERN and Oracle celebrated 30 years of collaboration. In addition to providing hardware and software to CERN for three decades, Oracle has now been involved in the CERN openlab project for 10 years.

    Rolf Heuer and Loïc le Guisquet cut cake to celebrate 30 years of collaboration between CERN and Oracle (Image: CERN)

    The celebration, which capped off the ‘IT requirements for the next generation of research infrastructures workshop’ held at CERN, saw CERN Director-General Rolf Heuer present Loïc le Guisquet, executive vice president of Oracle Europe, Middle East, and Africa with a small award to mark the occasion. Heuer presented Guisquet with an Oracle tape mounted in glass and marked with the following inscription: ‘LHC data are stored on Oracle tapes similar to the one presented on this award. This specific tape stores the videos of the announcement of the discovery of the new boson, which took place at CERN on 4th July 2012′.

    ‘It is important that IT infrastructures for research embrace new technologies in a manner that is not only useful for researchers, but also improves the competitiveness of many business sectors,’ says Heuer, who cites the collaboration between Oracle and CERN as an excellent example of this. ‘CERN has been working continuously with Oracle over the last 30 years,’ he adds. ‘Oracle is also a long-standing partner of CERN openlab and I think it has developed into a successful model over the last decade now of public-private partnerships in the IT domain.

    CERN openlab is a unique public-private partnership between CERN and a range of leading IT companies. Its mission is to accelerate the development of cutting-edge solutions to be used by the worldwide LHC community. ‘By using CERN openlab as a showcase, companies can then promote their products and their services to other labs and different business sectors,’ says Bob Jones, head of the organization. ‘We are proud to be part of this collaboration,’ says Le Guisquet. ‘We are energised by it and we want it to go on because it always stretches our limits.'”.

    See the full article here.

    Meet CERN in a variety of places:

    Cern Courier




    CERN CMS New

    CERN LHCb New


    CERN LHC New

    LHC particles

    Quantum Diaries

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 7:32 pm on January 29, 2013 Permalink | Reply
    Tags: , Computer Science, , ,   

    From Stanford University: “Stanford Researchers Break Million-core Supercomputer Barrier” 

    Stanford Engineering plate

    Researchers at the Center for Turbulence Research set a new record in supercomputing, harnessing a million computing cores to model supersonic jet noise. Work was performed on the newly installed Sequoia IBM Bluegene/Q system at Lawrence Livermore National Laboratories.

    Friday, January 25, 2013
    Andrew Myers

    Stanford Engineering’s Center for Turbulence Research (CTR) has set a new record in computational science by successfully using a supercomputer with more than one million computing cores to solve a complex fluid dynamics problem—the prediction of noise generated by a supersonic jet engine.

    Joseph Nichols, a research associate in the center, worked on the newly installed Sequoia IBM Bluegene/Q system at Lawrence Livermore National Laboratories (LLNL) funded by the Advanced Simulation and Computing (ASC) Program of the National Nuclear Security Administration (NNSA). Sequoia once topped list of the world’s most powerful supercomputers, boasting 1,572,864 compute cores (processors) and 1.6 petabytes of memory connected by a high-speed five-dimensional torus interconnect.

    A floor view of the newly installed Sequoia supercomputer at the Lawrence Livermore National Laboratories. (Photo: Courtesy of Lawrence Livermore National Laboratories)

    Because of Sequoia’s impressive numbers of cores, Nichols was able to show for the first time that million-core fluid dynamics simulations are possible—and also to contribute to research aimed at designing quieter aircraft engines.

    An image from the jet noise simulation. A new design for an engine nozzle is shown in gray at left. Exhaust tempertures are in red/orange. The sound field is blue/cyan. Chevrons along the nozzle rim enhance turbulent mixing to reduce noise. (Illustration: Courtesy of the Center for Turbulence Research, Stanford University)

    Andrew Myers is associate director of communications for the Stanford University School of Engineering.”

    See the full article here.

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

  • richardmitnick 1:00 pm on January 7, 2013 Permalink | Reply
    Tags: , , , Computer Science,   

    From ESA Space Engineering: “LEON: the space chip that Europe built” 

    ESA Space Engineering Banner

    European Space Agency

    XMM Newton



    7 January 2013
    No Writer Credit

    Just like home computers, the sophisticated capabilities of today’s space missions are made possible by the power of their processor chips. ESA’s coming Alphasat telecom satellite, the Proba-V microsatellite, the Earth-monitoring Sentinel family and the BepiColombo mission to Mercury are among the first missions to use an advanced 32-bit microprocessor – engineered and built in Europe.

    Layout of the LEON2-FT chip, alias AT697

    All of them incorporate the new LEON2-FT chip, commercially known as the AT697. Engineered to operate within spacecraft computers, this microprocessor is manufactured by Atmel in France but originally designed by ESA.

    The underlying LEON design has also been made available to Europe’s space industry as the basis for company-owned ‘system-on-chip’ microprocessors optimised for dedicated tasks. For instance, Astrium is using it to create a space-based GPS/Galileo satnav receiver.

    LEON2-FT chip within Proba-2’s computer

    Independence from non-European parts is also a driver of our European Components Initiative, in place for the last decade, which is working with European industry to bring new components to market.”

    See the full article here.

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 12:03 pm on December 20, 2012 Permalink | Reply
    Tags: , , Computer Science, ,   

    From Sandia Lab: “Supercomputing on the XPRESS track” 

    Sandia aims to create exascale computing operating system

    December 20, 2012

    “In the stratosphere of high-performance supercomputing, a team led by Sandia National Laboratories is designing an operating system that can handle the million trillion mathematical operations per second of future exascale computers, and then create prototypes of several programming components.

    Called the XPRESS project (eXascale Programming Environment and System Software), the effort to achieve a major milestone in million-trillion-operations-per-second supercomputing is funded at $2.3 million a year for three years by DOE’s Office of Science. The team includes Indiana University and Louisiana University; the Universities of North Carolina, Oregon and Houston; and Oak Ridge and Lawrence Berkeley national laboratories. Work began Sept. 1.

    ‘The project’s goal is to devise an innovative operating system and associated components that will enable exascale computing by 2020, making contributions along the way to improve current petaflop (a million billion operations a second) systems,’ said Sandia program lead Ron Brightwell.”

    See the full post here.

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 12:17 pm on January 6, 2012 Permalink | Reply
    Tags: , , , , Computer Science, , , ,   

    From SLAC Today: “Organic Semiconductor” 

    January 6, 2012
    Diane Rezendes Khirallah

    “Simply put, an organic semiconductor is an organic material whose conductivity can be switched on and off at will. This helpful property gives semiconductors a critical role in the on-off switches at the heart of digital devices.

    Many associate the word organic with pesticide-free farm products. But in chemistry, organic refers to compounds that contain the element carbon.

    Today’s most common semiconductor is silicon, which, being its own element, contains no carbon. By controlling conditions such as the percentage and type of impurities in the material and varying the amount of electrical current and the intensity of light – whether visible, infrared or X-ray – scientists can control how the semiconductor behaves.

    But while silicon crystals are durable and allow electrical current to flow rapidly, they are also rigid and expensive to produce, making large-scale implementation cost-prohibitive (for example, in a large-scale solar array).

    In contrast, organic semiconductors – typically plastics and polymers that can be produced in sheets as little as one molecule thick – offer an inexpensive, lightweight, more flexible option. But they don’t yet conduct electricity as efficiently as silicon or operate for as long, which has limited their commercial use.”

    The full article is here.

    Magnified view of organic semiconductor crystals recently grown by Stanford chemical engineers, who studied their structural properties at SLAC’s Stanford Synchrotron Radiation Lightsource.Image courtesy Gaurav Giri, Chemical Engineering, Stanford University

    Now, here is an example of how this research is being applied today-

    The Clean Energy Project (CEP2), at Harvard University is doing work in collaboration with research teams at SLAC.

    CEP2 is a project in Public Distributed Computing under the World Community Grid (WCG) arm of IBM’s Smarter Planet initiative. You can make a contribution to this project with the idle CPU cycles on your computer(s). WCG projects run on a small piece of software from UC Berkeley, called BOINC – the Berkeley Open Infrastructure for Network Computing. Just visit the WCG web site or the BOINC web site, download and install the BOINC software. Visit the WCG web site to attach to the project. While you are at WCG, take a look at the other very worthwhile projects and attach to as many as you wish.

    Also, at the BOINC web site, you will find a whole host of other projects in the Physical Sciences, Astronomy and Cosmology, Mathematics and other areas. Again, you can attach to as many projects as you like.

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science. i1

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: