Tagged: Science Node Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:05 am on October 19, 2017 Permalink | Reply
    Tags: , , For the first time researchers could calculate the quantitative contributions from constituent quarks gluons and sea quarks –– to nucleon spin, Nucleons — protons and neutrons — are the principal constituents of the atomic nuclei, Piz Daint super computer, Quarks contribute only 30 percent of the proton spin, Science Node, , Theoretical models originally assumed that the spin of the nucleon came only from its constituent quarks, To calculate the spin of the different particles in their simulations the researchers consider the true physical mass of the quarks   

    From Science Node: “The mysterious case of Piz Daint and the proton spin puzzle” 

    Science Node bloc
    Science Node

    10 Oct, 2017 [Better late than…]
    Simone Ulmer

    Nucleons — protons and neutrons — are the principal constituents of the atomic nuclei. Those particles in turn are made up of yet smaller elementary particles: Their constituent quarks and gluons.

    Each nucleon has its own intrinsic angular momentum, or spin. Knowing the spin of elementary particles is important for understanding physical and chemical processes. University of Cyprus researchers may have solved the proton spin puzzle – with a little help from the Piz Daint supercomputer.

    Cray Piz Daint supercomputer of the Swiss National Supercomputing Center (CSCS)

    Proton spin crisis

    Spin is responsible for a material’s fundamental properties, such as phase changes in non-conducting materials that suddenly turn them into superconductors at very low temperatures.

    1
    Inside job. Artist’s impression of what the proton is made of. The quarks and gluons contribute to give exactly half the spin of the proton. The question of how is it done and how much each contributes has been a puzzle since 1987. Courtesy Brookhaven National Laboratory.

    Theoretical models originally assumed that the spin of the nucleon came only from its constituent quarks. But then in 1987, high-energy physics experiments conducted by the European Muon Collaboration precipitated what came to be known as the ‘proton spin crisis’: experiments performed at European Organization for Nuclear Research (CERN), Deutsches Elektronen-Synchrotron (DESY) and Stanford Linear Accelerator Center (SLAC) showed that quarks contribute only 30 percent of the proton spin.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    DESY

    DESY Belle II detector

    DESY European XFEL

    DESY Helmholtz Centres & Networks

    DESY Nanolab II

    DESY Helmholtz Centres & Networks

    SLAC

    SLAC Campus

    SLAC/LCLS

    SLAC/LCLS II

    Since then, it has been unclear what other effects are contributing to the spin, and to what extent. Furhter high-energy physics studies suggested that quark-antiquark pairs, with their short-lived intermediate states might be in play here – in other words, purely relativistic quantum effects.

    Thirty years later, these mysterious effects have finally been accounted for in the calculations performed on CSCS supercomputer Piz Daint by a research group led by Constantia Alexandrou of the Computation-based Science and Technology Research Center of the Cyprus Institute and the Physics Department of the University of Cyprus in Nicosia. That group also included researchers from DESY-Zeuthen, Germany, and from the University of Utah and Temple University in the US.

    For the first time, researchers could calculate the quantitative contributions from constituent quarks, gluons, and sea quarks –– to nucleon spin. (Sea quarks are a short-lived intermediate state of quark-antiquark pairs inside the nucleon). With their calculations, the group made a crucial step towards solving the puzzle that brought on the proton spin crisis.

    To calculate the spin of the different particles in their simulations, the researchers consider the true physical mass of the quarks.

    “This is a numerically challenging task, but of essential importance for making sure that the values of the used parameters in the simulations correspond to reality,” says Karl Jansen, lead scientist at DESY-Zeuthen and project co-author.

    The strong [interaction] acting here, which is transmitted by the gluons, is one of the four fundamental forces of physics. The strong [interaction] is indeed strong enough to prevent the removal of a quark from a proton. This property, known as confinement, results in huge binding energy that ultimately holds together the nucleon constituents.

    The researchers used the mass of the pion, a so-called meson, consisting of one up and one down antiquark –the ‘light quarks’ – to fix the mass of the up and down quarks to the physical quark mass entering in the simulations. If the mass of the pion calculated from the simulation corresponds with the experimentally determined value, then the researchers consider that the simulation is done with the actual physical values for the quark mass.

    And that is exactly what Alexandrou and her team have achieved in their recently published research, Physical Review Letters.

    Their simulations also took into account the valence quarks (constituent quarks), sea quarks, and gluons. The researchers used the lattice theory of quantum chromodynamics (lattice QCD) to calculate this sea of particles and their QCD interactions [ETH Zürich].

    Elaborate conversion to physical values

    The biggest challenge with the simulations was to reduce statistical errors in calculating the ‘spin contributions’ from sea quarks and gluons, says Alexandrou. “In addition, a significant part was to carry out the renormalisation of these quantities.”

    3
    Spin cycle. Composition of the proton spin among the constituent quarks (blue and purple columns with the lines), sea quarks (blue, purple, and red solid columns) and gluons (green column). The errors are shown by the bars. Courtesy Constantia Alexandrou.

    In other words, they had to convert the dimensionless values determined by the simulations into a physical value that can be measured experimentally – such as the spin carried by the constituent and sea quarks and the gluons that the researchers were seeking.

    Alexandrou’s team is the first to have achieved this computation including gluons, whereby they had to calculate millions of the ‘propagators’ describing how quarks move between two points in space-time.

    “Making powerful supercomputers like Piz Daint open and available across Europe is extremely important for European science,” notes Jansen.

    “Simulations as elaborate as this were possible only thanks to the power of Piz Daint,” adds Alexandrou.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

    Advertisements
     
  • richardmitnick 11:23 am on October 9, 2017 Permalink | Reply
    Tags: , , Aurora supercomputer, , , , Science Node,   

    From Science Node: “US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021” 

    Science Node bloc
    Science Node

    September 27, 2017
    Tiffany Trader

    ANL ALCF Cray Aurora supercomputer

    At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the “Aurora” supercomputer is on track to be the United States’ first exascale system. Aurora, originally named as the third pillar of the CORAL “pre-exascale” project, will still be built by Intel and Cray for Argonne National Laboratory, but the delivery date has shifted from 2018 to 2021 and target capability has been expanded from 180 petaflops to 1,000 petaflops (1 exaflop).

    2

    The fate of the Argonne Aurora “CORAL” supercomputer has been in limbo since the system failed to make it into the U.S. DOE budget request, while the same budget proposal called for an exascale machine “of novel architecture” to be deployed at Argonne in 2021.

    Until now, the only official word from the U.S. Exascale Computing Project was that Aurora was being “reviewed for changes and would go forward under a different timeline.”

    Officially, the contract has been “extended,” and not cancelled, but the fact remains that the goal of the Collaboration of Oak Ridge, Argonne, and Lawrence Livermore (CORAL) initiative to stand up two distinct pre-exascale architectures was not met.

    According to sources we spoke with, a number of people at the DOE are not pleased with the Intel/Cray (Intel is the prime contractor, Cray is the subcontractor) partnership. It’s understood that the two companies could not deliver on the 180-200 petaflops system by next year, as the original contract called for. Now Intel/Cray will push forward with an exascale system that is some 50x larger than any they have stood up.

    It’s our understanding that the cancellation of Aurora is not a DOE budgetary measure as has been speculated, and that the DOE and Argonne wanted Aurora. Although it was referred to as an “interim,” or “pre-exascale” machine, the scientific and research community was counting on that system, was eager to begin using it, and they regarded it as a valuable system in its own right. The non-delivery is regarded as disruptive to the scientific/research communities.

    Another question we have is that since Intel/Cray failed to deliver Aurora, and have moved on to a larger exascale system contract, why hasn’t their original CORAL contract been cancelled and put out again to bid?

    With increased global competitiveness, it seems that the DOE stakeholders did not want to further delay the non-IBM/Nvidia side of the exascale track. Conceivably, they could have done a rebid for the Aurora system, but that would leave them with an even bigger gap if they had to spin up a new vendor/system supplier to replace Intel and Cray.

    Starting the bidding process over again would delay progress toward exascale – and it might even have been the death knell for exascale by 2021, but Intel and Cray now have a giant performance leap to make and three years to do it. There is an open question on the processor front as the retooled Aurora will not be powered by Phi/Knights Hill as originally proposed.

    These events beg the question regarding the IBM-led effort and whether IBM/Nvidia/Mellanox are looking very good by comparison. The other CORAL thrusts — Summit at Oak Ridge and Sierra at Lawrence Livermore — are on track, with Summit several weeks ahead of Sierra, although it is looking like neither will make the cut-off for entry onto the November Top500 list as many had speculated.

    ORNL IBM Summit supercomputer depiction

    LLNL IBM Sierra supercomputer

    We reached out to representatives from Cray, Intel and the Exascale Computing Project (ECP) seeking official comment on the revised Aurora contract. Cray and Intel declined to comment and we did not hear back from ECP by press time. We will update the story as we learn more.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 11:48 am on October 4, 2017 Permalink | Reply
    Tags: , , Science Node, , Women once powered the tech industry: Can they do it again?   

    From Science Node: “Women once powered the tech industry: Can they do it again?” 

    Science Node bloc
    Science Node

    02 Oct, 2017
    Alisa Alering

    As women enter a field, compensation tends to decline. Is the tech meritocracy a lie?

    1

    Marie Hicks wants us to think about how gender and sexuality influence technological progress and why diversity might matter more in tech than in other fields.

    An assistant professor of history at the University of Wisconsin-Madison, Hicks studies the history of technological progress and the global computer revolution.

    In Programmed Inequality: How Britain discarded women technologists and lost its edge in computing, Hicks discusses how Britain undermined its early success in computation after World War II by neglecting its trained technical workforce — at that time largely composed of women.

    We had a few questions for Hicks about what lessons Britain’s past mistakes might hold for the seemingly-unstoppable economic engine that is Silicon Valley today.

    ‘Technical’ used to be associated with low status, less-skilled work, but now tech jobs are seen as high-status. How did the term evolve?

    In the UK, the class system was such that touching a machine in any way, even if it was an office computer, was seen as lower-class. For a time, there was enormous resistance to doing anything technical by the white men who were in the apex position of society.

    The US had less of that sort of built-in bias against technical work, but there was still the assumption that if you were working with a machine, the machine was doing most of the work. You were more of a tender or a minder—you were pushing buttons.

    The change resulted from a very intentional, concerted push from people inside these nascent fields to professionalize and raise the status of their jobs. All of these professional bodies that we have today, the IEEE and so on, were created in this period. They were helped along by the fact that this is difficult work, and there was a lot of call for it, leading to persistent shortages of people who could do the work.

    We’re in an interesting moment, when these professions are at their peak, and now we’re starting to see them decline in importance and remuneration. More and more, people are hired into jobs that are broken down in ways that require less skill or less training. New college hires are brought into them and the turnover is such that people no longer have the guarantee of a career.

    Will diversity initiatives, rather than elevating women, devalue the status of the field, as happened previously in professions like teaching and librarianship?

    We can see that already happening for certain subfields. Women are pushed into areas like quality assurance rather than what would be considered higher-level, more important, infrastructural engineering positions. The jobs require, in many cases, identical skills, and yet those subfields are paid less and have a lower status.

    The discrepancies are very much linked to the fact that there are a higher proportion of women doing the work. It’s a cycle: High pay and high status professions usually become more male-dominated. If that changes and more women enter the field, pay declines. The perception of the field changes, even if the work remains the same.

    Does the tech industry have a greater problem with structural inequality, or is the conversation just more visible?

    The really significant thing about tech is that it’s so powerful. It’s becoming the secondary branch of our government at this point. That’s why it’s so critical to look at lack of diversity in Silicon Valley.

    There’s just so much at stake in terms of who has the power to decide how we live, how we die, how we’re governed, just the entire shape of our lives.

    How do you suggest we tackle the problem?

    There’s this whole myth of meritocracy that attempts to solve the problem of diversity in STEM through the pipeline model — that, essentially, if we get enough white women and people of color into the beginning end of the pipeline, they’ll come out the other end as captains of industry who are in a position to make real changes in these fields.

    ut, of course, what happens is that they just leak out of the pipeline, because stuffing more and more people into a discriminatory system in an attempt to fix it doesn’t work.

    If you want more women and people of color in management, you have to hire them into those higher positions. You have to get people to make lateral moves from other industries. You have to promote people rather than saying, “Oh, you come in at the bottom level, and you somehow prove yourself.” It’s not going to be possible to get people to the top in that way.

    What we’ve seen is decades and decades where people have been kept at the bottom after they come in at the bottom. We have to have a real disruption in how we think about these jobs and how we hire for them. We can’t just do the same old thing but try to add more women and more people of color into the mix.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 1:46 pm on September 21, 2017 Permalink | Reply
    Tags: , Barcelona Supercomputer Center, Science Node, Turbulence   

    From Science Node: “Smoothing out our notion of turbulence” 

    Science Node bloc
    Science Node

    1
    Barcelona Supercomputer Center

    2
    Barcelona Supercomputer Center MareNostrum Intel Xeon Platinum supercomputer

    21 Sep, 2017
    Alisa Alering

    Advances in high-performance computing at the Barcelona Supercomputing Center create greater accuracy in turbulence modeling.

    Turbulence is what makes you clutch your seat and contemplate your mortality 36,000 feet above the ground as the flimsy tin can you’re flying in bounces violently.

    Turbulence is also present in ocean currents and the lava discharged from a volcano. It’s active in the smoke from a chimney, oil churning through pipelines and, yes, the air around aircraft wings.

    In fact, the flow of most fluids is turbulent, including the movement of blood in our arteries.

    Since turbulence is so prevalent, its study has many industrial and environmental applications.

    Scientists model turbulence to improve vehicle design, diagnose atherosclerosis, build safer bridges, and reduce air pollution.

    Caused by excess energy that can’t be absorbed by a fluid’s viscosity, turbulent flow is by nature irregular and therefore hard to predict. The speed of the fluid at any point constantly fluctuates in both magnitude and direction, presenting researchers with a long-standing challenge.

    But new research has validated a theory proposed in the early twentieth-century, but whose math was too complex to confirm until recently.

    The need to follow fluid lumps in time, space, and scale results in equations that generate too much information: Even now, only a small part of the flow will fit in a computer simulation.

    Scientists use models to make up the missing part. But if those models are wrong, then the simulation is also wrong and no longer represents the flow it’s attempting to simulate.

    Recent research by José Cardesa [Science], an aeronautical engineer in Javier Jiménez’s Fluid Dynamics Group at Universidad Politécnica de Madrid (UPM), attempts to gain new insights into the physics behind turbulent flows and reduce the gaps between simulated flows and the flows around real devices.

    “A main source of discrepancy between computer-modeled flows and the flow around a real airplane is given by the poor performance of the models,” says Cardesa.

    An underlying simplicity

    In 1940s, mathematician Andrey Kolmogorov proposed that turbulence occurs in a cascade.

    A turbulent flow contains whirls of many different sizes. According to Kolmogorov, energy is transferred from the large whirls to smaller and more numerous whirls, rather than dispersing to farther distances.

    But, Cardesa says, the chaotic behavior of a fluid makes it hard to observe any trend with the naked eye.

    Hoping to track individual eddy structures and determine if a recurrent behavior is at work in how turbulence spreads, Cardesa and his colleagues at UPM simulated a turbulent flow using the MinoTauro cluster at the Barcelona Supercomputing Center.

    3
    MinoTauro cluster at the Barcelona Supercomputing Center

    The code was run in parallel on 32 NVIDIA Tesla M2090 cards, using a hybrid CUDA-MPI code developed by Alberto Vela-Martin. The simulation took almost three months to complete and resulted in over one hundred terabytes of compressed data.

    Progress in analyzing the stored simulation data was initially slow, until Cardesa adjusted the code so it would fit on a single node of a computer cluster with 48 GB of RAM per node. This way, he could run the process independently on twelve different nodes and was able to complete the task within just one month.

    Their results validated Kolmogorov’s theory, revealing an underlying simplicity in the apparently random motion of turbulent wind or water. The next step may be to try to understand the cause of the trend Cardesa has detected or to implement the new insights into flow simulation software.

    Cardesa’s work has benefited from advances in computational speed and storage capacity. He points out that his work would have been possible about ten years ago, but the expense would have been such that it would have required a ‘heroic’ computational effort.

    “The reduced cost of technology has made it possible for us to play with these datasets,” says Cardesa. “This is an extremely useful situation to be in when doing fundamental research and throwing all our efforts at an unsolved problem.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 7:59 am on August 25, 2017 Permalink | Reply
    Tags: , , , Caty Pilachowski, , , , Science Node, ,   

    From Science Node: Women in Stem -“A Hoosier’s view of the heavens” Caty Pilachowski 

    Science Node bloc
    Science Node

    24 Aug, 2017
    Tristan Fitzpatrick

    6
    Caty Pilachowski

    1
    Courtesy Emily Sterneman; Indiana University.

    “An eclipse violates our sense of what’s right.”

    So says Caty Pilachowski. Pilachowski, past president of the American Astronomical Society and now the Kirkwood Chair in Astronomy at Indiana University, has just returned from Hopkinsville, Kentucky where she observed the eclipse on the path of totality and watched the phenomena associated with a solar eclipse.

    “There are all kinds of effects that we can see during an eclipse,” says Pilachowski. “For example, we’re able to see the corona, which we can never see during the daytime without special equipment.”

    The surface of the sun, Pilachowski explains, has a temperature of roughly 5,780 kelvins (10,000º Fahrenheit). The thin gas that makes up the corona far above the sun, however, has a much hotter temperature— over a million degrees K.

    “That process of transporting energy into the highest atmosphere of the sun is not well understood,” she observes. “It’s the region just above the bright lower atmosphere of the sun that we’re best able to see during the eclipse, and that’s where the energy transport occurs.”

    Smile for the camera

    But the star in our own neighborhood isn’t the only one Pilachowski is keeping her eye on.

    When they’re not watching eclipses, Pilachowski and her colleagues at the IU Department of Astronomy use the One Degree Imager (ODI) on the WIYN 3.5M Observatory at Kitt Peak outside Tucson, Arizona.

    2
    One Degree Imager (ODI) on the WIYN 3.5M Observatory


    NOAO WIYN 3.5 meter telescope at Kitt Peak, AZ, USA

    The ODI was designed to image one square degree of sky at a time (the full moon takes up about half a square degree). Each image produced with the ODI is potentially 1 – 2 gigabytes in size.


    Kitt Peak outside of Tucson, Arizona hosts the 3.5 meter WIYN telescope, the primary research telescope for IU astronomers. Courtesy IU Astronomy; UITS Advanced Visualization Laboratory.

    IU astronomers collect thousands of these images, creating huge datasets that need to be examined quickly for scholarly insight.

    “Datasets from the ODI are much larger than can be handled with methods astronomers previously used, such as a CD-ROM or a portable hard drive” says Arvind Gopu, manager of the Scalable Compute Archive team.

    This is where IU’s computationally rich resources are critically important.

    The ODI Portal, Pipeline, and Archive (ODI-PPA) leverages the Karst, Big Red II, and Carbonate supercomputers at IU to quickly process these large amounts of data for analysis.

    3
    Karst supercomputer

    4
    Big Red II supercomputer

    These HPC tools allow researchers to perform statistical analysis and source extraction from the original image data. With these resources, they can determine if they’ve located stars, galaxies, or other items of interest from the large slice of the universe they’ve been viewing.

    “The advantage of using ODI-PPA is that you don’t have to have a lot of supercomputing experience,” says Gopu. “The idea is for astronomers to do the astronomy, and for us at UITS Research Technologies to do the computer science for them.”

    This makes the workflow on the ODI much faster than for other optical instruments. When collecting images of the universe, some instruments run into the crowded field problem, where stars are so close to each other they blend together when imaged. Teasing them apart requires a lot of computational heft.

    Another advantage ODI-PPA offers is its user-friendly web portal that makes it easy for researchers to view out-of-this-world images on their own machines, without requiring multiple trips to Kitt Peak.

    “Without the portal, IU astronomers would be dead in the water,” Pilachowski admits. “Lots and lots of data, with no way to get the science done.”

    Out of the fire and into the frying pan

    Pilachowski is also a principal investigtor on the Blanco DECam Bulge Survey (BDBS). A three-year US National Science Foundation-funded project, BDBS uses the Dark Energy Camera (DECam) attached to the Blanco Telescope in Chile to map the bulge at the heart of the Milky Way.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Like the yolk of a fried egg rising above egg whites in a frying pan, billions of stars orbit together to form a bulge that rises out of the galactic center.

    With the help of the DECam, Pilachowski can analyze populations of stars in the Milky Way’s bulge to study their properties.

    Astronomers use three different variables to catalog stars: How much hydrogen a star has, how much helium it has, and how much ‘metals’ it has (or, all the elements that aren’t hydrogen or helium).

    When the data from the survey is processed, Pilchowski can explore a large amount of information about stares in the Bulge, giving her clues about how the Milky Way’s central star system formed.

    “Most large astronomical catalogues are in the range of 500 million stars,” says Michael Young, astronomer and senior developer analyst at UITS Research Technologies. “When we’re done with this project, we should have a catalog of about a billion stars for researchers to use.”

    Journey of two eclipses

    As a child of the atomic age, Pilachowski grew up devouring books about the evolution of stars. She read as many books as she could about how they were formed, what stages they went through, and how they died.

    “That interest in stars has been a lifelong love for me,” Pilachowski says. “It’s neat to me that what I found exciting as a kid is what I get to spend my whole career studying.”

    She observed the last total solar eclipse in the continental US on February 26, 1979, an event she says further inspired her research in astronomy.

    “For me that eclipse was a combination of, ‘Wow, this is so amazing,’” Pilachowski recalls.

    “On the other hand, the observer in me saw cool things that were present, like planets that were visible right near the sun in the day time.”

    Regardless of whether scientists get closer to answering why the sun’s outer atmosphere is much hotter than its surface, Pilachowski says the eclipse has an eerie, unnerving effect on viewers.

    “We have this deep, ingrained understanding that the sun rises every morning and sets every evening,” says Pilachowski. “Things are as they’re supposed to be. An eclipse is something so rare and counter to our intuition that it just affects us deeply.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 12:20 pm on July 13, 2017 Permalink | Reply
    Tags: Chinese Sunway ThaihuLight supercomputer currently #1 on the TOP500 list of supercomputers, How supercomputers are uniting the US and China, Science Node,   

    From Science Node: “How supercomputers are uniting the US and China” 

    Science Node bloc
    Science Node

    12 July 2017
    Tristan Fitzpatrick

    38 years ago, US President Jimmy Carter and China Vice Premier Deng Xiaoping signed the US – China Agreement on Cooperation in Science and Technology, outlining broad opportunities to promote science and technology research.

    Since then the two nations have worked together on a variety of projects, including energy and climate research. Now, however, there is another goal that each country is working towards: The pursuit of exascale computing.

    At the PEARC17 conference in New Orleans, Louisiana, representatives from the high-performance computing communities in the US and China participated in the first international workshop on American and Chinese collaborations in experience and best practice in supercomputing.

    Both countries face the same challenges implementing and managing HPC resources across a large nation-state. The hardware and software technologies are rapidly evolving, the user base is ever-expanding, and the technical requirements for maintaining these large and fast machines is accelerating.

    It would be a major coup for either country’s scientific prowess if exascale computing could be reached, as it’s believed to be the order of processing for the human brain at the neural level. Initiatives like the Human Brain Project consider it to be a hallmark to advance computational power.

    “It’s less like an arms race between the two countries to see who gets there first and more like the Olympics,” says Dan Stanzione, executive director at the Texas Advanced Computing Center (TACC).

    TACC Maverick HP NVIDIA supercomputer

    TACC Lonestar Cray XC40 supercomputer

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    TACC HPE Apollo 8000 Hikari supercomputer

    TACC Maverick HP NVIDIA supercomputer

    “We’d like to win and get the gold medal but hearing what China is doing with exascale research is going to help us get closer to this goal.”

    ___________________________________________________________________

    Exascale refers to computing systems that can perform a billion billion calculations per second — at least 50 times faster than the fastest supercomputers in the US.

    ___________________________________________________________________

    Despite the bona fides that would be awarded to whomever achieves the milestone first, TACC data mining and statistics group manager Weijia Xu stresses that collaboration is a greater motivator for both the US and China than just a race to see who gets there first.

    “I don’t think it’s really a competition,” Xu says. “It’s more of a common goal we all want to reach eventually. How you reach the goal is not exactly clear to everyone yet. Furthermore, there are many challenges ahead, such as how systems can be optimized for various applications.”

    The computational resources at China’s disposal could make it a great ally in the pursuit of exascale power. As of June 2017, China has the two fastest supercomputers in the top 500 supercomputers list, followed by five entries from the United States in the top ten.

    1
    Chinese Sunway ThaihuLight supercomputer, currently #1 on the TOP500 list of supercomputers.

    “While China has the top supercomputer in the world, China and the US probably have about fifty percent each of those top 500 machines besides the European countries,” says Si Liu, HPC software tools researcher at TACC. “We really believe if we have some collaboration between the US and China, we could do some great projects together and benefit the whole HPC community.”

    Besides pursuing the elusive exascale goal, Stanzione says the workshop opened up other ideas for how to improve the overall performance of HPC efforts in both nations. Co-located participants spoke on topics ranging from in situ simulations, artificial intelligence, and deep learning, among others.

    “We also ask questions like how do we run HPC systems, what do we run on them, and how it’s going to change in the next few years,” Stanzione says.“It’s a great time to get together and talk about details of processors, speeds, and feeds.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 1:21 pm on July 8, 2017 Permalink | Reply
    Tags: , , , , , , , Science Node, , UCSD Comet supercomputer   

    From Science Node: “Cracking the CRISPR clock” 

    Science Node bloc
    Science Node

    05 Jul, 2017
    Jan Zverina

    SDSC Dell Comet supercomputer

    Capturing the motion of gyrating proteins at time intervals up to one thousand times greater than previous efforts, a team led by University of California, San Diego (UCSD) researchers has identified the myriad structural changes that activate and drive CRISPR-Cas9, the innovative gene-splicing technology that’s transforming the field of genetic engineering.

    By shedding light on the biophysical details governing the mechanics of CRISPR-Cas9 (clustered regularly interspaced short palindromic repeats) activity, the study provides a fundamental framework for designing a more efficient and accurate genome-splicing technology that doesn’t yield ‘off-target’ DNA breaks currently frustrating the potential of the CRISPR-Cas9- system, particularly for clinical uses.


    Shake and bake. Gaussian accelerated molecular dynamics simulations and state-of-the-art supercomputing resources reveal the conformational change of the HNH domain (green) from its inactive to active state. Courtesy Giulia Palermo, McCammon Lab, UC San Diego.

    “Although the CRISPR-Cas9 system is rapidly revolutionizing life sciences toward a facile genome editing technology, structural and mechanistic details underlying its function have remained unknown,” says Giulia Palermo, a postdoctoral scholar with the UC San Diego Department of Pharmacology and lead author of the study [PNAS].

    See the full article here
    .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 12:05 pm on July 1, 2017 Permalink | Reply
    Tags: NRENs - national research and education networks, Science Node, Solar energy benefits education and research in Africa, Solar-powered batteries   

    From Science Node: “Solar energy benefits education and research in Africa” 

    Science Node bloc
    Science Node

    28 June, 2017
    Megan Ray Nichols

    1
    No image caption or credit.

    Research and education networks are under threat in Africa due to frequent power outages. Solar-powered batteries may hold the key to network resilience and scientific autonomy.

    It’s hard to imagine that in our technologically advanced society that there are people without electricity, but this is exactly what happens in many parts of Africa.

    With many remote regions and an unstable electrical grid, the science and education made possible by national research and education networks (NRENs) are often in jeopardy. Solar-powered batteries might just be the solution.

    Electricity, education, and research in Africa

    It is estimated that millions of families in Africa are without power, and the policies the government must enact to make electricity more available are slow in coming. Finding a viable and economical way to connect everyone to the grid has been a challenge.


    Wow!! Power to the people. Microgrids, like the one featured in this Tesla video, combine solar panels and rechargeable batteries to liberate remote regions from the tyranny of power outages. Courtesy Tesla.

    Electrical service disruption directly affects network operating centers (NOCs), network point-of presences (PoPs), research institutions, and students throughout the continent.

    “Information and communication technology (ICT) services define our daily lives,” notes Stein Mkandawire, chief technical officer for the Zambia Research and Education Network.

    “Funding standby generators for daily running of NOCs, PoPs and institutions is required, and that results in high service provision costs.”

    Even in less remote locales with an electrical infrastructure in place, blackouts occur frequently. The net result is an extreme hindrance for the scientific and educational projects underway in Africa.

    “Power outages often worsen the challenges faced when establishing NRENs in Africa because periods where power mains fail in excess of two days are still common,” says Isaac Kasana, CEO of the Research and Education Network for Uganda (RENU).

    3
    Here comes the sun. Electrical infrastructure is often taxed by the rugged expanses of Africa, handicapping scientific communications. Solar power is lighting the way to a solution. Courtesy McKinsey and Co.

    “Failure is so repetitive that the mains-charged battery systems are unable to sustain sufficient levels of operating autonomy to prevent site power shutdowns from occurring.”

    Power outages not only affect a specific site or campus but also the connectivity of other linked campuses. For instance, RENU’s network follows a sub-ring topology with typically eight or nine daisy-chained campus networks.

    Multiply that by the number of researchers, teachers, students, and communities depending on ICT services, and the fragility of the enterprise becomes apparent.

    In the face of these challenges, NREN engineers are looking to solar power as a way to sustain electricity during frequent blackouts.

    Harnessing solar power

    4
    Solar power. No image credit

    Being able to tap into solar energy for electrical power works best when there is a way to store that energy. In the past, batteries haven’t always worked as well as they should.

    But with advances in technology, solar-powered rechargeable batteries now make renewable energy systems reliable and viable.

    “Many African countries have plenty of sunshine which can be used as alternative source of energy, so solar energy is a means to sustain the NRENs in times of blackouts,” says Mkandawire.

    4
    The doctor is in. Remote researchers (and their data) cut off by intermmitent power supplies may find respite with implementation of solar-powered rechargeable batteries. Courtesy Johns Hopkins School of Public Health.

    Since most days have sufficient periods of intense sunshine, this would ensure near-continuous solar charging. When tied into a hybrid-charged power system, batteries can greatly enhance NREN resilience.

    “For up-country campuses and rural-located research stations (such as the NIH station at Rakai), solar-charged batteries may provide the most cost-efficient means of powering connectivity and other ICT equipment, says Kasana. “This will increase an NREN’s national coverage by enabling the connection of remote research stations and enhancing access for researchers who have to be based at such remote sites.”

    By supplying countries with a reliable source of power from solar, African NRENs can send a steady stream of services to institutions, research bases, and communities. This in turn, gives better access to learning materials.

    The benefits of solar power

    There are many affordable options for families in Africa to bring electricity through solar power into their homes. Using apps on their phones and equipment they can buy at the store, they can power their homes for less than $60 per year. Several places have already started using solar power — it provides electricity to areas that desperately need it, creates jobs, and furthers research and education.

    An education is one of life’s most precious acquisitions. But without the resources needed to teach and learn, knowledge-creation stalls.

    Solar power is brightening the future of science and research in Africa.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 3:27 pm on June 25, 2017 Permalink | Reply
    Tags: , , , Science Node, , TACC Lonestar supercomputer, TACC Stampede supercomputer   

    From Science Node: “Computer simulations and big data advance cancer immunotherapy” 

    Science Node bloc
    Science Node

    09 Jun, 2017 [Where has this been?]
    Aaron Dubrow

    1
    Courtesy National Institute of Allergy and Infectious Diseases.

    Supercomputers help classify immune response, design clinical trials, and analyze immune repertoire data.
    Scanning electron micrograph of a human T lymphocyte (also called a T cell) from the immune system of a healthy donor. Immunotherapy fights cancer by supercharging the immune system’s natural defenses (include T-cells) or contributing additional immune elements that can help the body kill cancer cells. [Credit: NIAID]

    The body has a natural way of fighting cancer – it’s called the immune system, and it is tuned to defend our cells against outside infections and internal disorder. But occasionally, it needs a helping hand.

    In recent decades, immunotherapy has become an important tool in treating a wide range of cancers, including breast cancer, melanoma and leukemia.

    But alongside its successes, scientists have discovered that immunotherapy sometimes has powerful — even fatal — side-effects.

    Identifying patient-specific immune treatments

    Not every immune therapy works the same on every patient. Differences in an individual’s immune system may mean one treatment is more appropriate than another. Furthermore, tweaking one’s system might heighten the efficacy of certain treatments.

    1
    Scanning electron micrograph of a human T lymphocyte (also called a T cell) from the immune system of a healthy donor. Immunotherapy fights cancer by supercharging the immune system’s natural defenses (include T-cells) or contributing additional immune elements that can help the body kill cancer cells. [Credit: NIAID]

    Researchers from Wake Forest School of Medicine and Zhejiang University in China developed a novel mathematical model to explore the interactions between prostate tumors and common immunotherapy approaches, individually and in combination.

    In a study published in Nature Scientific Reports, they used their model to predict how prostate cancer would react to four common immunotherapies.

    The researchers incorporated data from animal studies into their complex mathematical models and simulated tumor responses to the treatments using the Stampede supercomputer at the Texas Advanced Computing Center (TACC).

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    “We do a lot of modeling which relies on millions of simulations,” says Jing Su, a researcher at the Center for Bioinformatics and Systems Biology at Wake Forest School of Medicine and assistant professor in the Department of Diagnostic Radiology.

    “To get a reliable result, we have to repeat each computation at least 100 times. We want to explore the combinations and effects and different conditions and their results.”

    TACC’s high performance computing resources allowed the researchers to highlight a potential therapeutic strategy that may manage prostate tumor growth more effectively.

    Designing more efficient clinical trials

    Biological agents used in immunotherapy — including those that target a specific tumor pathway, aim for DNA repair, or stimulate the immune system to attack a tumor — function differently from radiation and chemotherapy.

    Because traditional dose-finding designs are not suitable for trials of biological agents, novel designs that consider both the toxicity and efficacy of these agents are imperative.

    Chunyan Cai, assistant professor of biostatistics at UT Health Science Center (UTHSC)’s McGovern Medical School, uses TACC systems to design new kinds of dose-finding trials for combinations of immunotherapies.

    4

    Writing in the Journal of the Royal Statistics Society Series C (Applied Statistics), Cai and her collaborators, Ying Yuan, and Yuan Ji, described efforts to identify biologically optimal dose combinations for agents that target the PI3K/AKT/mTOR signaling pathway, which has been associated with several genetic aberrations related to the promotion of cancer.

    After 2,000 simulations on the Lonestar supercomputer for each of six proposed dose-finding designs, they discovered the optimal combination gives higher priority to trying new doses in the early stage of the trial.

    TACC Lonestar Cray XC40 supercomputer

    The best case also assigns patients to the most effective dose that is safe toward the end of the trial.

    “Extensive simulation studies show that the design proposed has desirable operating characteristics in identifying the biologically optimal dose combination under various patterns of dose–toxicity and dose–efficacy relationships,” Cai concludes.

    Whether in support of population-level immune response studies, clinical dosing trials, or community-wide efforts, TACC’s advanced computing resources are helping scientists put the immune system to work to better fight cancer.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 12:35 pm on June 10, 2017 Permalink | Reply
    Tags: , , Irish Centre for High-End Computing, , , PRACE, , Science Node, Sinéad Ryan, ,   

    From Science Node= Women in STEM-“A day in the life of an Irish particle physicist” Sinéad Ryan 

    Science Node bloc

    Science Node

    02 Jun, 2017
    Tristan Fitzpatrick

    2
    Sinéad Ryan is a quantum chromodynamics expert in Dublin. She relies on PRACE HPC resources to calculate the mass of quarks, gluons, and hadrons — and uncover the secrets of the universe.

    Uncovering the mysteries of the cosmos is just another day in the office for Sinéad Ryan.

    2

    Ryan, professor of theoretical high energy physics at Trinity College Dublin, specializes in quantum chromodynamics (QCD). The field examines how quarks and gluons form hadrons, the fundamental starting point of our universe.

    “Quarks and gluons are the building blocks for everything in the world around us and for our universe,” says Ryan. “The question is, how do these form the matter that we see around us?”

    To answer this, Ryan performs numerical simulations on high-performance computing (HPC) resources managed by the Partnership for Advanced Computing in Europe’s (PRACE).

    “I think PRACE is crucial for our field,” says Ryan, “and I’m sure other people would tell you the same thing.”

    When quarks are pulled apart, energy grows between them, similar to the tension in a rubber band when it is stretched. Eventually, enough energy is produced to create more quarks which then form hadrons in accordance with Einstein’s equation E=MC2.

    The problem, according to Ryan, comes in solving the equations of QCD. PRACE’s HPC resources make Ryan’s work possible because they enable her to run simulations on a larger scale than simple pen and paper would allow.

    “It’s a huge dimensional integral to solve, and we’re talking about solving a million times a million matrices that we must invert,” says Ryan.

    “This is where HPC comes in. If you want to make predictions in the theory, you need to be able to do the simulations numerically.”

    In Ireland, the Irish Centre for High-End Computing is one resource Ryan has tapped in her research, but PRACE enables her and her collaborators to access resources not just locally but across the world.

    IITAC IBM supercomputer

    “This sort of work tends to be very collaborative and international,” says Ryan. “We can apply through PRACE for time on HPC machines throughout Europe. In my field, any machine anywhere is fair game.”

    Besides providing resources, PRACE also determines whether HPC resources are suitable for the kinds of research questions scientists are interested in answering.

    “PRACE’s access to these facilities means that good science gets done on these machines,” says Ryan. “These are computations that are based around fundamental questions posed by people who have a track record for doing good science and asking the right questions. I think that’s crucial.”

    Without PRACE’s support, Ryan’s work examining how quarks and gluons form matter and the beginnings of our universe would be greatly diminished, leaving us one step further behind uncovering the building blocks of the universe.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: