Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:34 pm on October 19, 2017 Permalink | Reply
    Tags: , , , LTEM-Laser terahertz emission microscopy, , Physics, This new technique enables measurements down to a resolution of 20 nanometers   

    From Brown: “Terahertz spectroscopy goes nano” 

    Brown University


    Brown University

    Going nano
    Researchers have improved the resolution of terahertz spectroscopy by 1,000 times, making the technique useful at the nanoscale.
    Mittleman Lab / Brown University

    Brown University researchers have demonstrated a way to bring a powerful form of spectroscopy — a technique used to study a wide variety of materials — into the nano-world.

    Laser terahertz emission microscopy (LTEM) is a burgeoning means of characterizing the performance of solar cells, integrated circuits and other systems and materials. Laser pulses illuminating a sample material cause the emission of terahertz radiation, which carries important information about the sample’s electrical properties.

    “This is a well-known tool for studying essentially any material that absorbs light, but it’s never been possible to use it at the nanoscale,” said Daniel Mittleman, a professor in Brown’s School of Engineering and corresponding author of a paper describing the work. “Our work has improved the resolution of the technique so it can be used to characterize individual nanostructures.”

    Typically, LTEM measurements are performed with resolution of a few tens of microns, but this new technique enables measurements down to a resolution of 20 nanometers, roughly 1,000 times the resolution previously possible using traditional LTEM techniques.

    The research, published in the journal ACS Photonics, was led by Pernille Klarskov, a postdoctoral researcher in Mittleman’s lab, with Hyewon Kim and Vicki Colvin from Brown’s Department of Chemistry.

    For their research, the team adapted for terahertz radiation a technique already used to improve the resolution of infrared microscopes. The technique uses a metal pin, tapered down to a sharpened tip only a few tens of nanometers across, that hovers just above a sample to be imaged. When the sample is illuminated, a tiny portion of the light is captured directly beneath the tip, which enables imaging resolution roughly equal to the size of the tip. By moving the tip around, it’s possible to create ultra-high resolution images of an entire sample.

    Klarskov was able to show that the same technique could be used to increase the resolution of terahertz emission as well. For their study, she and her colleagues were able to image an individual gold nanorod with 20-nanometer resolution using terahertz emission.

    The researchers believe their new technique could be broadly useful in characterizing the electrical properties of materials in unprecedented detail.

    “Terahertz emission has been used to study lots of different materials — semiconductors, superconductors, wide-band-gap insulators, integrated circuits and others,” Mittleman said. “Being able to do this down to the level of individual nanostructures is a big deal.”

    One example of a research area that could benefit from the technique, Mittleman says, is the characterization of perovskite solar cells, an emerging solar technology studied extensively by Mittleman’s colleagues at Brown.

    “One of the issues with perovskites is that they’re made of multi-crystalline grains, and the grain boundaries are what limits the transport of charge across a cell,” Mittleman said. “With the resolution we can achieve, we can map out each grain to see if different arrangements or orientations have an influence on charge mobility, which could help in optimizing the cells.”

    That’s one example of where this could be useful, Mittleman said, but it’s certainly not limited to that.

    “This could have fairly broad applications,” he noted.

    The research was supported by the National Science Foundation, the Danish Council for Independent Research and by Honeywell Federal Manufacturing & Technologies.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Welcome to Brown

    Brown U Robinson Hall
    Located in historic Providence, Rhode Island and founded in 1764, Brown University is the seventh-oldest college in the United States. Brown is an independent, coeducational Ivy League institution comprising undergraduate and graduate programs, plus the Alpert Medical School, School of Public Health, School of Engineering, and the School of Professional Studies.

    With its talented and motivated student body and accomplished faculty, Brown is a leading research university that maintains a particular commitment to exceptional undergraduate instruction.

    Brown’s vibrant, diverse community consists of 6,000 undergraduates, 2,000 graduate students, 400 medical school students, more than 5,000 summer, visiting and online students, and nearly 700 faculty members. Brown students come from all 50 states and more than 100 countries.

    Undergraduates pursue bachelor’s degrees in more than 70 concentrations, ranging from Egyptology to cognitive neuroscience. Anything’s possible at Brown—the university’s commitment to undergraduate freedom means students must take responsibility as architects of their courses of study.

  • richardmitnick 10:09 am on October 18, 2017 Permalink | Reply
    Tags: , , , Nigel Goldenfeld, Physics,   

    From Quanta: “Seeing Emergent Physics Behind Evolution” 

    Quanta Magazine
    Quanta Magazine

    August 31, 2017 [Previously hidden.]
    Jordana Cepelewicz

    Nigel Goldenfeld, director of the NASA Astrobiology Institute for Universal Biology, spends little time in his office in the physics department at the University of Illinois, Urbana-Champaign. But even when working on biology studies, he applies to them the informative principles of condensed matter physics and emergent states. Seth Lowe for Quanta Magazine

    The physicist Nigel Goldenfeld hates biology — “at least the way it was presented to me” when he was in school, he said. “It seemed to be a disconnected collection of facts. There was very little quantitation.” That sentiment may come as a surprise to anyone who glances over the myriad projects Goldenfeld’s lab is working on. He and his colleagues monitor the individual and swarm behaviors of honeybees, analyze biofilms, watch genes jump, assess diversity in ecosystems and probe the ecology of microbiomes. Goldenfeld himself is director of the NASA Astrobiology Institute for Universal Biology, and he spends most of his time not in the physics department at the University of Illinois but in his biology lab on the Urbana-Champaign campus.

    Goldenfeld is one in a long list of physicists who have sought to make headway on questions in biology: In the 1930s Max Delbrück transformed the understanding of viruses; later, Erwin Schrödinger published What is Life? The Physical Aspect of the Living Cell; Francis Crick, a pioneer of X-ray crystallography, helped discover the structure of DNA. Goldenfeld wants to make use of his expertise in condensed matter theory, in which he models how patterns in dynamic physical systems evolve over time, to better understand diverse phenomena including turbulence, phase transitions, geological formations and financial markets. His interest in emergent states of matter has compelled him to explore one of biology’s greatest mysteries: the origins of life itself. And he’s only branched out from there. “Physicists can ask questions in a different way,” Goldenfeld said. “My motivation has always been to look for areas in biology where that kind of approach would be valued. But to be successful, you have to work with biologists and essentially become one yourself. You need both physics and biology.”

    Quanta Magazine recently spoke with Goldenfeld about collective phenomena, expanding the Modern Synthesis model of evolution, and using quantitative and theoretical tools from physics to gain insights into mysteries surrounding early life on Earth and the interactions between cyanobacteria and predatory viruses. A condensed and edited version of that conversation follows.

    Physics has an underlying conceptual framework, while biology does not. Are you trying to get at a universal theory of biology?

    God, no. There’s no unified theory of biology. Evolution is the nearest thing you’re going to get to that. Biology is a product of evolution; there aren’t exceptions to the fact that life and its diversity came from evolution. You really have to understand evolution as a process to understand biology.

    So how can collective effects in physics inform our understanding of evolution?

    When you think about evolution, you typically tend to think about population genetics, the frequency of genes in a population. But if you look to the Last Universal Common Ancestor — the organism ancestral to all others, which we can trace through phylogenetics [the study of evolutionary relationships] — that’s not the beginning of life. There was definitely simpler life before that — life that didn’t even have genes, when there were no species. So we know that evolution is a much broader phenomenon than just population genetics.

    The Last Universal Common Ancestor is dated to be about 3.8 billion years ago. The earth is 4.6 billion years old. Life went from zero to essentially the complexity of the modern cell in less than a billion years. In fact, probably a lot less: Since then, relatively little has happened in terms of the evolution of cellular architecture. So evolution was slow for the last 3.5 billion years, but very fast initially.

    Why did life evolve so fast?

    The late biophysicist] Carl Woese and I felt that it was because it evolved in a different way. The way life evolves in the present era is through vertical descent: You give your genes to your children, they give their genes to your grandchildren, and so on. Horizontal gene transfer gives genes to an organism that’s not related to you. It happens today in bacteria and other organisms, with genes that aren’t really so essential to the structure of the cell. Genes that give you resistance to antibiotics, for example — that’s why bacteria evolve defenses against drugs so quickly. But in the earlier phase of life, even the core machinery of the cell was transmitted horizontally. Life early on would have been a collective state, more of a community held together by gene exchange than simply the sum of a collection of individuals. There are many other well-known examples of collective states: for example, a bee colony or a flock of birds, where the collective seems to have its own identity and behavior, arising from the constituents and the ways that they communicate and respond to each other. Early life communicated through gene transfer.

    How do you know?

    Life could only have evolved as rapidly and optimally as it did if we assume this early network effect, rather than a [family] tree. We discovered about 10 years ago that this was the case with the genetic code, the rules that tell the cell which amino acids to use to make protein. Every organism on the planet has the same genetic code, with very minor perturbations. In the 1960s Carl was the first to have the idea that the genetic code we have is about as good as it could possibly be for minimizing errors. Even if you get the wrong amino acid — through a mutation, or because the cell’s translational machinery made a mistake — the genetic code specifies an amino acid that’s probably similar to the one you should have gotten. In that way, you’ve still got a chance that the protein you make will function, so the organism won’t die. David Haig [at Harvard University] and Laurence Hurst [at the University of Bath] were the first to show that this idea could be made quantitative through Monte Carlo simulation — they looked for which genetic code is most resilient against these kinds of errors. And the answer is: the one that we have. It’s really amazing, and not as well known as it should be.

    Later, Carl and I, together with Kalin Vetsigian [at the University of Wisconsin-Madison], did a digital life simulation of communities of organisms with many synthetic, hypothetical genetic codes. We made computer virus models that mimicked living systems: They had a genome, expressed proteins, could replicate, experienced selection, and their fitness was a function of the proteins that they had. We found that it was not just their genomes that evolved. Their genetic code evolved, too. If you just have vertical evolution [between generations], the genetic code never becomes unique or optimal. But if you have this collective network effect, then the genetic code evolves rapidly and to a unique, optimal state, as we observe today.

    So those findings, and the questions about how life could get this error-minimizing genetic code so quickly, suggest that we should see signatures of horizontal gene transfer earlier than the Last Universal Common Ancestor, for example. Sure enough, some of the enzymes that are associated with the cell’s translation machineries and gene expression show strong evidence of early horizontal gene transfers.

    How have you been able to build on those findings?

    Tommaso Biancalani [now at the Massachusetts Institute of Technology] and I discovered in the last year or so — and our paper on this has been accepted for publication — that life automatically shuts off the horizontal gene transfer once it has evolved enough complexity. When we simulate it, it basically shuts itself off on its own. It’s still trying to do horizontal gene transfer, but almost nothing sticks. Then the only evolutionary mechanism that dominates is vertical evolution, which was always present. We’re now trying to do experiments to see whether all the core cellular machinery has gone through this transition from horizontal to vertical transmission.
    Is this understanding of early evolution why you’ve said that we need a new way to talk about biology?

    People tend to think about evolution as being synonymous with population genetics. I think that’s fine, as far as it goes. But it doesn’t go far enough. Evolution was going on before genes even existed, and that can’t possibly be explained by the statistical models of population genetics alone. There are collective modes of evolution that one needs to take seriously, too. Processes like horizontal gene transfer, for example.

    It’s in that sense that I think our view of evolution as a process needs to be expanded — by thinking about dynamical systems, and how it is possible that systems capable of evolving and reproducing can exist at all. If you think about the physical world, it is not at all obvious why you don’t just make more dead stuff. Why does a planet have the capability to sustain life? Why does life even occur? The dynamics of evolution should be able to address that question. Remarkably, we don’t have an idea even in principle of how to address that question — which, given that life started as something physical and not biological, is fundamentally a physics question.

    How does your work on cyanobacteria fit into these applications of condensed matter theory?

    My graduate student Hong-Yan Shih and I modeled the ecosystem of an organism called Prochlorococcus, a type of cyanobacteria that lives in the ocean through photosynthesis. I think it may well be the most numerous cellular organism on the planet. There are viruses, called phages, that prey on the bacteria. Ten years or so ago, it was discovered that these phages have photosynthesis genes, too. Now, you normally wouldn’t think of a virus as needing to do photosynthesis. So why are they carrying these genes around?

    It seems that the bacteria and phages don’t quite behave as the dynamics of a predator-prey ecosystem would predict. The bacteria actually benefit from the phages. In fact, the bacteria could prevent the phages from attacking them in many ways, but they don’t, not entirely. The phages’ photosynthesis genes originally came from the bacteria — and, amazingly, the phages then transferred them back to the bacteria. Photosynthesis genes have shuttled back and forth between the bacteria and the phages several times over the last 150 million years.

    It turns out that genes evolve much more rapidly in the viruses than they do in the bacteria, because the replication process for the viruses is much shorter and more likely to make mistakes. As a side effect of the phages’ predation on the bacteria, bacterial genes sometimes get transferred into the viruses, where they can spread, evolve quickly and then be given back to the bacteria, which can then reap the benefits. So the phages have been useful to the bacteria. For example, there are two strains of Prochlorococcus, which live at different depths. One of those ecotypes adapted to live closer to the surface, where the light is much more intense and has a different frequency. That adaptation could occur because the viruses made rapid evolution available.

    And the viruses benefit from the genes, too. When a virus infects its host and replicates, the number of new viruses it makes depends on how long the hijacked cell can survive. If the virus carries with it a life-support system — the photosynthesis genes — it can keep the cell alive longer to make more copies of the virus. The virus that carries the photosynthesis genes has a competitive advantage over one that doesn’t. There’s a selection pressure on the viruses to carry genes that benefit the host. You’d expect that because the viruses have such a high mutation rate, their genes would deteriorate rapidly. But in the calculations that we’ve done, we’ve found that the bacteria filter the good genes and transfer them to the viruses.

    So there’s a nice story here: a collective behavior between the bacteria and the viruses that mimics the kind of things that happen in condensed matter systems — and that we can model, so that we can predict features of the system.

    Seth Lowe for Quanta Magazine

    We’ve been talking about a physics-based approach to biology. Have you encountered the reverse, where the biology has informed the physics?

    Yes. I work on turbulence. When I go home at night, that’s what I lie awake thinking about. In a paper published last year in Nature Physics, Hong-Yan Shih, Tsung-Lin Hsieh and I wanted to better understand how a fluid in a pipe goes from being laminar, where it flows smoothly and predictably, to turbulent, where its behavior is unpredictable, irregular and stochastic. We discovered that very close to the transition, turbulence behaves kind of like an ecosystem. There’s a particular dynamical mode of the fluid flow that’s like a predator: It tries to “eat” the turbulence, and the interplay between this mode and the emerging turbulence gives rise to some of the phenomena that you see as the fluid becomes turbulent. Ultimately, our work predicts that a certain type of phase transition happens in fluids, and indeed that’s what the experiments show. Because the physics problem turned out to be mappable onto this biology problem — the ecology of predator and prey — Hong-Yan and I knew how to simulate and model the system and reproduce what people see in experiments. Knowing the biology actually helped us understand the physics.

    What are the limitations to a physics-based approach to biology?

    On one hand, there is a danger of replicating only what is known, so that you can’t make any new predictions. On the other, sometimes your abstraction or minimal representation is oversimplified, and then you’ve lost something in the process.

    You can’t think too theoretically. You have to roll up your sleeves and learn the biology, be closely tied with real experimental phenomena and real data. That’s why our work is done in collaboration with experimentalists: With experimentalist colleagues, I’ve collected microbes from the hot springs of Yellowstone National Park, watched jumping genes in real time in living cells, sequenced the gastrointestinal microbiome of vertebrates. Every day you’ll find me working in the Institute for Genomic Biology, even though my home department is physics.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 2:17 pm on October 17, 2017 Permalink | Reply
    Tags: , , , HBOOK and PAW and ROOT and GEANT, , , , Physics, René Brun   

    From ALICE at CERN: “40 Years of Large Scale Data Analysis in HEP: Interview with René Brun” 

    CERN New Masthead

    16 October 2017
    Virginia Greco

    Over 40 years of career at CERN, René Brun developed a number of software packages that became largely used in High Energy Physics. For these fundamental contributions he was recently awarded a special prize of the EPS High Energy Particle Physics Division. We have talked with him about the key events of this (hi)story.

    René Brun giving a seminar at CERN (on October 4, 2017) about “40 Years of Large Scale Data Analysis in HEP – the HBOOK, Paw and Root Story”. [Credit: Virginia Greco]

    It is hard to imagine that one same person can be behind many of the most important and largely used software packages developed at CERN and in high-energy physics: HBOOK, PAW, ROOT and GEANT. This passionate and visionary person is René Brun, now honorary member of CERN, who was recently awarded a special prize of the EPS High Energy Particle Physics Division “for his outstanding and original contributions to the software tools for data management, detector simulation, and analysis that have shaped particle and high energy physics experiments for many decades”. Over 40 years of career at CERN, he worked with various brilliant scientists and we cannot forget that the realization of such endeavors is always the product of a collaborative effort. Nevertheless, René has had the undoubtable merit of conceiving new ideas, proposing projects and working hard and enthusiastically to transform them in reality.

    One of his creations, ROOT, is a data analysis tool widely used in high energy and nuclear physics experiments, at CERN and in other laboratories. It has already passed beyond the limits of physics and is now being applied in other scientific fields and even in finance. GEANT is an extremely successful software package developed by René Brun, which allows simulating physics experiments and particle interactions in detectors. Its latest version, GEANT4, is currently the first choice of particle physicists dealing with detector simulations.

    But previous to ROOT and GEANT4, which are very well known among the youngest as well, many other projects had been proposed and software tools had been developed. It is a fascinating story, which René was invited to tell in a recent colloquium, organized at CERN by the EP department.

    As he recounts, all started in 1973, when he was hired in the Data Handling (DD) division at CERN to work with Carlo Rubbia in the R602 experiment at the ISR. His duty was to help developing a special hardware processor for the online reconstruction of the collision patterns. But since this development was moving slowly and was not occupying much of his work time, René was asked to write some software for the event reconstruction in multiwire proportional chambers. “At that time, I hated software,” René confesses smiling, “I had written software during my PhD thesis, while studying in Clermont-Ferrand and working at CERN during the weekends, and I hadn’t really enjoyed it. I had joined Rubbia’s group with the ‘promise’ that I would work on hardware, but very quickly I became a software guy again…”

    In short time, René implemented in software (programming in Fortran4) what they could not realize via hardware and, in addition, he developed a histogram package called HBOOK. This allowed realizing a very basic analysis of the data, creating histograms, filling them and sending the output to a line printer. He also wrote a program called HPLOT which was specialized in drawing histograms generated by HBOOK.

    At that time, there were no graphic devices, so the only way to visualize histograms was printing them using a line printer, and programs were written in the form of punched cards.

    René remembers with affection the time spent punching cards, not for the procedure itself, which was slow and quite tedious, but for the long chats he used to have in the room where the card punchers and printers of the DD department were sitting, as well as in the cafeteria nearby. In those long hours, he could discuss ideas and comment on new technologies with colleagues.

    A huge progress was made possible by the introduction of the teletype, which replaced card punchers. Users could generate programs on a disk file and communicate with a central machine, called FOCUS, while – at the same time – seeing on a roll of paper what they were doing as in a normal type machine. “The way it worked can make people smile today,” René recounts, “To log in the FOCUS, one had to type a command which caused a red light to flash in the computer centre. Seeing the light, the operator would mount into the memory of the machine the tape of the connected person, who could thus run a session on the disk. When the user logged out, the session was again dumped on tape. You can imagine the traffic! But this was still much faster than punching cards.”

    Some time later, the teletype was in turn replaced by a Tektronix 4010 terminal, which brought in a big revolution, since it gave the possibility to display results in graphic form. This new, very expensive device allowed René to speed up the development of his software: HBOOK first, then another package called ZBOOK and the first version of GEANT. Created in 1974 with his colleagues in the Electronic Experiments (EE) group, GEANT1 was a tool for performing simple detector simulations. Gradually, they added features to this software and were able to generate collision simulations: GEANT2 was born.

    In 1975 René joined the NA4 experiment, a deep inelastic muon scattering experiment in the North Area, led by Carlo Rubbia. There he collaborated on the development of new graphic tools that allowed printing histograms using a device called CalComp plotter. This machine, which worked with a 10-meter-long roll of paper, granted a much better resolution compared with line printers, but was very expensive. In 1979 a microfilm system was introduced: histograms saved on the film could be inspected before sending them to the plotter, so that only the interesting ones were printed. This reduced the expenses due to the use of the CalComp.

    René was then supposed to follow Rubbia in the UA1 experiment, for which he had been doing many simulations – “Without knowing that I was simulating for UA1,” René highlights. But instead, at the end of 1980, he joined the OPAL experiment, where he performed all the simulations and created GEANT3.

    While working on the HBOOK system, in 1974 René had developed a memory management and I/O system called ZBOOK. This tool was an alternative to the HYDRA system, which was being developed in the bubble chambers group by the late Julius Zoll (also author of another management system called Patchy).

    Thinking that it was meaningless to have two competing systems, in 1981, the late Emilio Pagiola proposed the development of a new software package called GEM. While three people were working hard on the GEM project, René and Julius together started to run benchmarks to compare their systems, ZBOOK and HYDRA, with GEM. Through these tests, they came to the conclusion that the new system was by far slower than theirs.

    In 1983 Ian Butterworth, the then Director for Computing, decided that only the ZBOOK system would be supported at CERN and that GEM had to be stopped, and HYDRA was frozen. “My group leader, Hans Grote, came to my office, shook my hand and told me: ‘Congratulations René, you won.’ But I immediately thought that this decision was not fair, because actually both systems had good features and Julius Zoll was a great software developer.”

    In consequence of this decision, René and Julius started a collaboration and joined forces to develop a package integrating the best features of both ZBOOK and HYDRA. The new project was called ZEBRA, from the combination of the names of the two original systems. “When Julius and I announced that we were collaborating, Ian Butterworth immediately called both of us to his office and told us that, if in 6 months the ZEBRA system was not functioning, we would be fired from CERN. But indeed, less than two months later we were already able to show a running primary version of the ZEBRA system.”

    At the same time, histogram and visualization tools were under development. René put together an interactive version of HBOOK and HPLOT, called HTV, which run on Tektronix machines. But in 1982 the advent of personal workstations marked a revolution. The first personal workstation introduced in Europe, the Apollo, represented a leap in terms of characteristics and performance: it was faster, had more memory and better user interface than any other previous device. “I was invited by the Apollo company to go to Boston and visit them,” René recounts. “When I first saw the Apollo workstation, I was shocked. I immediately realized that it could speed up our development by a factor of 10. I put myself at work and I think that in just three days I adapted some 20000 lines of code for it.”

    The work of René in adapting HTV for the Apollo workstation attracted the interest of the late Rudy Böck, Luc Pape and Jean-Pierre Revol from the UA1 collaboration, who also suggested some improvements. Therefore, in 1984 the three of them elaborated a proposal for a new package, which would be based on HBOOK and ZEBRA, that they called PAW, from Physics Analysis Workstation.

    The PAW team: (from the left) René Brun, Pietro Zanarini, Olivier Couet (standing) and Carlo Vandoni.

    After a first period of uncertainties, the PAW project developed quickly and many new features were introduced, thanks also to the increasing memory space of the workstations. “At a certain point, the PAW software was growing so fast that we started to receive complaints from users who could not keep up with the development,” says René smiling. “Maybe we were a bit naïve, but certainly full of enthusiasm.”

    The programming language generally used for scientific computing was FORTRAN. In particular, at that time FORTRAN 77 (introduced in 1977) was widespread in the high-energy physics community and the main reason for its success was the fact that it was well structured and quite easy to learn. Besides, very efficient implementations of it were available on all the machines used at the time. As a consequence, when the new FORTRAN 90 appeared, it seemed obvious that it would replace FORTRAN 77 and that it would be as successful as the previous version. “I remember well the leader of the computing division, Paolo Zanella, saying: ‘I don’t know what the next programming language will do but I know its name: FORTRAN.’”

    In 1990 and 91 René, together with Mike Metcalf, who was a great expert of FORTRAN, worked hard to adapt the ZEBRA package to FORTRAN 90. But this effort did not lead to a satisfactory result and discussions raised about the opportunity to keep working with FORTRAN or moving to another language. It was the period when object-oriented programming was taking its first steps and also when Tim Berners Lee joined René’s group.

    Berners-Lee was supposed to develop a documentation system, called XFIND, to replace the previous FIND that could run only on IBM machines, which had to be usable on other devices. He believed, though, that the procedure he was supposed to implement was a bit clumsy and certainly not the best approach to the problem. So, he proposed a different solution with a more decentralized and adaptable approach, which required first of all a work of standardization. In this context, Berners-Lee developed the by-now-very-famous idea of the World Wide Web servers and clients, developed using an object-oriented language (Object C).

    It was a very hot period, because the phase of design and simulation of the experiments for the new accelerator LHC had been launched. It was important to take a decision about the programming language and the software tools to use in these new projects.

    At the workshop of ERICE, organized by INFN in November 1990, and then at the Computing in High Energy Physics (CHEP) conference in Annecy (France), in September 1992, the high-energy physics “software gurus” of the world gathered to discuss about programming languages and possible orientations for software in HEP. Among the many languages proposed, there were also Eiffel, Prolog, Modula2 and others.

    In 1994 two Research and Development (RD) projects were launched: RD44, with the objective of implementing in C++ a new version of GEANT (which will become GEANT4), and RD45, aiming to investigate object-oriented database solutions for the LEP experiments.

    According to René, his division was split in three opinion groups: those who wanted to stay with FORTRAN 90, those who bet on C++ and those who were interested in using commercial products. “I presented a proposal to develop a package that would take PAW to the OO word. But the project, which I called ZOO, was rejected and I was even invited to take a sabbatical leave” René admits.

    This blow, though, proved later to be indeed a strike of luck for René. He was suggested by his division leader, David Williams, to join the NA49 experiment in the North Area, which needed somebody to help developing the software. At first, he refused. He had been leading for years both the GEANT and the PAW projects and making simulation or developing software for different groups and applications, thus accepting to go back working in a specific experiment appeared to him as a big limitation.

    But he gave it second thoughts and realized that it was an opportunity to take some time to develop new software, with total freedom. He went to visit the NA49 building in the Prevessin site and, seeing from the windows pine trees and squirrels, he felt that it was indeed the kind of quiet environment he needed for his new project. Therefore, he moved his workstation from his office to the Prevessin site (“I did it during a weekend, without even telling David Williams”) and, while working for NA49, he taught himself C++ by converting in this new OO language a large part of his HBOOK software.

    At the beginning of 1995, René was joined in NA49 by Fons Rademakers, with whom he had already collaborated. The two of them worked very hard for several months and produced the first version of what became the famous ROOT system. The name comes simply from the combination of the starting letter of the email addresses of the two founders (René and Rdm, for Rademakers), the double O of Object Oriented and the word Technology. But the meaning or the word ‘root’ also fitted well with its being a basic framework for more software to be developed and with the use of tree structures in its architecture.

    In November of the same year, René gave a seminar to present the ROOT system. “The Computing Division auditorium was unexpectedly crowded!” René recalls, “I think it was because people thought that Fons and I had disappeared from the software arena, while all of a sudden we were back again!” And actually the ROOT system generated considerable interest.

    But while René and Fons were completely absorbed by the work on their new software package, the RD45 project, which had the mandate to decide what new software had to be adopted by the new LHC experiments, had proposed to use the commercial product “Objectivity” and a lot of work was ongoing to develop applications to meet the HEP needs. According to René, there was a clear intention to obstruct the development and diffusion of ROOT. In spring 1996 the CERN director for computing, Lorenzo Foa, declared that the ROOT project was considered as a private initiative of NA49 which was not supported by the CERN management and that the official line of development was the one around Objectivity.

    “I think that the LHC Computing Board didn’t have the right insight into the architecture of these software tools to be able to judge which solution was the best. Thus, they had to trust what they were told,” René comments. “It is always a problem when there is such a divide between the experts – and users – working on something and the people who are to take important decisions.”

    Nevertheless, René and Fons continued developing ROOT and implementing new features, taking advantage of the lessons learnt with the previous software packages (in particular the requests and criticisms of the users). In addition, they followed closely the development of the official line with Objectivity, in order to know what people using it were looking for and what the problems or difficulties were. “The more we looked into Objectivity, the more we realized it could not meet the needs of our community,” René adds, “we knew that the system would fail and that eventually people would realize it. This gave us even more energy and motivation to work hard and improve our product.”

    They had continuous support from the NA49 and ALICE collaborations, as well as from many people in ATLAS and CMS, who saw good potentiality in their software package. At the time, René was collaborating with many people in both experiments, including Fabiola Gianotti and Daniel Froidevaux, in particular for detector simulations. Besides, many users trusted them for the relationship created along many years through the user support of PAW and GEANT.

    Things started to change when interest for ROOT raised outside CERN. In 1998, the two experiments of Fermilab, CDF and D0, decided to discuss about the future of their software approach, in view of the soon-coming Run II of the Tevatron. Hence, they opened two calls for proposals of software solutions, one for data storage and one for data analysis and visualization. René submitted ROOT to both calls. During the CHEP conference in Chicago the proposals were discussed and the last day it was publicly announced that CDF and D0 would adopt ROOT. “I was not expecting it,” says René, “I remember that when the communication was given, everybody turned their face and looked at me.” Soon later, the experiments of RHIC at the Brookhaven National Laboratory took the same decision. The BaBar experiment at SLAC, after years spent attempting to use Objectivity, had realized that it was not as good a system as expected, so moved to ROOT as well.

    Gradually, it was clear that the HEP community was ‘naturally’ going towards ROOT, so the CERN management had to accept this situation and, eventually, support it. But this happened only in 2002. With more manpower allocated to the project, ROOT continued developing fast and the number of users increased dramatically. It also started to spread to other branches of science and into the financial world. “In 2010, we had on average 12000 downloads per month of the software package and the ROOT website had more visitors than the CERN one”.

    The logo of the ROOT software package.

    René retired in 2012, but his two most important brainchildren, ROOT and GEANT, keep growing thanks to the work of many young scientists. “I think that it is essential to have a continuous stimulus that pushes you to improve your products and come out with new solutions. For this, the contribution of young people is very important,” comments René. But, as he admits, what really made him and his colleagues work hard for so many years is the fact that the software packages they were developing had always some competitors and, in many cases, they were challenged and even obstructed. “When you are contrasted, but you know you are right, you are condemned to succeed.”

    The great attention to the users’ needs has also been very important, because it helped to shape the software and build a trust relationship with people. “I have always said that you have to put the user support at the highest priority,” René explains. “If you reply to a request in 10 minutes you get 10 points, in one hour you get 2 points, and in one day you go already to -10 points. Answering questions and comments is fundamental, because if the users are satisfied with the support you give them, they are willing to trust what you propose next.”

    Now that he is retired, René still follows the software development at CERN, but only as an external observer. This does not mean that he has left apart his scientific interests, on the contrary he is now dedicating most of his energies to a more theoretical project, since he is developing a physics model. In his spare time, he likes gardening. He loves flowers, but he cannot avoid looking at them with a scientific eye: “A colleague of mine, who is mathematician, and I developed a mathematical model about the way flowers are structured and grow.”

    Brilliant minds are always at work.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    CERN/ATLAS detector


    CERN/CMS Detector




    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

  • richardmitnick 1:59 pm on October 17, 2017 Permalink | Reply
    Tags: , , , Eliane Epple, , , Physics,   

    From ALICE at CERN: Women in STEM – “Focus on Eliane Epple” 

    CERN New Masthead

    16 October 2017
    Virginia Greco

    Eliane Epple
    A postdoctoral researcher at Yale University, Eliane is working on an analysis involving hard scattering events that produce direct photons and has recently done her first shift as Run Manager for ALICE.

    When she started studying physics in Stuttgart, her hometown, Eliane Epple was already passionate about particle physics. But since it was not possible to specialize in this field at her university, after two years she moved to Munich and attended the Technical University Munich (TUM). Here, she followed courses for two more years before joining a research project led by Prof. Laura Fabbietti, who had just received a big grant and was starting her research group. The subject of Eliane’s Diploma thesis was the study of the interactions of kaons – and other particles containing strange quarks – with nuclear matter (protons and neutrons). More in detail, for her Diploma she analyzed the decay products of a resonance called Λ(1405), which is by some theories treated as a molecular bound state of an anti-kaon and a nucleon. Its is in this sense a pre-stage of a kaonic nuclear cluster that she later studied during her PhD, still working with Prof. Fabbietti.

    In particular, Eliane and colleagues were investigating the possible existence of anti-kaonic bound-states formed by, for example, two nucleons and one anti-kaon.­ Besides Fabbietti’s team, other groups all over the world were working on this topic, since a number of theoretical physicists had hypothesized that the attraction between nucleons and anti-kaons should be strong enough to give rise to this bound state, at least for a short time. “I analyzed data from the High Acceptance Di-Electron Spectrometer (HADES) at GSI.


    In particular, I looked for particles produced in p+p collisions that could originate bfrom the decay of this anti-kaon-nucleon bound state,” explains Eliane. “It was a very controversial topic at the time, because there were groups that, analyzing a certain set of data, could see a signal compatible with the detection of such bound state, while others couldn’t. I didn’t find any signal proving this hypothesis, but at the same time my results set un upper limit for the existence of this bound state at the beam energy of 3.5 GeV.”

    “In order to set a limit,” Eliane continues, “you compare the result of your data analysis with the outcome of a simulation, performed assuming the hypothesis that the signal you are looking for but didn’t see exists. In other words, you develop a model for this case and study how much signal you can introduce and still keep consistency with your data. You proceed to add more and more signal strength to your model in little steps, until you reach a threshold: if you overcome it, the model doesn’t fit anymore with the data. This threshold is an upper limit.”

    She also combined her results with data from other experiments and showed that it was very unlikely that the signal seen by some other groups could be due to an anti-kaon-nucleon bound state. “Actually, I think that this signal exists because there are many compelling reasons from our theory colleagues, but it is very challenging to see, first of all because the production cross section of this state is probably very small, which means that it occurs rarely, so we need to take a lot of data. In addition, it might be a very broad state, so we are not going to find a narrow peak. As a consequence, understanding the background well is essential.”

    When she completed her PhD in 2014, she decided to change field. “In that situation, you have two possible choices,” explains Eliane, “either you stay on the same topic and become an expert in a very specific field, or you change and broaden your horizon. In this second case, you do not become a specialist of one topic but rather increase your ‘portfolio’. I preferred to go for this second option and do something completely new. This way is much harder because you basically start from the beginning but I think it benefits a researcher in the long term to look at a field, in this case QCD, from many perspectives. I thus also encourage some young researchers to give low energy QCD research a chance and see what people do beyond the TeV scale.” Therefore, she joined the research group led by John Harris and Helen Caines at Yale University, in New Haven (US), where she has been working for two and a half years now, and entered the ALICE collaboration.

    Her present research activities focus on hard probes in high-energy collisions. “The proton is a very fascinating object, there is a lot going on in it,” Eliane comments. “When you scatter two protons at low energy (an energy range where I have previously been working on), you see how the ‘entire’ proton behaves, you are not able to distinguish its internal structure. On the contrary, at the high energies of LHC, when you collide two protons you start seeing what happens inside, you can observe how partons collide with each other.”

    In these hard scattering events, particles with a high transverse momentum are present in the final state. Eliane is analyzing Pb-Pb events in which a parton and a photon (a gamma) are produced. Photons do not interact with strongly-interacting matter, hence, when the Quark Gluon Plasma (QGP) is created in ALICE by smashing lead nuclei, a photon produced in the collision can traverse this medium and get out unaffected. In the opposite direction, a parton moves away from the collision vertex and fragments into a particle shower. The sum of the momenta of the particles in this shower have to balance the momentum of the photon (combining these fragments with the gamma on the other side is called gamma-hadron correlation), and altogether they carry the total momentum of the mother parton.

    The objective of this research is measuring the fragmentation function, which describes the correlation between the momentum of the mother and those of each particle in the shower. Normally, most of the daughter particles carry a small fraction (less than 20%*) of the momentum of the mother, whereas very few of them have a high fraction of this momentum. “By studying the behaviour of the particle shower in Pb-Pb collisions, in comparison with pp and p-Pb collisions, we can understand how the QGP medium modifies it,” explains Eliane. “We may have, for example, fewer of these very high momentum fragments and therefore more of the low momentum ones, or the shower might be broader. This study gives information about the properties of the medium that is created.”

    There are measurements of gamma-hadron correlations performed in PHENIX,


    at 200 GeV which show that in gold-gold collisions the fragmentation function changes, giving fewer particles with high momentum fractions and many more particles having a low momentum fraction. ALICE is investigating what happens at higher energies.

    Eliane is now working in collaboration with a graduate student at her institute and other colleagues in Berkeley. “We are performing a very complex analysis. In our events, we have to identify gammas on one side and the hadron showers on the other. But gammas can also be decay products of other particles, such as pions and other mesons. Thus, it is important to avoid this background signal and take into consideration only events in which the gamma is produced in the primary vertex. This is not easy and requires a number of following steps.”

    Eliane will continue working at Yale for some time. Then, she will either look for another post-doctoral position in ALICE or will directly apply for some grants, most likely in Europe. “There are various opportunities in Germany to get research funding to start your own research group.”

    Even though she likes her present topic of analysis, in the future she might change for something more basic: the substructure and dynamic of the proton. “The proton is a very complex and fascinating object in its own right, we still do not know much about its internal dynamics,” she highlights. In any case, the most important thing for her is to settle on a research topic that will give her deeper insight into QCD properties — something she is very intrigued by.

    In addition to doing data analysis, Eliane coordinates the activities of the EMCal calibration group and EMCal photon object group and, lately, has been Run Manager for the data taking. With so much work and a four-year-old daughter, there is not much time left. Nevertheless, when she can, she attends classes of modern dance to de-stress and relax.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    CERN/ATLAS detector


    CERN/CMS Detector




    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

  • richardmitnick 11:45 am on October 17, 2017 Permalink | Reply
    Tags: , , LZ- LUX-ZEPLIN experiment, , Physics,   

    From SURF: “LZ team installs detector in water tank” 

    SURF logo
    Sanford Underground levels

    Sanford Underground Research facility

    October 16, 2017
    Constance Walter

    Sally Shaw, a post-doc with the University of California Santa Barbara, poses next to the sodium iodide detector recently installed inside the water tank. Courtesy photo.

    The huge water tank that for four years housed the Large Underground Xenon (LUX) dark matter detector now stands empty. A small sign over the opening that reads, “Danger! Confined space,” bars physical entry, but a solitary note sung by Michael Gaylor, a science professor from Dakota State University, once jumped that barrier and reverberated for 35.4 seconds.

    Starting this week, the tank will be filled with the sounds of collaboration members installing a small detector that will be used to measure radioactivity in the cavern. It’s all part of the plan to build and install the much larger, second-generation dark matter detector, LUX-ZEPLIN (LZ).

    LBNL Lux Zeplin project at SURF

    “We need to pin down the background event rate to better shield our experiment,” said Sally Shaw, a post doc form from the University of California, Santa Barbara (UCSB).

    The detector, a 5-inch by 5-inch cylinder of sodium iodide, will be placed inside the water tank and surrounded by 8 inches of lead bricks. The crystal will be covered on all sides except one, which will be left bare to measure the gamma rays that are produced when things like thorium, uranium and potassium decay. Over the next two weeks, the team will change the position of the detector five times to determine the directionality of the gamma rays.

    Scott Haselschwardt, a graduate student at UCSB, said this is especially important because there is a rhyolite intrusion that runs below the tank and up the west wall of the cavern.

    “This rock is more radioactive than other types of rock, so it can create more backgrounds,” he said. This wasn’t a problem for LUX, Haselschwardt said, but it was smaller than LZ and, therefore, surrounded by more ultra-pure water.

    But LZ is 10 times larger and still must fit inside the same tank, potentially exposing it to more of the radiation that naturally occurs within the rock cavern. And while this radiation is harmless to humans, it can wreak havoc on highly sensitive experiments like LZ.

    “Because it is so much closer to the edges of the water tank, there was a proposal to put in extra shielding—perhaps a lead ring at the bottom of the tank to shield the experiment,” Shaw said.

    Like its much smaller cousin, LZ hopes to find WIMPs, weakly interacting massive particles. Every component must be tested to ensure it is free of any backgrounds, including more than 500 photomultiplier tubes, the titanium for the cryostat and the liquid scintillator that will surround the xenon container. But if the backgrounds emanating from the walls of the cavern are too high, it won’t matter.

    “The whole point is to see whether the lead needs to be used in the design of the shield,” said Umit Utku, a graduate student at University College in London. “Maybe we will realize we don’t need it.”

    Shaw, who created a design for lead shielding within the tank, said it’s critical to fully understand the backgrounds now.

    “If we do need extra shielding, we must adjust the plans before installation of the experiment begins,” she said.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE

  • richardmitnick 7:42 am on October 17, 2017 Permalink | Reply
    Tags: , , , Physics, , SSEN-steady-state electrical network   

    From PPPL: “PPPL completes shipment of electrical components to power site for ITER, the international fusion experiment” 


    October 16, 2017
    Jeanne Jackson DeVoe

    Electrical components procured by PPPL. Pictured clockwise: switchgear, HV protection and control cubicles, resistors, and insulators. (Photo by Photo courtesy of © ITER Organization, http://www.iter.org/)

    The arrival of six truckloads of electrical supplies at a warehouse for the international ITER fusion experiment on Oct. 2 brings to a successful conclusion a massive project that will provide 120 megawatts of power – enough to light up a small city − to the 445-acre ITER site in France.

    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France

    The Princeton Plasma Physics Laboratory (PPPL), with assistance from the Department of Energy’s Princeton Site Office, headed the $34 million, five-year project on behalf of US ITER to provide three quarters of the components for the steady-state electrical network (SSEN), which provides electricity for the lights, pumps, computers, heating, ventilation and air conditioning to the huge fusion energy experiment. ITER connected the first transformer to France’s electrical grid in March. The European Union is providing the other 25 percent.

    The shipment was the 35th and final delivery of equipment from companies all over the world, including from the United States over the past three years.

    “I think it’s a great accomplishment to finish this,” said Hutch Neilson, head of ITER Fabrication. “The successful completion of the SSEN program is a very important accomplishment both for the US ITER project and for PPPL as a partner in the US ITER project.”

    The six trucks that arrived carried a total of 63 crates of uninterruptible power supply equipment weighing 107 metric tons. The trucks took a seven-hour, 452-mile journey from Gutor UPS and Power Conversion in Wettingen, Switzerland, northwest of Zurich, to an ITER storage facility in Port-Saint-Louis-Du-Rhône, France. The equipment will eventually be used to provide emergency power to critical ITER systems in the event of a power outage.

    “This represents the culmination of a very complex series of technical specifications and global purchases, and we are grateful to the entire PPPL team and their vendors for outstanding commitment and performance”, said Ned Sauthoff, director of the US ITER Project Office at Oak Ridge National Laboratory, where all U.S. contributions to ITER are managed for the U.S. Department of Energy’s Office of Science.

    A device known as a tokamak, ITER will be the largest and most powerful fusion machine in the world. Designed to produce 500 megawatts of fusion power for 50 megawatts of input power, it will be the first fusion device to create net energy – it will get more energy out than is put in. Fusion is the process by which stars like the sun create energy – the fusing of light elements

    A separate electrical system for the pulsed power electrical network (PPEN), procured by China, will power the ITER tokamak.

    The first SSEN delivery in 2014 was among the first plant components to be delivered to the ITER site. The SSEN project is now one of the first U.S. packages to be completed in its entirety, Neilson said. He noted that the final shipment arrived 10 days ahead of PPPL’s deadline.

    In addition to the electrical components, PPPL is also responsible for seven diagnostic instruments and for integrating the instruments inside ITER port plugs. While PPPL is continuing work on an antenna for one diagnostic, most of the diagnostic and port integration work has been put on hold amid uncertainty over U.S. funding for its contributions to ITER.

    The SSEN project was a complex enterprise. PPPL researched potential suppliers, solicited and accepted bids, and oversaw the production and testing of electrical components in 16 separate packages worth a total of about $30 million. The effort involved PPPL engineers, as well as procurement and quality assurance staff members who worked to make sure that the components met ITER specifications and would do exactly what they are supposed to do. “It’s really important that we deliver to ITER equipment that exactly meets the requirements they specify and that it be quality equipment that doesn’t give them trouble down the road,” Neilson said. “So every member of the team makes sure that gets done.”

    Many of the components were for the high-voltage switchyard. A massive transformer procured by PPPL was connected to the French electrical grid in March. PPPL procured and managed the purchase and transportation of the 87-ton transformer and three others, which were built in South Korea by Hyundai Heavy Industries, a branch of the company known for producing cars. =

    The SSEN components came from as close to home as Mount Pleasant, Pennsylvania, to as far away as Turkey, with other components coming from Mexico, Italy, Spain, France, Germany, South Korea and the Netherlands.

    John Dellas, the head of electrical systems and the team leader for the project, has been working on the ITER SSEN project for the entire five years of the program. He traveled to Schweinfurt, Germany, to oversee testing of the control and protection systems for the high-voltage switchyard.

    Dellas took over the project from Charles Neumeyer after Neumeyer became engineering director for the NSTX-U Recovery Project last year. Dellas said Neumeyer deserves most of the credit for the program. “Charlie took the team down to the 10-yard line and I put everything in the end zone,” Dellas said. “I was working with Charlie but Charlie was the quarterback.”

    Neumeyer worked on the project from 2006, when the project was in the planning stages, until 2016. He said he was happy to see the project completed. “It’s very gratifying to see roughly 10 years of work come to a satisfying conclusion under budget and on schedule,” he said.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

  • richardmitnick 5:10 pm on October 16, 2017 Permalink | Reply
    Tags: , , Physics, , SLAC LCLS & LCLS II, X-ray synchrotrons and electron microscopes, XFEL's-X-ray free-electron lasers   

    From SLAC via R&D: “Ultrafast X-Ray Science: Groundbreaking Laser Takes Discovery to New Extremes” 

    SLAC Lab



    Mike Dunne, Ph.D, Director LCLS, SLAC National Accelerator Laboratory, Stanford University

    The LCLS Coherent X-ray Imaging Experimental Station. Credit: Nathan Taylor, SLAC National Accelerator Laboratory


    The world at the atomic scale is never at rest, with particles moving so quickly and molecular bonds changing so rapidly that we have been unable to capture their motion directly until now. Previously, we’ve had to rely on static pictures of the molecular world (using X-ray synchrotrons or electron microscopes), or infer dynamic behavior from spectroscopic signatures (using short-pulse optical lasers). All that changed in 2009, when the world’s first X-ray free-electron laser (XFEL) was successfully commissioned. The field of ultrafast X-ray science took a huge leap forward – with a source that was billions of times brighter than anything that came before, delivering bursts of X-rays on timescales that are many orders of magnitude shorter – reaching the femtosecond domain.

    A flash of light as short as this can freeze the motion of atoms in molecules, allowing us to make slow-motion movies of how nature works. A femtosecond (a millionth of a billionth of a second) is at the fundamental scale of atomic and molecular physics, and so underpins the initiating events of the chemical, material, and biological processes that make up our world.

    The concept of a free-electron laser was introduced by John Madey at Stanford in 1971, in which the passage of an electron beam through a series of magnets causes the emission of photons, which ultimately can interact to create a coherent burst of light. Later, in the 1990s, Claudio Pellegrini and collaborators proposed to extend free-electron lasers to the X-ray regime – a hugely ambitious concept that was ultimately proven with the construction of the Linac Coherent Light Source (LCLS) by the U.S. Department of Energy (DOE) at the SLAC National Accelerator Laboratory.

    This new source combines three critical features. First, the wavelength of the X-ray light is at the Angstrom scale, and tunable, so that the individual atoms in a molecule can be imaged. Second, these X-rays are delivered on a short enough timescale to freeze the motion of atoms in molecules, capture the initiating events of molecular bond formation and the dynamics of electrons as they orbit an atom or carry charge around a molecule. Third, over a trillion X-rays are delivered in each pulse, resulting in a source that is so incredibly bright it can deliver precise information from just a single pulse. With over a hundred pulses per second, movies can be made of how atomic and molecular systems evolve – allowing unprecedented insight into fields as diverse as chemical catalysis, structural biology, quantum materials science, and the physics of planetary formation.

    What underpins these fields is the need to make direct observations of fundamental charge, spin, and orbital or lattice dynamics on natural timescales – put simply, to see the motion of electrons and ions as they respond to their environment or to external stimuli.

    From a capability point of view, the progress in XFEL performance has been dramatic, creating precision tools with unprecedented peak intensity and time-averaged brightness. In less than a decade, the X-ray pulse duration has been shortened from over 100 femtoseconds to 5 fs (and likely 0.5 fs later this year); full polarization control has been introduced, so we can see chiral molecules that are important for many pharmaceutical drugs; and a wide array of dual-pulse options have been developed that provide the ability to drive a system and monitor its response on timescales that range from femtoseconds to microseconds.

    This is an illustration of an electron beam traveling through a niobium cavity – a key component of SLAC’s future LCLS-II X-ray laser. Kept at minus 456 degrees Fahrenheit, a temperature at which niobium conducts electricity without losses, these cavities will power a highly energetic electron beam that will create up to 1 million X-ray flashes per second – more than any other current or planned X-ray laser. Credit: SLAC National Accelerator Laboratory

    Looking forward

    The pace of progress is set to further accelerate over the next few years. The first X-ray laser, LCLS, delivered 120 pulses per second, followed by SACLA in Japan at 60 per second. A new facility, the European-XFEL based in Hamburg, Germany turned on in mid-2017, delivering pulses at 27,000 per second.

    DESY European XFEL

    And now an additional $1 billion is being invested by the DOE to create the LCLS-II upgrade that will provide up to a million pulses per second by 2020.


    This will be transformative, allowing the study of real-world systems that are simply inaccessible today, including statistical fluctuations and heterogeneous materials, rather than idealized samples.

    Normally when a field advances, the performance of the system increases incrementally, or sometimes by a factor of 10 or so. Here, the field had to cope with a factor of a billion increase in capability – requiring a completely new approach to the measurements, and innovation in almost all aspects of the technology.

    The incredible efforts in delivering this novel X-ray source are thus only part of the story. Precision science requires the integration of new instrumentation and measurement techniques that can take advantage of the characteristics of the new source, and so provide quantitative information. This in turn requires detectors that can sense photons individually or by the thousands and operate at the same repetition rate as the source; optics that can handle the intense X-ray power; delivery of a continuous stream of samples for the X-rays to probe; and data acquisition systems that can cope with unprecedented rates.

    Applications across industries

    The degree to which this has been achieved is astonishing, and has touched a very large number of fields. In chemical science, the ultrashort bursts of XFEL light have been used to capture the birth of a chemical bond and follow the ultrafast dynamics of catalytic activity as gas is passed over a metal surface – allowing insight into the fundamental reactions that drive major industrial processes.

    Similarly, the brightness of the beam has been used to create detailed “molecular movies” that for the first time track the opening of a ring molecule after a bond is broken by a burst of light. This ability to watch molecules evolve their geometrical structure in real time removes the great uncertainties associated with only being able to see the initial and final stages of structure formation and having to rely on complex numerical models.

    Interestingly, this approach has had great impact in the complex field of structural biology and the associated problem of drug discovery for improved medicines. In that area, LCLS has revolutionized the study of membrane proteins by allowing a new technique known as “diffraction before destruction” that can measure the atomic structure of very delicate samples. It has also allowed observations of how molecular machines work in living systems with sub-picosecond precision. Applications have ranged from investigating neurotransmitters connected with the study of Alzheimer’s disease, to finding weaknesses in the parasite that causes African sleeping sickness, to gaining a new understanding of how the body can regulate blood pressure and anxiety. Moving to the atomic scale, LCLS has provided the first direct evidence of superfluidity in nanometer-sized quantum systems, and has imaged the process of electron charge transfer to better understand how to harness photosynthesis for energy generation.

    With the advent of the new XFEL sources in Japan, Germany, the United States, Switzerland and the Republic of Korea, this field is set to expand into many new research areas. Much work is currently underway to define the most compelling science opportunities, and thus guide the direction of facility development. But the most impactful science may well come from fields that have never previously used X-ray sources or been able to peer into the ultrafast world.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 8:51 pm on October 13, 2017 Permalink | Reply
    Tags: 2-D structure of turbulence in tokamaks, , , Physics,   

    From PPPL: “PPPL takes detailed look at 2-D structure of turbulence in tokamaks” 


    October 13, 2017
    John Greenwald

    Correlation analysis of three plasma discharges on NSTX for each of five different radial locations near the plasma edge. The red regions marked with a blue cross have high positive correlation around the origin point, while the blue regions marked with a yellow cross have high negative correlation. Images courtesy of Stewart Zweben.

    A key hurdle for fusion researchers is understanding turbulence, the ripples and eddies that can cause the superhot plasma that fuels fusion reactions to leak heat and particles and keep fusion from taking place. Comprehending and reducing turbulence will facilitate the development of fusion as a safe, clean and abundant source of energy for generating electricity from power plants around the world.

    At the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL), scientists have assembled a large database of detailed measurements of the two dimensional (2-D) structure of edge plasma turbulence made visible by a diagnostic technique known as gas puff imaging. The two dimensions, measured inside a fusion device called a tokamak, represent the radial and vertical structure of the turbulence.

    Step toward fuller understanding

    “This study is an incremental step toward a fuller understanding of turbulence,” said physicist Stewart Zweben, lead author of the research published in the journal Physics of Plasmas. “It could help us understand how turbulence functions as the main cause of leakage of plasma confinement.”

    Fusion occurs naturally in space, merging the light elements in plasma to release the energy that powers the sun and stars. On Earth, researchers create fusion in facilities like tokamaks, which control the hot plasma with magnetic fields. But turbulence frequently causes heat to leak from its magnetic confinement.

    PPPL scientists have now delved beyond previously published characterizations of turbulence and analyzed the data to focus on the 2-D spatial correlations within the turbulence. This correlation provides clues to the origin of the turbulent behavior that causes heat and particle leakage, and will serve as an additional basis for testing computer simulations of turbulence against empirical evidence.

    Studying 20 discharges of plasma

    The paper studied 20 discharges of plasma chosen as a representative sample of those created in PPPL’s National Spherical Torus Experiment (NSTX) prior to its recent upgrade. In each of these discharges, a gas puff illuminated the turbulence near the edge of the plasma, where turbulence is of special interest. The puffs, a source of neutral atoms that glow in response to density changes within a well-defined region, allowed researchers to see fluctuations in the density of the turbulence. A fast camera recorded the resulting light at the rate of 400,000 frames per second over an image frame size of 64 pixels wide by 80 pixels high.

    Zweben and co-authors performed computational analysis of the data from the camera, determining the correlations between different regions of the frames as the turbulent eddies moved through them. “We’re observing the patterns of the spatial structure,” Zweben said. “You can compare it to the structure of clouds drifting by. Some large clouds can be massed together or there can be a break with just plain sky.”

    Detailed view of turbulence

    The correlations provide a detailed view of the nature of plasma turbulence. “Simple things about turbulence like its size and time scale have long been known,” said PPPL physicist Daren Stotler, a coauthor of the paper. “These simulations take a deep dive into another level to look at how turbulence in one part of the plasma varies with respect to turbulence in another part.”

    In the resulting graphics, a blue cross indicates the point of focus for a calculation; the red and yellow areas around the cross are regions in which the turbulence is evolving similarly to the turbulence at the focal point. Farther away, researchers found regions in which the turbulence is changing opposite to the changes at the focal point. These farther-away regions are shown as shades of blue in the graphics, with the yellow cross indicating the point with the most negative correlation.

    For example, if the red and yellow images were a region of high density turbulence, the blue images indicated low density. “The density increase must come from somewhere,” said Zweben. “Maybe from the blue regions.”

    Going forward, knowledge of these correlations could be used to predict the behavior of turbulence in magnetically confined plasma. Success of the effort could deepen understanding of a fundamental cause of the loss of heat from fusion reactions.

    Also contributing to this study were Filippo Scotti of the Lawrence Livermore National Laboratory and J. R. Myra of Lodestar Research Corporation. Support for this work comes from the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

  • richardmitnick 8:24 pm on October 13, 2017 Permalink | Reply
    Tags: A new ultrafast optical technique for thermal measurements—time-domain thermoreflectance, , Chengyun Hua, , , Physics,   

    From ORNL: Women in STEM – “Laser-Focused: Chengyun Hua turns the heat up on materials research” 


    Oak Ridge National Laboratory

    October 13, 2017
    Bill Cabage

    Chengyun Hua applied for a Liane B. Russell Distinguished Early Career Fellowship after meeting ORNL researchers at a Society of Women Engineers conference.

    In Chengyun Hua’s research, everything revolves around heat and how it moves. As a Russell Fellow at the Department of Energy’s Oak Ridge National Laboratory, Hua carefully analyzes nanoscale heat transfer mechanisms using laser spectroscopy.

    “Heat is being generated from everywhere and we can collect that heat and convert it to energy,” she explained. “We essentially have enough heat being produced 24/7 through electronics and other sources that we could potentially impact the world’s energy production and ease today’s energy concerns.”

    Although heat has the potential to generate enough energy to power the universe, if not channeled properly, it can also become problematic.

    “We’ve seen recent news of cell phones bursting into flames,” Hua said. “The reason is too much heat is produced locally, and it has nowhere to go in a short period of time. The challenge is to capture that heat flow at the nanoscale and understand how we can more effectively dissipate it.”

    Through Hua’s work in ORNL’s Building Equipment Research group, a new ultrafast optical technique for thermal measurements—time-domain thermoreflectance—was deployed at ORNL for the first time. The technique measures the thermal properties of materials, including thermal conductivity. Using ORNL’s Ultrafast Laser Spectroscopy Laboratory, Hua measures material conductivity down to the nanometer.

    “When a material is heated using a pulsed laser, thermal stress is induced,” she explained. “The objective of raising the temperature of a material is to unveil the microscopic processes of the phonons [a type of elementary particle that plays an important role in many of the physical properties of solids, such as the thermal conductivity and the electrical conductivity] that govern the heat transport in solids. Ultimately, with this better understanding, we can design the next generation of materials—materials that not only withstand heat but also manage the heat and convert it into energy.”

    For the love of physics

    Hua grew up a world away in Shanghai, China. An only child of accountant parents, she excelled in mathematics and science, something that was not unusual in her home country.

    “It’s easy to get a job in the engineering discipline in China; it’s a highly respected profession,” she said. For Hua, however, getting accepted to study engineering physics at the University of Michigan, Ann Arbor, was an opportunity not to be missed.

    “Studying in Michigan was the first time I had ever been to the United States,” she said. “But it wasn’t until I entered the mechanical engineering program at Cal Tech that I truly felt at home.”

    Hua completed her PhD in mechanical engineering at the California Institute of Technology at Pasadena. There she met an advisor and professor who helped steer her current career path, challenging her to continue focusing on nanoscale heat transfer properties. “Cal Tech was a unique playground if you love mathematics and physics,” she said.

    After meeting some ORNL researchers at a Society of Women Engineers conference, Hua made the decision in early 2016 to apply for a fellowship that would allow her to focus on micro- and nanoscale heat transfer and energy conversion at the lab. The Liane B. Russell Distinguished Early Career Fellowship attracts scientists who have demonstrated outstanding scientific ability and research interests that align with core capabilities at the lab.

    “My advisor encouraged me to apply and within one week I wrote my proposal on ‘Exploring Thermal Transport in Nanostructured Materials for Thermal Energy Conversion and Management.’ I interviewed in November 2015 and four days after the new year, I was invited to become a fellow at ORNL,” she said.

    Uprooting again to East Tennessee, Hua has found a supportive community that encourages the sharing of new ideas and interdisciplinary research.

    “I’ve been able to live in different parts of the U.S.,” she said. “But, everywhere I’ve been, I’ve found support and an environment that promotes ideas and stimulating conversation between scientists.”

    While Hua has adapted to many moves and changes, one part of her research and studies remains unchanged.

    “Heat always flows from hot to cold,” she said. “It’s the constant in the continuum.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 8:09 pm on October 13, 2017 Permalink | Reply
    Tags: , Baby MIND, , , , , , Physics   

    From CERN: “Baby MIND born at CERN now ready to move to Japan” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead


    13 Oct 2017
    Stefania Pandolfi

    Baby MIND under test on the T9 beamline at the Proton Synchrotron experimental hall in the East Area, summer 2017 (Image: Alain Blondel/University of Geneva)

    A member of the CERN Neutrino Platform family of neutrino detectors, Baby MIND, is now ready to be shipped from CERN to Japan in 4 containers to start the experimental endeavour it has been designed and built for. The containers are being loaded on 17 and 18 October and scheduled to arrive by mid-December.

    Baby MIND is a 75-tonne neutrino detector prototype for a Magnetised Iron Neutrino Detector (MIND). Its goal is to precisely identify and track positively or negatively charged muons – the product of muon neutrinos from the (Tokai to Kamioka) beam line, interacting with matter in the WAGASCI neutrino detector, in Japan.

    T2K map, T2K Experiment, Tokai to Kamioka, Japan

    The more detailed the identification of the muon that crosses the Baby MIND detector, the more we can learn about the original neutrino, in view of contributing to a more precise understanding of the neutrino oscillations phenomenon*.

    The journey of these muon neutrinos starts from the Japan Proton Accelerator Research Complex (J-PARC) in Tokai. They travel all the way to the Super-Kamiokande Detector in Kamioka, some 295 km away.

    Super-Kamiokande Detector, located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    On their journey, the neutrinos pass through the near detector complex building, located 280 m downstream from Tokai, where the WAGASCI + Baby MIND suite of detectors are. Baby MIND aims to measure the velocity and charge of muons produced by the neutrino interactions with matter in the WAGASCI detector. Muons precise tracking will help testing our ability to reconstruct important characteristics of their parent neutrinos. This, in turn, is important because in studying muon neutrino oscillations on their journey from Tokai to Kamioka, it is crucial to know how strongly and how often they interact with matter.

    Born from prototyping activities launched within the AIDA project, since its approval in December 2015 by the CERN Research Board, the Baby MIND collaboration – comprising CERN, University of Geneva, the Institute of Nuclear research in Moscow, the Universities of Glasgow, Kyoto, Sofia, Tokyo, Uppsala and Valencia – has been busy designing, prototyping, constructing and testing this detector. The magnet construction phase, which lasted 6 months, was completed in mid-February 2017, two weeks ahead of schedule.

    The fully assembled Baby MIND detector was tested on a beam line (link sends e-mail) at the experimental zone of the Proton Synchrotron in the East Hall during Summer 2017. These tests showed that the detector is working as expected and, therefore, ready to go.

    Baby MIND under test on the T9 beamline at the Proton Synchrotron experimental hall in the East Area, summer 2017 (Image: Alain Blondel/University of Geneva)

    *Neutrino oscillations

    Neutrinos are everywhere. Each second, several billion of these particles coming from the Sun, the Earth and our galaxy, pass through our bodies. And yet, they fly past unnoticed. Indeed, despite their cosmic abundance and ubiquity, neutrinos are extremely difficult to study because they hardly interact with matter. For this reason, they are among the least understood particles in the Standard Model (SM) of particle physics.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    What we know is that they come in three types or ‘flavours’ – electron neutrino, muon neutrino and tau neutrino. Since their first detection in 1956, and until the late 1990s neutrinos were thought to be massless, in line with the SM predictions. However, a few years later, the Super-Kamiokande experiment in Japan and then the Sudbury Neutrino Observatory in Canada independently demonstrated that neutrinos can change (oscillate) from one flavour to another spontaneously.

    Sudbury Neutrino Observatory, , no longer operating

    This is only possible if neutrinos have masses, however small, and the probability of changing flavour is proportional to their difference in mass and the distance they travel. This ground-breaking discovery was awarded with the 2015 Physics Nobel Prize.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier




    CERN CMS New

    CERN LHCb New II


    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: