From Symmetry: Women in STEM-“Finding happiness in hardware” Francesca Ricci-Tam, Grace Cummings

Symmetry Mag
From Symmetry

1
Illustration by Sandbox Studio, Chicago with Corinne Mucha
Francesca Ricci-Tam

09/10/19
Sarah Charley

Working on hardware doesn’t come easily to all physicists, but Francesca Ricci-Tam has learned that what matters most is a willingness to put in the practice.

Francesca Ricci-Tam remembers an organic chemistry lab she took during her undergraduate studies, before she became a physicist.

“The professor told us that the vacuum tubes were very expensive and delicate and that we shouldn’t destroy them,” she recalls.

Five minutes later, her tube exploded.

“I never considered myself very good at lab work,” she says. “I was very awkward.”

The student who bashfully cleaned shards of glass from her lab bench is now a hardware specialist building electronics for one of the largest scientific experiments in the world. Over time, she has learned that this work is a skill to be learned through practice and that early mistakes like hers with the vacuum tube are an essential part of the process.

Facing a fear of failure

Ricci-Tam entered the University of California, Davis in 2006 as a premed student with a double major in biochemistry and physics. She was home-schooled for most of her education and had very little experience working with her hands.

She describes herself as a perfectionist, a trait she struggled with while adjusting to the laboratory. “I was always worried about adding one too many drops of solution or breaking something,” she says.

After being rejected from several medical schools, she was faced with two choices: Take a year to gain more experience through a clinical internship and then try again, or change course and apply to graduate school in physics. She chose the latter.

Being a physicist requires learning the basic principles and equations that describe matter, and then performing experiments to test and possibly push beyond them. The transition from the classroom into the laboratory is where the next generation of physicists learns what being an experimentalist is all about—and that possessing a high level of intelligence means very little if you don’t cultivate an accompanying amount of persistence and just plain do the hard work.

A few years into her PhD, Ricci-Tam’s advisor asked her to help the UC Davis team build components for the 14,000-ton CMS detector, which a collaboration of about 4000 scientists use to study the collisions generated by the Large Hadron Collider at CERN.


CERN/CMS


CERN CMS Higgs Event May 27, 2012

Ricci-Tam had never done anything like it. She closely watched her colleagues as they unscrewed electronics and attached cables to the CMS pixel detector.

She remembers flipping into a completely different mindset when it was her turn to work with the electronics. “I would be completely focused—and panic later,” she says.

One day, a colleague told her that she worked like a surgeon. Ricci-Tam says the comment changed her perception of herself. “I thought, I can do this,” she says.

“I’ve been doing hardware work on and off ever since.”

The more Ricci-Tam worked on hardware, the more she discovered her own capabilities. As she gained experience and confidence, she began to find a balance between being completely focused and relaxed while working on tasks. She gradually let go of her perfectionist mindset and learned to give herself more space and time to work through problems.

“You cannot afford to be a perfectionist,” she says. “Working on hardware teaches you patience.”

2
Illustration by Sandbox Studio, Chicago with Corinne Mucha
Grace Cummings

Gaining an ally

Ricci-Tam is now a postdoctoral researcher at the University of Maryland working on upgrades to the Hadronic Calorimeter, a part of the CMS detector that records the energy and trajectory of fundamental particles called quarks.

3
Images of CMS HCAL Forward Calorimeter (HF) – CERN Document Server

Scientists are preparing CMS for the High-Luminosity LHC, an upgrade to the LHC that will increase the collision rate by a factor of 10 and provide scientists with the huge amount of data they need to look for and study rare subatomic processes.

The upgrades will make the CMS detector both more robust and more sensitive to the tiny particles produced in the collisions.

Last winter, Ricci-Tam started working with University of Virginia graduate student Grace Cummings on assembling and testing new electronics for the calorimeter called ngCCMs: “Next Generation Clock Control Module.” Cummings was the resident expert on the project, and Ricci-Tam was impressed with her organization and self-assurance. The two soon became friends.

“I’m not a very confident person, so I look to other people to learn how to be more confident,” Ricci-Tam says. “Grace is one of them.”

Unlike Ricci-Tam, Cummings started her pursuit of experimental physics with a strong desire to work on hardware. Cummings connects it to the satisfaction she found building massive towers out of blocks and creating three-dimensional sculptures during her art classes as a kid. “I’ve always liked working with my hands,” Cummings says. “It makes me feel connected to my work.”

She applied to colleges as a physics major and early on knew she wanted to go to graduate school. “I knew I wouldn’t be happy if I wasn’t asking questions and answering them,” she says.

During a summer internship at the US Department of Energy’s Fermi National Accelerator Laboratory, she was introduced to particle physics hardware and how a detector actually works. “I learned what scintillators are and how wavelength shifters work,” she says. “I got really excited. I wrote about how I wanted to do hardware in my graduate school applications.”

Working on hardware showed Cummings that part of being an experimentalist is looking to answer questions she never realized she would need to ask—including “What’s that smudge?”

In summer 2018 Cummings was tasked with inspecting freshly arrived electronics for the CMS calorimeter at Fermilab.

6
CMS calorimeter at Fermilab. https://www.fnal.gov/pub/science/experiments/energy/lhc/cms.html

She and her colleagues found an entire shipment of circuit boards, each with a strange blotch on one side.

“It wouldn’t come off, so we thought it might be something intrinsic to the printed circuit board,” Cummings says. “These are going to be in detector for the rest of the lifetime of CMS, so we want to make sure that everything is as perfect as it possibly can be and think about all the ways it could fail. Even if you don’t think something’s a big deal, it could become a big deal later.”

They ran through a series of tests and inspections, and the cards all seemed to be functioning as expected. She and her colleagues were scratching their heads when one of them thought to ask how the electronics had been packaged.

“It turns out that the Fermilab logo hadn’t been completely dry when they were packaged,” Cummings says. “Those were our white smudges: the imprint of the screen-printing ink.”

Cummings and her colleagues laugh about the situation today, but they know the work they do has serious implications for the experiments they’re building and repairing.

Cummings says every time she goes underground to install electronics in the four-story CMS detector, she is amazed at just how important every little piece becomes. “Working on hardware for me has been the biggest thing that shows why CMS signs all its papers as ‘CMS collaboration,’” she says. “I’m flabbergasted it works. It’s really a wonder.

“At the same time, I know how much time, effort and love I put into my work. If everyone cares half as much as I care, we’ll be fine.”

See the full article here .


five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.


Stem Education Coalition

Symmetry is a joint Fermilab/SLAC publication.


#applied-research-technology, #basic-research, #fnal-cms, #francesca-ricci-tam, #grace-cummings, #symmetry-magazine, #women-in-stem

From Fermi National Accelerator Lab: “Fermilab’s HEPCloud goes live”

FNAL Art Image
FNAL Art Image by Angela Gonzales

From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

August 5, 2019
Marcia Teckenbrock

To meet the evolving needs of high-energy physics experiments, the underlying computing infrastructure must also evolve. Say hi to HEPCloud, the new, flexible way of meeting the peak computing demands of high-energy physics experiments using supercomputers, commercial services and other resources.

Five years ago, Fermilab scientific computing experts began addressing the computing resource requirements for research occurring today and in the next decade. Back then, in 2014, some of Fermilab’s neutrino programs were just starting up. Looking further into future, plans were under way for two big projects. One was Fermilab’s participation in the future High-Luminosity Large Hadron Collider at the European laboratory CERN.

The other was the expansion of the Fermilab-hosted neutrino program, including the international Deep Underground Neutrino Experiment. All of these programs would be accompanied by unprecedented data demands.

To meet these demands, the experts had to change the way they did business.

HEPCloud, the flagship project pioneered by Fermilab, changes the computing landscape because it employs an elastic computing model. Tested successfully over the last couple of years, it officially went into production as a service for Fermilab researchers this spring.

2
Scientists on Fermilab’s NOvA experiment were able to execute around 2 million hardware threads at a supercomputer [NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science the Office of Science’s National Energy Research Scientific Computing Center.] And scientists on CMS experiment have been running workflows using HEPCloud at NERSC as a pilot project. Photo: Roy Kaltschmidt, Lawrence Berkeley National Laboratory]

Experiments currently have some fixed computing capacity that meets, but doesn’t overshoot, its everyday needs. For times of peak demand, HEPCloud enables elasticity, allowing experiments to rent computing resources from other sources, such as supercomputers and commercial clouds, and manages them to satisfy peak demand. The prior method was to purchase local resources that on a day-to-day basis, overshoot the needs. In this new way, HEPCloud reduces the costs of providing computing capacity.

“Traditionally, we would buy enough computers for peak capacity and put them in our local data center to cover our needs,” said Fermilab scientist Panagiotis Spentzouris, former HEPCloud project sponsor and a driving force behind HEPCloud. “However, the needs of experiments are not steady. They have peaks and valleys, so you want an elastic facility.”

In addition, HEPCloud optimizes resource usage across all types, whether these resources are on site at Fermilab, on a grid such as Open Science Grid, in a cloud such as Amazon or Google, or at supercomputing centers like those run by the DOE Office of Science Advanced Scientific Computing Research program (ASCR). And it provides a uniform interface for scientists to easily access these resources without needing expert knowledge about where and how best to run their jobs.

The idea to create a virtual facility to extend Fermilab’s computing resources began in 2014, when Spentzouris and Fermilab scientist Lothar Bauerdick began exploring ways to best provide resources for experiments at CERN’s Large Hadron Collider. The idea was to provide those resources based on the overall experiment needs rather than a certain amount of horsepower. After many planning sessions with computing experts from the CMS experiment at the LHC and beyond, and after a long period of hammering out the idea, a scientific facility called “One Facility” was born. DOE Associate Director of Science for High Energy Physics Jim Siegrist coined the name “HEPCloud” — a computing cloud for high-energy physics — during a general discussion about a solution for LHC computing demands. But interest beyond high-energy physics was also significant. DOE Associate Director of Science for Advanced Scientific Computing Research Barbara Helland was interested in HEPCloud for its relevancy to other Office of Science computing needs.

3
The CMS detector at CERN collects data from particle collisions at the Large Hadron Collider. Now that HEPCloud is in production, CMS scientists will be able to run all of their physics workflows on the expanded resources made available through HEPCloud. Photo: CERN

The project was a collaborative one. In addition to many individuals at Fermilab, Miron Livny at the University of Wisconsin-Madison contributed to the design, enabling HEPCloud to use the workload management system known as Condor (now HTCondor), which is used for all of the lab’s current grid activities.

Since its inception, HEPCloud has achieved several milestones as it moved through the several development phases leading up to production. The project team first demonstrated the use of cloud computing on a significant scale in February 2016, when the CMS experiment used HEPCloud to achieve about 60,000 cores on the Amazon cloud, AWS. In November 2016, CMS again used HEPCloud to run 160,000 cores using Google Cloud Services , doubling the total size of the LHC’s computing worldwide. Most recently in May 2018, NOvA scientists were able to execute around 2 million hardware threads at a supercomputer the Office of Science’s National Energy Research Scientific Computing Center (NERSC), increasing both the scale and the amount of resources provided. During these activities, the experiments were executing and benefiting from real physics workflows. NOvA was even able to report significant scientific results at the Neutrino 2018 conference in Germany, one of the most attended conferences in neutrino physics.

CMS has been running workflows using HEPCloud at NERSC as a pilot project. Now that HEPCloud is in production, CMS scientists will be able to run all of their physics workflows on the expanded resources made available through HEPCloud.

Next, HEPCloud project members will work to expand the reach of HEPCloud even further, enabling experiments to use the leadership-class supercomputing facilities run by ASCR at Argonne National Laboratory and Oak Ridge National Laboratory.

Fermilab experts are working to see that, eventually, all Fermilab experiments be configured to use these extended computing resources.

This work is supported by the DOE Office of Science.

See the full here.


five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

FNAL Icon

Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
collaborate at Fermilab on experiments at the frontiers of discovery.

#fermilabs-hepcloud-goes-live, #accelerator-science, #cern-cms, #cloud-computing, #fnal, #fnal-cms, #hep, #supercomputing

From Fermi National Accelerator Lab: “Fermilab scientists help push AI to unprecedented speeds”

FNAL II photo

FNAL Art Image
FNAL Art Image by Angela Gonzales

From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

January 29, 2019

Javier Duarte
Sergo Jindariani
Ben Kreis
Nhan Tran

1
Researchers at Fermilab are taking cues from industry to improve their own “big data” processing challenges.

Machine learning is revolutionizing data analysis across academia and industries and is having an impact on our daily lives. Recent leaps in driverless car navigation and the voice recognition features of personal assistants are possible because of this form of artificial intelligence. As data sets in the Information Age continue to grow, companies such as Google and Microsoft are building tools that make machine learning faster and more efficient.

Researchers at Fermilab are taking cues from industry to improve their own “big data” processing challenges.

Data sets in particle physics are growing at unprecedented rates as accelerators are upgraded to higher performance and detectors become more fine-grained and complex. More sophisticated methods for analyzing these large data sets that also avoid losses in computing efficiency are required. For well over two decades, machine learning has already proven to be useful in a wide range of particle physics applications.

To fully exploit the power of modern machine learning algorithms, Fermilab CMS scientists are preparing to deploy these algorithms in the first level of data filtering in their experiment, that is, in the “trigger.”

CERN/CMS

In particle physics lingo, a trigger occurs when a series of electronics and algorithms are used to select which collisions are recorded and which are discarded.

Fermilab scientists are exploring a new approach that uses high-throughput, low-latency programmable microchips called field programmable gate arrays (FPGAs). The trigger algorithms have to operate in a daunting environment, which requires them to process events at the collision rate of 40 MHz at the Large Hadron Collider (LHC) and complete it in as little as hundreds of nanoseconds.

LHC

CERN map


CERN LHC Tunnel

CERN LHC particles

In a growing collaboration with CERN, MIT, University of Florida, University of Illinois at Chicago and other institutions, Fermilab researchers have recently developed a software tool, called hls4ml, that helps users implement their own custom machine learning algorithms on FPGAs. hls4ml translates industry-standard machine learning algorithms, such as Keras, TensorFlow and PyTorch, into instructions for the FPGA, called firmware. This tool leverages a new way to create firmware called high-level synthesis (HLS), which is similar to writing standard software and reduces development time. hls4ml also allows users to take advantage of the capabilities of FPGAs to speed up computations, such as the ability to do many multiplications in parallel with reduced (but sufficient) precision.

The first proof-of-concept implementation of the tool showed that a neural network with over 100 hidden neurons could classify jets originating from different particles, such as quarks, gluons, W bosons, Z bosons or top quarks, in under 75 nanoseconds. Neural networks can also be used for iterative tasks, such as determining the momentum of a muon passing through the CMS endcap detectors. Using hls4ml, CMS collaborators have shown that the ability to reject fake muons was up to 80 percent better than previous methods.

Ultrafast, low-latency machine learning inference in FPGA hardware has much broader implications. Beyond real-time LHC data processing, applications can be found in neutrino and dark matter experiments and particle accelerator beamline controls. Even more broadly, accelerating machine learning with specialized hardware such as FPGAs and dedicated circuits called ASICs (application-specific integrated circuits) is an area of active development for large-scale computing challenges. Industry drivers such as Amazon Web Services with Xilinx FPGAs, Microsoft Azure and Intel have invested heavily in FPGAs, while Google has developed its own ASIC (a tensor processing unit, TPU). Specialized hardware platforms coupled with CPUs, referred to as co-processors, are driving the heterogeneous computing revolution. hls4ml can be applied in such co-processor platforms. Combining heterogeneous computing and hls4ML for low-latency machine learning inference could lead to an exciting potential to solve future computing challenges in particle physics.

The authors are members of the Fermilab CMS Department.

See the full article here .


five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

FNAL Icon

Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
collaborate at Fermilab on experiments at the frontiers of discovery.


FNAL/MINERvA

FNAL DAMIC

FNAL Muon g-2 studio

FNAL Short-Baseline Near Detector under construction

FNAL Mu2e solenoid

Dark Energy Camera [DECam], built at FNAL

FNAL DUNE Argon tank at SURF

FNAL/MicrobooNE

FNAL Don Lincoln

FNAL/MINOS

FNAL Cryomodule Testing Facility

FNAL Minos Far Detector

FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

FNAL/NOvA experiment map

FNAL NOvA Near Detector

FNAL ICARUS

FNAL Holometer

#accelerator-science, #fermilab-scientists-help-push-ai-to-unprecedented-speeds, #fnal, #fnal-cms, #hep, #particle-physics, #physics

From FNAL: “CMS experiment at the LHC sees first 2018 collisions”

FNAL II photo

FNAL Art Image
FNAL Art Image by Angela Gonzales

Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

April 19, 2018
Cecilia Gerber
Sergo Jindariani

Cecilia Gerber and Sergo Jindariani are co-coordinators of the LHC Physics Center at Fermilab.

After months of winter shutdown, the CMS experiment at the Large Hadron Collider (LHC) is once again seeing collisions and is ready to take data.

CERN CMS

CERN CMS Higgs Event

CERN CMS pre-Higgs Event

The shutdown months have been very busy for CMS physicists, who used this downtime to improve the performance of the detector by completing upgrades and repairs of detector components. The LHC will continue running until December 2018 and is expected to deliver an additional 50 inverse femtobarns of integrated luminosity to the ATLAS and CMS experiments. This year of data-taking will conclude Run-2, after which the collider and its experiment will go into a two-year long shutdown for further upgrades.

Run-2 of the LHC has been highly successful, with close to 100 inverse femtobarns of integrated luminosity already delivered to the experiments in 2016 and 2017. These data sets enabled CMS physicists to perform many measurements of Standard Model parameters and searches for new physics. New data will allow CMS to further advance into previously uncharted territory. Physicists from the LHC Physics Center at Fermilab have been deeply involved in the work during the winter shutdown. They are now playing key roles in processing and certification of data recorded by the CMS detector, while looking forward to analyzing the new data sets for a chance to discover new physics.


This is an event display of one of the early 2018 collisions that took place at the CMS experiment at CERN.

See the full article here .

Please help promote STEM in your local schools.

STEM Icon

Stem Education Coalition

FNAL Icon

Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
collaborate at Fermilab on experiments at the frontiers of discovery.


FNAL/MINERvA

FNAL DAMIC

FNAL Muon g-2 studio

FNAL Short-Baseline Near Detector under construction

FNAL Mu2e solenoid

Dark Energy Camera [DECam], built at FNAL

FNAL DUNE Argon tank at SURF

FNAL/MicrobooNE

FNAL Don Lincoln

FNAL/MINOS

FNAL Cryomodule Testing Facility

FNAL Minos Far Detector

FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

FNAL/NOvA experiment map

FNAL NOvA Near Detector

FNAL ICARUS

FNAL Holometer

#accelerator-science, #basic-research, #cern-cms, #fnal, #fnal-cms, #hep, #particle-accelerators, #particle-physics, #physics

From FNAL- “From the CMS Center CMS: design, construction, operations”


Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

Wednesday, Oct. 1, 2014
sn
Steve Nahn, U.S. CMS detector upgrade project manager, wrote this column.

It’s a very busy and sometimes hectic place on Wilson Hall’s 10th and 11th floors these days working on CMS. Rather than progressing sequentially through design, construction and operations phases of the CMS detector upgrades, we are going through all three simultaneously. This leads to a certain amount of jumping around.

CERN CMS New
CMS

The design component addresses the high-luminosity LHC era commencing in the mid-2020s, at which time the LHC’s total luminosity will increase 10-fold. To exploit the physics opportunities afforded by the more intense beam while coping with increased radiation dose, we must replace or upgrade key components of the detector. A large fraction of the collaboration spent the summer studying what sort of detector we would need in that demanding environment. The result, a 300-plus-page technical proposal, is nearly ready for release, and R&D efforts at Fermilab and collaborating institutes are already framing the technologies needed to make these Phase 2 upgrades a reality.

The construction component, the Phase 1 Upgrade Project, is a set of strategically targeted upgrades to cope with the imminent increased instantaneous luminosity starting next year and continually growing up to the high-luminosity LHC era. The design for this phase is complete, and the job at hand is to build the new sensors, back-end electronics and online triggering system. This project just went through Critical Decision 2 and 3 reviews simultaneously. The conclusion was a resounding recommendation for approval after a few technical details are resolved. The approval, which we hope will come through in November, will allow us to transition into production mode, launching activities at SiDet, Wilson Hall and the Feynman Center at Fermilab, as well as at the 30 collaborating U.S. universities, to move the project from design to installation in the next few years.

Lest we forget, there is the ongoing, operating experiment, perhaps the most exciting of the three phases. The LHC is poised to restart in spring 2015, after a two-year shutdown at twice the center-of-mass energy, the last significant step foreseen. The low mass of the Higgs argues for new physics that may appear in the next run, and the collaboration is gearing up to find it. This involves a program of extended running of the entire detector with cosmic rays before the beam returns to bring the detector back to peak efficiency, computing challenges to make sure the offline data production is ready, and increased effort on the analysis chain, particularly for potential early high-profile discoveries. A new discovery in 2015 would be fantastic, full stop, and we are committed to ensuring we are ready for such an opportunity.

There is indeed a lot of exciting work going on. And amid all this, there’s still one more thing to mention: Our fearless leader Patty McBride is transitioning from U.S. CMS program manager into her role as head of the Particle Physics Division. We know she isn’t going far — only three floors down in Wilson Hall — but we’ll miss her anyway. We take this opportunity to give her a giant “thank you” for her leadership and tireless efforts up here on the 11th floor. PPD is lucky!

See the full article here.

Fermilab Campus

Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

ScienceSprings relies on technology from

MAINGEAR computers

Lenovo
Lenovo

Dell
Dell

#accelerator-science, #basic-research, #fnal-cms, #hep, #particle-accelerators, #particle-physics

From Fermilab: “From the CMS Center – Getting ready for the second run of the LHC”


Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

Wednesday, June 11, 2014

kb
Kevin Burkett, acting head of the CMS Center, wrote this column.

The end of the LHC shutdown is now in sight, and members of CMS and machine experts are both beginning preparations for the restart of LHC operations in 2015. The current long shutdown started after the completion of LHC Run 1 in February 2013. Run 1 was a tremendous success, and the experiments are still completing all their analyses using the data accumulated during the run.

CERN LHC Map

Last week LHC machine experts gathered near CERN in Evian, France, to discuss plans for LHC operation in 2015. While the final decision on the collision energy will come after hardware tests of the LHC magnets later this year, the goal will be to deliver collisions at a center-of-mass energy of 13 TeV. This is close to the design energy and a significant increase compared to the 8-TeV collisions in 2012. A second goal is to cut the time between collisions in half, from 50 to 25 nanoseconds.

Members of CMS have been active during the shutdown, performing maintenance and improving the detector, as well as working to improve the algorithms used to reconstruct and identify the particles produced in collisions. Experts in computing have focused on improving the efficiency and reliability of the infrastructure while developing new tools for users.

An important milestone in our preparation for the start of data taking in 2015 is the upcoming Computing, Software and Analysis challenge, or CSA14. Simulated data samples are placed at sites around the globe and analyzed by members of the experiment. As the name suggests, this challenge allows us to test the readiness of many of the key aspects of our computing, offline software and physics analysis. Special emphasis will be placed on new procedures for users to access data and on validation of the output from the improved reconstruction algorithms.

Fermilab’s Joel Butler will lead CSA14. The exercise will require significant work from US CMS computing personnel, especially from the Scientific Computing Division. University members of the LHC Physics Center at Fermilab will also be active in CSA14 analysis. With time to address any issues uncovered in CSA14, CMS will be ready to go when the LHC starts up again in 2015.

See the full article here.

Fermilab Campus

Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.


ScienceSprings is powered by MAINGEAR computers

#accelerator-science, #fnal-cms, #hep, #particle-accelerators, #particle-physics

From Fermilab- “Frontier Science Result: CMS Connecting the dots”


Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

Friday, June 6, 2014

Fermilab Don Lincoln
Dr. Don Lincoln wrote this article

When two protons collide in the center of the CMS detector, the collision energy can create hundreds of electrically charged particles. These particles roar through the apparatus, crossing individual detector elements. Each particle marks the location of its passage, leaving a string of dots that can be seen on a computer screen.

CERN CMS New
CMS at CERN

One of the trickiest jobs in particle physics is to teach a computer how to connect the dots and reconstruct the tracks of all of the particles that exited the collision. That’s correct: The child’s simple pastime of connect-the-dots can consume the efforts of many of the finest minds in an experiment like CMS. The difficulty stems from the fact that there are hundreds of tracks and that, in a bit of an inconvenient oversight, nobody bothered to put numbers beside the dots to tell the computer which to connect.

Reconstructing tracks is one of the first tasks that an experiment must accomplish in order to begin to analyze the data. Before the tracks are identified, the data is a mess of little dots. Once the tracks are determined, scientists can begin to sort out the physical process that occurred by figuring out that this particle went this way while another particle went that.

In addition to reconstructing the tracks of particles, scientists also reconstruct the origin of the particles. This is the location at which the collision between two protons occurred. Until you know the origin and trajectory of the particles, you can’t even begin to understand what sort of collision was recorded.

CMS scientists have worked long and hard to develop the algorithms to accomplish these challenging tasks. In a recent paper, they described the result of their efforts. Particles leaving the collision at angles near 90 degrees measured from the beam can be reconstructed about 94 percent of the time. For the special case of isolated muons, the reconstruction probability rises to 100 percent. The location of the origin of the collision can be localized with a precision about 0.01 millimeters, or about half the size of the finest human hair. These algorithms are fast and flexible, and scientists continue to improve on them in anticipation of the resumption of operations in early 2015.

See the full article here.

Fermilab Campus

Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.


ScienceSprings is powered by MAINGEAR computers

#accelerator-science, #don-lincoln, #fnal-cms, #hep, #particle-accelerators, #particle-physics