Tagged: MARS 2020 Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:05 pm on February 8, 2020 Permalink | Reply
    Tags: "All About the Laser (and Microphone) Atop Mars 2020, MARS 2020, , , NASA's Next Rover", SuperCam   

    From NASA: “All About the Laser (and Microphone) Atop Mars 2020, NASA’s Next Rover” 

    From NASA

    Feb. 7, 2020

    Andrew Good
    Jet Propulsion Laboratory, Pasadena, Calif.

    Josh Handal
    NASA Headquarters, Washington

    Mars 2020’s mast, or “head,” includes a laser instrument called SuperCam that can vaporize rock material and study the resulting plasma. Credits: NASA/JPL-Caltech


    NASA is sending a new laser-toting robot to Mars. But unlike the lasers of science fiction, this one is used for studying mineralogy and chemistry from up to about 20 feet (7 meters) away. It might help scientists find signs of fossilized microbial life on the Red Planet, too.

    NASA Mars 2020 Rover

    One of seven instruments aboard the Mars 2020 rover that launches this summer, SuperCam was built by a team of hundreds and packs what would typically require several sizable pieces of equipment into something no bigger than a cereal box. It fires a pulsed laser beam out of the rover’s mast, or “head,” to vaporize small portions of rock from a distance, providing information that will be essential to the mission’s success.

    Here’s a closer look at what makes the instrument so special:

    A Far Reach

    Using a laser beam will help researchers identify minerals that are beyond the reach of the rover’s robotic arm or in areas too steep for the rover to go. It will also enable them to analyze a target before deciding whether to guide the rover there for further analysis. Of particular interest: minerals that formed in the presence of liquid water, like clays, carbonates and sulfates. Liquid water is essential to the existence of life as we know it, including microbes, which could have survived on Mars billions of years ago.

    Scientists can also use the information from SuperCam to help decide whether to capture rock cores for the rover’s sample caching system. Mars 2020 will collect these core samples in metal tubes, eventually depositing them at a predetermined location for a future mission to retrieve and bring back to Earth.

    The Mast Unit for Mars 2020’s SuperCam, shown being tested here, will use a laser to vaporize and study rock material on the Red Planet’s surface. Credits: LANL

    Laser Focus

    SuperCam is essentially a next-generation version of the Curiosity rover’s ChemCam. Like its predecessor, SuperCam can use an infrared laser beam to heat the material it impacts to around 18,000 degrees Fahrenheit (10,000 degrees Celsius) — a method called laser induced breakdown spectroscopy, or LIBS — and vaporizes it. A special camera can then determine the chemical makeup of these rocks from the plasma that is created.

    In a test shown here, the SuperCam Mast Unit — which sits in the mast, or “head,” of the Mars 2020 rover — zaps marks across a piece of metal.

    Just like ChemCam, SuperCam will use artificial intelligence to seek out rock targets worth zapping during and after drives, when humans are out of the loop. In addition, this upgraded A.I. lets SuperCam point very precisely at small rock features.

    Another new feature in SuperCam is a green laser that can determine the molecular composition of surface materials. This green beam excites the chemical bonds in a sample and produces a signal depending on which elements are bonded together — a technique called Raman spectroscopy. SuperCam also uses the green laser to cause some minerals and carbon-based chemicals to emit light, or fluoresce.

    Minerals and organic chemicals fluoresce at different rates, so SuperCam’s light sensor features a shutter that can close as quickly as 100 nanoseconds at a time — so fast that very few photons of light will enter it. Altering the shutter speed (a technique called time-resolved luminescence spectroscopy) will enable scientists to better determine the compounds present.

    Moreover, SuperCam can use visible and infrared (VISIR) light reflected from the Sun to study the mineral content of rocks and sediments. This VISIR technique complements the Raman spectroscopy; each technique is sensitive to different types of minerals.

    Laser With a Mic Check

    SuperCam includes a microphone so scientists can listen each time the laser hits a target. The popping sound created by the laser subtly changes depending on a rock’s material properties.

    “The microphone serves a practical purpose by telling us something about our rock targets from a distance. But we can also use it to directly record the sound of the Martian landscape or the rover’s mast swiveling,” said Sylvestre Maurice of the Institute for Research in Astrophysics and Planetary Science in Toulouse, France.

    The Mars 2020 rover marks the third time this particular microphone design will go to the Red Planet, Maurice said. In the late 1990s, the same design rode aboard the Mars Polar Lander, which crashed on the surface. In 2008, the Phoenix mission experienced electronics issues that prevented the microphone from being used.

    In the case of Mars 2020, SuperCam doesn’t have the only microphone aboard the rover: an entry, descent and landing microphone will capture all the sounds of the car-sized rover making its way to the surface. It will add audio to full-color video recorded by the rover’s cameras, capturing a Mars landing like never before.


    SuperCam is led by Los Alamos National Laboratory in New Mexico, where the instrument’s Body Unit was developed. That part of the instrument includes several spectrometers, control electronics and software.

    The Mast Unit was developed and built by several laboratories of the CNRS (French research center) and French universities under the contracting authority of CNES (French space agency). Calibration targets on the rover deck are provided by Spain’s University of Valladolid.


    JPL is building and will manage operations of the Mars 2020 rover for the NASA Science Mission Directorate at the agency’s headquarters in Washington.

    Read more about Mars 2020:



    See the full article here .


    Please help promote STEM in your local schools.

    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs. NASA shares data with various national and international organizations such as from the [JAXA]Greenhouse Gases Observing Satellite.

  • richardmitnick 12:35 pm on December 9, 2018 Permalink | Reply
    Tags: AI at NASA, , MARS 2020,   

    From ars technica: “NASA’s next Mars rover will use AI to be a better science partner” 

    Ars Technica
    From ars technica

    Alyson Behr

    Experience gleaned from EO-1 satellite will help JPL build science smarts into next rover.

    NASA Mars 2020 rover schematic

    NASA Mars Rover 2020 NASA

    NASA can’t yet put a scientist on Mars. But in its next rover mission to the Red Planet, NASA’s Jet Propulsion Laboratory is hoping to use artificial intelligence to at least put the equivalent of a talented research assistant there. Steve Chien, head of the AI Group at NASA JPL, envisions working with the Mars 2020 Rover “much more like [how] you would interact with a graduate student instead of a rover that you typically have to micromanage.”

    The 13-minute delay in communications between Earth and Mars means that the movements and experiments conducted by past and current Martian rovers have had to be meticulously planned. While more recent rovers have had the capability of recognizing hazards and performing some tasks autonomously, they’ve still placed great demands on their support teams.

    Chien sees AI’s future role in the human spaceflight program as one in which humans focus on the hard parts, like directing robots in a natural way while the machines operate autonomously and give the humans a high-level summary.

    “AI will be almost like a partner with us,” Chien predicted. “It’ll try this, and then we’ll say, ‘No, try something that’s more elongated, because I think that might look better,’ and then it tries that. It understands what elongated means, and it knows a lot of the details, like trying to fly the formations. That’s the next level.

    “Then, of course, at the dystopian level it becomes sentient,” Chien joked. But he doesn’t see that happening soon.

    Old-school autonomy

    NASA has a long history with AI and machine-learning technologies, Chien said. Much of that history has been focused on using machine learning to help interpret extremely large amounts of data. While much of that machine learning involved spacecraft data sent back to Earth for processing, there’s a good reason to put more intelligence directly on the spacecraft: to help manage the volume of communications.

    Earth Observing One was an early example of putting intelligence aboard a spacecraft. Launched in November 2000, EO-1 was originally planned to have a one-year mission, part of which was to test how basic AI could handle some scientific tasks onboard. One of the AI systems tested aboard EO-1 was the Autonomous Sciencecraft Experiment (ASE), a set of software that allowed the satellite to make decisions based on data collected by its imaging sensors. ASE included onboard science algorithms that performed image data analysis to detect trigger conditions to make the spacecraft pay more attention to something, such as interesting features discovered or changes relative to previous observations. The software could also detect cloud cover and edit it out of final image packages transmitted home. EO-1’s ASE could also adjust the satellite’s activities based on the science collected in a previous orbit.

    With volcano imagery, for example, Chien said, JPL had trained the machine-learning software to recognize volcanic eruptions from spectral and image data. Once the software spotted an eruption, it would then act out pre-programmed policies on how to use that data and schedule follow-up observations. For example, scientists might set the following policy: if the spacecraft spots a thermal emission that is above two megawatts, the spacecraft should keep observing it on the next overflight. The AI software aboard the spacecraft already knows when it’s going to overfly the emission next, so it calculates how much space is required for the observation on the solid-state recorder as well as all the other variables required for the next pass. The software can also push other observations off for an orbit to prioritize emerging science.

    2020 and beyond

    “That’s a great example of things that we were able to do and that are now being pushed in the future to more complicated missions,” Chien said. “Now we’re looking at putting a similar scheduling system onboard the Mars 2020 rover, which is much more complicated. Since a satellite follows a very predictable orbit, the only variable that an orbiter has to deal with is the science data it collects.

    “When you plan to take a picture of this volcano at 10am, you pretty much take a picture of the volcano at 10am, because it’s very easy to predict,” Chien continued. “What’s unpredictable is whether the volcano is erupting or not, so the AI is used to respond to that.” A rover, on the other hand, has to deal with a vast collection of environmental variables that shift moment by moment.

    Even for an orbiting satellite, scheduling observations can be very complicated. So AI plays an important role even when a human is making the decisions, said Chien. “Depending on mission complexity and how many constraints you can get into the software, it can be done completely automatically or with the AI increasing the person’s capabilities. The person can fiddle with priorities and see what different schedules come out and explore a larger proportion of the space in order to come up with better plans. For simpler missions, we can just automate that.”

    Despite the lessons learned from EO-1, Chien said that spacecraft using AI remain “the exception, not the norm. I can tell you about different space missions that are using AI, but if you were to pick a space mission at random, the chance that it was using AI in any significant fashion is very low. As a practitioner, that’s something we have to increase uptake on. That’s going to be a big change.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

  • richardmitnick 12:19 pm on December 6, 2018 Permalink | Reply
    Tags: , , , , , MARS 2020, Mars 2020 Mastcam-Z from ASU,   

    From Arizona State University: “Mars 2020 rover mission camera system ‘Mastcam-Z’ testing begins at ASU” 

    ASU Bloc

    From Arizona State University

    December 4, 2018

    Arizona State University research technician and Mars 2020 Mastcam-Z calibration engineer Andy Winhold waited patiently on the loading dock of ASU’s Interdisciplinary Science and Technology Building IV in anticipation of the arrival of a very special delivery.

    On board the delivery truck was precious cargo from Malin Space Science Systems, a test model of “Mastcam-Z,” the mast-mounted camera system for NASA’s Mars 2020 rover mission.

    NASA Mars 2020 rover schematic

    NASA Mars Rover 2020 NASA

    Mars 2020 Mastcam-Z

    The Eyes of NASA’s Next Mars Rover
    Mastcam-Z is the name of the mast-mounted camera system that is equipped with a zoom function on the Mars 2020 rover. Mastcam-Z has cameras that can zoom in, focus, and take 3D pictures and video at high speed to allow detailed examination of distant objects. The principal investigator for the instrument is professor and planetary scientist Jim Bell of the School of Earth and Space Exploration.

    Mastcam-Z is being designed, built and tested under the direction of principal investigator Jim Bell, of ASU’s School of Earth and Space Exploration. The dual camera system can zoom in (hence the ‘Z’ in “’Mastcam-Z’), focus and take 3D pictures and panoramas at a variety of scales. This will allow the Mars 2020 rover to provide a detailed examination of both close and distant objects.

    The test model that arrived on the Tempe campus in November, otherwise known as an engineering qualification model or EQM, is an important step in designing and building instruments for space. These models not only serve as a way to run the instruments through the rigors of launch and functionality in space, they also serve as a way for the instrument team to evaluate the design and testing plans before the final cameras are fully assembled.

    Testing the Mastcam-Z engineering model

    The engineering model essentially allows the team to do a “dry run” through the complete design and build process of the instrument before the final versions of the cameras are complete.

    “Parts may take longer to build than expected, a certain assembly step may be more difficult than initially thought or resources from third parties could become scarce on short notice,” Winhold said. “These are all things we can learn about and prepare for in advance using the engineering model.”

    The team first verifies that the test instrument operates correctly in terms of parts, power consumption and software. They also use the model to ensure the instrument meets mission requirements in terms of functionality, size and weight. “For Mastcam-Z, one of the primary interests with the engineering model was evaluating the instrument’s ability to change focal length — or zoom,” Winhold said.

    Specifically, the team tested the engineering model in the thermal vacuum chamber, located in ASU’s Interdisciplinary Science and Technology Building IV, to confirm that their support equipment was designed appropriately and allowed the camera to be placed securely in the chamber and view out the chamber’s window clearly. They also timed the tests so they knew how long testing the actual cameras will take, and they tested the IT network’s ability to share data quickly between people inside the cleanroom and other support team members outside of the room and around the world.

    Winhold describes his role on the mission as similar to someone playing the game “Operation,” where the patient is the Mastcam-Z cameras and the tweezers are the support pieces.

    The Mastcam-Z team testing the engineering model in ASU’s cleanrooms. Team members include Jim Bell, Andy Winhold, Alex Hayes, Ken Herkenhoff, Elsa Jensen, Tex Kubacki, Jake Schaffner, Paul Corlies, Christian David Tate, Megan Emch, Kristen Paris, Ernest Cisneros, Winston Carter, Corrine Rojas, Shane Thompson and Rick Hoppe. Photo courtesy ASU

    A calibration target used to assess the image quality of the cameras, consisting of geometric patterns, slanted edges, and lines very finely spaced apart to evaluate the camera’s optics and their ability to accurately capture the resolution and contrast of the imaged scene onto the camera’s image sensor. Photo courtesy ASU

    ASU research technician and Mars 2020 Mastcam-Z calibration engineer Andy Winhold with ASU’s thermal vacuum chamber in ISTB IV on the Tempe campus. The thermal vacuum chamber simulates the space environment so instruments can be tested for the rigors of space exploration. Photo courtesy ASU

    The engineering qualification model — a test model of Mastcam-Z, the mast-mounted camera system for NASA’s Mars 2020 rover mission — in the cleanroom of ISTB IV on the ASU Tempe campus. Photo courtesy ASU

    “But in my case,” said Winhold, “I’m only shown pictures of the board game, and based on those pictures I need to design and create the best tweezers for removing ailments without hurting the patient.”

    And according to the team, the testing has been a success so far.

    “We had a few hiccups we worked around, like cables not being long enough, not understanding best communication procedures, that sort of thing; but nothing truly unexpected,” Winhold said. “That’s exactly how we like things. In testing equipment that will be going to space, a boring day that goes according to procedure is a good one.”

    Next steps for the Mastcam-Z team

    In December, the actual Mastcam-Z flight cameras will arrive on the ASU Tempe campus for testing. They will then be delivered to NASA’s Jet Propulsion Laboratory and installed on the Mars 2020 rover, which will launch in summer 2020, landing on Mars in February 2021. The mission is expected to last at least one Mars year (687 Earth days).

    “The tests we ran on the engineering unit at ASU are almost identical to the tests we’ll be running on the actual cameras when they arrive,” Winhold said.

    Once the instrument is finalized and installed in the Mars 2020 rover, the engineering model continues to have a purpose.

    “Largely it is considered a ‘flight spare’ and will be a back-up unit should something happen to the flight cameras before launch,” Winhold explained. “Once the rover launches in the summer of 2020 we won’t be able to do any hands-on interaction with the flight cameras, though, so we’ll have the engineering model as a reference for possible problem solving and as a reference for subsequent rover missions.”

    About Mastcam-Z

    The cameras weigh about 8.8 pounds and will produce images of color quality similar to that of a consumer digital HD camera (2 megapixels). The cameras will help other Mars 2020 experiments on the rover by looking at the whole landscape and identifying rocks and soil (regolith) that deserve a closer look by other instruments. They will also spot important rocks for the rover to sample and cache on the surface of Mars, for eventual return (by a future mission) to Earth.

    Mastcam-Z’s purpose is to take high resolution panoramic color and 3D images of the Martian surface and features in the atmosphere with a zoom lens to magnify distant targets. It will be mounted on the Mars 2020 rover mast at the eye level of a 6-foot-5-inch person. The two cameras are separated by 9.5 inches to provide stereo vision. These cameras, with their all-seeing sharp vision, will provide images for science team members to pick out the best rocks, to hunt for evidence of past habitability recorded in the geology and texture of the landscape, and to look for signs of past water on Mars.

    Mastcam-Z’s principal investigator is Professor Jim Bell of the School of Earth and Space Exploration. The deputy principal investigator is Dr. Justin Maki of NASA’s Jet Propulsion Laboratory, the Planetary Society serves as the instrument’s education and public outreach partner, and the prime subcontractor for instrument development is Malin Space Science Systems, Inc.

    NASA’s Mars 2020 rover mission

    The Mars 2020 rover mission is part of NASA’s Mars Exploration Program, a long-term effort of robotic exploration of the Red Planet. The Mars 2020 mission addresses high-priority science goals for Mars exploration, including key questions about the potential for life on Mars. The mission also seeks to gather knowledge and to demonstrate technologies that address the challenges of future human expeditions to Mars. These include testing a method for producing oxygen from the Martian atmosphere, identifying other resources (such as subsurface water), improving landing techniques, and characterizing weather, dust, and other potential environmental conditions that could affect future astronauts living and working on Mars.

    Mastcam-Z Team

    On February 6, 2018, the Mastcam-Z team captured their traditional team photo in an unusual way: with the stereo testbed model of the camera. Just as the real camera will do on Mars, the testbed rotated to multiple positions to gather in the full scene. To produce this panoramic view, the team corrected the images for geometric distortion and assembled them into a mosaic.

    From left to right, the pictured team members are: Jim Bell, Justin Maki, Jeffrey Johnson, Mark Lemmon, Ken Edgett, Mike Wolff, Ken Herkenhoff, Samantha Jacob, Ed Cloutis, Andy Winhold, Zach Bailey, Danika Wellington, Nicole Schmitz, Rob Sullivan, Peter Martin, Paul Corlies, Jim Bell, Sarah Fagents, Kristen Paris, Stephanie Holaday, Elsa Jensen, Piluca Caballo Perucha, Ernest Cisneros, Jake Adler, Melissa Rice, Christian Tate, Kjartan Kinch, Darian Dixon, Gerhard Paar, Kathleen Hoza, Jon Proton, Jim Bell, and Mat Kaplan.

    Principal Investigator: Jim Bell, Arizona State University

    Deputy Principal Investigator: Justin Maki, NASA’s Jet Propulsion Laboratory

    Education and Public Outreach Partner: The Planetary Society

    Instrument Development: Malin Space Science Systems
    Team Blogs

    What’s the latest on the Mastcam-Z team? Check out the Planetary Society Mastcam-Z team blogs.


    Planetary Society Mastcam-Z Press Room

    NASA Mars 2020 Mission Newsroom

    Additional Resources

    NASA Mastcam-Z webpage

    Planetary Society Mastcam-Z webpage

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASUis the largest public university by enrollment in the United States. Founded in 1885 as the Territorial Normal School at Tempe, the school underwent a series of changes in name and curriculum. In 1945 it was placed under control of the Arizona Board of Regents and was renamed Arizona State College. A 1958 statewide ballot measure gave the university its present name.
    ASU is classified as a research university with very high research activity (RU/VH) by the Carnegie Classification of Institutions of Higher Education, one of 78 U.S. public universities with that designation. Since 2005 ASU has been ranked among the Top 50 research universities, public and private, in the U.S. based on research output, innovation, development, research expenditures, number of awarded patents and awarded research grant proposals. The Center for Measuring University Performance currently ranks ASU 31st among top U.S. public research universities.

    ASU awards bachelor’s, master’s and doctoral degrees in 16 colleges and schools on five locations: the original Tempe campus, the West campus in northwest Phoenix, the Polytechnic campus in eastern Mesa, the Downtown Phoenix campus and the Colleges at Lake Havasu City. ASU’s “Online campus” offers 41 undergraduate degrees, 37 graduate degrees and 14 graduate or undergraduate certificates, earning ASU a Top 10 rating for Best Online Programs. ASU also offers international academic program partnerships in Mexico, Europe and China. ASU is accredited as a single institution by The Higher Learning Commission.

    ASU Tempe Campus
    ASU Tempe Campus

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: