Tagged: MARS 2020 Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:35 pm on December 9, 2018 Permalink | Reply
    Tags: AI at NASA, , MARS 2020,   

    From ars technica: “NASA’s next Mars rover will use AI to be a better science partner” 

    Ars Technica
    From ars technica

    Alyson Behr

    Experience gleaned from EO-1 satellite will help JPL build science smarts into next rover.

    NASA Mars 2020 rover schematic

    NASA Mars Rover 2020 NASA

    NASA can’t yet put a scientist on Mars. But in its next rover mission to the Red Planet, NASA’s Jet Propulsion Laboratory is hoping to use artificial intelligence to at least put the equivalent of a talented research assistant there. Steve Chien, head of the AI Group at NASA JPL, envisions working with the Mars 2020 Rover “much more like [how] you would interact with a graduate student instead of a rover that you typically have to micromanage.”

    The 13-minute delay in communications between Earth and Mars means that the movements and experiments conducted by past and current Martian rovers have had to be meticulously planned. While more recent rovers have had the capability of recognizing hazards and performing some tasks autonomously, they’ve still placed great demands on their support teams.

    Chien sees AI’s future role in the human spaceflight program as one in which humans focus on the hard parts, like directing robots in a natural way while the machines operate autonomously and give the humans a high-level summary.

    “AI will be almost like a partner with us,” Chien predicted. “It’ll try this, and then we’ll say, ‘No, try something that’s more elongated, because I think that might look better,’ and then it tries that. It understands what elongated means, and it knows a lot of the details, like trying to fly the formations. That’s the next level.

    “Then, of course, at the dystopian level it becomes sentient,” Chien joked. But he doesn’t see that happening soon.

    Old-school autonomy

    NASA has a long history with AI and machine-learning technologies, Chien said. Much of that history has been focused on using machine learning to help interpret extremely large amounts of data. While much of that machine learning involved spacecraft data sent back to Earth for processing, there’s a good reason to put more intelligence directly on the spacecraft: to help manage the volume of communications.

    Earth Observing One was an early example of putting intelligence aboard a spacecraft. Launched in November 2000, EO-1 was originally planned to have a one-year mission, part of which was to test how basic AI could handle some scientific tasks onboard. One of the AI systems tested aboard EO-1 was the Autonomous Sciencecraft Experiment (ASE), a set of software that allowed the satellite to make decisions based on data collected by its imaging sensors. ASE included onboard science algorithms that performed image data analysis to detect trigger conditions to make the spacecraft pay more attention to something, such as interesting features discovered or changes relative to previous observations. The software could also detect cloud cover and edit it out of final image packages transmitted home. EO-1’s ASE could also adjust the satellite’s activities based on the science collected in a previous orbit.

    With volcano imagery, for example, Chien said, JPL had trained the machine-learning software to recognize volcanic eruptions from spectral and image data. Once the software spotted an eruption, it would then act out pre-programmed policies on how to use that data and schedule follow-up observations. For example, scientists might set the following policy: if the spacecraft spots a thermal emission that is above two megawatts, the spacecraft should keep observing it on the next overflight. The AI software aboard the spacecraft already knows when it’s going to overfly the emission next, so it calculates how much space is required for the observation on the solid-state recorder as well as all the other variables required for the next pass. The software can also push other observations off for an orbit to prioritize emerging science.

    2020 and beyond

    “That’s a great example of things that we were able to do and that are now being pushed in the future to more complicated missions,” Chien said. “Now we’re looking at putting a similar scheduling system onboard the Mars 2020 rover, which is much more complicated. Since a satellite follows a very predictable orbit, the only variable that an orbiter has to deal with is the science data it collects.

    “When you plan to take a picture of this volcano at 10am, you pretty much take a picture of the volcano at 10am, because it’s very easy to predict,” Chien continued. “What’s unpredictable is whether the volcano is erupting or not, so the AI is used to respond to that.” A rover, on the other hand, has to deal with a vast collection of environmental variables that shift moment by moment.

    Even for an orbiting satellite, scheduling observations can be very complicated. So AI plays an important role even when a human is making the decisions, said Chien. “Depending on mission complexity and how many constraints you can get into the software, it can be done completely automatically or with the AI increasing the person’s capabilities. The person can fiddle with priorities and see what different schedules come out and explore a larger proportion of the space in order to come up with better plans. For simpler missions, we can just automate that.”

    Despite the lessons learned from EO-1, Chien said that spacecraft using AI remain “the exception, not the norm. I can tell you about different space missions that are using AI, but if you were to pick a space mission at random, the chance that it was using AI in any significant fashion is very low. As a practitioner, that’s something we have to increase uptake on. That’s going to be a big change.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

  • richardmitnick 12:19 pm on December 6, 2018 Permalink | Reply
    Tags: , , , , , MARS 2020, Mars 2020 Mastcam-Z from ASU,   

    From Arizona State University: “Mars 2020 rover mission camera system ‘Mastcam-Z’ testing begins at ASU” 

    ASU Bloc

    From Arizona State University

    December 4, 2018

    Arizona State University research technician and Mars 2020 Mastcam-Z calibration engineer Andy Winhold waited patiently on the loading dock of ASU’s Interdisciplinary Science and Technology Building IV in anticipation of the arrival of a very special delivery.

    On board the delivery truck was precious cargo from Malin Space Science Systems, a test model of “Mastcam-Z,” the mast-mounted camera system for NASA’s Mars 2020 rover mission.

    NASA Mars 2020 rover schematic

    NASA Mars Rover 2020 NASA

    Mars 2020 Mastcam-Z

    The Eyes of NASA’s Next Mars Rover
    Mastcam-Z is the name of the mast-mounted camera system that is equipped with a zoom function on the Mars 2020 rover. Mastcam-Z has cameras that can zoom in, focus, and take 3D pictures and video at high speed to allow detailed examination of distant objects. The principal investigator for the instrument is professor and planetary scientist Jim Bell of the School of Earth and Space Exploration.

    Mastcam-Z is being designed, built and tested under the direction of principal investigator Jim Bell, of ASU’s School of Earth and Space Exploration. The dual camera system can zoom in (hence the ‘Z’ in “’Mastcam-Z’), focus and take 3D pictures and panoramas at a variety of scales. This will allow the Mars 2020 rover to provide a detailed examination of both close and distant objects.

    The test model that arrived on the Tempe campus in November, otherwise known as an engineering qualification model or EQM, is an important step in designing and building instruments for space. These models not only serve as a way to run the instruments through the rigors of launch and functionality in space, they also serve as a way for the instrument team to evaluate the design and testing plans before the final cameras are fully assembled.

    Testing the Mastcam-Z engineering model

    The engineering model essentially allows the team to do a “dry run” through the complete design and build process of the instrument before the final versions of the cameras are complete.

    “Parts may take longer to build than expected, a certain assembly step may be more difficult than initially thought or resources from third parties could become scarce on short notice,” Winhold said. “These are all things we can learn about and prepare for in advance using the engineering model.”

    The team first verifies that the test instrument operates correctly in terms of parts, power consumption and software. They also use the model to ensure the instrument meets mission requirements in terms of functionality, size and weight. “For Mastcam-Z, one of the primary interests with the engineering model was evaluating the instrument’s ability to change focal length — or zoom,” Winhold said.

    Specifically, the team tested the engineering model in the thermal vacuum chamber, located in ASU’s Interdisciplinary Science and Technology Building IV, to confirm that their support equipment was designed appropriately and allowed the camera to be placed securely in the chamber and view out the chamber’s window clearly. They also timed the tests so they knew how long testing the actual cameras will take, and they tested the IT network’s ability to share data quickly between people inside the cleanroom and other support team members outside of the room and around the world.

    Winhold describes his role on the mission as similar to someone playing the game “Operation,” where the patient is the Mastcam-Z cameras and the tweezers are the support pieces.

    The Mastcam-Z team testing the engineering model in ASU’s cleanrooms. Team members include Jim Bell, Andy Winhold, Alex Hayes, Ken Herkenhoff, Elsa Jensen, Tex Kubacki, Jake Schaffner, Paul Corlies, Christian David Tate, Megan Emch, Kristen Paris, Ernest Cisneros, Winston Carter, Corrine Rojas, Shane Thompson and Rick Hoppe. Photo courtesy ASU

    A calibration target used to assess the image quality of the cameras, consisting of geometric patterns, slanted edges, and lines very finely spaced apart to evaluate the camera’s optics and their ability to accurately capture the resolution and contrast of the imaged scene onto the camera’s image sensor. Photo courtesy ASU

    ASU research technician and Mars 2020 Mastcam-Z calibration engineer Andy Winhold with ASU’s thermal vacuum chamber in ISTB IV on the Tempe campus. The thermal vacuum chamber simulates the space environment so instruments can be tested for the rigors of space exploration. Photo courtesy ASU

    The engineering qualification model — a test model of Mastcam-Z, the mast-mounted camera system for NASA’s Mars 2020 rover mission — in the cleanroom of ISTB IV on the ASU Tempe campus. Photo courtesy ASU

    “But in my case,” said Winhold, “I’m only shown pictures of the board game, and based on those pictures I need to design and create the best tweezers for removing ailments without hurting the patient.”

    And according to the team, the testing has been a success so far.

    “We had a few hiccups we worked around, like cables not being long enough, not understanding best communication procedures, that sort of thing; but nothing truly unexpected,” Winhold said. “That’s exactly how we like things. In testing equipment that will be going to space, a boring day that goes according to procedure is a good one.”

    Next steps for the Mastcam-Z team

    In December, the actual Mastcam-Z flight cameras will arrive on the ASU Tempe campus for testing. They will then be delivered to NASA’s Jet Propulsion Laboratory and installed on the Mars 2020 rover, which will launch in summer 2020, landing on Mars in February 2021. The mission is expected to last at least one Mars year (687 Earth days).

    “The tests we ran on the engineering unit at ASU are almost identical to the tests we’ll be running on the actual cameras when they arrive,” Winhold said.

    Once the instrument is finalized and installed in the Mars 2020 rover, the engineering model continues to have a purpose.

    “Largely it is considered a ‘flight spare’ and will be a back-up unit should something happen to the flight cameras before launch,” Winhold explained. “Once the rover launches in the summer of 2020 we won’t be able to do any hands-on interaction with the flight cameras, though, so we’ll have the engineering model as a reference for possible problem solving and as a reference for subsequent rover missions.”

    About Mastcam-Z

    The cameras weigh about 8.8 pounds and will produce images of color quality similar to that of a consumer digital HD camera (2 megapixels). The cameras will help other Mars 2020 experiments on the rover by looking at the whole landscape and identifying rocks and soil (regolith) that deserve a closer look by other instruments. They will also spot important rocks for the rover to sample and cache on the surface of Mars, for eventual return (by a future mission) to Earth.

    Mastcam-Z’s purpose is to take high resolution panoramic color and 3D images of the Martian surface and features in the atmosphere with a zoom lens to magnify distant targets. It will be mounted on the Mars 2020 rover mast at the eye level of a 6-foot-5-inch person. The two cameras are separated by 9.5 inches to provide stereo vision. These cameras, with their all-seeing sharp vision, will provide images for science team members to pick out the best rocks, to hunt for evidence of past habitability recorded in the geology and texture of the landscape, and to look for signs of past water on Mars.

    Mastcam-Z’s principal investigator is Professor Jim Bell of the School of Earth and Space Exploration. The deputy principal investigator is Dr. Justin Maki of NASA’s Jet Propulsion Laboratory, the Planetary Society serves as the instrument’s education and public outreach partner, and the prime subcontractor for instrument development is Malin Space Science Systems, Inc.

    NASA’s Mars 2020 rover mission

    The Mars 2020 rover mission is part of NASA’s Mars Exploration Program, a long-term effort of robotic exploration of the Red Planet. The Mars 2020 mission addresses high-priority science goals for Mars exploration, including key questions about the potential for life on Mars. The mission also seeks to gather knowledge and to demonstrate technologies that address the challenges of future human expeditions to Mars. These include testing a method for producing oxygen from the Martian atmosphere, identifying other resources (such as subsurface water), improving landing techniques, and characterizing weather, dust, and other potential environmental conditions that could affect future astronauts living and working on Mars.

    Mastcam-Z Team

    On February 6, 2018, the Mastcam-Z team captured their traditional team photo in an unusual way: with the stereo testbed model of the camera. Just as the real camera will do on Mars, the testbed rotated to multiple positions to gather in the full scene. To produce this panoramic view, the team corrected the images for geometric distortion and assembled them into a mosaic.

    From left to right, the pictured team members are: Jim Bell, Justin Maki, Jeffrey Johnson, Mark Lemmon, Ken Edgett, Mike Wolff, Ken Herkenhoff, Samantha Jacob, Ed Cloutis, Andy Winhold, Zach Bailey, Danika Wellington, Nicole Schmitz, Rob Sullivan, Peter Martin, Paul Corlies, Jim Bell, Sarah Fagents, Kristen Paris, Stephanie Holaday, Elsa Jensen, Piluca Caballo Perucha, Ernest Cisneros, Jake Adler, Melissa Rice, Christian Tate, Kjartan Kinch, Darian Dixon, Gerhard Paar, Kathleen Hoza, Jon Proton, Jim Bell, and Mat Kaplan.

    Principal Investigator: Jim Bell, Arizona State University

    Deputy Principal Investigator: Justin Maki, NASA’s Jet Propulsion Laboratory

    Education and Public Outreach Partner: The Planetary Society

    Instrument Development: Malin Space Science Systems
    Team Blogs

    What’s the latest on the Mastcam-Z team? Check out the Planetary Society Mastcam-Z team blogs.


    Planetary Society Mastcam-Z Press Room

    NASA Mars 2020 Mission Newsroom

    Additional Resources

    NASA Mastcam-Z webpage

    Planetary Society Mastcam-Z webpage

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASUis the largest public university by enrollment in the United States. Founded in 1885 as the Territorial Normal School at Tempe, the school underwent a series of changes in name and curriculum. In 1945 it was placed under control of the Arizona Board of Regents and was renamed Arizona State College. A 1958 statewide ballot measure gave the university its present name.
    ASU is classified as a research university with very high research activity (RU/VH) by the Carnegie Classification of Institutions of Higher Education, one of 78 U.S. public universities with that designation. Since 2005 ASU has been ranked among the Top 50 research universities, public and private, in the U.S. based on research output, innovation, development, research expenditures, number of awarded patents and awarded research grant proposals. The Center for Measuring University Performance currently ranks ASU 31st among top U.S. public research universities.

    ASU awards bachelor’s, master’s and doctoral degrees in 16 colleges and schools on five locations: the original Tempe campus, the West campus in northwest Phoenix, the Polytechnic campus in eastern Mesa, the Downtown Phoenix campus and the Colleges at Lake Havasu City. ASU’s “Online campus” offers 41 undergraduate degrees, 37 graduate degrees and 14 graduate or undergraduate certificates, earning ASU a Top 10 rating for Best Online Programs. ASU also offers international academic program partnerships in Mexico, Europe and China. ASU is accredited as a single institution by The Higher Learning Commission.

    ASU Tempe Campus
    ASU Tempe Campus

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: