Here’s what’s been going on!

  • Exploring New Worlds of Reality – Part 3 February 16, 2018
    by Dot CannonParker Abercrombie wears a VR headset during his presentation

    What if you could explore Mars–or build a spacecraft–right from your desk?

    Those are some of the capabilities virtual and augmented reality allows rocket scientists,  said Jet Propulsion Laboratory’s Dr. Scott Davidoff  during Caltech’s recent “Virtual and Augmented Reality for Space Science and Exploration” symposium, in Pasadena.

    Dr. Scott Davidoff addresses audience from Keck Institute podium

    “One of the things teams have never been able to do is stand around a VR model together,” said Dr. Davidoff, during his after-lunch presentation.  Virtual reality, he added, could create a barrier between “being in it and not being in it”.

    But now, he said, augmented reality allowed teams to test the various scenarios which a spacecraft could encounter.

    “(For the Mars rover), we created a tool that allowed rover drivers to…come up with the safety of different paths,” Davidoff explained.

    Scott Davidoff smiles from Keck Institute podium

    “In many ways, by having these real objects (with) a mix of real and what I call ‘magical properties’ (we can problemsolve),” he continued.

    “I think we can move earlier and earlier (from the computer-assisted design stage, to a place where the team can ask questions).”

    Going to the surface–of Mars

    “What if we could give a scientist a headset that she could put on in her office–and then walk out from behind her desk and see Mars all around her?” asked NASA JPL senior software engineer Parker Abercrombie.

    Abercrombie, who is the lead developer of science targeting software for the upcoming Mars 2020 rover mission, shared an application that allowed exactly that capability, during his presentation.

    The OnSight application, he explained, was created in 2016.  Using images the Mars rover sends back to earth, Onsite’s technology puts scientists virtually on the planet.

    Parker Abercrombie wears a VR headset during his presentation

    “We get the downlink once or twice a day,” Abercrombie explained.   “(Then, OnSight allows for an AR experience) where everything around the user is Mars, except their computer.”

    OnSight he added, was just a starting point.  “This past summer (using the application), we had (a virtual-reality experience) called Access Mars developed in collaboration with Google.”

    And while Access Mars is currently available–and free–to the general public, Abercrombie said JPL wants to go much further.

    “The direction we’re interested in going, especially for Mars, is open-data standard,” he explained.

    VR and the user experience
    Dr. Ramaswamy addresse the audience from the Keck lectern

    “If it doesn’t help the user (why build it?),” commented Dr. Emine Basak Alper Ramaswamy.

    Dr. Ramaswamy, who is a data visualization developer at JPL, explored the directions in which virtual reality needed to become more user-friendly during her presentation.

    Referencing JPL’s signature astrodynamic computing platform, she said, “In reality, MONTE (has) a very steep learning curve.  You have to do everything from scratch.

    “We are just (now) focusing on (what users need),” she continued.  “Think of the websites you saw in the early 90’s.   (They didn’t work well but showed what was possible.  Then they evolved to workability.  Now we need the same facility of use) to come to the VR domain.  We don’t have that yet.”

    Dr. Ramaswamy closeup from the lectern

    “To be able to see something in a VR space, (that enables the viewer) to be more accurate with spatial calculations…Virtual reality gives immediate visual feedback (and) allows users to understand the spatial properties of the trajectories.”

    Efficiency, she said, would be a major benefit of the use of virtual reality.

    “There’s nothing here you cannot do on your desktop.  But if you use (VR) you might do a task in two minutes, instead of ten minutes.”

    The new (virtual) campfire
    Charles White onstage from Keck podium

    “Our virtual environment started (in) caves, 40,000 years ago,” said JPL Knowledge Management Specialist Charles White.

    Storytelling around a campfire, he said, was the first immersive environment.  And storytelling was the common denominator in all immersive environments–from the days of 3D movies to the present.

    But for all the current interest in virtual reality, White said, it isn’t new to NASA.

    “We’ve been doing it since 1985 with these giant clunky headsets,” he explained, showing a slide of the 1980s technology in use.

    Then, White took the audience to some of VR’s modern uses, including astronaut training and flight safety.

    Charles White explains VR concept

    “It’s a lot safer to crash a virtual plane than a real one,” he explained, outlining the ways NASA’s Ames Research Center, in Mountain View, California, used VR in its Future Flight Central facility.

    White also shared a creative project he’d done in virtual reality.   At the time, he and his team were proposing an Earth Science Center for construction in JPL’s Building 264.  But two other groups had proposed alternative projects for that space, as well.

    So, White constructed his proposed Earth Science Center in Second Life.

    “We built three JPL  rooms (virtually),” he explained.  “We had a bunch of people in suits–and then I had a unicorn come in.”

    The decision-makers, he continued, authorized construction of the real-life Earth Science Center on the spot.

    “This is our new (storytelling) campfire.  This is a shared experience,” White said.

    A panel explores “how-to”
    Dr. Scott Davidoff with four-person discussion panel at table with black Keck tablecloth banner

    So, what are the obstacles to widespread use of virtual and augmented reality in planning for space missions?

    Just before the mid-afternoon coffee break, a discussion panel on “VR/AR for Mission Design and Ops”, moderated by Dr. Davidoff, considered those issues.

    The sheer magnitude of these applications could be a problem, said Emine Ramaswamy.

    Dr. Ramaswamy speaks on a panel as Charles White listens

    “Especially the God’s-eye view, it’s an unimaginable scale for us,” she commented.  “We have to find a way to make this work with the desktops.”

    “The challenge, designing things in VR, was to be taken seriously,” contributed Charles White.  “You really have this intense identity interaction, when someone’s asked to put a gaming device in a serious application.”

    VR’s accuracy was another consideration.

    Parker Abercrombie speaks on discussion panel

    “It’s never going to be perfect,” said Parker Abercrombie.  “You can control a robot arm from your hand, but as your hand gets further from your body it’s not that precise.”

    Professor Joel W. Burdick makes a point during AR and VR discussion panel

    “I think there needs to be…more complex mapping,” suggested JPL Laboratory Research Scientist Joel W. Burdick, who is also a Caltech professor of both Mechanical Engineering and Bioengineering.

    “(You can equip that robot with a servo control, so it can scan around and find what you’re looking for).”

    Coffee was waiting, as were demos, in the library, of several different VR systems.  Still to come was the day’s final panel, on “A View from the Industry”.


    This is Part 3 of a 4-part series.  Here are the links to Parts 1 and 2.




Share on FacebookShare on Google+Tweet about this on TwitterBuffer this pageShare on RedditShare on TumblrShare on StumbleUpon

Leave a Reply