by Dot Cannon
“Are we there yet?” is a question with which any parent is familiar, during a road trip.
It’s also one which a brand-new “Automotive and Autonomous Vehicle Sensors Workshop” considered carefully, from three different perspectives, during Wednesday’s 2017 SENSORS Expo in San Jose.
By 11:00 Wednesday morning, Panel One had explored the angle of “Processing, CPUs and Analysis”. Following them, Panel Two talked about self-driving vehicles, from the angle of “Machine Learning and Software”.
Following a break, moderator and DSP Concepts Executive Vice President Will Tu seated Panel Three, for a discussion of “Sense Technologies”.
“We tackle very difficult imaging challenges,” said ON Semiconductor® Senior Director of Product Marketing Geoff Balew, introducing himself. “We have kind of this fresh cycle of new imaging technology coming through.
“As we’ve grown, the company (isn’t) just deploying technologies,” he continued. “We provide a variety of electronic devices around that image sensor. (And we can even design them.)”
Rourke Patullo, Michigan Applied Robotics Group (MARG) co-founder and VP of UM Autonomy, addressed a problem his organization was correcting, during his introductory remarks.
“We’re facing a situation where (undergraduates aren’t being taught to make ‘smart’ vehicles),” he said. “We bring (them) up from knowing absolutely nothing, to where they can go out and compete, and sometimes beat (more experienced developers).”
“It’s very sober and refreshing to hear (these discussions),” commented Boris Shulkin,. Vice-President of Research and Development for the world’s largest tier 1 automotive supplier, Magna International. Boris had previously participated in Panel One.
“Real work is being done. People are focusing on solving real problems.”
Sensing technologies: what needs to happen next?
“Sensor count data rates are certainly growing,” Will began, addressing his first question to the panel. “What technologies can balance the desire for more data with limitations?”
Exploration was essential, Geoff responded. “When you put a rack of computers in the car, you can do things that you can’t do if you’re worried about productizing. It takes people reaching, to understand (the minimum requirements) to deliver certain capabilities.”
As with Panels One and Two, safety considerations came up almost immediately in Panel Three’s discussion.
“I think a lot of people are looking at sensors as we (navigate) Levels 3 and 4,” Will said to the panel. “Isn’t there another suite that needs to be applied?”
(Level 3, with autonomous vehicles, is the “eyes-off” and “hands temporarily off” level. Level 4 is eyes off, hands off and minimal driver involvement, while Level 5, according to a “roadmap” slide Will presented, is a completely self-driven vehicle.)
Rourke referenced the first known self-driving car fatality in his reply. “The (Tesla) crash in Florida (in May, which killed driver Joshua Brown) resulted when the camera ultimately failed. When does the driver regain control?”
Boris pointed out a need to look at the big picture, when developing sensing technologies.
“You may think of one sensor as delivering x, y, z. But if you know that the other sensor can deliver something different…it’s very critical to look at the entire platform as one organism (rather than one sensor at a time).”
“As we look forward to more car autonomy, we’re moving towards, ‘there’s something in front of me, I need to look at it.'” Geoff said.
Weather was another issue, Rourke said. “In Michigan, sleet…will stop the sensor…What will you encounter in the environment, in terms of ice or emergency vehicles?”
Next, Will asked the panel about LIDAR (light detection and ranging, using a pulsed laser). Panelists agreed that its cost could be prohibitive.
“We see lots of great people, lots of great technologies,” Geoff said, “but sometimes there’s a price tag on it that makes it not for us.”
“One of the biggest barriers to entry, to the teams that want to join our group, (is that) they can’t afford LIDAR,” Rourke contributed. “(The first company that gets LIDAR down to an affordable level, will really advance.)”
But Geoff also pointed out LIDAR’s strong points, along with its weaknesses. “Image sensors are really nice because (their capability doubles approximately every two years, as they develop), but …they are most challenged in very dark scenes,” he said.
“Radar is very strong at telling you where an object is, and also, its velocity. Radar also bounces off things. LIDAR is kind of a combination of the two, (with) some of the strengths and weaknesses of (both image sensors and radar).”
Will followed up on Rourke’s comment, by asking, “What sensors are better in weather conditions?”
“That’s where radar really shines,” Geoff said. “Radar can go through dense fogs.”
And Boris pointed out that today’s best innovations may not be tomorrow’s.
“Technology is not sitting here, still,” he said. “Things are happening in the labs (that will change capabilities). The magic is really once you understand the sensors (and can combine them).
“Sensors today at Level 1 and Level 2 can’t do anything (more).”
As with the other panels, Panel Three agreed that autonomous vehicles weren’t yet ready to be part of everyday life. More study and development was essential, the panelists said. But all of them saw the industry moving closer to a day when “smart” vehicles take to the road everywhere.
“Even in the cabin, there’s driver monitoring,” Geoff said. “Especially with more autonomous vehicles, there’ll be more sensors in the car, to provide services for the human.”
“Michigan is in a wonderful position for acting,” commented Rourke. “What Ann Arbor is exploring: what level of redundancy do you need for sensors to be successful (in autonomous vehicles)?”
“Cost effective is an absolute must,” said Boris. (“But it’s s combination of things. I don’t think you can cover it with one solution.”)
And as the workshop concluded, Will offered a look into the “fun” aspect autonomous vehicles could add to life in the future.
“There are cars that are participating in autonomous competitions,” he said. “One was this autonomous race. I would love to see more (of that) to get a sense of where this industry is going.”
”