by Dot Cannon
“What is a fully autonomous vehicle? What does that look like? People are questioning, is there a steering wheel?”
With these words, DSP Concepts Executive Vice President Will Tu opened Wednesday morning’s workshop on “Automotive and Autonomous Vehicle Sensors”, at the 2017 SENSORS Expo and Conference.
Three panels would discuss the topic, that morning, at McEnery Convention Center in San Jose. And they’d be examining the topic from the perspectives of processing, machine learning and software, and sense technologies.
First, Will offered a definition. “The Society of Automotive Engineers have come up with five different levels of autonomy. Cruise control is kind of like feet off, then hands off, then eyes off,…and you finally get to brains off. That’s kind of scary.”
For the next fifty minutes, Will and his first panel would discuss “Processing, CPUs and Analysis.”
Panel One’s perspective
“I believe we are going through the most drastic change the automobile industry’s ever (experienced),” said Boris Shulkin, Vice-President of Research and Development for the world’s largest Tier 1 automotive supplier, Magna International.
Referencing vehicle autonomy levels, he continued, “The real breakthrough is happening between level two and level three (the boundary between monitored and non-monitored driving). I truly believe this has very disruptive potential.”
“We’re specifically focused on the sensor array,” said Wade Appelman, VIce-President of Sales and Marketing at Silicon Valley SiPM industry leader SensL. “As providers of these types of sensors, we have to go to the industries and prove that the technology works.”
“We’re focused on AI process learning and driverless cars,” said Aniket Saha, Senior Product Manager at ARM’s CPU group. “We obviously are growing. That space is going to be seventeen billion by 2025.”
The issues at hand
Panel One’s discussion included the issues of price, power and safety.
“Is there enough shift and power today to get to autonomous drive?” Will asked.
“I think we’re getting into a period right now when there’s a lot of unknowns,” Boris offered. “Since the requirements aren’t known, one train of thought is to put all the power you can on this vehicle.”
“The key to success, is to make it affordable. You have to design a platform in such a way that it’s scaleable up, but more important, scalable down.”
“I’m hearing from a lot of customers that there’s not enough processing power,” Will commented.
“One way is with LiDAR,” Wade responded. “There’s the simple way, and (then) the more accurate way, for better performance.”
“We hear, ‘what happens if two LiDAR systems are facing each other and freaking each other out?’ Histogramming is going to be something that people are really looking for.”
“It’s all about the system,” Aniket said, in response to an audience member’s question on safety–both now and in five years. “We have committed to increasing the safety standards of (self-driving vehicles).”
“As we move further and further into autonomous (driving), secure connectivity becomes a very important piece,” Boris said. “If you’re driving, and your (vehicle decides it needs to update software), you can’t just stop.”
As in previous discussions on the IoT, the panelists agreed that for autonomous vehicles, “one size fits all” does not apply.
“There’s no one engine that (works best for autonomy drive),” Aniket said. “It depends on what problem you’re trying to solve.”
“The needs of different platforms are going to be different,” Boris commented. “In terms of computers, there’s no single answer there either.”
Will had said earlier that some people projected autonomous driving by the end of this year. But more conservative time estimates came up, in the course of Panel One’s discussion.
“In 2020, we’ll start seeing, in small numbers, (selfdriving) vehicles in fenced areas,” Boris said. “These blocks are being built today, (and) will be valid very soon. The big question is, what’s going to happen in the larger market.”
To be continued.