This year’s edition of the Consumer Electronics Show (CES) in Las Vegas again took on the trappings of a vast global automotive exposition as nearly 300,000 square feet of the record-setting 2.75 million square-feet of overall displays featured a wide selection of vehicle-related rollouts, updates and product demonstrations.
CES exhibitors and attendees were connecting with connectivity among numerous autonomous concept vehicles along with current and future component-based breakthroughs, and it’s quite likely that significant professional connections were made to cultivate new collaborative ventures.
Forming partnerships amid a highly populated, fast-moving pace of inventions and refinements remains a sound business strategy, observes Young Sohn, the president and CEO of Samsung Electronics, who also serves as chairman of the board at Harman, acquired by Samsung for $8 billion in 2017. “Building an autonomous platform requires close collaboration across industry, as one company cannot deliver on this enormous opportunity alone,” Sohn says. “The challenge is simply too big and too complex.”
Bringing autonomous-related products to market can be a costly and time-consuming proposition given that many miles of actual on-road travel may be required to assemble the necessary data and work out the bugs, and aftermarket executives have been taking notice of the sizable CES selection of applications for simulating true-to-life driving conditions to create a more cost-effective environment for engineering and testing product development projects.
A partnership formed last year between India’s Tata Elxsi and Britain’s Spirent Communications has been instrumental in developing a V2X (vehicle-to- infrastructure or “vehicle-to-everything”) system for manufacturers of autonomous equipment that provides flexible, scalable and comprehensive testing and performance benchmarking throughout the development cycle – ranging from early research to pre-production.
Using a combination of Tata Elxsi’s patent-pending V2X Emulator Software and Spirent’s advanced solutions for GNSS (Global Navigation Satellite System) and radio channel simulation, “the integrated V2X test bed offers the ability to bring real-world traffic scenarios into the lab and thereby significantly reduces cost and time associated with extensive field testing,” according to the companies.
The Tata Elxsi/Spirent program has been adopted by the prestigious China Academy of Information and Communications Technology (CAICT) and approved by the OmniAir Consortium’s Connected Vehicle Certification Program.
Based in Washington, D.C. with a worldwide presence, OmniAir’s membership includes public agencies, private companies, research institutions and independent laboratories involved with perfecting Connected Vehicles, Intelligent Transportation Systems (ITS) and Road Tolling technology.
“Our V2X device certification program is designed to advance safer transportation systems and connected car communications by improving the mobility, efficiency and interoperability of ground transportation networks,” explains OmniAir Executive Director Jason Conley.
“Our products help partners and customers deploy their solutions assured that they meet requirements regarding functionality, interoperability and performance as outlined by the OmniAir Consortium and in forthcoming regulations by the U.S. Department of Transportation,” says Abhitesh Kastuar, general manager of Spirent’s automotive division. “We are fully committed to supporting evolution of these standards and specifications as the industry needs evolve.”
Future-proof technology
Samsung’s DRVLINE, an open, modular and scalable hardware and software-based platform debuting at CES, is being positioned as “the go-to partner” for OEMs and Mobility as a Service (MaaS) providers in the autonomous driving market. A $300-million Samsung Automotive Innovation Fund has been established along with a series of investments and partnerships designed to promote collaboration.
“Many hardware and software autonomous driving platforms force the end-user to adopt particular technology as an all-or-nothing black-box package,” according to the company. “The DRVLINE platform, however, has been designed so vendors can collaborate and the software can be customized or enhanced and individual components and technologies can be swapped in and out as needed.”
This quality also helps to “future-proof” the system – an essential consideration in such a fast-changing industry: OEMs and their suppliers “can market the most advanced existing autonomous technology while incorporating new innovations as they work toward Level 5 automation.” (Level 5 is defined as “Full Automation” under SAE International standards.)
DRVLINE includes a new Advanced Driver Assistance System (ADAS) forward-facing camera system created by Samsung/Harman and engineered to meet New Car Assessment Program (NCAP) standards promulgated by the National Highway Traffic Safety Administration (NHTSA). Lane departure warning, forward collision warning, pedestrian detection and automatic emergency braking are additional features.
“In a car, the human brain is constantly performing incredibly complex calculations while driving,” explains John Absmeier, senior vice president of the Autonomous/ADAS Strategic Business Unit at Harman and Samsung’s vice president of smart machines: “How far is that lamppost? Is that pedestrian going to step into the street? How long until the yellow light turns red? The industry has made incredible advances in automation, yet in-car compute is still a long way from approximating the power of our brains,” he points out. “The DRVLINE platform with its open and high-level compute capability is a first major first step toward building an ecosystem to support full autonomy.”
Six degrees of freedom
Mitsubishi Electric Corp. (MELCO) was advancing several ADAS initiatives at CES. “We’re showing our customers how they can make a seamless transition to autonomous and the new world of mobility by incorporating a number of new technologies into their current roadmaps,” says Mark Rakoski, vice president of engineering. “We have a culture of innovation and operational excellence that extends far beyond automotive, so we’re in a unique position to help automakers transition to lifestyle brands.”
Among Mitsubishi’s latest developments are Predictive Human Machine Interface (HMI) and hybrid haptics for providing a personalized interface without manual customization; Biometric Authentication in which wearables can replace typical key fobs; High-precision Autonomous Mapping for centimeter-level accuracy; Mobile Payment Integration, allowing cars to order, pay for, and route their occupants to food, services or purchases while consolidating individual orders and payments; and Car-to-home Integration lets cars work in harmony with smart homes in a seamless and intuitive way to offer new conveniences and cost savings.
Also in the company’s lineup, and displayed at CES for the first time in North America, is the EMIRAI 4 Smart Mobility Concept Car. A driver-monitoring system that utilizes a single wide-angle camera to simultaneously detect both driver and front passenger at a lower cost than competitive offerings is another Mitsubishi feature, joined by a new Roadway Illumination signaling system showing road-surface projections and generating car-body displays that alert pedestrians, cyclists and other traffic when a vehicle is about to back up or a door is about to open.
The new Fingerprint Base Map introduced by Civil Maps “allows self-driving cars to precisely determine their location in six degrees of freedom (“6DoF”) while evaluating the safest route to travel,” according to the company.
“With Fingerprint Base Map, developers now have a reliable, scalable solution for self-driving localization and navigation that does not blow through AV (Audio Visual) operation budgets,” says CEO and co-founder Sravan Puttagunta.
“With our compact map data format, what once required weeks and months to compile can now be executed more efficiently, in-vehicle, in real-time and while the car is driving,” he says. With a data footprint that is up to 10,000 times smaller than traditional base maps, the technology “enables autonomous vehicle developers to radically reduce the costs associated with data processing, computing power, data storage, bandwidth and energy consumption.”
Robotic perception
A Berlin-based startup called autoaid was at CES highlighting its patent-pending Automotive Bulb Camera that can be retrofitted onto any car “to collect and analyze huge quantities of video data from real traffic events” for furthering autonomous development projects.
Fitting into conventional headlight lampholders such as H7 or H4, the “plug-and-play” unit doesn’t alter the appearance of the vehicle, yet it records all traffic events and sends the data – coupled with real-driving behavior such as steering, braking and accelerating – to autoaid’s servers for analysis as the software identifies traffic participants, traffic lights, signs and other conditions. “The result for the industry,” the company reports, “is a detailed, almost inexhaustible volume of data for the driving behavior of millions of car drivers. From the point of view of the end-customer the solution also offers attractive retrofittable security systems such as a lane-keeping assistant or a collision warning system.”
NVIDIA’s DRIVE Platform is a computing system enabling automakers and suppliers to accelerate production of automated and autonomous vehicles. It scales from a palm-sized, energy efficient module for AutoCruise capabilities to a powerful AI supercomputer for autonomous driving. The architecture is available in a variety of configurations.
“Artificial intelligence is the essential tool for solving the incredibly demanding challenge of autonomous driving,” notes Jensen Huang, NVIDIA’s founder and CEO.
DRIVE is able to understand in real-time what is happening around the vehicle, precisely locate itself on an HD map and plan a safe path forward. Huang refers to it as “the world’s most advanced self-driving car platform – combining deep learning, sensor fusion and surround vision to change the driving experience.”
Also at CES, AEye unveiled its AE100 robotic perception system for the autonomous vehicle, ADAS and mobility markets.
Based on the company’s iDAR (Intelligent Detection and Ranging) technology, the cost-optimized system presents a new form of data collection.
“The AE100 is a game-changer for the autonomous vehicle and ADAS markets, as it makes iDAR technology commercially available for the first time,” says AEye founder and CEO Luis Dussan.
“iDAR-based robotic perception allows sensors to mimic the visual cortex – bringing real-time intelligence to data collection,” he elaborates. “As a result, the system not only captures everything in a scene – it actually brings higher resolution to key objects and exceeds industry required speeds and distances. By solving for the limitations of first generation LiDAR-only solutions, AEye is enabling the safe, timely rollout of failsafe commercial autonomous vehicles.”
“In conjunction with our OEM and Tier 1 partners, we have developed a product that addresses the complex system requirements, high standards and performance demands of our customers,” reports Barry Behnken, AEye’s vice president of engineering.
“A key objective was to design a solid state modular platform that is software definable in order to increase reliability and optimize the cost,” he adds. “We created an easy transition from first-generation spinning LiDAR hardware that allows path-planning software teams to plug and play the AE100 as they replace their legacy systems. It requires no software changes, and enables them to bring in the more-advanced features over time.”
Subscribe to Aftermarket Business World and receive articles like this every month….absolutely free. Click here.