NODAR Launches Hammerhead 3D Vision Platform for Mainstream Autonomy
January 7, 2021 | PRNewswireEstimated reading time: 2 minutes
NODAR, Inc. has released the first demonstration of its Hammerhead™ 3D vision platform, taking an important step towards truly safe, mainstream autonomous driving. NODAR Hammerhead™ produces high-density 3D point-clouds at ranges up to 1,000m with astonishing accuracy, eclipsing mono-camera and LiDAR performance and paving the way to L3 and higher autonomy. This unprecedented achievement includes small object detection at previously impossible ranges. In the demonstration released today, Hammerhead™ is used to detect and accurately measure the distance to a 10cm brick at a distance of 150m.
Reaching this milestone is groundbreaking in two key ways: 1) detecting unknown objects at 150m provides ample time to safely avoid collisions at highway speeds (4.5 seconds at 120kph/74mph) & 2) competitive solutions based on a single camera utilizing AI and inferencing may be able to detect known large objects at this range, however, many lethal obstructions will be unknown to these systems or too small, and will fail detection. NODAR's technology measures the physical environment in real-time, providing distance data for every pixel in view, regardless of whether an object is known or unknown.
Beyond small object detection, NODAR Hammerhead™ delivers a new level of safety where existing solutions fall flat. Mono-camera solutions relying on deep learning to estimate depth are inherently limited by finite training sets, compute requirements, and known object ambiguity (adult vs. child can introduce range error of 50%), exposing life-threatening uncertainty. Numerous fatal and highly publicized accidents in recent years serve to emphasize the shortcomings of existing systems. LiDARs rely on scanning beams and can easily miss small objects. The LiDAR scanning process takes precious time, whereas a camera-based system offers >20X the area coverage rate with the reliability, robustness, and low price of high volume solid-state cameras. Last, NODAR produces frame-by-frame disparity maps every 33 milliseconds while single-camera systems and LiDAR must aggregate and analyze data before producing results, causing significantly slower performance.
The significant advances in performance, accuracy, and reliability that NODAR brings will yield better performance around critical edge cases, higher levels of safety and an increase in lives saved, at lower cost than current approaches.
Bio-inspired by the hammerhead shark, which has the best depth perception in the animal kingdom due to the wide separation between its eyes, NODAR Hammerhead™ uses data from multiple cameras to calculate real physical measurements of distance to targets. The unique advantage of NODAR's system is the ability to mount the cameras independently in long-baseline configurations, such as in the sideview mirrors, headlamps, or on the roof. With highly accurate long-range 3D sensing and no reliance upon inferred measurements, Hammerhead™ accurately captures instances of banked roads, disabled vehicles, and road debris - edge cases that other vision systems will miss.
Leaf Jiang, NODAR founder and CEO, speaks to the significance of today's demonstration: "The automotive world knows that current ADAS systems must advance to ensure human safety. Today, we've taken a fundamental step towards demonstrating that higher levels of automated driving are achievable with existing sensor technology in the immediate term. At NODAR we believe autonomy should never compromise safety, and that a camera-based solution is the only way to deliver on the performance, safety, and pricing requirements of the mainstream automotive market."
Suggested Items
Real Time with… IPC APEX EXPO 2024: Operational Excellence and Smart Factory Initiatives
04/30/2024 | Real Time with...IPC APEX EXPOOperational excellence and operational efficiency are defined in this interview with Koh Young General Manager Joel Scutchfield. He touches on automation, AI, and collaboration as solutions to resource limitations. Koh Young's data-driven approach uses AI for process adjustments, data analytics, and supply chain enhancements. The discussion underscores the shift toward smart factory initiatives and the future of manufacturing, with a focus on reshoring, nearshoring, and technology utilization.
IDTechEx Report on Quantum Technology: Nano-scale Physics for Massive Market Impact
04/30/2024 | PRNewswireThe quantum technology market leverages nano-scale physics to create revolutionary new devices for computing, sensing, and communications. Across the industry, quantum technology offers a paradigm shift in performance compared with incumbent solutions.
NASA’s Optical Comms Demo Transmits Data Over 140 Million Miles
04/30/2024 | NASA JPLNASA’s Deep Space Optical Communications experiment also interfaced with the Psyche spacecraft’s communication system for the first time, transmitting engineering data to Earth.
Real Time with… IPC APEX EXPO 2024: Insight into Summit Interconnect's Success
04/30/2024 | Real Time with...IPC APEX EXPOShane Whiteside, CEO of Summit Interconnect, discusses the company's recent recognition as one of the best PCB fabricators in the industry by receiving IPC's Peter Sarmanian award. Whiteside touches on the impact of changes in the marketplace, such as the Defense Production Act and presidential determination, on their growth. Whiteside also shares the company's focus on mechanical and data automation to enhance manufacturing processes and anticipates more automation and evolution in the industry.
Koh Young Showcases Award-winning Inspection Solutions at SMTconnect with SmartRep in Hall 4A.225
04/25/2024 | Koh Young TechnologyKoh Young Technology, the industry leader in True 3D measurement-based inspection solutions, will showcase an array of award-winning inspection and measurement solutions at SMTconnect alongside its sales partner, SmartRep, in booth 4A.225 at NürnbergMesse from June 11-13, 2023. The following offers a glimpse into what Koh Young will present at the tradeshow: