Neural Propulsion Systems (NPS) launched out of stealth today with its groundbreaking NPS 500 platform that it says is the first to combine LiDAR, radar, and cameras that can “see” around corners and twice as far as any system. The company built the “world’s first” all-in-one deeply integrated multi-model sensor system focused on Level 4/5 autonomy around its own solid-state MIMO LiDAR and super-resolution SWAM radar.

The sensor-fused system interconnects the MIMO LiDAR, SWAM radar, and cameras to cooperatively detect and process 360° high-resolution data, “giving vehicles the ability to prevent all accidents.” The system enables vehicles to see around corners and over 500 m (1640 ft) of range, with ultra-resolution accuracy and a highly adaptive frame rate. The breakthrough capabilities are said by the company to make it ten times more reliable than currently announced sensor solutions.

“Our goal to prevent all transportation accidents is the holy grail for autonomous vehicles,” said Behrooz Rezvani, Ph.D., Founder and CEO of NPS. “We are the sensing system behind the Zero Accidents Platform for large-volume deployment at an affordable cost. Existing technologies are not sufficient to achieve this paradigm, so we created our own more powerful LiDAR and radar. Our AI-driven sensor-fusion system processes this ultra-high-resolution data to create the safest and most reliable solution in the market today. The NPS 500 slashes time-to-market for autonomous vehicle manufacturers while being the most cost-effective.”

Helping Rezvani lead NPS are Babak Hassibi, Ph.D., Chief Technologist; David Dunaway, Exec. Advisor; Saman Behtash, Ph.D., VP Engineering; Amin Kashi, VP Product & Customer Engineering; Alysson Do, VP Ops & Admin; and Hinrich Woebcken, Exec. Advisor.

“LiDAR, radar, and cameras will all play significant roles in creating the ideal autonomous driving platform, and there is no question that tightly connected sensors with onboard data fusion for automated driving enable more functionalities,” said Pierrick Boulay, Senior Analyst at Yole Développement. “This direction is unique and is likely to succeed in a market that could reach $25 billion in 2025 for sensing and computing in both ADAS and robot vehicles.”

The sensor system is said to address the physics-based limitations of each sensory system, enhancing and combining the strengths of LiDAR, radar, and cameras. The company says it addresses cameras that provide high-resolution images, but lack depth information and depend on lighting conditions. Radar can measure velocity with great precision, but have lower resolution than LiDAR and are vulnerable to interference from other radars. And LiDAR provides super precision depth information, but its performance and reliability degrade in adversarial, weather, and light conditions—and it can get occluded fairly easily.

For the NPS 500, the solid-state MIMO LiDAR architecture doubles range to greater than 500 m with super-resolution, adaptive multi-beam search, and 10% reflectivity. The new class of radar technology with ten times better detection reliability, simultaneous, and multi-band 360° FoV is said to be 70 times better against other radar signal interference. The software is claimed to be the first to involve AI (artificial intelligence) fusion technology to “see-around-the-corner.” The system offers 650-Tb/s sensor-data processing on a network of tightly connected, custom, signal-processing chips.

Among other benefits claimed for the system are a doubling in the reaction time currently available LiDAR and a significant increase in sensor data reliability. It can anticipate pedestrian’s movement and detect moving objects approaching intersections well in advance. Built-in redundancy helps provide maximum reliability in harsh environments, bad driving, and on rough terrain. Low capital expenditures and operating expenses are promised for customers.

For more info, pricing, and availability, customers and partners can contact info@neuralpropulsion.com.