For the second year in a row at CES, Sony Corp. showed an impressively designed, conceived, engineered, and built prototype electric car called the Vision-S. While it has previously said it has no plans to build the car for public consumption, the company announced that the its development activity has reached the next stage.

With completion of its Vision-S prototype electric vehicle in December 2020, public road testing commenced in Austria for technical evaluation. This is in an effort to continue advancement of vehicle development for safety and security, entertainment, and “adaptability.” Sony will continue to further develop the vehicle, and it plans to conduct driving tests in other regions going forward.

“Our Vision-S processes information from the surrounding environment in real-time, utilizing Sony’s advanced-sensing technologies,” said Kenichiro Yoshida, Chairman, President and CEO, Sony Corp.

So, advancing its sensing technology is a goal of the project. In conjunction with this year’s CES reveal, Sony released a promotional video on other objectives of the car program. With the road-ready prototype, Sony says it hopes to contribute to the evolution of mobility by exploring not only how cars work and how they are made, but also their relationship with society.

“The human desire for mobility will never disappear,” said Izumi Kawanishi, Senior Vice President, AI Robotics Business Group, Sony Corp. “Believing there is a need to satisfy the human desire to move safely, when we started thinking of Sony’s first car, or mobility [in general], we saw user experience as important.”

The company continues its research and development to provide safety through imaging and sensing technologies, to create an inspiring space through the audio-visual technologies it has cultivated over the years, and to constantly update mobility through the cloud, AI (artificial intelligence), and network technologies.

“Communications, the cloud, and services beyond them are considered as parts of mobility,” said Yasufumi Ogawa, Engineering Manager & Senior System Engineer, AI Robotics Business Group, Sony Corp.

The video also highlighted some of the supplying companies that are helping in the process.

“When we started talking with people, it was very impressive that everybody welcomed us with expectations and curiosity about what Sony will do,” said Yuhei Yabe, General Manager, Business Strategy Dept, AI Robotics Business Group, Sony Corp.

Special thanks went to AImotive, Bosch, Continental, Elektrobit, Magna Steyr, Valeo, Vodafone, and ZF. Other supplier partners of note are AWS, B.eng, Blackberry QNX, Brembo, Chevalier Technologies, Gentex, Here, Nvidia, Recaro, and Qualcomm.

“Contribution to the evolution of mobility cannot be done by ourselves alone,” said Ogawa. “We can accomplish it together with our partners as a member of society.”

 

A winning design

Regardless of production intent, the prototype looks to be well-developed and -designed. The original CES 2020 prototype, which has been further developed, has been recognized by the Red Dot organization in October for a prestigious Design Concept award, for design innovations and design concepts that focus on concepts, ideas, and visions that are in development and not commercially available.

The four-seat car is 192.7 in (4895 mm) long, 57.1 in (1450 mm) tall, and 74.8 in (1900 mm) wide on a 118.1-in (3000-mm) wheelbase. It has nearly the same dimensions as recently launched Xpeng P7 and falls between Tesla’s Model S and Model 3. However, the Vision-S scales in at 5180 lb (2350 kg), about 600 to 1000 lb (272 to 454 kg) more than the P7.

The car’s EV platform gives the sedan a long wheelbase thanks to a compact and highly flexible powertrain layout distinctive of pure EVs, and the development of ultrathin battery packs is said to provide a best-in-class cabin environment. The platform can also be used for coupes, SUVs, and MPVs. It is designed to achieve top-class scores in automotive safety tests around the world, with not only advanced active safety through sensing, but also solid passive safety features.

Two 200-kW electric motors front and rear enable all-wheel drive acceleration of 4.8 s for the 0-62 mph (0-100 km/h) sprint and a top speed of 149 mph (240 km/h). The double wishbone front and rear suspension with air springs gives a variable ground clearance from 120 and 135 mm (4.7 and 5.3 in). Tires are 245/40R21 in front and 275/35R21 on the rear.

ZF supported Sony in developing the prototype’s network chassis within its four main technology areas of integrated safety, autonomous driving, e-mobility, and vehicle motion control. With the combination of these elements, the vehicle dynamics control balances comfortable ride with driving safety, which becomes even more relevant when the car is driven autonomously.

“This is combining the latest trends we see in automotive—autonomous driving, e-mobility, and entertainment in the car,” said Dr. Peter Holdmann, Executive Vice President, Division Car Chassis Technology, ZF Friedrichshafen AG. “And we can contribute with our best vehicle dynamic elements in this car.”

“What I like most at Vision-S Prototype is that this prototype combines all the requirements of a network chassis,” said Andre Engelke, Head of System House Vehicle Motion Control, ZF Friedrichshafen AG.

 

Building the prototype

After CES 2020, Manfred Pircher, Engineering Project Leader, Magna Steyr, blogged about his company’s involvement in the program. The Vision-S project started in Spring 2018 when Sony approached his company. It was tasked with hand-building the prototypes, but the supplier is supporting Sony in other areas, from engineering to integration with other suppliers.

“From the start, it was clear they wanted to gain an understanding of how to develop a vehicle, including core technologies such as ADAS and electric architecture,” said Pircher. “They also wanted a realistic approach, right down to the four-door coupe shape of the prototype. They wanted to show this prototype means serious business.”

Along the way, the Magna team learned some important lessons from Sony.

“When you have new entrants into the automotive business, they ask a lot of questions,” he added. “It made us re-evaluate how we explain things to non-automotive people, what are the relevant processes and how to make the best documentation.”

With the project now entering its third year with road testing, Sony is taking the project to the next stage.

“I’m very happy to see that the Vision-S was just the starting point of our joint cooperation,” said Frank Klein, President, Magna Steyr. “We see that high-tech companies like Sony have a major impact on the mobility of the future.”

 

Safety Cocoon

The car’s Sony Safety Cocoon concept addresses safety and security, which the company says is critical for enabling advancement of autonomous driving in the near future. With sensing technology that surpasses human vision, the system provides a 360-degree view of the vehicle’s surroundings while monitoring the condition of the driver and the vehicle’s interior. Furthermore, by feeding back the sensed information in real time to the driver, it provides a sense of a safe and comfortable mobility experience.

“We’ve increased the number of sensors on the vehicle to 40 and experimented to see how high we can raise their sensing capability,” said Kawanishi. “We’ve created a system that we can verify our safety and security measures as much as possible. From a user-experience point of view, we are working to pursue how much we can make a relaxing in-car space, so we would like to evolve it further including entertainment features. We also would like to plus something extra to the conventional idea and think about how vehicles evolve by themselves. So we enhanced connectivity and network features.”

Of the 40 total sensors, Sony focused on CMOS (complementary metal-oxide semiconductor) image sensors installed inside and outside the vehicle to continuously monitor safety while driving. Specifically, the sensor breakdown is 18 ToF (time-of-flight) cameras, 18 radars/ultrasonics, and 4 LiDARs.

The OVAL sensing system helps to reduce passenger stress as well as aid eco-friendly driving. It provides high-accuracy, Level 2+ driver assistance with adaptive cruise control, self-parking, and automatic lane changing. Through software updates, Sony engineers aim to develop the system into a Level 4-equivalent self-driving system.

The system also allows the driver to follow the vehicle behind and safely pass in complex situations where lanes merge. It constantly monitors its surroundings and provides the driver with easy-to-understand suggestions on when it is safe to change lanes. The driver has to use the indicator and the lane change is done automatically. In the case of road work zones, the system detects the affected lane and selects an available alternative lane. It detects surrounding conditions such as pedestrians, other vehicles, and using map information can automatically park while recognizing obstacles.

AIMotive is working with Sony on the automated driving software stack for the prototype.

“This is a great collaboration for us, helping us better understand the key challenges in developing and integrating advanced ADAS technologies from well-respected industry leaders,” said László Kishonti, CEO & Founder at AImotive. “There are two current trends in the automobile industry; one is the electrification and the second is the adding more intelligence to the cars. Obviously, the Sony project is about both. We are building an electric car which is intelligent. I hope this will be not just a prototype, but we can follow Sony and the other companies to get it through the consumer or the mass market field.”

The supplier’s suite of technologies provide targeted L2+ ADAS capabilities with the eventual goal of greater levels of autonomy.

“Here at AImotive, we are focusing on camera-based and automated driving features combined with radar mostly” said Gabor Pongracz, Product Manager, AImotive. “In this project, we get the chance to use the market best cameras from Sony for bringing our solutions even to a more major and widely available state.”

AImotive will be announcing new and significantly enhanced versions of its production-ready aiDrive software stack for L2–L4 automated driving, as well as its popular aiSim development, simulation, and validation tools later in Q1 2021.

 

Entertaining user interface

Inside the cabin, advanced ToF-based distance imaging sensors monitor the condition of vehicle occupants. Facial expressions and gestures are used to determine the driver’s level of concentration and fatigue, and alerts are issued when necessary.

In addition, Sony engineers are researching/developing a lip-reading system, in conjunction with the driver-monitoring camera, to reliably capture the driver’s speech intentions even in noisy situations, feeding the data into the content display and navigation system operation. The system also aims to adjust the temperature inside the vehicle by inferring the condition of the occupants based on behavior read from the ToF sensor.

With the spread of 5G connectivity and the advancement of self-driving technology, Sony says the vehicle interior will become a place for relaxation—like a living room. In anticipation of this evolution, the Vision-S Prototype cabin is designed to be a luxurious space in which entertainment content can be enjoyed to the fullest.

With the 360 Reality Audio system, vehicle occupants can experience an immersive, three-dimensional sound field that envelops them in sound from all directions. Using Sony’s object-based spatial audio technology, sound can be placed in a 360-degree global sphere, providing listeners with music as intended by the artist. Individual seat speakers enable each individual passenger to enjoy a personalized sound configuration.

The 360 Reality Audio also integrates with the car’s digital mirrors, which combine cameras and dash-edge displays. When changing lanes, a digital side mirror system responds to approaching vehicles from behind. The LCDAS (lane change decision aid systems) warns the driver with a sound alert and flashing icon, with 360 Reality Audio providing an intuitive way to gauge the distance to the approaching vehicle.

The digital mirror displays are a part of a series of screens spanning the width of the car provide access to movies, games, and other visual content inside the vehicle. The user interface is highly intuitive, with a horizontal base that takes into account the driver’s line-of-sight, allowing the driver and passenger to access content independently.

The center and passenger display work together remotely with a PlayStation connected from home via 5G, enabling passengers to play state-of-the-art gaming in the car. The large, dynamic screen enables powerful graphics and can be enjoyed during time spent waiting in the car such as while charging or parked. The rear seats feature 10.1-in screens, allowing passengers to enjoy entertainment content separately.

Up front, a gesture lets a user pass the display screen to the next display with a quick swipe of the screen in the shape of an L. The driver and front passenger can be linked horizontally, just as the rear seat screen can be easily operated from the front seat. In addition to the touchscreens, a variety of system input styles are available, including jog dials and DualShock, which are accessible even when the seat is reclined.

Elektrobit tools helped create the advanced human-machine interface (HMI) of the Vision-S.

“The Vision-S Prototype provides what you would expect from a Sony product that’s designed for mobility,” said Christian Reinhard, Executive Vice President, Head of Projects, Elektrobit Automotive GmbH. “The design is very clean and the screen content and the surfaces in the cockpit mix very well. It is fun to use and interact with the prototype.”

The supplier’s experts say that the in-vehicle user experience has never been more important, and Sony understands the opportunity.

“You clearly see that Sony has unfolded its whole creative power and used all its experience from various products and technologies and have really thought through the car from the beginning,” said Siegfried Dirr, Head of Cockpit System Solutions, Elektrobit Automotive GmbH.

Larger touch screens, advanced graphics, voice and gesture control, and augmented reality are but a few of the enhancements that are starting to make their way into more and more vehicles as the industry marches toward greater automation.

 

Adaptability

In an era where all products and data are connected, Sony wanted the Vision-S to be a part of the network. Data are synchronized in real time and the software is repeatedly updated over the network, moving away from conventional independent systems to a system that has the potential to continue growing in real time.

The car’s software-defined E/E architecture is designed to support constant monitoring and over-the-air (OTA) updates through connectivity enabled by 5G networks. To fully leverage the connectivity technology even when the vehicle is traveling at high speeds, Sony is working with telecom industry partners to pursue real-time capabilities and enable sophisticated autonomous driving and remote control. One of those partners is Vodafone.

“Here Sony is a fantastic partner to work together,” said Hannes Ametsreiter, CEO, Vodafone Germany. “Mobility is not possible without connectivity in the future, and therefore connecting cars, connecting people, and making life more secure is super important.”

The car’s Vision-S Link network connection allows for smooth integration with smart devices. Once the mobile app is initiated remotely, the car starts preparing to leave. It starts the vehicle, adjusts cabin temperature and seat position, and automatically unlocks the doors when the driver approaches.

The ToF camera and driver-monitoring cameras ensure that the environment inside the vehicle’s cabin is optimized for his/her individual preferences. For instance, the navigation system route selection takes into account driver and passenger preferences, as well as environmental conditions, such as weather and traffic congestion, and the state of the vehicle, including the remaining charge level. If the system detects a sleeping passenger in the back seat, the car will automatically control the climate around that seat to a suitable temperature. The system continues to evolve through everyday use, learning a user’s preferred temperature and music, driving preferences and routes, and actual driving data to make the space more comfortable.

 

More to come?

As vehicles become more automated, drivers along with passengers will increasingly become new-mobility users thinking about how to best make use of their travel time.

“With autonomous driving, drivers are free from driving and Sony should offer the additional value to it,” said Yabe.

An inflection point occurred in 2018 for Sony as it sensed the automotive industry was undergoing a once-in-a-lifetime transformation, with the CASE revolution reshaping its future.

“Sony already possessed expertise in three of the four areas that CASE represented—the C (connected), A (autonomous) and E (electric),” said Kawanishi. “In other words, mobility was fast becoming a stage where Sony could play an active role. And I thought that outsiders like us jumping into the fray could very well be part of the revolution.”

Sony is continuing to think about what mobility should be and creating new proposals.

“‘Getting closer to people’ is our corporate direction,” added Kawanishi. “I think that mobility serves as a tool to achieve it. A car brings us closer to a customer because our car could be beneficial to our customers. And because this is an EV, we would like to contribute to solve energy issues or cut CO2 emissions, and we also would like to think about our contribution to the society.”