Autonomous Driving Vision

From assisted to autonomous driving, featured at CES

The autonomous driving future is quickly becoming a reality and Elektrobit is well equipped to lead the way by enabling automobiles to function as both real-world sensors and data consumers via cloud connectivity. Car sensors already enable the computers in cars to sense many things that humans cannot, but sensors alone are not enough. In order to provide value to consumers and auto manufacturers, it is imperative to integrate, analyze, and interpret all the relevant sensor data at any given time. That way, the automated car will be able to act appropriately at all times.

Elektrobit, a global supplier of embedded solutions for the automotive industry, with products powering over 70 million vehicles worldwide, needed to increase their perception as innovators and visionaries in the field of automotive software. Elektrobit contacted Dupla Studios to help envision future user scenarios that can be enabled by their car sensor technology.

We worked closely with the Elektrobit Engineering and Marketing teams to brainstorm which sensors can enable specific features and how those features together can enable scenarios that improve the lives of drivers and passengers. Dupla Studios wrote-up the scenarios, created storyboards and user interfaces to illustrate the experiences in the scenarios. We also interfaced with Run Studios, a Seattle video production company, to get the video produced. The final video was shown at the Elektrobit booth in CES 2016.


Scenario creation, Storyboarding, User Experience Design, Creative Direction.


From car sensors to user scenarios

We kicked off this project by holding an in-person workshop with engineers from Elektrobit’s innovation lab and their global marketing director. During the day long workshop, we understood Elektrobit’s goals, success criteria, target audience and take-aways for the video.

We facilitated a session where members from EB’s engineering and marketing teams outlined all available car sensors and grouped the sensors into possible real-world bite-sized scenes. We then grouped the scenes into scenarios, all of the scenarios exhibiting the technology Elektrobit enables from an end-user’s perspective (automobile driver). The outcome of our discovery workshop was a prioritized list of scenarios with write-ups for each.

Go deep or go wide: profile progression vs technology progression

We discussed a couple of possible progressions for the video. The first alternative was to show a progression from the human lifetime and outlining different needs for different life stages: from a single business professional with disposable income, to a adventurous couple with no kids, and then to a more safe and conservative family and finally to a senior citizen, who may be potentially impaired in one fashion or another. The second alternative was to focus on a single persona and progress the story through the future stages of autonomous driving: from 5 years out (assisted driving) to 10 years out (automated driving) and finally envisioning a future, approximately 20 years out (the future of on-demand mobility). During the workshop, we made a decision to focus on the latter progression.

Telling the story and designing the experience

We started the design phase by translating the written scenarios into storyboards. The storyboards allowed Elektrobit to better visualize the ideas and scenarios in action and also created a method for them to more easily provide feedback. The storyboards defined the user experience throughout the video: from the UX of the target user’s devices (wearables, phones, and tablets) as they enable interfacing with the car, to the potential interface surfaces inside the car. We also included the interactions between the vehicle in autonomous mode and the people outside the vehicle.

When in autonomous driving mode, since there is no driver, we envisioned intuitive ways the car could interact with the exterior world to ensure peace of mind and safety for people both in and outside of the car.

The booth at CES 2016

Elektrobit’s vision was brought to life at its CES booth in a video showing the future of automotive mobility in 5, 10 and 20 years. The video includes the analytics-based deep learning that will enable cars to understand the needs, preferences, and behaviors of their drivers. This era of automobile innovation will make travel safer, more productive, and enjoyable. Intuitive heads-up-displays assisted by the Augmented Reality (AR) Creator suite—built by EB—factor prominently in this future vision.

Final video

Elektrobit’s vision video also appeared in an Elektrobit & NVIDIA press announcement. The video below is narrated by Walter Sullivan, Head of Innovation Lab at Elektrobit, who was our main point of contact for this project.