Car interior of tomorrow: life inside BMW’s mind-blowing Dee concept car

Published Jan 10, 2023

Share

Calvin Fisher recently time travelled to the not-so-distant future in a machine called Dee. This is his story.

Las Vegas - When I say Dee I am referring to the Digital Emotional Experience, BMW’s latest i Vision Concept car. It’s built upon BMW’s Neue Klasse electrified platform, but that’'s only telling half the story.

The Bavarian firm took the opportunity of this year’s CES expo in Las Vegas, via a keynote speech by its CEO Oliver Zipse, to unveil a concept that is leaps and bounds ahead of its time.

By now you’re aware of its standout features – the digital “paint job”, its electric powertrain and that it was unveiled alongside three sentient beings à la Arnold Schwarzenegger, Herbie and KITT. But the true magic resides inside. Here’s what happened when I climbed aboard.

BMW is evolving its operating system from BMW OS8 to 0S9, with a stopgap solution in the shape of 8.5. This will roll out via over-the-air updates to existing vehicles soon enough, but with i Vision Dee, the marque is pushing even further ahead while previewing some of what’s imminent in OS9. And it’s fantastic. I’m not going to get bogged down by technobabble just yet, and will, instead, focus on my tactile experience, which felt straight out of a sci-fi flick.

Stepping into the future

A friendly Dee greeted me, then popped open her cabin doors. I mean, ITS doors – so friendly and warm was the voice that ushered me inside, it’s easy to get confused. I dropped a cheek into the driver’s pew, and was instructed to start up by thumbing a glowing prompt on the dashboard. Very organic, startlingly intuitive.

What happened next was an augmented experience, as a basic HUD (head-up display) rose into view on the windscreen glass. In this minimal iteration, the information was mostly on par with what’s come before and what you’ll encounter in contemporary BMWs. Data such as speed, direction of travel… you know the drill.

We were invited to select the next level of augmentation and this is where things got interesting, unleashing even more live information than before, and encroaching ever more onto my peripheral view but, somehow, in a non-invasive way. The data was depicted vividly and with great clarity as we drew content from the metaverse in a manner that terrified the cynic in me but allowed my inner nerd to revel.

Social notifications, weather and traffic alerts, in addition to more useful car data each had its own place, and spilt over onto the side glass as well. It wasn’t as unnerving as the virtual road (projected on a wraparound screen) that we were on, which didn’t require us to actually concentrate on driving, despite us sharing it with virtual cyclists, pedestrians and obstacles of this nature. But then, upon unlocking the uppermost level of augmentation, our minds were promptly blown into the ether. We were not ready.

In this final layer, the world outside was almost completely replaced. Rebuilt with brighter brushstrokes, made hyper lucid but faithful to reality in terms of the geometry and geography. Left turns remained left, intersections were intact, a hairpin to the right with a five-degree camber and 20m of elevation would remain just so, but the environment could be changed to suit your fancy.

Feeling silly? If the software is to be believed, you could be driving on Mars, or underwater. Feeling pragmatic? Well, if you hate driving at night, you could virtually change that to a bright sunny day. Ditto, you could turn a rainy day, clear. See for yourself in the video below:

In terms of uses for the metaverse, I’ll concede that I didn’t see this one coming, and nominate it as my favourite use of the technology. Oh, you’re a big motorsport fan? Well, what if your mundane drive through the suburbs took on the appearance of the Yas Marina Circuit instead, but with all the necessary hard points safely in place such as traffic lights, road signs, oh and other road users, of course. Mind, increasingly blown. But then, it happened again.

The gamification of the drive

I know, it’s not my favourite new buzzword either. Despite being an avid gamer, I don’t feel the need for my equally beloved driving experience to take on similar qualities. That is, until a kaleidoscope of butterflies surrounded us, and proceeded to fly in our direction, playfully darting across the windscreen and bonnet, spilling off our flanks and visible in the side windows as it should be.

Immediately I thought of a beautiful PlayStation title I recently played, called “The Ghost of Tsushima”. Ironically its design philosophy included a HUD-less environment, where autumn leaves would be summoned with a mere swipe of the finger, and the wind would, in effect, guide you to your next way point. This was that and, in practical terms, a beautiful replacement for a crude navigation arrow.

It clicked for me just then, the idea of being guided to my destination, by something more organic and less intrusive. Less flashing lights, more gentle persuasion. I’m not entirely sure why this made such an impact on me, but it did.

This cannot be just two years away?

Can it? This generation of concept cars is built remarkably close to production, sacrificing none of the shock and awe, but built within inches of production readiness. With Dee, BMW is showcasing an AI reality, augmented and intuitive and built upon an open-source Android Automotive engine and incorporating an advanced new speech engine dubbed IPA Next (Intelligent Personal Assistant Next).

This is BMW OS9 and indeed, it rolls out in just two years. How close it will be to the Dee experience, I cannot say. But I’m hopeful. To quote a true sage on artificial intelligence and androids, Sarah Connor, “the future, always so clear to me, has become like a black highway at night. We were in uncharted territory now… making up history as we went along.”

That’s us now, as we return to that old adage of “eyes on the road, hands on the wheel”, albeit with a futuristic twist.