The year that will be the end of the smartphone as we know it

Iskander Smit
LABSinfonl
Published in
5 min readJan 14, 2017

--

Last Friday I attended a cosy new meetup Fraiday. All attendees shared their ideas for future applications of AI in their personal life. As I was telling about my plans the host Jim Stolze asked how I got my insights for future developments. I did not really think about that a lot, and I answered it is just an attitude of an open mind.

I had to think about this again last evening when someone om Twitter shared an article on the current activities of former inventor of Android working on a new kind of mobile device. It combines with some thoughts on the future mobile ecosystem triggered my own experiences and observations of the use of the Airpods and a discussion we had earlier that day at Info.nl about interesting technology developments with serverless computing. So I have to add to the question of Jim Stolze that the combining of impulses and finding the pattern is important part of thinking on future.

So lets elaborate a bit on this interesting development: will we see a shift to a new kind of mobile device, or devices. It is not a super new subject to think about maybe, but without a doubt this device changed our lives the most the last decade (we just celebrated the 10th anniversary of the iPhone last week). The decade the we got a computer in our pocket with all the consequences in changes of behaviour from instant knowledge all the time and instant contact with our peers, that are not our close friends anymore but with our strong and weak ties.
And with instant tools for all the needs in life that can be solved by a computerised service. An all-purpose device.

The phone is our center of the personal universe, the gravitation center, the hub for a functioning life, a black hole of attention. We are glass slab zombies.
Still: how connected our phone is in all his 4G, iCloud continuity and (Google) Now-features, the dominant form factor is a device that bundles everything. I think there are signs now that this could very well change dramatically in the coming year(s). And I think there is a growing consensus how that can look. We’ve seen it in the movie Her: a continuous connection with the smart cloud by ‘beyond-screen’ voice interactions and a device for supporting functions that cannot be solved by voice interactions. This is a model that looks to become true. The parts of the puzzle have been introduced in its first iterations, only in need to be integrated.

The devices in Her: a dumb screen and an ear piece as smart hub

First, the Cloud as a concept for more than storage, but also for computing is something that Stephen Wolfram is building for some time. And Amazon introduced Lambda some time ago as part of their cloud strategy.

Second, The Alexa-platform is also rather matured, not only in technology (Alexa as operating system) but also in use. These shifts take time, we need to get acquainted to talking to products for instructions. Nobody doubts that this will be one of the main trends in 2017, let alone all the introductions of Alexa-enabled products at CES.

Third, AI is of course part of this shift. The first mass iteration will be dialogue intelligence mainly. Google is showing how they believe in it and adding “Ok Google” into everything. But also the acquisition of Viv by Samsung can mean an acceleration here. This technology showed impressive demos. We need an understanding assistant to make us comfortable to use voice as interface paradigm.

Fourth, as said above: the inventor of Android Andy Rubin seems to be busy with the ultimate glass slab device. I can imagine that this could be very well be the information device as is used in Her. The company is called Essential, see if that will be the nature of the device indeed.

Fifth, And does Apple play a role here? Definitely. As always they seem to go for the ‘above and beyond’ strategy. Suck in all emerged knowledge from the pioneers and come up with the ultimate execution that is not build on technology push but with a user focus. Of course they still need to prove this but I am looking forward to the release of the next iOS that will merge MacOS and iOS and have a serious step in experiencing AI as a service.
The Airpods are a very important step in this. Building a voice controlling device with a mini computer in it, that focuses on one of the important barriers; lousy connecting experiences. I feel that Apple will shift the experience of the services from the phone to cloud, controlled by the voice activation devices like the Airpods and next iterations. It is not so strange they create a range of those devices with the Beats acquisition. I expect that Apple will be able with a human design focus to make a voice interface workable as soon as it is socially acceptable, and that we wear the Airpods all the time we are in transit. I already feel the urge to keep my pods in also without using them for listening.

2017 will be a building year, we will see new devices appear, and we will see AI mature in the dialogue engine. Apple could very well launch an iteration of Swift that is a variant of Amazon Lambda in iCloud that can be the start for developers to seriously start making these serverless applications. This all together is important to ramp up to the next wave of interactions. That are independent from the device and more connected to the actions. Where all the things we use can be function as a platform for these services.

I don’t expect a revolutionary new device ecosystem this year, the phone is to0 much of a cash cow still. But in hindsight we might see that 2017 is the year that was the start of the end of the smartphone as we know it.

--

--

Research & innovation director at Info.nl. Leading LABS. Co-organiser Behavior Design AMS & ThingsCon Amsterdam. Cities of Things Foundation.