I saw a tweet this morning that caught my eye.
— Gen Ashley (@coderinheels) September 9, 2016
Having written about Conversational IoT earlier this week I thought it was worth a quick follow up. I said then that
The nature of the current dominant user interface – the apps – is breaking down.
The existing screen obsessed mobile first model, where we’re all looking down at our phones like
So from my perspective Google’s Nearby platform is very interesting because it has a design principle based on interaction between a number of devices and end points. Generally development has been about this screen or that screen rather than this screen and this screen. And now we’re throwing voice into the mix with Amazon Echo, Siri, and Cortana.
Even watching TV isn’t a one screen activity any more – we’re tweeting from the other screen at the same time. I am surprised nobody has nailed the mobile for sports data, augmenting the TV experience yet. What’s interesting to me are the new models for interaction between all the devices and services we use. Amazon’s dash buttons are another non screen-based end point.
Beacons have been around a while, obviously, but Apple and Google are in unique positions to get this right. I will have to look at it more closely but at first glance Nearby looks like a really interesting framework, in that it comes at things from the interaction rather than the screen perspective.
Nearby Messages exposes simple publish and subscribe methods that rely on proximity. Your app publishes a payload that can be received by nearby subscribers. On top of this foundation, you can build a variety of user experiences to share messages, create real-time connections between nearby devices, and receive beacon messages.
Nearby Connections enables your app to easily discover other devices on a local network, connect, and exchange messages in real-time. Use Nearby Connections to create multiplayer and multi-screen experiences.
Nearby Notifications is a new feature allowing developers to tie an app or website to a BLE beacon and create contextual notifications, even with no app installed.
I can’t wait for better heads up design, interaction and programming models. The idea we’re redesigning urban signage for example to accommodate our cricked necks seems crazy to me. In Augsburg they’re now embedding signals into the road after a girl was hit by a tram.