What Will an AI Hardware Device Look Like?

Sam Altman, Jony Ive, and Masayoshi Son have announced a new “iPhone of Artificial Intelligence,” but what will such a device actually look like?

Late in September, breathless stories hit about a new “iPhone of Artificial Intelligence” hardware device collaboration among Sam Altman (OpenAI’s CEO), Sir Jony Ive (the guy who designed the iPhone), and Masayoshi Son (the SoftBank CEO who has invested $140 Billion in AI).

But how will you take a selfie?

Altman, speaking last week at The Wall Street Journal’s Tech Live conference, was vague about details except that he has no desire to compete in a crowded smartphone market. But that only means he and his collaborators want to create an AI-powered device that will eat the smartphone—the way the smartphone ate the phone, camera, calendar, flashlight, notebook, MP3 player, and many other things we used to lug around.

Let’s call this new Artificial Intelligence Device an AID. It’s a mistake, though, to think that AID will be only one piece of hardware. Instead, AID will be the hub of a collection of devices that will form a Personal Area Network (PAN), all run by an AI assistant that makes Siri and Alexa look learning disabled.

To replace smartphones, AID needs to replicate the smartphone’s input and output functions.

Inputs include: sensors that connect to a user’s body (like the Apple Watch’s health functions today that tell you if your heart rate goes awry), camera/videocamera, microphone, GPS/map, and a way to manipulate virtual objects (akin to how you pinch and zoom today).

Outputs include: speakers/earbuds to hear the AI, Augmented Reality (AR) smartglasses as the user’s primary display, a portable “handover” display (perhaps the size of a compact foldable mirror) so you can show other people your pictures or that funny Instagram Reel. The compact display could also be a projector, like with  Humane, a prototype “screenless” wearable AI device. This is a hockey puck a user wears in a breast pocket. (OpenAI’s Sam Altman invested in Humane.)

There will also be a “buy this” gizmo that you can tap to purchase, like with Apple Pay or Google Pay today.

I’ve been thinking about PANs for years, including in my 2011 Science Fiction novel Redcrosse. The protagonist, Diana McNight, wears a pearl strand computer with a variety of sensors and input/output accessories: a floppy monitor that she crumples into her pocket, special contact lenses that project images directly on her retinas, and tiny implants in her fingertips that allow her to manipulate virtual objects. Another character, John Drakanis, has his computer in the form of a stylish wristwatch that connects to his PAN.

With AID, the biggest challenge of a PAN is battery life: the gadgets will need to stay powered up, and nobody will want to stop everything several times a day to recharge.

Another challenge will be how the different gadgets communicate with each other. Today, my rudimentary PAN of iPhone, Apple Watch, and AirPods Pro all connect via Bluetooth. Bluetooth’s range is a healthy 10 meters, but that will become a limitation if my AID also connects to a hovering drone that follows me around to get pictures and videos from angles impossible today. I’ll also want AID to prearrange my latte at the coffee shop I’m passing or to have it delivered by drone as I’m walking… so AID will need a 5G or stronger internet connection on top of Bluetooth.

Here are two important additional questions.

Question #1: On top of replicating what smartphones do today, what other products and appliances will AID eat? I think AID will absorb a lot of the functions of today’s laptops, televisions, and tablets. Instead of a $3,500 MacBook Pro, I might just have a cheap keyboard, mouse, and monitor: one set at home and one at work. If you’ve ever cast a video from your smartphone to the big screen on the wall, then you’ve already seen the future of co-viewing. If La Profesora and I each have high def smartglasses, then we can watch The Morning Show together looking at a virtual rather than a physical big screen. Virtual co-viewing (like the “watch parties” that different streaming services offer but which I’ve never heard of anybody doing in real life) will become easy with AID.

Question #2: Who will design the accessories? Jony Ive is a legendary hardware designer, but the collection of wearables in a PAN will also be fashion items. Meta (Facebook) partnered with Ray-Ban for its primitive smartglasses, and I predict that we’ll soon see a fourth partner join the Altman/Ive/Son collective to take on transforming a bunch of nerdy gadgets into things that stylish people want to wear. We’ll have PAN accessories to wear to the gym and different ones to wear to work or to go out. I expect that companies like LVMH, Nike, and fast fashion houses will compete to be the fourth partner creating accessories that are smart in two senses of the word.

Smartphones scaled at lightning speed once manufacturers opened up their App Stores to third parties and became platforms upon which others built businesses. To scale, AID will need to do the same.

Finally, AID will change the hierarchy of senses we use with digital technology. Today’s internet is visual first (we look), tactile second (touchscreens), and auditory third (speaking and listening). AID will still be visual first, but speaking and listening will nudge ahead of touch as we chit chat with our digital assistants about most of our input/output needs.

This is the next phase of the digital revolution.

Note: To get articles like these—plus a whole lot more—delivered straight to your inbox, please subscribe to my free weekly newsletter!


by

Tags:

Comments

0 responses to “What Will an AI Hardware Device Look Like?”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.