Apple’s software is very good, in general. Even though the company has focused on more platforms than ever – macOS and iOS and iPadOS and tvOS and watchOS and whatever software Apple builds for its car may come one day and its AR/VR almost certainly will come headset – these rigs have continued to be excellent. It’s been a while since we’ve had an Apple Maps-style fiasco; the biggest mistakes made by Apple are much more at the level of put safari url bar on the wrong part of the screen.
What all this success and maturity engenders, however, is a feeling that Apple’s software is…done – or at least very close. Over the past two years, the company’s software announcements at WWDC have been almost exclusively iterative and additive, with few big changes. Last years big iOS announcements, for example, there were quality-of-life improvements to FaceTime and new types of IDs that work in Apple Wallet. Otherwise, Apple just rolled out new settings menus: new commands for notifications, Focus mode settings, privacy tools – that sort of thing.
That’s not a bad thing! Nor is the fact that Apple is the best quick follower in the software business, remarkably quick to adapt and polish everyone’s new software ideas. Apple devices are as feature-rich, durable, stable, and usable as anything you’ll find anywhere. Too many companies try to reinvent everything all the time for no reason and end up creating problems where they didn’t exist. Apple is nothing if not a ruthlessly efficient machine, and that machine works hard to refine every pixel its devices create.
The best of iOS 15, in case you forgot.
But we’re at a technology inflection point that will demand more from Apple. It is now quite clear that AR and VR are Apple’s next big thing, the next supposedly huge industry after the smartphone. Apples not likely to show helmet at WWDC, but as augmented and virtual reality enters our lives more and more, everything about how we experience and interact with technology is going to have to change.
Apple has been introducing AR for years, of course. But all that’s shown are demos, things you can see or do from the other side of the camera. We’ve seen very little from the company how they think AR devices will work and how we’ll use them. The company that likes to rave about its input devices is going to need some new ones and a new software paradigm to match. That’s what we’re going to see this year at WWDC.
Remember last year when Apple showed you could take a picture of a piece of paper with your iPhone and it would automatically scan and recognize any text on the page? Live Text is an AR feature end-to-end: it’s a way to use your phone’s camera and AI to understand and catalog information in the real world. The whole tech industry thinks it’s the future – it’s what Google is doing with Maps and Lens and what Snapchat is doing with its lenses and filters. Apple needs a lot more where Live Text came from.
From a simple user interface perspective, one thing AR will need is a much more efficient system for obtaining information and completing tasks. No one will wear AR glasses that send them Apple Music ads and news notifications every six minutes, will they? And full-screen apps that demand your singular attention are increasingly going to be a thing of the past.
We might get some clues as to what that will look like: it looks like “using your phone without getting lost in your phone” will be a theme at this year’s WWDC. According Bloomberg‘s Mark Gurman, we could see an iOS lock screen that displays useful information without requiring you to unlock your phone. A more visible iPhone seems like a great idea and a good way to stop people opening their phones to check the weather only to find themselves at the bottom of a TikTok hole three and a half hours later. The same goes for rumored “interactive widgets”, which would let you perform basic tasks without having to open an app. And, if Focus Mode gets any supposed improvements — and especially if Apple can make Focus Mode easier to set up and use — it could be a really useful tool on your phone and a totally essential tool on your AR glasses.
I would also expect Apple to continue to bring its devices closer together in terms of what they do and how they do it in an effort to make its entire ecosystem more usable. With a nearly full lineup of Macs and iPads running on Apple’s M chip – and possibly a full lineup after WWDC if the long-awaited Mac Pro finally appears – there’s no reason why devices don’t share more DNA. Universal Control, which was probably the most exciting iOS 15 announcement even though it only shipped in February, is a good example of what Apple looks like to treat its many displays as part of an ecosystem. If iOS 16 brings true free-form multitasking to the iPad (and I hope it does), an iPad in a keyboard dock is essentially a Mac. Apple used to avoid this closeness; now he seems to be kissing her. And, if he thinks of all these devices as companions and accessories for a pair of AR glasses, he will need them to do the job well.
The last time Apple — damn, the last time somebody – had a really new idea about how we use gadgets in 2007 when the iPhone was launched. Since then, the industry has been on the right track, improving and tweaking without ever really breaking the basics of multitouch. But AR will break all that. It cannot work otherwise. That’s why companies are working on neural interfaces, trying to perfect control of gestures, and trying to figure out how to display everything from translated text to maps and games on a small screen in front of your face. Meta already ships and sells its best ideas; Google comes out in the form of Lens features and sizzling videos. Now Apple needs to start showing the world how it thinks the future of AR works. Helmet or no helmet, this will be the story of WWDC 2022.