In 2007, Steve Jobs said, “There’s an old Wayne Gretzky quote that I love. ‘I skate to where the puck is going to be, not where it has been.’ And we’ve always tried to do that at Apple. Since the very, very beginning. And we always will.”
Apple certainly guards the skating path close to their chest. From hiring former FBI agents for protecting secrets, to frowned upon employee bag checks and a rather concealed supply chain; Apple always tries to make it difficult to spot what is around the corner. However, from knowledge currently available, deductions can be made from the following:
- 📢 Apple’s announcements
- 🤝 Apple’s acquisitions
- 💸 Apple’s Annual Shareholder meeting
1. Apple’s announcements
Every year, Apple’s World Wide Developer Conference better known as WWDC presents the Cupertino giant’s goals for the near future and important product announcements. This year was a packed event, with a record number of new launches that had me writing pages of notes whilst tuning in.
- TV OS
- Watch OS 6
- iOS 13
- iPad OS
- The new Mac Pro
- Apple Pro Display XDR
- Mac OS Catalina
- AR Kit 3
In this packed an event over 2 hours long, it is easy to be sucked into all of the dark mode enabled software updates waiting to be downloaded, the beta releases to be installed or new declarative takes on Swift to be dabbled with. However, layered within this year’s event was something more important to Apple’s future than what meets the stage-light. It was AR Kit, and what those few minutes meant for developers, users and Apple. The tech press has a pretty muted narrative towards AR announcements, and it was no different this time. The contrast between Apple’s game changing positioning of AR and the tech press is one of stark contrast.
The tech punters are not quite sipping the kool aid as much:
However, on a bit of a deeper dive into this part of the keynote reveals more compelling information on what could shape the company up for the next iPhone moment, in due time. Apple’s AR strategy largely is into 3 pieces now:
- AR Kit
- Reality Kit
- Reality Composer
From the beginning, ARKit has offered computer vision tracking which allows modern iOS devices to track their location in space, as well as detect flat planes like the ground or a flat table which could be used to place virtual objects into the scene. The launch of ARKit 3 brings with it 2 new, and key additions to the platform:
Human Occlusion & Body Tracking
A big problem with current AR applications is the lack of understanding of people, objects and compositions of the real world. If you place a chair near a wall and walk around the corner, the device has no understanding of ‘occluding’ the object as there is a change in the scene composition. To solve one part of this, ARKit 3 understands the position of people in the scene. Your device would now be able to correctly composite complex virtual objects blended with people in the scene, either in front or behind the object depending on the position relative to the camera. In prior versions of ARKit, virtual objects would always show ‘on top’ of anyone in the scene, no matter how close they were to the camera. This would break the illusion of augmented reality by showing conflicting depth cues.
The second big part of the ARKit update is the ability to track human pose. By knowing where people are in the scene and how their body is moving, ARKit 3 tracks a virtual version of that person’s body which can in turn be used as input for the AR app. Body tracking could be used to translate a person’s movements into the animation of an avatar, or for interacting with objects in the scene, etc. This has profound impact for the indie industry where traditionally, motion capture or other body tracking techniques have been expensive and required sophisticated hardware.
The technology much like most first generation releases seems a bit coarse. Even with minor camera movement the avatar’s feet don’t remain particularly still while the rest of the body is moving, and small leg motions aren’t well tracked. When waving, the avatar can be seen to tip forward in response to the motion even though the user doesn’t. In the demo footage, the user keeps their arms completely out to the sides and never moves them across their torso (which would present a more challenging motion capture scenario).
For now this could surely be useful for something simple like an app which lets kids puppeteer characters and record a story with AR avatars. But hopefully we’ll see it improve over time and become more accurate to enable more uses. It’s likely that this was a simple ‘hello world’ sort of demo using raw tracking information; a more complex avatar rig could smartly incorporate both motion input and physics to create a more realistic, procedurally generated animation.
With ARKit 3, Apple also introduced RealityKit which is designed to make it easier for developers to build augmented reality apps on iOS.
Building an Augmented Reality application is rather different from building a ‘flat’ app. You require a strong sense of understanding of 3D concepts and key technical elements associated with the paradigm. A big portion of iOS developers are still in the realm of building the 2D layer of apps, and are likely not equipped to handle this transition in an easy manner. Apple is clearly trying to iron this out, with a big focus on making developers either incorporate AR into their existing apps or launch new apps that are AR first out of the box.
Apple’s description of RealityKit encompasses photorealistic rendering, 3D editor, camera effects, animations, physics etc. This is pretty much the whole suite of game engines such as Unity and Unreal, where almost all the AR content is currently being developed at currently. It seems that RealityKit wants to be a viable option that would make it easier for developers to familiarize themselves to this AR shift. Where Apple brings an extended advantage is around rendering, that it aims to make as convincing as possible. Building on RealityKit and utilizing the advancements on ARKit, Apple wants to make these 3D objects blend as convincingly as possible with the real world, and layering effects onto this 3D object such as lights, shadows, reflections and more as if they were captured through the camera.
“RealityKit seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality,” — Apple
Just like RealityKit, Reality Composer aims to make things easier for developers not experienced with game engines, workflows, and assets.The Reality Composer offers a library of existing 3D models and animations which can be visually edited, allowing the creation of simple AR experiences that can integrate into apps using Xcode. Through the Reality Composer, while Apple has a undoubtedly dystopian name for the editor, it is helping trickle the benefits of AR from just a select few developers to a larger part of the development community.
Hence, Apple’s announcements on the AR suite of products it now has seems to be a lot about the world ahead of us, and how Apple could involve a larger audience to both experience and develop content, applications and games with AR in mind. What makes Apple’s second part of the AR play even more interesting is the parallel thread of acquisitions it has embarked on, that tends to throw up a pattern.
While Apple can hide a number of behind the scenes work on it’s new product lines, it cant quite hide the list of acquisitions it makes with it’s warchest of cash and profits. As a company, Apple used to make acquisitions that only had a very direct, clear and strategic value add to the company over a predictable time horizon. For example, Authentic gave Apple TouchID. Siri gave Apple, well, embarrassment and Siri. However, something seems to be very different in the recent past.
Since 2014, there has been a consistent theme in their grocery bill of acquisitions. Here are some clear dot connectors:
Now, combine this with a healthy number of recent patents and a big spike in AR/VR job descriptions on their portal, all pointing towards the same thing.
💭 Is the technology primed and ready for AR glasses? NO.
💭 Are customers ready to spend hours wearing AR glasses? NO.
💭 Is there a wildly successful comparable AR glass right now? NO.
It is interesting how you could replace the word AR glasses in those 3 questions to iPhone, rewind back to 2007, and the answers remain consistent. The fact is we have seen this happen with the iPod and iPhone already. Besides, it is in Apple’s own interest to ensure that this happens yet again, because besides the technology and products everybody loves the one thing that dictates companies this size happen to be the quarterly earnings and stock prices. That, has Apple inevitably looking at newer pastures for growth.
Annual Shareholder Meetings
Perhaps Apple’s most compelling reason to crank it up on their pace of innovation is the very nature of technology adoption itself. Today, there are over 2 billion iOS devices that have been sold, and the world literally has only a finite audience left for Apple to sell to. At the start of the year, Apple cut its revenue outlook for the first time in almost two decades as demand for iPhones weakened, and the drop-off of Chinese sales was more extreme than expected.
Companies that depend on Apple are even more dramatically affected. The Chinese chipmaker, and Apple supplier, AMS is more exposed to Apple’s vicissitudes than most. It gets about 20 percent of its revenue from the tech giant, according to Bloomberg supply chain analysis, but it would probably have hoped for even more given the scale of its investment. The 80 percent decline in AMS shares since their March peak makes it the worst-performing Apple supplier of the past 12 months. AMS manufacturing facilities are working well below capacity, underscoring the sense that AMS over-invested in factories, and Apple has plateaued on the sales of new iPhones. The CEO admitted as much in a conference call with analysts.
Apple’s largest share of profits by far has been the revolution of the iPhone, which has been the world most successful tech product, ever. Period. The iPhone changed the very fabric of the society and Apple now faces a steep challenge: repeat this feat. The only way a company that is so dependent on one product can survive is to replace itself and not wait for someone else to. Apple’s stopgap has been to raise the price of the iPhone as well as stop reporting the number of devices sold on a quarter to quarter basis to help curtail investor skepticism. However, that tactic would not last indefinitely. Customers are quickly reaching the threshold of how much should phones cost, and how often do they need to be replacing them. Hence, Apple’s ultimate business goal of increasing or continuing to create shareholder value, keep the stocks in the green and the stock owners happy ultimately requries them to launch a new product line. Not the likes of the AirPods or the HomePod, which are perhaps great products but not the runaway success Apple needs as a $1 trillion market cap company.
But why is AR the AnsweR?
Why invest so much time and money into a piece of technology that requires you to hold your phone up awkwardly, let alone your 13' inch iPad? AR apps have had only a few hits and it has been a difficult road for the ecosystem.
Skate to where the puck is moving
Apple is betting big that this step is just an intermediary. Almost like the notch on your phone before cameras get motorized and pop up, or go right underneath the display. Almost like TouchID to FaceID and possibly to ultrasonic fingerprint readers, that go right underneath the display. Apple knows better than most that the current state of AR is a stepping stone towards getting to the larger goal of hardware/wearable enabled access paradigms. Apple is rumored have a secret team of hundreds of employees working on virtual and augmented reality projects.
Rumor mill summary
Apple analyst Ming-Chi Kuo believes Apple’s augmented reality smart glasses will arrive sometime between late 2019 and the second quarter of 2020 and will work as an iPhone accessory. Kuo says the glasses will take on a display role while wirelessly offloading computing, networking, and positioning to the iPhone. Designing the glasses as an iPhone companion device allows Apple to keep them slim and lightweight because the processor doesn’t need to be built in.
Bloomberg has said Apple is developing an AR-focused product with a dedicated display, a built-in processor, and a new “rOS” or reality operating system. rOS is said to be based on iOS, the operating system that runs on the iPhone. For the AR headset, Apple is developing a “system-on-a-package” chip similar to what’s in the Apple Watch. As for input methods, Apple is considering touch panels, voice activation, and head gestures, and a range of applications from mapping to texting are being prototyped. Virtual meeting rooms and 360-degree video playback are also concepts that are being explored. Bloomberg has suggested Apple is aiming to finish work on the augmented reality headset by 2019, and a finished product could be ready to ship as soon as 2020.
2020 is really not a lot of time left ahead for Apple to be launching an entirely new product category. If this smoke in the horizon is to be taken, albeit with a pinch of salt, Apple must already be well into laying the groundwork needed for shifting the development and content community from a phone first mindset to a wearable first one. That, is almost what Apple has completely pulled off when the rest of the world has been lured towards the new cheese grater Mac Pro and the dark mode on iOS. Apple has been able to foundationally lay out a lot of the moving pieces of the software puzzle to make a hardware launch happen.
Digging deeper into the announcement
There was one particular slide during Apple’s keynote about all the new features on ARKit that Apple did not have the time to talk about.
There are significant parts of this announcement that are mandated in an Apple Glass world.
Sign In with Apple
Apple announced that it was forcing developers to use its own signon feature. It promises to protect your privacy in several ways that Facebook and Google don’t, even creating randomly-created email addresses so that if that company starts spamming you you will be able to easily turn off that email address.
Smart glass relevance: In the scenario of Apple Smart glasses, these hardware devices require to continuously scan your environment, much like ARKit, and composite AR objects on top of your world. This mesh data of your real world would include deeply personal spaces such as your living room and almost every space where you wear your glasses at. This poses a significant privacy threat, and Apple’s signon feature could be a natural extension to this space, to offer a privacy focused spatial mapping system.
The concept of localizing to your environment is an important part of making AR possible. This is paired with a 6DoF tracking mechanism to enable the user to move freely around the AR object. Interestingly, Apple’s developer documentation revealed more information about new methods how this positional tracking can be used.
Smart glass relevance: The overview was observed by a number of people for a key addition: Virtual Reality scenarios. This can be traced back to Apple’s acquisition of VRVANA, a VR + AR headset that could be bringing this core tech to the new Apple Glass if and when it launches. Hence, to enable the fallback mode of VR only positional tracking this configuration looks apt.
Collaborative, at scale
There was actually quite a few additions in ARKit 3 which Apple didn’t go into detail about during the keynote, some of these were Multiple Face Tracking which could recognize three faces at once; Collaborative Sessions which are ideal for developers or AR multiplayer experiences, the simultaneous use of front and back cameras and improve 3D-object detection.
Smart glass relevance: The features around collaboration and the AR cloud layer seem to be an overstretch at a time when developers are failing to use even the basics to enable their apps with AR. Apple is solving for both ends of the spectrum, on one side it is moving the industry with the possibilities with AR and on the other, is bringing a subset of these possibilities to a number of new users & developers with RealityKit and RealityComposer. This two pronged strategy is indicative of the technology that needs to be advanced for the launch of glasses while the least common denominator on the number of people using and building this content needs to be advanced from the community aspect.
Apple Pay in AR
With the introduction of AR, commerce has been one of those direct and easy to comprehend use cases. With more consumers shopping online and via their phones, AR sampling has been adopted by a number of product manufacturers as a way to help shoppers feel more confident about their purchases. Apple Pay has typically made it incredibly simple for users to make purchases through their phones. However, what if Apple Pay is not limited just to phones? The company has already implemented Apple Pay into the Apple Watch and the next step seems to be towards implementing it into the next wearable form factor there is: glasses.
This year’s WWDC was undoubtedly one of the most packed events with several launches that are available to users either today, or later this year. Layered underneath those launches, remains the key building blocks of AR across apps, the web and commerce that is collaborative, private and intelligent. If there ever is a company that has been successful at launching entire markets out of thin air, it is Apple. The Cupertino giant has typically dominated the market across hardware, and is also setting sights on services now. Between the new Arcade or the original shows featuring Oprah, Apple needs to find what is going to spark the next mass market consumer product. The best way for Apple to find this out would perhaps be to invent it.
“I see AR as being profound. AR has the ability to amplify human performance instead of isolating humans. So I am a huge, huge believer in AR. We put a lot of energy on AR. We’re moving very fast. I don’t want to say what we may do, but I could not be happier with how things are going right now.”— Tim Cook
To me, that remains the biggest nugget of optimism from the conference this year. The trail of breadcrumbs and the timing of Apple’s emphasis on AR. The company tends to soft signal what it’s priorities are well before a product has made its way to glass store shelves. For example, Tim Cook consistently would state “the wrist is a great possibility for computing” before the Apple Watch was launched. If we are to go by the company’s posturing, AR seems to be well and truly their search for the next iPhone moment. An encore is no easy feat, but a necessary one.
🙂🤳 ➡ 🤯👓