Horace Dediu speculating about how the Apple Glasses might come to be:
If you were to look at glasses: What if Apple glasses look like a regular pair of glasses? What if it have absolutely no technological appearance?
The question on the glasses or anything worn on the faces, as I said before, it’s a very delicate thing. The watch when it was introduced it didn’t look that different from any watch. And so what the logic I think for wearables for Apple has been is, “Let‘s not ask people to accept a behavior or appearance or a burden in terms of the weight or anything else that they”re not already accepting”.
And we go back to this, if you want to know where wearables are going, look at where jewelry has gone. That’s been my thesis about wearables.
We find it acceptable to hang in our bodies a lot of metal, a lot of rocks – which could be substituted with silicon and plastic, and maybe some other metals.
The area around the ears, the neck, the eyes and potentially the top of the head – where a hat would reside – these have been conquered. [We have] over-the-ear and in-ear headphones but [nothing] yet on the face.
And this is one of the things that with Apple Silicon they might actually be able to deliver on such a small headset that it actually would be indistinguishable from a normal pair of sunglasses.
Actually, what I think they would go for in a glasses model would be the degree of customization. Everybody has a different style of glasses. I’ve gone into these places in the past and gone through everything available. And I was like, I didn’t like any of them. You know, they did not fit me. How do you make a pair of glasses or how do you make a lenses-based system which have to conform to so many face shapes and styles?
I would think it would be along the lines of a custom order based on a 3D face scan. The idea is that they use your phone to scan your face, that gets you a 3D map of the head. You use that scan to have effectively a 3D render of your head and the glasses. Then you say, OK, I accept and I buy. And then they send it to you in the mail, and you get perfectly fitting glasses. This would take into account obviously the face shape, but also your nose and your ears and all the other touch points.
The logic then would be that the customization is key to a face device, not only in terms of size, but in terms of style. If that’s the case, then how do you create compute and sensor hardware as the embedded piece into this otherwise physical object that has so many shapes that it can be? That would be the breakthrough that I think they would seek in a wearable for the face.
- First of all, do not force people to adopt new behaviors
- Secondly, do not force people to adopt new aesthetics
- A third would be that it would be completely shaped by the user, as opposed to here‘s the choices you have
- And then make a useful product. Make it so that it does make a quantum leap in usability
The advantage of making a proper set of glasses (not sunglasses) is that the general population that does have corrective lenses (whether these contacts or not) is higher than 50%. If that’s the case, then the market is huge. Possibly even bigger than the watch market, and yet it’s more difficult. Because everyone is very particular about what they choose to wear on their face, as they would be for anything they wear on their bodies as clothing. Not so much wrist-worn or even ear-worn, but face is a very special place.
We started this discussion about silicon. The fact is that it might be that the key to embedding that technology into a very customizable outer appearance is possible through only this particular breakthrough in their ability to integrate. In which case, that becomes the next category. If they‘re able to crack that nut, it becomes the next category. Implications thereafter are profound, because again you are dealing with a new user experience.
The watch was a potential new category for computing as well. It turns out that its glanceable nature, despite the fact that you might glance at it 100-200 times a day, is not a sufficiently immersive experience. It is secondary. It is still a wonderful place to put technology. It does seem to affect your behavior quite a bit as a passive sensor. But the leap we might need to get people to be interacting with computers more might still require a position around the ears and the eyes.
Horace Dediu on how Apple is usually not a first mover into new product categories:
Apple doesn’t jump on a bandwagon. It doesn‘t develop a product strategy and doesn‘t actually firmly launch something until 2% (or more) of the market has been already explored through products sold. Meaning, until the adoption of that technology goes beyond 2% of the population that ultimately adopts it.
If you think about the mobile phone, when the iPhone launched in 2007, it had been already about seven or eight years of smartphones in the market. For Americans, it was mostly the BlackBerry, maybe a bit of Windows Mobile. But for Europeans, it would have been Symbian or Nokia-based smartphones that ran some crude operating system. But it was an operating system with apps. There were app stores. There were apps, there was media, there was imaging, there was a messaging with imaging, and so on. All these prototypical services we have today.
Effectively, by the time the iPhone shipped, tens of millions of smartphones had already shipped and arguably, about 2% of the market had been developed. And that‘s when Apple reset the expectations with the iPhone.
If you go back to MP3 players, same thing. The iPod, when it launched, was said that it was not competitive with the Nomad, which was one of the brands of the time. The iPod was lame because it didn’t have as much capacity as an existing MP3 player. By 2001, when the iPod did ship it already was 2-3 years of MP3 players in the market. Everyone at the time was like, why is Apple so late? Why is their product so limited?
The fact is that super early adopters are frustrated. They’re banging on about something. This is true for micromobility, it might be true for AR/AV too, where would you have already tons of VR headsets that kind of came and went. The Google Glass that came and went. The HoloLens from Microsoft.
We also had, by the way, tablet computing starting in the late 90s. That’s a good decade before the iPad shipped.
The pioneering happens early, and failures happen early. And Apple kind of comes in just when… I believe it is not because Apple thinks it can get a better advantage later, but because I think they toy with it. They put things together and say, “Now, it‘s good enough”. It‘s good enough for the Apple brand. And, as a result, it‘s good enough for the average user.
In the adoption curve, they come in after the so-called innovator phase, in the middle of the early adopter phase. The early adopter phase is between 2 and 13%. Above 13% is the early majority up to about 50%. Then after 50% until about 85% is the late majority. And then the last 15% are called laggards. Roughly, that’s how the famous adoption curve has been segmented.
Horace Dediu on how Apple seems to beta test new technologies:
One of the curiosities that I noticed was the inclusion of a 3D LiDAR.
What is it doing in an iPad?
Well, short-distance LiDAR might become one of the core technologies in AR. So they‘re putting it into an iPad almost as a beta test. Let‘s see how it works in real world. Let‘s use some of the data from it to calibrate our own other systems.
Again, the M Series, the chips that detect motion, were first shipped on an iPhone. Using an iPhone you could detect steps, but nobody‘s thinking of that anymore. We‘re using the watch for that. But I think that the development of the core silicon for the Apple Watch was started with the iPhone.
The M Series was sort of planted in there for getting some early data and then using it to deploy future technologies. Because the roadmap is so far ahead, I think there’s a lot of foreshadowing going on there.
That’s what I would be looking at is if there are acquisitions in enterprise or acquisitions in AR or acquisitions in AI that are leading us to kind of put together this elephant by touching a piece of it.
And sometimes it goes nowhere. There’s possibly dead-ends all around. That’s the game.
Horace Dediu on how Apple Silicon might be the source of a competitive advantage in wearables and AR for Apple:
If Apple is no longer constrained by what foundries and what architectures are going on in other companies, we might be looking at a new era of compute.
Again, what if you go from old metaphors/UXes and repackaging them, but rather (potentially) coming up with something new? That‘s always the question out there. Does this silicon enable more rapid prototyping of user interactions and user interfaces, along the lines of AR?
In which case, by the way, from a competitive point of view: How hard it would be for Microsoft or Google or Huawei to, without custom silicon, being able to implement some of these newer ideas about interaction modes?
Especially in the wearables segment, where you’re pushing on miniature, on power, on sensing.