I'm far more interested in the OS and Apps, that's the thing that's going to make this a success or a flop. Hardware (Apple will almost certainly get this right) is important, but the actual user experience and use cases that Apple present are what will be the deciding factor.
Outside of gaming and industry use AR/VR have felt a little like a solution in search of a problem. I do believe though that the people who work at Apple are capable of solving that problem, finding the use cases that an every day person wants it for.
My benchmark for this is my family, they are all different, with different level of interest in tech. If I can see the majority of them having a use for it, being inspired by the UX and marketing, then it will be successful.
Until now none of them have any sort of VR kit, but they all have iPhones and Macs.
As a slight aside though, new hardware products are very hard for Apple to keep completely under wraps, too many people in the supply chain, too many factories. Software on the other hand they can lock down incredibly well (pun intended). I am somewhat suspicious that they may supprise us at WWDC with their take on generative AI. Now that could be part of the headset, but it's more likely to be part of iOS/macOS and a feature that aims to help them sell more of their existing hardware ranges.
Disagreed, for VR/AR there are many issues that can make the experience uncomfortable or even sickening. The hardware and software are key to making a better device that is actually nice to use. Just putting on a Quest 2 headset is a tiring experience, if the battery is not drained again. Even small details like font rendering can cause headaches.
But I don't think I'm being a fanboy when I say that if I trust any company to fix physical comfort/usability issues, it's Apple (with a $3,000/device budget).
Right, it's Apple, the company that makes the most unergonomic mouse in its price range, and the only wireless mouse I know of that can't be charged while in use. Also the company that once pulled a "if you can't get reception, it's because you are holding your phone wrong" followed by "have you considered a case".
We can be certain that it will look great. That's one thing Apple nails every time. For usability and comfort they have their ups and downs like anyone else, and an above average share of "it might not be very usable, but look how sleek it looks".
I like the mouse. It scrolls well, it’s thinness means it fits well in bag compartments, and I’ve never been bothered by the charging because I only have to charge every few months and use it at a desktop, where I have a lightning cable handy. Maybe if I gamed or something I wouldn’t like it, or if I were working with spreadsheets or something where I’d have to click around a lot, but for reading and writing, it’s great.
Your comment can be turned on its head. It’s exactly because Apple ended up making a mouse that can’t be charged during use that their contribution to VR are vital: they think different.
Their mouse has a great battery life that requires charging it sparingly. Your Mac tells you when it needs charging.
They thought about the mouse charging problem. They arrived at different answers than the rest of the industry.
Sometimes when you "think different" you invent relativity, and sometimes you invent Time Cube.
Every mouse on the market has a battery life of months. Every mouse on the market will tell the user when the battery runs down. Apple is not special. The thing is just built wrong. And they haven't fixed it after all these years.
That is the smallest fatal flaw with that mouse. The bigger flaw is ergonomics. Selling something that likely to cause injury should be illegal.
My Logitech MX lasts a couple of months on a charge, and Gnome warns when it is down to 20%, and it's done that for as long as I can remember.
The difference here is that sometimes I ignore the warning and then it gets too low but I can plug it in as I use it for 20 mins, whereas a Mac user can't.
Exactly. I figure they put the mouse charging on the bottom on purpose so that people would not just leave it plugged in. I end up charging my mouse about once every few months so it’s not an issue.
I end up charging my mouse about once every few months
Wah, that is crazy. What is so special about this mouse that it needs so little power? Why isn't this reported more widely!? Other wireless mice that I have used need regular recharging or new disposable batteries. Always a sub-par experience.
Yeah that’s why the port’s on the bottom. To force a change in habit learned from subpar hardware. The thing charges enough to last a whole day in like a minute or two.
The usability issues here aren't in the realm of "does it have a polished case and nice straps", it's more "the state of the art VR tech isn't good enough for sustained work" unless your work doesn't involve reading or writing.
And, I mean... that's gotta be the bar for $3K, right? Even the most insanely passionate gamers would have a hard time justifying a $3K gaming accessory, IMO. And what else is there? The only usecase that I feel is ready is movies/TV, which... yeah is also a hard sell for $3K, especially since every family member would need one if you want to watch something together.
I think that VR business meetings would be quite valuable. But indeed, you cannot make your employees wear a VR glasses for prolonged periods if it gives them severe headaches and/or nauseousness.
I like the Quest 2 but you’re so spot on about the battery. I have to plan a day ahead — ‘maybe I want to use the Quest 2 tomorrow, better plug it into the charger and make sure everything is updated!”.
Although (tangent incoming!) I think omitting the OLED from the Quest 2 was a bit of a mistake. My theory is that the screen actually has more to do with why most games in the library have that same-y bright cartoonish look, rather than just the limitations of the XR2. It’s a relief the Apple headset is going OLED, if this leak is to be believed that is.
The latest Quest firmware has an “update before shut down” option that should help a bit, but it’s mostly a band-aid. The Quest Pro comes with a dock and just stays silently updated and charged, which seems the best way to go.
As far as the cartoony games: that’s actually the “fault”[1] of Unity/Unreal Engine which powers quite literally 99% of Quest titles (you can count the titles using other engines on one hand). VR studios are hyper-indie and don’t have the budget to build out games that look different, especially when that cartoon style lends itself to higher performance.
1. It’s not actually the fault of Unity/UE: Red Matter 2 is built on Unreal but looks spectacularly better than most other standalone titles.
>I'm far more interested in the OS and Apps, that's the thing that's going to make this a success or a flop. Hardware (Apple will almost certainly get this right) is important, but the actual user experience and use cases that Apple present are what will be the deciding factor.
Outside of gaming and industry use AR/VR have felt a little like a solution in search of a problem. I do believe though that the people who work at Apple are capable of solving that problem, finding the use cases that an every day person wants it for.
This is my feeling too - I just don't see an immediate usecase for such a device outside of a select few niches, none of which (even combined) move the sales needle for a company of Apple's size. What are people actually going to use this for? I just don't see it - although there could be something as the tech matures in a few years.
This is my feeling too - I just don't see an immediate usecase for such a device outside of a select few niches…
As usual, Apple is playing the long game. We should remember that when the iPhone was announced in 2007, Steve Jobs set the goal to get 1% of the cell phone market.
In 2007, there was no App Store and the only native apps were what came installed on the device.
Apple is laying the groundwork for the future. They’ll sell enough for it to meaningful while the rest of the ecosystem comes together.
Start with first principles in hardware. We are primarily visual creatures secondly we are linguistic. If the io for a hardware device were ideal would it not serve these two functions?
> But the cell phone market was a pretty well-established and sizable market. The AR/VR market is not.
I guess that's one way to look at it.
But compared to what was popular when the iPhone was released (Blackberry, Treo, Nokia), it was the iPhone that jumpstarted what became the modern smartphone market we see today.
Apple is uniquely positioned to usher in the modern AR/VR era, especially if it's true that iOS apps will run in on the headset. Nobody is better than Apple at capturing the imagination of developers.
The use case is replacing laptops - calling it now that we'll be reading Bloomberg articles about the "Headset Class" in our lifetimes :). Definitely don't think apple has a shot in hell of pulling that off in 2023, but I'd love to be surprised.
I’m pretty sure Apple wants to rule a new product category with better margins and no existing users.
With laptops and increasingly phones and tablets most potential customers already have a thing from that category so it’s becoming harder and harder to sell and the profit margins and volumes are becoming smaller.
So Apple is desperate to create a new category.
Honestly, if they can make a device you can comfortably wear for an extended period of time, with that spec sheet, there's a lot of lifestyle shifting things than can be done.
You could imagine locking virtual screens and interactive widgets in real space in exciting ways. You could redesign a living room if your main way to watch something is a virtual screen. Virtual cookbook apps you "leave" in the kitchen that you can interact with hand tracking even when you have raw chicken all over your hands.
The Hololens never got there but the use cases are there IF it actually hits the hardware mark. If its confined to 20 minute bursts before discomfort sets in like current hardware then its far more niche.
HOnest question; would you use a AR headset that wasn't actually see-through but is just streaming video from cameras on the outside? I assumed the answer would be universally no for everyone, but I might be in a bubble. I haaaaate the Oculus Passthrough, personally
Oh sure. I'm fine with passthrough but the Quest 2 barely scratches the surface and an Oculus Pro shows the potential but it's still very much under the bare minimum. I think it's a feasible path as long as the camera quality is pretty high.
Supplying content and adding the ability for long range AR (think up to the scale of a par 5 golf hole or a basketball arena) is what I work on at quintar.ai
It's a classic Apple move to leak rumors of extremely high prices and then announcing only medium high prices that make people feel like it's cheap.
The original iPad was rumored to cost $1500 to $1800. So when they finally announced it "only" cost $500 people couldn't believe how cheap it was in comparison.
I don't know why people expected iPad become so expensive. It's a big iPhone without cellular. Though capacitive 9.7 screen wasn't popular, it wouldn't cost so much.
It's not. The iPad costs $449, or $329 for the last gen. That is their primary product, along with the Air is $599. The Pro is for the "pro" and tech enthusiasts.
I am really struggling to see how this can succeed. Maybe it will be like Apple Pay, lackluster adoption for a time and then years down the line starts to gain marketshare. Even for me a geek who works in the apple ecosystem with a relatively high paying job who would love to mess with this thing it would be hard for me to justify spending 3k+ on a headset like this. Combine that with the fact that building compelling virtual reality experiences is just difficult, where are the apps going to come from? Maybe I will be totally wrong, I am excited to see it though.
$3k+ is exactly the right price point. A customer cannot be disappointed from a product they don't buy. And the type of enthusiast who spends 3k+ on it, will be satisfied before they even wear it.
At 3k+ it will likely be the unchallenged top headset on the market, and that alone will mean that tech nerds will be raving over it.
> where are the apps going to come from?
Gaming whales exist. Enterprise customers exist. Microsoft got a $2b from the DOD for building the holo-lens 2 at tech demo levels of product quality. IMO, VR is 1 UX breakthrough away from finding a money-generating niche. And UX breakthrough/popularization is Apple's bread and butter.
> $3k+ is exactly the right price point. A customer cannot be disappointed from a product they don't buy. And the type of enthusiast who spends 3k+ on it, will be satisfied before they even wear it.
I think you're absolutely right. I think a similar example of this is the RTX 4090. Everyone complains about the price, the power usage, how it's overkill for 99.99% of the population but it's constantly sold out. I doubt anyone who owns one is really disappointed
> I think a similar example of this is the RTX 4090.
I’m not sure. With video cards those gamer whales can crush all the lesser card owners after they buy some power via MTX. Then they can convince themselves it’s skill based, not money based. With a $3k headset in a closed ecosystem they’ll be playing amongst themselves and they won’t get the same level of reward (aka crushing “noobs”) that they’re able to buy in a more mainstream market.
I don’t know a single person, including friends of friends, that would buy any VR headset let alone one that costs $3k. The price point needs to come down a ton and typically adoption of tech like that comes from younger generations that have exactly zero money these days, so I predict massive flops from everyone because I don’t think the economics are going to work.
>Are there a lot games where a 4090 owner would have a meaningful advantage over an 3060 owner in a competitive environment?
Yes, all of them benefit from being able to push high framerates at resolutions the 3060 (and 2070 and 1080) can't get to. A high framerate is a meaningful advantage for a bunch of reasons, but so's a higher resolution- doesn't matter if an enemy's movement is smooth if your screen is too blurry to positively identify them.
Sure, the 1080/2070/3060 can run 4K at 30Hz, or 1440p at 90-100Hz (this is why that whole G-sync thing exists- it's designed to help hide framerate drops from those cards), but the 4090 is significantly more powerful than those are and can push 4K at 144Hz.
> Yes, all of them benefit from being able to push high framerates at resolutions the 3060 (and 2070 and 1080) can't get to. A high framerate is a meaningful advantage for a bunch of reasons, but so's a higher resolution- doesn't matter if an enemy's movement is smooth if your screen is too blurry to positively identify them.
Which game in particular gives a 4090 owners a meaningful advantage over 3060 owners? I understand the importance of high & stable FPS.
If there are so many can you please name a competitive game where a 4090 give a meaningful advantage over a 3060. And consider that 1080p at 120FPS on low settings is essentially the max that any competitive gamer could ask for. Running games higher then 1080p is eye candy, nothing more. Anything over 120FPS does give a tiny, tiny, tiny advantage.
Agree, and would add that “apps” is likely the wrong mindset: This thing is going to be all about experiences. Yoga on the beach. Standing on the pitch during Ted Lasso. Watching the Apollo moon landing from the surface of Tranquility Base.
Either they have the immersion and UI model that makes this kind of content viscerally “wow”, or they don’t have a product. I’m guessing they have a product.
These "experiences" are visceral but also very short lived and weirdly frustrating for many (most ?) people.
I look at it the same we look at paintings. Some pieces are truely mindblowing, speak to your soul and you could look at them everyday for next decades without getting bored...or not. I think for most people even the most impacting piece becomes a wallpaper pattern after the fifth time they stared at it for 5 min. They might revisit it once in a while, at most.
For a headset, that mostly means it's played with for 30 min the first time and stays in a closet for the rest of its life. You forget about it and don't get a chance to get revisited.
So yes, experiences could be wonderful, but it also needs apps that bring you back to the device on a regular basis to help the ecosystem stay alive.
(the other weird part: the more realistic and visceral the experience is, the more you get greedy and ask for more. Except there's nothing beyond. You'll be looking at that moon landing, wishing to move forward to look closer, except it will be clunky if even possible. And you're reminded that yes, it's "just" virtual, and technology and content is not ready yet)
I had that visceral “wow” while reading your post. I’m not saying that’s what they’ll do. But that _is_ one way of leveraging the content/IP from TV+ and could serve as an impressive demonstration which is fun to work on.
I am guessing they don't, if we use your metrics. I do get that discussing here are mostly US fanboys, so facts are hard to actually discuss rationally, but Apple made flops in the past, and underwhelming hardware and software they initially presented as second coming of Jesus.
VR is hard, sorry hard even for companies that had decade headstart and similar budgets and talents available to Apple.
It may actually help them if people have more realistic expectations than presented here, since its not possible it will be a perfect product from gen 1, day 1. Not even Apple ever achieved any of that on any sophisticated product. Since there are so many sub-categories in VR to compete in (home cinema, overall immersion, raw resolution, couch/room experience, controllers, comfort, virtual huge desktop, sound quality and so on and on) people should be realistic. And also expect to wait few years to have 3rd party software fully utilizing HW.
Price point should not drive expectations, Apple often gives upmark on its products compared to competition. We shall see soon.
Honestly the tech has stagnated for the last few years, with VR headsets actually getting crappier, (Meta Quest Pro is slightly better than 1080p?) I hope Apple lights a fire under them, it's 2023 the technology could be a LOT better. VR is one of those things where it either works perfectly or it's a failure, you need super high refresh rates, high resolution, and software to get a certain level of immersion that tricks your brain into believing it's real.
The Pro is sort of an aberration. Pixels per degree are actually higher than the Quest 2, but the panel resolution itself is slightly less (and the MiniLED tech is far superior to the regular LCD panel). The big upgrade was with the pancake optics, which are actually quite a leap forward.
Meta should have put the product on ice until Qualcomm had XR2 Gen 2 SoCs available. They couldn’t bump the panel resolution because of the SoC. Hell, even the extremely overengineered Touch Pro controllers only exist as an artifact of the XR2 not supporting enough cameras to do face/eye tracking while still having cameras available for SLAM tracking.
Have you ever been to a live action theater and dinner thing? Like Tony and Tina’s wedding and the lion king - it’s corny as hell and meant to be consumed in small doses.
Apple Pay was great from day one - depending on where you lived. In the UK contactless payments are the standard and already were when Apple Pay was released meaning I could use it in 99% of retailers, even small ones.
As I understand it, the US didn’t really have contactless payment pre-apple/google pay _at all_, so it’d naturally be a harder transition than in places where it was already rolled out.
Even today, you’ll occasionally see terminals in Europe which don’t actually support Apple Pay as such; it’ll work, because it acts like a normal contactless card, but only up to 50 euro, because the terminal doesn’t know it’s authenticated.
Sounds like you're aware of why it's a uniquely American thing.
You forgot to mention companies that did accept it but stopped after they implemented their own payment system (which again only applies to American branches of their stores).
If they can execute well on a simple use case - virtual desktop for professionals -this will be justifiable to me. I spent >1000€ on my last 5k ultra wide five years ago.
This promises to be a portable unlimited screen space - if I can use it 8h+, OS integration is there - I could see myself spending that kind of money.
Having a portable workspace anywhere with a chair and desk is big - especially when desk/chair dimensions now don't play a factor.
>This promises to be a portable unlimited screen space
Well, if the resolution is up to par and the software is set up to mitigate the problems even middling resolution would cause.
No headset, except maybe the 8K (and possibly the 4K) PiMax designs, are capable of displaying text in a way that isn't a blurry mess; the problem is both the lack of resolution and the fact that anti-aliasing can only do so much (move your head ever so slightly off axis so the text no longer lines up with the pixel boundaries and fine details become completely distorted by insufficient resolution- software could help fix this if anyone bothered to write any but that hasn't happened yet).
“Portable workspace “ is an interesting concept to me. Something more than a laptop screen. It would be hard to get input devices right when you’re wearing a headset though.
Lots of VR headsets have a small webcam, and can show its viewpoint as a HUD element in the headset. That's enough to keep track of your bearings, then you just need a normal mouse and keyboard (e.g. the laptop you already own). No need to revolutionize on the input front if all you want is lots of private screen real-estate.
You can have this right now from multiple VR headsets (Lenovo comes to mind as a company that puts a lot of emphasis on this use-case). The issue is the resolution of most headsets is pretty low for this usecase, so you end up with pretty low-resolution virtual "screens".
With supported keyboards (basically Apple and Logitech’s offerings atm), they load up a tracked 3D model of the keyboard and then pull in an overlaid video feed of your real hands. With non-supported keyboards, the video feed is more of a small “window” that also captures the keyboard.
And when it works, it’s honestly very impressive (especially with color passthrough).
I fully expect Apple to have a more robust and less buggy version ready to go - and with the much higher specs they’ll probably be able to get a lot of “wows” from the demos.
I've never been convinced by an AR keyboard-like interface. You need tactical, physical feedback. I really can't imagine anything short of direct brain interface being a good replacement for physical input devices, specifically the keyboard, though LLMs will maybe make voice interfaces tolerable for certain creative tasks.
It feels a lot like it’s been pursued all the way to a releasable product and there’s just too many wheels in motion to stop and think if it’s really a worthwhile idea.
We could be wrong and people are happy to hand over 3k for this, but it’s just hard to understand how for a whole bunch of reasons.
Are they going to get third parties to build software for this? Where is the volume to support that going to come from? Who’s going to want to go first and risk 3k on something that might flop and be on eBay for a fraction of the price in a year?
Just to point out this sounds like the exact criticisms you might have levied at Apple for the original iPhone. Not that this will go the same way, but they've done it once the dramatic world changing effect.
Did people really say that about the original iphone? I don't remember that.
I remember the iPhone announcement as being a product that basically everybody had been asking for and immediately saw the value in. And the rumour cycle leading up to the announcement was insane - people were so hyped for apple to announce a proper smartphone, it was already a product category. Things like the Palm Treo and the Nokia N95 existed and were relatively popular - handheld computers with phone capabilities as well as full web browsers, email clients, maps, video playback, etc. The iPhone was taking a product concept that had already been proven, and hugely improving the form factor and user experience. Nobody saw the iPhone announcement and said "nah, i just don't see the value there".
Gizmodo loved it, but "The real elephant in the room is the fact that I just spent $600 on my iPhone and it can’t do some crucial functions that even $50 handsets can. I’m talking about MMS. Video recording. Custom ringtones. Mass storage. Fully functioning Bluetooth with stereo audio streaming..."
TechCrunch: "That virtual keyboard will be about as useful for tapping out emails and text messages as a rotary phone. Don’t be surprised if a sizable contingent of iPhone buyers express some remorse at ditching their BlackBerry when they spend an extra hour each day pumping out emails on the road."
A ton of people loved it, but it came under heavy criticism for the lack of physical keyboard and the AT&T dependency.
Not enough emphasis can be placed on the existence of a usable web browser on a phone. It did not exist before. Blackberry was great for emails, and Windows mobile was great for propping your door open, but when Safari on iOS came out, even with it being slow 2G and with a buggy keyboard and no copy/paste -- you could use websites on a mobile device -- anywhere. Being able to pull up amazon and check reviews for products while browsing in a BestBuy or whatever was something that just was not practical before and it was a game changer, instantly.
Lots of people criticized it for not being able to do basic tasks like copy and paste or the lack of 3G after they bought it. The iPhone didn't suffer at all from the criticism because people saw it as a work in progress -- the AT&T attachment was actually brilliant because it allowed them to make AT&T build their network and offer decent data plans. Before iPhone the data rate was something ridiculous as an add-on to a regular phone plan, and everyone using 2G data all at once brought the network to its knees in dense areas and forced upgrades.
Yeah my point wasn't that it was universally loved - there was plenty of criticism. but the criticism was because other things in the same product category were better than iPhone at some things. and that's because the product category already existed, and people were used to using those products and had already formed opinions about how they should work.
an augmented-reality headset is a fundamentally different sell, because that's not a product that currently exists in any meaningful way. Apple doesn't have to convince me that their AR headset is better than any other one that i've tried, because i've never tried one. They will have to convince me that this is a thing i need to own.
The iPhone cost roughly the same as 2007’s competing smartphones that were already selling millions of units (e.g. the Nokia N95), and it offered completely unique capabilities.
Today there is a high-volume VR headset in the market, the Quest 2, but the new Apple device is projected to be nearly ten times more expensive and it seems to be more of an incremental advancement of the state of the art.
But we’ll see… I hope Apple pulls a rabbit out of the hat.
In 2007, cellphones were mostly “free” (the cost was hidden in the service contract). Plus however much they made by putting crapware on the phones.
Similarly the Facebook devices have their cost hidden in the privacy violations that come built in. It is a “better” business model than the cellphone contracts because it is even harder to figure out how much they are making off you, but I wouldn’t be surprised if the Apple model wins again.
> The iPhone cost roughly the same as 2007’s competing smartphones that were already selling millions of units (e.g. the Nokia N95), and it offered completely unique capabilities.
At the time of launch the iPhone was one of the highest priced (including carrier subsidies) phones on the market.[1]
> Today there is a high-volume VR headset in the market, the Quest 2, but the new Apple device is projected to be nearly ten times more expensive and it seems to be more of an incremental advancement of the state of the art.
As Ballmer mentions later there were high-volume Windows Mobile phones available at that time too.[1]
What were those "completely unique capabilities" at launch? There was no App Store. It didn't record video. It didn't have 3G. It was actually a fairly ho-hum phone.
The real web available in your pocket. If you weren’t using the mobile web before then, it was a decidedly second-class experience until mobile Safari launched. It took Google 2-3 years after the iPhone to get Android to be comparable and by then the App Store was booming, which basically doomed all of the old phones. Previously selling apps required paying off each carrier and giving huge percentages of the sales, so there were almost no apps of decent quality available. At WWDC 2008, there was spontaneous applause when it launched with much better terms, hard as that may be to believe now.
It was a ho-hum phone by the existing standard of mobile phones. But it offered a full-size capacitive touchscreen and a desktop-level operating system. These were both unique in the market, and importantly they were designed together for a cohesive experience that nobody else could match. The iPhone introduced UI affordances like inertial touch scrolling which we take for granted today, and that UI enabled things like mobile web browsing that didn’t suck.
The iPod touch was an important pillar of this new platform’s introduction. It’s often forgotten today, but a lot of people around the world got introduced to iOS on the iPod rather than the iPhone.
Can you elaborate further on what you mean by a "desktop-level operating system"? By what metric(s)/feature-set? Because Windows Mobile out-did it every way in terms of multitasking, file system et al. You couldn't even multitask on an iPhone until iOS 4 in 2010.
The capacitive touchscreen wasn't unique, the LG Prada was announced prior to the original iPhone, and launched before the original iPhone too. Perhaps "unusual" would be a better term.
I had an iPhone at launch, and perhaps it's down to what websites you used (I'm a Brit), but the browsing experience was pretty terrible until people started developing responsive websites, and it really wasn't until ~2009-2010 that this took off in any appreciable or meaningful capacity in the UK, and even then, it was still common to have a separate mobile version of a website up until 2013. Prior to that, an app for larger websites became much more common, and offered a much better experience than the mobile website or responsive website ever did. Browsing the web over 2G sucked. Websites having buttons too small to press with your fingers also sucked.
Websites that were 100% Flash were super common, as were Flash elements. Even the PlayStation Portable could display Flash in the browser by 2005 after an official firmware update from Sony.
I also had Windows Mobile, BlackBerry et al in the years prior to that.
Sure, pinch-to-zoom, that was serviceable in lieu of a responsive website or mobile site, but it still sucked versus just using the app for the site when the App Store eventually launched. Furthermore, most of the software innovation was happening in Cydia via jailbreaks prior to the App Store launch. Even then, for a considerable number of years following it. Apple would frequently incorporate whatever was hot on Cydia as a concept on the next major version of iOS.
You didn't seem to like your original iPhone. That's ok. I, for one, loved my iPod Touch. It had a full-fledged iPod music app that really worked flawlessly with podcasts and my large music library. I even paid for the update that gave me calendar, mail, and contacts because I wanted to read my emails on the go. Maps was wonderful, there was great YouTube app to watch videos in bed, and Safari worked ok while we were all waiting for responsive websites.
Of course the App Store was an explosion of functionality, but there was a lot to love about iOS before third-party apps appeared.
I liked my original iPhone prior to the launch of the App Store—but it was only jailbreaking and adding a lot of features via Cydia that made it a keeper, even after the launch of the App Store.
iOS was largely lacking for many years. No copy-paste until 2009, no multi-tasking until 2010. It was rough, even as a diehard Apple fan at the time.
There were countless times that I had phone envy—paying more to get considerably less functionality didn't sit well with me at the time. It definitely felt like a toy at times. Everybody wanted to play with the gimmicks like drinking a virtual beer or swooshing a lightsabre, or playing Tap Tap Revenge or Angry Birds, but as a daily-driver phone for productivity and not screwing around, it was pretty lacking.
This was the era when you bought a "smartphone" to do productive things, not screw around. I also liked my PSP, but it wasn't a productivity device, despite having a pretty kick-ass web browser with Flash support and a decent-sized screen.
Windows Mobile in 2007 was based on Windows CE, a completely different and much more limited kernel compared to desktop Windows. Symbian and BlackBerry had the same limitations: these were operating systems designed for embedded devices, not desktop-level computing. They were memory efficient but their growth path was nonexistent.
The original iPhone OS used the underpinnings of Mac OS X: the Mach kernel, CoreGraphics, the Cocoa Foundation APIs… At the iPhone launch, Apple even called the phone operating system “OS X” to highlight that it’s the same.
The combination of screen size, resolution, and non-broken browser to render non-mobile-specific websites tolerably.
That's how it broke the “smartphones are niche because the mobile web sucks, and the mobile web sucks because there are so few users that its not worth improving” barrier.
Don’t forget Apply also got the soft keyboard right which was no small feat. People were dead set against it but here we are no physical keyboards anywhere on smartphones.
Sure. Not sure that was indepedently important (for adoption, it clearly had a big impact on the course of mobile interaction) but even if it wasn’t it was essential to the web thing, because otherwise you couldn’t have a usable keyboard and a sufficiently large screen.
It had the first phone web browser, and indeed the first web browser on any device smaller than a laptop, which didn't make you want to immediately throw it out the window. That was a big deal.
The blackberry message thread context was terrible, iPhone was better - come on, don't lie. The idea you could get full screen visual context was the kicker - blackberry screens were 'decent' but when it was a ~2.5 inch height, compared to the iPhone - it was unreadable.
The original iPhone was pretty bad actually. It didn't have many of the unique capabilities compared to what we know.
In terms of competitive pricing, I mean yeah if this is just a Quest 2 with better screens its DOA, but I would hope not, though I am keeping expectations low. My only point was Apple does have the capability to pull rabbits out of their hat, not that they will.
> We could be wrong and people are happy to hand over 3k for this
There'll be _some_; remember Google Glass? I'm not convinced there'll be enough to make it any more than a niche oddity, though (even the numbers quoted in the post imply that; Apple doesn't launch that many products where they only expect to produce 500k in six months...)
> I am really struggling to see how this can succeed.
People seem to forget this every time, but this version of the tech is aimed at developers and early adopters, just like version one of Hololens was. It's also priced like Hololens was.
The consumer focused version in a thick sunglasses form factor comes later.
Or, not at all - we’re on year, what, 8 of that? There isn’t anyone reputable who thinks there’s a short term breakthrough coming, especially after Metas $$$$ swings at it and Apple failed to.
c.f. Bloomberg recent piece on this; tldr lot of internal strife at Apple because that’s what they were supposed to do and simply couldn’t. Now it’s limping out the door because they might as well ship something. Lead was saying as recently as a year ago that glasses were 3 years out, but they were shelved because there’s no path to it
> Or, not at all - we’re on year, what, 8 of that? There isn’t anyone reputable who thinks there’s a short term breakthrough coming, especially after Metas $$$$ swings at it and Apple failed to.
Pancake optics just recently became a thing and they really are transformative compared to fresnel lenses of the past. You have edge to edge clarity instead of the massive blurry mess of earlier HMDs. They’re also typically thinner and don’t need compute power to compensate for pincushion distortion.
If this Apple headset takes off, I’ll almost feel sorry for Meta. The Quest Pro was their “business prosumer” device that needed another 6-12 months in the oven before it would be usable for that purpose. Great PCVR headset, but nobody makes money on those.
If anything, Meta has shown that VR CAN be sold to the masses as a video game console (Quest 2 sales numbers are honestly impressive in that context), but the console model doesn’t bring in the billions in revenue promised to investors. And nobody wants Metaverse garbage. It’s make or break on the prosumer market, and if Apple bombs I wouldn’t be surprised to see the Zuck pack it up soon after.
Great for VR, horrible for AR, unfortunately: the trick that makes the optics thinner is bouncing the light a ton in the lens, dimming the display.
I am hopeful with recent increases in nits from OLED and luck from uleds / smol oled, we'll get "glass but 4x the size and more contrast" by 2025 from someone.
Yeah, the power consumption on panels bright enough for pancake optics aren’t good news for AR form factors.
Even in a dedicated HMD like the Pro, they had to redesign the SoC to move the memory so they could fit a chonkier active cooling system (that mysteriously is on full throttle constantly in recent firmware, which makes me wonder if they were seeing thermal failures). Bosworth admitted in one of those Facebook livestreams that the reason the Pro tops out at 90hz isn’t because the panels don’t support 120 - it’s because you’d cook your face from the heat of running those high nit panels at 120hz.
> There isn’t anyone reputable who thinks there’s a short term breakthrough coming
One big thing Apple has going for it is that they will iterate on a product year after year instead of doing one hardware update and giving up like Microsoft did.
Regardless, developers need hardware in their hands if you want third party apps for the new platform.
I've built too many experiments to attempt to make this visual wearable plane worth it, and if these devices try to do anything more than hands free reading, they are a worse experience to achieve the same task.
Think about its 'most compelling' consumer use case - Lifelike Avatars in FaceTime. Which one do you think is preferred, cooking in the kitchen and wearing a headset, or propping your iPhone up and FaceTiming while cooking? Where on earth does it 'change the experience' to have a real-life like avatar? We have the 'life like' avatar right now with Zooms - and the reason Zooms suck is innately the human part, not the rendering part - most meetings could be emails or 5 min phone calls.
Everyone had spreadsheets and publishing needs (Mac), portable music players (iPods), cellular phones (iPhone), headphones (AirPods), home wireless music speakers (HomePods)... think of fitbit / pebble, 1980s VR and then todays VR ?
Yeah I was there for those (edit: similar I mean, different context) conversations with H1 - the goal is better human contextual awareness on the factory floor - the amount of things that need to be 'newly explained' on the floor should be 0. This negates this whole AR/VR 'support staff / training' spatial context. It's like cycling with training wheels after you've been riding for 20 years... it makes no sense.
(we this found out with GM doing a pilot)
[Edit] to keep it going, external cameras are how you verify parts are picked correctly and put in the right place - you don't need FPV for that to be verified either.
[Edit 2] Until a flight simulator uses VR and AR headsets as primary training, this stuff just doesn’t make sense. CAE has definitely explored with this but it’s not what they reach for
About as well as any recent Microsoft or Google product that was abandoned when it didn't prove to be an instant success instead of continuously iterating on it and improving it year after year?
For one thing, the first iPhone model wasn't missing that many elementary functions. GPS and 3G were obvious ones, and both were remedied quickly in subsequent models.
For another, the competitors were also all missing at least one vital elementary function: Internet support that didn't suck.
So I don't think there are a lot of comparisons to be drawn between a yet-to-be-released VR platform that, ultimately, nobody is really asking for, and a phone that almost everybody who wasn't named Steve Ballmer, Mike Lazaridis, or Ned Ludd desperately wanted.
(Admittedly I said the same thing about the watch -- who in the world wants a watch? -- and that was way off-base given how much demand there turned out to be.)
The iPhone's a pretty crappy apples:apples example, because a huge part of its functionality was outside Apple's control -- a usable cellular data plan.
And, from memory, only came about because Apple was Apple and could say "We don't care that you're AT&T. Create this plan for us or your customers won't be able to get an iPhone."
It was missing 3rd party apps of any form, copy/paste, video recording, any sort of office suite, email IIRC had no push support at all. Palm/Windows Mobile/Blackberry devices of the time could do a lot more.
> Palm/Windows Mobile/Blackberry devices of the time could do a lot more.
Blackberry for email, Windows Mobile for doorstops, and Palm for 'jack-of-all-trades, master of none'. None of them got mobile web right, none of them had iTunes, and none of them had Steve Jobs hyping them.
> not sure what you mean by "people seem to forget this every time"
People seem to forget that this wasn't ever intended to be the mass market consumer version of the tech.
They've been iterating internally on a consumer version for years now, and they'll keep at it until they have something for the mass market that they think is worth shipping.
A headset having 4Kx4K panels for each eye doesn't make it a substitute for a 4K monitor, the distortion through the optics means a virtual display will appear much lower resolution than that. The rule of thumb for headsets with 2Kx2K panels is that a virtual theatre covering most of your field of view has a perceived resolution of about 720p, so you can maybe expect about 1440p from the Apple headset in the best case where the virtual display dominates your field of view, and less than that if the display is smaller.
1440p is good enough for 27in monitor. If I can have an array of those where it can track my focus between then and it was seamlessly integrated to OS I could see the value.
My God, me; too. Never - ever - could understand why Apple wouldn’t just sell the watch as a potentially standalone device.
I know, it’s Apple; but at the same time, you’re easily going to sell 2-3 times more of them, and; like the iPod, it can easily introduce them to the Apple ecosystem and when it comes time to buy a new phone; they’ll maybe say ‘well, I’ve already got an Apple Watch and really like it…maybe I’ll get an iPhone this time my contract expires.’
But it’s Apple business logic. I’ve used their products pretty much exclusively for nearly 20 years (I’m an iOS/WatchOS dev and have used Logic to make music for 15 years) but I’ll never understand many of their decisions.
it can easily introduce them to the Apple ecosystem
I argue that it would be a horrible introduction to Apple products. For input, you either talk to it or type on a tiny keyboard. There’s no web browser. The screen is so small, it’s measured in millimeters (whereas in the U. S., the phone is measured in inches).
Were I running Apple, there’s no way I’d let the watch be the introduction to Apple products without some improved means of interaction.
Yes, and not everyone has an iPhone for the first step. There are lots of iPhone users, but lets not pretend Android users are an insignificant userbase.
where did I imply that they were insignificant? I specifically mentioned a well known android user. I used to be a dyed in the wool android user myself and a day one adopter of the Moto 360.
Like, yeah, it’s not ideal but you just ask a friend with an iPhone to help out. Unless you’re in an android only circle of course.
Either way, not ideal, but not as dire as the people above make it out to be either
Well Apple Pay, running/tracking apps and listening to music via Bluetooth all work without phone. And I don’t have the model with internal sim, maybe more stuff works with that version.
I agree you need the phone too, but at least I’ve been able to track a run while listening to music on my headphones and pay for water after my flask broke all without a phone present.
I think we’re close to just the watch but I’d be damned if I want to dictate messages etc. The screen size and lack of input devices definitely holds it back from being genuinely standalone.
I am blown away by how bad using other music apps on the Apple Watch can be for listening without a phone (and no sim).
I haven’t tried in some time but about a year ago YouTube Music wouldn’t work at all and Spotify was janky. I started my run and after the buffer was exhausted my music stopped playing even though I thought I had downloaded it onto the watch.
Probably the solution is Apple Music but that is annoying.
The announcement coming from WWDC makes me think that the first wave is mostly for developers to build on, and then the cheaper/more advanced headsets coming later will be the ones marketed to regular users. I was on the fence until Palmer Luckey gave it a thumbs up, now I am cautiously optimistic.
The rumor mill has been consistent for over a year that the “mass market” iteration is coming in 12–18 months (I’d guess they’re targeting holiday season 2024).
I think people are underestimating how effective a “cost-almost-no-object” version is going to be at generating interest. There will be lines at Apple Stores as soon as it’s available to try.
The model is iPod / iPod mini|nano. Everyone wanted the former, everyone bought the latter.
> I think people are underestimating how effective a “cost-almost-no-object” version is going to be at generating interest. There will be lines at Apple Stores as soon as it’s available to try.
Not to mention all the developers who want to build something for the mass-market iteration. This is going to be like the original launch of the App Store, where even small gimmick apps get a lot of attention. If you want in on the gold rush, you’ll have to buy the first version.
I think Apple just needs to get this out. If it’s somewhat decent it may spur development in 3D and vr/ar tooling on the mac platform again, something which is sorely lacking after Apple’s disinterest and inability to have good GPU’s.
Having this in the hands of developers could be great, Apple can followup with an affordable hardware product a few years later. It’s basically Apple’s hololens, with hopefully a different fate.
High quality head mounted screen could be a huge breakthrough for ergonomic work. Staring at a fixed plane and orienting your body with respect to the plane is pretty mechanically unnatural for the human system and it causes a lot of problems.
I tend to agree with this take. It's interesting to think about what you could do with an "infinite" screen, especially if it's socially acceptable to wear in a cafe, on a plane, etc.
who in their right mind would like a huge screen like a phone, without a good way to input information like a keyboard or a mouse. what would be the use case for a product like that? watch youtube?
My watch mostly replaced my phone. It’s a game changer. I thought they were fairly superfluous and unnecessary and expected they’d actually phase it out within 5 years.
I don't find anything compelling about it either. I don't even like my iPhone that much. Half the time I pull it out of my pocket to find that the camera is on (no way to disable that) or I've made some other gesture that has opened up a dialog with no obvious means to dismiss it.
I think problem with Apple Pay you experienced was due to US not being up to speed with contactless payments (or chip one for that matter as well)? In Poland you coild use contactless cards for a veeeeery long time, so when mobile payments came it was just swapping card for a phone. What's mote, before Google / Apple pay, banking apps offered mobile payments, first Via dedicated sim card and then via app
People literally switched banks to get Apple Pay here.
I think the whole "chip-and-pay" thing in the US is so very laughable - everyone else has been with it for at last a decade.
It's a shame it took forcing merchants to wear the cost instead of banks making the terminals more affordable - then again, I am talking about the banks here.
I remember the original Macintosh launch, and just prior to that the failure of Lisa, both were very expensive for their time. I’m also skeptical but Apple has managed to pull it off even if like you point out it might not be an instant success, the main question will be can they take an idea that everyone has been excited about for a while but couldn’t get quite right and hit a home run. Also excited to see what they come up with.
Your perspective here is invaluable: do you think pulling over all the other apps from Mac, iPad and iPhone is enough to make this device compelling at day 0?
There’s no huge productivity shift like there was for spreadsheets to visicalc that pushes our perception for what’s possible - so something has to make this device compelling in a different way.
Not really. Sales collapsed after an initial rush of excitement. By the end of 1984, Apple was selling only 10,000 Macs per month, much less than they had projected. Jobs was ousted a couple of months later.
The success of the Mac was to show what a pc could be. This headset is the same. Both Apple Watch and iPhone sucked when they came out, but they were good enough at showing what the future could be; enough for people to buy them. Both have improved dramatically around 5-10 years after release.
This headset should be thought of as a dev unit, except substitute dev with early adopters.
If it’s good then there will definitely be a value add in maybe 5-10 years. If it sucks then they’ll just have failed at creating a compelling headset experience that has broad appeal, just like everyone before them.
It’s important to realize just still how early we are in the computing revolution, and how much of the growth curve there’s left.
[This is not here for you, it's here for me as a record that I said this, and I was either very right or very wrong, but at least I said something - and put words down to clarify my thinking]
I feel like the 'how many times do we have to tell you old man' character - hearing that the iPhone and the Watch sucked coming out and both eventually became successful.
The iPhone did not suck in any way shape or form. It was everything anyone who touched a PDA or Blackberry before that wanted - text messages in a gorgeous swipe-able full view context, visual voicemail, a phone that had contacts as a first class citizen, you could actually read the screen and not feel claustrophobic, and a frickin' full web browser! And it was your iPod too!? Dude... we all need to sit back, stop playing revisionist history, and realize how in awe we were waiting for this stupid phone to activate - it took me over a week to activate my iPhone because we all overloaded Cingular. But the swipe gesture just to unlock it, and it's beautiful screen just had me staring at the thing for the full week - so damn excited. Simple things like the highlight 'Swipe to unlock' was a designers wet dream for a few years - come on man! This was flippin' insane.
The same thing happened to me with mp3 players and the iPod. I won't bore you with a similar story - but come on! We all bought mp3 players, the experience sucked but the end goal was worth it - and when the iPod made the thing we wanted to do easy - it was a natural success. The same thing happened with Desktop Publishers and people who used spreadsheets on a daily basis for the GUI based computers.
To spell this out again, the world already had and used on a daily basis the product class Apple made great. Most people did not care for fitness trackers/watchers when they came out [see Pebble's fall from grace, fitbit's early IPO failure, etc.], nor do most care for VR and AR headsets.
Here's the kicker, I've bought almost every generation VR and AR headset trying to make these devices compelling - but every experiment just shows these 'take over a human field of view' sucks. They do nothing for us like the music player or PDA did - maybe it's an Alto issue (we need the GUI break through for these) - but I've worked on this problem from every angle for 7 years and it sucks. Even the dang architects and mechanical engineers think it sucks - and they are in the spatial computing context! I was there with the DK1 met with HTC in Taiwan and other people I can't say, and did all the BS - this tech strata sucks for what it wants to achieve. It just needs to be a beautiful hands free reader accessory - that's where it's compelling - this whole ambient computing will not be head worn, it just won't.
Hey future you, I just also just want to clarify that I don't really think the iPhone sucked!
Obviously it was amazing. I just meant on day 1, which many of us forget about, it wasn't there for most people. It has conceptually stayed more or less the same product but it really has gotten so much better that we take for granted how much it's improved.
In my opinion, for example, the iPhone 4s was the first that was good enough for most people, but even then for many people it wasn't until the screen became giant that it became usable.
Anyways, I was too poor to own the first iPhone, but I had the first Apple Watch. It was weird for me because I simultaneously felt like it was amazing and a huge let down. I kept telling myself, "Well, at least it's a nice watch", but the truth was that it was just too limited.
I think conceptually the watch has stayed the same but is so much better now.
I'm excited to see the non-head-work ambient computing future, and I think Apple will be successful.
Thank you. One of the more frustrating aspects of the Apple Watch and now VR headset is the bizarre tarnishing of the launch of the iPhone and iPod. It’s a strange cognitive dissonance too, since people do still point to that Keynote as earth shattering. No one goes back and watches the Apple Watch announcement.
I used to say Apple wasn't good at games but they have a lot of experience making content and publishing games these days. They have continued to let mac gaming flounder but its a different story on mobile. This headset will almost certainly feel closer to mobile than anything else.
> I am really struggling to see how this can succeed.
It could work if they allowed pr0n on it. They already offer the privacy which makes it a much better proposition than a Quest where Zuck is always watching over your shoulder.
Nothing stops you from watching porn in the Quest web browser. It has private browsing mode like every other browser.
I don’t think this is the killer app for headsets though, unless maybe someone comes up with a volumetric production pipeline for truly immersive participatory experiences…
How do I know if activating private mode in the Quest web browser is not equivalent to sending a signal to Meta that "now extra juicy information is coming in"?
What if the headset has a 4G modem built-in and sends telemetry that way? What if the next-gen Quest has a Starlink receiver in the SOC?
Ostensibly it doesn't matter. Apple's headset won't be any different, both companies will encrypt their traffic and insist it's meaningless telemetry. If these companies want to spy on you, they will - they own your hardware more than you do. Without a verifiable, open boot process you're basically just throwing stones from a glass house. If you want paranoia-tier VR computing, get a Valve Index and an airgapped Nvidia/Linux workstation. It would probably still be cheaper than Apple's complete product (somehow).
Whatever the case, I'd bet the farm on Apple's new headset shipping with OCSP telemetry out-of-box.
By far the most expensive part are the two 1.3 inch Sony OLED screens at 700 USD (350 each). A "special-shaped flexible OLED" external screen by LG is only listed as 30 USD, which seems quite low. Its purpose may be to see the eyes of the wearer (reverse passthrough).
This reminds me a lot of the lead up to the iPad announcement. A lot of buzz about a super high price and limited functionality (it’s just a big iPhone!) and the the actual price is significantly lower (though this “leak” is already about half of what the other “leaks” have been estimating).
I wouldn’t be surprised if this is Apple marketing or Wall Street traders inserting information into the news cycle. For Apple, the benefit is obviously to exceed expectations when they release an $800 headset (still expensive, but so much cheaper to someone who had mentally allocated $1,600). For Wall Street, anything to get the stock price to move one way or the other is valuable.
That's wishful thinking. Given the estimated BOM of 1500 to 1600 USD, the originally leaked ~3000 USD end user price is much more plausible. The BOM doesn't include costs of development, marketing, or profit margins.
The original reported price of $3,000 is not unrealistic. If this BOM is real, a $3,000 price point would be roughly similar to the profit margins of their other hardware products. The past few months have probably seen a lot of closed-door headset demos, and I wouldn't be surprised if the price was a major point of feedback for many people.
I doubt Apple discusses pricing with the people it gives demos too or even partners creating content. The only people who know the price are finance, marketing, and C-level.
To the contrary, I'd wager the price is the first thing every Apple shareholder would care about after demoing a VR product. Many of these people are not stupid investors and know the market is hot. Even if Apple doesn't explicitly tell them the price, these people know that price is a constraining factor. Especially if they're handed a piece of hardware with a battery dangling off the side like it's a 2012 Oculus demokit.
I love this thread, everyone is sharing their expert opinion on something that is right down the corner.
It would be a great exercise to review this thread 6 months after the headset is on the market, and see the true quality of hn on forecasting.
I had this tab open for awhile and was just re-reading the comments after talking to a retired Intel director. It's interesting to look at other innovative Apple posts here, maybe you'd like to check them:
Something interesting is that a lot of the most popular links for these from algolia are not the announcement. The M1 in particular surprised me. The announcement discussion is a few pages behind many other posts about various M1 things.
I've long thought that there's a lot of interesting research opportunities for hn discussions (see Dropbox announcement). Maybe ChatGPT can summarize it.
I am wondering when Apple will start putting a second camera onto iPhones with proper pupil distance, so people can start taking 3D photos and videos in landscape format. That would be a real "whole family" use case...
Some years ago I made 3D photos of a wedding with a physical dual camera setup. That was quite a pain to get exposure etc. synchronized and postprocessed. Still the pictures look amazing on a quest. Nowadays this could all be super easy.
With some AI/NERF magic, you could adjust IPD and allow limited head motion for videos and photos, even when they're shot with a sub-optimal stereo setup or even on a single camera.
sure, there are already models that allow you to hallucinate a full 3D scene from a single image. Still, it would be pretty cheap to add that camera I assume and it would have the potential to pretty quickly fix the 3D content issue that is prevalent currently without introducing hallucination artifacts.
I had 3D photos taken of my wedding too in 2009. The wedding was in the snow, so the 3D worked out nicely.
There were several 3D Android phones that I remember in the early days with dual-lens cameras and autostereographic screens, but sadly they all bombed.
At some point the technology will return, I'm certain of that.
No, but these prices track for an ultra-high-end product given what I saw of the hardware industry years ago.
For example, volume cost of a basic PCB is in the region of 1 cent per square cm, but a $40 PCB in volume can easily come from many layers and a weird stackup (not to mention that this may be up to about 100 sq cm).
Apple (and most high end tech designers) use highly specialized, very $$$ PCBs for a lot of their products.
They aren't created by traditional litho + etch, but rather by a process called modified semi-additive processing (mSAP). iPhone, apple watch, etc PCBs are basically the same as CPU/IC interposers on a large scale.
Definitely not 1 cent/sq cm cheap, but they have massive volume and high price points. It lets them get the insane part density needed these days.
Can’t speak to the bom but the rumors have consistently said this will be multiple thousands at retail.
Apples track record is refinement/perfecting for mass audiences not blue sky innovation. I’m looking forward to seeing their entry in this space should it actually launch.
If you have to dissipate more heat than a lightweight, traditional copper heatpipe solution can handle, there will be problems with noise and battery life.
The mid frame and thermal module costs are unlikely, but not strictly impossible. I can’t contrive of any scenario where there would be $20 of microphone BOM in a headset though.
So the internet went from $3000 being the BOM cost and Apple not making any profits to $1.5K BOM cost? [1]
And for anyone who knew something about CE BOM cost. That list is just complete BS. At this point it is highly likely Apple will release an AR/VR Headset at $1999 and the internet will think it is a bargain. Given how PR had everyone set the expectation of its cost.
I believe in the potential of this product. Having a quest 1 that use still to this day, I believe very strongly that Apple will make this work.
People talk about use cases, but there are quite a few. VR/AR is going to be fantastic for home fitness and Apple already have a fitness subscription, pair that with a watch for stats etc and this really will be a great device for that. Also exactly what apple loves at the moment, requires two products and a subscription in the apple eco system, great to getting people locked in.
I think AR/VR as long as they can get it comfortable will replace screens and the TV, why have a TV in the house, when you can put on a headset and be at the cinema? Without it taking up the space of a TV and being something you can take with you anywhere, like a hotel etc. There were rumours for so long that Apple were working on an actual TV, I think what actually happened, is they will release the device that will replace it.
All that needs to be done, is for someone to make all this work together and you have got a killer product, so i wouldn't write Apple off just because Meta have done a bad job with trying to sell the metaverse vision to everyone. I don't think Meta were stupid for pursing this, the current products are just half baked and badly executed, but Apple is in the position to do this right. This was the talk of the tech world not too long ago, before AI came and took over everything, but I believe Apple is going to put it firmly back on the map in a few weeks when they drop this. What Apple does, the world listens.
This will be expensive and have a lot of flaws, but any generation shifting product that Apple has released has been like that, just wait a few years down the line and see where it goes.
Doing any fitness activity under sweaty VR googles is a fucking pain. None of the display tech is providing sufficient FPS to allow any sort of jittery sports movement without causing people to be sick.
Watching TV in VR also is highly uncomfortable because you can't relax your head due to the weight (and again it is uncomfortably hot). Unless you are on an airplane it doesn't make any sense. The only other advantage is privacy.
The only hope this has is if apple uses insane camera tech to do video pass through VR in such a low latency that this device can beat Hololens for AR applications.
You can all have your own headset and watch the same content. But I’m sure you understand how that doesn’t replace the TV experience where you watch content with your family or friends, together, in the same space.
I have thought that being able to join a live event like a baseball or basketball game would be the killer feature for VR. I’ve come around to that not being a great experience because you’d basically be going t the game by yourself.
On the other hand, Apple’s bread and butter is one user one device. The question is will Apple lead the way for “killer app” or rely on third part developers?
I'm honestly kind of boggled by the HN comments so far, all seem to be various forms "I can't see how wearable displays can offer any value unless they're cheap". To me the use case is obvious: getting rid of the fucking screen. There is no inherent law of physics reason I can't have a full size desktop 10k 40" or whatever max is "display" for every single system I interact with, anywhere. The current paradigm of lots of different screens for different computers (desktop/notebook/tablet/phone/watch/appliances) is because nobody has yet been able to sufficiently directly project photons into retina via something good enough and wearable. Products like the RETISSA II demonstrate direct retinal projection working but without enough resolution, too clunky, etc. But the fundamentals work. The killer app IMO for a wearable display is replacing every other display. I can sit down at my desk or anywhere else and have whatever screen size/config I want, the only difference being how much local compute I have available and at what latency. No more need for "headless servers", just have the right display key pairs and you can pull up a screen/console just from what you're wearing. Same with anything from network switches to printers, can have augmented local display tech using a full size screen vs some tiny little LED thing at best. On a plane or a train and want to watch a movie, or pull up SSH sessions or whatever else? Full screen, and in total privacy. NV/FLIR, or digital [LMH]VPOs could all just feed into my worn display too. Which would be perfectly crisp even for those of us with bad eyes.
And on and on and on. It's so obvious. All the challenges involved are engineering/polish/politics (in terms of agreeing on some level of industry standards at least for a minimal level of universal interaction). Which doesn't mean it couldn't be mucked up but the promise is clear. I've spent many many thousands of dollars on displays of all sorts. When one wearable one can replace them all I'd cheerfully pay $5k let alone $3k. It'll also open up entire new realms of form factors for computers.
Apple may or may not manage it, and certainly probably not the first go around, but it's pretty clear why they'd want to be getting going on real experience with it even if it's mostly for devs and such to start. It's the definition of a looming disruption, a huge amount of their current visual product design is about "very nice screens, with stuff on them". Something that eventually obsoletes the "screens" bit would be one obvious way someone else could pull the rug out from under most of Apple's current offerings, and conversely is a big new opportunity for them too long term. It might take them 5-10 years, but if they think now is the time to get iterating I don't see why they shouldn't get going even if it's a touch aggressive.
Everyone can see the value of a Holodeck, or a magical lightweight high-resolution display that barely weighs on your head, doesn't cause eye-strain and doesn't mess up your hair.
But that hasn't yet been technically achieved.
What the comments are saying is: "I can't see how current state-of-the-art wearable displays can offer any value, given their current limitations"
And I agree. With none of the headsets I've tried (Rift, Rift-S, Valve Index, HoloLens, HoloLens 2), I could imagine working 8 hours per day. They're heavy, scratchy and are annoying to put-on & take off. The Nreal Airs, mentioned by a sibling comment, have a promising form-factor, but only feature a 1080p display.
At our current technological advancement, I doubt that Apple will overcome these limitations next week.
If there was some massive market for this, it would already exist.
As-is, desktop streaming apps do exist for VR. This emerging paradigm you're describing is yours for the taking, if you want it.
Screens are simply a better paradigm though. I say this as someone with a wireless OLED headset less than 20 feet away from me - I choose to instead respond and browse the internet and program and watch TV on my disappointing, $250 VA panel. There is never a time where I go "I'd rather experience this 2D, traditional content in VR now" even if the headset is charged and ready to go. Casting the content to my TV is much more comfortable, lets me share it with others, lets me eat my popcorn unintruded and works much better for binging sessions where you don't want "headset burns" on your face.
> On a plane or a train and want to watch a movie, or pull up SSH sessions or whatever else?
Yes, because this is just so difficult on my phone and laptop. If I want to watch a movie or pull up SSH sessions while I'm on a train, I would much rather do so on an interface that is not stuck to my face.
Write it on my tombstone, if it makes you happy. This headset is dead-on-arrival the moment their "Starting at" figure is revealed to be over $1,000. If that sounds like a longshot to you, then the WWDC response is going to make me look like William Tell.
I agree this is the main intended use case, but I’m not sure it’s compelling enough to offset all the downsides:
* Unless Apple has figured out true mixed reality (i.e. transparent lenses with full field-of-vision overlaid video), opaque AR goggles are a non-starter for most people. Seeing your surroundings only through a camera and screen is unacceptable. Not to mention the fact that people around you can’t see your face either.
* Unless other people have these glasses, they can’t see your screen either. Even if these glasses have no shortcomings whatsoever, I can’t imagine a future where everyone in the room needs compatible glasses to look at the same screen.
* Few people actually need enormous amounts of screen real estate, beyond what’s practical to pack in a laptop. The use cases you’ve mentioned (watching movies on the go, logging into device consoles) are fairly niche IMO. There would have to be some compelling new UI paradigm (e.g. truly 3D displays) that’s only possible at massive virtual display scale.
> Seeing your surroundings only through a camera and screen is unacceptable.
Without qualification this statement is strange. I don't care if I'm seeing through a camera and screen or through a transparant piece of plastic. What I care about is image quality, distortion, FOV, latency etc.
So the question is surely a more nuanced one about the relative importance of each of these factors?
> Not to mention the fact that people around you can’t see your face either.
When I'm working I'm not sure this is a huge deal.
> Unless other people have these glasses, they can’t see your screen either. Even if these glasses have no shortcomings whatsoever, I can’t imagine a future where everyone in the room needs compatible glasses to look at the same screen.
Ah. I get it - you're assuming there's other people in the same room as you! Why would you make that assumption for this device? I rarely work with anyone in the room, and when I do, they tend to be working on their own stuff. My collaborations are 90% remote (and for the 10% I can take the headset off)
>I don't care if I'm seeing through a camera and screen or through a transparant piece of plastic. What I care about is image quality, distortion, FOV, latency etc.
There is currently no camera/display tech that is remotely close to delivering dynamic range, latency, refresh rate, and resolution anywhere close to what our eyes can perceive.
>>Not to mention the fact that people around you can’t see your face either.
>When I'm working I'm not sure this is a huge deal.
It is certainly a big deal for many people to be able to maintain eye contact with people they're speaking with in person.
>Ah. I get it - you're assuming there's other people in the same room as you! Why would you make that assumption for this device?
I think you are the one making several constraining assumptions here:
1. this device is just for work
2. most people work physically alone
3. in the rare instances that people work together, they almost never look at each others' screens
Re: 1, OP's vision is that AR goggles may ultimately supersede displays entirely, which would make it difficult to do things that most people commonly do today, like show their friends things on their phone.
Re: 2, this is also not true; as of February 2023, it's estimated that 30% of all office work occurred remotely [0].
Re: 3, this is also not true, especially outside of the tech industry, where people collaborate in-person on documents/spreadsheets/presentations all the time.
I see a whole host of impractical impediments: knowing Apple, only Apple glasses will be able to have a shared reality experience, so all the people in the room with non-Apple glasses will be left out. There will also be version incompatibilities as the tech evolves, leading to a whole host of awkward scenarios: "sorry friends, I can't show you this funny video, I have the iGlass 5 but you guys only have the iGlass 3, and you guys have Samsung Galaxy Glass." Furthermore, since these glasses will have to be networked, you'd likely need a separate pair for work and personal life, which creates a host of incompatibilities ("sorry coworker, I can't show you my photos from the weekend, since I took them with my personal glasses and you only have your work pair on you.")
> OP's vision is that AR goggles may ultimately supersede displays entirely, which would make it difficult to do things that most people commonly do today, like show their friends things on their phone.
This is the context I missed. I wasn't thinking "all day" usage - but then I don't see that being especially feasible even for transparent AR for quite a while (for different reasons).
Honestly, I think you're just pulling some of these user preferences out of nowhere. Or at least you're greatly underestimating what is acceptable. Its not like people don't use sunglasses and ski-goggles. Eye contact is nice but proven to not be necessary in many many cases.
The reason a Quest Pro is a pain is the weight and passthrough quality. You get used to not seeing someone's eyes fairly quickly.
>3. in the rare instances that people work together, they almost never look at each others' screens
This is a huge one but I think its a workable problem. Hopefully Apple lets you 'cast' your screen to others for a shared experience. It makes a huge difference.
>Unless Apple has figured out true mixed reality (i.e. transparent lenses with full field-of-vision overlaid video)
This sounds like a very early adopter/developer version 1. The tech for what you describe exists, but this particular version isn't going to be it. Which I'm basing on Apple themselves, leaks says they're thinking in terms of hundreds of thousands of units in a year for this, for contrast they sold something like 220 million iPhones just last year. The question is whether they can achieve an MVP and get iterating.
>Unless other people have these glasses, they can’t see your screen either. Even if these glasses have no shortcomings whatsoever, I can’t imagine a future where everyone in the room needs compatible glasses to look at the same screen.
I can, very, very easily. But in the long interim between now and then, I can imagine remaining legacy screens which people already have for that. Because we already have them. It's not a zero-sum game, having wearable displays too doesn't somehow make existing screens shut off.
>Few people actually need enormous amounts of screen real estate
Utterly ludicrous. The only reason people use small screens is portability, size (not everyone physically has room for a large display), and cost. If there were two 12" notebooks, one of which had a conventional matching screen like now and the other at the push of a button could pop out a 30" magic solid hologram display (resizable to whatever one wanted) at zero extra mass or stored size cost for the same price, there are zero people who'd pick the former. Everything everyone does with computers and gaming and movie watching and so on benefits from full resolution arbitrary size windows. It's just that right now there is no way to separate that from the physical size of the display hardware itself.
>watching movies/playing games/reading books/doing work is niche, and wanting to do it privately is niche
I am not aware of this technology; can you point me to it?
>>Unless other people have these glasses, they can’t see your screen either. Even if these glasses have no shortcomings whatsoever, I can’t imagine a future where everyone in the room needs compatible glasses to look at the same screen.
>I can, very, very easily.
Currently, if your own device doesn't work, it just means you can't access your own content. With connected AR devices, it would mean that you also can't access anyone else's shared reality. Here are some common ways I foresee AR devices not being easily interoperable:
* Knowing Apple, only Apple glasses will be compatible with each other, so all the people in the room with non-Apple glasses (or glasses that are too old) will be left out: "sorry friends, I can't show you this funny video, I have the iGlass 5 but you guys only have the iGlass 3, and you guys have Samsung Galaxy Glass."
* Battery life on these had better be phenomenal, since you're cut off from a large portion of reality if it dies.
* These glasses will have to be networked, with close to zero latency, which is always tricky business—even mature technologies like wifi are far from perfectly reliable today.
>But in the long interim between now and then, I can imagine remaining legacy screens which people already have for that. Because we already have them. It's not a zero-sum game, having wearable displays too doesn't somehow make existing screens shut off.
I totally agree. This is the future I think is most likely — some sort of clever hybrid between physical screens and AR screen extensions.
>>Few people actually need enormous amounts of screen real estate
>Utterly ludicrous.
You are right. I should have said, “given the tradeoffs of AR glasses, few people need that real estate.”
>The only reason people use small screens is portability, size (not everyone physically has room for a large display), and cost. If there were two 12" notebooks, one of which had a conventional matching screen like now and the other at the push of a button could pop out a 30" magic solid hologram display (resizable to whatever one wanted) at zero extra mass or stored size cost for the same price, there are zero people who'd pick the former.
I totally agree with this. If this technology were a magic hologram (i.e. zero tradeoffs), it would be totally game-changing.
> watching movies/playing games/reading books/doing work is niche, and wanting to do it privately is niche
You don't need a giant screen with tons of tradeoffs to read most books or do most work. Gaming and movies would benefit, but most people do not spend the majority of their time on a computer gaming or watching cinematic movies.
I share that vision but I don’t see the current Apple doing this.
With their performance advantage they could easily work towards something like Samsung Dex with a compute core that you plug into different terminals, or macOS on the iPad.
But they choose not to, most likely trying not to cannibalize different device categories.
I would love to be wrong but I don’t see this happening for many years until some Chinese brand offers cheap alternative glasses and everyone runs around with a battery powered Mac mini.
Besides that we also obviously don’t know how long until the battery and display tech is good enough to compete with screens
I don't have a crystal ball either. But there is no one in tech at this point who hasn't read The Innovator's Dilemma. And Apple themselves have as good a recent history of disrupting themselves as well as others and shifting tacks, repeatedly, as any other megacorp. They are clearly awake to it. Doesn't mean they can execute, but I don't agree there is any sign of them "trying not to cannibalize" vs just not thinking the timing is there yet.
Granted the question of timing is of course always one of the huge challenges too, someone can have 100% the right idea, product segment focus, etc, but be too early (or too late) and fail. But again, Apple has to have as much insight into all of this as anyone, they've built deep, deep manufacturer relationships. As with some of their previous products, the right time to get going is probably "a bit early, but not too much". Ie, good enough to satisfy a few hundred thousand people, not just thousands/tens of thousands, but early enough that everyone else thinks it's not ready yet. Then there is enough qualities/volume for real feedback and getting the iteration ball rolling, but not so much that it means making major changes isn't doable yet. If I think back to Mac OS X (I still vividly remember both the Public Beta and 10.0), the iPod, the iPhone, the iPad, or even a lot of Mac models (Macbook Air v1 definitely had a host of "this needs more hardware" aspects), that seems to be around the point they always aim for.
Again no special insight on if they've gotten that moment right here but it certainly doesn't feel like it couldn't be given where things are at. And my confusion is people who apparently can't see any value at all, regardless of improvements. It's one thing to say "I think they're 3 years too early for MVP", it's another to say "what's the point of wearable displays ever".
It’s hilarious, I’ve got a pair of nreal airs and I use them all the time, the ergonomics and comfort alone are so much better than a standard monitor.
Yes exactly. They've been foreshadowing this with developers for a while with ARKit. It will take several iterations, and likely as yet non existent new technology to make this a true mass market product. But it's not like it would make sense for them to spin on prototypes for a decade, and some point they have to ship and take it from there.
Nowhere in this comment do you mention Apple Music, Apple TV+ or the App Store -- these are the areas Apple is trying to drive traffic to with their hardware. I don't expect them to even bother to give lip service to the kinds of use cases you're describing.
It's some tiny Twitter (blue) user posting some nebulous list. Seems pretty BS to me, and it's surprising HN falls for this. Okay, it actually isn't surprising.
It's actually funny how a confirmation bias works: In other posts people say "this tracks with other rumors..."...yeah, that's what people who make stuff up tend to do.
A vr headset is just a really big screen. Imagine how much stuff you can leave on your desktop if it’s a perfect sphere and you can sit in your office chair spinning round with it all whirling past you…
Is "rear view mirror" a joke, or maybe just a lower res cheaper camera pointing backwards, or maybe this leak isn't legit?
Either way, looking forward to seeing the headset!
This is english translation of original Chinese leak [1].
Lot of formal/informal colloquial terms gets lost in translation. Yes you could be right - wearer of the headset would be aware of environment behind their head and for that this could be the camera system.
My favourite (possibly apocryphal) example of meanings lost in literal translation, is "water sheep".
(Hydraulic ram).
Through an unrelated anecdote, in my head this is now bound to the music of "smoke on the water", with a sheep impression for the guitar (bass?) — "Sheep on the water, ba baa baaah, ba ba ba baaah"
I have a VR game I've been working on for 3 years. I planned to have it ready by whenever Apple had a headset ready, but if this thing costs $3000 Idk.
I wish they could offer a subsidized devkit or something
You should probably focus on the already existing market, anything else is a waste of time unless Apple has invited you to prepare something. Otherwise that's just a whole lot of wasted resources on an unsure userbase.
It would be shocking to me if apple released a AR headset without at least considering Unity and/or Unreal support. Think about how long it takes to make a proper VR game, let alone a quality one
$1600, wow! I should confess that despite I am now using my Oculus Quest 2 only for Beat Saber once a month, buying it for $300 two years ago was a real bargain.
Your eyball is a screen. And, well, screens are screens. Both work by restricting ambient light and focusing an image at a specific focal length.
Air is ... many things, and useful and all that, but it is not a screen. Fill it with particles and you can get traced patterns through it (e.g., laser through smoke or arclamps through fog), but that's not screen-like projection.
And under any circumstances in which ambient light levels are high, the darkest possible ground is ... the ambient light level. Which means either that the entire image is washed out or it's absolutely blasted with light.
... amongst what I'll term displays, which is to say, an updatable presentation medium, there are generally illuminated and ambient variants. These date back to scratching on sand or dirt, erasable wax tablets (used by Greeks and Romans for education), slates (common amongst students in the 18th/19th centuries, roughly), and of course modern variants.
I use "illuminated" to describe any display in which the information content is expressed within the light incident or emitted from the display. This would capture all of, say, a traditional CRT video display, a projection screen (where the image is reflected off the display surface), rear-projection displays (see 1990s-styple projection televisions), Nixie tubes, neon (and related) signage, and discrete-element displays such as LCD or plasma screens. What this captures is the distinction between a CRT display in which an image is painted (using an electron gun and colour mask) and an LCD display in which individual pixel elements are activated. I was unhappy with how I'd considered both as equivalent in my initial comment. I would argue that both are screens, but what's really common is that the illumination carries the signal.
Ambient displays have a content encoding mechanism distinct from any illumination, thought that illumination might be integrated into the display itself such as with backlit liquid-crystal displays or "frontlit" e-ink (electrophoretic) displays. Here the surface itself is physically altered and ambient light is used to reveal that arrangement.
Again, your eyeball is a screen (incident light carries the image, see the camera obscura for a physical analogue), air ... is not.
If you wanted to consider how you might convey information via thin air, it seems that there are a few possible options:
- One could create a temporary screen surface within air and project onto it. The water screen display used in tunnel excess-height warning systems is one such example. Note that the prominent visuals for that system show it at night or under limited ambient light. The effect under direct daylight would be far less prominent. Other possibilities include some sort of mesh or hanging thread pattern onto which an image could be projected. This is actually used to an extent in some IMAX theatres, particularly domed screens, in which the "solid" dome is actually a mesh with many holes in it, which both reduce overall mass and permit greater ventilation of the theatre space itself. At viewing distance, the holes are undetectable. All of these still require low ambient light. Some of these options might be viable where a natural framing opportunity exists, e.g., a beaded door curtain or similar) exists. But that's not a generalisable method which an individual could carry with them.
- One could arrange for some particle swarm to organise within the message space to carry the image. Illuminated drone swarms are an example used now in entertainment and advertising (same low-ambient light requirement), though a reflective pigmented alternative might be conceived of. This ... would probably be difficult to maintain, and is effectively like spraying pigment into space. Either you're going to require a lot of ink, or the pigment elements would have to be both "smart" and movement-capable such that they could arrange themselves. In practice, static displays (print on paper or similar media) or dynamic ones (e-ink, transflective LCD, or similar) involving a very much non-thin-air medium is vastly more practicable.
How would that even be conceivably possible with the technology of today?
Look at the state of the art for projectors, they still require special reflective screens to reach their full potential. You're asking for what is still very much in the realm of science fiction.
This. They have all failed at this thus far, it naturally means that anyone trying will fail next time too. For eternity. It’s a well established law of nature at this point, that if new tech isn’t a massive hit within the first few iterations, it’s a wrap.
1.6k+ computers, high but understandable . 1.6k phones, way to high. 1.6k+margin (3k?) for a AR/VR headset, seems like something most people would never buy.
I highly doubt Apple or anyone else expect most people to buy it. It's a new market with no content and no clear use cases. So starting slow and brining the price down over time seems like the best strategy.
Outside of gaming and industry use AR/VR have felt a little like a solution in search of a problem. I do believe though that the people who work at Apple are capable of solving that problem, finding the use cases that an every day person wants it for.
My benchmark for this is my family, they are all different, with different level of interest in tech. If I can see the majority of them having a use for it, being inspired by the UX and marketing, then it will be successful.
Until now none of them have any sort of VR kit, but they all have iPhones and Macs.
As a slight aside though, new hardware products are very hard for Apple to keep completely under wraps, too many people in the supply chain, too many factories. Software on the other hand they can lock down incredibly well (pun intended). I am somewhat suspicious that they may supprise us at WWDC with their take on generative AI. Now that could be part of the headset, but it's more likely to be part of iOS/macOS and a feature that aims to help them sell more of their existing hardware ranges.