The potential is to replace computers. In its current version, it is basically an iPad on your face. Look a little forward and it is a laptop on your face. Imagine that the price came down to $1500-2000 in a couple years, now you can buy an Apple Vision instead of a laptop. And you wouldn't need to buy a TV either. So this does have device consolidation potential like the iPhone did and it can tap into an existing market like the iPhone tapped into the phone markets.
I think previous AR/VR devices didn't quite have the right sweetspot of hardware features (too low resolution, tied to one spot, extra controllers), but this one looks like it might just do it. What it doesn't have is a low enough cost, so it will be a slow start. I'm also still curious if there will be a "killer app" that encourages people to get into it, but the long-term vision of spatial computing is itself enough of a killer feature. I just wonder how long that will take.
> Imagine that the price came down to $1500-2000 in a couple years, now you can buy an Apple Vision instead of a laptop. And you wouldn't need to buy a TV either.
I can compile code on my laptop - can I do that on a vision?
I can plug a xbox on my TV, or watch it with 4 people. Can I do that with a vision?
I think this replacing TVs is a really hard sell, except in remarkably niche people. Sitting on the couch together playing Nintendo just can’t be replaced, and apple surely doesn’t want to allow third party inputs, they want an internal app ecosystem, which Nintendo and PlayStation won’t ever do. (Xbox maybe). Laptop replacement, I can buy though. But only some fraction of those, nothing large, and certainly no larger than iPhone market share percentages.
My guess is that Apple was a vehicle to further his compiler and programming language plans. Now that he has done that, it becomes more mundane maintenance and incremental changes. So it is time to move on to a new challenge.
Swift has its problems but keeps getting better. I only get to use it for my at-home projects, since at work the portion of the codebase that is for iOS is Objective-C with no plans to switch any time soon. Old Obj-C too, and started by people used to programming in VB on MS platforms. So count your blessings.
I find Swift to be getting worse each year. While sure i'm blessed that I don't have to write applications in assemble, I have grown jaded towards Swift. If it wasn't because I am an iOS developer, I would happily not use the language. I have been slowly positioning myself away from doing iOS development. I feel like a massive corporation like Apple can provide better tools to write apps for their walled garden than what they are providing me.
I'm not sure I would say Swift is the kind of robust that is needed for safety critical software, but it is a nice step forward for application code. It mostly forces you to deal with things safely while still allowing Objective-C dynamic behaviors when you need to get around the restrictions (often to interact with other Obj-C code).
So, yes I can see why one would call Lattner at the forefront of making more reliable compilers and pushing shifts in quality standards in fast-moving Silicon Valley. It is an awesome achievement to create a new language that improves both readability and safety and even more awesome to get it mainstreamed so quickly.
There are a few people who I would like to trade places with. Lattner is one of them, Musk is another. They both fulfill different parts of my long-held dreams. So I consider them to both be quite awesome. Its cool that they'll be working together too I guess.
Sure Lattner and Musk are interesting people, but I find the level of hero worship in the tech industry to be sickening.
Having used compilers for a few new languages (Rust, Go, Dart, Kotlin, Swift). Swift is the only one I've had any issues with as well as Swift seems to be the only language to have adopted the "move fast and break things" philosophy of Silicon Valley. I dunno, I just don't see the argument.
Lattner is best known for starting the whole LLVM project. Swift is just a small side project in comparison, it just got adopted by Apple for some reason.
LLVM is one of the most influential pieces of software of the past decade. Hero worship isn't good but credit where credit is due.
You're confusing language development with autonomous vehicle development. Think of the long term goal. It's desirable to move fast and fix things with language development in the near term, to achieve a more perfect design and accelerate its maturation at the temporary cost of more volatility. After this process achieves a high level of maturity, said design principles may offer a safer, more reliable programming system that would be better suited to safety-critical applications.
Additionally I'm sure we can all agree there is no substitute for maturation through time and usage in the field. Which frankly is an argument for more popular languages over obscure ones. None of the ones you mentioned are ready for safety-critical system development (including Swift 3), but which one is most likely to achieve widespread adoption and field testing in the long run?
No I'm not confusing them. I'm responding directly to the comment that Chris lattner represents a more measured approach to software development than is tradition in the tech industry.
I don't think Swift stands to gain wide spread traction outside of Apple orientated app development. Aside from a lack of stability, Apple is to well known for boxing its competitors out. I've used and loved their products my entire life and I know how annoying it is to go against Apple's grain.
>I don't think Swift stands to gain wide spread traction outside of Apple orientated app development.
It already is though, there are several Linux web frameworks etc. It's open source and community run so I'm not sure how they're planning to box out competitors from it.
There are some web frameworks that are indevelopment. That does not mean Swift has gained any traction. Also having toyed around with one, the experience was not great.
When writing a server, I would take Go over Swift anyday. It out preforms it, uses less memory, its simpiler, oh and it uses a "tradiontal" GC.
That is very much _not_ the case according to the testing I have done recently.
Swift uses a lot less memory than Go unless the program uses only trivial amounts of memory in the first place. Using interfaces in Go data structures makes the difference even more pronounced.
On top of that, all runtimes that use a tracing GC require tons of spare memory at all times unless programs are very carefully written to avoid memory allocation.
That said, Swift has a few very weak spots when it comes to memory. Most notably the String type, which is terrible on all counts, but that is a whole different story.
> On top of that, all runtimes that use a tracing GC require tons of spare memory at all times unless programs are very carefully written to avoid memory allocation.
Only if the said language doesn't allow for stack or static globals.
> Doesn't that effectively amount to manual memory management?
Not really, example in Active Oberon:
TYPE
point = RECORD x, y : INTEGER; END;
VAR
staticPoint : point; (* On the stack or global *)
gcPoint : POINTER TO point; (* GC pointer *)
noGCPoint : POINTER(UNTRACED) TO point; (* pointer not traced by the GC *)
On Active Oberon's case, those pointers are still safe. They can only point to valid memory regions, think of them as weak pointers that can also point to data on the stack or global memory.
This in safe code.
If the package imports SYSTEM, it becomes an unsafe package, and then just like e.g. Rust's unsafe, the rules are bended a bit and usage with SYSTEM.NEW() SYSTEM.DISPOSE() is allowed.
Just like any safe systems programming language, it is up to the programmer to ensure this pointer doesn't escape the unsafe package.
I still don't get how you can say that this memory is not under the GC's control but it's "not really" manual memory management either. Is it reference counting then? How does that memory get released?
It doesn't get released, unless you are doing manual memory management inside an unsafe package.
In a safe package it can only point to existing data, there isn't anything to release.
If the pointee is something that lives on the heap, it is similar to weak references. Points to GC data, but doesn't count as yet another GC root.
If the pointee is on the stack or global memory (data segment in C), then there is also nothing to release. Global memory only goes away when program dies, stack gets released on return. Memory that was allocated by the compiler due to VAR declarations, it is static.
Usually the idea is that you use untraced pointer to navigate statically allocated data structures, they are not to be exposed across modules.
Also sounds similar to M (aka MUMPS), see InterSystem's Cache (commercial) or GT.M (open source). It is an old programming language with a persistent key-value store that is basically a NoSQL database with programming logic combined. From what I've heard finance and health care software often uses this. It similarly puts the business logic in the database.
They also had their currency value cut to half in 2008, and that had a very real impact on the purchasing power of people. The currency has recovered a bit but the change is still there and its's big.
There are also currency controls for taking money out, etc. Iceland is still in severe shock.
It would be interesting if candidates were not allowed to promise what they would do, but merely campaign on their morals and good standing in the community so we can pick people who would make intelligent decisions but not necessarily ones that would choose a specific way on a specific issue. That's what you're asking for here.
Which is why, for the next decade, I will not vote for any incumbent, or successive seating of a political figure in any office that I vote for... No repeat... no return... I encourage everyone to do the same.. ignore party, and policy... simply don't vote for anyone who's running for re-election for anything, or for that matter looking to seat switch, where term limits are imposed.
After a decade, the shakeout from the real source of many of the problems (follow the money) would be in such disarray, that maybe then we can talk about fixing the system.
Democracy is a form of government that can destroy itself. The French voted to put in an emperor their first time around. So we may have a good constitution but if the people vote to make bad laws that break it, and permit those laws to function, then it is the people's fault, not the founder's fault.
I mean, the binary choice clusterfuck that is modern politics is certainly the fault of the founders, and is certainly partially responsible for the current problems. And because of the reverence of the founders and the constitution as it exists, the self destruction and rebirth is looking likelier then meaningfully repairing the system.
I think previous AR/VR devices didn't quite have the right sweetspot of hardware features (too low resolution, tied to one spot, extra controllers), but this one looks like it might just do it. What it doesn't have is a low enough cost, so it will be a slow start. I'm also still curious if there will be a "killer app" that encourages people to get into it, but the long-term vision of spatial computing is itself enough of a killer feature. I just wonder how long that will take.