I definetely get what OP was trying to say. Bret presented something that sounds a lot like the future even though it definitely isn't. Some of listed alternatives like direct data manipulation, or visual languages or non-text languages have MAJOR deficiences and stumbling blocks that prevented them from achieving dominance. Though in some cases it basically does boil down to which is cheaper, more familiar.
I think the title of Bret's presentation was meant to be ironic. I think he meant something like this.
If you want to see the future of computing just look at all the things in computing's past that we've "forgotten" or "written off." Maybe we should look at some of those ideas we've dismissed, those ideas that we've decided "have MAJOR deficiencies and stumbling blocks", and write them back in?
The times have changed. Our devices are faster, denser, and cheaper now. Maybe let's go revisit the past and see what we wrote off because our devices then were too slow, too sparse, or too expensive. We shouldn't be so arrogant as to think that we can see clearer or farther than the people who came before.
That's a theme I see in many of Bret's talks. I spend my days thinking about programming education and I can relate. The art when it comes to programming education today is a not even close to the ideas described in Seymour Papert's Mindstorms, which he wrote in 1980.
LOGO had its failings but at least it was visionary. What are MOOCs doing to push the state of the art, really? Not that it's their job to push the state of the art -- but somebody should be!
This is consistent with other thing's he's written. For example, read A Brief Rant on the Future of Interaction Design (http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesi...). Not only does he use the same word in his title ("future"), but he makes similar points and relates the future to the past in a similar way.
"And yes, the fruits of this research are still crude, rudimentary, and sometimes kind of dubious. But look —
In 1968 — three years before the invention of the microprocessor — Alan Kay stumbled across Don Bitzer's early flat-panel display. Its resolution was 16 pixels by 16 pixels — an impressive improvement over their earlier 4 pixel by 4 pixel display.
Alan saw those 256 glowing orange squares, and he went home, and he picked up a pen, and he drew a picture of a goddamn iPad.
[picture of a device sketch that looks essentially identical to an iPad]
And then he chased that carrot through decades of groundbreaking research, much of which is responsible for the hardware and software that you're currently reading this with.
That's the kind of ambitious, long-range vision I'm talking about. Pictures Under Glass is old news. Let's start using our hands."
Okay. Single fingers are a amazing input device because of dexterity. Flat phones are amazing because they fit in my pockets. Text search is amazing because with 26 symbols I can query a very significant portion of world knowledge (I can't search for, say, a painting that looks like a Van Gogh by some other painter, so there are limits, obviously).
Maybe it is just a tone thing. Alan Kay did something notable - he drew the iPad, he didn't run around saying "somebody should invent something based on this thing I saw".
Flat works, and so do fingers. If you are going to denigrate design based on that, well, let's see the alternative that is superior. I'd love to live through a Xerox kind of revolution again.
I've seen some of his stuff. I am reacting to a talk where all he says is "this is wrong". I've written about some of that stuff in other posts here, so I won't duplicate it. He by and large argues to throw math away, and shows toy examples where he scrubs a hard coded constant to change program behavior. Almost nothing I do depends on something so tiny that I could scrub to alter my algorithms.
Alan Kay is awesome. He did change things for the better, I'm sorry if you thought I meant otherwise . His iPad sketch was of something that had immediately obvious value. A scrubbing calculator? Not so much.
Hmm didn't you completely miss his look, the projector etc? He wasn't pretending to stand in 2013 and talk about the future of programming. He went back in time and talked about the four major trends that excisted back then.
No. I'm saying, there is a reason those things haven't' become reality. They have much greater hidden cost than presented. It is the eqivalent of someone dressing in 20th century robe of Edisson and crying over the cruel fate that fell on DC. Much like DC these ideas might see a comeback but only because the context has changed. Not being aware of history is one blunder but failing to see why those thing weren't realized is another blunder.