Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Cannes Interview: Christopher Nolan on a new “unrestored” 70mm print of “2001” (filmcomment.com)
132 points by prismatic on May 19, 2018 | hide | past | favorite | 68 comments


The comparison to paintings in an art gallery struck me as the most compelling argument for viewing film in a theater, rather than a digital copy in a nice viewing room. That is the medium the art was made in, therefore there's an intrinsic value in viewing it in the way the artist intended it to be viewed. I hadn't thought if it that way and he's right: no one tries to pass off a print as being the same thing as the actual painting.

I also found Nolan's rejection of the term "consume" interesting. Here's a term that the industry has been really pushing, likely to keep in line with the view that watching a film is a single use thing, and he dismisses it out of hand.


"Consume" and "Content" are the two most disgusting words I can think of in the context of art and more generally entertainment. They are words invented by advertisers to reduce artistic works into commodities and people into passive zombies. They are a misanthropic conception of the way that people interact with artistic works, and we seem to put up with it because they make great DB column names.


"Consume" for media is a wrong word. After you consume something it should be gone. You can be consumed by disease, meaning you're becoming less due to it.

Movies, books, songs are not less when you enjoy them.

Instead of saying you "consume" art, you can say you "spend time with" art.


Well they is when you view something as art. I consume a lot of movies, but I spend a lot more time with Christopher Nolan's film, and rewatch time in some cases multiple times.


The analogy of seeing an original painting in a gallery vs. looking at a print is appealing, however, I can't help think that we're comparing apples and oranges here: paintings are 3d objects and prints are 2d reproductions. All image projections are reproductions, and the colour and textural properties inherent in photochemical film processing are subtle and trivial to reproduce digitally.

I'm fairly certain Nolan himself could be fooled in a double blind test between 4k digital and a 70mm (assuming he didn't oversee the transfers haha).

This is more about _knowing_ how the picture was produced, and that it was reproduced faithfully, and the excitement that comes from that for those who care. The placebo effect is real :)


No, it's pretty easy to tell on a big enough screen. What's impossible to distinguish in a double blind test is FLAC vs good lossy compression or high end reel-to-reel against high end digital.


So then does the argument come down to a question of bitrate? Can we push digital projections beyond the point of human visual discernment, like we have with audio?


Yes, but I don't see many 16K projectors around.


That is not how high the cinema ever gets. The focus plane never completely aligns, the backgrounds shift, the projector vibrates and its lenses have edge artifacts. The scanable 16k film would look way better in Oculus VR than reprojected.


I wouldn't use the optics of commercial VR headsets as any kind of quality benchmark.


That argument would perhaps be valid, if someone went to through the trouble of creating a super-hi-rez simulation of the film theatre experience. To my knowledge, no one has done that.


There are paintings that are meant to be viewed in galleries, but they are relatively recent. Historically, most paintings were commissioned by individuals and institutions...i.e. the were painted for patrons. There's about three centuries between painting the Mona Lisa and the opening of the Louvre. Or to put it another way, Renaissance paintings in an art museum are somewhat analogous to 2001 on an iPad and a person could make an argument [1] that part of what makes modern art modern is that it is painted for a world in which art gets displayed in museums.

[1]: I'm not making that argument. The argument I am making is that a person can make that argument.


I get the attractiveness of seeing this particular film in a form that is as close as what the director wanted the audience to see as possible. But the analogy only goes so far. Much of chemical film photography goes through the digital domain for manipulation. Almost all magazine photography goes into a digital production process. Many artists digitize their paintings and sell "giclée" prints to hit lower price points and extract a lot more revenue from popular paintings.


After reading that, I have a new respect for Nolan as an auteur. He clearly believes in the art of film-making, and not just throwing out big-ticket movies. I've much preferred his more cerebral work such as The Prestige and Intersteller, over big ticket stuff like his Batman Trilogy (and I'm a HUGE Batman fan). I hope he continues down this artistic path.


In case you aren't already aware of it, you should try to find a copy of his first film, "Following". It's extremely low-budget, but aside from that, showcases good direction, interesting camera work, and a fantastic, Nolan-esque plot.

Like Clerks, its creation story is super interesting, and a testament to how little it costs to actually make a film if you're super creative, and really determined to.


Another good Nolan film is "Memento". My favourite of his works.


Personally I found Interstellar to be more pretentious, preachy, and illogical than cerebral. I mean I can suspend disbelief for SciFi tropes like time travel, but don't ask me to believe that a single uncurable pathogen could infect every staple food crop.

Not one of his better efforts.


There are dozens of us!

I agree, I really didn't get drawn in by Interstellar much at all. Although I only watched it on a home "theatre" and not an imax blow-your-mind "experience", so maybe I missed out on something that the bigger screen would have given me?

It's just weird that a rapidly-falling-back-to-agrarian society scrabbling for resources would also be able to secretly maintain such a high-technology base. I was waiting for a 1%-esque hidden society to be revealed leeching off the masses or something.

I'm not sure what the agrarian bit meant, I felt like it could be dropped without losing much of the film. I would have believed a dying Earth with every effort thrown into trying to colonize Mars, but Martians get cancer and the colony keeps failing or something. Then we discover this wormhole with what looks like Earth-like planets right on the other end! I could believe we'd throw a lot into a last ditch effort to find a planet we could live more easily on.

I probably over-thought it :)


> watched it on a home "theatre" and not an imax blow-your-mind "experience"

Whenever I hear that a movie needs a theatre with a screen larger than the Arecibo telescope for the public to enjoy it, I already know that's a bad movie.

All the other things you mention are spot on, too. Simply many science fiction movies lately have stopped making any sense, as if (as if?) the public was completely unable to tell the difference.


> don't ask me to believe that a single uncurable pathogen could infect every staple food crop.

Not to mention the fact that they instead of fighting it, they want to build gravitational propulsion, all by themselves in a secret location, to transfer humanity somewhere else. Like, low hanging fruits.


Also somebody told Hans Zimmer to turn it up to 11. There was no need for that, ordinary Hans Zimmer is plenty (see e.g. Crimson Tide) but Interstellar is permanently at maximum Hans Zimmer. I presume this is Nolan's fault.


Plenty of enjoyable vignettes, but a disappointment as a whole.


I would argue that if you can set aside the pop-culture legacy of the character of Batman--that feeling of "can something this popular be actually good?"--The Dark Knight is a better movie than Interstellar.

Interstellar has some beautiful shots and inspiring moments, but I think the plot and characters are not as well-realized. As much as I love the subject matter, I think Interstellar is one of Nolan's weaker films.


I would put them both more in the middle than most - I hate the magic aspect of the ending of Interstellar, but I also find the Joker's final act uncompelling, especially the boats rigged with explosives - but the aspect of both that I appreciate is a large-scale ambition that's rare these days (but that 2001 also certainly has). Technology has made low-budget ambition still produce high quality results, which is fantastic, but there's also something about what you can do when you throw hundreds of millions at a more audacious concept. I found Dunkirk disappointing because it was (by nature) much more straightforward.


I also prefer those movies, but Nolan’s Batman films were good as well. Not on the same level, but still good!


Honestly, as a kid watching The Dark Knight for the first time, I had been highly compelled by the movie.


> He clearly believes in the art of film-making, and not just throwing out big-ticket movies.

Except that's what he does. Sorry if this sounds ad-hominem, but he could spend less time expounding the difference between 70mm and digital and rather concentrate on making movies with plots that make some sense and not just intended to please the most average public.


“A lot of the information is thrown away when you digitize. In sound terms it’s overtones and subtones—things that you can’t consciously hear. An analog medium has all kinds of complicated cross-talk between the different frequencies of information that you’re getting, which have a particular character to them”

Similar to audiophile hokeyness. The dust and dirt on the film must add warmth and character, similar to the hiss and warble of analog tape. If that’s your thing, fine, but having grown up listening to tape and records, the first time I heard a CD, my jaw dropped. It was really that much better.

Someone told me once the best record will always be better than the average CD, but the best CD is always better than the best record.

I do concede the “loudness wars” have really screwed up digital audio, but it is fundamentally better, and I’m sure uncomprsssed digital video is the same.


No, he's right about this specific thing. 8K scans do not outresolve 70mm film in terms of lp/mm. 4K projection is just inadequate.


I’ve watched 4k digital films at two exceptional digitally equipped theaters:

https://cinerama.com/Technology.aspx

https://www.geekwire.com/2015/photos-inside-one-of-the-world...

And I swear each time I’ve seen noticeable digital artifacts (usually where text or some long swoopy line is involved). The digital projection has never given me a “holy crap” hair standing on end visual response.

Not so with 70mm or 70mm IMAX (RIP Seattle’s equipment). I saw Interstellar in 70mm IMAX, and the space scenes really were stunning. Like, clearly amazing.

When we saw 2001 in 70mm at the Cinerama a few years back, several people in our group all remarked how the initial sunrise almost made you feel heat.

Yes, that could be my imagination. But I’ll add that I didn’t even know that I was watching The Master in 70mm a few years ago. I’d watched it with a friend, and I distinctly remember coming home and remarking to my wife how amazing the visual detail was.

I’m firmly in the “digital has yet to wow me visually the same as 70mm” camp. And I say this as someone who thinks CD was the most profound improvement in audio quality that I can remember in my life. So, I don’t think I’m just being nostalgic.


Digital (video) artefacts are definitely a thing. Unlike with the CD, where we got to a point where over an hour of essentially perfect quality stereo music fits on this cheap mass-produced object, digital cinema is in a place where they can do something that's very good - much better certainly than what you'd see in a second rate cinema after a picture has been out for a few months and the print has seen better days, but it's absolutely not perfect and I can well believe it's not going to impress anybody who can see a nice new 70mm print.

Basically every frame of a digital movie is a huge JPEG. That's not quite exactly what's happening, but it's close enough. So you don't have inter-frame compression artefacts, as you'd see on a DVD or Youtube video, but artefacts in the individual still images absolutely are present, albeit rarer because they're using a lot of bytes for each frame. And so yes, text would be especially likely to make this noticeable as it is in JPEG on web pages.


Have you ever heard a quality recording on a reel-to-reel tape machine at 15 or 30 ips? Digital cannot hold a candle to it.


The only compelling, objective reason to record to analog tape is for its distortion quality. This is particularly true at slower speeds. At 30 ips, I’m not sure it’s an especially useful medium.

You may also be able to make an argument for analog tape delay, but — saying this as someone’s who’s been out of the engineering game for a bit, so with some grain of salt — this is digitally reproducible.

For playback, digital can hold every candle you can think of. Resolution, SNR, absense of any degradation, ease of use, etc., etc. Given sufficient bit depth and sample rate, it’s as good as any medium ever needs to be (for human hearing).


Fun thing here:

There's a thriving cottage industry taking real studio masters from original tape (from the era when actual studio engineers would use tape) and making copies for "discerning" audiophiles like you to play on their reel-to-reel machines. Very authentic.

Except of course each time you make such a copy you'd damage the original irreversibly, and these are the real master tapes, when they're gone that's it.

So what they _actually_ do is make a single digital copy and then copy that back to make each of the tapes they sell. So you're getting an analogue copy of a digital copy of the original. By any reasonable thinking that's the worst of both worlds.

But don't worry, paying extra for something that's objectively worse is audiophile tradition. Keep it up.


Because digital reproduces the signal accurately while tape introduces a kind of degradation you find appealing?


Provide proof in the form of ABX logs.


When arguing about digital vs analog, people miss the elephant in the room:

Why does listening to an actual violin being played a few feet away sound so much better than any digital or analog recording? It's not a little bit better. It's a LOT better. Same with a piano or a trumpet.


Because unless you're listening in an anechoic room - which is Not Fun - reflected ambience creates a hemispherical sound field around an acoustic source, and stereo recording only captures a small sample of that. (Enough to create a 2D spatial illusion, but no more than that.)

And also because loudspeaker technology is still pretty terrible compared to the rest of the audio chain. Converters are nearly perfect, amps can be made close to perfect at non-trivial cost, microphones are okay but usually add some colouration, but even the best studio-grade speakers have limitations and add significant distortion.

And consumer-grade speakers - even expensive ones - are nowhere close to perfection. And even if they were, you'd still have to deal with room resonances in the performance room, which would add extra colour to the sound that wasn't present in the original recorded ambience.

Basically hifi is good enough to be musically enjoyable in a near-enough not-too-distracting way, but you need research-grade technology - like complex wavefield speaker arrays and acoustic tuning of the performance space - before you can record and reproduce sound almost transparently.


> loudspeaker technology is still pretty terrible compared to the rest of the audio chain

Which also goes to my point that arguing about which is better, analog or digital, is utterly pointless.


This is absolutely true. I have had several experiences of knowing music was live at a distance of several hundred feet, through nothing but how it sounded. And like you said, it's easy to tell primarily with "analog" instruments like brass, accoustic stringed, woodwind, drums, etc. Basically anything where the sound comes from the vibration of something besides a speaker.

Come to think of it, maybe that's why? Maybe the intrinsic harmonics of speakers cause overtones different than the material of the instrument? Or, similarly, maybe the material of the recording microphone changes the harmonics levels compared to the human ear?


I thought I could tell too but once I thought I heard live music coming from a long way off by it turned out to be coming from a relatively small portable speaker attached to a man. I was surprised but now I no longer believe I can tell the difference from a distance.


The kinds of sound systems people commonly encounter aren't that good. Also, people bring their expectations with them when they listen to recorded music. With appropriate equipment and a suitable listening environment, it's possible to fool a blinded listener.


I also notice this effect when at a concert. If they play the violin without amplification, it sounds much better then when they do.


Some of his arguments are very fragile but there's something about the race for technology that I agree distort the art and the enjoyment.

Maybe it's a transitional period where people are believers of more K (4K 8K... ) makes better movies. I think not. It's like music, it's a balance. Maybe later the industry will get bored by chasing new features and moviemakers will go back to making better movies, not better photos.


When I was walking out of Ready Player One recently the trailer for Infinity War was showing - and it looked like exactly the same movie, with an indistinguishable CGI aesthetic, and similar action.

One thing you can say for 2001 is that it wasn't a copy of anything else. In fact it invented a lot of the visual tropes that are still used in science fiction movies.

That's not a thing that happens much now. There were some genius-level directors around in Kubrick's time, but now virtually all movies look like they live in the same CGI world, with standard stock characters, and similar directing tropes.

It would be huge fun it CGI stopped working for a few years, and directors had to go back to telling stories without it.


Part of this is also normal in a way. Periods were people have everything to invent, and then periods were people live in previous codes.

Music feel the same. After the 70s most things have been done. 80s brought some pop color and craziness to it. And now what is there to invent ? hard to say.

Maybe we hit the ceiling, maybe there's still room for surprise ..


Is there a list of theaters that will be getting the film? I know there's a 70mm-capable theater in my town that did the Hateful Eight when it was released, so there's a possibility...



http://www.in70mm.com has a possibly non exhaustive list.


I don't mean to sound like an insufferable hipster, but '2001' was not supposed to be drenched in shades of orange and blue like half the movies made these days. From the screenshots I've seen, Nolan's team appears to have seriously screwed the pooch on the color timing, in an effort to force a trendy look on a timeless movie.

E.g., https://www.youtube.com/watch?v=oR_e9y-bka0


Playing in LA.

Any way to know when it is coming to Houston?


http://www.redballoon.net/h8-venues.txt

Check the two theatres listed here which allegedly have real 70mm projectors.

Also the Museum of Fine Arts Houston has a listing for it but it's not clear if they have a 70mm projector.

https://www.mfah.org/calendar/2001-space-odyssey


It's interesting how many people are stuck in the past and refuse to move to better formats. Because of that we still don't have 60fps movies :/


60fps isn't high enough for realistic motion. 60fps on a sample-and-hold display gives you obvious blur, and 60fps on a low persistence display gives you obvious flicker. It would be better to standardize on 120fps (as with Billy Lynn's Long Halftime Walk), because that gives acceptable motion quality even on a sample-and-hold display, and it's an integer multiple of common frame rates so it's backwards compatible with legacy equipment with minimum quality loss.


70mm optical prints from 65mm negatives are technically superior to any current digital projection standard, IMAX 70mm is even better.


Technically superior only for still shots. 24fps has such poor motion quality that all the advantage is lost whenever something moves or pans. Digital projectors are available for 4K 120fps, which produces more realistic video than any film format (even Showscan).


You could build a 70mm projector that runs faster, it's not like they're silent at 24fps.


Unfortunately, probably not. 70mm on a Vic8 already can be prone to tearing if not handled correctly, I wouldn't trust tape splices any faster than 24fps. IMAX already has zig-zag shaped splices to survive the faster film speed of 8 perf 70mm.


I saw the Hobbit movies at 48fps and it was not good. There is something about our brains that makes us reject higher frame rate footage as being... subjectively unappealing.

A casual perusal of the reviews of the Hobbit HFR films, reveals that most people agree with this assessment.

I don't think anyone who paid extra to go and see one of the HFR showings could be labeled as "stuck in the past", but at this point I am definitely not going to advocate for 60fps in the future. Not because I am resistant to new formats but because I like movies that I watch to be "good" and higher frame rate presentations (whether by frame interpolation or by original photography) seem to be less good.


I believe Kubrick would have loved the challenge of exploiting HFR. NASA cameras for Barry Lyndon and all that...

The Hobbit films were a terrible choice to introduce the tech to a mainstream audience for any number of reasons, the heavy dependence on makeup and other “artifice” for the production not least.


I think it's because of the motion blur. https://news.ycombinator.com/item?id=8793346#8794004 With only 24 fps, we have come to depend on motion blur on each frame to give us motion information. It can be manipulated intentionally (like removing motion blur to make moves in a fight scene feel more abrupt), but if changes don't match what we expect it can feel weird.


On the other hand, I feel almost nauseous every time I see a panning shot in a standard 24fps film. They look choppy to an extreme that makes me wonder how anyone let that through production but I guess it's just standard now. I would really like to see the switch to higher frame rates made for that problem alone.


Watch all the hobbits in 48FPS and then try to go back to 24fps and you will see that 24fps now looks laggy and unnatural


Missing the entire point of the interview.

Some of the older media/technology has qualities that the current tech lacks, and that makes a difference in artistic expression.

It's not just 'more is better'.

Even B&W is sometimes, objectively, a far better medium than color.


Whats the arguement for 60fps?

I know 24fps on a 24fps timeline to mimic real life, and 30fps dropped to a 24fps timeline to create hyper realistic movement.

60fps filmed on a 60fps timeline just looks like a first edition digital camera. (3 ccd in the house!)

What are the benefits to it?


Panning shots and faster motion become much less choppy at 60. This is why sports are broadcast at 60fps today. Imagine if you could watch the Bourne Identity and actually tell what was going on.


On the other hand, this not being able to "actually tell what was going on" is a desirable feature for most films. Because what was actually going on was a bunch of actors feigning fist fights, etc. I think if 60fps is to catch on, the state of the art in film production is first going to have to improve so that it looks "realistic" even in live action. Otherwise it looks like a stage play; we accept the obvious artifice when watching stage productions, but we're used to far more realism in our films.


Digital might catch up with 70mm eventually, but it's not there now. There's a reason so many IMAX theaters still use film.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: