Although I use git these days and wouldn't go back to P4 if you paid me. I still sometimes use p4merge for my difftool and I always thought they did a great job with their client UI with Qt. So, yay for p4 and Qt.
Funny, while I love Qt, and have been using it in our internal tools, there is a old-lore & myths about p4win (the MFC version) of the tool - that always felt snappier than its successor - p4v - and yes, p4win is snappier, but sometimes its just the fact that you've used it for so long, its hard to get without it.
Just about any big game studio uses P4 these days. I haven't seen one that doesn't.
"Just about any big game studio uses P4 these days. I haven't seen one that doesn't."
That's a very interesting statement. I wonder why? Maybe it's just grandfathered into these companies. IMHO, even if you're an enterprise software shop I think git can handle that with the proper tools and practices.
I'm currently having to use Accurev in an enterprise environment, but I haven't run into anything that Accurev offers that can't be accomplished with git in a better and faster way. But then again, that's just my opinion :-)
P4, as much as there are aspects of it I hate, handles the situation where you have binary files in the repo much better than most source code control systems do... especially git which sucks at that[1]. Game companies tend to have potentially quite large (by SCCS standards) binary assets mixed in with the code or even when they are cleanly separated, they like to have the standardization of everyone using the same version control system instead of having something separate for the artists.
([1] Predictably someone will probably want to respond to this with some link that shows a complicated and non-intuitive way in which you can get git to be halfway reasonable with handling binary files, though this will just prove my point more than disprove it, when compared to P4 which just handles binary files great out of the box with no thinking or planning required).
Every time you make a change to a binary file, git will make a copy of the entire thing and attempt to compress it (despite the fact that it may already be compressed as part of the file format and won't compress any further). And because of the distributed nature of git this means each client is going to have who-knows-how-many perhaps barely different versions of each binary asset, which could be megabytes to hundreds of megabytes in length per version and woe be unto anyone who has to clone the repository for the first time when coming into the project (be prepared to wait hours, maybe days).
There are various git methodologies and projects (eg. git-annex) aimed at working around this but by default git just wasn't designed to deal with big repos full of big files.
OK, I get it. The hashing would become slow on each file (especially large ones). And given that git is designed to give you the whole repo history, yes, cloning would be slow with repos with a lot of history.
It seems like 2 of the main reasons Perforce works better with binary files is that when you sync, you only get the version you're requesting, thus resulting in faster downloads and there's an option to disable compression.
Yes that's exactly correct. On the client-side P4 basically only really cares about the current version (or whatever branched/changelisted revision you are working with) which for projects with a lot of files (especially binary files) will be an incredibly smaller set of data than the full history. git is fully distributed with all of the pros and cons that come with that.