Hacker Newsnew | past | comments | ask | show | jobs | submit | chrisdirkis's commentslogin

Undoubtedly? Modern video games are pretty good, and there's a lot of them. They're mostly written in mainstream programming environments. I don't see without explanation how the take is undoubtedly correct in that context.


Fortunately, well-respected game programmers like Muratori and Blow, who have written games that are more highly regarded by critics than BAC Skywalk and who also have more experience than you do, have spent thousands of hours providing that explanation. If you aren't going to listen when they explain it, you aren't going to listen to me either.


Thanks for checking out my portfolio! I have to admit, it's a bit out of date, but I also don't think that my background in games matters all that much in any specifics. I can talk to principles.

To try and understand where you're coming from, I'll make a few notes and you can tell me where you agree or disagree.

I think we'll both agree that Rust is probably not suitable for most parts of game development. Rust, while enabling "bug-free" refactoring with it's expressive type system, also imposes that type system when you don't want it. Oftentimes, when tweaking a game, you'd rather try things out in a state where things could break than be forced to finish the refactor every time, and Rust doesn't have particularly ergonomic escape hatches for that. If there are specific components with well defined inputs and outputs for your use case (say, maybe, an input system, or a rollback netcode engine), then I think Rust could be a good choice, but you'd also be paying when interfacing with whatever other tools you're using.

I think we'll possibly disagree with the idea that games need to be fast. They need to be fast enough to work! A stable 60fps is pretty much table stakes nowadays too, even in genres that don't benefit hugely from such low latency, in terms of player perception. But "fast enough" is a different bar in different contexts. Mario is fun, and in 2025 I would not need to put much effort in at all to make Mario fast enough in ~any context someone will play it in. On the other hand, I'd probably have to put a lot of work into a photorealistic open world RPG, or an RTS with tens of thousands of units. Many games live in between those two, and oftentimes there's reason to optimise some parts and not others. If a game runs at 60fps on a 5-year-old Android phone, which can often be achieved with Unity (modulo garbage collection hitches), I'm not going to spend extra effort optimising further.

Where I probably disagree most is that we currently err too far on the side of correctness. One thing you didn't note (and I'm not sure if Muratori or Blow talk to) is the difficulty in finding bugs. Games are a massive ball of mutable state and operations on it, and games are usually massively wide artifacts, played on a huge variety of target devices, that are hard-to-impossible to cover the full state space of. Bugs are often non-trivial to see when writing, see when testing, or notice if a reversion exposes them. If I were to guess, I've seen more time spent on resolving bugs than from writing features, from me and the devs I've worked with in my career.

I think in the iron-triangle-esque trade between "easy to write more game", "hard to write bugs", and "game runs fast", I personally bias towards the first two. Few games _need_ the latter, fewer still need it everywhere in the game. Scripting languages and higher-level langs like a C# are pretty ergonomic for games (and the latter specifically, outside the Unity context, is pretty good in terms of optimisation capabilites too).

I'm unsure what made you think that I'd be unlikely to want to listen or discuss things with you, so if you do have notes there I'd be happy to hear them too.


Interesting and thought-provoking.

I don't think Rust is mainstream enough to be what they were attacking, especially 6 years ago or whenever Blow gave that talk. Unity certainly is, and they seem to reserve special scorn for it, maybe because it's so popular.

I don't agree that it's easy to make Mario hit a stable 60fps in any popular gaming environment. In the browser, it's easy to hit 60fps but impossible to keep it stable. And, as you concede, it can be challenging with Unity (or Godot).

Latency is a separate issue from fps, even when the fps isn't janky. With your PC plugged into a Smart TV, you can hit a stable 60fps, but typically with two or even three frames of lag from the TV, which is a very noticeable downgrade from a 6510-based Nintendo connected to an RF modulator and a CRT TV from 01979. And often the OS adds more! Three 60fps frames of lag is 50ms. The one-way network latency from New York to London is 35ms. Most players won't be able to identify that there's a problem, but they will reliably perform more poorly and enjoy the game less.

I'm skeptical of the Muratori crowd's implicit assertion that this kind of responsivity is inherently something that requires the game developer to understand the whole technology stack from SPIR-V up. I think that's a design problem in current operating systems, where it definitely does exist. And, while I'm skeptical of their dismissal of the importance of bugs, I'm confident that they're representing their own needs as accurately as they can.

But probably it's better for you to engage with their explanation of their own ideas than with mine. I might be misunderstanding them, and I don't have their experience.


Do you not feel as though that's incredibly anti-social? It's a very zero-sum "I'd prefer if things were better for me in direct proportion to how much worse they'll be for '_others_'"


A few quick q's, since I'm working on a game with a hand-rolled rollback impl (we have state copying, so we can do some nice tricks):

- Is there anywhere we can follow you about the clock-sync trick? I'd definitely love to be notified - On the adaptive delay, are there gameplay or rollback engine implications to variable delays? Seems somewhat "unfair" for a player to be penalised for poor network conditions, but maybe it's better than them teleporting around everywhere.

Good luck with the project! I'll hopefully have a fiddle around with it soon :)


Sorry for the slow reply!

I think maybe the Twitter might be the best place to follow for blog updates: https://x.com/MadeWithEasel - I will definitely be posting about every blog there once I get there.

On the adaptive delays: I have a multiplayer game which gets about 100 players a day and it has been interesting seeing how they have all reacted to various iterations of the netcode. The overarching thing I've learned is that latency is quite psychological, it's the difference between expectation and reality that matters. In other words, high latency doesn't necessary mean an unhappy player, if they are expecting the high latency.

First though, the amount of rollback is limited to what the player's device is able to handle (there's yet another algorithm I've made for collecting statistics and estimating this!) Some devices cannot handle any rollback at all and so unfortunately sometimes rollback netcode isn't solving anything for those players. I think these are the cases where the adaptive latency is more important.

We sometimes would have games where we have 3 people from the US playing happily together, and then 15 minutes later 1 person from South Korea would join, and the latency would jump up dramatically for everyone. The US players would feel the difference and become unhappy. The simplest way to explain what Easel does now is it places the server at the weighted-average midpoint between all the players. So in this case, you can imagine that the server started in the US, and the moved 1/4 of the way towards South Korea (since 1 out of 4 players are in South Korea). I have found this to be the most fair and the key thing is the players find it to be fair. It matches how they think the latency should be apportioned and so they are okay with it.

Recently though I added a feature which splits the world up into regions (it's more complicated than it sounds because the regions flex around the players a bit, see https://easel.games/docs/learn/multiplayer/regions). By default, you only play with players near your region, but you can switch into Roaming Mode and play with people all over the world. The trick here is, when the player chooses Roaming Mode, they are explicitly choosing high latency, which changes their expectations. When they get high latency, they expect it, and so they are happy. The funny thing is, the algorithm used to automatically assign them the same high latency in these situations but players didn't like it because they didn't have the choice.


I need to emphasise that this is dramatically different. Comparing the first clip of OoT to [1], I see: - Higher framerate - Real-time lighting - Real-time shadows - Higher render resolution - Water ripples/splashes (unsure about these being new) - Widescreen, motion blur, etc.

If your contention is "they're not an improvement on the graphics, just changes", then we can agree to disagree, but they're otherwise incredibly noticeable!

[1]: https://www.youtube.com/watch?v=nySI72vRl_4


Specifically, he's not just investing in startups, but investing in ones that have a factor that is either unknown or unknowable. That's a different strategy from YC's "Invest in promising founders" or the typical VC "Invest in early stage startups with good metrics".


It must have been "promising founders working on an idea with an unknown or unknowable factor", right?

I can't imagine it working out so well without that qualifier.


Most likely, I'm just passing on what's noted in the post. There's likely a lot more nuance to the actual decision-making :)


I believe the new legislation says, if GP were to be punished for not being available for such a call, that'd be illegal and they could launch some sort of action. This seems totally compatible with GP's policy; boss can call them outside hours, GP can respond, if GP doesn't respond there's no consequences.


I believe this is a feature that exists with Copilot in VSC too.

I'm also not convinced it's useful. Either the commit message contains surprising information that cannot be derived from the changes, or it doesn't and the message derivation can either be done in-real-time or by reading the changes manually. In general, I'd prefer the former state of things; the commit messages I find useful contain the "why", which often isn't clear from the code changes.


They're in the process of swapping out their Mono fork for a modern Core fork, which I believe they're planning on upstreaming changes to too. A few years ago, someone demonstrated a 1-week hackathon version of a game build running on Core, and recently (2023-12), someone on the forum noted that they have an internal Editor running on Core.

Most modern large games use IL2CPP, the C++ transpiler; it's a requirement on iOS, nearly a requirement on Android, and almost certainly a requirement on consoles. Perf is often nicer and it's harder to mod/hack, which some devs like.

Sibling poster notes different .Net compatibility settings. Modern Unity versions allow you to be in either .Net 4.6(ish) mode or .Net Standard 2.0/2.1 mode. The latter comports more with Core's APIs, though is pretty old at this point.

And, since we're on similar topics, Unity currently supports C# 9.0, minus a few minor features[1]. This is a massive improvement compared to a few years ago, where we were stuck with C# 4 or 6, and means you can write pretty modern and performant C#. Especially notable is Span support, a way of representing (basically) non-owning array slices.

[1] https://docs.unity3d.com/2023.3/Documentation/Manual/CSharpC...


Funny enough moving to il2cpp made it even easier to hack/mod/access since now all class info and offsets and other info are exposed, easily with full type info. I'm just about done with a lib that can fully read all of the il2cpp related data externally, it's been a fun project


Troupo is citing facts, you're citing incentives/habits. While incentives are powerful and those habits have been borne out repeatedly by Apple, one of you has the clearly more documented argument. To be convincing, you need to demonstrate that Apple chose to deprecate Flash to help establish a walled garden. You've shown that they'd probably _want_ to, but that's still one step away.


Aah..you are asking for documented, formal evidence. Unless someone leaks the trove of internals e-mails at Apple of the years before the previous decade, one isn't going to get this. Adobe was a competitor after all. Quite a high barrier for an argument that favours Apple.


I disagree with Harriet Johnson in at least a few cases, but the entire thrust of that piece and her argument within is "it's insane that people make judgements on my quality of life, without asking me, and say that I should not be born". You can make that argument and also believe that abortion is an appropriate tool for some cases.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: