> Systems built as Reactive Systems are more flexible, loosely-coupled and scalable.
I'm so weary of this kind of language. Literally everybody trying to sell you an architecture - good or bad - would probably describe it as "more flexible, loosely-coupled and scalable". Not to mention the open question: relative to what? Average? Every other possible approach? The approach currently seen as dominant?
I just wish people would skip the pageantry and go straight to the benefits/tradeoffs. If you elide or deny the existence of tradeoffs, that's only going to deepen my skepticism even further.
> Literally everybody trying to sell you an architecture - good or bad - would probably describe it as "more flexible, loosely-coupled and scalable". [...] If you elide or deny the existence of tradeoffs
"loosely coupled" is a tradeoff; when I hear about an architecture being "loosely coupled", I assume that it allows substituting different components easily, but I also assume that it may be harder to evolve that interface and make use of the full capabilities on both sides. Sometimes I want that (if I'm looking to replace one of those components), but sometimes I want something more tightly integrated.
Similarly, "flexible" is a tradeoff; some frameworks describe themselves as "opinionated" rather than "flexible".
It typically means there is a message bus that receives events and it distributes them to various listeners, generically which are free to do what they want with them. I agree with the general premise of that, but yes, not all software or applications need such a super generic construct.
What's really funny is that the latest fad for those running microservices is to go to monorepos, to make sure that all their APIs are synchronized, which is effectively tightly-coupling them again.
Monorepos aren't a fad. They are strictly a super-set of functionality and capability over siloed repos, that also happen to simplify common development cases.
They just weren't feasible before modern tooling - and without it, their drawbacks significantly outweighed their benefits.
It's like calling electric-assisted bikes a fad. No, people always wanted motorized bicycles, but prior to advancements in battery technology supplanting ICEs, they were heavy, loud, noisy, and smelly. Nowadays, they are just heavy. Does this make them harder to use in certain contexts? Yes. Climbing stairs with one sucks. Does electric-assist make them a better vehicle for getting from point A to point B in any city with hills? Absolutely.
i made this diagram in reaction to this manifesto 8 years ago but unfortunately it doesn't apply as well to version 2.0. still, i'd like to share it again - https://i.imgur.com/ll51WJ3.png
One thing that I think is missing here is to have high quality error messages, both in terms of machine and human readability. As systems grow larger, it becomes more and more infeasible to triage issues correctly without machine readability. Human readability of errors is critical for pretty much any system at any scale.
I think there is a bit more than just one thing missing there. It looks like 3 or 4 loose thoughts poorly put together on single sheet of paper as a snake oil remedy to all your aches. You don't need to strech your imagination too much to come up with counter examples which would make authors unfonfortable. Ie one of the most performant architectures are used in financial exchanges, they don't look anything like whatever this web page describes. Pretty much nothing there applies to people doing machine learning, working on games, internal services, prototyping and more – they can and often are distrubuted but not in the narrowed view of the world presented there. Some insights like "message-driven as opposed to event-driven" make you question author's knowledge on the subject of distributed systems.
Probably better to ignore and base yourself on more credible sources.
Great idea, here are some thoughts for real Rambo coder manifesto who never says never and always:
* writes resilient code that is simple
* embeds microservices in monolith to get the best of both worlds
* writes only distributed code with transactional consistency at optimal performance without deadlocks
* all systems are consistent, available and tolerate network partitions
* distributed messaging has one-and-only-one delivery semantics always guaranteed
* applies same architecture and principles on full spectrum of problems - from writing kernel code to website for local tennis club
* webscale
Applying those and only those principles will not limit you. It will free you from worry and improve your family life.
Any seemingly contradictory statements are a sign that you don't yet get it. You simply have to accept them as axiomatic truths and you'll experience enlightement that awaits. Trust me. I have a website so it must be true.
> These changes are happening because application requirements have changed dramatically in recent years. Only a few years ago a large application had tens of servers, seconds of response time, hours of offline maintenance and gigabytes of data. Today applications are deployed on everything from mobile devices to cloud-based clusters running thousands of multi-core processors. Users expect millisecond response times and 100% uptime. Data is measured in Petabytes. Today's demands are simply not met by yesterday’s software architectures.
Is this true? There are surely more companies than before, but is most of the programming done for applications like this? I always thought that at any point in time, the vast majority (~70%) of developers are working on internal applications with users ranging from a few to a few thousands. Is there any hard data on this?
Signed it too :) Can’t remember a particular reason though in our case (former cofounder) we were discovering the actor model, immutability, streams,.. tbh remember a great deal of fun programming during those days, insane schedules and whatnot. Though truth be told our most productive systems are Go these days (sold, stack still holding).
Are there any examples of this manifesto being implemented in a holistic way? I am trying to understand the bigger pictures as most of what I understand as reactive show up as user interface libraries.
I don't think it's widely appreciated how vastly diverse and multifaceted the world of software engineering really is, and how that enormous diversity blows up any notion of one-size-fits-all. "Best principles for application design" seems almost as silly as "best principles for thought".
Good luck debugging that crap when you’re woken up at 2am by a page. Even magnificent Java stack traces won’t help. You’ll be stuck in “how the hell did we end up here”
I'm so weary of this kind of language. Literally everybody trying to sell you an architecture - good or bad - would probably describe it as "more flexible, loosely-coupled and scalable". Not to mention the open question: relative to what? Average? Every other possible approach? The approach currently seen as dominant?
I just wish people would skip the pageantry and go straight to the benefits/tradeoffs. If you elide or deny the existence of tradeoffs, that's only going to deepen my skepticism even further.