Agreed - I’ve spent months working on a deep tech problem before I could even show a functional core, let alone putting it into production. Some problems take time, experience and foresight to solve and the solution only dovetails at the end.
Exactly .... Agile works pretty well in the "execution" phase when all of the requirements are known upfront. For any kind of R&D agile is not a good fit at all. R&D fits the waterfall model better. Ideally it should be like you do your R&D with waterfall and the agile for execution.
In my aforementioned example, the set of development practices most often called "agile" was neither the right approach nor what we did. Agile as defined as the ideas in the agile manifesto might have been what we did. It would depend on who you asked and whatever their particular axe to grind was.
So I don't know, and quite frankly we didn't really care if we were "agile." All we found was that "agile" is simply too lossy of a descriptor for meaningful conversations about development process.
People writing books and blogs about what they mean when they say "agile" hasn't caused the industry to coalesce on a concise definition. That horse is out of the barn, and I don't expect we're getting it back.
You can be as cynical as you like, the general idea of (very) quickly iterating to stay focused on what your customer wants is a proven and effective way of build software.
It's not as effective if you're not working in software, or at least I don't know how effective it is, but that's not what this submission is about.
This may be true if you have no aspirations for your software beyond what your customer can imagine in the present. Almost seems like a methodology that has internalized learned helplessness.
If you think the process I'm describing here is, "Ask customer what they want, do exactly what they say." then I'm not being very clear.
"Don't assume that your team should build what it's told to build. Instead assume the opposite: nobody really knows what you should build, not even the people asking for it. Your team's job is to take those ideas, test them, and learn what you should really build." [0]
That's your team's job. Not to make guesses by assuming you know what's best, not to do what they're told by customers, but to run tests and use results to inform next steps. That's the entire purpose of a development team, and you simply cannot do this without a quick iterative cycle.
Actually, "my" team's job is 1/2 asking the customer, and 1/2 not asking the customer so we can invent stuff they haven't thought of and ideally can't/won't think of.
I think the reason people aren't fully agreeing with you is that there's a lot of important stuff where a quick iterative cycle in front of a customer eliminates the possibility of high-value outcomes.
That's only true if you're interacting with your customer in some weird, subservient way that has nothing to do with anything I've said at all.
The point you're trying to make isn't the novel insight I think you're trying to present it as. Of course you don't just build what your customer asks for blindly, of course you design with the future in mind, none of that is precluded whatsoever by iterating quickly to provide value and test what your customer needs.
The point I'm trying to make is a corollary to “A committee is a life form with six or more legs and no brain". It's an acknowledgement that sometimes the customer is bikeshedding from start to finish. It's the recognition that rapid iteration can mean a short planning horizon that traps an architecture in a local maximum that won't deliver on business aspirations.
In a past life I had developed 3 generations of tooling to support a certain complex and information-dense task that was essential to the business. Generational change was not incremental and required months/years exploring the problem domain and prototyping and backtracking and intentionally not consulting anyone. If I came out of stealth too early I'd be directed back to a gen 1 approach that was functionally and architecturally exhausted and could not meet the business needs at a reasonable cost. By the time I was done I understood the problem domain better than the business did. (I will stipulate that most of the time this is a bad approach--but sometimes it's absolutely necessary.)
My (aspirational) purpose at work is to surprise my customer with valuable and insightful solutions that they could not have arrived at incrementally, or with their methods, or with their team. Surely you can see how this is in direct conflict with rapid iteration in collaboration with them?
Except you're not inventing things, you're discovering things. That's a big difference. The best way to discover if something works, is to put it into the hands of customers as fast as possible while minimizing your expense at doing so.
I didn't know I was coming across as cynical? Just saying that I have no idea if we were "agile" because the term is so damn malleable. I do know we weren't a cookie-cutter copy of the popular SCRUM inspired workflow that's commonly just called "agile" regardless of whether the team is being "Agile."
Ah okay. SCRUM is the square to Agile's rectangle.
You're "lowercase a" agile if you operate under the prioritization outlined in the Agile Manifesto. I think most of the people who've found consistent success with agile would say that's what matters is more about that than about XP or SCRUM or Kanban or whatever.
Those systems can help, but to be agile is to focus more on individuals, working software, customer collaboration, and responsiveness and less on processes, documentation, contracts, and plans.
I hesitate even to reword the items in the manifesto, just because I know it took a lot of very smart people a good while to agree to the specific wording, and it's seemingly stood the test of time.