There's a great scene at the end of Generation Kill where the war reporter asks the commander why he didn't reprimand one of the worst leaders underneath him more harshly.
The commander essentially says that he constantly receives conflicting reports about his leaders' performance, and if he relieved one for unproven reports, his command would disintegrate. Because the good leaders have equal numbers of bad reports about them, from those who don't like them.
Summed up the proper exercise of authority nicely, imho.
That's one answer. Although manager technical expertise has to also come with minimal ego, so you can defer to your people when they know more than you.
The alternative is developing an intuition for bullshit. It's rarer, but I've had a few non-technical PMs / managers who were excellent at their job because they could suss out whether someone wasn't being honest.
Everything has to come with minimal ego. I don't see how that has anything to do with managers having technical knowledge.
And it's great if a handful of people can smell out bs in a field they're not familiar with, but you can't write policy on the back of a few unicorns like that. (Yes, managing ego is hard too, but I believe it's more common to start with and more trainable.)
There is this scene in Das Boot were one of the crew members says that they are going deeper than the sub is rated for and the captain says "don't worry German engineering".
But right now we know neither (Bob or Steve being bad), and we hide that behind some objective metric that gets diluted at every level. We hide it in Steve's reviews, in Bob's reviews, in the post-weld reports, in the weld-quality audit report, the inspection. Heck even NASA oversight inspectors are probably downplaying some of the severity when stating it in their findings report (saying it's partially due to, or the welds "contributed" to the quality, etc.) instead of just saying "these guys got an amateur to weld the exterior". So this "Steve is Bad" property gets spread out across all those various things, when we could just as easily have fired Steve because Bob evaluated him. Or maybe Bob didn't state it, maybe the NASA inspector told him "Bob, whoever you got to weld this thing is an amateur. Fire him before you get someone killed."
And sure, does that mean Bob might get it wrong? Of course, he might even be a petty little tyrant with an ego or he just might not like Steve because he made a funny comment about Bob's tie on his first day. I don't think some things can scale and be optimized away like we're all a bunch of cogs in a machine with our own individual little tasks and performance metrics that get aggregated into the larger whole.