My background is aerospace engineering. Specifically, I have managed technical risk on a variety of aircraft fleets including SAR, military tactical aviation, and special missions aircraft. In a nutshell my job was to look at complex systems, identify likely failure modes, and come up with engineering solutions. I'm very good at it. Perhaps unsurprisingly, it turns out that identifying failure modes in complex systems is a highly transferable skill.
When I launched my first startup shortly after COVID I realized that that skillset makes me shitty at raising capital because I tend to only see the faults in my ideas and I end up with a "nobody would pay for this" mindset. But I also tend to easily spot the ways that tech companies' products can (will, do) go sideways. I am a perennial late adopter. I don't own an Alexa or really anything "smart" equivalent because I knew early on that they would become dystopian surveillance devices and/or security risks.
My personal question to you: were you able commercialize your error finding skills? I am electrical engineer, I predict very early the week when shit hits the fan. Nobody listens and most of my job nowadays is cleaning the shit. I am sooo tired of being right. I want to find a way to use my future forecast skill instead of fixing obvious things afterwards. One manager told me it’s a normal process to hit the wall at full speed and the re-scope project and move from there. But damn… I don’t want to waste my lifetime doing obvious errors and fixing them. To be clear, errors happen and it’s fine. But most of them are easily predictable.
When I launched my first startup shortly after COVID I realized that that skillset makes me shitty at raising capital because I tend to only see the faults in my ideas and I end up with a "nobody would pay for this" mindset. But I also tend to easily spot the ways that tech companies' products can (will, do) go sideways. I am a perennial late adopter. I don't own an Alexa or really anything "smart" equivalent because I knew early on that they would become dystopian surveillance devices and/or security risks.
And I am so off-the-charts tired of being right.