This post just listed traits very prominent in hackers - deep fascination for technology, perfectionism, need for deep focus and few distractions, concern for efficiency - and wrapped this package under the label of "lack of judgment".
The problem I have with such generalization is that these scenarios are always a bit caricatural and are usually presented in a way that nicely fit the argument.
The experimental developer is required to build simple stuff, but goes to extremes just because he wants to toy with new technology, while the great one is praised for her conservative approach. It also helps that she asks exactly the right questions and receives the right answers.
“How many devices do we expect to have?”
“Well, we hope to sell 500 in 12 months.”
“How often will they need to report in?”
“Roughly once an hour.”
“How reliable is the network?”
“It’ll use WiFi, so fairly reliable.”
In reality, sometimes you ask these questions, you get very accurate answers and based on that, you pick some technology that you believe will spot on address the problem. You might even make the judgment call that you have enough wiggle room to include one or two new concepts you've been curious about, that are yet still very relevant to the task at hand.
Then something happens mid-project and it turns out that what was originally requested wasn't actually what was needed. How many times has that happened?
Two possible conclusions in these situations, for either developers:
- the "rockstar" either looks like a god, for having foreseen some problems, or he'll be the guy who brought a tank to a knife fight.
- whereas the "great developer" will just look incompetent, or she'll just be, well, great.
Judgment is a nice trait for a good developer, but it is subjective. There are some hits and some misses.
What I believe makes a _great_ developer is the fact that they might work to push their own boundaries, which is the reason you're interested in them in the first place, but most importantly, when they do, they stand by their work.
I can't embrace a definition of a great developer, where the primary quality is to avoid causing trouble for the company, the project, the team or their boss. That has almost nothing to do with the discipline. You're describing a "great employee" or a "great team player".
The exact description of a great developer given in this article might absolutely not work in other environments, where developers are required to push the envelope and think outside the box. In such context, your great developer might be thought of as mediocre at best.
the "rockstar" either looks like a god, for having foreseen some problems, or he'll be the guy who brought a tank to a knife fight.
Fights over. The guy with the knife killed you while you were putting the treads on. Meanwhile, the guy with the knife discovered that his customers actually wanted something else because they could run the first iteration of the actual product while you were worrying about scaling to a million customers. Maybe the customer is in the steak-house business, but you just stopped reading at "customer wants knives", got a hard-on and started building tanks.
Then something happens mid-project and it turns out that what was originally requested wasn't actually what was needed.
So you advocate over-engineering a just-in-case tour-de-force, instead of agile, iterative, responsive development. How are you getting up votes????
This post just listed traits very prominent in hackers - deep fascination for technology, perfectionism, need for deep focus and few distractions, concern for efficiency - and wrapped this package under the label of "lack of judgment"
Not so. The post listed traits found in very prominent in hackers but also found in ineffective, inexperienced hackers. And advised that to focus on just those traits is wrong. The key trait that separates these two groups is judgement. A "prominent hacker" will use the right tool for the job, right now.
"So you advocate over-engineering a just-in-case tour-de-force, instead of agile, iterative, responsive development. How are you getting up votes????"
There's a time and a place for heavy engineering, and a time and place for agile, iterative development, and sometimes both have times and places in the same project.
One element of good judgement is realizing what the tradeoffs between older and trendier development methodologies are, and then navigating them intelligently.
For example, the direction LedgerSMB is moving is towards a heavily engineered, intelligent database with a well-engineered API, and a more agile application built on this framework. The database engineering (esp. in an accounting app) needs to be done right and account for a lot of "just in case" while the actual application running on the database should be able to be customized relatively easily.
Its funny, because the only code I write that is 100% TDD is the stuff that has to be done right. "Heavy Engineering" without agile, iterative development is actually just "Heavy Wishful Thinking", or "Heavy Waterfall".
Fights over. The guy with the knife killed you while you were putting the treads on[...]
This could go both ways actually. I'm sure even you could come up with some concrete examples of behemoths that came into an industry and just annihilated the competition, because they decided to go the extra mile, when everybody else had a myopic vision. FYI, the tank to the knife fight is not an approach that I necessarily embrace or advocate for. I'm merely stating here that vilifying the practice of foreseeing something bigger, as a counter-example of what constitute a "great" developer, doesn't necessarily work. Tell me a great developer has good judgment, just don't go as far as listing a developer's fascination for technology as a pathology. If it can be good, then leave it as an inconclusive attitude whose outcome is highly dependent on the dev's said judgment, or lack thereof and her ability to carry out the execution of what she undertook.
So you advocate over-engineering a just-in-case tour-de-force, instead of agile, iterative, responsive development. How are you getting up votes????
Nobody here advocates over-engineering, the term in itself is negative and the practice indefensible. My position is that "over-engineering" is a subjective thing, like judgment. You think we need a tunnel, I say we should build a bridge. We foresee different things, but the need is still to get across. Does our difference of opinion qualify you as "great"? The point of the segment you're referring to was that it is possible for something that was originally labeled "over-engineering" to become, in the right circumstances, "sound engineering".
The post listed traits found in very prominent in hackers but also found in ineffective, inexperienced hackers.
This is one problem I have with the article. Here's a specific excerpt:
[...]When given what has the oportunity to be a “fun problem,” developers without judgement tend to run to their cave to craft the most elegant solution possible. They have a natural desire to over design the solution either in terms of flexibility, speed, feature scope, or simply to get a chance to play with their new pet technology. They need to be constantly checked on to make sure they aren’t half way down a rabbit hole[...]
How many bad hackers have you encountered that were concerned with such things as flexibility, speed or feature scope? The description here is crafted in a way that superimpose hackers clichés with bad judgment. It doesn't segregate good or bad hackers, but it grabs traits that are generally seen as beneficial and just displays them in a bad light. It doesn't say "They have a natural desire to design the solution in terms of flexibility", but rather "They have a natural desire to _over design_" That, is my problem with it. You can make Dianne look good by all means, but you shouldn't have to make Jake look bad in order to do it (btw Jake is supposed to be a rockstar. Last time I checked, in the tech community, this is a guy who's taken his craft to new depths. As far inexperienced and ineffective go, this falls short).
I agree 100% that a prominent hacker uses the right tool for the job, but that's an effect, not a cause, of being a great hacker.
I define a great programmer as someone who is prepared to take responsibility for their solutions being effective or not.
Can people maintain the system you wrote? Is the system appropriate for what was required? Did you do work that saved the company 50% of their operating costs? Or are you just having a great time farting around being useless and causing damage?
Can people maintain the system you wrote? Is the system appropriate for what was required? Did you do work that saved the company 50% of their operating costs? Or are you just having a great time farting around being useless and causing damage?
I almost wholeheartedly agree with the sentiment. I do however have a problem with it when people that experiment with technology are painted in a way that automatically assigns them with the latter group.
A lot of the traits listed as indicator of being a great developer are actually "effects", they don't necessarily warrant that you qualify as "great". Whereas most "causal" traits are relegated under the label "lack of judgment". That is my problem with the essay.
I didn't mean to say that developing with what might be considered 'experimental' tech as inappropriate. I think that sometimes breaking away from 'best practices' and 'how we do things in this company' is exactly the kind of thing that can create significant savings.
I think there's a lot of entropy with tools and techniques on the tech industry generally, and more so in actual companies where cultures don't change as rapidly.
The thing though is that a lot of this occurs within a framework of evaluating what is not working and rethinking it. On the whole though, this is what pushing the envelope is all about. Note here we are often bucking (rather than embracing) many trends, though we do look to trends to pick pieces that make sense.
When it comes to software engineering, I always think back to my first year Calculus teacher, Mike Lavender, and his admonishment that "in mathematics, power tools are consider inelegant where hand tools will do." Words to live by in programming and engineering.
So I'd go so far as to redefine elegance in software programming and engineering as "simplicity perfected." The simple fact is that where requirements change, a well engineered simple solution will usually turn out to be more agile than a heavily engineered, complex one (whether programmed with "agile" methodologies or not).
When one takes the idea of trying to perfect simplicity in the art of programming (and software engineering), it becomes easier to recognize foot-guns and inherent problems, and tradeoffs often become more conscious and controlled.
So while my view is slightly different from the article, it's difference is perhaps one of nuance rather than substance.
I think the post could swap "developer" for "hacker" - where you seem to be switching "great" for "hacker".
I think what you are saying sort of agrees with the post. A bad hacker will make a mess, a great hacker exhibits judgement - and brings in new technologies because they are the right solution.
Indeed, the post says nothing about the great developers experience with Sinatra... perhaps it was what she always used. Or perhaps she knew a little about it, and realised it was the most effective tool to use (and therefore used it for the first time).
What the post is talking about is the bad rockstar hacker who doesn't ask the right questions (lacks judgement), trys to over engineer the solution (lacks judgement), uses new technologies for the sake of it (lacks judgement).
I can't embrace a definition of a great developer, where the primary quality is to avoid causing trouble for the company, the project, the team or their boss. That has almost nothing to do with the discipline. You're describing a "great employee" or a "great team player".
The post is talking about the aspects that make a great developer.. working in a company.
I think they are right; they need a developer with the judgement to find out what the company needs, build a solution with effective tools and who has the ability to iterate as the spec develops. All of that ties to having good judgement.
Not someone who gets all enthused with the latest tools and delivers a much more elaborate solution :)
From the perspective of hiring or contracting a new developer someone who doesn't cause trouble in that way is a good thing. Especially as what the post describes is someone who actively produces solutions instead!
In my experience many excellent hackers don't even try to make the educated guess about how much change to account for. The Hacker assumes that the system must handle various far-off scenarios which gives them an excuse to play with [insert tech here]. Or the Hacker assumes that "the business" has thought of the best strategy for dealing with various technology issues that might occur (as if non-technical types could really do that), which gives them the ability to say they were just doing what they were told.
The industry conditions Hackers to avoid thinking strategically, at least in part. The Junior Hacker is browbeaten for not thinking of every possible scenario when things go wrong, however unlikely. The Junior Hacker is rewarded for following the directions of their technical supervisors to the letter without thinking. It shouldn't surprise anyone that by the time the Hacker is promoted to Senior Hacker, they have little experience designing and developing software that will ship quickly, not be a maintenance burden, and can adapt to success if it comes.
This article is totally true, but almost impossible to act on.
It's very very hard to differentiate people who have a passion for doing good work and people who just like to play with the latest trendy technology. Especially since the latter is often just an immature stage of the former.
You can look for someone who works with non-trendy technologies.
This is actually a big reason why non-trendy technologies persist - indeed, why they often thrive in the larger marketplace precisely at the moment that they become least sexy and fall off the geek radar. Non-trendiness is a market signal of its own. There's a lot less noise in the backwaters, and those products which survive to become boring have demonstrated staying power. People obviously aren't using them for fun, or because they're easy to promote on popular blogs, or because VCs can impress each other by dropping their names. They must have survived for another reason. Perhaps they are... pragmatic, useful, or cost-effective?
This strategy must be used in moderation, of course. The final stage of nontrendiness is obsolescence and death; You don't want to hire those who are expert only in obsolete things. (Oh, my IE6-CSS-hacking skills will soon have no market value. Thank goodness!)
Don't sell yourself short. My father bought an IBM-PC and built a record-keeping database application for a niche market that did very well for him from the mid-80s well into the 21st century. It was written in almost-pure PC-COBOL (some ASM), which was already the butt of jokes before the PC era even began.
This essay seems to beg the question. A "good" developer, by its definition, is the who picks the most reasonable, easy to maintain technology, because she knows anything "much more complex would be beyond her current skill." Yet, if she is lacking in skill, how can we be sure her judgment is sound on what a "reasonable" and "easy" technology is? Doesn't the basis for that judgment require superstar-like experience and curiosity?
Developers who lack skill are awful at picking technologies. I like to think of it like this: when the guys that don't know anything start picking the tools they are basically throwing darts at a board. Everything is going to seem random. Sure, some of their picks will be correct, but most are going to be awful selections that make maintaining the project a very unpleasant experience.
These kinds of people tend to choose whatever seems popular right now. If they happen to be managers then they will read Gartner reports and report on their contents as if Moses himself had presented them with edicts carved into stone tablets.
I would say that once you reach a certain level you are able to determine which technology is reasonable based on things like language familiarity, community support, and documentation.
Sure, but I wonder if two years in Ruby, or any language, is enough?
A superstar dev might go out and try a lot of the new, cool stuff, even when it is needed. But it seems that that developer would also have a grasp of the common characteristics of bad technology.
So I think there are at a minimum two good traits of a great developer. The rockstar-like quality of having the skill, curiosity, and breadth of knowledge to handle being on the edge. And the maturity and thoughtfulness to know whey he/she has failed in the past.
Ruby is also a bit of a contrived example. Ruby looks a lot like you could come to it from Java and know most of it, but metaprogramming actually takes a long time and a lot of work to really figure out.
Similarly, Ruby has many different little variants -- Rails, Rake, various sorts of DSLs -- and that adds significantly to the time to learn it.
Really learning Ruby is learning several similar languages, which makes two years a much harder number to achieve.
C++ could easily have been like that if it had different constellations of features that were routinely used in different ways by the same programmers. Imagine if there were a whole "C++ tools" subculture that used templates extensively but differently, along with one or two other nontrivial features (say, multiple inheritance and exceptions) used in specific ways with specific non-compiler-mandated rules.
You'd have to learn, in effect, about another programming language's worth of material to program fluently within that subculture.
If you are already an expert programmer who knows a few similar languages, then sure.
If you are already an expert programmer who knows different languages, then two years may be enough, or it may not.
If you are not already an expert programmer, then two years is unlikely to be enough. Peter Norvig suggests it may take ten years to become an expert: http://norvig.com/21-days.html
Example: I have been programming for a little over ten years (though not all of that time was professional programming); have a degree in software engineering; have worked in telecoms, web development and recently in embedded systems; have used C, C++, Java and Python professionally; I am a self-described language enthusiast who likes to tinker with, try out and apply a variety of languages (factor, scheme, clojure, prolog, ocaml, haskell and a few more to a lesser extent). BUT I still learn new things about the languages I use. I still feel I have a lot more to learn. I first started using Python in 2001 or 2002 and I still find myself learning useful new things.
So, sure, you can learn enough to effectively use a language in a lot less than two years, but I think it will take a lot lot longer to learn "everything you ever needed to know about a language".
It depends to whom you're asking this question. Some people have high standard than others.
I wouldn't bet a consulting project performing integration of various systems using Ruby if the programmers on my team average of 2 years "side-project" Ruby/Rails.
I still wouldn't bet on Ruby developers with 2 years of experience on certain type of web-apps either unless I've seen their code, code style, discipline, judgement, etc.
I would trust a Ruby developer who got burned by some of the Ruby tricks though.
Case and point (or anecdotes):
I recently brought in a veteran Java, 2 year-ish Ruby/Rails, and 1 year-ish JS (thanks to Node.js and Crockford "The Good Parts") programmer to help us working on some of the JavaScript codebase. Java knowledge is intermediate despite the number of years(I consider deep JVM and standard JDK knowledge around hard-to-use packages such as NIO, Thread, Networking as expert, other than that, just intermediate). Pragmatic guy. Understand OOP and its gotchas.
I made an observation of her work in Ruby and JS. On the JS side, she wrote much more code than needed and my gut feeling tells me that she does this to increase her experience using JS. Kinda using it as a playground.
On the Ruby side, certain bad OO practice tends to be a little bit more accepted by her. Where as she would be critical when it comes to bad OO practice in Java.
Hence this is why I would prefer to observe developer's attitude more as oppose to the experience level. Because sometime 1-2 years are considered as "Honeymoon" period where everything looks fine and dandy. As soon as the honeymoon is over and your significant others start to show its true colour, then you're back down to the earth again.
Totally agree. For example, if a web dev wants to try a new language/framework, he or she can be competent in a few weeks and well-versed in 3-6 months. No?
Agreed and of course a clear understanding of what the tradeoffs of approaches are. In the end, I think that more than years of experience is the best marker.
I have come into codebases written by a few developers for 20 years that made me think "why don't these people ever get any better?" And when the codebase could function as a textbook of how not to program........
What this article is saying is you need someone with experience to make the right judgement. This means you need to find someone that has already gone on the 5-8 year roller coaster that a developer goes through. This from the 'just get something working', to astronaut over-architecture, to practicality that might come with running many projects and being through many successes and failures.
A good developer might be at any stage in that process and it is really a timing thing. Typically after 10 years most developers have this as long as they have not become too pessimistic and have experienced many platforms, not religious to any of them.
Experience with a beginners mind that is willing to innovate still, is the perfect balance of a 'great' developer.
A very smart developer who I respect once said something like: when you are young, you have lots of energy and want to learn/use the latest technology. As you get older and gain more experience, you start to see the patterns. It's not about the technology, there are repeated themes and solutions. But when you start realizing this and have accumulated 7 years of experience, you may be transitioning out and wanting to begin starting a family. So the people who might be most qualified might be wanting to leave the intensity of coding for the relative stability of management.
So perhaps n years of experience is a cludgy filter for the relative graybeards who understand that technology is just a tool, not an end in itself.
I find the volume of defensive comments a little amusing, since three times a day we all vote up and universally agree ideas are cheap and execution is king. The author is drawing the same parallel to characteristics of individuals.
Unlike most of you, I read that and found myself saying "yes!" It's the tortoise and hare argument: good dev teams are disciplined, focused and stable. I suspect the backlash is because it stung a bit, since many here saw themselves in flighty Jack, versus measured Diane. I was there too, but have learned from a few failed projects and disappointed clients. I encourage you to reflect upon yourself a little more carefully; good execution is decidedly unsexy and needs more Diane than Jack.
Are you being misled by the fact that the article uses only two examples? That's a rhetorical shortcut, not a proposed dichotomy. The article does not actually claim that "those who use Sexy Tech X tend to have bad judgement" or anything like that. What it says is that folks with good judgement use appropriate tools for appropriate reasons, which may be situational.
I'm not making a rhetorical argument. You should take me seriously.
The article stereotypes "rock star" developers as being obsessed with using the latest tools and compares them to more "average developers" who (it is claimed) better know their limits and ship more maintainable products.
I believe that this is a false dichotomy.
There are plenty of "rock star" developers who ship maintainable products which use appropriate technology. There are far more "average developers" who ship awful, barely-functional code because they're only able to use a limited set of technologies.
The author's argument is that the column on the right trumps the column on the left: Judgement is important, even if the genius-with-poor-judgement is some kind of CS demigod. However, he doesn't actually say that lack-of-genius and lack-of-judgement are correlated, or that he'd turn down a wise genius if such a rare and precious being fell on him from the sky. You've gotten the unfortunate impression that he's saying that, but that's an artifact of trying to use two examples to describe a fuzzy multidimensional space. (I mean, really, even two dimensions is a painfully naive approximation, except of course to a management consultant. ;)
Of course, if the original blog post used my four examples it would be as dull as... management consulting. The moral of this story is that writing is hard. You have to leave some ambiguity, and trust the reader to fill in the gaps, and hope the gaps are not too large, and that your sly invocation of the trendy-yet-pragmatic NoSQL user in the penultimate paragraph helps people take the hint. But sometimes the whole thing just doesn't work. C'est la vie.
If clarity was one of the design goals, the author should have included the 2x2 matrix. Sometimes, the best opinions aren't those that corrects our wrongly-held beliefs, but those that don't create them, which is much more silent.
I agree with the gist of this article, but there is an opposite side to this piece as well.
We do need those developers that pick the most wild, crazy, and bleeding edge technologies. New technologies shouldn't and certainly don't solve every problem out there, but they often tackle a couple of really interesting areas quite well. At the very least they allow us to question and rethink our current development stacks. Also, being one of the first to adapt to a technology has it's benefits, not saying that it justifies the risk, but it should be considered.
Also I think node js and cassandra are pretty easily maintainable, but thats not really what this is about :)
Judgement is indeed an important part of being a great developer, but I don't think the article is doing a good job of explaining why, or what kind of judgement.
First of all, the story about Jack & Dianne doesn't illustrate much. Why Jack ended up with unmaintainable code and Dianne didn't? And what does it have to do with the tools they've chosen? You can write unmaintainable code in any language.
It sounds like Dianne did a better job of understanding requirements. Okay, fair enough, understanding requirements is important (though I wouldn't call it "judgement"). But requirements change. They always do, that's why software has to be maintained - to reflect changing requirements. Which means that whether the code is maintainable does not depend on the code per se - it depends on how requirements change (unless the code is so bad that it can't be changed at all).
Say, if tomorrow the new requirement is to add a new API to the backend, Dianne's solution is probably more maintainable because it's simpler. But if the new requirement is to scale to 10K users, Dianne's solution might have to be rewritten from scratch, and thus completely unmaintainable.
So, to write maintainable code developer has to anticipate future changes in requirements and make tradeoffs based on those predictions. Now, that's a hard problem, that's where we need judgement and experience. This kind of judgement makes a great developer - mere conservatism in selecting tools and technologies does not.
I agree wholeheartedly with this concept. I previously used the term "instincts", but I think "judgement" is better. A solution is more than a technical implementation: it should address the actual requirements of the product as well as the characteristics of the maintainers. A combination of familiarity + ease of development + enough scalability.
I think that for any type of problem solving there is a clear workflow you must go through in order to efficiently come to a solution. 1.) Break down the problem to see what you are actually required to do. 2.) Think of ways in which you have solved previous problems and check whether you can do something similar to what you have done in the past. 3.) If you need to do something that you haven't done before do some research and find the tools that you need to solve the problem; if there are multiple paths to solution apply Occam's Razor. 4.) If step three didn't work, you haven't looked hard enough or you have found a problem that no one has solved before, it is most likely the former. 5.) Implement your solution; if it works great the problem has been solved. If it doesn't work retrace your steps and then take a break and come back to the problem when you have a clearer head. 99% of your problems can be solved in this manner
I think the truth of this article depends on the size of your team.
If you're hiring only one to a few developers, it's likely that you need your employees to self manage. They need to strategize, have realistic expectations, and fulfill them to the best of their abilities. "Dianne," the good developer, demonstrates these skills. So, judgment is important from this kind of developer.
However, I think that "Dianne" would be best suited to a more managerial position, focusing on her strategy skills and ability to relay needs to other developers. Handing off the implementation details to a team of rockstars, with deep and varied proficiencies, would make for a spectacularly complementary team.
And in the latter case, your best developers are - and should be - rock stars. You need to consider not only how you want your code to scale, but how your company will.
I think the fundamental problem exposed in the article is actually not the hiring criteria, but rather that resumes are usually screened by HR departments who have no clue about what a job entails. So the hiring managers have to provide hiring criteria that the HR resume screener can understand. The HR resume screener is not a programmer and has no way to judge the capabilities of programmers. The hiring manager may not be either.
As seen in a Slashdot sig some time ago, "Light travels faster than sound, which is why some people appear bright until you hear them speak." Unfortunately appearances win to those who cannot understand what is being said.
When I look around at other companies hiring Ruby on Rails developers, I see them focusing on three major traits: Super-smart; Large community following; Deep Ruby knowledge.
I definitely agree with the thrust of the article, but I run community sites and have run Ruby job ads for a few years and I can't recall seeing anyone looking for a hire with a "large community following."
In aggregate, across all the listings I've run, companies are most usually looking for experience, team working skills, familiarity with Agile practices, TDD experience and, sadly, the ability to commute to a certain location.
I can't recall seeing anyone looking for a hire with a "large community following."
companies are most usually looking for ... team working skills
I believe there is an operating assumption here that active membership and following in a given community is a solid indicator that the individual in question has very good team interaction skills, as well as working knowledge of the community's domain focus.
Maybe, although I think that assumption would be erroneous (not that you suggested it wasn't :-)).
I can only speak for the Ruby and Rails community but I think there are many developers who have a strong "following" on blogs, Twitter, at events, and even on GitHub but who aren't necessarily good "team players" (To be fair, I include myself in this bracket ;-)).
When I do my first interview (as the interviewer) one day, I'm going to ask "Do you read blogs by software developers?" and if they answer "Yes", I will not hire them because these blogs are a bunch of horse shit.
For it to be judgment, it must be living and active in a programmer's mind. It can't be switched on some of the time, it must be switched on all of the time, for the mundane and the monumental. It must be sharper than a double-edged sword; it must discern the thoughts and attitudes of the heart.
One fallacy in this article is the portrayal of the typical "regular programmer". The reality is that many regular programmers would pick something horrible but established in the enterprise like Java Struts or Oracle Web forms, not Rails, because that's "what they know".
I agree with this article and have a proposal for another trait of a great dev:
A great developer is not necessarily one who writes the most elegant code, but one who knows how to walk the line between elegant code and speedy dev time.
I agree with the first point. Development is usually a team effort. Effective teams generally play of their strengths. This is probably more important than having a rockstar on board.
Isn't simple code inherently more maintainable. I can see it having the problem of lacking scalability, but prematurely building in scalability is the root of all evil...
I agree that premature anything is bad, especially in extreme form, however, your other point needs some elaboration.
Isn't simple code inherently more maintainable
Verbose, over-simplified code can also be unmaintainable because you lose track of the bigger picture.
The only really simple code is small projects, and code that was just written. In the longer run, as features are added, there is always complexity being added.
In that phase it becomes important to separate concerns, prevent accidental complexity, factor out repeated code, and so on. This can mean that some parts of the code have to make use of more powerful constructs of the programming language, and thus are less trivially "simple".
Also, opinions differ greatly on what is "simple code" and what is not. After all, all code is simple once you understand it.
That's why I have come to define "elegance" as "simplicity perfected."
The best solution is the one which makes use of a number of clear, easily understood, simple solutions in combination to solve a problem. But the wrong approach to simplicity can be as bad as complexity.
Great developer is relatively easy to identified. Three things: 1. get shit done, 2. good decisions on trade-offs (cost, performance, etc), 3. maintainable work.
The real fallacy in the article is an unstated assumption - that developers should be homogeneous. Why not have a "rockstar" Jack and a "sensible" Dianne, in dynamic tension? Teams of one are the exception. In multi-person teams, focusing on the same attribute not only misses an opportunity to hire developers who complement one another but often leads to outright conflict as multiple developers lay claim to the same decisions or assignments. The author's point is slightly more applicable to choosing a lead developer, where judgment is at even more of a premium and many people would make the mistake of promoting Jack over Dianne because of his higher profile, but even then I think the article does more harm than good.
The problem I have with such generalization is that these scenarios are always a bit caricatural and are usually presented in a way that nicely fit the argument.
The experimental developer is required to build simple stuff, but goes to extremes just because he wants to toy with new technology, while the great one is praised for her conservative approach. It also helps that she asks exactly the right questions and receives the right answers.
“How many devices do we expect to have?” “Well, we hope to sell 500 in 12 months.” “How often will they need to report in?” “Roughly once an hour.” “How reliable is the network?” “It’ll use WiFi, so fairly reliable.”
In reality, sometimes you ask these questions, you get very accurate answers and based on that, you pick some technology that you believe will spot on address the problem. You might even make the judgment call that you have enough wiggle room to include one or two new concepts you've been curious about, that are yet still very relevant to the task at hand.
Then something happens mid-project and it turns out that what was originally requested wasn't actually what was needed. How many times has that happened?
Two possible conclusions in these situations, for either developers:
- the "rockstar" either looks like a god, for having foreseen some problems, or he'll be the guy who brought a tank to a knife fight.
- whereas the "great developer" will just look incompetent, or she'll just be, well, great.
Judgment is a nice trait for a good developer, but it is subjective. There are some hits and some misses.
What I believe makes a _great_ developer is the fact that they might work to push their own boundaries, which is the reason you're interested in them in the first place, but most importantly, when they do, they stand by their work.
I can't embrace a definition of a great developer, where the primary quality is to avoid causing trouble for the company, the project, the team or their boss. That has almost nothing to do with the discipline. You're describing a "great employee" or a "great team player".
The exact description of a great developer given in this article might absolutely not work in other environments, where developers are required to push the envelope and think outside the box. In such context, your great developer might be thought of as mediocre at best.