It seems to me that there are two schools of thought: one school which uses debuggers, the other school which uses printf statements and equivalents. Neither schools are absolutely false.
I was once told a quote (original unknown, so any butchering is mine) something " when I was a young programmer, I relied on print statements to debug. When I learned more, I used a debugger, and when I learned even more, I used print statements"
I always took that to mean dogma was no substitute for results, so I use whichever works better for the problem at hand. (Or at least try to - often this means trying to notice when one way isn't working well and being willing to switch, but that means I was doing the worse choice for a while)
Exactly! I’ve dealt with multiple real-time embedded platforms that have a log output channel that is waaay more convenient to use via printfs than hooking up a whole debug apparatus.
Though I've had the super rare occasion where the bug actually IS the code behind the print function, usually in experimental PL/OS stuff. There's a sort of turtles all the way down feeling the first time it happens.
Gotten bit by that before in Ruby, which attempted to pretty print objects by recursively enumerating their properties and printing them. One mutually recursive data structure later...
Well gdb is pretty hard to use. I learned how to step through programs recently but I'm still not confident using it. The fact that printing variables is still the quickest and easiest way to gain insight into a running program's state speaks volumes.
To use gdb, I have to recompile all code with -g, run gdb, set up breakpoints and variable printing, run the program in gdb and then slowly step through it.
There's times I use the debugger and times I just print to the console. It really, really depends on the situation. Not to mention setting the logging level to DEBUG is basically the same as just a bunch of printfs and it's many times the best way to debug issues in production.
To say printf is not a "quality debugging technique" is just false.
"Open Source means it has fewer bugs and is more secure."
Maybe not in the absolute meaning of that expression, but I find it true when you only consider the higher quality open source projects, e.g. say the top 10k open source projects. The code that most professional software developers produce at big and small companies alike makes me in awe that their final products work as well as they do.
"Being able to program is the most important aspect of being a good software engineer."
Ehh, I would say this is true. Certainly not the only aspect but if you can't code then you just end up being someone playing office politics all day trying to hide that fact.
I find it true when you only consider the higher quality open source projects, e.g. say the top 10k open source projects.
How are you defining "top" here? The best quality open source projects are not the most widely used projects -- if anything, I find that the correlation tends to be negative.
The biggest falsehood I believed is that companies care about writing quality code, and you will get to apply the rigor on the job that you learn at uni. On the whole, they don't really care, with some exceptions.
This is a great list. The arrogance of the recent grad is pretty astounding (in my experience) (myself included). I wish more time was spent on solving some of these misconceptions during university.
> Command-line tools should print colorized output.
Just add a "colorless" command line argument if you need to parse things without ASCII color codes. Better yet, detect when the program is outputting to a pipe and disable colorized output, e.g. like Git https://unix.stackexchange.com/a/19320
> They will use lots of math in their career.
The validity of this is highly domain-specific.
> 'git' and 'GitHub' are synonymous.
Aside from GitHub-a-likes (e.g. GitLab, BitBucket), what does this mean? I'm assuming that it's that you can use Git by its original use pattern (i.e. without a "single-source-of-truth-plus-issue-tracker-as-a-service" system and more as a "true" DVCS) but very few projects seem to really use it this way. They're important projects, sure (e.g. Linux) but they are few in number.
> Sprinkling printf statements is an efficient debugging technique.
This is...opinionated. As useful as GDB can be in a pinch I have often preferred to just output things to console. `printf` specifically is a bad example, as C doesn't have reflective abilities (out of the box, anyway), making debug-by-manual-print harder.
> Compiler warnings can be ignored, as they'd be errors otherwise.
I smell a Golang programmer...
> Using lambda in Python is a good idea because it shows others you have a CS degree and understand the "Lambda Calculus".
Kinda rolling my eyes at this one. Yes one can "be annoying" with FP-like concepts but lambdas can be very useful in a pinch.
>Object-oriented programming is the best and most common programming paradigm.
>Using a custom written Vector class makes your program object-oriented.
I feel I should add my personal falsehood: "State objects with methods means your program is object-oriented."
Unfortunately I really know people in my class who believe that if people are in fear of getting fired and missing deadlines they will be more productive :/
A lot of people who are interested in management (including CS people wanting to make a ton of money), have this idea that 'survival of the fittest' is just everything killing everything else all the time.
The line of thinking is that the best way to be successful in the business world is to abuse and destroy everyone possible to show how tough and competitive you are. If you've ever been pushed to compete with your co-workers instead of a competing company, someone higher up is probably thinking that way.
The article talks about 'lots of comments'. The occasional comment can be helpful but many things are just as clearly expressed in code as in comments. E.g., if you write a numerical algorithm you should probably either explain it or add a reference. But if you start to see things like
DatabaseConnection conn; // Connection to the database
eeeeh, that sounds more like tribalism; there's plenty of good webapps written in Python... and Ruby, and Go, and PHP, and Perl, and any number of other languages, many of which might be unexpected but work well for the people using them.
That is correct, but some just tend to religiously adopt Python. Those who understand that the best tool for the task at hand is the best tool for the task at hand are fine.