Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I had a manager that would repeatedly use this type of remark. We had this particular application that on one could touch without breaking something. It was always a mess. He would often remark "I wrote the whole thing over a weekend, why can't you guys make a tiny little change without breaking it?" One day I finally got fed up with him and replied "Because it's the quality of work you'd expect from an entire application written over a weekend."


On the opposite site of the coin, I was a professional developer 25 years ago (and still keep my hand in as a hobby), and do regularly struggle to understand why things take as long as they do these days. We were quoted a day for something that I said could be written in about half an hour - their quote included nothing more than coding and unit testing it.

So I was challenged to prove it. Which I did (or rather I did it in 10 minutes). When their code came back, a day later, it didn't actually work. When we pointed that out to them, they came back another day later with code that looked almost identical to the one I'd written in 10 minutes.

tl;dr. Sometimes managers don't realise the complexity of modern software, but sometimes modern developers are actually just plain slow.


I have been developing software for 20 years and I am currently replacing some applications that were written in the last five years. They use OOP, patterns and the latest tools. Often I wade through dozens of lines of codes trying to find the meat of what they are trying to do. I usually find easier and shorter ways to implement the same thing just by better design and avoiding repetition. I don't think programming is just about the tools, I think it is about structure and organization.

Every pattern, layer, feature or tool that you introduce in a project, makes it more complex, so you really have to use good judgment when you decide what to add.


>Every pattern, layer, feature or tool that you introduce in a project, makes it more complex,

That really shouldn't be true.

A pattern is simply a commonly used way of solving a common problem. If you're picking one that make it more complex, then you've picked the wrong pattern. The reason that I could do the example I talked about in my original post in 10 minutes was because it was a bog-standard design pattern designed to solve exactly the issue that the software needed. Amongst the many reasons why it took the developer 2 days was the fact that they had to pretty much reinvent this pattern from first principles.

Equally, if you're introducing a tool that doesn't simplify something that you'd need to do/build manually then you shouldn't be introducing that tool.


Times have changed considerably. When I started in the industry 20+ years ago our tools were a compiler, debugger, and editor. Any library not provided by the compiler we wrote simply because purchasing libraries was very expensive. We also weren't afraid to code a solution specific to the problem and not be overly concerned with abstracting every library for potential use with any future application. Most work went into the server code, which housed the application logic and coordinated with the database; graphic UI's were a luxury.

Nowadays a web application, for example, is tiered more. Different frameworks are encouraged for the different tiers: Bootstrap, Spring, Hibernate, etc. Each one is its own ecosystem and is built on top of other libraries. It's very common to make web service calls outside your WAN. Quickly you find out that "standards" have different interpretations by different library authors.

UI's are no longer an afterthought. They affect how successful your application is. (My observation is that a well-designed UI can cut down on user errors and training by two thirds over a merely functional UI.)

I'm keeping the example simple by not mentioning necessary middle-tier components that we didn't use 20+ years ago. We also didn't worry about clustered environments, asynchronicity, or concurrency.

Not knowing the application you needed or how the analysis was done by the coding team, it's hard to say if some of their "slowness" was getting to know the problem AND coming to understand how extensible, performant, and reliable you wanted it. My own approach is usually to solve the "happy path" first and then start surrounding it with "what if's" - e.g. what if a null is passed into the function, etc. Over time I refactor and build in reliability and extensibility. The coding team you referred to may have used a different approach in which they tried abstracting use-cases and building an error handling model before solving the "happy path".

Your "tl;dr" is spot on. But I'd like to raise a cautionary flag about judging modern development through a 25 yo lens. The game has changed.


I'm still heavily involved in IT. I know the game has changed. As well as the complexity that's come in to it, there's an awful lot of complexity that's disappeared. Those necessary middle-tier components and UI frameworks have to be used, but they no longer have to be built from scratch.

>We also didn't worry about clustered environments, asynchronicity, or concurrency.

Clustered environments, probably not. But asynchronicity and concurrency were the bane of my life. Writing comms software back in the day involved having to hand-craft both the interrupt-driven reading of data from the i/o port and the storage and tracking of that data in a memory-constrained queue, synchronised with displaying that data on the screen. And the windowed UI had to be hand-crafted as well. Error handling was no more of an afterthought then than it is now - and you couldn't roll out a patch for a minor defect without manually copying 500 floppy disks and posting them to clients.

I understand why some bits of development take a long time, but the reality is that 90+% of the development work that our place does these days is what an ex-manager used to refer to as "bricklaying" - dull and repetitive work that involves pretty much zero thought to implement. Extract file X (using the language's built in drag and drop file-extract wizard), sort it by date (using the language's built-in sort module), split into two separate files (using the language's built-in file split module) and load into database Y (using the language's built-in database load module).

And even with all of these tools, it still takes 10 times longer for people to develop these kinds of thing than it did when we were writing all of this from scratch. It's not because of complexity of coding, of environments, or of frameworks. The problem is that much of the IT industry has replaced skill and knowledge with process, contracts, documentation and disinterested cheap labour.


I once worked with a founder who would always pull that kind of garbage: "I wrote [simple software with no dependencies or integration requirements] in [N] days! Why is it taking you guys [M] months to write [complex software relying on several 1st and 3rd party libraries, a component that needs to work within a large, old legacy system]?

Good on the develop manager!


There was once a programmer who was attached to the court of the warlord of Wu. The warlord asked the programmer: “Which is easier to design: an accounting package or an operating system?”

“An operating system,” replied the programmer.

The warlord uttered an exclamation of disbelief.

“Surely an accounting package is trivial next to the complexity of an operating system,” he said.

“Not so,” said the programmer, “when designing an accounting package, the programmer operates as a mediator between people having different ideas: how it must operate, how its reports must appear, and how it must conform to tax laws.

By contrast, an operating system is not limited by outward appearances. When designing an operating system, the programmer seeks the simplest harmony between machine and ideas. This is why an operating system is easier to design.”

The warlord of Wu nodded and smiled. “That is all good and well,” he said, “but which is easier to debug?”

The programmer made no reply.

The Tao of Programming, Geoffrey James


I believe the developers working on Windows Vista, Copland, Taligent and GNU/Hurd would like to have a word with Mr. James.


Heh, reminds me of a boss that criticised my work a few months back: Boss: "Hey antimagic, why haven't you finished that module yet?" Me: because the code it's interfacing to is a big ball of spaghetti (paraphrasing, because it was the boss in question's code - so I was much more diplomatic) Boss: "You need to learn to be able to work with other people's code better - I can get in and modify your code easily" Me: <stare at boss waiting for penny to drop> ... 30 seconds later after no reaction... Me: Yes, I do write fairly clean code. Thank you for noticing.


I love it when someone's attempt at insulting you and complementing themselves results in the opposite of their intent. But is it more satisfying when they realize it or are unaware?


Also I've generally found it's easier to develop starting with a blank directory tree (no code), than to inherit a legacy codebase, have to make the sometimes grueling time/energy/focus/trial-and-error investment needed to come up to speed on it, understanding-wise, at the fine-grained level of detail you need to code confidently, then, figure out how to make a positive change that doesn't make some other thing worse. (And it's harder if no automated tests or documented manual test plan.) I call it OPC for Other People's Code. One of the anti-patterns of software engineering. It's distinct from NIH (Not Invented Here), which is another anti-pattern.


Of course it's always easier. The question is, can the business afford to wait while you greenfield another app? Usually it can't. That's why refactoring.

First step is fixing the development environment / build process and getting a staging server up. It will inevitably be broken / nonexistent, with frequent edits directly to production necessary. The last guy will have internalized a great deal of operational workarounds that you'll need to rediscover then codify into the app.

Next you write tests. There will be none. Once you have a workflow that is decent, you can start to identify the worst offenders. All the while, you'll be having to change the codebase to meet project requirements, this will give you a good idea of where the really bad shit is. Unit test all of it, and if you're feeling froggy, write some integration tests. Once you get to this phase, you should be unit testing your project work.

Only after those two are completed can you start refactoring. Treat it like TDD. Keep an eye on larger goals like 12 factor conformance. It may look pie-in-the sky at first, but it will give you ideas on what to focus on. Main advantage of refactoring over ground-up re-writing is, you don't have to sell it to your boss. You just do it, in and around normal project work.

The biggest hurdle is the first step. It's scary to fuck with deployment. The approach I've come up with is to fork the codebase and rebuild the tooling on top of that, deploying first to staging, then to production, alongside the current working copies. Once you're satisfied flip the switch. You may have to flip it back, but at least it will be easy and instantaneous.

These lessons are from my ongoing project to modernize an ancient Rails 2.3.5 website running on Debian Lenny. Linode doesn't even offer that OS anymore, I had to cannibalize a server with an EOL app on it for a staging environment. I can't use Vagrant because there aren't any Lenny boxes.

It's long, arduous and slow. I fucking love it.



What do you do when management won't allow Unit/Automated testing because it takes time away from writing user facing code?


Do it anyway. The thing about testing is that it's a skill that you have to work on. You have to know what to test and how. Testing shouldn't affect the speed at which you write code at all.

The reason your boss is saying no is because you had to ask him. The reason you had to ask them is because you know it will take more time than it will save, at least at first.

You have to learn this skill somehow, and the best way to do it is on something that matters rather than with a side project. So write tests at your job, and learn the skill of testing on your employer's time, without their knowledge. Or on off hours if that makes you squeamish. But learn the skill, it's important.

Then, when you refine your testing work flow to the point where it makes more sense to test as you write, don't bother hiding it anymore. When they ask, show them your workflow and how it's not taking up too much time and list out the benefits of testing. If they tell you to stop anyway, take your new skills and find a new job. You're growing past the ability of your current job to challenge you.


Can you explain or give some links on "12-Factor Conformance"? I did a few quick searches and nothing popped up.


Please don't sign your comments. (http://ycombinator.com/newsguidelines.html)


A company I worked for had a .NET application that was critical to basically all of corporate ticketing and resource management. It would no longer compile as a whole except on one guy's laptop and this didn't seem to bother anyone but me. Instead each file was modified and placed on the server to be compiled at runtime (causing a massive slowdown). It was mind boggling just how little people cared and how critical it was for day to day operations.

Eventually that laptop was destroyed in a bizarre accident (dropped at the airport security checkpoint was the claim) and last I heard they were regularly backing up the directory on the web server and still dropping files in to compile at runtime.

This is what happens when someone writes something and no longer has responsibility to maintain it or document.


Odds are Gates didn't write the complete code, probably more of the blueprint. Also worth noting- the original implementation of FAT was pretty basic/fundamental. Could one even call it an application?


According to Wikipedia:

> The original FAT file system (or FAT structure, as it was called initially) was designed and coded by Marc McDonald,[9] based on a series of discussions between McDonald and Bill Gates.


yeah because writing a filesystem was pretty trivial back then.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: