Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>In general, anything that was added to a language to support generics or to hide pointers and memory management from the developers on a language that isn't a lisp has only produced more harm than good.

Why should I care or need to worry about pointers and memory management in every part of my code?

Yeah as a C# developer I need to be aware if I'm passing a variable by value or reference.. but I don't want to and should not need to define this all the time. I know that built-in simple types (int, string, double) etc are passed as value... and everything else is passed as reference. So it's basically a non-issue.

I grew up learning C/C++ and having to deal with all the crap. Why oh why do I want to handle all of this myself?

I just want to code and focus on getting things done... C#, for example, allows me to do that.

You saying these things have "produced more harm than good" is so obviously coming from a more academic standpoint, or purist in the sense of "oh my god he doesn't even realize that those 8 bytes are going to be held up until the garbage collector comes, whereas I can deallocate that precious memory right away!". Sorry but almost no one cares. Yeah there are times where you need to care, but for most developers that time is... never.

The elitist attitudes on HN are astounding sometimes. Heaven forbid someone code without also managing every aspect of the underlying hardware!



How do you think these luxuries you were provided were given to you?

If you are operating within a managed usermode level of an operating system then what you say is perfectly valid, especially for programs that do not require high availability such as web servers. You can program in a managed environment and trust in your languages JIT/GC to handle low level optimization.

On embedded systems, software that requires high availability (video games for example), or kernel mode drivers and firmware, you are not always allowed that luxury. Understanding of and strong micromanagement of how memory is allocated and moved around the system becomes more and more crucial. It could be hardware limitations of the device that cause this, or in the case of firmware or drivers, any overhead that would be acceptable in a usermode application will be felt throughout the environment when you work on low-level.

TLDR you and the parent post are talking about apples and oranges.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: