Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm actually on the opposite side of this. I know HN is more web-focused so embedded isn't as much represented but I'm scared every time I'm trying to learn Javascript or even Rust.

Syntactic sugar complicates code for me, it may help to reveal the intent what the code is doing but I *have to* know how the code is doing it. Of course you can just learn what the compiler will do in every of those instances but that's a lot to learn and it increases exponentially. On the other hand in C you have assignments, conditionals, loops and function calls - nothing else.

In the end the code will do the same thing but I'm willing to spent few more keystrokes and maybe an added comment just for the peace of mind of knowing what is actually happening. (One may say that the assembler output is still a blackbox but the general structure of the computation will remain intact)

It may be outdated, I agree, but it's just something I can't shake off



Idk I understand what you are saying, but in my experience you just end up building an intuition around what the sugar is doing, and you can de-sugar mentally to understand about what the code compiles down to. But I can understand if you're really working on HPC or something like that how you would want to make it completely unambiguous.

With Rust I think it can be a bit tricky, because a lot of it is context dependent, and the compiler will often make a lot of things easier for you until it can't, at which point the error which got you into a situation may be pretty far from the code you just changed. That's why I think syntactic sugar is best when it's quite dumb and local. I.e. X is just another way of writing Y.


> a lot of it is context dependent

I think you nailed it here what I couldn't put in words. I don't mind context-dependency when it's in the code I'm writing (calltree, variables, flags etc.) but for some reason if it's context-dependent in the language itself, hidden from me, it feels uncomfortable because I cannot change it, have to learn it and then have to debug it.


I've been thinking about it a bit more and I think I've gotten to the bottom of the point I'm trying to make. But again, it's very subtle and unimportant 99% of time but it's the way I tend to think about code

First example (C):

for(i=0; i<N ;i++) { t = data[i]; ... }

Second example (python):

for t in data: ...

The difference is that in the second example I'm telling the compiler what I want to *have*. It will give me exactly that ('t' will contain consecutive values from 'data') and it will do it's best automagically. In the first example however, I'm telling the compiler what to *do* and it will just and only follow orders (iterate 'i', copy the value from data+i to 't').

Insignificant difference most of the time but an important one to me. And sugar makes it more complicated to distinguish what I told the compiler to do vs. what I wanted to have. Helps me with debugging and getting cycle count under control


do vs have python vs c interpreter vs compiler

but then who is the doer and haver c vs asm complier vs assembler


Any thoughts on Ruby? As someone who's worked with it a great deal of the time, I find it extremely simple and concise to write in.


I haven't tried Ruby at all to be honest. If I need to do some quick'n'dirty scratchpad calculations on x86 I'm using python but I find myself still using 'C-like' approach instead of idiomatic python. I.e. even 'for x in X:' gives me heebie-jeebies for some unexplained reason (though I use it). I think it's because I've been conditioned for so many years to think about data/code as just that, data in memory. So every time I need to access something I'm thinking *where* it is first, not about getting it - that's secondary and, paradoxically, unimportant. I'll use it in the next stage/line but by then I already have it (I don't know if that makes any sense at all).

It makes no sense to do it most of the time in most circumstances, and I think downvotes I'm getting represent that (though I'm not preaching it or anything, just commenting about my point of view), but if you have a hard deadline of X cycles you think about code differently.


I totally understand. Having recently transitioned to embedded C/C++, I can tell you that the way I think about memory and memory access in this world is very different than my thinking about it in python for over a decade. I really like iterators, they help eliminate a whole class of bugs, but after the last couple of years, I'm not sure I'd be so comfortable with them if I'd been counting cycles for most of my career as well.


> I can tell you that the way I think about memory and memory access in this world is very different than my thinking about it in python for over a decade

Would you be able to shed some more light on this in the opposite direction? I would appreciate it because I'm moving to a higher role and I would like to develop tools my team can use to streamline development but I can't implement them myself and I'm very anxious to take over the tools team because my brain is wired in a very different way (I know what they need but not how it should be written) and what I think is important is probably last year's snow there so I'll be slowing down things unnecessarily. Any advice?


If I understand you correctly, the main difference is to stop thinking in terms of bits and bytes and think more in terms of amorphous data. Like, for x in X: is really more appropriately expressed as for value in values:

Values is just a container, and we'd like to grab one of what it contains one at a time to do something with it (I'm assuming native English speaking here, other language's pluralization might not be as natural to this convention). It might even make sense to start with some kind of notation like, for value in valueList:, to help internalize, but this can be a crutch. Because of the ducktyping nature of python, values might be a list today, but the same code can easily handle any iterable (I've done enough maintenance programming where it was clear some variable had never been renamed, it was once a list, and was now something else. Type annotations on newer codebases help here). The other thing to realize, is that heterogeneous collections is one of the things that Python excels at, and something that is likely to make your head hurt in C (I'm not even sure how I'd implement mixed type arrays in C without doing something gross with void*). I'm not saying you necessarily want to mix types since it can often lead to a bad time, but it still helps to think about just how amorphous the data can be.

Another thing that will feel really gross, is that Python is going to be orders of magnitude slower than your used to in C (with similar memory bloat). Just get over that, and know that if you absolutely need to speed up parts of the code, you can write those parts in C fairly easily.

Anyways, good luck.


If you're used to writing embedded code then I'm not surprised - you want to be able to fully introspect everything that is happening since understanding performance is so important.

Syntactic sugar is often better in high-level languages because it provides a level of indirection and allows for more flexibility in API design.


It complicates code until it makes it easier, sometimes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: