I haven't tried Ruby at all to be honest. If I need to do some quick'n'dirty scratchpad calculations on x86 I'm using python but I find myself still using 'C-like' approach instead of idiomatic python. I.e. even 'for x in X:' gives me heebie-jeebies for some unexplained reason (though I use it). I think it's because I've been conditioned for so many years to think about data/code as just that, data in memory. So every time I need to access something I'm thinking *where* it is first, not about getting it - that's secondary and, paradoxically, unimportant. I'll use it in the next stage/line but by then I already have it (I don't know if that makes any sense at all).
It makes no sense to do it most of the time in most circumstances, and I think downvotes I'm getting represent that (though I'm not preaching it or anything, just commenting about my point of view), but if you have a hard deadline of X cycles you think about code differently.
I totally understand. Having recently transitioned to embedded C/C++, I can tell you that the way I think about memory and memory access in this world is very different than my thinking about it in python for over a decade. I really like iterators, they help eliminate a whole class of bugs, but after the last couple of years, I'm not sure I'd be so comfortable with them if I'd been counting cycles for most of my career as well.
> I can tell you that the way I think about memory and memory access in this world is very different than my thinking about it in python for over a decade
Would you be able to shed some more light on this in the opposite direction? I would appreciate it because I'm moving to a higher role and I would like to develop tools my team can use to streamline development but I can't implement them myself and I'm very anxious to take over the tools team because my brain is wired in a very different way (I know what they need but not how it should be written) and what I think is important is probably last year's snow there so I'll be slowing down things unnecessarily. Any advice?
If I understand you correctly, the main difference is to stop thinking in terms of bits and bytes and think more in terms of amorphous data. Like, for x in X: is really more appropriately expressed as for value in values:
Values is just a container, and we'd like to grab one of what it contains one at a time to do something with it (I'm assuming native English speaking here, other language's pluralization might not be as natural to this convention). It might even make sense to start with some kind of notation like, for value in valueList:, to help internalize, but this can be a crutch. Because of the ducktyping nature of python, values might be a list today, but the same code can easily handle any iterable (I've done enough maintenance programming where it was clear some variable had never been renamed, it was once a list, and was now something else. Type annotations on newer codebases help here). The other thing to realize, is that heterogeneous collections is one of the things that Python excels at, and something that is likely to make your head hurt in C (I'm not even sure how I'd implement mixed type arrays in C without doing something gross with void*). I'm not saying you necessarily want to mix types since it can often lead to a bad time, but it still helps to think about just how amorphous the data can be.
Another thing that will feel really gross, is that Python is going to be orders of magnitude slower than your used to in C (with similar memory bloat). Just get over that, and know that if you absolutely need to speed up parts of the code, you can write those parts in C fairly easily.