I started reading about D couple of months ago after work. For me, it appeared to be easier than Scala for example which I use at work. Not only is it simpler, it is faster and I am not even talking about compile times. These features alone persuaded me to rewrite couple of Scala based data processing algorithms we use for our ML models to give it a try. What I really liked is that you don't have to know all about the language to write perfomant code. Still keep wondering how does it get constantly overlooked.
> Still keep wondering how does it get constantly overlooked.
Because you kee thinking in technological terms. Languages have come and gone, those who stayed are either old enough to have a midlife crisis or are supported by at least one big company. For all the merits D has there is no big name willing to push it forward, even though its creator is now working for Facebook and is doing some internal work: Facebook didn't express any interest in showing this off.
It really is a shame, because D has a lot of things that might interest programmers, all it takes is a little more presence.
This seems to be an increasingly growing niche where D is gaining popularity. I see more and more people turning to D where they would normally use python or R for adhoc data crunching
We are using D for high performance computational biology -- I brought a Go dev and a python dev both onboard with minimal learning curve. Even the powerful template metaprogramming seemed really easy for the team to pick up. I believe this is because it really has a nice design (compared with, say, C++, which I would never, ever use in general bioinformatics unless team had specific past expertise)
I also use D for data & ML at work (Netflix) when standard python workflow doesn't cut it anymore. There's just so far you can go with the typical "call a popular python lib interfacing with a C++ backend" paradigm before hitting a brick wall.
As one of those people, I can say the reason is for the C-level processing power with the approachable syntactic sugar. Out of all the languages gaining popularity for their ability to run natively (Rust, Go, D, etc), I've felt that D is the most natural continuation of C/C++, at least syntactically. I haven't touched the language in about a year, but I'm personally rooting for it.
And with D as Better C, (-betterC switch), it's easy to convert your C project to D, one file at a time, and not require the D runtime library (only the standard C library is required).
The original impetus for this was so I could incrementally convert the D compiler backend itself from "C with Classes" to D.
I try to use D for that (difficult because the libraries are fairly low level, I would say sparse but calling into C and C++ is pretty trivial) because, for the same effort I put into bullying python to do what I want, I can write type safe (generic) code that is both actually readable and ready to be reused if needs be. And sometimes orders of magnitude faster.
That and ranges (D/Andrei's preferred model for iteration) are excellent - in my view at least.
eBay's tsv-utils author here. A fair bit of performance benchmarking was done on the tools with the goal of exploring this type of use (data crunching). D did really well (tsv-utils are fast!). There's more info on the benchmarks page in the github repo. Perhaps the best summary is the slides from a talk I did at 2018 DConf. Links:
It doesn't even need to be an alternative to Python or R, since it's really easy to interoperate with both languages. This is in addition to tools like eBay's tsv-utils.