Author here. The language is the part I also dread the most, and is also the part that will be the least unique and the least interesting to "re-invent".. I'll look into finding an existing language that runs on bare metal and that has (or supports) immutable values.
There were some dataflow languages like VAL or SISAL that did exactly that. The hardware and software on top was built in the 70s and 80s, but such parallel research funding sort of dried out after the cold war ended.
There is a resurgence of dataflow related research again, which you would probably be interested in. For example, Jack Dennis is pushing the 'Fresh Breeze Architecture': a write-only-once memory CPU with in-hardware GC. This should be right up your alley:
funny sidenote, one of House dev (Jeremey Bobbio) had a talk at FOSDEM2014 about reproducible builds for debian, similar to Nix (also immutable in spirit) I believe.
The current Haskell execution engine is very imperative indeed.
Instead of graph reduction, you could drive your execution in a completely immutable way by using string reduction. I don't think this has been explored much, I only found mention of it in 80s FP books (and Dybvig's thesis).
What about D 2.0? I haven't used it but I think they changed everything to const by default. Maybe you would have to write a collections library, not sure.