Yes. It will truncate the file first, then allocate new blocks for the new random data, likely leaving the old blocks lying around.
You can avoid this with the 'conv=notrunc' option of 'dd'[1]. It will overwrite existing blocks instead of truncating (and possibly reallocating):
notrunc Do not truncate the output file. This will preserve
any blocks in the output file not explicitly written by
dd. The notrunc value is not supported for tapes.
I don't know whether a nearly-empty drive is better or worse than average. But the point is, you had a 1000-block file full of secret data and when you do
> secrets.txt
the file is truncated, freeing blocks 1-999 (usually block 0 is zero-filled.) If you proceed to write random data it will go to newly-allocated blocks. Then a raw read of the original blocks will expose your secret data.
With dd and notrunc, the random data goes to the original blocks, overwriting your secrets.
You can avoid this with the 'conv=notrunc' option of 'dd'[1]. It will overwrite existing blocks instead of truncating (and possibly reallocating):
[1]:https://developer.apple.com/library/mac/documentation/Darwin...