Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes. It will truncate the file first, then allocate new blocks for the new random data, likely leaving the old blocks lying around.

You can avoid this with the 'conv=notrunc' option of 'dd'[1]. It will overwrite existing blocks instead of truncating (and possibly reallocating):

    notrunc  Do not truncate the output file.  This will preserve
             any blocks in the output file not explicitly written by
             dd.  The notrunc value is not supported for tapes.
[1]:https://developer.apple.com/library/mac/documentation/Darwin...


Do you mean that the system will try to compress whatever gigabytes of random data?


No, I mean it will store the new random data in different blocks, thus not overwriting the old blocks.


Even if the whole drive is 'empty' ?


I don't know whether a nearly-empty drive is better or worse than average. But the point is, you had a 1000-block file full of secret data and when you do

    > secrets.txt
the file is truncated, freeing blocks 1-999 (usually block 0 is zero-filled.) If you proceed to write random data it will go to newly-allocated blocks. Then a raw read of the original blocks will expose your secret data.

With dd and notrunc, the random data goes to the original blocks, overwriting your secrets.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: