As you comment, Freebase is bigger than Wikidata. It is 22GB compressed (250GB uncompressed) while Wikidata is 5GB compressed (49GB uncompressed) [1].
Said that, I believe the process described in the blog post is not loading the whole Wikidata dump into memory and it would work the same to process Freebase or even larger data dumps with your laptop.
From the post:
How Akka Streams can be used to process the Wikidata dump in parallel and using constant memory with just your laptop.
Said that, I believe the process described in the blog post is not loading the whole Wikidata dump into memory and it would work the same to process Freebase or even larger data dumps with your laptop.
From the post: How Akka Streams can be used to process the Wikidata dump in parallel and using constant memory with just your laptop.
[1] https://developers.google.com/freebase/data http://dumps.wikimedia.org/other/wikidata/