If you make your so-called-'unit' tests too closely coupled to you don't actually work faster though, because you constantly have to refactor them to match the internal implementation.
Your testing feedback might be quicker, but quickly getting useless information (ie. your implementation changed) is just as bad as slower tests, because you need to fix the test, recompile and again before you get useful information.
Also not all 'integration tests' are slow. If you can test using SQlite in memory, instead of pointing at a PG instance that needs to be provisioned, for instance, you can get a lot of your end-to-end testing running really quickly.
I agree. Unit tests can definitely slow you down. Testing against high level interfaces or having full integration tests is better. Then the failure is much more likely to be a genuine failure rather than a legitimate change in implementation.
If you have good logging and debugging tools the failure can be quite quick to pin down. James Coplien suggests liberal use of assertions in code which is a really good idea and something C/C++ programmers used to do a lot of. Assertions can then be as effective as unit tests in terms of localising failure.
Try CREATE TABLE UNLOGGED with postgres. The diversity of RDBMS and SQL dialects defeats the point of integration testing the persistence layer with SQLite.
even this is less clear - for some dbs it's normal to use a sequence for rowid's (vs some sort of autoincrement) and i don't think sqlite offers all the same behavior there.
Your testing feedback might be quicker, but quickly getting useless information (ie. your implementation changed) is just as bad as slower tests, because you need to fix the test, recompile and again before you get useful information.
Also not all 'integration tests' are slow. If you can test using SQlite in memory, instead of pointing at a PG instance that needs to be provisioned, for instance, you can get a lot of your end-to-end testing running really quickly.