You're getting a lot of pushback in the comments here and I don't understand why. This is exactly right. Stay consistent with the existing way or commit to changing it all (not necessary all at once) so it's consistent again, but better.
How fast they are depends on the difficulty of the sudoku and on your machine, but it's typically a few thousand/s on the hardest known and several 100k/s to a million/s on very easy.
I'm the author of Tdoku and maintainer of those benchmarks.
If anyone has a well optimized DLX implementation I'd love to add it to the comparison. The few I've sampled from github were not competitive, but I have no idea how much effort when into them.
Seems pretty competitive to me if the third fastest in your benchmark uses dancing links.
I do think it's probably true that dancing links wont be literally the fastest, but the neat thing is that it's a generic algorithm that solves ALL total cover problems, not just Sudoku. Of course a tuned Sudoku solver could probably beat it (optimizing around the Sudoku-specific parts), but it's nice to see that it's at least in the running.
Bear in mind that JSolve was 5x faster than the fastest DLX solver in the attractivechaos 2011 benchmarks, and today there are several solvers that are 2x faster than JSolve (https://github.com/t-dillon/tdoku/tree/master/benchmarks). If the difference is really an order of magnitude, that's pretty non-competitive.
I certainly don't mean do disparage DLX. It's a powerful and general algorithm. But if you need a fast Sudoku solver (for puzzle mining or studying Sudoku research questions) then it's not the tool I'd reach for. The backtracking solvers just get a lot of mileage out of tuning their representation to play nicely with hardware. They also blow general purpose CDCL SAT solvers out of the water, at least for 9x9 Sudoku.