The latest iterations on the security front utilizes ECDH for key exchange (LE secure connections) and seems fairly robust. The legacy pairing implementation is vulnerable to mitm during the very first bonding, except in the case where the devices use out-of-band data like NFC. Neither Android or iOS opted to ever implement OOB, so that made security more difficult. It required us to tell our customers to reduce output power during bonding, so that the devices had to be close enough to avoid sniffing.
I felt that for the Low Energy part, the security concerns in this article were quite outdated. None of the listed attacks are applicable for LE.
Other than that, I think this gave a very good introduction to the protocol on all layers. I think the future for Bluetooth will be its ability to hook up lots of cheap sensors to a hub (with internet access, optionally) that can work for years without changing the battery. Unfortunately, the companies that already have a market share in e.g audio are trying to stall future advances in the LE front. Others are trying to basically reimplement BR/EDR in LE, thinking it will still stay "low energy".
Why hasn't Bluetooth taken off more for wireless keyboards and mice? Bluetooth has been very common for a long time as a built-in on laptops, but good luck finding any of these peripherals that don't require a dedicated little USB receiver.
Is it a licensing/certification cost, or something more nebulous?
Bluetooth keyboards/mice need drivers and they don't work at the bios level. Whereas logitech's usb receivers, for example, can be use in the bios and doesn't need additional drivers in the OS for basic functions.
I've always wondered how Apple managed to solve this. You can use Bluetooth keyboard to control the pre-boot options (like recovery mode, single user mode, etc)
By owning the BIOS/EFI layer in their devices. But if I remember correctly, you can't use a Bluetooth keyboard when entering your FileVault password. One of the many reasons why I never turn off my machine as I use it clamshell 99.9% of the time.
As usual, the entrenched PC manufacturers just don't give a crap. AMI released this press in September 2015 that they're finally supporting bluetooth in UEFI [1]. The necessity for drivers and the apparent lack of compatibility to standard device interfaces, and generally low software quality has made bluetooth a pain on standard PC hardware.
It doesn't have to be that way, for example Apple's systems have supported wireless keyboards and wireless mice in pre-boot forever.
I always dreaded bluetooth. It was always overengineered to my mind, fragile, ad-hoc, moving target. Sometimes the bt stack feels the same as crapware. Very subjective remark, I may be unlucky and/or dumb.
You're not alone here. I've been recently working on bluetooth related tech and was really surprised from the lack of native support under Windows 10. Had to reinstall my sytem at some point due to a stack driver mess.
"Nobody" (not enough people) uses Bluetooth, so bugs are not found in a timely manner and noone wants to invest a lot of time making it good. Which means that Bluetooth is horribly broken across all devices, which means that "nobody" uses it...
You can certify, and use the Bluetooth mark, based on assertion and documentation.
Bluetooth SIG is also not very good about going after companies misusing the mark. This is why you can buy a million brands of $10 OBD-II dongles from Amazon that all show up with the exact same device ID - and they have the Bluetooth mark molded right into the plastic.
It's actually the same issue with low-end mice and keyboards on USB [0].
Also, I tried a wireless keyboard once and there was a perceptible delay between me pressing the key and the keypress showing up on the screen, which is really unacceptable.
That's certainly not a bluetooth problem, but some other hardware/software along the chain. I've been using a mix of bluetooth and wired peripherals for years and never been able to tell the difference.
Its very hard to find data but is possible. One huge problem is most of the folk explanations or broscience level discussions insist on irrelevancies like long term bandwidth, refuse to debate transactional latency, and insist on not discussing varying deviation of latency effects. Stereotypically the discussion devolves into a debate over the exact WPM its possible to achieve over a five minute averaged typing test being 10 to 100 times faster than the fastest human typist, even on gaming forums specializing in hand/eye coordination intensive games, its like a meme strange attractor.
Here's an ancient study of overall systemic latency from hitting the switch to processing code for PS/2 vs USB.
In summary the PS/2 latency was only single digit ms and critically the deviation of the data was very low, but USB was around 20 to 30 and varied more. I know bluetooth is far worse and far more variable although I don't have a convenient handy link. It seems fairly intuitive that direct hardware I/O will be better than embedding it in a packetized lan, which will be better than running a packetized lan over a radio link.
The problem will never appear in bulk data entry or causal text editing, but its an immense problem for e-sports or pretty much anything relying on hand-eye coordination (maybe CAD?). As a simplistic example of the scale of the problem, 10 ms doesn't sound like much, but a fast pitch baseball goes about half a meter in 10 ms... Training yourself via hand eye coordination to hit the space bar to swing a bat using one protocol and then switching to a different protocol would almost certainly result in 100% failure rate, and even worse, competitors who cross protocols would almost certainly have an unfair advantage on the lower latency, more predictable PS/2. Using the modern definition where the only true form of "gaming" is FPS sequels, I would imagine it would be nearly impossible to train on moving or leaping targets using one keyboard protocol and then switch to another protocol. Also I'd anticipate that some game strategies that rely on reliably consistent latency would be possible only on PS/2 and not on bluetooth due to inherent latency and occasional interference issues.
I'd go so far as to theorize that possibly one reason console gamers are crushed when competing against PC gamers on otherwise level playing fields might have something to do with high and variable latency wireless console controllers vs very low latency predictable PC controllers.
The effect is probably not huge, but over a very large number of transactions it can eventually add up to be a real problem. Its similar to the question of if you would accept a cryptographic random source that was 1% biased? Its possible to come up with numerous scenarios where it won't matter "on very long term average" and so on, but ...
> Otherwise we'd all have to take a radio operator course before enabling the Wi-Fi or Bluetooth functions on our smartphones, or even to turn on our microwave ovens.
Not really. I've got a license for GMRS, but the "test" just consisted of sending the FCC sixty bucks.
I felt that for the Low Energy part, the security concerns in this article were quite outdated. None of the listed attacks are applicable for LE.
Other than that, I think this gave a very good introduction to the protocol on all layers. I think the future for Bluetooth will be its ability to hook up lots of cheap sensors to a hub (with internet access, optionally) that can work for years without changing the battery. Unfortunately, the companies that already have a market share in e.g audio are trying to stall future advances in the LE front. Others are trying to basically reimplement BR/EDR in LE, thinking it will still stay "low energy".