Let me know when Wifi gets to the point where I don't need 4 mesh nodes in my 700sqft 180 year old house just to get 50mbps in my bedroom out of my gigabit line at the front of the house.
That is probably never going to happen. Your problem is not wifi, but the environment, where you are using it.
We also have an old house, with 1m thick stone walls. The stone works as a good material to build a Farraday cage, so we have wifi only in the rooms that have AP. But the APs themselves are not connected wirelessly, that would be futile. Where it was possible, we ran ethernet cable, and where it was not practical, powerline is it. You can get pretty decent speed over powerline, I was surprised too.
I wouldn't advise it, powerline technology is very flaky and even less reliable than WiFi. I have a Devolo G.hn powerline bridge and my parents as well, the one downstairs requires power-cycling about once a week (why they don't just put a watchdog timer in the unit eludes me).
Is it only that adapater or are you power cycling on both ends?
Too much work dragging / cable management in my current setup.
I'm looking to remote stream / remote play my desktop to living room and bed room TV. From what I'm reading, it feels smoother even when there's more latency. But power cycling multiple adapters would be a issue though.
Just one end suffices. I think the software for the Broadcomm PLC controller chipset is flaky and needs a swift kick in the pants every now and then, which is why a simple heartbeat protocol with a watchdog timer should suffice.
IEEE attempted to fix reliability, range and latency issues in WiFi 6 and 6E. Unfortunately, since the 802.11 spec's conception in the late 90s, the push has been to get throughput numbers (meaning, the marketing number for more sales for vendors) up. I believe WiFi 7 is something like 11 Gbps max theoretical throughput which is a solution in search of a problem. Absolutely nobody needs 11 Gbps, but everyone needs reliability.
wifi7 has MLO, which will contribute to devices getting satisfied with their data sooner and not asking for more timeslots. 4096 QAM will also, however marginally, contribute to this effect.
In crowded spaces speed == reliability since a ton of the unreliability is due to crowded channels. More speed == clients shut up faster == more clients are able to actually communicate reliably.
Sure, you might not need 11Gbps on your own. But you + all your neighbors might need a total of 11Gbps. The air is just one big shared wire.
Eh... OFDMA and guard interval modulation would probably serve better for crowded channels than higher throughput. Also, something like 40% of WiFi airtime is used by... beacons. Another opportunity for optimization.
OFDMA and adjustable guard interval modulation are already features of WiFi since WiFi 6. So its not like they're just ignoring those things, they're already there.
Yes, but my original point was the IEEE came up with these things (OFDMA, MU-MIMO, fronthaul MIMO, GI modulation) when they were given the opportunity to focus on optimization instead of speed. I am saying it would be nice if they would continue to do this instead of QAM++.
The higher 6GHz band penetrates walls less than 2.4GHz and 5GHz, which means you will need more APs but each AP will have less interference from neighbors.
Yeah - I have had this repeatedly as I tend to live in old stone houses with meter thick walls.
I’ve found powerline networking to work surprisingly well - over a hundred megabits on a crufty 70 year old power install, and I just jam them strategically in rooms that are separated by a thick wall.
At home, I throw regular 802.11n half a kilometer through empty air with a yagi, but that’s a different story entirely.
Same, I have two powerline APs that deliver ethernet+wifi, more than enough for chromecast/netflix/etc. The only issue i have is sometimes clients don't choose the best APs.
Counterintuitive, but try turning the transmit power down on your APs - it’ll reduce the intersection without compromising signal quality where you actually want it. I used to almost reflexively turn them all up to full, but that, it turns out, is a fool’s errand.
Lathe and Plaster wall construction is not WiFi friendly. It was the dominant interior wall material during the Victorian era until drywall became popular in the 1930s and 40s.
I don't think that'll ever be possible if by "180 year old house" you mean "house with interior walls made of stone". You chould use powerline adapters (with or without wifi extenders) to get the signal through the house without running visible cables.
For example (and I'm not endorsing it, other manufacturers are available) the TP-LINK TL-WPA4220.
I was just being facetious. I've used powerline in the past, but it's proven to be an impossibility in this house as it was wired multiple times over the years and there's like 8 different circuits between 5 rooms.
My house is larger than this, and my wifi covers my house, my yard, and reaches a few houses away. There's something different about your house, do you have metal in the walls?
The clue is “180 year old” - it’s going to be masonry. I’m willing to bet that yours is framed lumber. The former is like, well, a brick wall, to WiFi signals. The latter is barely noticeable.
Not necessarily. My 200 year old house is wood. Although the plaster still there attenuates a bit.
But I agree with your basic point. An access point in one corner of the 1800 sq. ft. house mostly covers the whole place. I was just at a friend's 2BR Europen apartment and one access point only covered about half the place.
Won't happen. They are pushing WiFi to lower range not more. A routers in each room (ideally wired, but mesh if you can get a connection is okay) won't interfere with your neighbors WiFi (you won't even realize this is the cause of problems if it is!) and allows for much faster speeds.
7 is actually a huge breakthrough imo because of multi link support. It's not just 'overkill'.
It can band the various frequencies (2.4 + 5 + 6) together transparently. This is actually a huge boon for normal users, as often devices get stuck on 2.4GHz which is congested to hell even though a 5/6GHz network is available (especially on a laptop or smartphone moving around). It should shift between them transparently.
However, Windows 11 doesn't support this yet on the release version, and I have a feeling compatibility could be really poor (let's hope not). But this feature alone is a massive breakthrough for reliability of WiFi.
Also the 320MHz bandwidth on 6GHz is hugely helpful (though in the UK/EU only 1 channel is available at that channel width, in the US there is much more spectrum allocated - I believe this should change in the future though). I struggle personally to get more than 1gig/sec out of real world usage on 160MHz channels, which is a pain because I can get faster than 1gig from my router but wifi is the limiting factor.
So I would basically expect to see real world speeds (ie: not right next to the router and through a wall or two) of around 3gbit/sec on WiFi 7, compared to 1gig/sec on WiFi 6/6E - as you will have a 320MHz 6GHz channel bonded with 160MHz 5GHz (and perhaps some 2.4, but I'm not sure if it can do 'tri band' operation, I've seen conflicting information about that).
Considering how pervasive Wi-Fi is in the modern world, there isn't a huge amount of spectrum allocated for it. Only 3x 2.4GHz 20MHz networks and 5x/6x 5GHz 80MHz networks are enough to congest the most accessible spectrum in high-density areas.
The addition of the 6GHz range in Wi-Fi 6E and Wi-Fi 7 is a welcome addition there, but it's also interesting to talk about the fact that newer modulation schemes also help to improve the efficiency of available spectrum. For example, Wi-Fi 7's 4096-QAM theoretically improves transmission rates by 20% compared to Wi-Fi 6's 1024-QAM without increasing the channel width.
> Considering how pervasive Wi-Fi is in the modern world, there isn't a huge amount of spectrum allocated for it. Only 3x 2.4GHz 20MHz networks and 5x/6x 5GHz 80MHz networks are enough to congest the most accessible spectrum in high-density areas.
In reality networks aren't transmitting all the time, so even though only 3-6 networks can be active at a time without interference, that's fine as long as everyone in the same apartment building aren't all trying to download a 100GB game update.
WiFi 7 can do 320MHz channels in the 6GHz band. Fortunately the FCC ignored the cellcos' lobbying and assigned nearly the entire 6GHz band to WiFI rather than selling it to the rapacious data monopolists (motto: no one should ever transfer data for free without paying us a cut).
I'm not yet making use of wifi 6 or 6E, let alone needing wifi 7, when my ISP fiber speed at home is only 300Mbps so even the old wifi 5 AC currently in use is sufficient for my household with its 866 Mbps theoretical top speed.
I could in theory install it at my parents place where they have multi gigabit fiber internet available, but there's only so fast you need to beam cat videos to your iPad before you reach diminishing returns and I feel like we crossed that point a while ago.
Wifi 7 feels just like mm-wave 5G and 8k TVs, as in there's legit use cases it for them, but for 90% of people it's overkill compared to the previous-gen they already have, and all the hype in the media is basically advertising to drive consumer interest and sales for something they most likely don't need.
5G marketing always made me smile. “It will allow doctors to operate on a patient on the other side of the planet”. How? “It will enable next gen VR content”. How?
If WiFi + fibre can’t do that, how on earth is 5G supposed to do that? Even if 5G was supposed to replace WiFi (it didn’t), you’re still stuck with the same fibre in between the two 5G masts.
Worse, ultrawideband/mmwave 5G and 8k TVs can be a nuisance. I've seen 5G phones repeatedly get "stuck" on a weak or nonexistent mmwave signal while traveling, and it can take quite some time before the phone decides to fall back to a different tower. 8k TVs are cool until you realize that a lot of your cables won't even output 4k signal to them, if the manufacturer didn't implement a decent fallback.
I'm not sure if wifi 7 has similar hangups, but considering the fact that my 10 year old Airport router still does everything I need at home, with an integrated backup for my Macs, it's a hard sell. I still get reduced bufferbloat on the Airport Extreme compared to most commercial routers -- I'm guessing that even when you just use one, modern mesh network configs have some nasty overhead.
The worst thing about TV input is usually refresh rate. Many TVs will display 120hz internally, but their HDMI port will only carry 4K@60. Because of this, the only way to see 120fps video is by playing 60fps (or less) video through the TV's internal GPU-based frame interpolation.
If TV manufacturers would have just put in a DisplayPort port, we wouldn't be in this mess.
>If TV manufacturers would have just put in a DisplayPort port, we wouldn't be in this mess.
HDMI winning the TV market over DisplayPort wasn't a free market victory like VHS over Betamax, it was an anti-consumer corporate monopolistic grab.
The big TV manufacturers at the time (Hitachi, Panasonic, Philips, Sony and Toshiba) were also the main founders the HDMI consortium when it was invented, meaning they had vested interest to push their proprietary shit to get paid and make more money.[1]
HDMI was also backed by Fox, Universal, Warner Bros and Disney along with system operators DirecTV, EchoStar (Dish Network) and CableLabs.[1]
It's like politicians voting themselves bigger salaries. Open standards like DisplayPort stood no chance compared to so much corporate moneybags, who monopolized the TV market, pushing in the opposite direction.
Consumers were never given a choice in the matter, but the TV manufacturers, movie studios and cable operators decided for us. From the bottom of my asshole, a big "fuck you" to them and to the regulators who slept at the wheel.
I have never seen a WiFi 5 device benchmark anywhere close to 866Mbps in real world scenarios.
Most of the time, you're lucky to reach half of the "theoretical" numbers for WiFi due to the way that they're calculated. I would almost be amazed if you could consistently max out your 300Mbps internet connection over WiFi 5, but that's probably just within reach of WiFi 5, assuming there isn't much interference for WiFi 5 to handle.
EDIT: I checked on my old iPhone SE (1st gen), which is a WiFi 5 client device, and it was only able to achieve ~240Mbps down on the best run. Are you sure you're not bottlenecked by WiFi 5?
My iPhone SE 3rd gen is bottlenecked by my 500Mb fiber line in speed tests within my office room where the wifi AP is located. My AP is only a wifi 5 device.
Testing with a WiFi 5 client device (iPhone SE 1st gen), I'm seeing only 240Mbps down and 122Mbps up, which is far closer to what I would expect from WiFi 5.
I don't know how a WiFi 6 client (like your phone) might work around the limitations of a WiFi 5 access point, but I'm not seeing anywhere near 500Mbps out of WiFi 5 from a normal client device.
But in this case, I am still getting what I'd consider great performance out of my wifi 5 AP while using an almost 2 year old device. Moving to a wifi 6/6E/7 AP is not going to appreciably make my Internet experience on my phone any better or faster. I felt this was relevant because the article is about wifi 7 access points.
I agree your comment was relevant. The person I was responding to said they weren’t using WiFi 6 at all, which I interpreted to mean that all of their client devices were also still WiFi 5, but my interpretation could have been wrong.
If you can run docker somewhere in your network, you could consider running an OpenSpeedTest server and browse to that from your iPhone. This would let you remove your ISP from the equation and see just how fast your WiFi 5 connection can go.
I’m also mildly skeptical that your access point is WiFi 5, but instead might actually support WiFi 6, since you’re already close to the maximum speeds I was seeing out of a WiFi 6 access point and WiFi 6 client device before I upgraded my network. Maybe WiFi 6 clients can do some serious magic with WiFi 5 access points, but it just seems… unlikely. But if you’re sure, then the numbers are what the numbers are. It just doesn’t make much sense to me.
>but I'm not seeing anywhere near 500Mbps out of WiFi 5 from a normal client device.
Not sure what you mean by "normal client device", but I've gotten approximately 500Mb/s on my laptop under optimal conditions (ie. direct line of sight to router and connecting to a wired device on the same LAN). The "top" speed for 2x2 80mhz under wifi 5 is 867Mb/s[1] so that roughly tracks. Under more realistic conditions (eg. wall between router and device, or transferring files between two wireless devices), you'd expect less.
Are you sure that the 2016 iphone se can do wide channels and multiple streams? Those are wifi5 features, but devices do not have to support them. Highend ones often do.
As mentioned, I'm just doing a speed test to show the bottleneck is my fiber line and not the wifi, hence my wifi is giving significantly better performance than was expected by the commenter I replied to.
It depends on how new/expensive your router is, how strong the signal is/close you are, and how congested the channel is (at that moment). Some Wifi 5 (802.11ac) routers are significantly faster than the average.
When I used the Macbook Pro (13", 2015) with Turris Omnia, my jaws dropped. I was getting gigabite over wifi. No wonder that the mbp had no ethernet.
It turned out, that Apple at the time used 3x3 MIMO ac and Turris happened to also be one.
With more common setup (2x2 mimo, 80 mhz wide channel), it is not a problem to achieve near 800 mbps speed with wifi5. 300 Mhz is too low; that must be either very noisy environment, or misconfigured AP (no mimo, narrow channel).
with MLO and the promised latency decreases, even at 300Mbps, it should significantly make a difference on how "snappy" everything is (and how reliable video calls are).
The fantastic article, Wireless is a trap[1], is one the best resource and will likely be valid for quite a pretty long time that -- yes, Wi-Fi is cool and all -- but you it will still lag behind wires.
The article was also discussed over and over again;
I moved mostly to wired during the Pandemic and the wi-fi is just there for the ones that needs it. Recently, I saw Crosstalk Solutions video[1] about Wi-Fi 7 with the new Unifi Access Point and he makes some good point about Wi-Fi 7 not something almost all of us don't need yet -- for quite a while.
And from my cursory readings, this is still not going to be easy for the Indian Homes (Apartments) where we have pretty thick concrete walls that blocks the 5GHz bands very successfully.
Comcast is my only wired ISP option. They can't even deliver the speeds I pay for. I'll upgrade my router once they can actually deliver speeds these routers are made to handle. Historically I've been able to stay 1-2 generations behind and not lose any top end performance because our local Comcast node is overloaded.
do you have Comcast Business or Xfinity (consumer)?
Business is marginally higher cost, but no caps on business plans and you get upgraded node priority. The likelihood your neighbors are also on business is low to none, and your traffic will get prioritized before consumer lines, not to mention there are less shenanigans to deal with when you interact with Comcast via their business unit, usually.
Its a better deal all around. I had them for a time, and while I still loathe Comcast and I'm so happy to be on a true fiber provider[0] (Ziply Fiber FTW!) I did find Comcast Business to be much superior service wise.
[0]: While not all of Ziply installations are FTTH, mine is, and its amazing, and its only 60 dollars a month.
> do you have Comcast Business or Xfinity (consumer)?
Both! Xfinity at home, and I have an office in the same town as my home and sub to Comcast business there. I know the download rate I pay for at my office is much, much lower than at home but it doesn't really effect me. With two teens, gaming and and an "all streaming" house I need the down speeds most at home.
I will consider it for "at home" when I close my office and my kids are off in the world (lol on that last one). Thanks for the tip!
I'm doing major renovations and will be wiring the whole place up. But for the few things that are gonna remain wireless I'll still have the fastest WiFi possible.
Two things I'm most excited about are building an amazing dedicated sound proofed and optimized home theater and having a real centralized server rack for everything in the house. It's gonna be great
WiFi is substantially faster than wired Ethernet which has stagnated in consumer applications at 1gbps for decades. For example the last time I did a macos migration it chose WiFi despite the wired path being available, and proceeded at greater than gigabit speeds. That's on top of all the convenience.
For me consistency and reliability are more important than (peak) throughput. I'll take 1Gbps wired over any WiFi standard, since the former is not affected by someone running the microwave or the random gremlins that tend to cause WiFi to misbehave when you need it most.
I'm not sure that's a fair blanket statement to make. I don't think I've bought any new devices with only 1gig ports in quite some time. My work laptop has 2.5GbE, my 5 year old desktop has 2.5GbE, my NAS has 10GbE, etc. 2.5GbE switches have also gotten quite cheap.
My WiFi 6 router only has gigabit LAN ports, but that's because I didn't personally need to go any higher than that. Faster options were available at the time I bought it.
My point is that consumer electronics are generally available with >1GbE if the application suits it. If you only buy the cheaper models then 1GbE is more common. There are plenty of applications where even 1GbE is overkill and spending any more money on a faster link would be throwing away money.
I think stagnation is a fitting description. Gigabit came out in the 1990s. It is still the mainstream speed. For example, a base mac mini, despite being one of the fastest machines you can buy, has a gigabit port or optional $100 10-gig port. There is no option for a 2.5 gigabit port. An iMac comes either without ethernet, or with gigabit. NUCs come with 2.5g, I grant you that, and I attribute that to Intel's push for the standard. But their 2.5g chipset notoriously doesn't work.
2.5g is on the higher end of devices a consumer would buy today, and lacks support among even slightly older devices. Wifi 7 is similar: it exists, but just barely. There is a Wifi 7 device in my hand right now, but few people own one. Under ideal conditions WiFi 7 whoops 2.5gbps ethernet, though. Even 6E which came out years ago, gives comparable throughput to 2.5g ethernet.
$100 is a little pricey I guess for an adapter but how much is it to get a wifi router that actually does 10Gbps in actual tests? For example I found this review for a “9.5Gbps” device that costs $1400 and actually hits like 1.5Gbps when measured.
The RealTek 2.5gb ethernet on my Asus motherboard hits maybe 400mbps with a tailwind, because it only supports a single receive queue. Not hitting the advertised base rates is common to wired and wireless alike.
The router Bell provides in Canada now has a 10 gig ethernet port but I'm plugging that into my dream machine which only has 1 gig.
I'll need to upgrade to the dream machine pro and then add 10 gig switches through the SFP+ port I believe. The new unifi 7 pro access points have 2.5 gig ports which will definitely be close enough to the 3 gig fiber internet.
Then I'll need to upgrade my desktop PC and nas to 10 gig ports :)
Will it change much for my daily usage? I doubt it. But it will feel good lol.
My favourite wifi vs ethernet finding is that my modern LG OLED TV is capped to 100Mbps through the ethernet port, but has no such limit on wifi and can therefore run much faster. I wonder how much LG saved by not adding gigabit ethernet ports?
100Mbps isn't even that much - I can push that if I'm streaming an uncompressed Blu Ray HD rip.
Gigabit ethernet is extremely different from fast ethernet. At this point, I would expect a fast ethernet port to cost more because of relative scarcity of suppliers, but there might have been an SoC at the time with an integrated one.
There are a lot of variables. Pier and beam or slab? It can often be easy to run wires under the house in pier and beam, not really an option for slab. Single story? It is usually pretty easy to run wires in the attic, but dropping down from the second to first story could be tricky without opening up a wall.
I paid a cabling company to wire up my old house. Cost a few grand and they figured out the path (attic, crawlspace, and in some cases the exterior of the house). I have a UniFi system so I got PoE WiFi access points and PoE 4K cameras installed at the same time.
I've got a wifi 6 era wifi bridge going for something that I can't wire.
That's working grand - 1 gig eth on both ends and link rate often ~1.8 in between so that's near enough to wired if you forgive the ms or two extra latency.
So unless my ISP decides it's time for 2.5gbe I'm not changing a thing.
I do have a 6E AP too but haven't used it yet.
7...I just don't see why I'd need it any time soon
What I'd really like is for someone to discuss and review any improvements that newer access points can have with older clients. For instance, 802.11ac brought beam forming that made a network of 802.11n clients work better. Otherwise, what's the point when the same hardware will be 1/3 the price when we finally start getting WiFi 7 clients?
Still on 802.11n in an old solid-wall house with multiple APs connected over ethernet, because it works reliably and cost under £100 to implement with second-hand kit.
I'm still on 5, but I keep meaning to get something for my shed. If I ever do this I'll buy the latest for the places I spend most of my time and moving an old AP to the shed. It might be Wifi8 by the time I get a round toit though.
Ubiquiti did come out with their U7 Pro which is only $189, which is really not bad considering their AC Pro is $149 and their U6 Pro is $159.
I ended up upgrading since I do now have a couple of Wifi 6E devices and I live in a very dense apartment building. The 6 GHz band has been crazy good! With the same laptop, with line of sight to the AP, I'll get 800 mbps on the 6 GHz band and 250 mbps on the 5 GHz.
The UniFi U7 Pro that I bought was ~$200. For that money, it upgraded my network from WiFi 6 to WiFi 7, so now the 6E devices that I have are noticeably faster (since I didn't have 6E before), and my network is future-proofed for WiFi 7. It seemed like an easy win for me, so I disagree with your assessment.
You also don't need multi-gigabit internet to have a NAS that you want to connect to over the local network.
You can get 6E router for <$100. Sure future proofing may be a valid argument in some cases, but the vast majority of users simply do not need to spend that extra money.
I wonder why many don't consider working with a lot of data off of a NAS that could benefit from the cleaner network setup. 6Ghz is also clearly helpful as an early adopter.