Not with newer standards:
> Orthogonal Frequency-Division Multiple Access (OFDMA) is a multi-user wireless transmission technology that divides a single Wi-Fi or cellular channel into smaller subcarriers called Resource Units (RUs), allowing multiple devices to transmit data simultaneously.
[…]
> Instead of one device occupying the entire channel (as in OFDM), OFDMA allows parallel transmissions. As a result, network congestion decreases significantly.
* https://www.netcomlearning.com/blog/what-is-ofdma
* https://airheads.hpe.com/blogs/antar1/2020/10/19/why-is-ofdm...
> In addition, the 802.11ax standard defines the smallest subchannel as a resource unit (RU), which includes at least 26 subcarriers and uniquely identifies a user. The resources of the entire channel are divided into small RUs with fixed sizes. In this mode, user data is carried on each RU. Therefore, on the total time-frequency resources, multiple users may simultaneously send data in each time segment, as shown in the following figure.
* https://info.support.huawei.com/info-finder/encyclopedia/en/...
That’s not correct. You and your neighbor can use the same channel at the same time. On your network, the transmissions of the other network appear will appear as noise. As long as the other devices are far enough away, however, your devices will still be able to make out their own signal.
When you and your neighbour _appear_ to be transmitting at the same time, each adapter is actually spending most of it's time waiting for a clear medium and for various backoff timers to expire before attempting to transmit.
"Appear as noise" is not defined for Wi-Fi adapters. There is only "I received a frame addressed to me and acknowledged it" or "I sent a frame and either did or didn't get an acknowledgement back from the receiver". Receivers do not know why they didn't receive a frame, or, if they received a corrupted frame, why it was corrupted. They just wait for a retransmit. Senders ordinarily wait a certain time to receive an acknowledgement, and if they don't, the start the transmit wait cycle again. But they often then reduce the data rate to increase the odds of a successful transmission.
I'm glossing over some complexity here, because there's a sender and receiver to consider, and each has a different view of the RF environment, but the point is always correct when all transmitters and receivers (lets say the 2 APs and each has 1 client) are in audible range of each other. And this is most of the time. Note that "audible range" (where the signal is such that the medium is deemed as busy by the adapter) is much larger than the "usable range" (where data can be transmitted at reasonable speeds). So transmitters create interference in a much larger area than they actually operate in.
That means your neighbour transmitting at 6Mbps to his AP will indeed degrade the performance of your client who wants to transmit at 600Mbps because your client has to wait ~100 times longer for a clear medium.
That's not correct. WiFi is "listen before talk." Radios listen to the channel, trying to decode preambles from other networks, before transmitting. In that process, they can detect other signals well below the threshold where they'll consider the medium in use (the CCA threshold). If you have an otherwise clean channel, the noise floor might be -95 dBm. Radios typically can decode the preambles 3-4 dB above the noise floor. Conventionally, the WiFi standards set the CCA threshold at -82 dBm. So the radio can "hear" a lot of signals that won't cause it to trigger collision avoidance. More recent standards allow using a CCA threshold as high as -62 dBM under certain circumstances to facilitate spatial reuse: https://arista.my.site.com/AristaCommunity/s/article/Spatial....
Also, what the Wifi standards do is less aggressive than what radios could do. The CCA thresholds are set to facilitate orderly use of the spectrum--they're not physical limits. To receive a transmission, you just need sufficient signal-to-noise ratio. An adjacent network transmission raises the noise floor, but if your radio is close enough to your AP, you might still have sufficient SNR.
OFDMA on wifi7/802.11be: https://blogs.cisco.com/networking/wi-fi-7-mru-ofdma-turning...
As a general rule of thumb, the best version of WiFi x will only come with WiFi x+1. So for all the problems to be solved and ironed out on OFDMA it will be WiFi 8 then. And for all the promises of Ultra-High Reliability, it will have to be WIFI 9.
WiFi is clearly moving more towards like 4G and 5G with every version. I just hope someday that it really is good enough where there are many people using it at the same time.
OFDMA was first used with Wifi 6:
* https://blogs.cisco.com/networking/wi-fi-6-ofdma-resource-un...
But at a fundamental level, the channel space (~60 across all bands best case) is extremely limited but the potential growth in transmitters is unbounded. It's like a linear hack to an exponential problem. It seems to work at first, but under very high load conditions performance still degrades ever faster until it falls off a cliff. Then there's all sorts of complex dynamic behaviour like the hidden node problem to add to this, but it all boils down to needing air-time and SNR.
You’re overlooking the spatial dimension: https://en.wikipedia.org/wiki/Spatial_multiplexing
Per this May 2025 Juniper presentation, half of their deployed APs have 6 GHZ enabled, and at least 20%—but as much as 50% depending on the environment—of clients have 6 GHz:
* https://www.youtube.com/watch?v=sV-3gA0OP9s
Corporate environments (where client hardware is more standardize) has higher 6 GHz adoption, BYOD (universities) environments have lower adoption.
So I'm not sure how you define "a while" as, but it's probably already the majority at most workplaces, and will be for personal stuff with-in a year or so.