Recently bought a couple of Meraki AP’s and would like to tweak available settings, still getting familiar with Meraki’s Dashboard.
Solved! Go to solution.
Excellent question. There are a variety of topics you can find here: https://documentation.meraki.com/MR/WiFi_Basics_and_Best_Practices
For our own networks, we do a couple of things as part of our standard install:
Good luck!
Excellent question. There are a variety of topics you can find here: https://documentation.meraki.com/MR/WiFi_Basics_and_Best_Practices
For our own networks, we do a couple of things as part of our standard install:
Good luck!
Hi BHC_RESORTS,
How do you enable adaptive 802.11r?
Thanks!
~Doug
Just to add as well.. depending on your deployment or surrounding environment dropping your 5Ghz channel width from 80Mhz down to 40 or 20 Mhz could be something to address as well in high density areas to provide better channel re-use and decrease channel overlap/co-channel interference.
In some situations lowering 2.4Ghz radios to lower powers or in some cases disabling 2.4 radios in some areas is required to lower the co-channel interference/overlap.
Best of Luck,
Chris
@ChrisR wrote:Just to add as well.. depending on your deployment or surrounding environment dropping your 5Ghz channel width from 80Mhz down to 40 or 20 Mhz could be something to address as well in high density areas to provide better channel re-use and decrease channel overlap/co-channel interference.
In some situations lowering 2.4Ghz radios to lower powers or in some cases disabling 2.4 radios in some areas is required to lower the co-channel interference/overlap.
Best of Luck,
Chris
Agreed. In high density environments, 80mhz (or worse, 160), will just cause too many problems. 40mhz is suitable for 95% of deployments in our testing, and allows for a good balance between capacity and speed. At your home, sure, 80mhz is great. 160mhz is pretty much LoS anyways, so unless you are standing in front of your AP...good luck.
Automatic power reduction on 2.4 ghz is basically mandatory in any commercial deployment. You may have to even further town down the power on some noisy APs, and in dense environments, I would seriously recommend directional antennas and/or turning off the 2.4ghz band.
In some of our properties, we disable every other 2.4 radio to prevent the band from being too congested. Even though the RF monitoring doesn't show it being too bad, because there are so few channels it can go from OK to terrible in a hurry.
To add a little, 11r is something you do want to test in your environment. Typically 11r networks would reject non 11r capable devices(or more specicially non-11r devices would not connect). Because of this issue, Adaptive 11r was introduced. However, most newer devices/OSs support 11r, Windows 10 and iOS 10+ support 11r. It all comes down to the client. Generally anything that support "802.11-2012", should also support 11r, 11k and 11v. But, you will also need to check if the device is 802.11ac certified under the Wi-Fi Alliance. Even iOS devices have issues, maybe not as much as others, just also double check your device being used.
Also, it's good to keep the SSIDs per channel to no more than three, Andrew von Nagy has a great website on Capacity planning, http://www.revolutionwifi.net.
And, to be more correct, I was under the impression that "11ac wi-fi alliance" certified meant that you had at least everything in "802.11-2012 Amendment or later", that's not 100% the case. Which comes down to making sure your device(client) supports what services your AP could support, if that's 11k/v/r etc...
Everyone you ask will have different answers to this question. There are no best practices, there are best choices for the design you are building to meet the requirements of the client devices. I teach design courses and am always asked what best practices exist and my answer is always "There are no absolutes in wireless" because what works for one site might not work for another.
Case in point already in this thread things like disabling lower data rates was mentioned, the energy still exists on the channel, it just requires clients to be able to operate at higher data rates to associate to a BSSID, whether or not it reduces airtime utilization depends on the environment a lot. While yes clients will operate at higher data rates, more frequent roaming events could occur, which can cause other delays.
Client band steering, well that's a tough one, on one hand if you built a design focused on 5GHz why not have a 5GHz only SSID that your devices connect to and have a 2.4GHz only network for those devices that don't support 5GHz? Is that a better route than using band steering? It all depends.
802.11r, this sounds great but so many devices don't support it. So do we enable it or not? Again it comes down to what does your devices support? If you enable it will older legacy devices stop working? The more complex we make the beacons the more likely legacy devices are to have problems.
I would worry less about best practices and make sure you know what your RF design goals are and build towards that. If we can't meet our physical RF requirements capacity and performance will not be met. The most important design and best practice is to reduce Co-Channel and Adjacent Channel Interference as much as possible.