Everyone you ask will have different answers to this question. There are no best practices, there are best choices for the design you are building to meet the requirements of the client devices. I teach design courses and am always asked what best practices exist and my answer is always "There are no absolutes in wireless" because what works for one site might not work for another.
Case in point already in this thread things like disabling lower data rates was mentioned, the energy still exists on the channel, it just requires clients to be able to operate at higher data rates to associate to a BSSID, whether or not it reduces airtime utilization depends on the environment a lot. While yes clients will operate at higher data rates, more frequent roaming events could occur, which can cause other delays.
Client band steering, well that's a tough one, on one hand if you built a design focused on 5GHz why not have a 5GHz only SSID that your devices connect to and have a 2.4GHz only network for those devices that don't support 5GHz? Is that a better route than using band steering? It all depends.
802.11r, this sounds great but so many devices don't support it. So do we enable it or not? Again it comes down to what does your devices support? If you enable it will older legacy devices stop working? The more complex we make the beacons the more likely legacy devices are to have problems.
I would worry less about best practices and make sure you know what your RF design goals are and build towards that. If we can't meet our physical RF requirements capacity and performance will not be met. The most important design and best practice is to reduce Co-Channel and Adjacent Channel Interference as much as possible.