Strictly the maximum is only a setting in the radio firmware. The practical limit depends on the quality of the connections and in the control over the clients that you have. The reasons:
best performance is when the data rate of all users is highest (54 Mbps) but this requires strong signals. When the central distribution uses legal power into an omni-directional antenna such strong signals can only be obtained with directional outdoor/roof antennas at the clients. Wit lesser signal and thus a poorer signal to noise ratio the units will automatically switch to lower data rate and so will need longer duration packets for the same amount of data.
Ideally all connections have to be good; one bad connection can spoil performance for many others as it tends to become a disturbing factor taking lots of time to transmit something, time during which the channel is unavailable for others.
The level of control is important to be able to set the CTS/RTS procedure in which, prior to a client sending data it asks and gets permission. It does ask permission using special short duration packets. Without this procedure clients that do not 'hear' each other would often send at the same time which reduces the capacity for the access point to receive it first time right; the phenomenon is called 'hidden node' problem.
So we use in our networks always a client unit that is not controlled by the user but managed by the network and can be guaranteed to have the proper settings and a good quality radio link, i.e. outdoor antenna and client unit are close to each other not to have coax cable losses and after that another in-house wireless router can take care of the in-house connectivity.
The above applies specifically to a star network topology (multitude of clients connect to a central access point) but the importance of good radio signal quality is always an important factor.
Hope this helps a bit in tackling the analysis.
(Last edited by doddel on 18 May 2009, 19:24)