Wed Jun 25, 2003 5:02 am by Cataphract_40
5kbit/s x 60 sec = 300kbits in a minute. 300kbit/min x 60 min = 1,800 kbits in an hour. 1,800kbits/hr x 24 hours = 43,200kbit/day. 43,200kbit/day x 30 days = 1,296,000kbits in a month.
That may seem like a lot of data, but think of it this way:
1 byte = 8 bits
So
1,296,000 / 8 = 162,000 kbytes per month. Divided by 1024 to get megabytes, you get 158.20.
And there you have it. One player on a server for an entire month will consume about 158 megabytes of bandwidth for that month. Multiply that by 32 (which seems to be the max for players) and you get 5062.4 megabytes per month, or 4.94375 gigabytes per month.
So, about 5 gigs of bandwidth per month for a 32 player server. (A very rough estimate.)