Does anyone know what the relationship between the various time measures is, at a low level? I know the general relationship between them, but when observing how they change relative to each other, especially when the server is recovering from a lag spike, is very confusing.
The measures I'm looking at are GetGameTime(), GetGameTickCount(), GetEngineTime(), and the values tickcount and cmdnum as passed to OnPlayerRunCmd[Post].
One thing I want to know specifically is whether game movement is regulated by the tick count or by "cmdnum" (i.e. PlayerRunCmd). Does the player move a distance equal to velocity divided by tickrate during each tick or each RunCmd/cmdnum? In general the answer is "they are the same" but when lag is thrown into the mix, that is not the case. Ideally I'd like to know the best way to measure the passing of time as far as game movement is concerned that isn't thrown off by lag. GetGameTime() is not the answer to that.
Here is some data from a test a did, plotting each of the above measures against "CMD" which represents each call to OnPlayerRunCmdPost. At around the 25 CMD mark I generated some lag by inserting 5 million values into an ArrayList. The varying recovery behavior of each time measure is baffling. If you look closely you'll notice GetGameTime() actually flows backwards, then recovers, but not completely.
I guess I could probably learn what I need to know by doing some more testing, but I figured this was interesting enough to post regardless.
Edit: Apparently there is no direct correlation between any of the above measures and when movement is actually processed, which is pretty troubling. It seems the only way to correctly negate the effects of lag when timing players is to count calls to OnPlayerRunCmd where entity properties actually changed as expected from the previous call.