While working on the speedometer hud, I came up with question about how speed/distance works in Torque. The hud's speedometer needle correctly spins through it's specified angles based on the target's velocity, but I don't entirely understand what the change in movement represents. An ultimate goal for a control like this is to realistically translate the raw game values into a "real speed" that is roughly representative of how the velocity actually feels to the player ... if that makes sense.
To me, the control doesn't accomplish this very well currently (nor does the original TGE version, which behaves identically). But before setting about trying to make it better, I wanted to check some basic core reasoning to make sure my foundation is correct.
So, first ("tu" is used to indicate "torque units"):
speed = obj.velocity
In the game context, would a proper unit of measure for velocity be tu/second? That's what I am assuming.
From there we would get:
tu/minute = speed * 60
tu/hour = tu/minute * 60
Providing us with the number of torque units an object would travel in a minute/hour at the current velocity.
So given a velocity of 10, you'd get:
10 * 60 = 600 tu/min
600 * 60 = 36000 tu/hr
Assuming 1 tu ~= 1m, that would mean that a velocity of 10 is roughly equal to 36km/hr. Right?
By the time a vehicle gets up to a velocity of 20/25 things start to feel pretty damn fast. Doing the math we get:
25 * 60 = 1500 tu/min
1500 * 60 = 90000 tu/hr
Which, if we're using the 1tu ~= 1m conversion puts it at 90km/h, or about 55mph.
First thing I'm trying to figure out is if all that is basically correct. If so, I'm wondering if the calculation ends up producing a value that is a bit slower than the velocity feels ... or if that's about right and attempting to drive a pretty basic 3d model across randomly-generated terrain @ 55mph is *supposed* to end up being rather difficult to control.