Juan Linietsky
2011-06-22 03:12:16 UTC
Here's a doubt i've been having for a while. When developing games based on
large worlds (like Elder Scrolls, GTA, etc), i can imagine that physics and
rendering become more jittery the further away the camera moves from the
origin (due to floating point precision loss). Is this really a problem? If
so, how is this solved? I can imagine that increasing floating point
precision to doubles helps a enormously, but i'm not sure if that's enough
and if it's worth the extra processing/bandwidth cost.
Transforming the world to local coordinates (so the camera is always at
the origin) also seems to me like a solution, but sounds like a lot more
work and messy code.
So, how is this solved in most cases?
cheers!
Juan
large worlds (like Elder Scrolls, GTA, etc), i can imagine that physics and
rendering become more jittery the further away the camera moves from the
origin (due to floating point precision loss). Is this really a problem? If
so, how is this solved? I can imagine that increasing floating point
precision to doubles helps a enormously, but i'm not sure if that's enough
and if it's worth the extra processing/bandwidth cost.
Transforming the world to local coordinates (so the camera is always at
the origin) also seems to me like a solution, but sounds like a lot more
work and messy code.
So, how is this solved in most cases?
cheers!
Juan