Hi all. I have a theoretical issue to discuss with you.
As you probably know, user (or world, page, logical etc..) space extents are usually much
larger than device space extents and sometimes, when you have a very large entity (line,
rectangle, whatever) defined in user space and you apply a scale transformation -- the library
doesn't matter here -- to zoom in over a part of the large entity, you reach a point where the
entity disappears (can't be rendered) simply because, at some scale, the transformed vertices
of the entity generate an overflow in device space; but if you zoom in further, you might
encounter some smaller entities in the same spot! I'll try to illustrate better with an
example. Say, our user space supports 64-bit (double) coordinates and our device space is
GDI32 (2^27 units) and we have two lines (in good ol' Cartesian system):
(-100.0, -100.0, +100.0, +100.0) [entity A]
and
(-1000000000000.0, +1000000000000.0, +1000000000000.0, -1000000000000.0) [entity B]
As we can see, the two lines cross point (0, 0) and logically, if I zoom enough to show line A
in my viewport, I should also be able to see (to draw, a programmer may say) line B as well;
but unfortunately, this doesn't happen; when line A is visible, the scaling factor is so big
that the mapped coordinates of line B exceed 27 bits. One solution would be, of course, to
calculate the intersection between line B and my viewport and draw only the visible line
segment. Fine, but what if B was a spline (NURBS), an ellipse, etc. Moreover, periodic curves
can have more than one intersecting segment with the viewport and the formulas and equations
become quite complex. Besides, the idea of down-sampling the user space entity enough to make
it map correctly in device space, flattening it, and finally up-sampling the resulting line
segments, this idea is not acceptable because of the loss of precision (curve smoothness)
incurred..
Let me point out that when the mapped coordinates can't fit in device space, the entity cannot
be flattened (using GDI's FlattenPath()). I tried to flatten (generate) NURBS' using a sample
from the Internet but for some reasons the output doesn't exactly match the same spline when
drawn in AutoCAD(R). Anyway, I wish I knew how to flatten all kinds of curves accurately. My
problem would be solved. I'm trying to study curves intersection now.. damn hard.
All the above applies to Win32. I know little about GDI+ .. but actually, even if the device
space gets larger, the idea of drawing ONLY what's visible in the current viewport (by
generating a new -- usually smaller -- set of sub-entities) should rule IMHO! For performance
reasons and because it simply makes more sense -- and it allows a much broader zooming range
BTW.
So guys! If this has been done before, or if anyone has any clues, books, links.. please let
me know :-) If anyone knows how to generate (by code) an AutoCAD(R) spline accurately, please
help me out

In an other forum, I even suggested that this kind of issues (mapping, clamping, clipping..)
should be SOLVED once and for all and become an integral part of all graphics library
packages! Don't you agree?
PS. If anyone knows any other forums or places on the net where I can further discuss these
issues, I would appreciate you letting me know!
Thanks
Robert
Comments
Basically you must design the world from the very start with clipping in mind. One common strategy is to use cubes of sqaures. Non-visible cubes can then be excluded very quickly. This depends on each object within the world being smaller than a cube, which may cause difficulties for high level curves.
Levels of detail are a similar problem. A lot of games get round the problem by excluding distant views. For instance Doom was set in a maze so that you could never see more than about five cells away before hitting a wall.
Robert