OpenGL display info from Jeff LaSor (collected from newsgroup posts and other sources)
I am wondering how the _TestSetAALevel command is going to be integrated?
As it stands now we have to run the command every time we open Rhino or a
new file.
We all agree this needs to be in the UI somewhere…and although it might
seem logical, I'm not sure the display modes is the right place….Here's
why….Setting the anti-alias mode (really called multi-sample mode)
consumes a lot more video memory than one might think…thus, if you put it
inside the display modes, then every view using that mode will now also be
using more video memory, especially if you work in high resolutions with
maximized views.
By having it separated from the view mode, you can have
different AA settings per view, optimizing your video memory consumption,
(now, granted, you could setup custom display modes all using different AA settings, but that might get to be a little much, especially when you want
to change/adjust certain display attributes)…this lets you optimize how
much AA you want, when you want it….There are several people who want max
AA settings, but when they maximize their view, they notice that the AA
level is degraded….that's not Rhino doing that, it's the drivers…and it
happens because they're running out of video memory and the drivers are
adjusting as best they can…if the other views weren't also max'd out in
AA, then this wouldn't happen.
Our first thought was to put it in the display modes, but after thinking
about it for a bit, we're not sure that's “optimal”…it most likely will
end up there, I'm just trying to explain why it's not there now….and that
we're trying to figure out better ways to give you the best, most optimal
performance with this type of setting.
When importing a background bitmap into Rhino for reference purposes, the
image is being distorted as though some intense antialiasing filter is being
applied. As a result, a nice sharp reference blueprint is made unreadable.
We're lucky to be able to read blueprints at all, but when high quality
2 bit scans are corrupted into unreadability, it destroys any utility Rhino
may have in producing accurate loft line tracings.
I've tried “turning off” filtering when placing a background bitmap, and have
even turned it on, turned it off, etc.
No luck. Same fuzzy garbage. I've looked for the solution, but can't find
it in the help file. Is there something I'm missing, or is this part of
a glorious application gloriously goofed up?
Without your example I can only speculate… but let me try to clear some
things up here….
It is incorrect to think that larger resolution images will provide more
detail in Rhino's OpenGL modes…Why? Because OpenGL has 2 limitations:
1) Bitmaps must be a power of 2 in both width and height
2) Textures have limitations on what their size can be, based on your
video card and driver.
Given those 2 things, here's what Rhino MUST do if either of these (or both)
are not satisfied:
1) Rhino must resize the image (either upwards or downwards) so that the
width and height are both powers of 2 (2, 4, 8, 16, 32, 64, 128, 256, 512,
1024, 2048, 4096, etc…).
2) Rhino must resize the image (downwards) to a power of 2 if the current
image resolution is greater than that supported by the card/driver.
If this isn't done, then the image will not display at all.
So, if either of these are getting done to your blueprints (especially #2),
then Rhino is altering them internally, and in the process, it's causing
things to get “distorted”. Why this seems to have changed from one beta to
the next, I'm not sure…but no matter what the answer or solution I do
here, there will be others who won't like it, and will prefer the way things
are now…I have yet been able to come up with something that everyone likes
in this area…
Having said that, I strongly recommend you try resizing your images manually
to match the criteria I've outlined above, so that Rhino won't touch them at
all…and theoretically, they should display exactly the way you would
expect them. Depending on your video card, I wouldn't go any higher than
2048×2048…most mid-range and higher cards support at least that…The next
beta will have this information in the OpenGL options page so that you can
see what your card's limits are…but for now, try sizes less than or equal
to 2048 and see if things look any better.
Please remember, just because you load an image that is 1000000 x 1000000,
doesn't mean that's the resolution Rhino is using (for the reasons I've
stated above)…so you might feel that 2048 is too small to get the details
you want…but again, if 2048 is your card's limit, then that's what size is
actually getting used anyways.
I have just recently discovered some issues with ATI's FireGL cards and
their latest drivers…I believe I have fixed the problems on Rhino's end,
but also, I've deteremined that the drivers need to have their own
configuration profile for Rhino…
Note - If you currently have a FireGL card and are not experiencing any
display problems/issues, then I do NOT recommend you do any of the following
(ie. if ain't broke, don't fix it)…However, if you are experiencing some display grief using a FireGL card then try the following:
1) Go into ATI's advanced drivers settings
2) Go to the Configuration tab and:
o Select “Add” and name the new profile “Rhino”
o UNCHECK the “Enable 8bit double buffered Overlay Panes” option
o CHECK the “Force copy swap” option
o Click OK to save it.
3) Go into Rhino's OpenGL settings and make sure all options are checked
EXCEPT for “Do not use OpenGL for drawing feedback items”…that one should
NOT be checked.
Unfortunately, ATI profiles don't work the way nVidia's do…they are not
automatically enabled when the program is run…so you have to manually
select it prior to launching Rhino.
Having done this, it seems to have cleared up ALL of the display artifacts
and window sizing issues, as well as allow me to use the “High” setting on
TestSetAALevel.
Please try this and let me know if it helps (or makes things worse) on your
end….and again, I do NOT recommend doing any of this if you're not
experiencing any display grief.
Is there a limited number of windows you can have in OpenGL? I'd
guess around 6? If I have 6 render-preview windows open, one will
usually start to have problems displaying…..
…True? False? Anyone else see this?
True and False …. How's that for an answer…
The number of OpenGL “anythings” is really determined by the amount of video
memory available and how that memory is being used. For example: If you're
running in a very high resolution and your views are maximized, then each
OpenGL context can be using quite a bit of memory….For example:
Let's say your monitor is in 1600×1280 and for simplicity let's say your
maximized viewport is the same size…Then you have the following:
1) 1600x1280x4 = 8mb of memory needed for the frame buffer
2) 1600x1280x4 = 8mb of memory needed for the back buffer
3) 1600x1280x3or4 = 6-8 mb of memory needed for the depth buffer
So far, we're up to around 22-24mb of memory for just one viewport that's
not displaying anything.
And if you're using AA (or multi-sample) modes, this can double and/or even
triple that amount.
Now, if you've got quite a few textures mapped in your scene, and those
textures are also large, then those too will take from the video memory
pool.
Multiply this all by the number of viewports you've got and you can see how
fast you can run out of 128mb or even 256mb of video memory.
Granted, I'm using large resolutions here, but you get the picture…
In both V3 and V4 the wireframe display is set by default to Windows
(not OpenGL accelerated), why is this?
You'd be surprised at how many educational institutions use Rhino,
and do so on sub-standard equipment…
If Rhino doesn't work for these people right out of the box,
they won't use it… Forcing Windows Wireframes as the default almost
guarantees Rhino will work out of the box on any system… So until we're
confident that even the lowest end computers have sufficient OpenGL support,
this is how it's going to remain.
I realize that Rhino could try to “detect” video card and driver version and
make the appropriate settings, and we probably will start trying to do that
in later SR's… just not SR0.