Page 1 of 1

ATI fixes rendering issues in their latest driver release

Posted: 2004-09-04, 23:52
by Carsten
I just got information that the latest release of the ATI Radeon drivers finally contains a fix for an earlier driver bug that caused rendering issues with Ca3DE on ATI Radeon graphics boards.

It took ATI almost a year to get the issue fixed, but I'm happy that they eventually moved. After all, working with their Tech Support was a positive experience, especially when compared to enterprises of similar size and products. I was worried all the time that people misattributed the rendering bugs to the Ca3D-Engine when in fact the driver was broken, but that is finally settled now. 8)

Posted: 2004-09-05, 00:56
by Dodger
Hi Carsten,

great to hear that! Of course it was not CA3D Error.
Shame on the people who thought that :D ;)

Dodger

It works with the new ati drivers

Posted: 2004-09-05, 12:21
by Firefax
Hi,
I tested Ca3de with the new Ati Drivers and Ca3de works fine. Right now, I have the version 4.8 (with Catalyst Control Center) installed. My Gfx Card is a Shappire Radeon 9800 Pro 128MB.
CYA
Firefax

Posted: 2004-09-05, 22:26
by Advis
I don't own an ATI card but it's of my upmost interest that any games we develop don't have these issues, so it's a very positive thing.

Hopefully this might also present a speed increase, any evidence to suggest this?

Posted: 2004-09-06, 00:04
by Carsten
Update:
I got some screenshots from Firefax and his ATI Radeon 9800. Well, maybe the report above was too early: While the situation did indeed improve, the screenshots indicated a lower than expected frame-rate, and still not the proper lighting in all situations.
I still have to run additional own tests until I can say more. Maybe the issue is settled, maybe its not. I'll keep looking into it. Stay tuned.

Posted: 2004-09-06, 00:16
by Carsten
Advis wrote:Hopefully this might also present a speed increase, any evidence to suggest this?
No.

The problem that caused the rendering problems indeed is accompanied by too little FPS. That did not improve.

If you have an NVidia board, please note that the FPS with real-time lighting is generally slower than with lightmaps lighting, see this FAQ for more details.

Posted: 2004-09-27, 04:52
by RAZOR
I saw somewhere that with nvidia drivers opengl commands return immediately but with an ati driver they only return once the action has been performed, meaning you get 100% cpu usage when all your doing with the processor is sending stuff to the graphics card. Would that account for the bad performance?

Posted: 2004-09-27, 10:50
by Carsten
RAZOR wrote:Would that account for the bad performance?
I doubt it. While I also observed that OpenGL on NVidia takes very little time for individual function calls, that time is consumed later on flushing and swapping the frame buffers, that is, finishing the rendering of a single frame takes relatively long compared to rendering it.

On the other hand, virtually all 3D real-time software consumes 100% CPU, and even if the OpenGL implementation behaved vastly different on NVidia and ATI drivers, the rendering code could not be changed in order to account for this.

Posted: 2004-09-28, 15:06
by RAZOR
[EDIT: RAZOR, I'm very sorry, as I've accidently edited your posting, rather than posting a quoted reply to it. Unfortunetly, I only realized this after I've been unable to recover your original message.
I'm very sorry!
Carsten]

Excreta occurs*. My post was mostly about how if the rendering was in a different thread then it wouldn't matter how long it took for opengl commands to return the game code would still be being processed, but thats a hell of a thing to do. I also said that if you have the flush and swap commands at the start of a new frame instead of at the end of an old frame then the cpu and graphics card work in parallel more. I think that was about it.

*For those who have never heard this saying before, especially the non naitive english speakers, it amounts to "sh** happens".

Posted: 2004-09-28, 17:09
by Carsten
Up to you if you feel like playing around with it.
:-) Not really.
Next thing I'll do is changing the Ca3D-Engine to universally use and employ the new Ca3DE Material System ("MatSys"). The MatSys does already work in experimental mode in CaWE (looking very promising), and fully in the new ModelViewer. Thus, tomorrow I'll start making the engine use it, too, which will be a big step forward. All these items are not yet publicly available, but will be with the next release.

To get back to the original subject: I don't see much gain in having the rendering code in a separate thread from the master game loop, neither with the old rendering, nor in the light of the new MatSys. Threading doesn't buy anything in performance here, and - as you already mentioned - only introduces a lot of subtle sync'ing issues, which can be more expensive to resolve than anything threading ever buys. (Ca3DE never permits the CPU to be idling anyway, so there is nothing to gain from putting the rendering into a separate thread. ;) )

Posted: 2004-10-11, 22:43
by Shadow
yes i noticed that ca3d takes up a good deal of processor time. is there a way we could 'disable' un-needed functions of the engine? like if we have a single player game can we disable all mplayer stuff? or disable some of the advanced graphics rendering?

Posted: 2004-10-12, 00:06
by Carsten
Shadow wrote:yes i noticed that ca3d takes up a good deal of processor time. is there a way we could 'disable' un-needed functions of the engine? like if we have a single player game can we disable all mplayer stuff?
Sorry, no. Multi-Player is indeed more expensive than Single-Player, but the real and most important bottle-neck is probably the graphics.
or disable some of the advanced graphics rendering?
Yes, that is possible. Please have a look here:
http://www.ca3d-engine.de/phpBB2/viewtopic.php?t=7
Maybe "r_style 4" is what you're looking for.
The upcoming Material System will probably permit even finer control over rendering features.

Posted: 2004-10-12, 00:09
by Advis
This material system just sounds so cool......stop it Carsten you are making me drool!