NVIDIA and ATI

Few day ago I bought netbook with ATI Radeon graphic card. Before I always used cards from Nvidia and Intel. After day of installing tools to programming and profiling I ran for test L.B.A. Remake. And of course it didn't work. But after removing CUDA support that I needed to my thesis work, it started without problems.

But when app started i saw other problem: There were some weird stuff on my screen, they didn't look like they should. After two days of searching reason behind it I found that:

  • Nvidia allow more then it should. - I had invalid code (I probably write it very late in night because it was so stupid and when I saw it I thought how I could wrote it :] ). Code for each type of input to shaders (position, normals, texcoord0, ...) set glBufferData with size of all inputs and data set to NULL. On NVIDIA worked fine on ATI not.
  • NVIDIA and ATI have different bindings of Inputs - POSITION is always ATTR0 but other not. On end I decided on using in shaders defines with ATTRi that I bind for each type. I think that this will be safest and most efficient way.
  • Default value of input data -  if there is no binded input NVIDIA set input COLOR0 values on (1,1,1,1) but ATI use value (0,0,0,0).
There are probably other difference in them but right now I didn't found in working of program. If I find any then I share them.

On the end of work yesterday I was happy that I finally get hand on ATI card. Thanks to that I found some stupid code and for now I have hope that game will run on most of the graphics cards.

Comments

Popular posts from this blog

Query commands execution

W.U. 0x20

Lets play : Good code / Bad code