Just for information so that I haven’t left anything out I’m running Windows 7 on a MacBook Pro 13", with a 9400M graphics card. I tried the code from the bottom of this page ( ) to go through all pixel formats available, and none of them are marked as ICD, and all of them are marked as software modes. And the window style flag was already set. Same thing with DescribePixelFormat with argument 1. Hi again! I tried changing to 24 bit depth buffer but with no luck. I haven’t tried implementing the DescribePixelFormat way yet but from the earlier posts I doubt that it will solve my problems. Pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER My PixelFormatDescriptor looks like this: My OpenGL initialization is almost entirely the same as NeHe’s code, and Nehes original code fails as well… I can even run the tests in OGLExt viewer on both renderers and clearly see that the GDI fails on most tests… but why does that get picked when using ChoosePixelFormat and then SetPixelFormat? My custom software can’t get hardware accelerated OpenGL, but the OpenGL Extensions viewer software detects that I have 2 rendering contexts on my machine the GDI Generic and the GeForce 9400M that I would like my software to choose. I have the exact same problem on several machines. I would really appreciate any feedback / alternatives / fixes to my problem!! You can press OK to ignore the error and continue at your own risks, or (16 to 32 bits colors), disable antialiasing or other unusual settings. If Infinit圜lientProto still doesn't run, try to change your desktop settings Infinit圜lientProto requires hardware acceleration. The operation is not supported by your system/configuration File: F:\Projects\I-Novae\Src\Engine\IOpenGLRenderer\COpenGLRenderer.cpp Setting shader level: 2, real colors: 0, texture level: 2, preloading: 0 Operating system: Windows 2003 Server ().įeatures1: FPU VME DE PSE TSC MSR PAE MCE CX8 APIC MTRR PGE MCA CMOV PAT PSE36 Log file of 'C:\Program Files (x86)\Infinity\ICP\Infinit圜lientProto.exe'Ĭ:\Program Files (x86)\Infinity\ICP\Infinit圜lientProto.exe, run by Administrator. This feature enables render to texture functionality.īut then, the output from a prototype of a game I am trying to run ( the results are also detecting GDI Generic, like FurMark, and like the sample code file linked to above ): = This feature enables high level shading language for shaders. This feature improves performance in some particle systems. This feature provides hardware accelerated culling for objects. This feature improves the quality of texture mapping on oblique surfaces. This feature enables a wide variety of effects via per pixel programming (equivalent to DX9 Pixel Shader.) This feature enables a wide variety of effects via flexible vertex programming (equivalent to DX8 Vertex Shader.) This feature improves texturing quality by adding clamping control to edge texel filtering. This feature improves performance in some applications by using AGP for dynamic vertex transformation. This feature improves texture mapping performance in some applications by using lossy compression. This feature provides an alternate method of coloring specular highlights on polygons. This feature accelerates complex rendering such as lightmaps or environment mapping. This feature improves OpenGL performance by using video memory to cache transformed vertices. OpenGL driver version check (Current: 6., Latest known: 6.):Īccording the database, you are running the latest display drivers for your video card. Shading language version: 1.20 NVIDIA via Cg compiler Here is an excerpt from OpenGL Extensions Viewer: Renderer: GeForce Go 7950 GTX/PCI/SSE2 I am wondering if this is related to my system, I am running Windows XP Pro 64-bit, with two Geforce 7950s running in SLI. I have verified that I have the latest forceware drivers (圆4 for my system). I have hardware acceleration to full in my windows preferences, have tried uninstalling drivers, Driver Sweeping them, and re-installing. I am wondering if there is something in particular here that could be going wrong? I have been told that FurMark uses the same initialization code as the following ( ) and that this detection/initialization process is quite standard. OpenGL Extensions Viewer runs tests verifying me up to OpenGL 2.1 with good performance and 100% compatibility.įurMark, another benchmark, does not detect OpenGL 2.0 and defaults to GDI Generic. I’ve been experiencing a strange problem, where OpenGL extensions viewer detects my renderer as my video card properly (Nvidia GeForce GO 7950 GTX) but some other benchmarks/demos/games do not, instead defaulting to “GDI Generic”
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |