SOLVED: Croquet on Debian Testing: problem with OpenGL

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

SOLVED: Croquet on Debian Testing: problem with OpenGL

Jan Vrany-4
Hi,

I've finally solved my problem and now Croquet SDK
runs on my box. Thanks, Bert and Andreas!

I dig into the squeak 3.9 VM source code and realize, that
whole problem is in primitive

static int display_ioGLcreateRenderer(glRenderer *r, int x, int y, int
w, int h, int flags) /*located in
platform/src/vm-display-X11/sqUnixX11.c*/

The problem was that the routine looks for visual with
16 bit color depth and one bit stencil buffer, whereas my
graphics card supports only 16 or 24 bit color depth with
8 bit stencil buffer, so glXChooseVisual() fails. Because
no apropriate visual was found, no GLX contect could
be created, so Open GL fails to "initialize".
Color depth and stencil buffer size is hardcoded in
C source (my opinion is that this should be parametrized, for example
by command line option as -display etc).

So I changed those numbers in source code to match
my graphics card capabilities. After recompiling VM,
correct visual was found.

Unfortunately, another error raised. When the new, modified
version of display_ioGLcreateRenderer() calls
glXMakeCurrent(), an X protocol exception BadMatch occured
(and whole VM crashes). I've absolutely no idea why the
BatMatch error was raised, the code looks fine.

In order to solve this problem, I completely rewrite whole
static int display_ioGLcreateRenderer() from scratch, using
glxgears example from Mesa 3D as basis. After recompiling
VM, Croquet window appears and everyting runs fine
(and resonably fast :-)

This is my own version of context creating funtion:
----
static int visualAttributes[]= {
  GLX_STENCIL_SIZE,     0,  /* filled in later - must be first item! */
  GLX_ALPHA_SIZE,       1,  /* filled in later - must be second item! */
  GLX_RGBA,                 /* no indexed colors */
  GLX_DOUBLEBUFFER,         /* will swap */
  GLX_LEVEL,            0,  /* frame buffer, not overlay */
  GLX_DEPTH_SIZE,       24, /* decent depth */  
  GLX_AUX_BUFFERS,      0,  /* no aux buffers */
  GLX_ACCUM_RED_SIZE,   0,  /* no accumulation */
  GLX_ACCUM_GREEN_SIZE, 0,
  GLX_ACCUM_BLUE_SIZE,  0,
  GLX_ACCUM_ALPHA_SIZE, 0,
  None
};


static int display_ioGLcreateRenderer(glRenderer *r, int x, int y, int
w, int h, int flags)
{
  XVisualInfo* visinfo= 0;
  Window win;
  GLXContext ctx;
  XSetWindowAttributes attr;
  int mask = 0;

  if (flags & B3D_STENCIL_BUFFER)
    visualAttributes[1]= 8; /* ATI supports just 8 bit stencil buffers
*/
  else
    visualAttributes[1]= 0; /* do not use stencil buffers */
  _renderWindow(r)= 0;
  _renderContext(r)= 0;

        printf("Creating renderer\n");

  /* sanity checks */
  if (w < 0 || h < 0)
    {
      DPRINTF(1, (fp, "Negative extent (%i@%i)!\r", w, h));
   return 0;
    }

   visinfo = glXChooseVisual(stDisplay, DefaultScreen(stDisplay),
visualAttributes);
   if (!visinfo) {        
         printf("Error: couldn't get an  visual\n");
      return 0;
   }

   printf("Visual found\n");
   /* window attributes */
   attr.background_pixel = 0;
   attr.border_pixel = 0;
   attr.colormap = XCreateColormap( stDisplay, stWindow,
visinfo->visual, AllocNone);
   attr.event_mask = StructureNotifyMask | ExposureMask | KeyPressMask;
   mask = CWBackPixel | CWBorderPixel | CWColormap | CWEventMask |
CWOverrideRedirect;

   win = XCreateWindow( stDisplay, stWindow, x, y, w, h,
                        0, visinfo->depth, InputOutput,
                        visinfo->visual, mask, &attr );
   printf("X Window created\n");
   /* set hints and properties */
   {
      XSizeHints sizehints;
      sizehints.x = x;
      sizehints.y = y;
      sizehints.width  = w;
      sizehints.height = h;
      sizehints.flags = USSize | USPosition;
      XSetNormalHints(stDisplay, win, &sizehints);
      XSetStandardProperties(stDisplay, win, "GLX", "GLX",
                              None, (char **)NULL, 0, &sizehints);
   }

   ctx = glXCreateContext( stDisplay, visinfo, NULL, True );
   if (!ctx) {
      printf("Error: glXCreateContext failed\n");
      return 0;
   }
   printf("GLX context created\n");
   XMapWindow(stDisplay, win);
   printf("Window mapped\n");
   glXMakeCurrent(stDisplay, win, ctx);
   printf("GLX context set\n");
   XFree(visinfo);
   r->drawable = win;
   r->context = ctx;
   return 1;
}
----
(full source is attached)

In fact, original function and my version do the same,
just create GLX context, create and map window and set
GLX context for the created window, but on my box,
my version works fine whereas original one not.
I've no idea why...

Jan









On Út, 2006-04-25 at 08:46 +0200, Jan Vrany wrote:

> Hmm...I've tryed to set 32 bit depth but the
> radeon driver said that 32 bit depth is not supported.
> Maximum is 24, and Croquet doesn't run on 24bpp too.
>
> Does this means, that my system
> (IBM T41 with ATI Mobility Radeon 7500)
> is unable to run Croquet?
>
> Previous Croquet version runs fine.
>
> Thanks. Jan
>
> On Út, 2006-04-25 at 08:16 +0200, Bert Freudenberg wrote:
> > Set your display depth to 32.
> >
> > Croquet needs an accelerated visual with stencil buffer, which,  
> > according to your glxinfo output, is not available.
> >
> > - Bert -
> >
> > Am 25.04.2006 um 07:57 schrieb Jan Vrany:
> >
> > > Hi,
> > >
> > > yes, I did. GLX and DRI stuff works fine with
> > > other appc. glxinfo says:
> >
> >     visual  x  bf lv rg d st colorbuffer ax dp st accumbuffer  ms  cav
> > id dep cl sp sz l  ci b ro  r  g  b  a bf th cl  r  g  b  a ns b eat
> > ----------------------------------------------------------------------
> > 0x23 16 tc  0 16  0 r  .  .  5  6  5  0  0 16  0  0  0  0  0  0 0 None
> > 0x24 16 tc  0 16  0 r  .  .  5  6  5  0  0 16  8  0  0  0  0  0 0 Slow
> > 0x25 16 tc  0 16  0 r  .  .  5  6  5  0  0 16  0 16 16 16  0  0 0 Slow
> > 0x26 16 tc  0 16  0 r  .  .  5  6  5  0  0 16  8 16 16 16  0  0 0 Slow
> > 0x27 16 tc  0 16  0 r  y  .  5  6  5  0  0 16  0  0  0  0  0  0 0 None
> > 0x28 16 tc  0 16  0 r  y  .  5  6  5  0  0 16  8  0  0  0  0  0 0 Slow
> > 0x29 16 tc  0 16  0 r  y  .  5  6  5  0  0 16  0 16 16 16  0  0 0 Slow
> > 0x2a 16 tc  0 16  0 r  y  .  5  6  5  0  0 16  8 16 16 16  0  0 0 Slow
> > 0x2b 16 dc  0 16  0 r  .  .  5  6  5  0  0 16  0  0  0  0  0  0 0 None
> > 0x2c 16 dc  0 16  0 r  .  .  5  6  5  0  0 16  8  0  0  0  0  0 0 Slow
> > 0x2d 16 dc  0 16  0 r  .  .  5  6  5  0  0 16  0 16 16 16  0  0 0 Slow
> > 0x2e 16 dc  0 16  0 r  .  .  5  6  5  0  0 16  8 16 16 16  0  0 0 Slow
> > 0x2f 16 dc  0 16  0 r  y  .  5  6  5  0  0 16  0  0  0  0  0  0 0 None
> > 0x30 16 dc  0 16  0 r  y  .  5  6  5  0  0 16  8  0  0  0  0  0 0 Slow
> > 0x31 16 dc  0 16  0 r  y  .  5  6  5  0  0 16  0 16 16 16  0  0 0 Slow
> > 0x32 16 dc  0 16  0 r  y  .  5  6  5  0  0 16  8 16 16 16  0  0 0 Slow
> > 0x4b 32 tc  1  0  0 c  .  .  0  0  0  0  0  0  0  0  0  0  0  0 0 None
> >
> >
>
>

sqUnixX11.c (53K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: SOLVED: Croquet on Debian Testing: problem with OpenGL

Bert Freudenberg-3
Am 26.04.2006 um 00:54 schrieb Jan Vrany:

> static int display_ioGLcreateRenderer(glRenderer *r, int x, int y, int
> w, int h, int flags) /*located in
> platform/src/vm-display-X11/sqUnixX11.c*/

You mean platforms/unix/vm-display-X11/sqUnixX11.c, right?

> The problem was that the routine looks for visual with
> 16 bit color depth and one bit stencil buffer, whereas my
> graphics card supports only 16 or 24 bit color depth with
> 8 bit stencil buffer, so glXChooseVisual() fails.

This is not correct - from the glXChooseVisual man page:

   GLX_STENCIL_SIZE
       Must be followed by a nonnegative integer that
       indicates the desired number of stencil bitplanes.
       The smallest stencil buffer of at least the specified
       size is preferred.  If the desired value is zero,
       visuals with no stencil buffer are preferred.

That means, 1 is the *minimum* number if stencil buffer planes. We  
surely would like to get 8 at least, but even one bit is better than  
none.

> Because
> no apropriate visual was found, no GLX contect could
> be created, so Open GL fails to "initialize".
> Color depth and stencil buffer size is hardcoded in
> C source (my opinion is that this should be parametrized, for example
> by command line option as -display etc).

No, all of these are minimum values.

> So I changed those numbers in source code to match
> my graphics card capabilities. After recompiling VM,
> correct visual was found.

That would indicate a bug in your GLX, IMHO. Can you verify changing  
1 to 8 really makes a difference?

> Unfortunately, another error raised. When the new, modified
> version of display_ioGLcreateRenderer() calls
> glXMakeCurrent(), an X protocol exception BadMatch occured
> (and whole VM crashes). I've absolutely no idea why the
> BatMatch error was raised, the code looks fine.
>
> In order to solve this problem, I completely rewrite whole
> static int display_ioGLcreateRenderer() from scratch, using
> glxgears example from Mesa 3D as basis
> [...]
> In fact, original function and my version do the same,
> just create GLX context, create and map window and set
> GLX context for the created window, but on my box,
> my version works fine whereas original one not.
> I've no idea why...

The only functional difference I can see is that you call  
glXCreateContext after XCreateWindow. Other than that, it appears you  
just removed the error cleanup code and added size hints, which are  
useless because we do not create a top-level window.

It would be very nice if you could investigate this further.

- Bert -