hi!
anybody has any idea if it's possible to have BitBlt be in a different endianness (or byte ordering)? Right now, to copy BitBlt to screen, if it's 32 bits I just need to copy the bytes, but for 16 and 8 bits I need to flip the words/bytes around? Is there a way to tell Squeak to use a different byte ordering? Actually, I'm not even sure if I'm doing the byte swapping properly. I saw that there are some references to color depths of -32 -16 -8 and so. Any idea what this are for? I tried them, but nothing happened (like nothing). thanks! gera |
Gerardo Richarte wrote:
> I saw that there are some references to color depths of -32 -16 -8 > and so. Any idea what this are for? I tried them, but nothing happened > (like nothing). That is probably because you were doing something like here: form := Form extent: Display extent depth: -16. Display displayOn: form. form displayOn: Display. The reason why you don't see any difference is simply that the conversion happens *twice* once when blitting from the display and once when blitting to it (and besides, the whole point of BitBlt is to deal with the difference automatically so it's quite natural that you can't observe any effect as long as Display functions properly). To see the actual effect (e.g., the difference in the bits) you need to manipulate the form's depth variable like here: form := Form extent: Display extent depth: -16. Display displayOn: form. form instVarNamed: 'depth' put: 16. form displayOn: Display. Cheers, - Andreas |
In reply to this post by Gerardo Richarte
On 16-Jun-06, at 3:07 PM, Gerardo Richarte wrote: > hi! > > anybody has any idea if it's possible to have BitBlt be in a > different endianness (or byte ordering)? > Right now, to copy BitBlt to screen, if it's 32 bits I just need to > copy > the bytes, but for 16 and 8 bits I need to flip the words/bytes > around? It's not byte ordering so much as *pixel* ordering that you need to work with. There is also an issue on some platfomrs of the internal pixel format to worry about; for example on RISC OS memory is littleendian and pixels are BGR not RGB. On Windows pixels formats can vary depending on depth and include what Andreas & I concluded has to be called middle-endian in some cases! On all current platforms VMs there are function to handle the pixel transforming and copying transparently without any user involvement. What are you doing that needs to handle endianness? > > Is there a way to tell Squeak to use a different byte ordering? > Actually, I'm not even sure if I'm doing the byte swapping properly. There is (or was, at least) a byteswapping method somewhere in BitBlt or Form but generally you would negate the pixel depth to (hopefully) trigger the flip code in the bitbltplugin. I tried using it for RISC OS but since it doesn't handle the pixel format conversion (unsurprisingly) it wasn't terribly helpful for me. > > I saw that there are some references to color depths of -32 -16 -8 > and so. Any idea what this are for? I tried them, but nothing happened > (like nothing). Mostly you would use these for the depth of the Display rather than generic Forms. Depending on platform you might see the available display depth (world menu->appearance->set display depth) including some negative numbers. tim -- tim Rowledge; [hidden email]; http://www.rowledge.org/tim Futuristic: It will only run on a next generation supercomputer. |
Am 17.06.2006 um 01:51 schrieb tim Rowledge:
> On 16-Jun-06, at 3:07 PM, Gerardo Richarte wrote: > >> anybody has any idea if it's possible to have BitBlt be in a >> different endianness (or byte ordering)? >> Right now, to copy BitBlt to screen, if it's 32 bits I just need >> to copy >> the bytes, but for 16 and 8 bits I need to flip the words/bytes >> around? > > What are you doing that needs to handle endianness? If you need to swizzle color components in a Form, then BitBlt with a colormap is quite helpful. See class ColorMap. - Bert - |
In reply to this post by Gerardo Richarte
tim said:
> On all current platforms VMs there are function to handle the pixel > transforming and copying transparently without any user involvement. > What are you doing that needs to handle endianness? I guess that's the main problem. I should have said it before, the question is related to SqueakNOS, so I guess I'm implementing a new platform. The native side answers ioScreenDepth() with one of 32, 16 or 8. Then DisplayScreen>>startUp fixes Squeak's Display depth to match it: Display>>setExtent:depth: calls Display>>supportsDisplayDepth: which will be true only for a single depth (the native depth). So there is one single depth across the system (the native). I'm developing on Intel PCs, with what I think is direct video memory access (frame buffer?). I know all this: If depth is 32, a simple copy from Squeak's Display to the screen is perfect, same colors as linux squeak. If depth is 16, a simple copy will swap odd and even pixels, and use a "blue palette" for colors. If depth is 16 and I turn the pixels around during the copy (xchange 2 lower bytes with 2 higher bytes), the form of the screen is fine, but the colors still look weird . If depth is 16 and I swap the bytes around (change byte endianness of 32 bits word), it looks a little bit better, but colors are still weird. The colors may be just a problem with the palette, although it could be something else (like bit ordering which I doubt). also similar things happen for 8 bits mode (if straight copied, 4 pixels swapped and wrong colors. If byte swapped, just wrong colors) I have two different questions: I know how to solve the pixel ordering (swapping). Any idea whats going on with the colors? And then. I don't think the best way to solve this is to swap the bytes around when rendering, but better just to tell BitBlt to always use a different representation, so bytes are never swapped (for performance). Do you think this is true? and, how do I do it? I tried setting -16 bits depth as supported depth. (changing display primitives), but nothing's shown on screen if I do it. And if I make -16 supported by primitives and then I do Display newDepth: -16; restore. The screen is never redrawn again... I uploaded a partial screenshot to http://minnow.cc.gatech.edu/squeak/uploads/1762/SqueakNOS-16bits.png. This is the 16 bits straight copy version. I also uploaded a zip with 5 images to http://minnow.cc.gatech.edu/squeak/uploads/1762/SqueakNOS-Display.zip... (I didn't know if upload it to the swiki or not, but I don't have another place and it's going to help the project, so I guess it's fine, even if it can't be deleted latter... actually, this could also be for for other people). Thanks all for answers. gera |
On 17-Jun-06, at 8:12 AM, Gerardo Richarte wrote: > tim said: > >> On all current platforms VMs there are function to handle the pixel >> transforming and copying transparently without any user involvement. >> What are you doing that needs to handle endianness? > > I guess that's the main problem. I should have said it before, the > question is > related to SqueakNOS, so I guess I'm implementing a new platform. That does make a bit of a difference! > The native side answers > ioScreenDepth() with one of 32, 16 or 8. Then > DisplayScreen>>startUp fixes Squeak's > Display depth to match it: Display>>setExtent:depth: calls > Display>>supportsDisplayDepth: > which will be true only for a single depth (the native depth). So > there is one single depth > across the system (the native). The supportDisplayDepth stuff is a way to let the image know which depths you can support; on a direct frame buffer device I'd suggest declaring that you only support the frame buffer depth . You could of course support several if your device can be tweaked. Take a look at some of the implementations in other platforms to check on it. I think Windows supports a variety of -ve depths as well as the usual 4/8/16/24/32 values. RISC OS only supports +ve values. Does the unix fb stuff not help as well? > > I'm developing on Intel PCs, with what I think is direct video > memory access (frame buffer?). > > I know all this: > > If depth is 32, a simple copy from Squeak's Display to the screen > is perfect, same colors as linux squeak. So it seems clear that the device 32bpp mode is known and agrees with the Squeak format > > If depth is 16, a simple copy will swap odd and even pixels, and > use a "blue palette" for colors. You did set the device up to expect 16bpp first, didn't you? :-) > > If depth is 16 and I turn the pixels around during the copy > (xchange 2 lower bytes with 2 higher bytes), > the form of the screen is fine, but the colors still look weird . > > If depth is 16 and I swap the bytes around (change byte endianness > of 32 bits word), it looks a little > bit better, but colors are still weird. You need to find documentation of the expected pixel formats. > > The colors may be just a problem with the palette, although it > could be something else (like bit ordering > which I doubt). If you have a settable palette in the video hardware you should be able to correct the RGB/BGR/BRG type problems with that. Just create a suitable mapping > > also similar things happen for 8 bits mode (if straight copied, 4 > pixels swapped and wrong colors. > If byte swapped, just wrong colors) > > I have two different questions: I know how to solve the pixel > ordering (swapping). It's not hard for 8/16/32 bpp but gets more interesting for 4/2/1 - take a look at my code in the RISC OS tree for details of how managed it. > Any idea whats > going on with the colors? And then. I don't think the best way to > solve this is to swap the bytes around > when rendering, but better just to tell BitBlt to always use a > different representation, so bytes are > never swapped (for performance). Do you think this is true? and, > how do I do it? It ought to work just by using a -16 depth (for example), though as I've previously mentioned it never did for RISC OS. As long as you are copying from Display to actual display you can correct the format anyway that works for you and you never have to worry about converting back since the Squeak DIspaly is left alone. If you want the Display bitmap to be that actual memory rendered by the real display hardware then things can get more complicated and you need to see about the previously mentioned palette manipulation or better yet more clever tricks that can be done with the video card. Or you look into the SurfacePlugin to use an external memory chunk as the Display bitmap.... ... or you go completely insane and rewrite the bitblt to work littleendian and then rewrite a lot of image code to understand endianness and pixel format properly. I actually did a littleendian bitblt back in 1.1x days but it would need a lot of updating to e of any use now. I think that with all the other bits of code that implicitly rely upon pixel format it would be horribly impractical to try that route but you may have different opinions and tolerances for labour than I do. Hmph, I've rambled a bit but it *is* early morning. Basically, you really need to find out the pixel format you video hardware wants, or better yet find a way to *set* the video hardware to accept squeak's format. With such info we can be a lot more helpful. tim -- tim Rowledge; [hidden email]; http://www.rowledge.org/tim ASCII to ASCII, DOS to DOS. |
Free forum by Nabble | Edit this page |