Scroll events really should be mouse events (*). This hasn't been too
much of a problem in Morphic where we have a focus-follows-mouse policy. But in Tweak, for example, the keyboard scroll events go to the focused pane even when I point the mouse somewhere else. AFAIK no host platform originally delivers scroll events as keyboard events, so it appears as the Right Thing to do, rather then hacking around it on the image side. However, I'm not sure about the best way to actually hand this to the image ... Any opinions? As a further data point, there are mice having both vertical and horizontal scroll capabilities, something that users may want to put to use. Not sure if we should support the additional buttons, too, but why not? My mouse has 8 of them in total, all of which do something reasonable - yes I'm on a Mac ;-) [OT] Rumor has it that Apple is going to introduce a new mouse soon, which may have two buttons or an iPod-like wheel or both ... - Bert - (*) Yes, I'm guilty for coming up with the arrow-key hack in the first place, but it was TSTTCPW |
On Thursday 17 March 2005 9:02 am, Bert Freudenberg wrote:
> Scroll events really should be mouse events (*). This hasn't been too > much of a problem in Morphic where we have a focus-follows-mouse > policy. But in Tweak, for example, the keyboard scroll events go to the > focused pane even when I point the mouse somewhere else. > > AFAIK no host platform originally delivers scroll events as keyboard > events, so it appears as the Right Thing to do, rather then hacking > around it on the image side. However, I'm not sure about the best way > to actually hand this to the image ... Any opinions? Why not just present it as a button press? -- Ned Konz http://bike-nomad.com/squeak/ |
In reply to this post by Bert Freudenberg-3
Hmm... I really don't mind representing scroll events differently
(having added the Ctrl-Cursor hack way back) ... but how do other platforms represent scroll wheel events? On Windows, there is an explicit message WM_SCROLLWHEEL (or so - don't quite remember the name) which is independent of both, keyboard and mouse events. I'd be happy to pass this up individually, but what about the other platforms? Cheers, - Andreas Bert Freudenberg wrote: > Scroll events really should be mouse events (*). This hasn't been too > much of a problem in Morphic where we have a focus-follows-mouse policy. > But in Tweak, for example, the keyboard scroll events go to the focused > pane even when I point the mouse somewhere else. > > AFAIK no host platform originally delivers scroll events as keyboard > events, so it appears as the Right Thing to do, rather then hacking > around it on the image side. However, I'm not sure about the best way to > actually hand this to the image ... Any opinions? > > As a further data point, there are mice having both vertical and > horizontal scroll capabilities, something that users may want to put to > use. Not sure if we should support the additional buttons, too, but why > not? My mouse has 8 of them in total, all of which do something > reasonable - yes I'm on a Mac ;-) > > [OT] Rumor has it that Apple is going to introduce a new mouse soon, > which may have two buttons or an iPod-like wheel or both ... > > - Bert - > > (*) Yes, I'm guilty for coming up with the arrow-key hack in the first > place, but it was TSTTCPW > > |
Could we change the cursor key events to scrollwheel events in the
image? Although os-x has scroll wheel events and if I recall I was passing that data up in a mouse event, which we later decided not to do because we needed the space for the windowindex. Also I'm not sure I got a reply to my note on the tweak list about passing the mac virtual key code up along with the unicode on keyup/down. versus VK_MUMBLE. Also didn't get an answer to: "So how do we know that the unicode value 0xB4 coming up from the VM on a keydown is a character or a VK_LAUNCH_MAIL key request?" On Mar 17, 2005, at 10:47 PM, Andreas Raab wrote: > Hmm... I really don't mind representing scroll events differently > (having added the Ctrl-Cursor hack way back) ... but how do other > platforms represent scroll wheel events? On Windows, there is an > explicit message WM_SCROLLWHEEL (or so - don't quite remember the > name) which is independent of both, keyboard and mouse events. I'd be > happy to pass this up individually, but what about the other > platforms? > > Cheers, > - Andreas > > Bert Freudenberg wrote: >> Scroll events really should be mouse events (*). This hasn't been too >> much of a problem in Morphic where we have a focus-follows-mouse >> policy. But in Tweak, for example, the keyboard scroll events go to >> the focused pane even when I point the mouse somewhere else. >> AFAIK no host platform originally delivers scroll events as keyboard >> events, so it appears as the Right Thing to do, rather then hacking >> around it on the image side. However, I'm not sure about the best way >> to actually hand this to the image ... Any opinions? >> As a further data point, there are mice having both vertical and >> horizontal scroll capabilities, something that users may want to put >> to use. Not sure if we should support the additional buttons, too, >> but why not? My mouse has 8 of them in total, all of which do >> something reasonable - yes I'm on a Mac ;-) >> [OT] Rumor has it that Apple is going to introduce a new mouse soon, >> which may have two buttons or an iPod-like wheel or both ... >> - Bert - >> (*) Yes, I'm guilty for coming up with the arrow-key hack in the >> first place, but it was TSTTCPW >> ======================================================================== === John M. McIntosh <[hidden email]> 1-800-477-2659 Corporate Smalltalk Consulting Ltd. http://www.smalltalkconsulting.com ======================================================================== === |
> Also didn't get an answer to:
> "So how do we know that the unicode value 0xB4 coming up from the VM on > a keydown is a character or a VK_LAUNCH_MAIL key request?" I didn't understand the question. First off, who is "we" (e.g., how do "we" know) in the above? If you mean the user, then the answer is that either we agree on the codes generated by the VM (such as the X11 key sym defs) or that we will translate to a common code in the image (we could translate into symbols, too, such as #LaunchMailKey). If you mean the image, then it will most likely (assuming no common keys) by a table stored with (C)Platform default translating the thing appropriately. "how do we know that a value on a keyDown is a character or a key" - is this correct english? (or Canadian ;-) I'm not sure I understand that. It is by *definition* that KeyDown events describe *keys* and not characters. Therefore there can be no question about this: keyDown events must *always* be interpreted as keys and *never* as characters. Cheers, - Andreas |
Oops resent this to the list, yes reply ALL is not my default email
preference for good reasons. Key down events I thought we agreed upon as being Unicode values as the result of pressing a key or combination of keys to generate the designed unicode value. keychar was the historical MacRoman ascii value, mind on the mac it might become a unicode value if the mapping from unicode . So on my mac entering 0xB4 gives me ¥ (16rB4 asCharacter) I really want that yen symbol in my text and not have tweak launch a smalltalk email client, so how do we know the mean if the only thing coming up is 0xb4 On Mar 18, 2005, at 10:17 AM, Andreas Raab wrote: >> Also didn't get an answer to: >> "So how do we know that the unicode value 0xB4 coming up from the VM >> on a keydown is a character or a VK_LAUNCH_MAIL key request?" > > I didn't understand the question. First off, who is "we" (e.g., how do > "we" know) in the above? If you mean the user, then the answer is that > either we agree on the codes generated by the VM (such as the X11 key > sym defs) or that we will translate to a common code in the image (we > could translate into symbols, too, such as #LaunchMailKey). If you > mean the image, then it will most likely (assuming no common keys) by > a table stored with (C)Platform default translating the thing > appropriately. > > "how do we know that a value on a keyDown is a character or a key" - > is this correct english? (or Canadian ;-) I'm not sure I understand > that. It is by *definition* that KeyDown events describe *keys* and > not characters. Therefore there can be no question about this: keyDown > events must *always* be interpreted as keys and *never* as characters. > > Cheers, > - Andreas > ======================================================================== === John M. McIntosh <[hidden email]> 1-800-477-2659 Corporate Smalltalk Consulting Ltd. http://www.smalltalkconsulting.com ======================================================================== === |
In reply to this post by Andreas.Raab
In message <[hidden email]>
Andreas Raab <[hidden email]> wrote: > Hmm... I really don't mind representing scroll events differently > (having added the Ctrl-Cursor hack way back) ... but how do other > platforms represent scroll wheel events? On Windows, there is an > explicit message WM_SCROLLWHEEL (or so - don't quite remember the name) > which is independent of both, keyboard and mouse events. I'd be happy to > pass this up individually, but what about the other platforms? To the best of my knowledge (not very strong since I don't use scrollwheels) it comes in as plain scroll events on RISC OS. So nothing special from that respect but effectively irrelevant since such events only come from host scroolbars and we don't use those. tim -- Tim Rowledge, [hidden email], http://sumeru.stanford.edu/tim ASCII to ASCII, DOS to DOS. |
In reply to this post by johnmci
> Key down events I thought we agreed upon as being Unicode values as the
> result of pressing a key or combination of keys to generate the > designed unicode value. Nope. If you will re-read my post then I thought I made it clear that we have a distinction between keys and characters. Once more from the beginning: When you press a *key* the key may or may not generate a *character*. If a character is indeed being generated then the character is passed to the image as EventKeyChar. For example, here is a sequence that will generate the "á" character on my keyboard: keyDown: ´ (accent key) keyUp: ´ (accent key) keyDown: A keyUp: A keyStroke: á (accented-a) Note that the accent *key* did NOT generate an accent *character* - it was subsumed by the subsequent combination with the A key to generate the accented á character. Therefore, the interpretation of the values by Squeak should always be: Treat all keyDown events as *keys* and NOT as characters and treat all keyChar/keyStroke events as *characters* and NOT as keys. Whether the *keys* need translation too depends on your platform - IIRC, then on Unix platforms there is an explicit call to translate the "raw" keyboard code into something a little more abstract. However, you should *never* report a *character* in a keyDown event - this is really, really broken. > So on my mac entering 0xB4 gives me ¥ (16rB4 asCharacter) > I really want that yen symbol in my text and not have tweak launch a > smalltalk email client, so how do we know the mean if the only thing > coming up is 0xb4 We know the difference easily by observing that the Yen-*key* will have a different code from the launch-mail *key* and that the launch-mail *key* will in all likelyness not generate a *character*. Therefore, if we have a translation table in the image that says "0xB4 is the launch-mail *key* on the Mac" and, for example, "0xB7 is the Yen *key* on the Mac", then even though the VM will translate the 0xB7 *key* into the 0xB4 *character* there is no confusion whatsoever. We can simply observe keyDown events to watch for the 0xB4 *key* to launch the email client and we can observe keyStroke events to watch for the 0xB4 *character* and insert the Yen sign. Cheers, - Andreas |
I'm sure this is a change from our original thoughts to generate
unicode values for the keydown/keyup. I'll note you could be using a input palette and not a keyboard device so you won't get any key up/down events, and you imply that you are supplying unicode values for the keystroke, versus the historical MacRoman? So how does a unicode character flow up in Windows? For the most part I am using the unix keyboard logic for os-x to track the key up/down strokes, so given your accented character a) we remember the keyDown, b) translation input services gives us the unicode for á (accented-a) which I generate as a keyDown unicode. This is after translation services has digested all the keystrokes and made a decision on what it all means. c) I translate that into the macroman character for á (accented-a) which could depending on your language choice be different than the unicode, which I generate as a KeyChar d) when the key goes up I have remembered the unicode for á (accented-a) which I generate as a keyUp. *If input services gives us a unicode, or input palette services give us a unicode for which there is no MacRoman translation then the keyChar value is unicode. I'll note that input palette services can give us a string of unicode which is translated into synthetic keydown/keychar/keyup values. It could be possible under os-x to generate the keydown/keyup and keychar as suggested but I'm not sure what that would break. The only information I have is the keycodes which are values mapped to keys and depend on the keyboard used and would require a translation table in smalltalk. On Mar 18, 2005, at 10:59 AM, Andreas Raab wrote: > the image as EventKeyChar. For example, here is a sequence that will > generate the "á" character on my keyboard: > > keyDown: ´ (accent key) > keyUp: ´ (accent key) > keyDown: A > keyUp: A > keyStroke: á (accented-a) > > Note that the accent *key* did NOT generate an accent *character* - it > was subsumed by the subsequent combination with the A key to generate > the accented á character. > ======================================================================== === John M. McIntosh <[hidden email]> 1-800-477-2659 Corporate Smalltalk Consulting Ltd. http://www.smalltalkconsulting.com ======================================================================== === |
On Friday 18 March 2005 12:31 pm, John M McIntosh wrote:
> I'm sure this is a change from our original thoughts to generate > unicode values for the keydown/keyup. I'll note you could be using a > input palette and not a keyboard device so you won't get any key > up/down events, and you imply that you are supplying unicode values for > the keystroke, versus the historical MacRoman? So how does a unicode > character flow up in Windows? Why not just pass unicode characters up all the time? There's no particular need for MacRoman keystroke events in the image (or doesn't need to be). And I don't think that key down/up events map at all well to Unicode, which deals with characters. After all, a given character could take many key down/up events to produce, and a keydown/up may not even generate any keystroke events (like, for instance, the shift key). The mapping between hardware events (key codes) and keyboard keys is available, at least in X (you can ask it for the name of the key). XLookupString returns what for us would be equivalent to the "keystroke". But note that there is also the possibility to have both: keyDown/keyUp -- physical keys, maybe with a name keystrokeDown/keystrokeUp -- logical (translated) keys Maybe we should distinguish these. Do other platforms allow this level of detail? Here's some output from xev. Look at the keycode/keysym/XLookupString; you'll see that XLookupString returns the logical keystroke for both KeyPress and KeyRelease events. <press and release the 'a' key> KeyPress event, serial 28, synthetic NO, window 0x3800001, root 0xd4, subw 0x0, time 187062733, (-933,57), root:(1445,895), state 0x0, keycode 38 (keysym 0x61, a), same_screen YES, XLookupString gives 1 bytes: "a" KeyRelease event, serial 28, synthetic NO, window 0x3800001, root 0xd4, subw 0x0, time 187062861, (-933,57), root:(1445,895), state 0x0, keycode 38 (keysym 0x61, a), same_screen YES, XLookupString gives 1 bytes: "a" <do a ctrl-c> KeyPress event, serial 28, synthetic NO, window 0x3800001, root 0xd4, subw 0x0, time 187073836, (-1847,-333), root:(531,505), state 0x0, keycode 37 (keysym 0xffe3, Control_L), same_screen YES, XLookupString gives 0 bytes: "" KeyPress event, serial 28, synthetic NO, window 0x3800001, root 0xd4, subw 0x0, time 187073964, (-1847,-333), root:(531,505), state 0x4, keycode 54 (keysym 0x63, c), same_screen YES, XLookupString gives 1 bytes: "" KeyRelease event, serial 28, synthetic NO, window 0x3800001, root 0xd4, subw 0x0, time 187074412, (-1847,-332), root:(531,506), state 0x4, keycode 54 (keysym 0x63, c), same_screen YES, XLookupString gives 1 bytes: "" KeyRelease event, serial 28, synthetic NO, window 0x3800001, root 0xd4, subw 0x0, time 187074460, (-1847,-332), root:(531,506), state 0x4, keycode 37 (keysym 0xffe3, Control_L), same_screen YES, XLookupString gives 0 bytes: "" -- Ned Konz http://bike-nomad.com |
On Mar 18, 2005, at 12:55 PM, Ned Konz wrote: > On Friday 18 March 2005 12:31 pm, John M McIntosh wrote: >> I'm sure this is a change from our original thoughts to generate >> unicode values for the keydown/keyup. I'll note you could be using a >> input palette and not a keyboard device so you won't get any key >> up/down events, and you imply that you are supplying unicode values >> for >> the keystroke, versus the historical MacRoman? So how does a unicode >> >> character flow up in Windows? > > Why not just pass unicode characters up all the time? > > There's no particular need for MacRoman keystroke events in the image > (or > doesn't need to be). It's called backwards compatibility. If you use CSA french keyboard layout *some* of the accented characters have different unicode values than macroman. Not passing up the MacRoman breaks 2.8 images... -- ======================================================================== === John M. McIntosh <[hidden email]> 1-800-477-2659 Corporate Smalltalk Consulting Ltd. http://www.smalltalkconsulting.com ======================================================================== === |
On Friday 18 March 2005 12:58 pm, John M McIntosh wrote:
> It's called backwards compatibility. If you use CSA french keyboard > layout *some* of the accented characters have different unicode values > than macroman. > Not passing up the MacRoman breaks 2.8 images... In the Unix VM you can set an environment variable or command line argument to change the encoding of input events. This changes what gets sent to the image. Perhaps either providing a way to pass down the encoding requirements to the VM or giving an alternate way to launch the VM for older images might be OK. After all, I suspect that people who are trying to use a new VM with a 2.8 image would probably be somewhat tech savvy. -- Ned Konz http://bike-nomad.com |
In reply to this post by johnmci
>> There's no particular need for MacRoman keystroke events in the image
>> (or doesn't need to be). > > It's called backwards compatibility. If you use CSA french keyboard > layout *some* of the accented characters have different unicode values > than macroman. > Not passing up the MacRoman breaks 2.8 images... I agree. I'd vote for EventKeyUTFChar instead. The keyValue would be UTF32. At some (unknown) point in the future we drop (MacRoman) EventKeyChar support for good. This would also simplify the transition to m17n-aware VMs. Right now, all VMs would have to generate both events, for the image to pick. Cheers, - Andreas |
In reply to this post by Ned Konz
Twitch, most squeakland users are not wanting to type obscure things on
a command line, in fact they don't know a terminal exist. They also don't care about image or vm versions, they just double-click on an image or vm and expect it to work. If we are proposing to change the keyboard event model then we need a more transparent solution. I'll suggest passing the keyboard code as another parm on the keyup/down. That I have. Tweak would have to look at both the keycode and the unicode on the mac to make a decision. On Mar 18, 2005, at 1:06 PM, Ned Konz wrote: > On Friday 18 March 2005 12:58 pm, John M McIntosh wrote: >> It's called backwards compatibility. If you use CSA french keyboard >> layout *some* of the accented characters have different unicode >> values >> than macroman. >> Not passing up the MacRoman breaks 2.8 images... > > In the Unix VM you can set an environment variable or command line > argument to > change the encoding of input events. This changes what gets sent to the > image. > > Perhaps either providing a way to pass down the encoding requirements > to the > VM or giving an alternate way to launch the VM for older images might > be OK. > > After all, I suspect that people who are trying to use a new VM with a > 2.8 > image would probably be somewhat tech savvy. > > -- > Ned Konz > http://bike-nomad.com > > ======================================================================== === John M. McIntosh <[hidden email]> 1-800-477-2659 Corporate Smalltalk Consulting Ltd. http://www.smalltalkconsulting.com ======================================================================== === |
In reply to this post by johnmci
Hi John,
John M McIntosh wrote: > I'm sure this is a change from our original thoughts to generate > unicode values for the keydown/keyup. It's certainly not a change from my original thoughts (though I might have expressed them inadequately ;-) It has always been my intention to make keyDown/keyUp events so that (in theory) the image could construct the corresponding keyChar events itself. The only reason we wouldn't do this is pure lazyness (and the need to know a few things from the OS in order to make the integration smooth). > I'll note you could be using a > input palette and not a keyboard device so you won't get any key > up/down events, and you imply that you are supplying unicode values for > the keystroke, versus the historical MacRoman? So how does a unicode > character flow up in Windows? Right now, no Unicode gets up to Windows. It really depends on what we decide to do with it (see my other post - I think having a new event type is the easiest and cleanest way to deal with the issue). > For the most part I am using the unix keyboard logic for os-x to track > the key up/down strokes, so given your accented character > > a) we remember the keyDown, > b) translation input services gives us the unicode for á (accented-a) > which I generate as a keyDown unicode. This is after translation > services has digested all the keystrokes and made a decision on what it > all means. > c) I translate that into the macroman character for á (accented-a) > which could depending on your language choice be different than the > unicode, which I generate as a KeyChar > d) when the key goes up I have remembered the unicode for á > (accented-a) which I generate as a keyUp. > > *If input services gives us a unicode, or input palette services give > us a unicode for which there is no MacRoman translation then the > keyChar value is unicode. I'll note that input palette services can > give us a string of unicode which is translated into synthetic > keydown/keychar/keyup values. > > It could be possible under os-x to generate the keydown/keyup and > keychar as suggested but I'm not sure what that would break. The only > information I have is the keycodes which are values mapped to keys and > depend on the keyboard used and would require a translation table in > smalltalk. I'd say let's just list that unicode in a new event and everything should be pretty much fine. Cheers, - Andreas |
In reply to this post by Ned Konz
> But note that there is also the possibility to have both:
> > keyDown/keyUp -- physical keys, maybe with a name > > keystrokeDown/keystrokeUp -- logical (translated) keys > > Maybe we should distinguish these. > > Do other platforms allow this level of detail? No. Windows only has the "translated keys", e.g., you will always see VK_A regardless of the code that the keyboard might generate. Cheers, - Andreas |
In reply to this post by Andreas.Raab
MMm if you pass up the unicode in the keychar event as another field
then you'll reduce the amount of extra smalltalk code we'll need to run. Your proposal means processing 4 events on each keystroke and perhaps remembering state for the keychar versus EventKeyUTFChar. If I have both in the same record it could simplify things, perhaps even a flag to indicate the VM is passing up unicode is needed too. I still think on the keydown/up I will need to pass up the unicode and the raw keyboard code, so that you can figure out that unicode=function key and rawkey = 90 implies F1 on the mac. On Mar 18, 2005, at 1:36 PM, Andreas Raab wrote: > > I'd say let's just list that unicode in a new event and everything > should be pretty much fine. > > Cheers, > - Andreas > -- ======================================================================== === John M. McIntosh <[hidden email]> 1-800-477-2659 Corporate Smalltalk Consulting Ltd. http://www.smalltalkconsulting.com ======================================================================== === |
Hi John,
> MMm if you pass up the unicode in the keychar event as another field > then you'll reduce the amount of extra smalltalk code we'll need to run. Yes, we can do this but if we do then we cannot easily drop the MacRoman support. But I'm fine with this solution if you prefer it. Proposal: We add a "utf32Code" field to the sqKeyboardEvent that VMs are supposed to fill in if they support extended input services, zero if not. Everyone agree on this? > I still think on the keydown/up I will need to pass up the unicode and > the raw keyboard code, so that you can figure out that unicode=function > key and rawkey = 90 implies F1 on the mac. We really should be able to make that distinction by looking at the keyDown exclusively. What's so hard about it? It seems that both Win & *nix have trivial ways of giving us the information so it seems to me that the Mac ought to be able to give us that information too, shouldn't it? Cheers, - Andreas |
Hello,
> Proposal: We add a "utf32Code" field to the sqKeyboardEvent that VMs are > supposed to fill in if they support extended input services, zero if > not. Everyone agree on this? That would be ok. ("Probably a comment should say something like "precomposed Unicode characters in UTF-32".) And defining the byte order would be necessary. And a tangent topic... If somebody wants to write an action game, she might want to design an interface that uses the shift-key to, say, charge the energy while the key is down, and shoot the cannon ball when the key is released. Probably two-player game in which each player uses the different shift key. I wonder if a "terminal mode" or something in which all the raw keycode (keyboard keycode) are reported to the image without cooking. That would allow us to experiment other type of interaction such as chord key input. -- Yoshiki |
Hi Yoshiki,
> That would be ok. ("Probably a comment should say something like > "precomposed Unicode characters in UTF-32".) And defining the byte > order would be necessary. Huh? You see me confused. If UTF32 is a 32bit wide quantity then why would we have to worry about it? We use 32bit "words" all the time without worrying about byte ordering. What am I missing? > And a tangent topic... If somebody wants to write an action game, > she might want to design an interface that uses the shift-key to, say, > charge the energy while the key is down, and shoot the cannon ball > when the key is released. Probably two-player game in which each > player uses the different shift key. I wonder if a "terminal mode" or > something in which all the raw keycode (keyboard keycode) are reported > to the image without cooking. That would allow us to experiment other > type of interaction such as chord key input. Err, Yoshiki, *please* tell me you have not been reading the two threads before and in particular the three posts where I explained what I want to be happening for keyDown vs. keyChar events. *Please*. Otherwise I must assume that I am so terribly bad at explaining what I mean that even three messages, all from different points of view, are not good enough to explain what I mean. Because... what you say in the above is what I have been advocating all along. Four a fourth time from the beginning: "keyDown" and "keyUp" events report raw (untranslated, uncooked, whatever you name it) *keys*, that is individual keys on the keyboard, such as, for example, the left-shift, or the right-shift, the F1, the IME-Mode, or whatever else key. "keyChar" (also known as: keyStroke) events report translated, cooked, whatever you name it *characters* that is (typically) human-readable entitities which are created by some means of combining the aforementioned *key* events. Therefore, if you would like to, you could compose and interpret the keyDown character in any way you choose. So, indeed, it would be utterly trivial to deal with other forms of interactions. Sigh. I'm exhausted. Cheers, - Andreas |
Free forum by Nabble | Edit this page |