Next try. I may do something stupid but it appears that all morphs draw on the same coordinate system. Meaning that every Morph that will draw something at 0@0 will put it in the top left corner of the image displayed. I would expect that a morph that is positioned somewhere provides a canvas where 0@0 is relative to the position of the morph. If coordinates are absoulte how can I delegate work to sub components and having them calculate the offsets properly?
Norbert |
On 20 September 2013 12:15, Norbert Hartl <[hidden email]> wrote: Next try. I may do something stupid but it appears that all morphs draw on the same coordinate system. Meaning that every Morph that will draw something at 0@0 will put it in the top left corner of the image displayed. I would expect that a morph that is positioned somewhere provides a canvas where 0@0 is relative to the position of the morph. If coordinates are absoulte how can I delegate work to sub components and having them calculate the offsets properly? Welcome to club. To ease the pain, take a look at TransformMorph. Norbert -- Best regards, Igor Stasenko. |
Am 20.09.2013 um 12:24 schrieb Igor Stasenko <[hidden email]>: Thanks. I look at it but it is still hard to achieve something simple. Norbert
|
In reply to this post by Igor Stasenko
Am 20.09.2013 um 12:24 schrieb Igor Stasenko <[hidden email]>:
I figured out how to do it. Basically it is exactly what I need: transform geo coordinates to screen coordinates. But then LineMorph does not seem to like a borderWidth below 1. It is a drag. I think I could burn several days on this crap. Better start doing something different. Norbert |
In reply to this post by NorbertHartl
On Fri, 20 Sep 2013, Norbert Hartl wrote:
> Next try. I may do something stupid but it appears that all morphs draw > on the same coordinate system. Meaning that every Morph that will draw > something at 0@0 will put it in the top left corner of the image > displayed. I would expect that a morph that is positioned somewhere > provides a canvas where 0@0 is relative to the position of the morph. If > coordinates are absoulte how can I delegate work to sub components and > having them calculate the offsets properly? I had this problem to while teaching pupils (grade 10) Smalltalk as their first 'real' programming language. We were embedding EllipseMorphs into a PasteUpMorph. Embedded morphs know their owner, hence self owner position might be the desired offset. Markus |
I'd not expect though, that it should be necessary to add offsets by hand.
What is the advantage of having embedded objects use global coordinates? Markus |
On 20 September 2013 17:37, Markus Schlager <[hidden email]> wrote: I'd not expect though, that it should be necessary to add offsets by hand. What is the advantage of having embedded objects use global coordinates? i don't know.. i think people who could answer this question is not on this list. i think it, as many other aspects of our heritage, a result of many years of building on top of once simple and robust things, and that build-on-top-without-rethinking process led us to what we have now: once simple and robust solution, now buried under tons of layers, where some of them trying to extend design, and some trying to fix its original issues.. and all of that instead of changing design to something more adequate and simpler. -- Best regards, Igor Stasenko. |
On 20 September 2013 19:22, Igor Stasenko <[hidden email]> wrote:
... and some of them there is to duplicate already existing functionality. i guess because people was unaware of its existence or because coding is much more fun than reading and learning code :)
-- Best regards, Igor Stasenko. |
In reply to this post by Markus Schlager-2
On Sep 20, 2013, at 5:37 PM, Markus Schlager <[hidden email]> wrote: > I'd not expect though, that it should be necessary to add offsets by hand. What is the advantage of having embedded objects use global coordinates? conceptually this is a bug. > > Markus > |
In reply to this post by NorbertHartl
CONTENTS DELETED
The author has deleted this message.
|
In reply to this post by Igor Stasenko
Am 20.09.2013 um 12:24 schrieb Igor Stasenko <[hidden email]>: Thanks. I'm trying to understand TransformMorph but I don't really get it. I cannot see _how_ it is transforming stuff. For testing I have three morphs MasterMorph TransformMorph TestMorph In TransformMorph I use MorphicTransform offset: 0 angle: 0 scale: 10.0. When I implement #drawOn: in TestMorph and do aCanvas fillRectangle: (Rectangle origin: 10@10 extent: 10@10) color: Color red. the result looks good although I don't understand where the coordinates are translated. To me it appears that the way it goes is from Canvas to GrafPort and I didn't find the location where TransformMorph comes into play. I'm trying to figure that out because I cannot use Float coordinates in my transform morph. Doing in TestMorph>>#drawOn something like aCanvas fillRectangle: (Rectangle origin: 0.9@0.9 extent: 1@1) color: Color green. I get two problems. Basically the Rectangle is truncated to Ints before the transform. Additionally in this particular case the rectangle is not displayed at all. For 0.89@.089 I can see it truncated at origin 0@0 but at 0.9@0.9 it just disappears. Same goes for 0.63 and other values. And I don't know how to debug morphic because using breakpoints is too dangerous while it is in stepping mode. thanks, Norbert |
Free forum by Nabble | Edit this page |