Hi All, ideally adding the high dpi support to the VM will not break backwards-compatibility. But that implies that the VM is informed before it creates the display of whether to create a high dpi display or not. Off-list Tobias asked me where the VM sets up the display on Mac and I was surprised by the answer. I thought it would be as part of beDisplay. But it's actually as a side-effect of DisplayScreen class>>actualScreenSize, primitive 106, which calls the ioScreenSize function. It is this functions' responsibility to actually create the display, deriving the size from the savedWindowSize info in the image header (which can or could be overridden on the VM command line, and is when -headless is supplied). So any new primitive added to allow DisplayScreen to inform the VM of whether to use high dpi or not would have to be invoked before primitive 106. So one way to implement this is to modify the chain of invocations leading up to primitive 106. For this route I'd like to propose the following refactoring: DisplayScreen class>>actualScreenSize <primitive: 106> ^ 640@480 becomes DisplayScreen class>>actualScreenSize self primitiveUseHighDPI: self useHighDPI. "where this is a preference" ^self primitiveScreenSize primitiveScreenSize <primitive: 106> ^ 640@480 Another route is to make the useHighDPI flag part of the image header state alongside the saved window size. This would mean it was added to the flags accessed via vmParameterAt: 48. There could be a command-line argument to override. Finally I note that the beDisplay primitive simply stores the display object in the specialObjectsArray and assigns the interpreter variables that track the display, displayBits, displayWidth, displayHeight & displayDepth. It then invokes ioNoteDisplayChangedwidthheightdepth, but *all* the implementations of this function are empty. I propose that we should eliminate 5this call and its implementation. It is confusing to follow it and find it does nothing. The argument could be that a platform might require it. But if that's so we can always put it back. We have an existence proof in all our platforms that this is unlikely. Thoughts? _,,,^..^,,,_ best, Eliot |
Hi Eliot, What would be the consequences of moving the display creation logic from DisplayScreen class>>actualScreenSize to beDisplay? If beDisplay is called afterwards, it at least should be feasible, right? Cheers, Fabio On Thu, Oct 8, 2020 at 7:59 PM Eliot Miranda <[hidden email]> wrote: > > > Hi All, > > ideally adding the high dpi support to the VM will not break backwards-compatibility. But that implies that the VM is informed before it creates the display of whether to create a high dpi display or not. Off-list Tobias asked me where the VM sets up the display on Mac and I was surprised by the answer. > > I thought it would be as part of beDisplay. But it's actually as a side-effect of DisplayScreen class>>actualScreenSize, primitive 106, which calls the ioScreenSize function. It is this functions' responsibility to actually create the display, deriving the size from the savedWindowSize info in the image header (which can or could be overridden on the VM command line, and is when -headless is supplied). > > So any new primitive added to allow DisplayScreen to inform the VM of whether to use high dpi or not would have to be invoked before primitive 106. So one way to implement this is to modify the chain of invocations leading up to primitive 106. For this route I'd like to propose the following refactoring: > > DisplayScreen class>>actualScreenSize > <primitive: 106> > ^ 640@480 > > becomes > > DisplayScreen class>>actualScreenSize > self primitiveUseHighDPI: self useHighDPI. "where this is a preference" > ^self primitiveScreenSize > > primitiveScreenSize > <primitive: 106> > ^ 640@480 > > > Another route is to make the useHighDPI flag part of the image header state alongside the saved window size. This would mean it was added to the flags accessed via vmParameterAt: 48. There could be a command-line argument to override. > > > Finally I note that the beDisplay primitive simply stores the display object in the specialObjectsArray and assigns the interpreter variables that track the display, displayBits, displayWidth, displayHeight & displayDepth. It then invokes ioNoteDisplayChangedwidthheightdepth, but *all* the implementations of this function are empty. I propose that we should eliminate 5this call and its implementation. It is confusing to follow it and find it does nothing. The argument could be that a platform might require it. But if that's so we can always put it back. We have an existence proof in all our platforms that this is unlikely. Thoughts? > _,,,^..^,,,_ > best, Eliot |
On Fri, Oct 9, 2020 at 12:31 AM Fabio Niephaus <[hidden email]> wrote:
I don't know. There would need to be a design where the semantics of beDisplay were "Make me the display but set my size to whatever you think best". But that's not how the interface is designed. The interface is designed as 1. tell Display how big the actual GUI's screen is => at this point the actual GUI's screen must be opened to find out how big it is 2. Display adjusts itself accordingly 3. beDisplay simply records the state that was actually started in 1. I don't know how to change this order without breaking backward-compatibility, which is where we came in. Also, a deBisplay primitive which reached in and changed the inst vars in Display would be a really bad thing (IMO). So it seems to me that we're stuck with the actual GUI's screen being opened in DisplayScreen class>>actualScreenSize <primitive: 106>
_,,,^..^,,,_ best, Eliot |
In reply to this post by Eliot Miranda-2
Hi Tobias, On Thu, Oct 8, 2020 at 11:57 PM Tobias Pape <[hidden email]> wrote: Hi Very nice. Let's go with this.
Yep, works for me.
+1
Not now. It used to. One would see the pixels in the display become nonsense noise as the GC moved the display underneath the screen refresh. But now in Spur the beDisplay primitive pins the display bits. It's also fairly recent: Even if the GC does move the display, ioNoteDisplayChangedwidthheightdepth can't be called until after GC, which means that from the time the GC moves the display to the time the GC finishes, the display image is corrupted. That's fixed in Spur but in V3 you'll see that happen, especially if you resize the display (which allocates a new bitmap). At least on Mac I would see it regularly.
_,,,^..^,,,_ best, Eliot |
In reply to this post by Eliot Miranda-2
On Fri, Oct 9, 2020 at 9:52 AM Eliot Miranda <[hidden email]> wrote: > > > > > On Fri, Oct 9, 2020 at 12:31 AM Fabio Niephaus <[hidden email]> wrote: >> >> >> Hi Eliot, >> >> What would be the consequences of moving the display creation logic >> from DisplayScreen class>>actualScreenSize to beDisplay? If beDisplay >> is called afterwards, it at least should be feasible, right? > > > I don't know. There would need to be a design where the semantics of beDisplay were "Make me the display but set my size to whatever you think best". But that's not how the interface is designed. The interface is designed as > > 1. tell Display how big the actual GUI's screen is > => at this point the actual GUI's screen must be opened to find out how big it is > 2. Display adjusts itself accordingly > 3. beDisplay simply records the state that was actually started in 1. > > I don't know how to change this order without breaking backward-compatibility, which is where we came in. Also, a deBisplay primitive which reached in and changed the inst vars in Display would be a really bad thing (IMO). Interesting...in TruffleSqueak, the display is created as part of beDisplay: https://github.com/hpi-swa/trufflesqueak/blob/5547e981b063b89d767a132862db514efdaaf171/src/de.hpi.swa.trufflesqueak/src/de/hpi/swa/trufflesqueak/nodes/primitives/impl/IOPrimitives.java#L241 Fabio > > So it seems to me that we're stuck with the actual GUI's screen being opened in DisplayScreen class>>actualScreenSize <primitive: 106> > >> >> Cheers, >> Fabio >> >> On Thu, Oct 8, 2020 at 7:59 PM Eliot Miranda <[hidden email]> wrote: >> > >> > >> > Hi All, >> > >> > ideally adding the high dpi support to the VM will not break backwards-compatibility. But that implies that the VM is informed before it creates the display of whether to create a high dpi display or not. Off-list Tobias asked me where the VM sets up the display on Mac and I was surprised by the answer. >> > >> > I thought it would be as part of beDisplay. But it's actually as a side-effect of DisplayScreen class>>actualScreenSize, primitive 106, which calls the ioScreenSize function. It is this functions' responsibility to actually create the display, deriving the size from the savedWindowSize info in the image header (which can or could be overridden on the VM command line, and is when -headless is supplied). >> > >> > So any new primitive added to allow DisplayScreen to inform the VM of whether to use high dpi or not would have to be invoked before primitive 106. So one way to implement this is to modify the chain of invocations leading up to primitive 106. For this route I'd like to propose the following refactoring: >> > >> > DisplayScreen class>>actualScreenSize >> > <primitive: 106> >> > ^ 640@480 >> > >> > becomes >> > >> > DisplayScreen class>>actualScreenSize >> > self primitiveUseHighDPI: self useHighDPI. "where this is a preference" >> > ^self primitiveScreenSize >> > >> > primitiveScreenSize >> > <primitive: 106> >> > ^ 640@480 >> > >> > >> > Another route is to make the useHighDPI flag part of the image header state alongside the saved window size. This would mean it was added to the flags accessed via vmParameterAt: 48. There could be a command-line argument to override. >> > >> > >> > Finally I note that the beDisplay primitive simply stores the display object in the specialObjectsArray and assigns the interpreter variables that track the display, displayBits, displayWidth, displayHeight & displayDepth. It then invokes ioNoteDisplayChangedwidthheightdepth, but *all* the implementations of this function are empty. I propose that we should eliminate 5this call and its implementation. It is confusing to follow it and find it does nothing. The argument could be that a platform might require it. But if that's so we can always put it back. We have an existence proof in all our platforms that this is unlikely. Thoughts? >> > _,,,^..^,,,_ >> > best, Eliot > > > > -- > _,,,^..^,,,_ > best, Eliot |
Free forum by Nabble | Edit this page |