Installing high dpi support and backwards compatibility of the VM

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Installing high dpi support and backwards compatibility of the VM

Eliot Miranda-2
 
Hi All,

    ideally adding the high dpi support to the VM will not break backwards-compatibility.  But that implies that the VM is informed before it creates the display of whether to create a high dpi display or not.  Off-list Tobias asked me where the VM sets up the display on Mac and I was surprised by the answer.

I thought it would be as part of beDisplay.  But it's actually as a side-effect of DisplayScreen class>>actualScreenSize, primitive 106, which calls the ioScreenSize function.  It is this functions' responsibility to actually create the display, deriving the size from the savedWindowSize info in the image header (which can or could be overridden on the VM command line, and is when -headless is supplied).

So any new primitive added to allow DisplayScreen to inform the VM of whether to use high dpi or not would have to be invoked before primitive 106.  So one way to implement this is to modify the chain of invocations leading up to primitive 106.  For this route I'd like to propose the following refactoring:

DisplayScreen class>>actualScreenSize
<primitive: 106>
^ 640@480

becomes

DisplayScreen class>>actualScreenSize
self primitiveUseHighDPI: self useHighDPI. "where this is a preference"
^self primitiveScreenSize

primitiveScreenSize
<primitive: 106>
^ 640@480


Another route is to make the useHighDPI flag part of the image header state alongside the saved window size.  This would mean it was added to the flags accessed via vmParameterAt: 48.  There could be a command-line argument to override.


Finally I note that the beDisplay primitive simply stores the display object in the specialObjectsArray and assigns the interpreter variables that track the display, displayBits, displayWidth, displayHeight & displayDepth.  It then invokes ioNoteDisplayChangedwidthheightdepth, but *all* the implementations of this function are empty.  I propose that we should eliminate 5this call and its implementation. It is confusing to follow it and find it does nothing.  The argument could be that a platform might require it.  But if that's so we can always put it back.  We have an existence proof in all our platforms that this is unlikely.  Thoughts?
_,,,^..^,,,_
best, Eliot
Reply | Threaded
Open this post in threaded view
|

Re: Installing high dpi support and backwards compatibility of the VM

fniephaus
 
Hi Eliot,

What would be the consequences of moving the display creation logic
from DisplayScreen class>>actualScreenSize to beDisplay? If beDisplay
is called afterwards, it at least should be feasible, right?

Cheers,
Fabio

On Thu, Oct 8, 2020 at 7:59 PM Eliot Miranda <[hidden email]> wrote:

>
>
> Hi All,
>
>     ideally adding the high dpi support to the VM will not break backwards-compatibility.  But that implies that the VM is informed before it creates the display of whether to create a high dpi display or not.  Off-list Tobias asked me where the VM sets up the display on Mac and I was surprised by the answer.
>
> I thought it would be as part of beDisplay.  But it's actually as a side-effect of DisplayScreen class>>actualScreenSize, primitive 106, which calls the ioScreenSize function.  It is this functions' responsibility to actually create the display, deriving the size from the savedWindowSize info in the image header (which can or could be overridden on the VM command line, and is when -headless is supplied).
>
> So any new primitive added to allow DisplayScreen to inform the VM of whether to use high dpi or not would have to be invoked before primitive 106.  So one way to implement this is to modify the chain of invocations leading up to primitive 106.  For this route I'd like to propose the following refactoring:
>
> DisplayScreen class>>actualScreenSize
> <primitive: 106>
> ^ 640@480
>
> becomes
>
> DisplayScreen class>>actualScreenSize
> self primitiveUseHighDPI: self useHighDPI. "where this is a preference"
> ^self primitiveScreenSize
>
> primitiveScreenSize
> <primitive: 106>
> ^ 640@480
>
>
> Another route is to make the useHighDPI flag part of the image header state alongside the saved window size.  This would mean it was added to the flags accessed via vmParameterAt: 48.  There could be a command-line argument to override.
>
>
> Finally I note that the beDisplay primitive simply stores the display object in the specialObjectsArray and assigns the interpreter variables that track the display, displayBits, displayWidth, displayHeight & displayDepth.  It then invokes ioNoteDisplayChangedwidthheightdepth, but *all* the implementations of this function are empty.  I propose that we should eliminate 5this call and its implementation. It is confusing to follow it and find it does nothing.  The argument could be that a platform might require it.  But if that's so we can always put it back.  We have an existence proof in all our platforms that this is unlikely.  Thoughts?
> _,,,^..^,,,_
> best, Eliot
Reply | Threaded
Open this post in threaded view
|

Re: Installing high dpi support and backwards compatibility of the VM

Eliot Miranda-2
 


On Fri, Oct 9, 2020 at 12:31 AM Fabio Niephaus <[hidden email]> wrote:
 
Hi Eliot,

What would be the consequences of moving the display creation logic
from DisplayScreen class>>actualScreenSize to beDisplay? If beDisplay
is called afterwards, it at least should be feasible, right?

I don't know.  There would need to be a design where the semantics of beDisplay were "Make me the display but set my size to whatever you think best".  But that's not how the interface is designed.  The interface is designed as

1. tell Display how big the actual GUI's screen is 
      => at this point the actual GUI's screen must be opened to find out how big it is
2. Display adjusts itself accordingly
3. beDisplay simply records the state that was actually started in 1.

I don't know how to change this order without breaking backward-compatibility, which is where we came in.  Also, a deBisplay primitive which reached in and changed the inst vars in Display would be a really bad thing (IMO).

So it seems to me that we're stuck with the actual GUI's screen being opened in DisplayScreen class>>actualScreenSize <primitive: 106>


Cheers,
Fabio

On Thu, Oct 8, 2020 at 7:59 PM Eliot Miranda <[hidden email]> wrote:
>
>
> Hi All,
>
>     ideally adding the high dpi support to the VM will not break backwards-compatibility.  But that implies that the VM is informed before it creates the display of whether to create a high dpi display or not.  Off-list Tobias asked me where the VM sets up the display on Mac and I was surprised by the answer.
>
> I thought it would be as part of beDisplay.  But it's actually as a side-effect of DisplayScreen class>>actualScreenSize, primitive 106, which calls the ioScreenSize function.  It is this functions' responsibility to actually create the display, deriving the size from the savedWindowSize info in the image header (which can or could be overridden on the VM command line, and is when -headless is supplied).
>
> So any new primitive added to allow DisplayScreen to inform the VM of whether to use high dpi or not would have to be invoked before primitive 106.  So one way to implement this is to modify the chain of invocations leading up to primitive 106.  For this route I'd like to propose the following refactoring:
>
> DisplayScreen class>>actualScreenSize
> <primitive: 106>
> ^ 640@480
>
> becomes
>
> DisplayScreen class>>actualScreenSize
> self primitiveUseHighDPI: self useHighDPI. "where this is a preference"
> ^self primitiveScreenSize
>
> primitiveScreenSize
> <primitive: 106>
> ^ 640@480
>
>
> Another route is to make the useHighDPI flag part of the image header state alongside the saved window size.  This would mean it was added to the flags accessed via vmParameterAt: 48.  There could be a command-line argument to override.
>
>
> Finally I note that the beDisplay primitive simply stores the display object in the specialObjectsArray and assigns the interpreter variables that track the display, displayBits, displayWidth, displayHeight & displayDepth.  It then invokes ioNoteDisplayChangedwidthheightdepth, but *all* the implementations of this function are empty.  I propose that we should eliminate 5this call and its implementation. It is confusing to follow it and find it does nothing.  The argument could be that a platform might require it.  But if that's so we can always put it back.  We have an existence proof in all our platforms that this is unlikely.  Thoughts?
> _,,,^..^,,,_
> best, Eliot


--
_,,,^..^,,,_
best, Eliot
Reply | Threaded
Open this post in threaded view
|

Re: [squeak-dev] Installing high dpi support and backwards compatibility of the VM

Eliot Miranda-2
In reply to this post by Eliot Miranda-2
 
Hi Tobias,

On Thu, Oct 8, 2020 at 11:57 PM Tobias Pape <[hidden email]> wrote:
Hi

> On 08.10.2020, at 19:58, Eliot Miranda <[hidden email]> wrote:
>
> Hi All,
>
>     ideally adding the high dpi support to the VM will not break backwards-compatibility.  But that implies that the VM is informed before it creates the display of whether to create a high dpi display or not.  Off-list Tobias asked me where the VM sets up the display on Mac and I was surprised by the answer.
>
> I thought it would be as part of beDisplay.  But it's actually as a side-effect of DisplayScreen class>>actualScreenSize, primitive 106, which calls the ioScreenSize function.  It is this functions' responsibility to actually create the display, deriving the size from the savedWindowSize info in the image header (which can or could be overridden on the VM command line, and is when -headless is supplied).
>
> So any new primitive added to allow DisplayScreen to inform the VM of whether to use high dpi or not would have to be invoked before primitive 106.  So one way to implement this is to modify the chain of invocations leading up to primitive 106.  For this route I'd like to propose the following refactoring:
>
> DisplayScreen class>>actualScreenSize
>       <primitive: 106>
>       ^ 640@480
>
> becomes
>
> DisplayScreen class>>actualScreenSize
>       self primitiveUseHighDPI: self useHighDPI. "where this is a preference"
>       ^self primitiveScreenSize
>
> primitiveScreenSize
>       <primitive: 106>
>       ^ 640@480
>


Here's another idea:
We already have

DisplayScreen class>>actualScreenScaleFactor
        <primitive: 'primitiveScreenScaleFactor'>
        ^ 1.0

And if we change DisplayScreen class>>startUp  to

DisplayScreen class>>startUp  "DisplayScreen startUp"
        Display setScaleFactor: self actualScreenScaleFactor.
        Display setExtent: self actualScreenSize depth: Display nativeDepth.
        Display beDisplay

Very nice.  Let's go with this.

Then the contract could be:

"Iff you call primitiveScreenScaleFactor before any call to primitive 106, then you opt in to possibly high dpi"

That way, we do not have to change any image at all, cause older images just don't call that primitive.

Yep, works for me.


>
> Another route is to make the useHighDPI flag part of the image header state alongside the saved window size.  This would mean it was added to the flags accessed via vmParameterAt: 48.  There could be a command-line argument to override.

Maybe a cmd-line parameter in any case…

+1

>
>
> Finally I note that the beDisplay primitive simply stores the display object in the specialObjectsArray and assigns the interpreter variables that track the display, displayBits, displayWidth, displayHeight & displayDepth.  It then invokes ioNoteDisplayChangedwidthheightdepth, but *all* the implementations of this function are empty.  I propose that we should eliminate 5this call and its implementation. It is confusing to follow it and find it does nothing.  The argument could be that a platform might require it.  But if that's so we can always put it back.  We have an existence proof in all our platforms that this is unlikely.  Thoughts?


Funny. The mac vm says "/* This is invoked when the GC moves the display bitmap.  For now do nothing. */" Does the GC ever do that actually?

Not now.  It used to.  One would see the pixels in the display become nonsense noise as the GC moved the display underneath the screen refresh.  But now in Spur the beDisplay primitive pins the display bits.
It's also fairly recent:

78c402ea71ebcc9db12496f81021fdb9b57deb5f (Fri May 12 19:29:45 2017)

StackInterpreter:
    Simplify and make robust display bitmap access for display update.  The old code
    required platforms that needed to redraw at arbitrary times to have to access
    the display bits through interpreterProxy->displayObject, decoding it each time.
    There exists a small window during compaction, etc, during whiuch such access
    will fail and cause a VM crash.  The new code provides four variables to
    reference the display, displayBits, displayWidth, displayHeight and
    displayDepth, which are assigned appropriately in the primitiveBeDisplay
    primitive.  After a GC the interpreter checks if the displayBits have changed
    location and if so calls ioNoteDisplayChanged:width:height:depth:
    (ioNoteDisplayChangedwidthheightdepth) to inform the platform of the change
    (currently all platforms implement this as a null function).


So, old (<2017) code cannot depend on it, new code does not. If the GC issue is moot, we can ditch it.

Even if the GC does move the display, ioNoteDisplayChangedwidthheightdepth can't be called until after GC, which means that from the time the GC moves the display to the time the GC finishes, the display image is corrupted.  That's fixed in Spur but in V3 you'll see that happen, especially if you resize the display (which allocates a new bitmap).  At least on Mac I would see it regularly.


Best regards
        -Tobias







--
_,,,^..^,,,_
best, Eliot
Reply | Threaded
Open this post in threaded view
|

Re: Installing high dpi support and backwards compatibility of the VM

fniephaus
In reply to this post by Eliot Miranda-2
 
On Fri, Oct 9, 2020 at 9:52 AM Eliot Miranda <[hidden email]> wrote:

>
>
>
>
> On Fri, Oct 9, 2020 at 12:31 AM Fabio Niephaus <[hidden email]> wrote:
>>
>>
>> Hi Eliot,
>>
>> What would be the consequences of moving the display creation logic
>> from DisplayScreen class>>actualScreenSize to beDisplay? If beDisplay
>> is called afterwards, it at least should be feasible, right?
>
>
> I don't know.  There would need to be a design where the semantics of beDisplay were "Make me the display but set my size to whatever you think best".  But that's not how the interface is designed.  The interface is designed as
>
> 1. tell Display how big the actual GUI's screen is
>       => at this point the actual GUI's screen must be opened to find out how big it is
> 2. Display adjusts itself accordingly
> 3. beDisplay simply records the state that was actually started in 1.
>
> I don't know how to change this order without breaking backward-compatibility, which is where we came in.  Also, a deBisplay primitive which reached in and changed the inst vars in Display would be a really bad thing (IMO).

Interesting...in TruffleSqueak, the display is created as part of beDisplay:

https://github.com/hpi-swa/trufflesqueak/blob/5547e981b063b89d767a132862db514efdaaf171/src/de.hpi.swa.trufflesqueak/src/de/hpi/swa/trufflesqueak/nodes/primitives/impl/IOPrimitives.java#L241

Fabio

>
> So it seems to me that we're stuck with the actual GUI's screen being opened in DisplayScreen class>>actualScreenSize <primitive: 106>
>
>>
>> Cheers,
>> Fabio
>>
>> On Thu, Oct 8, 2020 at 7:59 PM Eliot Miranda <[hidden email]> wrote:
>> >
>> >
>> > Hi All,
>> >
>> >     ideally adding the high dpi support to the VM will not break backwards-compatibility.  But that implies that the VM is informed before it creates the display of whether to create a high dpi display or not.  Off-list Tobias asked me where the VM sets up the display on Mac and I was surprised by the answer.
>> >
>> > I thought it would be as part of beDisplay.  But it's actually as a side-effect of DisplayScreen class>>actualScreenSize, primitive 106, which calls the ioScreenSize function.  It is this functions' responsibility to actually create the display, deriving the size from the savedWindowSize info in the image header (which can or could be overridden on the VM command line, and is when -headless is supplied).
>> >
>> > So any new primitive added to allow DisplayScreen to inform the VM of whether to use high dpi or not would have to be invoked before primitive 106.  So one way to implement this is to modify the chain of invocations leading up to primitive 106.  For this route I'd like to propose the following refactoring:
>> >
>> > DisplayScreen class>>actualScreenSize
>> > <primitive: 106>
>> > ^ 640@480
>> >
>> > becomes
>> >
>> > DisplayScreen class>>actualScreenSize
>> > self primitiveUseHighDPI: self useHighDPI. "where this is a preference"
>> > ^self primitiveScreenSize
>> >
>> > primitiveScreenSize
>> > <primitive: 106>
>> > ^ 640@480
>> >
>> >
>> > Another route is to make the useHighDPI flag part of the image header state alongside the saved window size.  This would mean it was added to the flags accessed via vmParameterAt: 48.  There could be a command-line argument to override.
>> >
>> >
>> > Finally I note that the beDisplay primitive simply stores the display object in the specialObjectsArray and assigns the interpreter variables that track the display, displayBits, displayWidth, displayHeight & displayDepth.  It then invokes ioNoteDisplayChangedwidthheightdepth, but *all* the implementations of this function are empty.  I propose that we should eliminate 5this call and its implementation. It is confusing to follow it and find it does nothing.  The argument could be that a platform might require it.  But if that's so we can always put it back.  We have an existence proof in all our platforms that this is unlikely.  Thoughts?
>> > _,,,^..^,,,_
>> > best, Eliot
>
>
>
> --
> _,,,^..^,,,_
> best, Eliot