VM Maker: VMMaker.oscog-eem.2182.mcz

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

VM Maker: VMMaker.oscog-eem.2182.mcz

commits-2
 
Eliot Miranda uploaded a new version of VMMaker to project VM Maker:
http://source.squeak.org/VMMaker/VMMaker.oscog-eem.2182.mcz

==================== Summary ====================

Name: VMMaker.oscog-eem.2182
Author: eem
Time: 24 March 2017, 4:03:46.057141 pm
UUID: 541d6c6d-ade5-47f1-ad2b-f9278422e3c9
Ancestors: VMMaker.oscog-eem.2181

Fix pc mapping for backward jumps in Cogit VMs with extension bytecodes.  The old code mapped a backward branch mcpc to the unextended jump bytecode, which could turn a backward branch into a forward jump, with disastrous results.  The map searcher must offset by the number of extensions to refer to the beginning of the extended bytecode sequence for a backward jump.  The extended backward jump bytecode must compute correctly the offset of the start of the extended bytecode from the target of the jump.

Fix context printing so that the cogMethod is printed last in a stable context (I get confused otherwise).

=============== Diff against VMMaker.oscog-eem.2181 ===============

Item was changed:
  ----- Method: CoInterpreter>>ensureContextHasBytecodePC: (in category 'frame access') -----
  ensureContextHasBytecodePC: aContext
  "Make sure the context has a byetcode pc.  Can only be used on single contexts."
  | pc |
  self assert: (self isMarriedOrWidowedContext: aContext) not.
  pc := objectMemory fetchPointer: InstructionPointerIndex ofObject: aContext.
  ((objectMemory isIntegerObject: pc)
  and: [(pc := objectMemory integerValueOf: pc) < 0]) ifTrue:
  [pc := self mustMapMachineCodePC: pc context: aContext.
+ self assert: (self validBCPC: (objectMemory integerValueOf: pc) inMethod: (objectMemory fetchPointer: MethodIndex ofObject: aContext)).
  objectMemory storePointerUnchecked: InstructionPointerIndex ofObject: aContext withValue: pc]!

Item was changed:
  ----- Method: CoInterpreter>>ifBackwardsCheckForEvents: (in category 'jump bytecodes') -----
+ ifBackwardsCheckForEvents: offsetToJumpBytecode
- ifBackwardsCheckForEvents: offset
  "Backward jump means we're in a loop.
  - check for possible interrupts.
  - check for long-running loops and JIT if appropriate."
  | switched backwardJumpCountByte |
  <inline: true>
+ offsetToJumpBytecode >= 0 ifTrue:
- offset >= 0 ifTrue:
  [^self].
 
  localSP < stackLimit ifTrue:
  [self externalizeIPandSP.
  switched := self checkForEventsMayContextSwitch: true.
  self returnToExecutive: true postContextSwitch: switched.
  self browserPluginReturnIfNeeded.
  self internalizeIPandSP.
  switched ifTrue:
  [^self]].
 
  "We use the least significant byte of the flags word (which is marked as an immediate) and
  subtract two each time to avoid disturbing the least significant tag bit.  Since the byte is
  initialized to 1 (on frame build), on first decrement it will become -1.  Trip when it reaches 1 again."
  backwardJumpCountByte := self iframeBackwardBranchByte: localFP.
  (backwardJumpCountByte := backwardJumpCountByte - 2) = 1
  ifTrue:
  [(self methodWithHeaderShouldBeCogged: (objectMemory methodHeaderOf: method)) ifTrue:
  [self externalizeIPandSP.
+ self attemptToSwitchToMachineCode: (self oopForPointer: localIP) - offsetToJumpBytecode - method - objectMemory baseHeaderSize - 1
- self attemptToSwitchToMachineCode: (self oopForPointer: localIP) - offset - method - objectMemory baseHeaderSize - 1
  "If attemptToSwitchToMachineCode: returns the method could not be cogged, hence..."].
  "can't cog method; avoid asking to cog it again for the longest possible time."
  backwardJumpCountByte := 16r7F]
  ifFalse:
  [backwardJumpCountByte = -1 ifTrue: "initialize the count"
  [self assert: minBackwardJumpCountForCompile <= 128.
  backwardJumpCountByte := minBackwardJumpCountForCompile - 1 << 1 + 1]].
  self iframeBackwardBranchByte: localFP put: backwardJumpCountByte!

Item was changed:
  ----- Method: CoInterpreter>>printMethodFieldForPrintContext: (in category 'debug printing') -----
  printMethodFieldForPrintContext: aContext
  <inline: true>
  | meth |
  meth := objectMemory fetchPointer: MethodIndex ofObject: aContext.
+ (self isMarriedOrWidowedContext: aContext)
+ ifFalse:
+ [self shortPrintOop: meth.
+ (self methodHasCogMethod: meth) ifTrue:
+ [self space; printHexnp: (self cogMethodOf: meth)]]
+ ifTrue:
+ [(self methodHasCogMethod: meth) ifTrue:
+ [self printHexnp: (self cogMethodOf: meth); space].
+ self shortPrintOop: meth]!
- (self methodHasCogMethod: meth) ifTrue:
- [self printHexnp: (self cogMethodOf: meth); space].
- self shortPrintOop: meth.!

Item was added:
+ ----- Method: CogVMSimulator>>extUnconditionalJump (in category 'jump bytecodes') -----
+ extUnconditionalJump
+ "242 11110010 i i i i i i i i Jump i i i i i i i i (+ Extend B * 256, where bbbbbbbb = sddddddd, e.g. -32768 = i=0, a=0, s=1)"
+ "| byte offset |
+ byte := objectMemory byteAt: localIP + 1.
+ offset := byte + (extB << 8).
+ (offset < 0 and: [(self iframeBackwardBranchByte: localFP) - 2 = 1]) ifTrue: [self halt]."
+ ^super extUnconditionalJump!

Item was added:
+ ----- Method: CogVMSimulator>>primitiveVoidVMState (in category 'system control primitives') -----
+ primitiveVoidVMState
+ self halt.
+ ^super primitiveVoidVMState!

Item was changed:
  ----- Method: Cogit>>mapFor:bcpc:performUntil:arg: (in category 'method map') -----
  mapFor: cogMethod bcpc: startbcpc performUntil: functionSymbol arg: arg
  "Machine-code <-> bytecode pc mapping support.  Evaluate functionSymbol
  for each mcpc, bcpc pair in the map until the function returns non-zero,
  answering that result, or 0 if it fails to.  To cut down on number of arguments.
  and to be usable for both pc-mapping and method introspection, we encode
  the annotation and the isBackwardBranch flag in the same parameter.
  Guilty as charged."
  <var: #cogMethod type: #'CogBlockMethod *'>
  <var: #functionSymbol declareC: 'sqInt (*functionSymbol)(BytecodeDescriptor *desc, sqInt annotationAndIsBackwardBranch, char *mcpc, sqInt bcpc, void *arg)'>
  <var: #arg type: #'void *'>
  <inline: true>
  | isInBlock mcpc bcpc endbcpc map mapByte homeMethod aMethodObj result
   latestContinuation byte descriptor bsOffset nExts annotation |
  <var: #descriptor type: #'BytecodeDescriptor *'>
  <var: #homeMethod type: #'CogMethod *'>
 
  self assert: cogMethod stackCheckOffset > 0.
  mcpc := cogMethod asUnsignedInteger + cogMethod stackCheckOffset.
  "The stack check maps to the start of the first bytecode,
  the first bytecode being effectively after frame build."
  result := self perform: functionSymbol
  with: nil
  with: 0 + (HasBytecodePC << 1)
  with: (self cCoerceSimple: mcpc to: #'char *')
  with: startbcpc
  with: arg.
  result ~= 0 ifTrue:
  [^result].
  bcpc := startbcpc.
  "In both CMMethod and CMBlock cases find the start of the map and
  skip forward to the bytecode pc map entry for the stack check."
  cogMethod cmType = CMMethod
  ifTrue:
  [isInBlock := cogMethod cmIsFullBlock.
  homeMethod := self cCoerceSimple: cogMethod to: #'CogMethod *'.
  self assert: startbcpc = (coInterpreter startPCOfMethodHeader: homeMethod methodHeader).
  map := self mapStartFor: homeMethod.
  annotation := (objectMemory byteAt: map) >> AnnotationShift.
  self assert: (annotation = IsAbsPCReference
  or: [annotation = IsObjectReference
  or: [annotation = IsRelativeCall
  or: [annotation = IsDisplacementX2N]]]).
  latestContinuation := startbcpc.
  aMethodObj := homeMethod methodObject.
  endbcpc := (objectMemory numBytesOf: aMethodObj) - 1.
  bsOffset := self bytecodeSetOffsetForHeader: homeMethod methodHeader.
  "If the method has a primitive, skip it and the error code store, if any;
  Logically. these come before the stack check and so must be ignored."
  bcpc := bcpc + (self deltaToSkipPrimAndErrorStoreIn: aMethodObj
  header: homeMethod methodHeader)]
  ifFalse:
  [isInBlock := true.
  self assert: bcpc = cogMethod startpc.
  homeMethod := cogMethod cmHomeMethod.
  map := self findMapLocationForMcpc: cogMethod asUnsignedInteger + (self sizeof: CogBlockMethod)
  inMethod: homeMethod.
  self assert: map ~= 0.
  annotation := (objectMemory byteAt: map) >> AnnotationShift.
  self assert: (annotation >> AnnotationShift = HasBytecodePC "fiducial"
  or: [annotation >> AnnotationShift = IsDisplacementX2N]).
  [(annotation := (objectMemory byteAt: map) >> AnnotationShift) ~= HasBytecodePC] whileTrue:
  [map := map - 1].
  map := map - 1. "skip fiducial; i.e. the map entry for the pc immediately following the method header."
  aMethodObj := homeMethod methodObject.
  bcpc := startbcpc - (self blockCreationBytecodeSizeForHeader: homeMethod methodHeader).
  bsOffset := self bytecodeSetOffsetForHeader: homeMethod methodHeader.
  byte := (objectMemory fetchByte: bcpc ofObject: aMethodObj) + bsOffset.
  descriptor := self generatorAt: byte.
  endbcpc := self nextBytecodePCFor: descriptor at: bcpc exts: -1 in: aMethodObj.
  bcpc := startbcpc].
  nExts := 0.
  self inlineCacheTagsAreIndexes ifTrue:
  [enumeratingCogMethod := homeMethod].
  "Now skip up through the bytecode pc map entry for the stack check."
  [(objectMemory byteAt: map) >> AnnotationShift ~= HasBytecodePC] whileTrue:
  [map := map - 1].
  map := map - 1.
  [(mapByte := objectMemory byteAt: map) ~= MapEnd] whileTrue: "defensive; we exit on bcpc"
  [mapByte >= FirstAnnotation
  ifTrue:
  [| nextBcpc isBackwardBranch |
  annotation := mapByte >> AnnotationShift.
  mcpc := mcpc + ((mapByte bitAnd: DisplacementMask) * backEnd codeGranularity).
  (self isPCMappedAnnotation: annotation) ifTrue:
  [(annotation = IsSendCall
   and: [(mapByte := objectMemory byteAt: map - 1) >> AnnotationShift = IsAnnotationExtension]) ifTrue:
  [annotation := annotation + (mapByte bitAnd: DisplacementMask).
  map := map - 1].
  [byte := (objectMemory fetchByte: bcpc ofObject: aMethodObj) + bsOffset.
   descriptor := self generatorAt: byte.
   isInBlock
  ifTrue: [bcpc >= endbcpc ifTrue: [^0]]
  ifFalse:
  [(descriptor isReturn and: [bcpc >= latestContinuation]) ifTrue: [^0].
  (descriptor isBranch or: [descriptor isBlockCreation]) ifTrue:
  [| targetPC |
  targetPC := self latestContinuationPCFor: descriptor at: bcpc exts: nExts in: aMethodObj.
  latestContinuation := latestContinuation max: targetPC]].
   nextBcpc := self nextBytecodePCFor: descriptor at: bcpc exts: nExts in: aMethodObj.
   descriptor isMapped
   or: [isInBlock and: [descriptor isMappedInBlock]]] whileFalse:
  [bcpc := nextBcpc.
  nExts := descriptor isExtension ifTrue: [nExts + 1] ifFalse: [0]].
  isBackwardBranch := descriptor isBranch
    and: [self isBackwardBranch: descriptor at: bcpc exts: nExts in: aMethodObj].
  result := self perform: functionSymbol
  with: descriptor
  with: (isBackwardBranch ifTrue: [annotation << 1 + 1] ifFalse: [annotation << 1])
  with: (self cCoerceSimple: mcpc to: #'char *')
+ with: (isBackwardBranch ifTrue: [bcpc - (2 * nExts)] ifFalse: [bcpc])
- with: bcpc
  with: arg.
  result ~= 0 ifTrue:
  [^result].
  bcpc := nextBcpc.
  nExts := descriptor isExtension ifTrue: [nExts + 1] ifFalse: [0]]]
  ifFalse:
  [self assert: (mapByte >> AnnotationShift = IsDisplacementX2N
  or: [mapByte >> AnnotationShift = IsAnnotationExtension]).
  mapByte < (IsAnnotationExtension << AnnotationShift) ifTrue:
  [mcpc := mcpc + ((mapByte - DisplacementX2N << AnnotationShift) * backEnd codeGranularity)]].
  map := map - 1].
  ^0!

Item was changed:
  ----- Method: StackInterpreter>>extUnconditionalJump (in category 'jump bytecodes') -----
  extUnconditionalJump
  "242 11110010 i i i i i i i i Jump i i i i i i i i (+ Extend B * 256, where bbbbbbbb = sddddddd, e.g. -32768 = i=0, a=0, s=1)"
+ | byte offset bcpcDelta |
- | byte offset |
  byte := self fetchByte.
  offset := byte + (extB << 8).
+ bcpcDelta := offset < 0 ifTrue: [numExtB * 2] ifFalse: [0].
  extB := 0.
  numExtB := 0.
  localIP := localIP + offset.
+ self ifBackwardsCheckForEvents: offset + bcpcDelta.
- self ifBackwardsCheckForEvents: offset.
  self fetchNextBytecode!

Item was added:
+ ----- Method: StackInterpreter>>validBCPC:inMethod: (in category 'debug support') -----
+ validBCPC: thePC inMethod: aMethod
+ <var: #aMethod type: #usqInt>
+ "Note that we accept anInstrPointer pointing to a callPrimitiveBytecode
+ at the start of a method that contains a primitive.  This because methods like
+ Context(Part)>>reset have to be updated to skip the callPrimtiive bytecode otherwise."
+ "-1 for pre-increment in fetchNextBytecode"
+ ^thePC >= (objectMemory lastPointerOf: aMethod)
+  and: [thePC < (objectMemory numBytesOfBytes: aMethod)]!