• Jazelle DBX and ARM926EJ-S ~~> Pantilope

    From Mild Shock@janburse@fastmail.fm to comp.lang.prolog on Sat Jun 28 01:26:40 2025
    From Newsgroup: comp.lang.prolog

    Hi,

    Back in the early days SUN was already talking
    about Java on in CPU. Interestingly this happened:

    The most prominent use of Jazelle DBX is by
    manufacturers of mobile phones to increase the
    execution speed of Java ME games and applications.
    A Jazelle-aware Java virtual machine (JVM) will
    attempt to run Java bytecode in hardware, while
    returning to the software for more complicated,
    or lesser-used bytecode operations. ARM claims that
    approximately 95% of bytecode in typical program
    usage ends up being directly processed in the hardware. https://en.wikipedia.org/wiki/Jazelle

    So in the 90's we had first internet, and then
    in the 00's we had mobile phones. The 10's had
    big data and early deep leearning.

    But Python is still slow as fuck in the 20's.
    They should invent a CPU that can do Pantilope,
    i.e. direct executon of Python.

    Bye
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to comp.lang.prolog on Sat Jun 28 01:27:24 2025
    From Newsgroup: comp.lang.prolog

    Hi,

    Jeff Barnet might have a point:

    Fran's specialty was bring graph theory into computer development

    although he sounds boring. Prolog is very
    weak when using graph theory to code generation.
    Even Prolog Cafe is based on WAM, and not LLVM.

    WAM is linear code, LLVM sees code as graph
    of blocks. Here is an example:

    entry:
    %cond = icmp eq i32 %x, 0
    br i1 %cond, label %if_zero, label %if_nonzero

    if_zero:
    ; do something
    br label %merge

    if_nonzero:
    ; do something else
    br label %merge

    merge:
    %val = phi i32 [0, %if_zero], [1, %if_nonzero]
    ret i32 %val

    Its not the AST of the source code, but the IR,
    i.e. internal representation after some AST
    processing.

    Today I was wrestling quite a number of hours,
    to figure out whether liveness analysis can
    be done in one pass. My Prolog system Dogelog

    Player uses two passes, so that assertz/1 is
    a little slow. Maybe I implement a fast path
    without the liveness analysis for the

    dynamic database, to speed it up.

    Bye

    Mild Shock schrieb:
    Hi,

    Back in the early days SUN was already talking
    about Java on in CPU. Interestingly this happened:

    The most prominent use of Jazelle DBX is by
    manufacturers of mobile phones to increase the
    execution speed of Java ME games and applications.
    A Jazelle-aware Java virtual machine (JVM) will
    attempt to run Java bytecode in hardware, while
    returning to the software for more complicated,
    or lesser-used bytecode operations. ARM claims that
    approximately 95% of bytecode in typical program
    usage ends up being directly processed in the hardware. https://en.wikipedia.org/wiki/Jazelle

    So in the 90's we had first internet, and then
    in the 00's we had mobile phones. The 10's had
    big data and early deep leearning.

    But Python is still slow as fuck in the 20's.
    They should invent a CPU that can do Pantilope,
    i.e. direct executon of Python.

    Bye

    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to comp.lang.prolog on Sat Jun 28 01:27:56 2025
    From Newsgroup: comp.lang.prolog

    Hi,

    Interrestingly Prolog nearly had a "Jazelle":

    Meanwhile, in Oxford I made contact with Tim [Robinson]
    from High Level Hardware, who had developed a microcoded
    workstation called the Orion (there is a good Wikipedia
    article on this machine). Tim wanted a Prolog system for
    the Orion, so I gave him the Prolog-X reference
    implementation. He microcoded it, and we reckoned
    it would have amazing performance because of that.
    However, several simultaneous events conspired to
    halt the microcoded Prolog on the Orion. https://www.softwarepreservation.org/projects/prolog

    Derived from Prolog-X, so basically from ZIP.

    Bye

    Mild Shock schrieb:
    Hi,

    Jeff Barnet might have a point:

    Fran's specialty was bring graph theory into computer development

    although he sounds boring. Prolog is very
    weak when using graph theory to code generation.
    Even Prolog Cafe is based on WAM, and not LLVM.

    WAM is linear code, LLVM sees code as graph
    of blocks. Here is an example:

    entry:
      %cond = icmp eq i32 %x, 0
      br i1 %cond, label %if_zero, label %if_nonzero

    if_zero:
      ; do something
      br label %merge

    if_nonzero:
      ; do something else
      br label %merge

    merge:
      %val = phi i32 [0, %if_zero], [1, %if_nonzero]
      ret i32 %val

    Its not the AST of the source code, but the IR,
    i.e. internal representation after some AST
    processing.

    Today I was wrestling quite a number of hours,
    to figure out whether liveness analysis can
    be done in one pass. My Prolog system Dogelog

    Player uses two passes, so that assertz/1 is
    a little slow. Maybe I implement a fast path
    without the liveness analysis for the

    dynamic database, to speed it up.

    Bye

    Mild Shock schrieb:
    Hi,

    Back in the early days SUN was already talking
    about Java on in CPU. Interestingly this happened:

    The most prominent use of Jazelle DBX is by
    manufacturers of mobile phones to increase the
    execution speed of Java ME games and applications.
    A Jazelle-aware Java virtual machine (JVM) will
    attempt to run Java bytecode in hardware, while
    returning to the software for more complicated,
    or lesser-used bytecode operations. ARM claims that
    approximately 95% of bytecode in typical program
    usage ends up being directly processed in the hardware.
    https://en.wikipedia.org/wiki/Jazelle

    So in the 90's we had first internet, and then
    in the 00's we had mobile phones. The 10's had
    big data and early deep leearning.

    But Python is still slow as fuck in the 20's.
    They should invent a CPU that can do Pantilope,
    i.e. direct executon of Python.

    Bye


    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to comp.lang.prolog on Sun Jun 29 13:05:26 2025
    From Newsgroup: comp.lang.prolog

    has a flag (module sensitive) called var_prefix.

    Interesting backward oriented featuritis, to support
    Prolog systems from the past, can be also rationalized
    as useful for the present. You find a few real world

    source code listings in the Computer Museum below,
    very old Prolog systems used the star (*) as a variable
    prefix, then some Prolog systems used the underscore (_)

    as variable prefix. Lower case was not always supported,
    some Prolog systems had a ‘NOLC’ (No-Lower Case) and
    ‘LC’ (Lower Case) directive, to switch modes:

    Computer History Museum’s Software Preservation Group https://www.softwarepreservation.org/

    Edit 29.06.2025:
    If only SWI-Prolog would put the same effort in forward
    oriented features that it does in backward oriented
    features, i.e. features that deal with Prolog systems

    of the future. For example supporting this trivial gadget:

    ?- X = [a,b,c]
    `abc`

    Mild Shock schrieb:
    Hi,

    Interrestingly Prolog nearly had a "Jazelle":

    Meanwhile, in Oxford I made contact with Tim [Robinson]
    from High Level Hardware, who had developed a microcoded
    workstation called the Orion (there is a good Wikipedia
    article on this machine). Tim wanted a Prolog system for
    the Orion, so I gave him the Prolog-X reference
    implementation. He microcoded it, and we reckoned
    it would have amazing performance because of that.
    However, several simultaneous events conspired to
    halt the microcoded Prolog on the Orion. https://www.softwarepreservation.org/projects/prolog

    Derived from Prolog-X, so basically from ZIP.

    Bye

    Mild Shock schrieb:
    Hi,

    Jeff Barnet might have a point:

    Fran's specialty was bring graph theory into computer development

    although he sounds boring. Prolog is very
    weak when using graph theory to code generation.
    Even Prolog Cafe is based on WAM, and not LLVM.

    WAM is linear code, LLVM sees code as graph
    of blocks. Here is an example:

    entry:
       %cond = icmp eq i32 %x, 0
       br i1 %cond, label %if_zero, label %if_nonzero

    if_zero:
       ; do something
       br label %merge

    if_nonzero:
       ; do something else
       br label %merge

    merge:
       %val = phi i32 [0, %if_zero], [1, %if_nonzero]
       ret i32 %val

    Its not the AST of the source code, but the IR,
    i.e. internal representation after some AST
    processing.

    Today I was wrestling quite a number of hours,
    to figure out whether liveness analysis can
    be done in one pass. My Prolog system Dogelog

    Player uses two passes, so that assertz/1 is
    a little slow. Maybe I implement a fast path
    without the liveness analysis for the

    dynamic database, to speed it up.

    Bye

    Mild Shock schrieb:
    Hi,

    Back in the early days SUN was already talking
    about Java on in CPU. Interestingly this happened:

    The most prominent use of Jazelle DBX is by
    manufacturers of mobile phones to increase the
    execution speed of Java ME games and applications.
    A Jazelle-aware Java virtual machine (JVM) will
    attempt to run Java bytecode in hardware, while
    returning to the software for more complicated,
    or lesser-used bytecode operations. ARM claims that
    approximately 95% of bytecode in typical program
    usage ends up being directly processed in the hardware.
    https://en.wikipedia.org/wiki/Jazelle

    So in the 90's we had first internet, and then
    in the 00's we had mobile phones. The 10's had
    big data and early deep leearning.

    But Python is still slow as fuck in the 20's.
    They should invent a CPU that can do Pantilope,
    i.e. direct executon of Python.

    Bye



    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to comp.lang.prolog on Sun Jun 29 13:13:01 2025
    From Newsgroup: comp.lang.prolog

    Corr.: Forgot to show the correct answer substitution:

    ?- X = [a,b,c]
    X = `abc`

    Mild Shock schrieb:
    has a flag (module sensitive) called var_prefix.

    Interesting backward oriented featuritis, to support
    Prolog systems from the past, can be also rationalized
    as useful for the present. You find a few real world

    source code listings in the Computer Museum below,
    very old Prolog systems used the star (*) as a variable
    prefix, then some Prolog systems used the underscore (_)

    as variable prefix. Lower case was not always supported,
    some Prolog systems had a ‘NOLC’ (No-Lower Case) and
    ‘LC’ (Lower Case) directive, to switch modes:

    Computer History Museum’s Software Preservation Group https://www.softwarepreservation.org/

    Edit 29.06.2025:
    If only SWI-Prolog would put the same effort in forward
    oriented features that it does in backward oriented
    features, i.e. features that deal with Prolog systems

    of the future. For example supporting this trivial gadget:

    ?- X = [a,b,c]
    `abc`

    Mild Shock schrieb:
    Hi,

    Interrestingly Prolog nearly had a "Jazelle":

    Meanwhile, in Oxford I made contact with Tim [Robinson]
    from High Level Hardware, who had developed a microcoded
    workstation called the Orion (there is a good Wikipedia
    article on this machine). Tim wanted a Prolog system for
    the Orion, so I gave him the Prolog-X reference
    implementation. He microcoded it, and we reckoned
    it would have amazing performance because of that.
    However, several simultaneous events conspired to
    halt the microcoded Prolog on the Orion.
    https://www.softwarepreservation.org/projects/prolog

    Derived from Prolog-X, so basically from ZIP.

    Bye

    Mild Shock schrieb:
    Hi,

    Jeff Barnet might have a point:

    Fran's specialty was bring graph theory into computer development

    although he sounds boring. Prolog is very
    weak when using graph theory to code generation.
    Even Prolog Cafe is based on WAM, and not LLVM.

    WAM is linear code, LLVM sees code as graph
    of blocks. Here is an example:

    entry:
       %cond = icmp eq i32 %x, 0
       br i1 %cond, label %if_zero, label %if_nonzero

    if_zero:
       ; do something
       br label %merge

    if_nonzero:
       ; do something else
       br label %merge

    merge:
       %val = phi i32 [0, %if_zero], [1, %if_nonzero]
       ret i32 %val

    Its not the AST of the source code, but the IR,
    i.e. internal representation after some AST
    processing.

    Today I was wrestling quite a number of hours,
    to figure out whether liveness analysis can
    be done in one pass. My Prolog system Dogelog

    Player uses two passes, so that assertz/1 is
    a little slow. Maybe I implement a fast path
    without the liveness analysis for the

    dynamic database, to speed it up.

    Bye

    Mild Shock schrieb:
    Hi,

    Back in the early days SUN was already talking
    about Java on in CPU. Interestingly this happened:

    The most prominent use of Jazelle DBX is by
    manufacturers of mobile phones to increase the
    execution speed of Java ME games and applications.
    A Jazelle-aware Java virtual machine (JVM) will
    attempt to run Java bytecode in hardware, while
    returning to the software for more complicated,
    or lesser-used bytecode operations. ARM claims that
    approximately 95% of bytecode in typical program
    usage ends up being directly processed in the hardware.
    https://en.wikipedia.org/wiki/Jazelle

    So in the 90's we had first internet, and then
    in the 00's we had mobile phones. The 10's had
    big data and early deep leearning.

    But Python is still slow as fuck in the 20's.
    They should invent a CPU that can do Pantilope,
    i.e. direct executon of Python.

    Bye




    --- Synchronet 3.21a-Linux NewsLink 1.2