Infencer/Interpreter hybrid and Kleene Star types.

Still draft but it works.  Gotta clean it up and supply defs for basis
functions.
This commit is contained in:
Simon Forman 2018-06-24 16:33:07 -07:00
parent 968556c7f3
commit 1aace53fa0
9 changed files with 3844 additions and 856 deletions

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,7 +1,9 @@
# Type Inference
## Pöial's Rules
Cf. ["Type Inference in Stack-Based Programming Languages"](http://prl.ccs.neu.edu/blog/2017/03/10/type-inference-in-stack-based-programming-languages/) by Rob Kleffner, 2017-03-10.
## Part I: Pöial's Rules
["Typing Tools for Typeless Stack Languages" by Jaanus Pöial
](http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.212.6026)
@ -30,7 +32,7 @@ This rule deals with functions that consume items from the stack `(a --)`:
(c a -- d)
### Third Rule
The third rule is actually two rules. These two rules deal with composing functions when the second one will consume one of items the first one produces. The two types must be *unified* or a type conflict declared.
The third rule is actually two rules. These two rules deal with composing functions when the second one will consume one of items the first one produces. The two types must be [*unified*](https://en.wikipedia.org/wiki/Robinson's_unification_algorithm) or a type conflict declared.
(a -- b t[i])∘(c u[j] -- d) t <= u (t is subtype of u)
-------------------------------
@ -41,7 +43,6 @@ The third rule is actually two rules. These two rules deal with composing funct
-------------------------------
(a -- b )∘(c -- d) t[i] == u[k] == u[j]
## Examples
Let's work through some examples by hand to develop an intuition for the algorithm.
There's a function in one of the other notebooks.
@ -267,7 +268,7 @@ def F(stack):
return (d, (c, S0)), stack
```
## Implementation
## Part II: Implementation
### Representing Stack Effect Comments in Python
@ -485,8 +486,8 @@ poswrd
### List Functions
Here's that trick to represent functions like `rest` and `cons` that manipulate lists. We use a cons-list of tuples and give the tails their own numbers. Then everything above already works.
### Stack Functions
Here's that trick to represent functions like `rest` and `cons` that manipulate stacks. We use a cons-list of tuples and give the tails their own numbers. Then everything above already works.
```python
@ -539,6 +540,7 @@ F
Compare with the stack effect comment and you can see it works fine:
([4 5 ...] 2 3 1 -- [3 2 ...])
3 4 5 1 2 0 2 1 5
### Dealing with `cons` and `uncons`
However, if we try to compose e.g. `cons` and `uncons` it won't work:
@ -567,7 +569,7 @@ The problem is that the `unify()` function as written doesn't handle the case wh
def unify(u, v, s=None):
if s is None:
s = {}
else:
elif s:
u = update(s, u)
v = update(s, v)
@ -606,7 +608,7 @@ C(cons, uncons)
## Compiling
## Part III: Compiling Stack Functions
Now consider the Python function we would like to derive:
@ -904,7 +906,7 @@ for name, stack_effect_comment in sorted(defs().items()):
## Types and Subtypes of Arguments
## Part IV: Types and Subtypes of Arguments
So far we have dealt with types of functions, those dealing with simple stack manipulation. Let's extend our machinery to deal with types of arguments.
### "Number" Type
@ -1158,7 +1160,7 @@ delabel(foo)
def unify(u, v, s=None):
if s is None:
s = {}
else:
elif s:
u = update(s, u)
v = update(s, v)
@ -1460,7 +1462,7 @@ for name, stack_effect_comment in sorted(defs().items()):
uncons = ([a1 .1.] -- a1 [.1.])
## Functions that use the Stack
## Part V: Functions that use the Stack
Consider the `stack` function which grabs the whole stack, quotes it, and puts it on itself:
@ -1518,33 +1520,16 @@ Let's try `stack∘uncons∘uncons`:
It works.
#### `compose()` version 2
This function has to be modified to use the new datastructures and it is no longer recursive, instead recursion happens as part of unification.
This function has to be modified to use the new datastructures and it is no longer recursive, instead recursion happens as part of unification. Further, the first and second of Pöial's rules are now handled automatically by the unification algorithm.
```python
def compose(f, g):
(f_in, f_out), (g_in, g_out) = f, g
if not g_in:
fg_in, fg_out = f_in, stack_concat(g_out, f_out)
elif not f_out:
fg_in, fg_out = stack_concat(f_in, g_in), g_out
else: # Unify and update.
s = unify(g_in, f_out)
if s == False: # s can also be the empty dict, which is ok.
raise TypeError('Cannot unify %r and %r.' % (fo, gi))
fg_in, fg_out = update(s, (f_in, g_out))
return fg_in, fg_out
stack_concat = lambda q, e: (q[0], stack_concat(q[1], e)) if q else e
s = unify(g_in, f_out)
if s == False: # s can also be the empty dict, which is ok.
raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
return update(s, (f_in, g_out))
```
I don't want to rewrite all the defs myself, so I'll write a little conversion function instead. This is programmer's laziness.
@ -1796,7 +1781,7 @@ C(cons, unstack)
## Multiple Stack Effects
## Part VI: Multiple Stack Effects
...
@ -1880,44 +1865,9 @@ for f in MC([dup], [mul]):
(n0 -- n1)
## `concat`
### Representing an Unbounded Sequence of Types
How to deal with `concat`?
concat ([.0.] [.1.] -- [.0. .1.])
We would like to represent this in Python somehow...
```python
concat = (S[0], S[1]), ((S[0], S[1]),)
```
But this is actually `cons` with the first argument restricted to be a stack:
([.0.] [.1.] -- [[.0.] .1.])
What we have implemented so far would actually only permit:
([.0.] [.1.] -- [.2.])
```python
concat = (S[0], S[1]), (S[2],)
```
Which works but can lose information. Consider `cons concat`, this is how much information we *could* retain:
(1 [.0.] [.1.] -- [1 .0. .1.])
As opposed to just:
(1 [.0.] [.1.] -- [.2.])
### Brzo...'s Derivitives of Regular Expressions
We can invent a new type of type variable, a "sequence type" (I think this is what they mean in the literature by that term...) or "Kleene Star" type. I'm going to represent it as a type letter and the asterix, so a sequence of zero or more `AnyJoyType` variables would be:
We can borrow a trick from [Brzozowski's Derivatives of Regular Expressions](https://en.wikipedia.org/wiki/Brzozowski_derivative) to invent a new type of type variable, a "sequence type" (I think this is what they mean in the literature by that term...) or "[Kleene Star](https://en.wikipedia.org/wiki/Kleene_star)" type. I'm going to represent it as a type letter and the asterix, so a sequence of zero or more `AnyJoyType` variables would be:
A*
@ -2049,7 +1999,7 @@ def unify(u, v, s=None):
sn.update(s)
return t
ses = unify(u[0], v[0])
ses = unify(u[0], v[0], s)
results = ()
for sn in ses:
results += unify(u[1], v[1], sn)
@ -2160,29 +2110,17 @@ for result in unify(sum_[0], f):
#### `compose()` version 3
This function has to be modified to use the new datastructures and it is no longer recursive, instead recursion happens as part of unification.
This function has to be modified to yield multiple results.
```python
def compose(f, g):
(f_in, f_out), (g_in, g_out) = f, g
if not g_in:
yield f_in, stack_concat(g_out, f_out)
elif not f_out:
yield stack_concat(f_in, g_in), g_out
else: # Unify and update.
s = unify(g_in, f_out)
if not s:
raise TypeError('Cannot unify %r and %r.' % (fo, gi))
for result in s:
yield update(result, (f_in, g_out))
s = unify(g_in, f_out)
if not s:
raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
for result in s:
yield update(result, (f_in, g_out))
```
@ -2209,8 +2147,8 @@ for f in MC([dup], muls):
print doc_from_stack_effect(*f)
```
(a0 -- f0)
(a0 -- i0)
(f0 -- f1)
(i0 -- i1)
@ -2295,6 +2233,341 @@ for result in unify(a, b):
{s2: (a1*, (a3, s1)), a2: a10004, a4: a1}
## Part VII: Typing Combinators
TBD
This is an open subject.
The obvious thing is that you now need two pairs of tuples to describe the combinators' effects, a stack effect comment and an expression effect comment:
dip (a [F] --)--(-- F a)
One thing that might help is...
Consider the type of:
[cons] dip
Obviously it would be:
(a1 [..1] a2 -- [a1 ..1] a2)
`dip` itself could have:
(a1 [..1] -- ... then what?
```python
class SymbolJoyType(AnyJoyType): prefix = 'F'
W = map(SymbolJoyType, _R)
k = S[0], ((W[1], S[2]), S[0])
Symbol('cons')
print doc_from_stack_effect(*k)
```
(-- [F1 .2.])
```python
dip_a = ((W[1], S[2]), (A[1], S[0]))
```
```python
d = relabel(S[0], dip_a)
print doc_from_stack_effect(*d)
```
(-- a1001 [F1001 .1002.])
```python
s = list(unify(d[1], k[1]))[0]
s
```
{s0: (a1001, s1000), s1002: s2, F1001: F1}
```python
j = update(s, k)
```
```python
print doc_from_stack_effect(*j)
```
(a1001 -- a1001 [F1 .2.])
```python
j
```
((a1001, s1000), ((F1, s2), (a1001, s1000)))
```python
cons
```
((s1, (a1, s23)), ((a1, s1), s23))
```python
for f in MC([k], [dup]):
print doc_from_stack_effect(*f)
```
(-- [F0 .1.] [F0 .1.])
```python
l = S[0], ((cons, S[2]), (A[1], S[0]))
```
```python
print doc_from_stack_effect(*l)
```
(-- a1 [[[[.1.] a1 .23.] [a1 .1.] .23.] .2.])
```python
def dip_t(F):
(quote, (a1, sec)) = F[1]
G = F[0], sec
P = S[3], (a1, S[3])
a = [P]
while isinstance(quote, tuple):
term, quote = quote
a.append(term)
a.append(G)
return a[::-1]
```
```python
from joy.utils.stack import iter_stack
```
```python
a, b, c = dip_t(l)
```
```python
a
```
(s0, s0)
```python
b
```
((s1, (a1, s23)), ((a1, s1), s23))
```python
c
```
(s3, (a1, s3))
```python
MC([a], [b])
```
[((s0, (a0, s1)), ((a0, s0), s1))]
```python
kjs = MC(MC([a], [b]), [c])
kjs
```
[((s0, (a0, s1)), (a1, ((a0, s0), s1)))]
```python
print doc_from_stack_effect(*kjs[0])
```
(a0 [.0.] -- [a0 .0.] a1)
(a0 [.0.] -- [a0 .0.] a1)
a0 [.0.] a1 [cons] dip
----------------------------
[a0 .0.] a1
```python
stack_concat = lambda q, e: (q[0], stack_concat(q[1], e)) if q else e
```
```python
class SymbolJoyType(AnyJoyType):
prefix = 'F'
def __init__(self, name, sec, number):
self.name = name
self.stack_effects = sec
self.number = number
class CombinatorJoyType(SymbolJoyType): prefix = 'C'
def dip_t(stack, expression):
(quote, (a1, stack)) = stack
expression = stack_concat(quote, (a1, expression))
return stack, expression
CONS = SymbolJoyType('cons', [cons], 23)
DIP = CombinatorJoyType('dip', [dip_t], 44)
def kav(F, e):
#i, stack = F
if not e:
return [(F, e)]
n, e = e
if isinstance(n, SymbolJoyType):
Fs = []
for sec in n.stack_effects:
Fs.extend(MC([F], sec))
return [kav(Fn, e) for Fn in Fs]
if isinstance(n, CombinatorJoyType):
res = []
for f in n.stack_effects:
s, e = f(F[1], e)
new_F = F[0], s
res.extend(kav(new_F, e))
return res
lit = S[0], (n, S[0])
return [kav(Fn, e) for Fn in MC([F], [lit])]
```
compare, and be amazed:
```python
def dip_t(stack, expression):
(quote, (a1, stack)) = stack
expression = stack_concat(quote, (a1, expression))
return stack, expression
```
```python
def dip(stack, expression, dictionary):
(quote, (x, stack)) = stack
expression = (x, expression)
return stack, concat(quote, expression), dictionary
```
And that brings us to current Work-In-Progress. I'm pretty hopeful that the mixed-mode inferencer/interpreter `kav()` function along with the ability to specify multiple implementations for the combinators will permit modelling of the stack effects of e.g. `ifte`. If I can keep up the pace I should be able to verify that conjecture by the end of June.
The rest of this stuff is junk and/or unfinished material.
### `concat`
How to deal with `concat`?
concat ([.0.] [.1.] -- [.0. .1.])
We would like to represent this in Python somehow...
```python
concat = (S[0], S[1]), ((S[0], S[1]),)
```
But this is actually `cons` with the first argument restricted to be a stack:
([.0.] [.1.] -- [[.0.] .1.])
What we have implemented so far would actually only permit:
([.0.] [.1.] -- [.2.])
```python
concat = (S[0], S[1]), (S[2],)
```
Which works but can lose information. Consider `cons concat`, this is how much information we *could* retain:
(1 [.0.] [.1.] -- [1 .0. .1.])
As opposed to just:
(1 [.0.] [.1.] -- [.2.])
### represent `concat`
([.0.] [.1.] -- [A*(.0.) .1.])
@ -2356,7 +2629,7 @@ concat(a, b)
## Joy in the Logical Paradigm
## Appendix: Joy in the Logical Paradigm
For this to work the type label classes have to be modified to let `T >= t` succeed, where e.g. `T` is `IntJoyType` and `t` is `int`
@ -2371,85 +2644,85 @@ print doc_from_stack_effect(*F)
ValueError Traceback (most recent call last)
<ipython-input-113-4b4cb6ff86e5> in <module>()
<ipython-input-137-4b4cb6ff86e5> in <module>()
1 F = reduce(C, (pop, swap, roll_down, rest, rest, cons, cons))
2
----> 3 print doc_from_stack_effect(*F)
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
ValueError: need more than 1 value to unpack
@ -2487,28 +2760,7 @@ F[1][0]
s[0]
```
## Typing Combinators
## [Abstract Interpretation](https://en.wikipedia.org/wiki/Abstract_interpretation)
I *think* this might be sorta what I'm doing above with the `kav()` function...
In any event "mixed-mode" interpreters that include values and type variables and can track constraints, etc. will be, uh, super-useful. And Abstract Interpretation should be a rich source of ideas.
TBD
This is an open subject.
The obvious thing is that you now need two pairs of tuples to describe the combinators' effects, a stack effect comment and an expression effect comment:
dip (a [F] --)--(-- F a)
One thing that might help is...
## Abstract Interpretation
## Something else...
[4 5 ...] 2 1 0 pop∘swap∘roll<∘rest∘rest∘cons∘cons
[4 5 ...] 2 1 swap∘roll<∘rest∘rest∘cons∘cons
[4 5 ...] 1 2 roll<∘rest∘rest∘cons∘cons
1 2 [4 5 ...] rest∘rest∘cons∘cons
1 2 [5 ...] rest∘cons∘cons
1 2 [...] cons∘cons
1 [2 ...] cons
[1 2 ...]
Eh?

View File

@ -2,8 +2,12 @@
Type Inference
==============
Pöial's Rules
-------------
Cf. `"Type Inference in Stack-Based Programming
Languages" <http://prl.ccs.neu.edu/blog/2017/03/10/type-inference-in-stack-based-programming-languages/>`__
by Rob Kleffner, 2017-03-10.
Part I: Pöial's Rules
---------------------
`"Typing Tools for Typeless Stack Languages" by Jaanus
Pöial <http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.212.6026>`__
@ -47,8 +51,9 @@ Third Rule
The third rule is actually two rules. These two rules deal with
composing functions when the second one will consume one of items the
first one produces. The two types must be *unified* or a type conflict
declared.
first one produces. The two types must be
`*unified* <https://en.wikipedia.org/wiki/Robinson's_unification_algorithm>`__
or a type conflict declared.
::
@ -61,9 +66,6 @@ declared.
-------------------------------
(a -- b )∘(c -- d) t[i] == u[k] == u[j]
Examples
--------
Let's work through some examples by hand to develop an intuition for the
algorithm.
@ -387,8 +389,8 @@ following Python code:
(_, (d, (c, ((a, (b, S0)), stack)))) = stack
return (d, (c, S0)), stack
Implementation
--------------
Part II: Implementation
-----------------------
Representing Stack Effect Comments in Python
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@ -618,11 +620,11 @@ Let's try it out.
List Functions
~~~~~~~~~~~~~~
Stack Functions
~~~~~~~~~~~~~~~
Here's that trick to represent functions like ``rest`` and ``cons`` that
manipulate lists. We use a cons-list of tuples and give the tails their
manipulate stacks. We use a cons-list of tuples and give the tails their
own numbers. Then everything above already works.
.. code:: ipython2
@ -683,6 +685,7 @@ Compare with the stack effect comment and you can see it works fine:
::
([4 5 ...] 2 3 1 -- [3 2 ...])
3 4 5 1 2 0 2 1 5
Dealing with ``cons`` and ``uncons``
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@ -719,7 +722,7 @@ deal with this recursively:
def unify(u, v, s=None):
if s is None:
s = {}
else:
elif s:
u = update(s, u)
v = update(s, v)
@ -758,8 +761,8 @@ deal with this recursively:
Compiling
---------
Part III: Compiling Stack Functions
-----------------------------------
Now consider the Python function we would like to derive:
@ -1094,8 +1097,8 @@ from their stack effect comments:
Types and Subtypes of Arguments
-------------------------------
Part IV: Types and Subtypes of Arguments
----------------------------------------
So far we have dealt with types of functions, those dealing with simple
stack manipulation. Let's extend our machinery to deal with types of
@ -1396,7 +1399,7 @@ of how many labels of each domain it has "seen".
def unify(u, v, s=None):
if s is None:
s = {}
else:
elif s:
u = update(s, u)
v = update(s, v)
@ -1719,8 +1722,8 @@ comments. We can write a function to check that:
uncons = ([a1 .1.] -- a1 [.1.])
Functions that use the Stack
----------------------------
Part V: Functions that use the Stack
------------------------------------
Consider the ``stack`` function which grabs the whole stack, quotes it,
and puts it on itself:
@ -1803,32 +1806,17 @@ It works.
This function has to be modified to use the new datastructures and it is
no longer recursive, instead recursion happens as part of unification.
Further, the first and second of Pöial's rules are now handled
automatically by the unification algorithm.
.. code:: ipython2
def compose(f, g):
(f_in, f_out), (g_in, g_out) = f, g
if not g_in:
fg_in, fg_out = f_in, stack_concat(g_out, f_out)
elif not f_out:
fg_in, fg_out = stack_concat(f_in, g_in), g_out
else: # Unify and update.
s = unify(g_in, f_out)
if s == False: # s can also be the empty dict, which is ok.
raise TypeError('Cannot unify %r and %r.' % (fo, gi))
fg_in, fg_out = update(s, (f_in, g_out))
return fg_in, fg_out
stack_concat = lambda q, e: (q[0], stack_concat(q[1], e)) if q else e
s = unify(g_in, f_out)
if s == False: # s can also be the empty dict, which is ok.
raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
return update(s, (f_in, g_out))
I don't want to rewrite all the defs myself, so I'll write a little
conversion function instead. This is programmer's laziness.
@ -2100,8 +2088,8 @@ comments are now already in the form needed for the Python code:
Multiple Stack Effects
----------------------
Part VI: Multiple Stack Effects
-------------------------------
...
@ -2191,58 +2179,16 @@ Multiple Stack Effects
(n0 -- n1)
``concat``
----------
Representing an Unbounded Sequence of Types
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
How to deal with ``concat``?
::
concat ([.0.] [.1.] -- [.0. .1.])
We would like to represent this in Python somehow...
.. code:: ipython2
concat = (S[0], S[1]), ((S[0], S[1]),)
But this is actually ``cons`` with the first argument restricted to be a
stack:
::
([.0.] [.1.] -- [[.0.] .1.])
What we have implemented so far would actually only permit:
::
([.0.] [.1.] -- [.2.])
.. code:: ipython2
concat = (S[0], S[1]), (S[2],)
Which works but can lose information. Consider ``cons concat``, this is
how much information we *could* retain:
::
(1 [.0.] [.1.] -- [1 .0. .1.])
As opposed to just:
::
(1 [.0.] [.1.] -- [.2.])
Brzo...'s Derivitives of Regular Expressions
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
We can invent a new type of type variable, a "sequence type" (I think
this is what they mean in the literature by that term...) or "Kleene
Star" type. I'm going to represent it as a type letter and the asterix,
so a sequence of zero or more ``AnyJoyType`` variables would be:
We can borrow a trick from `Brzozowski's Derivatives of Regular
Expressions <https://en.wikipedia.org/wiki/Brzozowski_derivative>`__ to
invent a new type of type variable, a "sequence type" (I think this is
what they mean in the literature by that term...) or "`Kleene
Star <https://en.wikipedia.org/wiki/Kleene_star>`__" type. I'm going to
represent it as a type letter and the asterix, so a sequence of zero or
more ``AnyJoyType`` variables would be:
::
@ -2392,7 +2338,7 @@ Can now return multiple results...
sn.update(s)
return t
ses = unify(u[0], v[0])
ses = unify(u[0], v[0], s)
results = ()
for sn in ses:
results += unify(u[1], v[1], sn)
@ -2517,30 +2463,17 @@ Can now return multiple results...
``compose()`` version 3
^^^^^^^^^^^^^^^^^^^^^^^
This function has to be modified to use the new datastructures and it is
no longer recursive, instead recursion happens as part of unification.
This function has to be modified to yield multiple results.
.. code:: ipython2
def compose(f, g):
(f_in, f_out), (g_in, g_out) = f, g
if not g_in:
yield f_in, stack_concat(g_out, f_out)
elif not f_out:
yield stack_concat(f_in, g_in), g_out
else: # Unify and update.
s = unify(g_in, f_out)
if not s:
raise TypeError('Cannot unify %r and %r.' % (fo, gi))
for result in s:
yield update(result, (f_in, g_out))
s = unify(g_in, f_out)
if not s:
raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
for result in s:
yield update(result, (f_in, g_out))
.. code:: ipython2
@ -2567,8 +2500,8 @@ no longer recursive, instead recursion happens as part of unification.
.. parsed-literal::
(a0 -- f0)
(a0 -- i0)
(f0 -- f1)
(i0 -- i1)
.. code:: ipython2
@ -2665,6 +2598,377 @@ no longer recursive, instead recursion happens as part of unification.
{s2: (a1*, (a3, s1)), a2: a10004, a4: a1}
Part VII: Typing Combinators
----------------------------
TBD
This is an open subject.
The obvious thing is that you now need two pairs of tuples to describe
the combinators' effects, a stack effect comment and an expression
effect comment:
::
dip (a [F] --)--(-- F a)
One thing that might help is...
Consider the type of:
::
[cons] dip
Obviously it would be:
::
(a1 [..1] a2 -- [a1 ..1] a2)
``dip`` itself could have:
::
(a1 [..1] -- ... then what?
.. code:: ipython2
class SymbolJoyType(AnyJoyType): prefix = 'F'
W = map(SymbolJoyType, _R)
k = S[0], ((W[1], S[2]), S[0])
Symbol('cons')
print doc_from_stack_effect(*k)
.. parsed-literal::
(-- [F1 .2.])
.. code:: ipython2
dip_a = ((W[1], S[2]), (A[1], S[0]))
.. code:: ipython2
d = relabel(S[0], dip_a)
print doc_from_stack_effect(*d)
.. parsed-literal::
(-- a1001 [F1001 .1002.])
.. code:: ipython2
s = list(unify(d[1], k[1]))[0]
s
.. parsed-literal::
{s0: (a1001, s1000), s1002: s2, F1001: F1}
.. code:: ipython2
j = update(s, k)
.. code:: ipython2
print doc_from_stack_effect(*j)
.. parsed-literal::
(a1001 -- a1001 [F1 .2.])
.. code:: ipython2
j
.. parsed-literal::
((a1001, s1000), ((F1, s2), (a1001, s1000)))
.. code:: ipython2
cons
.. parsed-literal::
((s1, (a1, s23)), ((a1, s1), s23))
.. code:: ipython2
for f in MC([k], [dup]):
print doc_from_stack_effect(*f)
.. parsed-literal::
(-- [F0 .1.] [F0 .1.])
.. code:: ipython2
l = S[0], ((cons, S[2]), (A[1], S[0]))
.. code:: ipython2
print doc_from_stack_effect(*l)
.. parsed-literal::
(-- a1 [[[[.1.] a1 .23.] [a1 .1.] .23.] .2.])
.. code:: ipython2
def dip_t(F):
(quote, (a1, sec)) = F[1]
G = F[0], sec
P = S[3], (a1, S[3])
a = [P]
while isinstance(quote, tuple):
term, quote = quote
a.append(term)
a.append(G)
return a[::-1]
.. code:: ipython2
from joy.utils.stack import iter_stack
.. code:: ipython2
a, b, c = dip_t(l)
.. code:: ipython2
a
.. parsed-literal::
(s0, s0)
.. code:: ipython2
b
.. parsed-literal::
((s1, (a1, s23)), ((a1, s1), s23))
.. code:: ipython2
c
.. parsed-literal::
(s3, (a1, s3))
.. code:: ipython2
MC([a], [b])
.. parsed-literal::
[((s0, (a0, s1)), ((a0, s0), s1))]
.. code:: ipython2
kjs = MC(MC([a], [b]), [c])
kjs
.. parsed-literal::
[((s0, (a0, s1)), (a1, ((a0, s0), s1)))]
.. code:: ipython2
print doc_from_stack_effect(*kjs[0])
.. parsed-literal::
(a0 [.0.] -- [a0 .0.] a1)
::
(a0 [.0.] -- [a0 .0.] a1)
a0 [.0.] a1 [cons] dip
----------------------------
[a0 .0.] a1
.. code:: ipython2
stack_concat = lambda q, e: (q[0], stack_concat(q[1], e)) if q else e
.. code:: ipython2
class SymbolJoyType(AnyJoyType):
prefix = 'F'
def __init__(self, name, sec, number):
self.name = name
self.stack_effects = sec
self.number = number
class CombinatorJoyType(SymbolJoyType): prefix = 'C'
def dip_t(stack, expression):
(quote, (a1, stack)) = stack
expression = stack_concat(quote, (a1, expression))
return stack, expression
CONS = SymbolJoyType('cons', [cons], 23)
DIP = CombinatorJoyType('dip', [dip_t], 44)
def kav(F, e):
#i, stack = F
if not e:
return [(F, e)]
n, e = e
if isinstance(n, SymbolJoyType):
Fs = []
for sec in n.stack_effects:
Fs.extend(MC([F], sec))
return [kav(Fn, e) for Fn in Fs]
if isinstance(n, CombinatorJoyType):
res = []
for f in n.stack_effects:
s, e = f(F[1], e)
new_F = F[0], s
res.extend(kav(new_F, e))
return res
lit = S[0], (n, S[0])
return [kav(Fn, e) for Fn in MC([F], [lit])]
compare, and be amazed:
.. code:: ipython2
def dip_t(stack, expression):
(quote, (a1, stack)) = stack
expression = stack_concat(quote, (a1, expression))
return stack, expression
.. code:: ipython2
def dip(stack, expression, dictionary):
(quote, (x, stack)) = stack
expression = (x, expression)
return stack, concat(quote, expression), dictionary
And that brings us to current Work-In-Progress. I'm pretty hopeful that
the mixed-mode inferencer/interpreter ``kav()`` function along with the
ability to specify multiple implementations for the combinators will
permit modelling of the stack effects of e.g. ``ifte``. If I can keep up
the pace I should be able to verify that conjecture by the end of June.
The rest of this stuff is junk and/or unfinished material.
``concat``
~~~~~~~~~~
How to deal with ``concat``?
::
concat ([.0.] [.1.] -- [.0. .1.])
We would like to represent this in Python somehow...
.. code:: ipython2
concat = (S[0], S[1]), ((S[0], S[1]),)
But this is actually ``cons`` with the first argument restricted to be a
stack:
::
([.0.] [.1.] -- [[.0.] .1.])
What we have implemented so far would actually only permit:
::
([.0.] [.1.] -- [.2.])
.. code:: ipython2
concat = (S[0], S[1]), (S[2],)
Which works but can lose information. Consider ``cons concat``, this is
how much information we *could* retain:
::
(1 [.0.] [.1.] -- [1 .0. .1.])
As opposed to just:
::
(1 [.0.] [.1.] -- [.2.])
represent ``concat``
~~~~~~~~~~~~~~~~~~~~
@ -2731,8 +3035,8 @@ or...
Joy in the Logical Paradigm
---------------------------
Appendix: Joy in the Logical Paradigm
-------------------------------------
For this to work the type label classes have to be modified to let
``T >= t`` succeed, where e.g. ``T`` is ``IntJoyType`` and ``t`` is
@ -2752,85 +3056,85 @@ For this to work the type label classes have to be modified to let
ValueError Traceback (most recent call last)
<ipython-input-113-4b4cb6ff86e5> in <module>()
<ipython-input-137-4b4cb6ff86e5> in <module>()
1 F = reduce(C, (pop, swap, roll_down, rest, rest, cons, cons))
2
----> 3 print doc_from_stack_effect(*F)
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
<ipython-input-101-ddee30dbb1a6> in C(f, g)
<ipython-input-99-ddee30dbb1a6> in C(f, g)
10 def C(f, g):
11 f, g = relabel(f, g)
---> 12 for fg in compose(f, g):
13 yield delabel(fg)
<ipython-input-100-4237a6bb159d> in compose(f, g)
<ipython-input-98-5eb7ac5ad2c2> in compose(f, g)
1 def compose(f, g):
2
----> 3 (f_in, f_out), (g_in, g_out) = f, g
4
5 if not g_in:
----> 2 (f_in, f_out), (g_in, g_out) = f, g
3 s = unify(g_in, f_out)
4 if not s:
5 raise TypeError('Cannot unify %r and %r.' % (f_out, g_in))
ValueError: need more than 1 value to unpack
@ -2862,38 +3166,11 @@ For this to work the type label classes have to be modified to let
s[0]
Typing Combinators
------------------
`Abstract Interpretation <https://en.wikipedia.org/wiki/Abstract_interpretation>`__
-----------------------------------------------------------------------------------
TBD
This is an open subject.
The obvious thing is that you now need two pairs of tuples to describe
the combinators' effects, a stack effect comment and an expression
effect comment:
::
dip (a [F] --)--(-- F a)
One thing that might help is...
Abstract Interpretation
-----------------------
Something else...
-----------------
::
[4 5 ...] 2 1 0 pop∘swap∘roll<∘rest∘rest∘cons∘cons
[4 5 ...] 2 1 swap∘roll<∘rest∘rest∘cons∘cons
[4 5 ...] 1 2 roll<∘rest∘rest∘cons∘cons
1 2 [4 5 ...] rest∘rest∘cons∘cons
1 2 [5 ...] rest∘cons∘cons
1 2 [...] cons∘cons
1 [2 ...] cons
[1 2 ...]
Eh?
I *think* this might be sorta what I'm doing above with the ``kav()``
function... In any event "mixed-mode" interpreters that include values
and type variables and can track constraints, etc. will be, uh,
super-useful. And Abstract Interpretation should be a rich source of
ideas.

View File

@ -119,17 +119,15 @@
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="Types.html">Type Inference</a><ul>
<li class="toctree-l2"><a class="reference internal" href="Types.html#poial-s-rules">Pöials Rules</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#examples">Examples</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#implementation">Implementation</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#compiling">Compiling</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#types-and-subtypes-of-arguments">Types and Subtypes of Arguments</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#functions-that-use-the-stack">Functions that use the Stack</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#sets-of-stack-effects">Sets of Stack Effects</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#concat"><code class="docutils literal notranslate"><span class="pre">concat</span></code></a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#typing-combinators">Typing Combinators</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#part-i-poial-s-rules">Part I: Pöials Rules</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#part-ii-implementation">Part II: Implementation</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#part-iii-compiling-stack-functions">Part III: Compiling Stack Functions</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#part-iv-types-and-subtypes-of-arguments">Part IV: Types and Subtypes of Arguments</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#part-v-functions-that-use-the-stack">Part V: Functions that use the Stack</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#part-vi-multiple-stack-effects">Part VI: Multiple Stack Effects</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#part-vii-typing-combinators">Part VII: Typing Combinators</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#appendix-joy-in-the-logical-paradigm">Appendix: Joy in the Logical Paradigm</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#abstract-interpretation">Abstract Interpretation</a></li>
<li class="toctree-l2"><a class="reference internal" href="Types.html#something-else">Something else…</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="NoUpdates.html">No Updates</a></li>

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

331
joy/utils/polytypes.py Normal file
View File

@ -0,0 +1,331 @@
'''
Multiple Stack Effects
By adjusting the machinery in types.py to handles lists of stackeffect comments
we can capture more information about the type signatures of some functions,
and we can introduce a kind of Kleene Star or sequence type that can stand for
an unbounded sequence of other types.
'''
import sys
sys.path.append('/home/sforman/Desktop/Joypy-hg')
from itertools import product
from joy.utils.types import (
AnyJoyType, A,
C,
DEFS,
doc_from_stack_effect,
FloatJoyType,
NumberJoyType, N,
StackJoyType, S,
stacky,
_R,
relabel, delabel,
update,
)
# We no longer want FloatJoyType to accept IntJoyType.
class IntJoyType(NumberJoyType): prefix = 'i'
a0, a1, a2, a3, a4, a5, a6, a7, a8, a9 = A
n0, n1, n2, n3, n4, n5, n6, n7, n8, n9 = N
s0, s1, s2, s3, s4, s5, s6, s7, s8, s9 = S
f0, f1, f2, f3, f4, f5, f6, f7, f8, f9 = F = map(FloatJoyType, _R)
i0, i1, i2, i3, i4, i5, i6, i7, i8, i9 = I = map(IntJoyType, _R)
class KleeneStar(object):
'''
A sequence of zero or more `AnyJoyType` variables would be:
A*
The `A*` works by splitting the universe into two alternate histories:
A* -> 0
A* -> A A*
The Kleene star variable disappears in one universe, and in the other
it turns into an `AnyJoyType` variable followed by itself again.
We have to return all universes (represented by their substitution
dicts, the "unifiers") that don't lead to type conflicts.
'''
kind = AnyJoyType
def __init__(self, number):
self.number = number
self.count = 0
self.prefix = repr(self)
def __repr__(self):
return '%s%i*' % (self.kind.prefix, self.number)
def another(self):
self.count += 1
return self.kind(10000 * self.number + self.count)
def __eq__(self, other):
return (
isinstance(other, self.__class__)
and other.number == self.number
)
def __ge__(self, other):
return self.kind >= other.kind
def __add__(self, other):
return self.__class__(self.number + other)
__radd__ = __add__
def __hash__(self):
return hash(repr(self))
class AnyStarJoyType(KleeneStar): kind = AnyJoyType
class NumberStarJoyType(KleeneStar): kind = NumberJoyType
#class FloatStarJoyType(KleeneStar): kind = FloatJoyType
#class IntStarJoyType(KleeneStar): kind = IntJoyType
class StackStarJoyType(KleeneStar): kind = StackJoyType
As = map(AnyStarJoyType, _R)
Ns = map(NumberStarJoyType, _R)
Ss = map(StackStarJoyType, _R)
def _lil_uni(u, v, s):
if u >= v:
s[u] = v
return s,
if v >= u:
s[v] = u
return s,
raise TypeError('Cannot unify %r and %r.' % (u, v))
def unify(u, v, s=None):
if s is None:
s = {}
elif s:
u = update(s, u)
v = update(s, v)
if u == v:
return s,
if isinstance(u, AnyJoyType) and isinstance(v, AnyJoyType):
return _lil_uni(u, v, s)
if isinstance(u, tuple) and isinstance(v, tuple):
if len(u) != len(v) != 2:
raise TypeError(repr((u, v)))
a, b = v
if isinstance(a, KleeneStar):
if isinstance(b, KleeneStar):
return _lil_uni(a, b, s)
# Two universes, in one the Kleene star disappears and unification
# continues without it...
s0 = unify(u, b)
# In the other it spawns a new variable.
s1 = unify(u, (a.another(), v))
t = s0 + s1
for sn in t:
sn.update(s)
return t
a, b = u
if isinstance(a, KleeneStar):
s0 = unify(v, b)
s1 = unify(v, (a.another(), u))
t = s0 + s1
for sn in t:
sn.update(s)
return t
ses = unify(u[0], v[0], s)
results = ()
for sn in ses:
results += unify(u[1], v[1], sn)
return results
if isinstance(v, tuple):
if not stacky(u):
raise TypeError('Cannot unify %r and %r.' % (u, v))
s[u] = v
return s,
if isinstance(u, tuple):
if not stacky(v):
raise TypeError('Cannot unify %r and %r.' % (v, u))
s[v] = u
return s,
return ()
def compose(f, g):
(f_in, f_out), (g_in, g_out) = f, g
s = unify(g_in, f_out)
if not s:
raise TypeError('Cannot unify %r and %r.' % (fo, gi))
for result in s:
yield update(result, (f_in, g_out))
def C(f, g):
f, g = relabel(f, g)
for fg in compose(f, g):
yield delabel(fg)
def meta_compose(F, G):
for f, g in product(F, G):
try:
for result in C(f, g):
yield result
except TypeError:
pass
def MC(F, G):
res = sorted(set(meta_compose(F, G)))
if not res:
raise TypeError('Cannot unify %r and %r.' % (F, G))
return res
mul = [
((i2, (i1, s0)), (i3, s0)),
((f2, (i1, s0)), (f3, s0)),
((i2, (f1, s0)), (f3, s0)),
((f2, (f1, s0)), (f3, s0)),
]
dup = [DEFS['dup']]
cons = [DEFS['cons']]
sum_ = [(((Ns[1], s1), s0), (n0, s0))]
f = [(s0, ((n1, (n2, (n3, s1))), s0))]
print doc_from_stack_effect(*f[0]), doc_from_stack_effect(*sum_[0])
print '.......................'
##for result in unify(sum_[0][0], f[0][1]):
## print result, '->', update(result, sum_[0][1])
for g in MC(f, sum_):
print doc_from_stack_effect(*g)
print
print '.......................'
print
for g in MC(dup, mul):
print doc_from_stack_effect(*g)
##stack_concat = lambda q, e: (q[0], stack_concat(q[1], e)) if isinstance(q, tuple) else e
class FunctionJoyType(AnyJoyType):
def __init__(self, name, sec, number):
self.name = name
self.stack_effects = sec
self.number = number
def __add__(self, other):
return self
__radd__ = __add__
class SymbolJoyType(FunctionJoyType): prefix = 'F'
class CombinatorJoyType(FunctionJoyType): prefix = 'C'
from joy.library import dip, dipd
##def dip_t(stack, expression):
## (quote, (a1, stack)) = stack
## expression = stack_concat(quote, (a1, expression))
## return stack, expression
CONS = SymbolJoyType('cons', cons, 23)
DIP = CombinatorJoyType('dip', [dip], 44)
DIPD = CombinatorJoyType('dipd', [dipd], 45)
from itertools import chain
flatten = lambda g: list(chain.from_iterable(g))
def kav(F, e):
if not e:
return [F]
n, e = e
if isinstance(n, SymbolJoyType):
Fs = MC([F], n.stack_effects)
res = flatten(kav(Fn, e) for Fn in Fs)
elif isinstance(n, CombinatorJoyType):
res = []
for combinator in n.stack_effects:
fi, fo = F
new_fo, ee, _ = combinator(fo, e, {})
new_F = fi, new_fo
res.extend(kav(new_F, ee))
else:
lit = s9, (n, s9)
res = flatten(kav(Fn, e) for Fn in MC([F], [lit]))
return res
##l = [(s0, ((CONS, s2), (A[1], s0)))]
##
##e = (DIP, ())
##
##h = kav(l[0], e)
##
##for z in h:
## print doc_from_stack_effect(*z)
ID = s0, s0
expression = (a1, ((CONS, s0), (DIP, ())))
for sec in kav(ID, expression):
print doc_from_stack_effect(*sec)
expression = (a1, (a3, ((CONS, s0), (DIPD, ()))))
for sec in kav(ID, expression):
print doc_from_stack_effect(*sec)

View File

@ -34,7 +34,12 @@ class NumberJoyType(AnyJoyType): prefix = 'n'
class FloatJoyType(NumberJoyType): prefix = 'f'
class IntJoyType(FloatJoyType): prefix = 'i'
class StackJoyType(AnyJoyType): prefix = 's'
class StackJoyType(AnyJoyType):
prefix = 's'
def __nonzero__(self):
# Imitate () at the end of cons list.
return False
_R = range(10)
A = a0, a1, a2, a3, a4, a5, a6, a7, a8, a9 = map(AnyJoyType, _R)
@ -69,8 +74,12 @@ def delabel(f, seen=None, c=None):
pass
if not isinstance(f, tuple):
seen[f] = f.__class__(c[f.prefix])
c[f.prefix] += 1
try:
seen[f] = f.__class__(c[f.prefix])
except TypeError: # FunctionJoyTypes break this.
seen[f] = f
else:
c[f.prefix] += 1
return seen[f]
return tuple(delabel(inner, seen, c) for inner in f)
@ -97,7 +106,7 @@ def unify(u, v, s=None):
if v >= u:
s[v] = u
return s
raise ValueError('Cannot unify %r and %r.' % (u, v))
raise TypeError('Cannot unify %r and %r.' % (u, v))
if isinstance(u, tuple) and isinstance(v, tuple):
if len(u) != len(v) != 2:
@ -110,13 +119,13 @@ def unify(u, v, s=None):
if isinstance(v, tuple):
if not stacky(u):
raise ValueError('Cannot unify %r and %r.' % (u, v))
raise TypeError('Cannot unify %r and %r.' % (u, v))
s[u] = v
return s
if isinstance(u, tuple):
if not stacky(v):
raise ValueError('Cannot unify %r and %r.' % (v, u))
raise TypeError('Cannot unify %r and %r.' % (v, u))
s[v] = u
return s