So, Guido has brought up the idea of optional static typing again, posting his current throughts on the idea, as well as noting what he sees as the problem areas.

His favoured syntax is:

def (a: sometype, b: sometype) -> sometype:
  pass

Bleh. The main reason I dislike the current version of anonymous functions is because they embed a colon in the middle of an expression so you can probably guess how I feel about Guido's reuse of the colon here. And I've always quite liked VB's approach of using 'as' to indicate the return type of a function. Anyway, syntax aside, there's a question of what the optional type declarations are actually for.

Now one point to make at the start is that optional typing (checked by the VM) and optional static typing (checked by the compiler) are different things, and it makes some sense to do the former before doing the latter. Once you have a syntax for optional typing, making it static is merely a question of figuring out how to get the compiler to do the type check, instead of the VM. This activity would then blend in with Python's general issue of "how can we move things to the compiler to save run-time activity, without losing too much dynamism?"

Having dropped the static idea for the moment, there's the fundamental question of what does a type declaration mean? Python has historically relied on an approach that says "if it defines the right methods, it's OK by me". This is great for flexibility and code reuse, but plays merry hell with type inferencing systems, and can lead to some exceedingly cryptic error messages when you pass a type that doesn't provide the correct methods (or provides methods with the right names, but the wrong signatures, etc, etc).

Stealing an example I like:

def int_divide(x as Integer, y as Integer) as Integer:
  return x / y

We don't really want x as Integer (or x : Integer in Guido's syntax) to mean isinstance(x, Integer) do we? After all, we'd like this function to work for builtin types, and Python's builtin types won't know anything about this interface we have created. It would be far nicer if the optional typing was just a way of formalising the 'duck typing' that Python currently relies on.

So let's consider something like the interfaces from PJE's PEAK, or Eiffel's idea of conformance (PEP 246, basically). In this case, we have a builtin method adapt() to adapt a given object to a given protocol. I'd suggest the meaning of the example should become:

def int_divide(x, y):
  x = adapt(x, Integer)
  y = adapt(y, Integer)
  return adapt(x / y, Integer)

Objects participate in this scheme as interfaces by definining __adapt__ special methods, and as adaptable objects by defining __conform__ special methods. That way, interfaces and types can be written in any order, and still play well together. For instance, the existing 'adaptation methods' understood by the builtin objects' constructors (i.e. __int__, __str__ and friends) could be incorporated into the system by having the __adapt__ methods of the relevant interfaces invoke the appropriate constructor - if the constructor throws an exception, then the adaptor method converts it to the appropriate adaptation exception.

As mentioned in PEP 246, it would also be possible to have an 'adaptation registry' which mapped from (type, interface) tuples to adaptation methods. While this doesn't really matter to the basic idea of adaptation, it's handy for people trying to integrate code which provides the right interface, but doesn't actually provide the relevant adaptation information (e.g. if it provides a read() method, and the function uses an interface which expects that method).

For containers, it would make sense to have the interfaces be parameterisable (e.g. List(Integer), List(Number), List(int, long) or List() - that last example meaning, "allow a List with any types", since a list which allowed no types wouldn't be very useful). This suggests the concept of interface factories - classes whose instances are themselves interfaces.

For example (assume AdaptationError is a subclass of TypeError that is thrown when an adaptation fails):

class AdaptedOk(Exception): pass

class List(object):
  def __init__(self, *args):
    self._allowed_interfaces = args

  def __adapt__(self, obj):
    try:
      lst = list(obj)
    except Exception, ex:
      raise AdaptationError(str(ex))
    interfaces = self._allowed_interfaces
    if interfaces:
      for i, x in enumerate(lst):
          try:
            for interface in interfaces:
              try:
                lst[i] = adapt(x, interface, None)
                raise AdaptedOk
              except AdaptationError:
                continue
            raise AdaptationError("List element %s does not "
                   "support any allowed interface" % str(x))
          except AdaptedOk:
            pass
    return lst

  def __eq__(self, other):
    return (isinstance(other, type(self)) and
     (self._allowed_interfaces == other._allowed_interfaces))

OK, so PEP 246 combined with syntactic support would give a cleaner mechanism for dynamic type checking. However, it would still be nice to have some sort of static checking for optimisation purposes (if the compiler knows the types at compilation time, it can do all the operator lookups and so forth then, instead of waiting to do the lookups at runtime).

Well, how about a slightly different pair of special methods: __adapt_strict__ and __conform_strict__. The result of strict adaptation guarantees that the result of adaptation is an actual instance of the interface. If an interface defines __adapt_strict__ without defining __adapt__, then Python can be certain that the results of adaptation to that interface will be an instance of that interface.

For example, the builtin types might provide __adapt_strict__ methods, allowing them to be used as interfaces which guaranteed that the result was an instance of the builtin type:

  isinstance(adapt(x, int), int) # Always true
  isinstance(adapt(x, Integer), Integer) # Likely false
  isinstance(adapt(x, Integer), int) # Only possibly true

This can give us static typing, as long as the compiler can check for the existence of __adapt__ and __adapt_strict__ on the supplied interface (e.g. by assuming the names of builtins actually refer to the builtins). Here's some hypothetical implementations of __adapt_strict__ for the builtins object and list:

def object:
  # The rest of object's definition is as normal
  # Naturally this would really be implemented in C. . .
  # Use a class method so any new-style class
  # can automatically be used for strict adaptation
  @classmethod
  def __adapt_strict__(cls, obj):
    if isinstance(obj, cls):
      return obj
    try:
      result = cls(obj)
    except Exception, ex:
      raise AdaptationError(str(ex))
    return result

class AdaptedOk(Exception): pass
def list(object):
  # The rest of list's definition is as normal
  # Naturally this would really be implemented in C. . .
  # Uses an instance method, so we use self
  # to store the list of allowed interfaces
  def __adapt_strict__(self, obj):
    try:
      lst = list(obj)
    except Exception, ex:
      raise AdaptationError(str(ex))
    if self:
      for i, x in enumerate(lst):
        try:
          for interface in self:
            try:
              lst[i] = adapt(x, interface, None)
              raise AdaptedOk
            except AdaptationError:
              continue
          raise AdaptationError("List element %s does not "
                 "support any allowed interface" % str(x))
        except AdaptedOk:
          pass
    return lst

With strict adaptation available, our earlier example of non-strict list adaptation would change to be:

def List(list):
  def __adapt__(self, obj):
    return self.__adapt_strict__(obj)

The rules for adaptation would change slightly from those suggested in the PEP:

  1. If the object is an exact instance of the interface, return it
  2. Try the object's __conform_strict__ method, if it has one. If that works, return the result.
  3. If the interface allows non-strict adaptation (it defines __adapt__), then try the object's __conform__ method, if it has one. If that works, return the result.
  4. Try the interface's __adapt_strict__ method, if it has one. If that works, return the result.
  5. Try the interface's __adapt__ method, if it has one. If that works, return the result.

The new additions are steps 2 & 4, and step 3 has been modified so that it is only tried if the interface implements __adapt__. The idea of restricting step 1 to exact instances is taken from the PEP - it allows subclasses to say "I don't implement my parent's interface" by throwing an exception in its conformation method. The check for instances of subclasses has been moved to the implementation of object.__adapt_strict__. This allows interfaces to decide how they choose to deal with subclasses.