[Date Prev] [Date Next] [Thread Prev] [Thread Next] Date Index Thread Index Search archive:
Date:Mon Jan 17 16:36:32 1994 
Subject:Re: Type Definition: A style question? 
From: Robin Popplestone  
Volume-ID:940117.04 

leander@dos-lan.cs.up.ac.za writes:

> In another  vein,  I had  been  wondering about  the  insistence  that
> functional languages be capable  of inferring each  and every type  of
> every  subexpression.  One  gets   the  idea  that  insisting   that a
> type-system  must   allow  automatic   inference  might   place   some
> restrictions on it. Maybe, if we allow types that need user assistance
> in specifcation,  we might  get  more powerful  type systems.  I'm  no
> expert on this field, so comments would be appreciated.

Hindley-Milner inference depends on -unification- which is readily possible
on  word-algebras,  but  becomes  problematic  as  soon  as  any  laws  are
introduced, where  it is  at least  as  hard as  the -word  problem-,  i.e.
undecidable on some algebras.

The most obvious circumstance  in which a  richer descriptive apparatus  is
needed is in  describing -sequences-.  Now we already  have well  developed
apparati for specifying  sets of  sequences, namely  formal grammars.  Most
type theories are weak in  their characterisation of any representation  of
sequences, e.g. Hindley-Milner only supports characterising lists as  being
homogeneous,  which  is  far  weaker  even  than  regular  grammars.   This
discrepancy has  of course  occurred to  various people  - there  was  some
discussion on the compilers news-group about it. Of course, the problem  is
the complexity of any computation involving grammars. E.g. "do two grammars
specify the same language", which, in a type-theory would come down to "are
these types equal?" is decidable for regular grammars but not much else. So
mostly people have looked  at dynamic type-checking,  where parsing is  all
that has to  be done. However  we all  know the advantages  of static  type
checking, provided it does not unduly bridle our freedom of expression.

But sequences -are- vitally important  in Computer Science. Data is  stored
sequentially in memory, and issues forth sequentially down channels. One of
the amusements of C programmers  is effectively to reparse  data-structures
with a new type-signature. Now just because they often fall over and  graze
their knees or split their lips, doesn't mean that what they are trying  to
do is always inappropriate.

I have  recently  had a  go  at defining  a  static type  system  based  on
grammars. Thus I can write:

    List({Int*String})

meaning a  list in  which  integers and  strings  alternate -  {...}  means
"Kleene  Closure".  Pragmatically,  my   type-checker  works  for   POP-11,
requiring  type-signatures  for  quite  a  proportion  of  functions,   but
inferring types of locals. Since POP is a higher-order language, the theory
is hard, and I  have made a  limited amount of  progress in showing  that a
sound type system of this kind is possible. What is clear is that the  more
descriptive the type-system, the stronger  the restrictions on updating  of
data-structures must be if a type-system  is to be sound. So that  ML-style
references are likely to be the acceptable face of mutability (to those who
can bear to look upon any aspect of this beast). This work will appear as 2
Glasgow DCS reports.

[Incidentally - one introduces Hd and Tl as -type functions- so that the
type-signature for the list-function hd is

   hd: All a; List(a) -> Hd(a)

these functions, and indeed the aquisition of arguments by functions,
require the concept of the quotient of languages, well known to theorists.
]

Robin Popplestone.