[Date Prev] [Date Next] [Thread Prev] [Thread Next] Date Index Thread Index Search archive:
Date:Mon May 23 19:06:05 1997 
Subject:Re: OOP and The future of POP-11 
From:Jonathan Cunningham 
Volume-ID:970523.01 

In article <5lkq1b$jii@percy.cs.bham.ac.uk>,
AaronSloman@nospam.com (Aaron Sloman See text for reply address wrote:

>[RM]
>> >(though it is well suited to AI work)
>
>I know very little about Java, but I find that a very strage
>statement. What sort of AI work? What exactly makes Java good for
>that.
>
> From the little I know about it I guess there are two possible
>answers (a) garbage collection so that people can easily build
>tangled networks without everyone having to re-invent safe
>algorithms for deciding what can be returned to free store and
>(b) incremental compilation, which can make exploratory design
>development and debugging very much faster.

IMO, the feature that makes it suitable for AI work is that it is
a general purpose programming language. The problems with
using, say, C++ for AI work are problems with that language for
any programming, IMO.

>Does it also support user definable macros which, when they are
>expanded, can use arbitrary user defined procedures in the expansion
>process (as in Lisp and Pop-11), or anything like Pop-11's syntax
>words?

Robin Popplestone makes a similar point to yours about macros.

IMO, macros are unnecessary. They have two main uses: syntax
extensions (see below) and to save typing :-). In the latter category,
I include macros which could be replaced by function calls, but where
macros have been used for efficiency.

Such macros should be replaced by function calls. We can assume
that the compiler is smart enough to expand them inline, which it
probably is; if not, then an "inline" keyword is not a big problem. I
really dislike the usual implementations of macros which change the
meaning of a program depending on which macros have been loaded.

Other "save typing" macros need not be used: instead use a smart
editor! Your code will be more readable.

Some of this is personal preference, and I have no real objection to
macros:
I'm merely arguing that they are not really necessary.

>Over and over again AI language users have found that it is useful
>to introduce powerful new syntax forms to support forms of
>programming that are appropriate to different sorts of tasks
>(thereby making development and maintenance much easier).
>
>Examples are rule-based extensions, pattern matchers, the addition
>of object-oriented programming constructs, the addition of new
>syntactic forms for natural language processing, syntax for planning
>languages, etc. Some of this can be difficult to do using simple
>forms of macro expansion - e.g. expanding code to make use of a
>closure of a previously compiled procedure.

I think these fall into two types: (a) genuinely new language constructs
reflecting genuinely new programming concepts, (b) syntactic sugar.

Similarly to  macros, I would argue that syntactic sugar is nice but
unnecessary, and may even be bad for your teeth.

Without knowing the particular extensions you have in mind, I
can't comment on all the above. However, you wouldn't need to
use closures with an OOP style of programming. Like a lot of
programming constructs, it depends what you are used to using. But
closures are (were?) a good way to combine procedures and data.
That's what objects are for - combining procedures and data. Here's
a very simple example. It doesn't cover all cases of use of closure, but
illustrates just one kind of use.

    define adder(m, n) -> sum; m + n -> sum; enddefine;
    vars plus3 = adder(% 3 %);
    plus3(2) =>

The equivalent using objects (in pseudo-code - I can't remember
the poplog objectclass syntax well enough to get it right):

    class adder;
       vars n;
       method adder(i); i->n; endmethod;
       method apply(m) -> sum; m+n -> sum; endmethod;
    endclass;
    vars plus3 = adder(3);
    plus3.apply(2)=>

Ok, it's more verbose. I can live with that. It's also more general, and
doesn't require any new syntax or programming constructs. (I hope
you can guess what the pseudo-code is supposed to do.)

A lot of interesting programming ideas have come out of research into
language design conducted by AI researchers. A lot of this has been
achieved by writing interpreters for new languages. Or language
pre-processors that translate into an existing language. But how much
of it was *necessary* for the AI research? And how much was diverted
effort, away from the AI research? (I'm not suggesting that the diverted
effort was wasted - far from it.)

However, I think your point is a good one. To use Java for some AI
applications I think we would certainly want libraries that made
writing pre-processors easy. It wouldn't be difficult to write
such libraries, but we would also want support for them in the
development environment.

>(With an incremental compiler you can, I presume, partly achieve
>this by having a command to compile a file of text after passing it
>through a pre-processor, which can be defined to use arbitrary
>previously compiled procedures.)

One of the features that makes Java more suitable than C++ is the separate
compilability of "compilation units" (i.e. class definitions), which would
make the above strategy relatively easy.

>[JLC]
>> For real-time control work, I would still expect to use C, at present, not
>> C++. For a variety of reasons. However, there is no intrinsic reason why
>> garbage collection should preclude this kind of use. Asynchronous
>> incremental GC algorithms have been known for years. But I don't
>> expect them to be implemented for Java;
>
>Why not? If there's a demand, and the market is large enough, they
>will be implemented!

Something I didn't know when I wrote my original post: the GC in Java
systems runs continuously in a separate thread. So on any system with
pre-emptive multi-tasking of threads, it would never be necessary
for the system to halt and do a GC. (I.e. they already do have asynchronous
GC algorithms.)

>[RM]
>> >Garbage collection is about the only thing Java actually buys you over c++
>> >from an AI point of view and you can get GC in C++ with a library so its no
>> >big deal.
>
>What about incremental compilation?

You can buy incremenal compilers for C++, if you want to spend the
money. A problem with C++, IMO, is the necessity to load *huge*
sets of header files. Pre-compiled headers are almost a necessity, and
compilation and linking is still relatively slow. This is certainly one
of the argued advantages of Java (i.e. dynamic run time link loading of
new procedures.)

>[JLC]
>> Or smart pointers (what I use). But this is a small point. The reason for
>> preferring Java is that C++ is ugly. Much more important ;-). A language
>> which lets you pass arguments as pointers *or* references? That's only for
>> backwards compatibility <Shudder>.
>
>Is this like Algol68's distinction between an INT and a REF INT ? I
>guess it's a useful facility as it makes it unnecessary to create an
>explicit REF data structure when you want to do this (e.g. to enable
>different procedure invocations to share an updatable variable).

Yes.

The Java model is the same as pop11's model. I.e. a variable which
contains an integer (or a float, or a boolean etc.) actually stores the
value, for anything else the variable stores a pointer to the value, except
you don't need to know about pointers.

For example,

   define mess_with_list(mylist);
         "a" -> mylist(2);
   enddefine;
   vars list1 = [a b c];
   vars list2 = [d e f];
   list2 -> list1;
   mess_with_list(list1);
   list2 =>

What do you expect to print?

If you translated the above into Java in the most direct manner, you would
get exactly the same behaviour (as for pop11). Except that you would
probably have to write something like:

    Pair list1 = new Pair("a", new Pair("b", new Pair("c", NULL)));

HERE FOLLOWS A SERIOUS SUGGESTION: I think poplog should implement
a superset of Java as one of its language compilers. The superset should
provide syntactic sugar for things like lists, but should compile standard
Java programs correctly. I.e. a normal java program should compile and
run correctly without changes.

>[JLC]
>> Oh, I don't believe in the NC,
>
>Why not? My guess is that until NC's (glorified X terminals

Just a hunch. I'd be happy to expound at length over a beer.

>[JLC]
>> Aaargh!! No wonder you said programming was not being taught properly: if
>> you aren't teaching OOP from day one, you are making life *much* harder.
>> Do you also teach algebra using Roman numerals before introducing
>> the decimal system?
>
>Here I agree more with Richard.
>
>Maybe Jonathan has forgotten what it is like to teach people with
>very little mathematical or scientific expertise, with NO
>programming experience, who are taking an AI course out of
>intellectual interest (it's the best form of liberal education for
>the next millenium, if done properly), and many of whom will never
>be programmers.

Um. You may be right. I reckon I *could* teach C++ as a first language,
but I wouldn't like to do it! I wouldn't mind teaching Smalltalk as
a first language (if I knew it). Or Common Lisp. Or pop11 with objectclass.

I would certainly start by introducing objects as the first concept
though. ("To program a computer, we have to write something called
a program. What kind of thing is a program? ..." Notice the use of
the words "kind" and "thing"? It's a small step to use the words
"class" and "object".)

>Of course, when students do their first mini-project, whether they
>are using an object oriented style or not, I try to get them to
>think very clearly about the ontology they are working with. I
>sometimes think that some of the enthusiasm for OOP is misplaced
>because it is really enthusiasm for Ontology-oriented design, which
>is important whether or not you make use of the powerful features of
>OOP (e.g. inheritance, polymorphism and methods).

That's an important part of it. There's a certain chicken-and-egg
nature to learning OOD though. You can't do it well until you can
program well, but to program well, you need to do it well. Programming
well requires a lot of practice.

>[JLC]

>> I'm not going to argue this point: if you don't agree that OOP is
>> simpler than procedural, then I won't be able to convince you.

>You are wrong because use need procedural programming techniques,
>including such things as conditionals, loops, variables, in order to
>do OOP, not the other way. I.e. learning OOP involves learning more.
>                                                                QED.

The fallacy here is the assumption that "less" implies "simpler". Sometimes
learning is easier when the material to be learnt has a more coherent
structure.

>[JLC]
>> What a strange thing to say. I wouldn't dream of writing *any* program
>> other
>> than as an OOP.

When I first encountered OOP, I couldn't see the point of it. I could
do anything I wanted procedurally. (I could see the point of logic
programming, which seemed genuinely different.)

Perhaps an analogy will help: I no longer use "goto" statements in
my programs. I use "while" loops, "for" loops etc. I'm aware that
exactly the same effect could be achieved using "goto" statements.
But I don't *think* in terms of "goto" statements. Should an occasion
arise when I need a "goto" statement, I'll use one. But it hasn't happened
for a long time.

This is what I mean by OOP mindset. I could easily write code
as a collection of procedures passing extra arguments around (or, worse,
operating on global variables), but I don't. Any more than I consciously
decide to replace "goto" statements with loop constructs.

>What a strange thing to say. OOP is just one of a number of
>programming paradigns, and different paradims are useful for
>different purposes.

I don't think it is a particularly different programming paradigm. The
analogy with structured programming is best, IMO. It's a more
structured way to organise your program. In many programs it might
make sense to have an "Application" class, with a single instance
representing the running program. This is where you'd put what what
could be globals in a procedural program. There's no obvious advantage
to doing it that way, rather than having top-level procedures. But there's
no real disadvantage either (and I think there are some non-obvious
advantages).

On a different point ...

>However, I suspect that in general it can be very difficult to
>determine which is the best global optimisation, even at run time,
>since the very same sequence may be capable of being used in
>different ways and I suspect that working out whether a commitment
>to a particular representation is right, even at run time, is as
>difficult as solving the halting problem.

If you can decide, at run time, which is the best representation of
a sequence, maybe the system could alter the representation at
run time? (Especially on a multi-threaded, multi-processor
implementation - start off with both representations, and every so
often whichever is losing "cheats" by copying the answer-so-far
from the one which is ahead ;-).

Wow! What a long post. And I had a lot of work to do this afternoon as well
:-(.

Jonathan