[Date Prev] [Date Next] [Thread Prev] [Thread Next] Date Index Thread Index Search archive:
Date:Mon Apr 24 10:29:57 1996 
Subject:Re: Java (slightly off topic) 
From:Jonathan Cunningham 
Volume-ID:960427.01 

In article <4lgfjm$sme@infa.central.susx.ac.uk>,
richardm@cogs.susx.ac.uk (Richard Matthias) wrote:

>Jonathan Cunningham (jlc@sofluc.demon.co.uk) wrote:
>: Has anyone thought yet about implementing Java within Poplog?
>
(snip)
>said :). Going in the other direction is currently very difficult because the
>Java VM has no decent way of creating closures as the Lisp people have found
>out. There was talk of persuading Sun to incorporate such a facility in
(snip)
>
>: I've just started looking at the language. I have a strong hunch that
>: it may turn out to become a very important general purpose language
>: in about, say, two years time.
>
>Good call. I think the current fuss is premature too. Current problems with it

Maybe closures will come in time, but I'm not really predicting converts
from Lisp (or Pop11), and C++ programmers won't know what they are missing,
so this won't be a disadvantage for them.

P.S. You don't need closures. It is not The OO Way :-). I've just realised
why
I don't feel the lack of closures in C++. In fact (since this is
comp.lang.pop)
I could argue that you shouldn't use closures in pop11 (although for
efficiency, I might allow the use of partial application). Instead,
create a class called "action" (or "closure" or "foo"), which has one
method "apply", then subclass it as needed. (You might need - yukk -
several "applies", apply1, apply2 etc. depending on how many arguments
they require. In C++ (or Java - but not pop11, except that in pop11 you
only need one), they could all have the same name. I actually do this
a lot when writing HCI code in C++.
    And don't point out the difficulties: they are the same as we had using
partial application instead of true closures in the early-ish days of
poplog pop11. You get used to using references instead of local variables.

>: I'm sure variables get used more than they get assigned to
>
>Yup. The incremental garbage-collectors used in JavaVMs rely on this fact to
>stay efficient. A value may be read with impunity, but writes are trapped and
>invoke the marking phase of the gc.

Interesting. But with multi-threading becoming more common, I would have
thought that an asynchronous incremental GC running in a separate thread
would be the way to go: all ready for dual-processor/multi-processor
architectures. (This in a Non-Toy language: slightly further ahead than
two years though - but perhaps not so far. Maybe five years?) You could
adjust the priority of the gc thread according to the free/total memory
ratio.

Not that I would want to invent a correct asynchronous incremental GC, but
such things exist in the literature
>
>: I haven't got far with my reading yet, but so far the Java core looks
>: quite small: most of the power will come (as in poplog) from all the
>: associated libraries that will be developed.
>
>: Historically, big languages have always lost out (PL/1 and Algol-68 spring
>: to mind). C++ may be the next to go: it has too much baggage (no
(snip)
>
>Talk of C++'s death is premature. References are neater than pointers, but

If Java becomes useful in, say, 2 years, then it would be 4 to 6 years
before it had an impact on C++, and another 2 or 3 years before widespread
switching from C++ to Java-2003. I'm not willing to make strong
predictions on that timescale, but I _would_ be willing to bet _against_
C++.

>they don't cut ice when it comes to system programming. Java by its very
>design requires a support system even when compiled to native code. In short
>you couldn't write your operating-system in Java and more importantly you
>couldn't write your JavaVM in Java. This lack of symmetry puts people off.
>
I'm not convinced. References are implemented as pointers, so the only
thing you would need (which would also break the "safety" of Java for
internet purposes) would be one kludge to allow C-like casting (type
coercion). After all OS data structures are just data structures. Where
this became really inconvenient (not impossible), I could foresee the
use of C (not C++). (At present, real-time embedded systems may be written
in a mixture of C and C++. It is not too hard to imagine a foreign-function
interface added to a future Java.)

But more likely, if Java did become popular enough, people would start
agitating for more features to be added. And by then I will be speculating
on what language will replace it. :-).

>Also the things that were left out of Java to make it neater (other than
>pointers) are very useful. They wouldn't have made it into C++ if they weren't
>useful. Take multiple-inheritance for example. Someone described OO languages
>that don't support MI as 'Death by sub-classing'.

Java has an interesting twist: it has multiple inheritance of "interface"
classes. I don't yet have a feel for how well this compensates for lack
of real multiple inheritance. I very seldom use multiple inheritance in
C++, and don't miss it (although I used the "mixin" style of programming
a lot when I used lisp). {For those who don't know what I'm talking
about, an interface is a class with no local variables, but it can define
additional methods.}

Lack of multi-methods is a more serious criticism. I still use
multi-methods
(i.e. run-time dispatch on the types of a combination of args, not just
one)
a lot when programming in lisp, and it is a pain not to have them in C++.
But even this is alleviated by overloading of function definitions: if the
compiler can figure out at compile time which function to call, then you
don't need "true" multi-methods.

>: Everyone and their cat are implementing Java interpreters at the moment.
>
>Yes, there are lots of VMs being developed as I mention above. It would be an
>interesting exercise, but I don't see how poplog could "cash-in on the rush".

A very strong argument against implementing a poplog Java (except for
fun): there would be no money in it. Except, and unless, the poplog
environment made it attractive enough. But I think you are right. A pity.

--jlc