In article <8ctg9n$uuf$1@soapbox.cs.bham.ac.uk>,
pop@roo.cs.umass.edu writes:
>
> Chris Dollin said:
>
>> OO-style inheritance doesn't cut it. C++-style templates are a bad
>> approximation to ML-style polymorphism. Java-style distinctions between
>> primitive types" and "class types" just serve to complicate that
>> which was already simple.
>
> Actually, I'd say that Java makes a sensible decision to distinguish
> statically between what's necessarily a pointer and what's definitely not.
> This provides information necessary for garbage-collection while preserving
> the natural operations of the machine on non-pointer entities. Given that
> machine architectures don't support tagging, I'd say that Java's made the
> right decision.
Having written rather more Java code that I'd really like to recently,
I must respectfully disagree. The most painful consequence is that you
can't write uniform code for collections: in Java, you can't use `int`
values as either the keys or the results from hash tables, nor the values
stored in extensible vectors. (Nor `char` nor `bool` nor `long` nor
`float` nor `double`.) You either have to write your own code for these,
or you have to make Float, Integer, Double etc *objects*, using additional
store (and cycles) and complicating the code.
I wouldn't mind if the language had a universal type *and* ways of
saying "... but it's always an int", "... but it's always a reference-to-
Foo*, or it had ML-style polymorphism. But it has *neither*, and the
more Java I write the less I like it.
As for the machine architecture question, well, I think that the
decision that the Java arithmetic oeprations were to wrap, not fail,
on overflow was what we English call "regrettable". With type declarations
available, a compiler can avoid tagging and untagging intermediate values
(or even final values). There are so *many* ways in which a better job
could be done.
--
Chris "Pepper -> C runs 'fast enough', for example ..." Dollin
C FAQs at: http://www.faqs.org/faqs/by-newsgroup/comp/comp.lang.c.html
|