Warning: Flame Bait 
Author Message
 Warning: Flame Bait


Quote:

>...The fact of the matter is that Ada is more portable,
>more scalable, and has a far more complete, tested and mature
>multithreading model than Java, including complete thread-safe programming
>support with full guarding capabilities. ...

I'm not a big fan of Java, but I don't see how anyone can say that Ada
is more portable than Java.  The Java language definition nails down all
kinds of things that are "implementation defined" or "unspecified" in
Ada.

- Bob



Tue, 18 May 1999 03:00:00 GMT  
 Warning: Flame Bait

Bob Duff says

"I'm not a big fan of Java, but I don't see how anyone can say that Ada
is more portable than Java.  The Java language definition nails down all
kinds of things that are "implementation defined" or "unspecified" in
Ada."

Portability is MUCH more than just a question of the language definition, it
is about being able to practically port a wide range of applications from
one machine to another over a wide range of machines. Java does not yet
begin to meet either of these criteria. We have to wait to see whether
it can in the future, but right now, Java is quite limited in its reasonable
application domain in practice, and I don't see too many Java compilers for
the 1750A, let alone the 8051 :-)



Tue, 18 May 1999 03:00:00 GMT  
 Warning: Flame Bait

Quote:

>Portability is MUCH more than just a question of the language definition, it
>is about being able to practically port a wide range of applications from
>one machine to another over a wide range of machines. Java does not yet
>begin to meet either of these criteria. We have to wait to see whether
>it can in the future, but right now, Java is quite limited in its reasonable
>application domain in practice, and I don't see too many Java compilers for
>the 1750A, let alone the 8051 :-)

Quite true.  Portability in practice is quite different from portability
in theory.  Java has the "theory" part done quite nicely -- a Java
program that works properly on one machine will work the same on any
other machine that correctly supports the Java language (except for
issues involving shared variable updates by multiple threads).  You
can't say that about Ada, since Ada doesn't define lots of things (order
of evaluation of arguments, pass-by-reference vs. pass-by-value,
semantics of arithmetic overflow, etc -- I could name several tens of
cases).  On the other hand, if a Java compiler doesn't exist on your
machine, you're out of luck.  Likewise, if your machine doesn't have a
word size that is a power of 2, Java can't be compiled efficiently for
it.  The designers of Java clearly valued portability above all else,
including efficiency, and there are many situations where Java simply is
inappropriate.

We'll see what happens.  Java may well be the final nail in the coffin
for weird machines.  Hmm.

Anyway, it's hard to reason about portability "in practice".  Think
about all the C programmers who write code assuming sizeof(int) =
sizeof(char*), and write code that is quite portable -- in practice.  My
taste wants some more basis, than, "I ported it to 3 machines, and it
seemed to work."  I have this itchy feeling -- I want to *know* that it
will port.

- Bob



Wed, 19 May 1999 03:00:00 GMT  
 Warning: Flame Bait


Quote:
> We'll see what happens.  Java may well be the final nail in the coffin
> for weird machines.  Hmm.

> Anyway, it's hard to reason about portability "in practice".  Think
> about all the C programmers who write code assuming sizeof(int) =
> sizeof(char*), and write code that is quite portable -- in practice.

Assumptions of correspondence between "int" and pointer sizes
are scheduled to be discovered as manufacturers start actually
using the the 64-bit addressing capabilities of the machines
they have already sold people.

Those who program for DEC Unix, and then VMS, and in the future
Windows NT are able to gradually discover through program execution
all such (or is it really _all_, is there someplace we have not tested?)
bon mots.  So unlike the Java case, the hardware does not have to be
"weird", it just has to be "different".  I believe IBM, HP and Sun
all have hardware with 64-bit addressing now, so all those C programmers
out there may be kept quite busy, with half of them engaged in eliminating
this error and the other half engaged in making that same old assumption
in new code.

Larry Kilgallen



Wed, 19 May 1999 03:00:00 GMT  
 Warning: Flame Bait

Bob says

"Quite true.  Portability in practice is quite different from portability
in theory.  Java has the "theory" part done quite nicely -- a Java
program that works properly on one machine will work the same on any
other machine that correctly supports the Java language (except for
issues involving shared variable updates by multiple threads).  You
can't say that about Ada, since Ada doesn't define lots of things (order
of evaluation of arguments, pass-by-reference vs. pass-by-value,
semantics of arithmetic overflow, etc -- I could name several tens of
cases).  On the other hand, if a Java compiler doesn't exist on your
machine, you're out of luck.  Likewise, if your machine doesn't have a
word size that is a power of 2, Java can't be compiled efficiently for
it.  The designers of Java clearly valued portability above all else,
including efficiency, and there are many situations where Java simply is
inappropriate."

Well you can always make a language portable on paper by specifying
everything in great detail, that would for example have been trivial
to do in Ada 95, BUT, and this is a *huge* but, the consequence of
doing so is that efficient implemenations become impossible on many
machines.

Now of course, so far we have only (by conventional compiler standards)
ludicrously inefficient interpreted implementations of Java, where such
details are largely buried by the interpretive overhead. But when people
start writing, or rather trying to write, efficient Java compilers, the
problems will be more than expected.

Bob wonders if Java will be the final nail in the coffin for "weird"
machines?

The trouble is that when you pin down the semantics as far as Java has,
then a lot of machines become weird.

For example, all DEC Alpha's and the high end MIPS chip (R10000) are both
weird by this definition, because they do not quite implement the whole of
the IEEE floating-point standard. If you are serious about Java requiring
strict adherence to the IEEE standard, then it will be impossible in either
of these cases to provide this strict adherence without a huge loss of
efficiency (just try running your favorite fortran codes on a DEC Alpha
in strict IEEE mode, and you will see what I mean).

Now, it is certainly trivially easy when designing a language to make a
statement that FPT will be exactly IEEE 754, and then congratulate
yourself for doing such a splendid job of portability design, but if
what you have achieved is a design that does not run correctly on the
machines that people think of as being appropriate for high end fpt
calculations (Sun is not a big player in this market), then perhaps you
have not done such a great job after all.

You have to know a LOT to avoid such mistakes. My guess is that the folks
at Sun who specified IEEE floating-point were simply unaware of the
consequences (it is possible that this was a subtle way of designing a
language more amenable to Sun than to SGI or DEC, but I doubt it was
well enough informed to have been so clever :-)

In the Ada world, a similar though smaller scale glitch happened with
floating-point. In Ada 83, division of floating-point values must be
exact if the result is a model number. Sounds reasonable, BUT, it means
that you cannot implement Ada 83 efficiently on a Cray (or for that
matter on an Intel i860). In the Ada 95 standard, there is a new
"feature" (RM G.2.1(16))

                         Implementation Permissions

16   If the underlying floating point hardware implements division as
multiplication by a reciprocal, the result interval for division (and
exponentiation by a negative exponent) is implementation defined.

Now from one point of view, this means that Ada is getting worse from
a portability point of view, but in pragmatic terms it makes Ada more
portable, because it makes it practical to implement Ada efficiently
on such machines. Sure there are very unusual cases of code that depend
on the old rule, but it is better to have a very minor portability
glitch of this kind, rather than a situation where NO Ada code runs
acceptably efficiently on a whole class of machines!



Wed, 19 May 1999 03:00:00 GMT  
 Warning: Flame Bait

Quote:

>For example, all DEC Alpha's and the high end MIPS chip (R10000) are both
>weird by this definition, because they do not quite implement the whole of
>the IEEE floating-point standard. If you are serious about Java requiring
>strict adherence to the IEEE standard, then it will be impossible in either
>of these cases to provide this strict adherence without a huge loss of
>efficiency (just try running your favorite Fortran codes on a DEC Alpha
>in strict IEEE mode, and you will see what I mean).

Interesting.  I guess it remains to be seen how seriously Java compiler
writers take the standard.  Ada 83 compiler writers have a history of
obeying the letter of the law (especially when pushed by the ACVC).
Sometimes even in those rare circumstances where the letter of the law
damages their customers.  I'll bet that Java a compiler writer could get
away with using the machine arithmetic, if it's "pretty close" to strict
IEEE.  (Of course, if the machine isn't IEEE at all, Java is stuck with
software floating point emulation.  I guess that's what you want for
applets distributed over the internet, but not for high performance
numerics stuff.)

Note also that many of the issues we're talking about have nothing to do
with particular machines.  For example, order of parameter evaluation.
Ada says "arbitrary order", in the hopes that compilers can generate
more efficient code.  Java says "left to right", putting portability
ahead of efficiency.  Maybe the Java designers would argue that the
efficiency hit is small, or maybe they would argue that portability is
essential, and efficiency less important.

(By the way, there are cases in Ada 95 (not Ada 83) where the parameters
can NOT be evaluated left to right!)

Quote:
>You have to know a LOT to avoid such mistakes. My guess is that the folks
>at Sun who specified IEEE floating-point were simply unaware of the
>consequences (it is possible that this was a subtle way of designing a
>language more amenable to Sun than to SGI or DEC, but I doubt it was
>well enough informed to have been so clever :-)

Really?  I suspect they knew what they were doing.

- Bob



Thu, 20 May 1999 03:00:00 GMT  
 Warning: Flame Bait

Bob Duff says

"Note also that many of the issues we're talking about have nothing to do
with particular machines.  For example, order of parameter evaluation.
Ada says "arbitrary order", in the hopes that compilers can generate
more efficient code.  Java says "left to right", putting portability
ahead of efficiency.  Maybe the Java designers would argue that the
efficiency hit is small, or maybe they would argue that portability is
essential, and efficiency less important."

And of course the argument on the other side is that, especially in Ada
which is not full of {*filter*} side effect operators like ++, only very
peculiar code would ever notice the difference, and it seems a shame
to take an efficiency hit to promote portability of code that should
never be written in the first place!

In fact the Ada rule encourages maintainable code by clearly declaring
that weird dependence on side effects is unacceptable. Making badly
written non-maintainable code portable is NOT an Ada design priority.



Thu, 20 May 1999 03:00:00 GMT  
 Warning: Flame Bait

Quote:

> Bob Duff says

> "I'm not a big fan of Java, but I don't see how anyone can say that Ada
> is more portable than Java.  The Java language definition nails down all
> kinds of things that are "implementation defined" or "unspecified" in
> Ada."

> Portability is MUCH more than just a question of the language definition, it
> is about being able to practically port a wide range of applications from
> one machine to another over a wide range of machines. Java does not yet
> begin to meet either of these criteria. We have to wait to see whether
> it can in the future, but right now, Java is quite limited in its reasonable
> application domain in practice, and I don't see too many Java compilers for
> the 1750A, let alone the 8051 :-)

I know virtually nothing about the Java language or the JVM.  But in
the early 80's I worked for a company called Softech Microsystems, which
had acquired the rights to a system called UCSD Pascal.  Everything I
have read about the JVM leads me to believe that JVM has its roots in
the old "p-Machine", which was the virtual machine that UCSD Pascal
was compiled for.

UCSD Pascal was very portable.  We had it on a wide variety of
platforms and processors Z80, 8086, 68k, TI9900, HP-87, etc. etc.
etc.  You could compile on one platform and run the  object code
on another platform (sound familiar?).  Towards the end we had even
defined a local area network system to which allowed one system to
run the software resident on another systems disk.

There were some questionable business decisions that killed the
product (one was the pricing which allowed Borland Pascal to
become the {*filter*} Pascal product).  But also there were some basic
technology problems:

  o Speed - No matter what you do, an interpreted system will never
            execute code as a direct object code execution system.  This
            removes the system from that class of applications that
            requires speed.  The p-System attempted to address this
            by introducing native code generators.  However, even the
            code generated by these tools required some of the VM
            codes to remain.  And of course, once you go native you
            are no longer portable.

  o Architecture Limitations -

            The p-System was limited by the very thing that made it
            incredibly portable, the basic architecture of the
            p-Machine.  It was defined as a 16 bit architecture.  When
            the first set of 32 bit machines hit the market, the
            virtual machine definition kept programs from taking
            advantage of the underlying hardware.  We were addressing
            with a 32 bit definition, but we did not give it enough
            priority.

These two issues became critical with the emergence of the IBM PC
and the standard PC architecture.  All of a sudden portability was
much less important than performance on the IBM PC.  The fact that
the p-System could make limited use of memory beyond 64k was critical.

Anyway my two cents.  It would be interesting for someone who knows both
the Java technology and something about what happened to UCSD Pascal
to compare the two, both from a technology and business standpoint.
Since
Java is targeting a different application set the things that caused
UCSD Pascal to dissappear might be irrelevant.  It could also be that
the limitations of using an interpreted Virtual Machine are also
irrelavant to Java.  But it does seem like it would make an interesting
study (if one has not already been done).

Tom Robinson



Thu, 20 May 1999 03:00:00 GMT  
 Warning: Flame Bait


[snip]

Quote:
> Bob wonders if Java will be the final nail in the coffin for "weird"
> machines?
> The trouble is that when you pin down the semantics as far as Java has,
> then a lot of machines become weird.
> For example, all DEC Alpha's and the high end MIPS chip (R10000) are both
> weird by this definition, because they do not quite implement the whole of
> the IEEE floating-point standard. If you are serious about Java requiring
> strict adherence to the IEEE standard, then it will be impossible in either
> of these cases to provide this strict adherence without a huge loss of
> efficiency (just try running your favorite Fortran codes on a DEC Alpha
> in strict IEEE mode, and you will see what I mean).
> Now, it is certainly trivially easy when designing a language to make a
> statement that FPT will be exactly IEEE 754, and then congratulate
> yourself for doing such a splendid job of portability design, but if
> what you have achieved is a design that does not run correctly on the
> machines that people think of as being appropriate for high end fpt
> calculations (Sun is not a big player in this market), then perhaps you
> have not done such a great job after all.
> You have to know a LOT to avoid such mistakes. My guess is that the folks
> at Sun who specified IEEE floating-point were simply unaware of the
> consequences (it is possible that this was a subtle way of designing a
> language more amenable to Sun than to SGI or DEC, but I doubt it was
> well enough informed to have been so clever :-)

Wasn't there a long cross-posted thread back in winter '93-'94 about an
ambiguity in the IEEE fp standard that left parts of the standard
implementation-defined re: rounding intermediate results? I don't recall
the details, perhaps someone else reading here does. Was that ambiguity
a factor in the Alpha and Mips fp designs?

[etc]




Fri, 21 May 1999 03:00:00 GMT  
 Warning: Flame Bait

Clayton said

"Wasn't there a long cross-posted thread back in winter '93-'94 about an
ambiguity in the IEEE fp standard that left parts of the standard
implementation-defined re: rounding intermediate results? I don't recall
the details, perhaps someone else reading here does. Was that ambiguity
a factor in the Alpha and Mips fp designs?"

That involves a very subtle point regarding rounding in some unusual cases,
where indeed there is a one-bit ambiguity depending on where rounded is
done in the denormal case. However, this has nothing whatsoever to do with
DEC's anbd SGI's decision not to proper support denormals *at all*.



Fri, 21 May 1999 03:00:00 GMT  
 Warning: Flame Bait

Quote:

> Now, it is certainly trivially easy when designing a language to make a
> statement that FPT will be exactly IEEE 754, and then congratulate
> yourself for doing such a splendid job of portability design, but if
> what you have achieved is a design that does not run correctly on the
> machines that people think of as being appropriate for high end fpt
> calculations (Sun is not a big player in this market), then perhaps you
> have not done such a great job after all.

Java implementations that do not conform to the Java language
specification are already starting to appear.  One was announced
recently in comp.compilers... amoung other reasons for its non-
conformance is the fact that it doesn't handle IEEE infinities or NaN.

It will be interesting to see how much the market values conformance to
spec versus how much it values ease of implementation (which translates
into lower cost) or efficiency.

The Ada market has placed a very high value on conformance to the
standard (and in particular on validation).  Whether the Java market
will do so remains to be seen.

--

WWW: <http://www.cs.mu.oz.au/~fjh>   |  of excellence is a lethal habit"



Fri, 28 May 1999 03:00:00 GMT  
 
 [ 13 post ] 

 Relevant Pages 

1. Interesting, Was: Mac vs PC FLAME BAIT Par EXELENCE

2. FLAME BAIT!

3. Objects for ANS Forth FLAME BAIT!

4. G.C as disqualifier (Flame-Bait)

5. GC as disqualifier (Flame-Bait)

6. Not really a flame-bait

7. Microsoft Flame Bait

8. Window Systems and Forth (was: Mac vs PC FLAME BAIT Par EXELENCE)

9. Ichibah flames, and flames out over, Ada 9X

10. portable Forth, who cares? ... FLAME BAIT!

11. Mac vs PC FLAME BAIT Par EXELENCE, was Re: Hang on, isn't Forth , out of date now?

12. emacs & ved; flame-bait.

 

 
Powered by phpBB® Forum Software