multi-language vs mono-language, 2/2 (was: Me and FP &c) 
Author Message
 multi-language vs mono-language, 2/2 (was: Me and FP &c)

(continued from pt. 1)

    [snip a lot, see pt 1]

> > So, there is no intrinsic reason why a language good at dynamically
> > typed programming, and able to optimize better via type annotation,
> > should be necessarily preferred over a couple of languages.

> But there are plenty of reasons, else everybody would be writing in
> dozens of languages.

That "dozens" of languages may be sub-optimal for use in one given
project (by a single person) affords no inferences about "intrinsic
reasons" why ONE should be the optimal number.

Nobody eats any given meal using dozens of different pieces of
cutlery -- yet using _a few_ variously appropriate ones is quite
common, reasonable, and effective.  The total cost, mostly in
terms of confusion, of using N different tools may diverge faster
than O(N) _for N large enough_ -- I think it might risk being
quadratic, at least, if each tool can interact with each other.  But
that carries no implication on what happens for small values of N.

If I had arrayed in front of me a dozen forks of different kinds,
a dozen knives of very diverse capability, half a dozen spoons,
and several hybrids as well (a spoon with fork-ish prongs on
the end, another with a cutting edge, etc), my eating efficiency
would probably plunge; using each one appropriately would
require a decision and search process, even when knowing exactly
which one I want I might have a hard time finding it among the
others and disentangling it from them, etc.

But that is no reason to try to limit myself to, say, just a
knife, period.  Soup would for example become quite a problem --
even if the knife was cleverly 'hollowed' to be vaguely spoonlike
usable, I'd risk cutting my mouth at every sip.  One knife, one
fork, and one spoon, seems a minimal efficient array for many
kinds of meals.  Two spoons would be better if I have both
soup and ice-cream -- it would be inconvenient to rinse off the
spoon in the middle of the meal.  Two knives would be better
if I have to both spread butter _and_ effectively cut/carve meat.
Etc, etc.  The optimal number will differ with kinds of meals and
to some extent personal preferences.  For a meal based on a
sandwich, zero pieces of cutlery might be best; for another
more complicated meal, half a dozen.

But as you can see, that "several dozens" different tools in
simultaneous usage is unreasonable has no implications that
the optimal number of tools be ONE.

> First of all, mastering a language requires a lot of mental effort.
> Learning the concepts isn't that difficult (after the first couple of
> languages, all further languages are easy), but it's damn hard to
> remember all the fiddly syntax details: where to put semicolons, how

For a language you practice regularly, it's not a problem, in
practice, to juggle all of these things, particularly if the languages
are different enough on the surface.  For dozens of languages, it
would be a big problem; for half a dozen, its a piece of cake.

> Second, if "the right tool for the right task" worked, we'd be using
> different languages for every module of a project. We'd do the FFT in C,
> business logic in an FPL, and the GUI in (why not?) Visual Basic. But
> this is not done,

But, it *IS* done!  We do it regularly.  Some of our GUIs are in
HTML (mostly dynamic) and Javascript, some in a dedicated
proprietary language of our own (the engine is mostly, but not
only, in C++), most of the computationally intensive stuff is
in C++, the DB accesses are in SQL, the scripting is quite varied,
etc. And we're anything but unique in this -- it's the common
practice today.

> because inter-language interfaces are inherently
> unsafe. The semantics of different languages are incompatible, and the
> programmer has to provide a translation of concepts at the interface.

Components' interfaces are specified in a dedicated language, called
IDL, which gets compiled down to easily-machine-readable metadata.
It's the job of each language's tools to take the metadata and map
it to the stubs/proxies/whatever that the particular language requires.

This is imperfect today, i.e. with COM: there is no clear demarcation
of what each language *must* implement/translate to interoperate
fully, no clear, machine-enforced "programming-as-contracting" spec
at the application-logic level (requires, ensures, invariants), etc.  It's
still quite workable -- not 'theoretically', maybe, but quite so in
practice.  The programmer has no pecular translation job to do if
he or she is working in a reasonably high-level language with decent
COM support -- Java (of the MS variety, soon due for extinction),
Python, Visual Basic, Delphi, or (haven't checked this out, but this
is what the respective tools claim) Dylan (FunDev with the COM
package), OCaml (with CamlIDL), Haskell (with HaskellScript).  If
working in C++, one gets the choice of several levels of approach
(as in most other things) -- high-level as above, bit-twiddling very
low level, or intermediate.  But except for bit-twiddlers by choice,
it's still the tools' job, NOT the programmer's, to translate and afford
reasonable access to COM's fundamental semantics rules (who owns
what resources in cases of [in], [out], and [in, out] parameters of
various kinds, etc).  Good tools seem to be doing a good job at this.

It will be even better (if it matures properly...) with .NET, with the
rules much clearer and pervasive, some explicit notion of
contracting, very rich metadata, and fuller interoperability.  Several
functional programming languages appear to be readying very
good implementations in a .NET environment, a development
I'm keeping an eye on with much interest -- it's much more
interesting to me than the little extras and quirks MS has decided
to put on its almost-Java-but-can't-call-it-that C#, the gyrations
it's going through to make C++ work in such a "managed"
environment, or Fujitsu's work on .NET-enabled Cobol...

> What makes matters worse is that this translation (in itself a tricky
> and complex process) is never checked by a tool, because that would
> require an Nx(N-1)/2 number of interface adapters for N languages. No
> wonder that most projects restrict themselves to two or three languages!

Apparently, this happens only on platforms which have no good
componentization infrastructure.  A "star" pattern has a cost of
O(N) rather than O(N squared), in terms of numbers of different
interfaces to be considered -- pretty obvious, isn't it?  The 'hub'
is IDL, and the implied semantics of the object-model, for COM
and Corba; in the Java-&-friends world, the JVM specs play a
role that has certain similarities (and the beans and EJB rules
complement it).  I don't know much about commercial use of
the many languages with compile-to-JVM implementations (I
do know that JPython is reasonably popular, but nowhere as
much as the base python implementation); the Java culture may
be strongly leaning towards monolinguism -- the COM, Corba
and .NET ones most definitely aren't.

> > Incidentally,
> > Microsoft .NET may soon play a similar "triage" role -- I know that,
> > e.g., Mercury, Python, and Haskell will all be there "in force", as
> > well as many others; so, any investments I make in those will be
> > no waste if .NET catches on; what about Dylan -- any news about
> > it and .NET, either?

> Ah yes, .net. I'm curious whether it can really make different languages
> interoperate - in that case, an important obstacle to multi-language
> projects would vanish.

So far, it's hard to do more than just play with it -- it's in pre-beta
stages, as yet.  And some impedance mismatches are clear: if a
C++ programmer wants the full freedom to deallocate resources at
any time, which his language's semantics grant him, he must do
so in "unmanaged" (aka "unsafe") code portions, which cannot play
fully with the critical common-language-runtime rules (if the pieces
they deallocate are exposed on the component's interface, that is).

Multiple inheritance of implementation is not supported across
different languages' boundaries (only single inheritance of
implementation, and multiple inheritance of interfaces), there
is no cross-language notion of "templates" ("compile-time" generic,
a.k.a. "polymorphic" in Haskell terms, interfaces); notions of
precisely-timed, guaranteed finalization (as supported in languages
with semantics given in reference-count terms, such as Python)
are not yet fully accomodated (the GC abstract semantics are of the
mark-and-sweep kind); etc, etc.  No doubt many other language-unique
issues will have a similarly hard time; I don't think, for example,
that "uniqueness" a la Clean gets any support; in some cases, though,
things have gone better than I would have thought possible (e.g.,
the Eiffel version on .NET seems to have implemented its contracting
notions well, and exposed it as a general interface also usable from
other languages; it can't _mandate_ any other language to use it,
of course, but that's no different, conceptually, from a piece of Eiffel
code built in no-contract-checking mode, as already possible today).

Maybe we could get some information from the FP guys who have
been working on the environment, such as the Mercury group...?
I've seen some notes from such people on various sites &c, and
the general drift seemed good, but, that was months ago...


Wed, 05 Mar 2003 03:00:00 GMT  
 [ 1 post ] 

 Relevant Pages 

1. multi-language vs mono-language, 1/2 (was: Me and FP &c)

2. Speed of FP languages, prospects (was Re: Benchmarking Lazy Functional Languages)

3. Multi - Language Programs (CFW4b & Legacy)

4. FP Component Libraries? (FP vs OOP)

5. FP in a larger scale (Re: Comparison of functional languages)

6. C++ vs Eiffel vs Smalltalk (warning--long language war)

7. Syntax differences between FP languages?

8. Promote those FP languages!

9. FP languages as idea incubators

10. Implementation of strict FP languages

11. Debates and FP language features

12. A good language to start fp


Powered by phpBB® Forum Software