multi-language vs mono-language, 1/2 (was: Me and FP &c) 
Author Message
 multi-language vs mono-language, 1/2 (was: Me and FP &c)


Quote:

> > [...] A reckon-cum-bomber, for
> > example, is not very viable (among fixed-wing aircraft, I mean).

> Yes, but this sort of trade-off is commonplace in the material world.

And in the intellectual one as well.  Do you not deign concerts
and plays of any attention, since we have operas that cover
both theatrical aspects AND musical ones?  Nah!  Opera has its
place, but it doesn't *displace* either specialist form -- there
are stories that are best told without music, and music that is
best played without pretending it's telling a story, or fitting
a libretto to it.

It's not _always_ so... using two different natural languages
and/or ways of writing, one for "noble" purposes and the
other for everyday necessities, has been common in many
places throughout the world, but nowadays most places seem
to vastly prefer to use a single natural language for all
purposes (the roles of English in non-native English cultures
stem from a rather different root than the "noble vs
everyday" distiction, although it could also provide another
viable analogy).  This does mean that the sole language
becomes more rich and complex, while in two-languages
cultures the "vulgar" (everyday) language may be simpler (it
need no terms for complex, subtle philosophical issues),
and so, maybe surprisingly, may the noble/holy/superior
ones (it may lack any meaningful way to talk about market
fluctuations in pickled herrings' prices, for example).  But
the "sole" (richer, more complex) language, in practice, gets
fragmented in a variety of idiolects that congregate around
at least two poles -- often, grammatically too, to some extent,
although it is lexically that the split is most evident.

The asserted (and perhaps practiced) "preferences" for a
"single" language, therefore, need not reflect "efficiencies";
they might, but an efficient-indifferent drive for a symbol
of national and cultural unity and strength, for example,
is also a very plausible reason.  Italian is exceptionally well
traceable, historically, from this point of view; it was
"designed" (back in the 13th/14th centuries) in a very
deliberate way as a bridge between Church Latin and
(basically Tuscan) dialects, and the prominent figure in
this movement was Dante Alighieri, the well-known poet,
a life-long opponent of the cultural and spiritual monopoly
of the Church; it only became a really living popular
language much later, in the 19th century, again in a clearly
political maneuver (by the government of newly united
Italy, again with a strong undercurrent of anti-Church
policy).

Back from natural languages to programming ones, I
suspect the roots of the widespread yearning for a single
language "good for everything" may be similar -- less
overtly cultural and national, here, but with a psychological
matrix that has much in common.  And, to be fair, my own
bias towards "letting a thousand flowers bloom", i.e. my
preference for multi-language systems (with the various
languages interoperating smoothly and productively), may
well be tied to my general propensity in favour of cultural
and lifestyle diversity ("rainbow coalition", Bach and
Offenbach and Nyman, Blake and Leopardi and Rilke...:-).

Quote:
> Plus, there's also the constraint that the language must be efficiently
> translatable into reasonably efficient code... but I've still got that

Yes, and the level of "reasonably" differs depending on how
wide the target of applicability is.  If a language is to be able
to implement well *every* functionality an application may
need in its components, it had better have a *deucedly good*
code generator -- ability for _high_ optimization indeed!  If
the target is "80%-90% of components" (that vast majority
that does NOT really need to-the-metal performance), then
the meaning of "reasonably" is suddenly sharply different...

Quote:
> nagging feeling that it *should* be possible to have a language that's a
> jack-of-all-trades.

"Everything is possible, but not everything is convenient", is one
way to translate a famous utterance of St. Paul to the Corinthians
(you may be more familiar with it, with the words "permissible"
and "beneficial" instead, or other sort-of-synonyms yet).  Many
programmers believe that Java, or C++, or Visual Basic, ARE that
"Jack of All Trades" right here and now -- they do all their work
in that one language, and purport to be happy with it.  Others,
no doubt, believe the same of less-widespread languages, be
they Eiffel, Dylan, some ML variant, Python, Perl, Haskell, Clean,
Erlang, Scheme (or some other Lispish one), C#, or whatever
(apologies to any One-True-Way'ist I've omitted!-).  I think that,
in a world where different languages interoperate smoothly, such
one-language-for-all-components solutions are probably not
optimal for any application complex enough to be, say, up to
several dozen components (where a component is typically about
100 to 1000 function points' worth of functional richness).

Quote:
> I see four large application areas where a language
> must be good in to be that:
> 1. Low-level stuff. Twiddling hardware registers etc. (for device
> drivers and such).
> 2. Algorithmically complex stuff that still but be efficient.
> Raytracers, FFT, sparse matrixes etc.
> 3. Glueing components together (scripting, if you want).

Many components will be mainly orchestrating the work of
other components, as well as, possibly, adding in some
piece of their own.  You could call that "scripting", but it's
a bit too wide an acception of "scripting" for my taste.
Sure, there is a need to script/glue things purely and
simply too, which is "scripting"'s main meaning.

Quote:
> 4. Real-time programming.

Sure, there is that.  In both hard and soft varieties.

We have _not_ exhausted the important fields with these
four points, of course.  Database access is noticeably out
of the list, as is the implementation of most layers in a
communication protocol stack, lexing/parsing/string&text
processing, logical/inferential stuff, etc.

Quote:
> Now when I look at these application domains, I don't see any
> conflicting requirements (with the exception of RT, which is often seen
> to conflict with glueing - RT proponents want strict control over every
> processing step to keep their deadlines while scripters specifically do
> *not* want to worry about the internals of the modules).

I get the impression (from a very superficial look, so far, at Erlang)
that is IS good at both RT (of the soft variety) AND 'scripting', at
least in the wide sense you seemed to be using it in [3].

If there is conflict there, there should also be between [1] (a field
which may want strict control over every bit going back and forth)
and the others (which don't need nor want it).  Automatic and
implicit memory management, for example, seems bad for [1] and
[4], potentially -- is surely good for [3] in every sense -- re [2],
it depends a lot (what is its cost, what are the alternatives, ...),
but for very efficiency-critical tasks fine-grained control of memory
strategies may need to be with the programmer.

Quote:
> So where is the problem that has prevented a 1-3 language?

"Jack of all trades, Master of none" -- it's an old English saying,
you know.  You want the J-o-a-t.  Why are you surprised that
you may also at the same time be getting the m-o-n...?

(continued in pt 2...)

Alex



Wed, 05 Mar 2003 03:00:00 GMT  
 multi-language vs mono-language, 1/2 (was: Me and FP &c)

Quote:

> > So where is the problem that has prevented a 1-3 language?

> "Jack of all trades, Master of none" -- it's an old English saying,
> you know.  You want the J-o-a-t.  Why are you surprised that
> you may also at the same time be getting the m-o-n...?

Perhaps because he's talking about programming languages and not
people.  The reason human beings can't master many different trades is
obvious - everyone only gets 24 hours in a day, so even a very good
generalist can't compete with a specialist in his own field.  He just
can't put enuff time into each speciality.

But there's no similar restriction on programming languages.  To a
degree, it works the opposite way: Each new language must spend a lot
of time re-inventing the wheel, while mini-languages built on top of
other languages (eg, Schelog) can simply re-use what's there.  I'm
primarily talking about the sort of library functionality that a
serious language needs: I/O, type hierarchy, module handling,
exception handling, etc.

ISTM the expressiveness issues really indicate the need for strong
metaprogramming capability and not multiple languages.

--
Tom Breton, http://world.std.com/~tob
Not using "gh" 1997-2000. http://world.std.com/~tob/ugh-free.html



Thu, 06 Mar 2003 03:00:00 GMT  
 multi-language vs mono-language, 1/2 (was: Me and FP &c)

    [snip]

Quote:
> people.  The reason human beings can't master many different trades is
> obvious - everyone only gets 24 hours in a day, so even a very good
> generalist can't compete with a specialist in his own field.  He just
> can't put enuff time into each speciality.

> But there's no similar restriction on programming languages.  To a

Except that, if their complexity is not constrained, it becomes a huge
defect.  Many languages have hit constraints of excessive complexity.
They become too hard to use most effectively for most human beings.

Quote:
> degree, it works the opposite way: Each new language must spend a lot
> of time re-inventing the wheel, while mini-languages built on top of
> other languages (eg, Schelog) can simply re-use what's there.  I'm
> primarily talking about the sort of library functionality that a
> serious language needs: I/O, type hierarchy, module handling,
> exception handling, etc.

...which is just the kind of things that the Common Runtime Library part
of Microsoft's .NET supplies to all languages that compile down to MSIL
and respect the needed rules.

Quote:
> ISTM the expressiveness issues really indicate the need for strong
> metaprogramming capability and not multiple languages.

We clearly have a huge split in worldview.  Fortunately for me, I
don't think the One True Wayers' dream of a mono-linguistic world
is really feasible, while barriers to language interoperability are in
fact crumbling even as we speak, and making the "let a thousand
flowers bloom" dreamworld of multi-language enthusiast closer and
closer to reality...:-).

Alex



Fri, 07 Mar 2003 03:00:00 GMT  
 multi-language vs mono-language, 1/2 (was: Me and FP &c)
Alex Martelli

Quote:
> barriers to language interoperability are in
> fact crumbling even as we speak, and making the "let a thousand
> flowers bloom" dreamworld of multi-language enthusiast closer and
> closer to reality...:-).

These barriers to language compatability are crumbling only if you
are prepared to kiss good bye to performance. Yeah we could compile
all languages to java or .net but performance will be awful.

graham



Fri, 07 Mar 2003 03:00:00 GMT  
 multi-language vs mono-language, 1/2 (was: Me and FP &c)

Quote:
> Alex Martelli
> > barriers to language interoperability are in
> > fact crumbling even as we speak, and making the "let a thousand
> > flowers bloom" dreamworld of multi-language enthusiast closer and
> > closer to reality...:-).

> These barriers to language compatability are crumbling only if you
> are prepared to kiss good bye to performance. Yeah we could compile
> all languages to java or .net but performance will be awful.

Performance on COM is anything _but_ awful; COM's overhead for
in-process, same-thread method calls is comparable to that of a C++
virtual function call (plus a couple of machine cycles because the
'this' pointer is specified to be in stack rather than in a register).
This is, in fact, one of its great strengths.

It's too early to benchmark .NET in earnest, but I've looked deeply
into the architecture and see no reason the overhead should be
substantially different from COM's.  Key performance impact is going
to be in mark-n-sweep GC rather than reference counting, and that
can be optimized (in theory, it should be _faster_ with decent
algorithms).  MSIL can and does accomodate optimizations not
particularly needed for imperative languages, but for functional
ones; at least, this is what one hears from the Mercury guys.

Alex



Sat, 08 Mar 2003 03:00:00 GMT  
 multi-language vs mono-language, 1/2 (was: Me and FP &c)
Alex Martelli

Quote:

>> These barriers to language compatability are crumbling only if you
>> are prepared to kiss good bye to performance. Yeah we could compile
>> all languages to java or .net but performance will be awful.

> Performance on COM is anything _but_ awful; COM's overhead for
> in-process, same-thread method calls is comparable to that of a C++
> virtual function call (plus a couple of machine cycles because the
> 'this' pointer is specified to be in stack rather than in a register).
> This is, in fact, one of its great strengths.

That's a pretty high cost for some of us.

graham



Sat, 08 Mar 2003 03:00:00 GMT  
 multi-language vs mono-language, 1/2 (was: Me and FP &c)
Uh, now I've got something to answer to... well, I baited for it, so I
guess I get what I deserve :)

Quote:



> > > [...] A reckon-cum-bomber, for
> > > example, is not very viable (among fixed-wing aircraft, I mean).

> > Yes, but this sort of trade-off is commonplace in the material
> > world.

> And in the intellectual one as well.  Do you not deign concerts
> and plays of any attention, since we have operas that cover
> both theatrical aspects AND musical ones?  Nah!  Opera has its
> place, but it doesn't *displace* either specialist form -- there
> are stories that are best told without music, and music that is
> best played without pretending it's telling a story, or fitting
> a libretto to it.

I agree.
What I intended to say was that analogies like aircraft design and music
will break if pushed too far. I think that relying on analogies is
dangerous - programming language design is a new field of thinking, and
we don't know at what points the analogies are likely to fail, so
reasoning by analogy is always to be taken with a grain of salt.
This doesn't mean that I consider analogies useless; exploring where an
analogy holds and where it breaks can give valuable insights into the
thing itself.

Quote:
> [Natural languages analogy]
> ... This does mean that the sole language
> becomes more rich and complex,

Here's the analogy breaker: natural languages almost instantaneously
adapt to fit any new type of things that need to be expressed; this
adaptation also happens by instinct and with input from the users
themselves; in fact there is no natural language designer sitting around
and looking for potential improvements.
For computer languages, these aspects are all different: adaptation is
slow if it happens at all, and the result is often ungainly (look at the
supposed OO evolution of C, C++, and you'll see what I mean). Adaptation

happens by design and requires individuals that have a hand for computer
language design (and these are exceedingly rare).

Quote:
> while in two-languages
> cultures the "vulgar" (everyday) language may be simpler (it
> need no terms for complex, subtle philosophical issues),
> and so, maybe surprisingly, may the noble/holy/superior
> ones (it may lack any meaningful way to talk about market
> fluctuations in pickled herrings' prices, for example).  But
> the "sole" (richer, more complex) language, in practice, gets
> fragmented in a variety of idiolects that congregate around
> at least two poles -- often, grammatically too, to some extent,
> although it is lexically that the split is most evident.

This is another difference. You can freely mix and match elements from
different natural languages. In fact that's how some languages have
started, as a mixture of words and syntax from two or more languages in
places where people from different cultures had to interact (like, at
economic barter centers, where people haggle with hand, feet, then words
from both languages, then syntax from both languages).
Computer languages, on the other hand, don't mix and match. You simply
cannot write a Pascal-style declaration of a local variable in the
middle of a Haskell program.

The human mind is much more flexible when it comes to understanding
language, and that's the reason why it's not the programmers who decide
where to put semicolons :)

Quote:
> The asserted (and perhaps practiced) "preferences" for a
> "single" language, therefore, need not reflect "efficiencies";
> they might, but an efficient-indifferent drive for a symbol
> of national and cultural unity and strength, for example,
> is also a very plausible reason.

Not for programming languages. There may be social factors at work, but
these are IMHO secondary if they play a role at all.
For example, even if Italian is a "designed" language, it was not
designed from the ground up, it was a merge between existing languages.
The only fully artificial languages that I know of are Esperanto (which
is marginally successful) and Volapk (which is obscure even in Germany
where it was invented in the pre-War time AFAIK).

Programming languages are, as a rule, designed from the ground up. You
can reuse general style (as in Java, which borrowed lots of its style
from C), but that's almost all - C++ semantics is radically different
from C semantics, Java is radically different from C and C++ or any
other language that is listed among its roots.

Quote:
> Back from natural languages to programming ones, I
> suspect the roots of the widespread yearning for a single
> language "good for everything" may be similar -- less
> overtly cultural and national, here, but with a psychological
> matrix that has much in common.  And, to be fair, my own
> bias towards "letting a thousand flowers bloom", i.e. my
> preference for multi-language systems (with the various
> languages interoperating smoothly and productively), may
> well be tied to my general propensity in favour of cultural
> and lifestyle diversity ("rainbow coalition", Bach and
> Offenbach and Nyman, Blake and Leopardi and Rilke...:-).

Hey, I like cultural diversity myself, but not in programming languages.
Too much effort to learn and distinguish a dozen slightly incompatible
semantics and ways of doing things.
Actually I felt much more at ease learning PHP - this language is so
limited that I just code ahead and to hell with imponderabilia like
reusability and such.

Quote:
> > Plus, there's also the constraint that the language must be
> > efficiently translatable into reasonably efficient code...

> Yes, and the level of "reasonably" differs depending on how
> wide the target of applicability is.  If a language is to be able
> to implement well *every* functionality an application may
> need in its components, it had better have a *deucedly good*
> code generator -- ability for _high_ optimization indeed!  If
> the target is "80%-90% of components" (that vast majority
> that does NOT really need to-the-metal performance), then
> the meaning of "reasonably" is suddenly sharply different...

Fully agreed.

Quote:
> > nagging feeling that it *should* be possible to have a language
that's a
> > jack-of-all-trades.

> "Everything is possible, but not everything is convenient", is one
> way to translate a famous utterance of St. Paul to the Corinthians
> (you may be more familiar with it, with the words "permissible"
> and "beneficial" instead, or other sort-of-synonyms yet).

It's different words in Germany anyway :)

Quote:
> Many
> programmers believe that Java, or C++, or Visual Basic, ARE that
> "Jack of All Trades" right here and now -- they do all their work
> in that one language, and purport to be happy with it.  Others,
> no doubt, believe the same of less-widespread languages, be
> they Eiffel, Dylan, some ML variant, Python, Perl, Haskell, Clean,
> Erlang, Scheme (or some other Lispish one), C#, or whatever
> (apologies to any One-True-Way'ist I've omitted!-).  I think that,
> in a world where different languages interoperate smoothly, such
> one-language-for-all-components solutions are probably not
> optimal for any application complex enough to be, say, up to
> several dozen components (where a component is typically about
> 100 to 1000 function points' worth of functional richness).

The problem is interfacing. Different languages put different things
into interfaces. How to do translate the in, out, and inout parameters
of Ada to C? C doesn't have that concept, so you're simply out of luck;
you run into problems when calling Ada from C and vice versa, because
that concept is untranslatable.
You can resort to assumptions like "all value parameters of C are 'in',
and all pointer parameters are 'inout'", but this will prevent many
optimizations and (occasionally) require additional thinking on the side
of the programmer.
The mapping of primitive types is even more intricate. Many languages
have only signed integers; how do you interface a C function that may
return any unsigned integer, including those that are out of range for
the signed integer?

These problems all vanish if you stay within a single language. You can
minimize the amount of data that needs to be passed between different
languages in an application, but that's a restriction that prevents the
easy mix-and-match that "the right tool for the right job" advocates
strive for.

Quote:
> > I see four large application areas where a language
> > must be good in to be that:
> > 1. Low-level stuff. Twiddling hardware registers etc. (for device
> > drivers and such).
> > 2. Algorithmically complex stuff that still but be efficient.
> > Raytracers, FFT, sparse matrixes etc.
> > 3. Glueing components together (scripting, if you want).

> Many components will be mainly orchestrating the work of
> other components, as well as, possibly, adding in some
> piece of their own.

That's what I meant.
Maybe "pasting" is a better word?

  You could call that "scripting", but it's

Quote:
> a bit too wide an acception of "scripting" for my taste.
> Sure, there is a need to script/glue things purely and
> simply too, which is "scripting"'s main meaning.

> > 4. Real-time programming.

> Sure, there is that.  In both hard and soft varieties.

> We have _not_ exhausted the important fields with these
> four points, of course.  Database access is noticeably out
> of the list, as is the implementation of most layers in a
> communication protocol stack, lexing/parsing/string&text
> processing, logical/inferential stuff, etc.

These are all in one or the other category, or a mixture of the above
categories.
Umm, lemme see:
Database access - pasting, straight and pure.
Communication protocol - low-level, algorithms, and some degree of
pasting (this mixture may be the reason why communication drivers are
generally considered "hard").
Lexing/parsing/string&text - algorithmic stuff.
Logical/inferential stuff - algorithmic as well.

- Show quoted text -

Quote:
> > Now when I look at

...

read more »



Sat, 08 Mar 2003 03:00:00 GMT  
 multi-language vs mono-language, 1/2 (was: Me and FP &c)

Quote:

> ...
>The problem is interfacing. Different languages put different things
>into interfaces. How to do translate the in, out, and inout parameters
>of Ada to C? C doesn't have that concept, so you're simply out of luck;
>you run into problems when calling Ada from C and vice versa, because
>that concept is untranslatable.
>You can resort to assumptions like "all value parameters of C are 'in',
>and all pointer parameters are 'inout'", but this will prevent many
>optimizations and (occasionally) require additional thinking on the side
>of the programmer.

Well, i don't know ada, but aren't inout parameters more like passing
by value / result,. in which case passing by reference would only work
with extra action on by the callee (which could easily be generated by
a compiler)?

Quote:
> ...

hs


Sat, 08 Mar 2003 03:00:00 GMT  
 multi-language vs mono-language, 1/2 (was: Me and FP &c)

Quote:


> > ...
> >The problem is interfacing. Different languages put different things
> >into interfaces. How to do translate the in, out, and inout
> >parameters of Ada to C? C doesn't have that concept, so you're simply
> >out of luck; you run into problems when calling Ada from C and vice
> >versa, because that concept is untranslatable.
> >You can resort to assumptions like "all value parameters of C are
> >'in', and all pointer parameters are 'inout'", but this will prevent
> >many optimizations and (occasionally) require additional thinking on
> >the side of the programmer.

> Well, i don't know ada, but aren't inout parameters more like passing
> by value / result, in which case passing by reference would only work
> with extra action on by the callee (which could easily be generated by
> a compiler)?

Well, yes, but it still prevents a Ada-calling-C-wrapper generator from
automatically generating reliable bindings.
For example, a C routine may have a char* parameter - is that an out
parameter for a char? Or should it be regarded as an inout parameter? Or
even a pointer to an array of characters, some of them in, others out,
yet others inout?

And this is just when interfacing Ada and C. Other combinations provide
similar problems:
* Strings in Basic are heap-managed, character arrays in most other
languages. (And thanks to Billyboy, Basic strings are still important
( .)
* Arrays: C arrays start indexing at zero, Pascal arrays at any index,
Basic arrays at zero or one (depending on a global switch - well,
depending on Basic dialect), most 4GLs start at one.
* Parameter passing conventions. Most languages have a subset of:
call-by-name, call-by-reference, call-by-value-return, and call-by-need
(almost no language has all four). Making sure that the right
translation is chosen often requires an inspection of the semantics of
the called routine, nothing that an automated tool can do.
* Calling conventions in general. Try to interface Prolog's backtracking
with the subroutine calls of nearly all other routines. Or lazy
evaluation with any imperative language.
* Try to map enums (as in Pascal or C++) to the concepts of any other
language that doesn't have enums.
* Try to map objects (unions of function and data) to any non-OO
language. (It's not a nice sight.)

You can work around most of these problems, but they create seams in
your application. You cannot easily reverse the decision what's written
in which language; you not only have to rewrite the code (which would be
a minor problem - if you decide to switch to another language you have
already seen that this will be a major win), you'll have to undo all the
interface-adaptation stuff at the old seam and introduce it at the new
seam.
Besides, you lose information (mostly type information) at the seam,
creating holes in the type system. I don't have to tell you that this is
a potential source for serious trouble. (In an untyped language, the
situation is even worse: types are present but implicit, and the
information is lost anyway...)

One concrete example. At work, I'm currently wrapping C libraries (more
precise: aspects of the Windows API) in Eiffel. Eiffel doesn't have
enums, but the Windows API abounds with definitions like
  #define WM_CREATE                       0x0001
  #define WM_DESTROY                      0x0002
  #define WM_MOVE                         0x0003
  #define WM_SIZE                         0x0005
  #define WM_ACTIVATE                     0x0006
I can easily wrap every of these symbols using an 'external' clause, but
the Eiffel compiler does not know anymore that these are simply integer
constants, so I can't use them as literal values in an 'inspect'
statement (this would be a 'case' in Pascal, a 'switch' in C).
Annoying, yes, and I could always write a series of if...elsif...
statements, but this is not the way that Eiffel was designed to be used,
weakening the power of the language for me.

Similar problems arise whenever you combine different languages. Calling
is easy enough (unless you interface with Prolog or a lazy language),
but passing data between worlds is awkward and cumbersome (and, if you
don't have tools, dangerous: neither compiler will be able to warn you
if you get something wrong).

I've seen IDL announced as the solution to such problems. I doubt that
IDL is universal enough to cover call-by-need, objects, and lazy
evaluation, but I'll let anybody with a better knowledge of IDL convince
me.
(At lest the Microsoftish variant of IDL has 156 keywords - this sounds
like a repetition of the "heat death" syndrome that plagues the
Universal VM approaches. It's a similar problem: find an intermediate
format that allows expressing the semantics of all programming
languages, be it as a VM or as a decorated tree. The last serious
attempt at that was TenDRA, and it has failed, like so many others
before. Look for "UNCOL" in the comp.compilers archive at
http://compilers.iecc.com/comparch/compsearch.)

Regards,
Joachim
--
This is not an official statement from my employer or from NICE.



Sun, 09 Mar 2003 03:00:00 GMT  
 
 [ 9 post ] 

 Relevant Pages 

1. multi-language vs mono-language, 2/2 (was: Me and FP &c)

2. Speed of FP languages, prospects (was Re: Benchmarking Lazy Functional Languages)

3. Multi - Language Programs (CFW4b & Legacy)

4. FP Component Libraries? (FP vs OOP)

5. FP in a larger scale (Re: Comparison of functional languages)

6. C++ vs Eiffel vs Smalltalk (warning--long language war)

7. Syntax differences between FP languages?

8. Promote those FP languages!

9. FP languages as idea incubators

10. Implementation of strict FP languages

11. Debates and FP language features

12. A good language to start fp

 

 
Powered by phpBB® Forum Software