algorithmic efficiency and lazy languages
Author Message
algorithmic efficiency and lazy languages

I have always been under the impression that lazy evaluation
makes it difficult to determine the time and space efficiency
of programs.

If this is true, then it is a strong practical argument against using
lazy evaluation, since asymptotic efficiency is quite important in
practice.   An O(n^2) algorithm may be unusable in practice, while
an O(n log n) algorithm may do quite nicely.

As a side note, it is sufficient to base the argument simply on
affecting asymptotic space complexity, since that affects the
asymptotic time complexity of garbage collection.

It seems hard to argue that referential transparency is more
important to language design than the ability to predict the
algorithmic behavior of one's programs, given the fact that
a vast section of computer science is devoted to the study
of algorithms.

-- David Tarditi

Sat, 07 Jan 1995 15:35:49 GMT
algorithmic efficiency and lazy languages

Quote:

>I have always been under the impression that lazy evaluation
>makes it difficult to determine the time and space efficiency
>of programs.

>If this is true, then it is a strong practical argument against using
>lazy evaluation, since asymptotic efficiency is quite important in
>practice.   An O(n^2) algorithm may be unusable in practice, while
>an O(n log n) algorithm may do quite nicely.

I think it is only  half  true.   There  is  no  special  difficulty  in
reasoning   about  the  *time*  complexity  of  programs  written  in  a
non-strict  functional  language.   One  can  derive  from  the  program
equations   for  functions  fi  say,  similar  equations  for  Tfi,  the
time-to-execute of function fi.  There is quite  a  nice  discussion  of
this,   with   some   examples,  in  chapter  6  of  Bird  and  Wadler's
"Introduction to Functional Programming".

On the other  hand  reasoning  about  the  *space*  complexity  of  such
programs seems to be much more difficult.

David Turner

Mon, 09 Jan 1995 23:36:00 GMT
algorithmic efficiency and lazy languages

Quote:

> There  is  no  special  difficulty  in
>reasoning   about  the  *time*  complexity  of  programs  written  in  a
>non-strict  functional  language.   One  can  derive  from  the  program
>equations   for  functions  fi  say,  similar  equations  for  Tfi,  the
>time-to-execute of function fi.

It seems to me that this is insufficient.  It basically tells us that it's
possible to write accurate profilers.  The real question is:  Are the Tfi
sufficiently simple for real programs that I can understand what's going
on, and use them to drive my algorithm design?

Quote:
>There is quite  a  nice  discussion  of
>this,   with   some   examples,  in  chapter  6  of  Bird  and  Wadler's
>"Introduction to Functional Programming".

>On the other  hand  reasoning  about  the  *space*  complexity  of  such
>programs seems to be much more difficult.
>David Turner

Hans-J. Boehm

Tue, 10 Jan 1995 01:23:29 GMT
algorithmic efficiency and lazy languages

Quote:

> >I have always been under the impression that lazy evaluation
> >makes it difficult to determine the time and space efficiency
> >of programs.>
> >If this is true, then it is a strong practical argument against using
> >lazy evaluation, since asymptotic efficiency is quite important in
> >practice.> An O(n^2) algorithm may be unusable in practice, while
> >an O(n log n) algorithm may do quite nicely.
> I think it is only half true. There  is  no  special  difficulty  in
> reasoning   about  the  *time*  complexity  of  programs  written  in  a
> non-strict  functional  language.   One  can  derive  from  the  program
> equations   for  functions  fi  say,  similar  equations  for  Tfi,  the
> time-to-execute of function fi.  There is quite  a  nice  discussion  of
> this,   with   some   examples,  in  chapter  6  of  Bird  and  Wadler's
> "Introduction to Functional Programming".
>   On the other  hand  reasoning  about  the  *space*  complexity  of  such
> programs seems to be much more difficult.

If it is difficult to determine the space complexity of the programs, I wonder
why is it easy to determine their time complexity ? Allocating and
initializing space takes time proportional to the space used (in the best
case -- assuming the space is never reused). Explicit freeing
takes time proportional to the space freed (again in the best case -- not
much compaction needed), garbage collection needs time proportional to the
amount of live data (best case -- mark and sweep needs time proportional
to the all space available to the program).

anurag

Tue, 10 Jan 1995 03:52:06 GMT
algorithmic efficiency and lazy languages

Quote:

>... garbage collection needs time proportional to the
>amount of live data (best case -- mark and sweep needs time proportional
>to the all space available to the program).

(I hate to start this thread again, especially since it's irrelevant to
the previous discussion, but ...)
As far as I know, the above statement about mark-sweep collection is
only meaningful and correct if:

1. Allocation and initialization time is not considered,
2. Concurrent or incremental collection is not considered.  (Otherwise it
becomes hard  and pointless to distinguish the collector and the allocator.)
3. The M/S collector is not allowed to defer sweeping to allocation
time, e.g. by using a bit map representation of the free list.
4. The collector does not attempt to maintain the heap size as a fixed
multiple of the amount of live data (otherwise there's no asymptotic
difference between the two).

(1) seems rather arbitrary.  There are great arguments for violating (2)
through (4), and I don't know of any good arguments for not implementing
at least (3) (and (4) under a real OS).

about asymptotic complexity of mark/sweep collection is:

It's possible to implement mark-sweep collection so badly that garbage
collector pause time is proportional to all space available to the
program.

Of course, there are other arguments why mark-sweep collection might
to e.g. copying collection under the right circumstances.  But I think
the asymptotic complexity argument is 95% bogus.

Hans-J. Boehm

Standard disclaimer ...

Tue, 10 Jan 1995 05:03:13 GMT
algorithmic efficiency and lazy languages

Quote:

> There  is  no  special  difficulty  in
>reasoning   about  the  *time*  complexity  of  programs  written  in  a
>non-strict  functional  language.   One  can  derive  from  the  program
>equations   for  functions  fi  say,  similar  equations  for  Tfi,  the
>time-to-execute of function fi. There is quite  a  nice  discussion  of
>this,   with   some   examples,  in  chapter  6  of  Bird  and  Wadler's
>"Introduction to Functional Programming".

I think there is some "special difficulty" in reasoning about the time
complexity of non-strict programs, and that B + W have chosen to
de-emphasise these problems.  The "similar equations for Tfi" given by B+W are
entirely context sensitive. For example, the statement (Ch 6 p138) that
T_reverse = O (n^2)
(the time to compute the naive reversal of a list is order n-squared)
depends entirely on the assumption that the entire list is computed.

For example, given a definition  of the form
f3 x = f1 (f2 x)
with your/B&W's "similar equations" for Tf1 and Tf2, it is *not* the case that
Tf3 x = (Tf2 x) + (Tf1 (f2 x))
because of
(i) lazy evaluation, which makes the cost of evaluating f2 dependent
on its consumer f1, and
(ii) higher order functions, which can make the time to evaluate f1
dependent on more than just the (extensional) value of (f2 x), in the case
that (f2 x) is a function (and similarly for f2 if x is of function type).

As a simple example (of (i)), take f2 = reverse and f1 = head. It should not be
too hard to see that the time to evaluate f3 is linear in the length
of its argument.

Nonetheless, I still agree with the second point:

Quote:
>On the other  hand  reasoning  about  the  *space*  complexity  of  such
>programs seems to be much more difficult.

Dave.

----------------------------------------------------------

For anyone who might be interested, issues (i) and (ii) are considered
in some detail in my thesis. Here are some other related references,
some of which were mentioned recently in this group.

(My stuff is available by anonymous ftp from directory papers/Sands
at theory.doc.ic.ac.uk)

author = "B. Bjerner",
title = "Time Complexity of Programs in Type Theory",
school = "Chalmers University of Technology",
year = 1989 }

author = {B. Bjerner and S. Holmstr\"om},
title = "A compositional approach to time analysis of first order lazy
functional programs",
booktitle = "Functional {P}rogramming {L}anguages and {C}omputer {A}rchitecture, conference proceedings",
publisher = {ACM press},
year =          {1989}
}

author = "{LeM\'etayer}, D.",
title = "Analysis of Functional Programs by Program Transformation",
booktitle = "Second France--Japan Artificial Intelligence and Computer
Science Symposium",
month = "",
publisher = "North--Holland",
year = 1988}

author =        "D. Sands",
title =         "Complexity Analysis for a Lazy Higher-Order Language",
booktitle =
{Proceedings of  Glasgow Workshop on Functional Programming },
publisher =     {Springer Verlag},
series =        {Workshop Series},
month =         {August},
year =          {1989},
}

author =      "D. Sands",
title =       "Calculi for Time Analysis of Functional Programs",
school =      "Imperial College",
year =        "1990",
month =       "September",
OPTnote =     ""

Quote:
}

author =      "D. Sands",
title =       "Time Analysis, Cost Equivalence and Program Refinement",
booktitle =   "Proceedings of the Eleventh Conference on Foundations
of Software Technology and Theoretical Computer Science",
year =        "1991",
pages =       "25--39",
publisher =   "Springer-Verlag",
number =      "560",
series =      "{L}{N}{C}{S}",
month =       "December"

Quote:
}

title = "Strictness Analysis Aids Time Analysis",
booktitle = "15th ACM Symposium on Principals of Programming Languages",
month = "January",
year = 1988}

--

Tue, 10 Jan 1995 19:00:12 GMT
algorithmic efficiency and lazy languages

Quote:
> There  is  no  special  difficulty  in
>reasoning   about  the  *time*  complexity  of  programs  written  in  a
>non-strict  functional  language.   One  can  derive  from  the  program
>equations   for  functions  fi  say,  similar  equations  for  Tfi,  the
>time-to-execute of function fi.

Quote:
>It seems to me that this is insufficient.  It basically tells us that it's
>possible to write accurate profilers.

Sorry, I should have been more explicit.

I meant that you  could  often  *solve*  the  equations  (making  enough
simplifying  assumptions,  as  one  always does in asymptotic complexity
analysis) to arrive at a formula for the asymptotic complexity  of  each
function  of  interest.   As  mentioned,  see  chapter 6 of B+W for some
examples.

-------------------

To add a further note on why space complexity is much less tractable -
it is RESIDENCY we are interested in, typically, rather than just the
number of times `cons' gets called, which is clearly related to the
time analysis, as has already been commented.

David Turner

Tue, 10 Jan 1995 22:36:27 GMT

 Page 1 of 1 [ 9 post ]

Relevant Pages