comments on comments 
Author Message
 comments on comments

Quote:


>>As for bottom-up vs. top-down, Meyer explains this very well in his
>>book, and it's certainly applicable to the recent discussion.

>Could you elaborate on this, does he advocate top-down, bottom-up, ???
>I haven't read Meyer's book.  

Meyer advocates neither, although his strategy (which he terms simply
"object-oriented design") is closer to bottom-up than to top-down.  His
claim is that the best way to obtain reusability, maintainability, and
reliability (which should all result from good software engineering
technique), the best method is to focus design on the data objects that
the program is manipulating.  The resulting design can often be very
obvious, as it isn't too hard to find the objects in most applications.
Object-oriented languages then package the objects and their internal
representation together with the routines that operate on them, and
thus separate interface from implementation.  A good object-oriented
system quite often will consist largely of software components that
have been taken from older projects, which is not really a possibility
in either the top-down or the bottom-up model.  If the map of a top-
down system is a tree, with the principal function at the top, then
an object-oriented system is a net, with each object providing
services to other objects, with no clear hierarchical arrangement;
this enables the system to be locally modified without global effects.

That's a very brief summary of Meyer's arguments.  He goes into more
detail, with examples, in the book.

To respond briefly to Bill Wolfe's comments about Eiffel being flaky:
I can't say how good a language Eiffel is in the real world.  However,
Meyer uses it very effectively as an exemplar of a good object-oriented
programming language, and at every step throughout the book he discusses
the design decisions he made in creating Eiffel, and what the impli-
cations are for object-oriented programs.  He discusses the pros and
cons of Ada, Smalltalk, etc., etc. at length, and he talks about the
essential features of any "true" (in his definition) object-oriented
language: multiple inheritance, genericity, polymorphism, dynamic
binding, etc.  Even if you don't agree with his decisions, his
exposition is very clear and (in my opinion) well-written.

(You can probably tell I like the book!)

Quote:
>Bob Hathaway


Rob Jellinghaus                | "Next time you see a lie being spread or a


{everyone}!decvax!yale!robertj |     -- K. Eric Drexler, _Engines of Creation_


Mon, 20 Jul 1992 04:55:00 GMT  
 comments on comments

Quote:

>As for bottom-up vs. top-down, Meyer explains this very well in his
>book, and it's certainly applicable to the recent discussion.


Quote:
> Could you elaborate on this, does he advocate top-down, bottom-up, ???
> I haven't read Meyer's book.  

The book is a GOOD read, and my summary can't hope to do it justice, but:
Meyer makes a good case that your current programming assignment is NOT
the "real" objective, since the goals and requirements of that program
will change, and something else (the same but with *one more feature*; or
a different program that deals with many of the same data) will be needed
"soon."  So: don't start by modeling "function" at all, start with modeling
the objects in the problem domain.  The objective isn't "new payroll
system," it's "understand your enterprise," with a *current* emphasis
on modeling employees, benefits, salaries, bonuses, tax situations, etc.
Meyer recommends deferring the functional objective as long as possible.

What emerges is more like bottom-up than top-down, but with important
differences; to me, bottom-up always meant "build the utility libraries
first."  Once again, emphasis on function.  Modeling "employee" requires
deciding what data and operations capture the notion of employee.  And you
can defer things, much as you do in -- hmm -- top down design!  Example:
all employees get paid, but the algorithm is different for hourly v. salaried
employees.  OK, declare something you do to an employee -- "Pay" -- and defer
the definition.  Special cases of employee will instantiate the deferred
method in different ways.

The presumed benefits -- and again, I haven't made the case, but Meyer is
quite convincing -- are that you emerge with a model of relevant entities in
your environment, not "merely" a program.  Thus the next program is already
well on the way to being done: add more classes as needed, more methods as
needed, and there it is.  Here again, Meyer makes a strong case that already-
defined classes should be both *open* and *closed* with respect to
modification.  "Open" is accomplished by letting older classes be members of
or ancestors of new classes; in particular, inheritance allows you to redefine
just those features of the new class that you need to change.  "Closed" is
just that a well-tested class can be cast in concrete: you needn't mess with
it, because you can modify by inherit with override.

He also makes a rather strong case for multiple inheritance, which is
lacking in most O-O languages (but is in Eiffel and C++ 2.0).  I don't have
the slightest intention of starting a flame war by trying to summarize
that issue here: but go read Meyer's treatment.

Enough.  Go get the book, it may change your life.

=Ned Horvath=



Mon, 20 Jul 1992 06:28:00 GMT  
 comments on comments

Quote:


>> What I like to stress is: Ada is *not* an object-oriented language.

>    I quote from ACM SIGADA Ada Letters, March/April 1989, Volume IX,
>    Number 2, Page 10:

>      Software Productivity Solutions, Inc., has announced a new product,
>      Classic-Ada (tm), which allows Ada software developers to use
>      inheritance and dynamic binding in object-oriented Ada designs.

You mean that they sell a *super-set* of ADA?

Mild sarcasm on:
This must be what we all have been waiting for !
Mild sarcasm off.

ADA is too much of an elephant already. I totally agree that
inheritance is good but ADA would have to be pruned a lot
before something new should be added to it.

  mikael
--

.........
You are in error. 2+2=5
Thank you for your cooperation. The Computer.



Mon, 20 Jul 1992 11:23:00 GMT  
 comments on comments
Bill,


Quote:
>    I stand by my statement regarding Eiffel's flakiness.  Obviously
>    it's just my opinion, and others may differ, but I think it's a
>    very well-founded opinion.

Can you please elaborate on that? I'm sure there are quite a few people
around who wants to know in what ways Eiffel is flaky.

--
Robert Claeson, ERBE DATA AB, P.O. Box 77, S-175 22 Jarfalla, Sweden
Tel: +46 (0)758-202 50  Fax: +46 (0)758-197 20




Mon, 20 Jul 1992 12:22:00 GMT  
 Good Design Strategies <Was comments on comments>

Quote:

> We agree the components are desirable but a modular design should get you
> there quickly too.

I'm not willing to give over the term "modular design" to the piecewise
refinement school. I design modules (or "classes", or "Atds", or
"components" or whatever).   Sometimes, but seldom, I actually do use
piecewise refinement as an design or implementation technique.
The modules are the "what". The piecewise refinement is the "how".
If you get good modules from piecewise refinement, good for you! I've
seldom seen anybody else pull it off, but that's no reason to presume
that it doesn't work for you.  My assertion -- must I repeat it again? --
is that good modules are more likely to come from a bottom-up approach.
Designing the top levels first is just too darn risky for my tastes.

Quote:
> By breaking the system into smaller pieces with each
> piece providing some well defined and more manageable part of the system,
> the components should emerge.

Components might emerge. But how they emerge may be dictated by the
top level.  That's what I don't want.  The top level might have to change.
Why not just begin by designing components?  That way, they don't have
to "emerge".  They are part and parcel of the design itself.

Quote:
> Not only for the objects, but for the software tools as well.

I've never seen it happen that way, but then I've only been doing this
stuff for eigh{*filter*} years.  Maybe next week. :-)

...

I'm going to get out of the discussion for a while.  I've been spending
too much time on this as it is.  Let me just say one thing: I've known
some people -- and be assured that I do not intend to suggest that
Mr. Hathaway is in this camp -- who have preached piecewise refinement
as a dogma for so long that it would take something akin to a religious
conversion to get them even to consider other techniques.  I am trying
to point that it just that: a technique.  The product is the thing.
We are sometimes free to choose the best methods on a product-by-product
basis. Which method is best depends on the job at hand. Top-down design
and piecewise refinement is, in my opinion, seldom the best technique to
use.



Mon, 20 Jul 1992 22:13:00 GMT  
 Good Design Strategies <Was comments on comments>

Quote:
> A good object-oriented
> system quite often will consist largely of software components that
> have been taken from older projects, which is not really a possibility
> in either the top-down or the bottom-up model.  If the map of a top-
> down system is a tree, with the principal function at the top, then
> an object-oriented system is a net, with each object providing
> services to other objects, with no clear hierarchical arrangement;
> this enables the system to be locally modified without global effects.

Point of semantics well taken.  Perhaps I should not be using the term
"bottom-up".  I guess if you get into the topology of it, inheritance
of structure (not reference), yields a directed acyclic graph which
may have lots of local bottoms and tops.  But if I said, "infrenum-supremum"
programming, who would know what I was talking about?

                              Dave J.



Mon, 20 Jul 1992 22:35:00 GMT  
 Good Design Strategies <Was comments on comments>

   [Due to the Followup-To, this article didn't make it to
    comp.lang.ada, although it obviously should have... Bill]


Quote:

>>      Software Productivity Solutions, Inc., has announced a new product,
>>      Classic-Ada (tm), which allows Ada software developers to use
>>      inheritance and dynamic binding in object-oriented Ada designs.

>You mean that they sell a *super-set* of ADA?

>Mild sarcasm on:
>This must be what we all have been waiting for !
>Mild sarcasm off.

>ADA is too much of an elephant already. I totally agree that
>inheritance is good but ADA would have to be pruned a lot
>before something new should be added to it.

Classic-Ada is *NOT* a super-set of Ada.  Harris has also developed a
tool, called InnovAda, for true object-oriented programming in Ada
(multiple-inheritance, dynamic binding, run-time message evaluation,
and so on).  Such tools generate compilable Ada code; if you wanted to,
you could have created the same code generated by the tool.  InnovAda
and similar tools are preprocessors which enhance productivity by giving
the designer/programmer a powerful development and maintenance environment.
There are graphic design tools on the market which do some amount of
code generation; would you consider those to be supersets of Ada?

InnovAda and other object-oriented preprocessors still provide one with
all the nice features of Ada, plus they supplement the language with the
advantages of true object-oriented design.  I have found these capabilities
to be indispensable for several problem domains; particularly when one
tries to implement Artificial Intelligence in Ada.

Richard P. Simonian     407/984-6006
Natural Language Processing Group
Harris GISD, Melbourne, FL  32902            



Mon, 17 Aug 1992 22:25:00 GMT  
 Good Design Strategies <Was comments on comments>

Quote:

>...
>InnovAda and other object-oriented preprocessors still provide one with
>all the nice features of Ada, plus they supplement the language with the
>advantages of true object-oriented design. .,..

>Richard P. Simonian     407/984-6006

Lex and Yacc generate wonderful C code as well -- But the implementation
has to be judged by where it is maintained.

Classic Ada and InnovAda, by that measure, ARE languages in their own
right, and ARE supersets of Ada (legal Ada is legal Classic Ada).

On the other hand, C originally did not include its current
preprocessor, so the language definition changed to include it.

Whatever Language you use, it should be fun and useful.
--
Michael Schwartz
ncar!dinl!schwartz



Mon, 17 Aug 1992 22:14:00 GMT  
 Good Design Strategies <Was comments on comments>

Quote:

>...My assertion -- must I repeat it again? --
>is that good modules are more likely to come from a bottom-up approach.

Ok, I'll admit you've made a good point although I still think starting at
the top is the natural way to design systems (and algorithms).

Quote:
>> By breaking the system into smaller pieces with each
>> piece providing some well defined and more manageable part of the system,
>> the components should emerge.

>Components might emerge. But how they emerge may be dictated by the
>...

Ok, how about a quick example starting at top level.  When designing
a "Classical" compiler/interpreter front-end, I know the structure will
consist of a scanner to read input and a parser to read the grammar.  
I'll assume the parser functions return an intermediate code Adt for
triviality.  Now, I know I'll need a symbol table for identifiers and
reserved words and a token type to return to the parser, and we have:

|-------|       |------|     |------------|
|scanner|------>|parser|---->|intermediate|
|-------|       |------|     |    code    |
      \          /           |------------|
       \        /
        |-------|
        |symbol |
        |table  |
        |-------|

Their are four top-level modules immediately identified.  
1. Symbol Table
    Provides a symbol_table_entry Adt (encapsulates a type).
    <Description>
    Operations
        a. lookup/insert
           <Description>
        ...
2. Scanner
    Provides TokenType Adt (encapsulates a type).
    <Description>
    Operations
        a. Next_Token
           <Description>
        b. Match_Token
           <Description>
        ...
3. Intermediate Code
    Provides Intermediate_Code Adt (encapsulates a type)
    <Description>
    Operations
        a. Make_Node
           <Description>
        b. Make_Child
           <Description>
        ...
4. Parser
    Parses input (encapsulates functionally related subprograms).
    <Description>
    Operations
        a. Statement
                Returns an intermediate code representation of the
                parsed statement...
        b. Parse
                Go for the whole thing:-)
        ...

Thus, a modular design with Adts.  The design can proceed by identifying
the return values and parameters to operations and possibly some high level
pseudo code to insure the problem is well understood and all parts of the
system will work together as expected.  To begin the implementation we
analyze the dependencies, almost always the Adts come last.
Dependencies
  parser:       scanner, symbol table, intermediate code,
  scanner:      symbol table
  symbol table:  nothing
  intermediate code: nothing
We can begin implementation with the intermediate code and symbol table
because we know what they need to provide and how they fit into the system
and because they don't depend on unwritten code.  Yes, the token,
symbol table, and intermediate code Adts won't change much and
are good candidates for reuse; I wouldn't expect to add more than one
or two operations and change more than a few parameters for each Adt
above after the initial design is complete.  Each module provides a
well defined interface and encapsulates either a type (Adt) or a
set of functionally related subprograms (the parser).  After the top level
design is complete we can turn to the internal design of the Adts.
For example the intermediate code Adt may be complex requiring recursive
application of the above technique, i.e. top down.  So, a generic tree
Adt from a library can be used, and so on, nothing special purpose.
The parser functions can be designed top-down and implemented bottom-up,
so we'll know what gramatical constructs are necessary from the top and
get there fastest from the bottom (the preferred implementation technique
for parsers).  Efficiency considerations can come later *if* the system
doesn't meet its timing constraints, good design comes first.  We wouldn't
know where to optimize until profiling anyway.

Classification:
Object Oriented (design & implementation)
    Yes: objects identified, 2nd level design and coding begins with
         objects (data structures).
    No : no inheritance.
Top-Down (design)
    Yes: Top-down design is used within modules, especially in the parser.
    No : Structure charts were possible, but modules provided the best high
         level view of the system.
Bottom-up (implementation)
    Yes: started with the lowest level Adt.  Generic linked lists, tree adts,
         etc., would be used to implement the Adts from the bottom up.
What I call this mess:
    Modular design with abstract data types, top-down design within modules.
Reuse:
    All software tools and components are conceivably replaceable
    and reusable.

Quote:
>... Which method is best depends on the job at hand. ...

I agree, the above technique was chosen after carefully considering the
problem and I have found it suitable for most, if not all applications.
Direct encoding of the design and implementation is possible in Ada
and to my understanding this is the preferred application of Ada's
methodology.  Flamage and comments are welcome.  In view of the recent
object-oriented discussions, can anyone comment on how inheritance,
dynamic binding, or any other object oriented techniques fit into or
could improve the above scheme?  I'll be disappointed if the modular
technique with Adts is the only one provided:-)

Bob Hathaway



Mon, 17 Aug 1992 05:45:00 GMT  
 Good Design Strategies <Was comments on comments>

        Please excuse the following flame on whether a given "tool" is
        a compiler and whether its input space defines a language.
        Prompted by the posting excerpted below:

Quote:

>/* ---------- "Re: Good Design Strategies <Was com" ---------- */
>Classic-Ada is *NOT* a super-set of Ada.  Harris has also developed a
>tool, called InnovAda, for true object-oriented programming in Ada
>(multiple-inheritance, dynamic binding, run-time message evaluation,
>and so on).  Such tools generate compilable Ada code; if you wanted to,
>you could have created the same code generated by the tool.  InnovAda
>and similar tools are preprocessors which enhance productivity by giving
>the designer/programmer a powerful development and maintenance environment.
>There are graphic design tools on the market which do some amount of
>code generation; would you consider those to be supersets of Ada?

Is "Classic Ada" (or InnovAda) an Ada superset?

a. Does using it result in SOURCE programs that aren't accepted by other
Ada compilation systems?

b. Does anyone reading the SOURCE code have to know more than Ada to
understand it?

c. Does it get confused when I put some reusable Ada components into the
SOURCE code (e.g. by complaining that "Classic-Ada" keywords are being used as
variable names)?

In the above, "SOURCE code" is defined as that which the programmer writes and
the maintainer maintains, and which the design documentation describes in
detail.  It is the SOURCE code which contains comments that map accurately
to block and variable names, types, etc.  I belive that for "Classic-Ada",
SOURCE code means the "Ada with extensions".  

Therefore, my opinion is that "Classic-Ada" is a programming language which
has all the drawbacks of any other Ada superset.  I'm sure it has
advantages as well.  Whether "Classic-Ada" is a language is not dependent
on whether it is compiled in a single step, or in two steps with Ada
generated in between.

Current Syntax Editors and Graphic Design tools are not Ada supersets, because
they are used to intially generate the Ada which is then maintained directly.
These tools help *generate* the SOURCE code which is still pure Ada.  

If the graphics form of a program was used to describe the entire program, and
the comments and documentation refer to the graphics form, and maintenance is
done by modifying the graphics and automatically re-generating the Ada, then
graphics would be the SOURCE code, and such a CASE tool *would* be a programming
language different from Ada.  I have not heard of any such tools yet.

Mike Ryer
Employed by Intermetrics, Inc., but speaking for myself



Mon, 17 Aug 1992 17:26:00 GMT  
 Good Design Strategies <Was comments on comments>

Quote:


>>       Software Productivity Solutions, Inc., has announced a new product,
>>       Classic-Ada (tm), which allows Ada software developers to use
>>       inheritance and dynamic binding in object-oriented Ada designs.

I believe there is also a proposal for the 199x revision to Ada to
introduce Package types. The package type is the analogy of the C++
class. As far as I know there has been no proposal to include constructor
and destrustor operators, but this certainly should be considered.


Mon, 17 Aug 1992 01:25:00 GMT  
 
 [ 53 post ]  Go to page: [1] [2] [3] [4]

 Relevant Pages 
 

 
Powered by phpBB® Forum Software