C by Abstraction 
Author Message
 C by Abstraction

I've been thinking about how people represent (internally) their understanding
of languages.  In particular, I've been trying to figure out how it is that
people who are "good" at a given language treat it differently than people who
are "bad" at it.

(This is sparked mostly by the difficulty of communicating to a certain
nameless individual why 'unsigned int foo = -1;' does what it does.)

Here is my current working theory:  You are best at a language when your
understanding of what will happen is derived from principles, not from
experiments.  You can learn to produce code good enough to get paid just by
imitating other code, and using trial and error to produce output, but good
code comes from using knowledge of what *must* happen in general, to derive
what will happen in a specific case.

This explains a lot about code portability.  Experimenters will produce code
by tweaking and testing until it works.  The result, unfortunately, is code
which works, not by design, but by accident.

For instance, consider the statement
        unsigned int foo = -1;

I read this as "take the value -1, convert to unsigned int, and put the result
in foo".  Now, as it happens, I don't believe I've ever owned a machine on
which the conversion had any physical form or effect.  However, knowing that
the conversion "is there" is an important part of my understanding.  Consider
the more complicated form:
        int minusone = -1;
        unsigned int foo = (unsigned int) *(&minusone);
        unsigned int bar = *((unsigned int *)&minusone);

Someone who has learned in terms of a 2's complement representation is likely
to believe these two statements equivalent, and the cast irrelevant.  Despite
my never having used a non-2's-complement system, I consider these statements
significantly different.  (And the 2nd a bad idea.)

I'm wondering how other programmers have viewed the language as they developed
experience with it, and with multiple platforms.  I still remember spending at
least an hour trying to write code to build the command "mv file.Z file" for
the benefit of system(), back when I wasn't sure what the difference between
strcat() and strcpy() was.  :)

How many of you, when you see "-1", think of this as "unary minus, with
operand 1", rather than "minus 1"?  (I don't, although I'm aware of it enough
that "-32768" and friends look suspicious.)

-s
--
Copyright 1997 Peter Seebach - seebs at solon.com - C/Unix Wizard

The *other* C FAQ, the hacker FAQ, et al.   http://www.*-*-*.com/ ~seebs
Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.



Fri, 09 Jul 1999 03:00:00 GMT  
 C by Abstraction


comp.lang.c.moderated:

=> How many of you, when you see "-1", think of this as "unary minus, with
=> operand 1", rather than "minus 1"?  (I don't, although I'm aware of it enough
=> that "-32768" and friends look suspicious.)

I think of it similar to your first choice, but that's because of I've worked
with too many systems where -1 didn't work. To get it I would have to write
0-1. Now I ALWAYS think of -1 as 0-1, even when I code it -1.

<><><><><><><><><><><><><><><><><><><><>

Wizard's First Rule:
    People are stupid.
            - Terry Goodkind

<><><><><><><><><><><><><><><><><><><><>



Fri, 09 Jul 1999 03:00:00 GMT  
 C by Abstraction

Quote:

>I've been thinking about how people represent (internally) their understanding
>of languages.  In particular, I've been trying to figure out how it is that
>people who are "good" at a given language treat it differently than people who
>are "bad" at it.
>(This is sparked mostly by the difficulty of communicating to a certain
>nameless individual why 'unsigned int foo = -1;' does what it does.)
>Here is my current working theory:  You are best at a language when your
>understanding of what will happen is derived from principles, not from
>experiments.  You can learn to produce code good enough to get paid just by

     Partial disagreement.
     You are best at a language when your understanding of what will
happen is derived from principles AND experiments, when you have read
the theory and have worked with enough examples that you know with
certainty what will happen.  Being able to regurgitate the rules isn't
enough.

Quote:
>imitating other code, and using trial and error to produce output, but good
>code comes from using knowledge of what *must* happen in general, to derive
>what will happen in a specific case.

     Yup!

Quote:
>This explains a lot about code portability.  Experimenters will produce code

                                              ^^^^^^^^^^^^^
     You misspelled "Guessers".

Quote:
>by tweaking and testing until it works.  The result, unfortunately, is code
>which works, not by design, but by accident.

     That approach is fine as long as you then work back to the
principle.  If you don't, I hope you like Italian (spaghetti) because
guesser code is spaghetti even if it is supposedly structured.

[snip]

Sincerely,

Gene Wirchenko

C Pronunciation Guide:
     y=x++;     "wye equals ex plus plus semicolon"
     x=x++;     "ex equals ex doublecross semicolon"



Mon, 12 Jul 1999 03:00:00 GMT  
 C by Abstraction



...

Quote:
>Here is my current working theory:  You are best at a language when your
>understanding of what will happen is derived from principles, not from
>experiments.  You can learn to produce code good enough to get paid just by
>imitating other code, and using trial and error to produce output, but good
>code comes from using knowledge of what *must* happen in general, to derive
>what will happen in a specific case.

The problem is that to work from principles you have to know what those
principles are. You also have to have confidence in your knowledge. The
best way to get knowledge and confidence is to go to the source which
for C is the standard. Books like K&R are certainly good but I know my
understanding improved greatly when I made the standard itself my primary
reference for C.

Quote:
>This explains a lot about code portability.  Experimenters will produce code
>by tweaking and testing until it works.  The result, unfortunately, is code
>which works, not by design, but by accident.
>For instance, consider the statement
>        unsigned int foo = -1;

>I read this as "take the value -1, convert to unsigned int, and put the result
>in foo".  Now, as it happens, I don't believe I've ever owned a machine on
>which the conversion had any physical form or effect.  However, knowing that
>the conversion "is there" is an important part of my understanding.  Consider
>the more complicated form:

You learn about this conversions when your compiler insists on warning about
converting a negative number to unsigned! :-)

Quote:
>        int minusone = -1;
>        unsigned int foo = (unsigned int) *(&minusone);
>        unsigned int bar = *((unsigned int *)&minusone);

>Someone who has learned in terms of a 2's complement representation is likely
>to believe these two statements equivalent, and the cast irrelevant.  Despite
>my never having used a non-2's-complement system, I consider these statements
>significantly different.  (And the 2nd a bad idea.)

Well, the results are potentially different and one is right and the other
wrong for a particular set of circumstances. The first is almost always the
one that is right but only almost.

Quote:
>I'm wondering how other programmers have viewed the language as they developed
>experience with it, and with multiple platforms.  I still remember spending at
>least an hour trying to write code to build the command "mv file.Z file" for
>the benefit of system(), back when I wasn't sure what the difference between
>strcat() and strcpy() was.  :)

I remember it was quite a while before I "discovered" sprintf, I must have
just missed that secion of K&R1. Maybe they shouldn't have put it after
the section on arrays and pointers! :-)

There is still the odd question on how to build command lines for system()
in comp.lang.c.

Quote:
>How many of you, when you see "-1", think of this as "unary minus, with
>operand 1", rather than "minus 1"?  (I don't, although I'm aware of it enough
>that "-32768" and friends look suspicious.)

Same here. In most circumstances it doesn't make any difference. A trained
eye should be able to spot immediately things like -32768 and

#define VALUE -1

as potential sources of error. For things like x = -1; it simply doesn't
matter. The important thing is to know the nature of the beast when the
situation call for it.

--
-----------------------------------------


-----------------------------------------



Tue, 13 Jul 1999 03:00:00 GMT  
 
 [ 4 post ] 

 Relevant Pages 

1. Newbie: separate big .cs file into small .cs files

2. library of practical abstractions

3. Automatic Dependency Abstraction?

4. Abstraction from the implementation (array & hashtable)

5. On Abstraction

6. Eric Roberts: Programming Abstractions in C

7. abstraction too complicated

8. HELP on abstraction!!!

9. HELP! info hiding/abstraction (difference please)

10. Data Abstraction and Hiding in C

11. Variable mapping and abstraction

12. Parameter passing & data abstraction

 

 
Powered by phpBB® Forum Software