Use of enums for user defined types 
Author Message
 Use of enums for user defined types

Hello all,

Today I seem to be getting a lot of doubts.. Maybe, am thinking too
much.. Anyway, enums are considered to be a better way of
representation when it comes to user defined types, say

enum Months {Jan=1,Feb,March,....};

can also be written as

# define JAN 1
# define FEB 2
.....

or

if the programmer insists upon the type of the variable, let him
declare it as

const int JAN = 1;
const int FEB = 2;

My personal opinion is that enum looks more elegant.. U r classifying
the variables into a set. I understand that..

Other than that, does enum give any significant advantages in

1. Compilation time
2. efficiency
3. Safe type checking
4. Readability
5. Memory

or any other parameter that I cant think of..

Please do pour in your opinions..

RR



Mon, 18 Jul 2005 20:25:25 GMT  
 Use of enums for user defined types


Quote:
> Hello all,

> Today I seem to be getting a lot of doubts..

Read: "having a lot of questions"

Quote:
> enum Months {Jan=1,Feb,March,....};

> can also be written as

> # define JAN 1
> # define FEB 2
> .....

> or

> if the programmer insists upon the type of the variable, let him
> declare it as

> const int JAN = 1;
> const int FEB = 2;

This can take up space in the image regardless of whether or not they are
actually referenced. Enums and #defines do not if never referenced.

Quote:
> My personal opinion is that enum looks more elegant.. U r classifying
> the variables into a set. I understand that..

They sure do unless you need to pass such a constant to a function as an
address (const int wins here) or need to represent things like Pi
(#defines win here).

Quote:
> Other than that, does enum give any significant advantages in

> 1. Compilation time

No. Implementation dependent.

Quote:
> 2. efficiency

No. Implementation dependent.

Quote:
> 3. Safe type checking

Yes vs. #defines.

Quote:
> 4. Readability

No different from #defines or const I would think.

Quote:
> 5. Memory

Better than const vars. if not referenced. Same as #define.

Quote:
> or any other parameter that I cant think of..

Enums can be scoped more tightly that #defines.


Mon, 18 Jul 2005 21:50:15 GMT  
 Use of enums for user defined types


Quote:
> Hello all,

> Today I seem to be getting a lot of doubts.. Maybe, am thinking too
> much.. Anyway, enums are considered to be a better way of
> representation when it comes to user defined types, say

> enum Months {Jan=1,Feb,March,....};

> can also be written as

> # define JAN 1
> # define FEB 2
> .....

> or

> if the programmer insists upon the type of the variable, let him
> declare it as

> const int JAN = 1;
> const int FEB = 2;

> My personal opinion is that enum looks more elegant.. U r classifying
> the variables into a set. I understand that..

> Other than that, does enum give any significant advantages in

> 1. Compilation time
> 2. efficiency
> 3. Safe type checking
> 4. Readability
> 5. Memory

> or any other parameter that I cant think of..

enum something { this = 1, that, ... someotherthing /* = 2789 */};
specs change - new item 'whataboutme' must be =1 everything else pushed
back...
significant? You bet your ( | ) ;)


Tue, 19 Jul 2005 02:29:10 GMT  
 Use of enums for user defined types
I'm not sure about the memory used. I believe enum uses the same size as
  int. But with #defines, you can use bytes to store the types.
Quote:

>>5. Memory

> Better than const vars. if not referenced. Same as #define.



Tue, 19 Jul 2005 23:44:12 GMT  
 Use of enums for user defined types

Quote:

> enum Months {Jan=1,Feb,March,....};

> can also be written as

> # define JAN 1
> # define FEB 2
> .....
> Please do pour in your opinions..

In addition to what others have said:-

For:
1) A compiler can check in some circumstances that you're using the
right thing, e.g. if you have Func(enum Months m), it can warn you if
you give it an enum Days.
2) Some de{*filter*}s show you strings JAN, FEB, etc. instead of the
numbers. This is useful when you want to see the meaning but a pain
when you need to know the number (and the de{*filter*} doesn't have a
"show me").
3) The assigned numbers autoincrement by default, so if you want to
add something in the middle, you don't have to edit a zillion
following #defines. However, I notice that some (paranoid?) people
never use the autoincrementing, and would do Jan=1,Feb=2,...

Against:
1) Unlike #defines, enums can't be used with preprocessor directives
like #if and in some macro usages (the popular STRINGISE, for
instance).
2) They aren't user-expandable without changing your header file, e.g.
you reserve 0-100 for yourself, and allow other people to define 101
up (as in system/user error codes).

Chris



Thu, 21 Jul 2005 09:03:17 GMT  
 Use of enums for user defined types
On Fri, 31 Jan 2003 09:44:12 -0600, Raghavan Subramaniyan

Do not top post in technical groups, it makes discussion threads
difficult or impossible to follow.  Material you add belongs after
quoted material you are replying to.  I have reformatted you message
properly and added my comment at the end.

Quote:

> >>5. Memory

> > Better than const vars. if not referenced. Same as #define.

> I'm not sure about the memory used. I believe enum uses the same size as
>   int. But with #defines, you can use bytes to store the types.

The type of a identifier defined in an enumeration is signed int, and
it can hold any value of that type.  When you define a value of that
enumeration type the compiler is free to select any integer type that
can hold all the values of the enumerated set.  It is not required to
us the type signed int.

In addition the constants themselves have no type, they are just
integer constant expressions of type signed int.  It is perfectly
legal to assign an enumeration constant value to a signed or unsigned
char object, providing that the value of the enumeration constant is
in the proper range.

--
Jack Klein
Home: http://JK-Technology.Com
FAQs for
comp.lang.c http://www.eskimo.com/~scs/C-faq/top.html
comp.lang.c++ http://www.parashift.com/c++-faq-lite/
alt.comp.lang.learn.c-c++ ftp://snurse-l.org/pub/acllc-c++/faq



Thu, 21 Jul 2005 11:48:24 GMT  
 Use of enums for user defined types

[snip]

Quote:
> Against [enum]:

[snip]

Quote:
> 2) They aren't user-expandable without changing your header file, e.g.
> you reserve 0-100 for yourself, and allow other people to define 101
> up (as in system/user error codes).

Sure they are, at least to the extent that #defines are. In your
example, enumerate all your 0-100 members of the enum type and then
add the member YourEnumType_otherIntMax = INT_MAX to ensure that the
enum type gets big enough to also hold other externally defined
values:

typedef enum
{
    EnumType_your0,
    EnumType_your1,
    .
    .
    .
    EnumType_your100,

    // External members:
    EnumType_other,
    // EnumType_other1,
    // ...
    EnumType_otherIntMax = INT_MAX

Quote:
} EnumType;

Then the external users can use "EnumType_other + n", where n is the
nth member in their enumeration.

The one thing I miss in the enum construction in C is a way to define
the enum type explicitly. Why wasn't that added to the language? Is
there anything against adding it in a future C version?

BTW, gcc will --- at least on my Linux PC --- define the enum type to
be signed if the enumeration contains any negative constants, and
usinged otherwise which seems sensible.

Daniel Vallstrom

Quote:

> Chris



Thu, 21 Jul 2005 20:35:03 GMT  
 Use of enums for user defined types

[snip]

Quote:
> My personal opinion is that enum looks more elegant.. U r classifying
> the variables into a set. I understand that..

> Other than that, does enum give any significant advantages [over #defines] in

> 1. Compilation time

no

Quote:
> 2. efficiency

no

Quote:
> 3. Safe type checking

yes, unless you use anonymous enums

Quote:
> 4. Readability

no, unless you are looking at a de{*filter*} or preprocessed code.

Quote:
> 5. Memory

unknown

        david

--
Andre, a simple peasant, had only one thing on his mind as he crept
along the East wall: 'Andre, creep... Andre, creep... Andre, creep.'
                                -- unknown



Sun, 24 Jul 2005 10:00:44 GMT  
 Use of enums for user defined types

Quote:
> On Fri, 31 Jan 2003 09:44:12 -0600, Raghavan Subramaniyan

...
> > I'm not sure about the memory used. I believe enum uses the same size as
> >   int. But with #defines, you can use bytes to store the types.

> The type of a identifier defined in an enumeration is signed int, and
> it can hold any value of that type.  When you define a value of that
> enumeration type the compiler is free to select any integer type that
> can hold all the values of the enumerated set.  It is not required to
> us the type signed int.

It's not clear if you mean the enum tag or the constant(s),
or a typedef name for the type, or a variable of the type;
all are identifiers definable in an enumerated-type declaration.

An enum type, defined by a full enum-specifier (that is, 'enum',
optional tag, left brace, list of enumerators, right brace),
and referenced by 'enum' and a tag or by a typedef name,
is effectively the same as some integer type provided by the
implementation, and (as you say) spanning the specified range.
(Formally it is a distinct type but compatible, meaning they
can be interchanged essentially everywhere; in a few cases
like *printf the wording isn't airtight but works in practice.)
You are right it doesn't have to be signed int, although many
compilers do make all enum types signed int for convenience.
It does have to be the same integer type for *all* uses, and
particularly all variables or other objects, of that enum type;
otherwise pointers to (and arrays of) enums wouldn't work.

Quote:
> In addition the constants themselves have no type, they are just
> integer constant expressions of type signed int.  It is perfectly
> legal to assign an enumeration constant value to a signed or unsigned
> char object, providing that the value of the enumeration constant is
> in the proper range.

Formally, enum constants *do* have type signed int,
which is the same as an unsuffixed in-range integer literal.
But it doesn't really make a difference whether you view
the identifier as having a type and a value of that type,
or just a value of the type, you get the same result.

Concur it is *correct* to assign any integer value, including
an enum constant, if in range, to any integer object, including
but not limited to signed or unsigned char.  It is also *legal*
to assign an out-of-range value to an unsigned integer object,
but produces results that are often (usually?) undesired.

--
- David.Thompson 1 now at worldnet.att.net



Fri, 29 Jul 2005 08:42:22 GMT  
 
 [ 9 post ] 

 Relevant Pages 

1. PRB: C2440 Error When Using CMap and User Defined Key Type

2. Using enums to define array sizes

3. Interface returns user defined type???

4. User Defined Data Types and Pointers

5. User defined types

6. HELP! user-defined data types

7. User defined Type in ATL

8. Returning user defined data types

9. Convert C++ (return type int, long, user-define, etc) function into COM/ATL

10. Passing User defined data Type in ATL

11. An array of user defined types

12. User Defined Types in Safearrays

 

 
Powered by phpBB® Forum Software