short/long/long long Formatting questions 
Author Message
 short/long/long long Formatting questions

A couple of questions relating to the following code
which was run on Solaris 2.8:

#include <stdlib.h>
#include <stdio.h>
#include <unistd.h>

int main(void){

char      cv;
short int sv;
      int lv;
long long llv;

  (void) printf("data sizes c=%d, s=%d, i=%d, ll=%d\n",
    sizeof(char),
    sizeof(short int),
    sizeof(int),
    sizeof(long long));

  cv=1;
  sv=1;
  lv=1;
  llv=1;
  (void) printf("a char      %2.2x\n",cv);
  (void) printf("a short int %4.4x\n",sv);
  (void) printf("a long  int %8.8x\n",lv);
  (void) printf("a long long %16.16llx\n",llv);
  cv=-1;
  sv=-1;
  lv=-1;
  llv=-1;
  (void) printf("a char      %2.2x\n",cv);
  (void) printf("a short int %4.4x\n",sv);
  (void) printf("a long  int %8.8x\n",lv);
  (void) printf("a long long %16.16llx\n",llv);  exit(EXIT_SUCCESS);

Quote:
}

gcc -o test test.c
./test
data sizes c=1, s=2, i=4, ll=8
a char      01
a short int 0001
a long  int 00000001
a long long 0000000000000001
a char      ffffffff
a short int ffffffff
a long  int ffffffff
a long long ffffffffffffffff

Change cv or sv to unsigned and they become 'ff' and 'ffff' respectively.

Question 1:  Apparently this is what's going on during code generation:

  int itmp=sv;      (or for unsigned: unsigned int itemp=sv;)
  (void) real_printf("a short int %4.4x\n",itmp);

Why, historically or operationally, did they do it that way?  Instead of,
for instance, by passing a parameter giving the data type of the variable
so that "x" could be used for all integer types?  Something like this:

  (void) real_printf("a short int %4.4x\n",sizeof(itmp),itmp);

Question 2:  Because of the effect in 1 a further kludge was required for long
long.
Since long long can't be stored into an int the ll(d,u,o,x) format specifiers
had to be created.  But what happens on a system where the standard
"int" is "long long"?  Or is that forbidden by the standard?  

Question 3.  If we ever have long long long (16 byte integers) is that going to
require lll(d,u,o,x) or has ll(d,u,o,x) already been coded to handle larger
integer structures
too?  And if it has does that same method work with smaller integers as well?

In other words, is there ever going to be a C syntax like:

  (void) printf("print any size int in hex correctly with this specifier
%4.4X\n",intval);

where intval could be declared with 1,2,4,8,16 etc bytes and the new specifier
"X" handles
them all correctly without sign extending to match an implicit intermediate
integer value.

Thanks,

Regards,

David Mathog



Mon, 11 Oct 2004 00:10:09 GMT  
 short/long/long long Formatting questions

Quote:

> data sizes c=1, s=2, i=4, ll=8
> a char      01
> a short int 0001
> a long  int 00000001
> a long long 0000000000000001
> a char      ffffffff
> a short int ffffffff
> a long  int ffffffff
> a long long ffffffffffffffff

> Change cv or sv to unsigned and they become 'ff' and 'ffff' respectively.

First, you should know the character types in variable argument-like
function call are prompted to int or unsigned int.

Second, you sould know the precision field in "%?.?x" gives the minimum
number of digits to appear after the conversion.

Third, you should know while you assign '-1' to an unsigned integer type,
the result is that by taking the modulo of 2^Number_of_Bit_of_the_type + 1.
For e.g., "unsigend cv=-1;" gives you value 255 if CHAR_BIT=8.

These explains what you have observed.

NB. Your code contains undefined behavior.
    The specifier 'X' expects an unsigned int type argument.

paiyi



Mon, 11 Oct 2004 00:44:06 GMT  
 short/long/long long Formatting questions
....

Quote:
> Question 1:  Apparently this is what's going on during code generation:

>   int itmp=sv;      (or for unsigned: unsigned int itemp=sv;)
>   (void) real_printf("a short int %4.4x\n",itmp);

> Why, historically or operationally, did they do it that way?  

Thze standard requires so. It says when there is no prototype for a
function or for a function with variadic paramters list, arguments
passed in the ellipsis (that is behind the last specified parameter),
are subject to default argument promotions. That is, a char is
passed as int, a unsigned char is passed as int or unsigned int if
int cannot represent all values of the unsigned type. The same happens
with short. float is converted to double. All other types are not
changed.

But note that with such a function it is apparently important to pass
the arguments in the expected type. That is, if you tell printf (as you
do with %x) you're passing aa argument of type unsigned int but pass it a
int, printf will access the passed object with an incompatible type
which results in undefined behavior. So you have to explicitly cast your
arguments or use the correct format specifier.

Quote:
> Instead of,
> for instance, by passing a parameter giving the data type of the variable
> so that "x" could be used for all integer types?  Something like this:

>   (void) real_printf("a short int %4.4x\n",sizeof(itmp),itmp);

> Question 2:  Because of the effect in 1 a further kludge was required for long
> long.
> Since long long can't be stored into an int the ll(d,u,o,x) format specifiers
> had to be created.  But what happens on a system where the standard
> "int" is "long long"?  Or is that forbidden by the standard?  

The two types would still be incompatible what I said above still
apllies.

Quote:

> Question 3.  If we ever have long long long (16 byte integers) is that going to
> require lll(d,u,o,x) or has ll(d,u,o,x) already been coded to handle larger
> integer structures
> too?  And if it has does that same method work with smaller integers as well?

The real size of a long long object is subject to the implementation.
That is, the implementation needs to make long long so it can hold
values from -9223372036854775807 to +9223372036854775807, which is 64
bits (but not 8bytes, as a byte can havwe more than 8 bits). If your
system has long long with 128bits, that's fine.

Quote:
> In other words, is there ever going to be a C syntax like:

>   (void) printf("print any size int in hex correctly with this specifier
> %4.4X\n",intval);

Never say never, but I doubt that there will.

Quote:

> where intval could be declared with 1,2,4,8,16 etc bytes and the new specifier
> "X" handles
> them all correctly without sign extending to match an implicit intermediate
> integer value.

--

"LISP  is worth learning for  the profound enlightenment  experience
you will have when you finally get it; that experience will make you
a better programmer for the rest of your days."   -- Eric S. Raymond


Mon, 11 Oct 2004 01:12:49 GMT  
 
 [ 3 post ] 

 Relevant Pages 

1. char size (was long long long long long int)

2. long long long long integers

3. 32-bit shorts (was a long long title)

4. Long long and long double

5. format specifier for unsigned long long

6. format string for long long

7. question: union and long long

8. long long output question.

9. "long long" C99 question

10. Question about long long.

11. Question about lltostr(long long, char *)

12. question: char* lltostr(long long, char*)?

 

 
Powered by phpBB® Forum Software