question about size of ints in Xcode

Hello. I have a a few quick questions about declaring and defining variables and about their size and so forth. The book I'm reading at the moment, "Learn C on the Mac", says the following in reference to the difference between declaring and defining a variable:

A variable declaration is any statement that specifies a variables name and type. The line *int myInt;* certainly does that. A variable definition is a declaration that causes memory to be allocated for the variable. Since the previous statement does cause memory to be allocated for myInt, it does qualify as a definition.


I always thought a definition of a variable was a statement that assigned a value to a variable. If a basic declaration like "int myInt;" does allocate memory for the variable and therefore is a definition, can anyone give me an example of a declaration that does not allocate memory for the variable and therefore is not a definition?

The book goes on, a page or so late, to say this:

Since myInt was declared to be of type int, and since Xcode is currently set to use 4-byte ints, 4 bytes of memory were reserved for myInt. Since we haven't placed a value in those 4 bytes yet, they could contain any value at all. Some compilers place a value of 0 in a newly allocated variable, but others do not. The key is not to depend on a variable being preset to some specific value. If you want a variable to contain a specific value, assign the value to the variable yourself.


First, I know that an int can be different sizes (either 4 bytes or 8 bytes, I think), but what does this depend on? I thought it depended on the compiler, but the above quote makes it sound like it depends on the IDE, Xcode. Which is it?

Second, it said that Xcode is currently set to use 4-byte ints. Does this mean that there is a setting that the user can change to make ints a different size (like 8 bytes), or does it mean that the creators of Xcode currently have it set to use 4-byte ints?

Third, for the part about some compilers giving a newly allocated variable a value of 0, does this apply to Xcode or any of its compilers? I assume not, but I wanted to check.

Thanks for all the help, and have a great weekend!

MacBook, Mac OS X (10.5.8)

Posted on Dec 18, 2009 5:21 AM

Reply
11 replies

Dec 18, 2009 6:26 AM in response to Tron55555

Tron55555 wrote:
Second, it said that Xcode is currently set to use 4-byte ints. Does this mean that there is a setting that the user can change to make ints a different size (like 8 bytes), or does it mean that the creators of Xcode currently have it set to use 4-byte ints?

GCC, which is used by XCode, is set to use a default 4-byte ints on 32 bits architecture and 8-byte ints on 64 bits architecture.
You can add your own compilation options, but I don't think there is an option for changing this behavior.

Dec 18, 2009 9:32 AM in response to Tron55555

Section 6.7 "Declarations" of the C standard (ISO/IEC 9899:1999) has the answer to your first question. But in short a definition is indeed a statement (well, not really a statement...) where you both define the type of a variable and allocate memory for it. A declarator defines the type of a variable, but does not allocate memory. An (easy) example of a declarator is a parameter of a function. (I know, having to write "A declarator defines" is confusing, but "A declarator declares" is just as confusing. 😉

Regarding your second question: the size of an integer depends on the choices the compiler writer makes, but many compilers allow the user to select the size of types at compile time. Typically the size of an integer matches the size of the processor's registers as that ensures highest run-time performance.

Of course a good programmer never makes assumptions about the sizes of data types, but uses <t> sizeof() </t> to determine them at run-time or use the <t> int[8,16,32]_t </t> data types provided by the C standard. These are typically only needed when manipulating (binary) data from an external source (file, network connection, etc.) or doing direct hardware access.


Regards,
Hans

Dec 18, 2009 11:21 AM in response to Tron55555

Tron55555 wrote:
I always thought a definition of a variable was a statement that assigned a value to a variable. If a basic declaration like "int myInt;" does allocate memory for the variable and therefore is a definition, can anyone give me an example of a declaration that does not allocate memory for the variable and therefore is not a definition?


I always like to think of a "declaration" to be something that makes no changes to the actual code, but just provides visibility so that compilation and/or linking will succeed. The "definition" allocates space.

You can declare a function to establish it in the namespace for the compiler to find but the linker needs an actual definition somewhere to link against. With a variable, you could also declare a variable as "extern int myvar;". The actual definition "int myvar;" would be somewhere else.

According to that book, both "extern int myvar;" and "int myvar;" are declarations, but only the latter is a definition. That is a valid way to look at it. Both statements 'delcare' something to the compiler, but on the second one 'define's some actual data.

First, I know that an int can be different sizes (either 4 bytes or 8 bytes, I think), but what does this depend on? I thought it depended on the compiler, but the above quote makes it sound like it depends on the IDE, Xcode. Which is it?


An "int" is supposed to be a processor's "native" size and the most efficient data type to use. A compiler may or may not be able to change that, depending on the target and the compiler. If a compiler supports that option and Xcode supports that compiler and that option, then Xcode can control it, via the compiler.

Second, it said that Xcode is currently set to use 4-byte ints. Does this mean that there is a setting that the user can change to make ints a different size (like 8 bytes), or does it mean that the creators of Xcode currently have it set to use 4-byte ints?


I think that "setting" is just not specifying any option to explicitly set the size. You can use "-m32" or "-m64" to control this, but I wouldn't recommend it. Let Xcode handle those low-level details.

Third, for the part about some compilers giving a newly allocated variable a value of 0, does this apply to Xcode or any of its compilers? I assume not, but I wanted to check.


I don't know for sure. Why would you ask? Are you thinking of including 45 lines of macro declarations 3 levels deep to initialize values based on whether or not a particular compiler/target supports automatic initialization? Xcode current supports GCC 3.3, GCC 4.0, GCC 4.2, LLVM GCC, CLang, and Intel's compiler for building PPC, i386, and x86_64 code in both debug and release, with a large number of optimization options. It doesn't matter what compiler you use or what it's behavior is - initialize your variables in C.

Dec 18, 2009 1:45 PM in response to Tron55555

Tron55555 wrote:
I always thought a definition of a variable was a statement that assigned a value to a variable. If a basic declaration like "int myInt;" does allocate memory for the variable and therefore is a definition, can anyone give me an example of a declaration that does not allocate memory for the variable and therefore is not a definition?


Plain C and C++ allow this:

typedef {
unsigned char ch;
} anOctet;


There you go: a definition (of a variable type, not a variable itself) that doesn't use any storage. Perhaps an enumeration

enum Test { One, Two, Three }


could also count as 'defining without actually storing'. (The enum values do get stored in your program, but only if they are actually used.)

Tron55555 wrote:
Third, for the part about some compilers giving a newly allocated variable a value of 0, does this apply to Xcode or any of its compilers? I assume not, but I wanted to check.


In the wonderful World of Windows, that's a common ogre, as the Debug build does clear newly allocated variables, and the Release build does not. If I remember correctly, that's because your program is easier to debug (or something), but it takes time to 'store' a zero, so that code gets removed for the Release builds. Everyone occasionally falls for this one, and after much debugging (of, you guessed right, the Debug build that does work correctly) one learns to not depend on this automatic behavior.

Message was edited by: Jongware

Dec 19, 2009 2:10 AM in response to etresoft

Thanks to all you guys -- I read all your posts, and they were all very helpful. I'd say the issue is cleared up, minus one little thing I want to confirm below, so thanks again!

etresoft wrote:
I don't know for sure. Why would you ask? Are you thinking of including 45 lines of macro declarations 3 levels deep to initialize values based on whether or not a particular compiler/target supports automatic initialization? Xcode current supports GCC 3.3, GCC 4.0, GCC 4.2, LLVM GCC, CLang, and Intel's compiler for building PPC, i386, and x86_64 code in both debug and release, with a large number of optimization options. It doesn't matter what compiler you use or what it's behavior is - initialize your variables in C.


I guess I was just curious, but point taken. Just to be clear, though, your advice is to always initialize a variable, no matter what the situation is (in C, at least) -- is this right? So every time I declare a variable, regardless of circumstances, if I have no other value to give it based on the context of its use, then I should assign it a null value like 0 -- correct?

Dec 19, 2009 1:45 PM in response to Tron55555

Tron55555 wrote:
I guess I was just curious, but point taken. Just to be clear, though, your advice is to always initialize a variable, no matter what the situation is (in C, at least) -- is this right?


That is the problem with C. There are no absolutes. If you find yourself in situations without crystal clear 100% accurate guidelines, does that mean there are no guidelines and anything goes? One could then easily ask the question, "What if I do declare and initialize my variable, but earlier logic causes me to jump to a later point in the code via 'goto', thus skipping the variable declaration and initialization. I can then use that variable, right?"

So every time I declare a variable, regardless of circumstances, if I have no other value to give it based on the context of its use, then I should assign it a null value like 0 -- correct?


If you have no other value to give a variable in the context of where you have declared it, perhaps you have declared it at the wrong point.

You don't want to ever access a variable that has not been initialized or assigned to some value. The danger is that the program could crash or, worse, have unpredictable results. If you crash, you know whats happened. You know your data may be bad. You know you have a bad result. If you access a variable with no value, you may be saving and using random, junk data and not realize it for years down the road. Do you want to be the climate researcher who goes to the UN 20 years down the road and has to explain how you used an uninitialized variable that, while not crashing, corrupted 100 years worth of data measurements?

Initializing a variable just to stick some value in it is not the worst case, but it isn't the best case either. The best case is not to declare a variable until you have some valid data to give it. Of course, that isn't always possible, but it is always possible to try.

Writing good code in C takes discipline and few people have that. Higher level languages can help out. In Perl, for example, all variables start out as 'undef' and you can use that fact if you need it. In C++ and Objective-C (and Perl) you have exceptions. If you know you have an error condition, stop working right away to avoid the risk of recording bogus results. C++ has stack based objects that are extraordinarily useful. Traditional C didn't allow the programmer to declare variables in the correct context. They always had to go at the start of a block. You don't have to do that anymore. You can include variable declarations and initializations in the flow and logic of your code. That allows you to write code that is simpler, with fewer branches.

Dec 19, 2009 5:17 PM in response to etresoft

That is the problem with C. There are no absolutes. If you find yourself in situations without crystal clear 100% accurate guidelines, does that mean there are no guidelines and anything goes? One could then easily ask the question, "What if I do declare and initialize my variable, but earlier logic causes me to jump to a later point in the code via 'goto', thus skipping the variable declaration and initialization. I can then use that variable, right?"


Okay, but in general, if I have, for example:


int myint;


It would generally be "better" to write this instead:


int myInt = 0;


Is that right?

You don't want to ever access a variable that has not been initialized or assigned to some value. The danger is that the program could crash or, worse, have unpredictable results.


Yeah we went over that in another thread recently. It was helpful to me.

Writing good code in C takes discipline and few people have that. Higher level languages can help out. In Perl, for example, all variables start out as 'undef' and you can use that fact if you need it. In C++ and Objective-C (and Perl) you have exceptions. If you know you have an error condition, stop working right away to avoid the risk of recording bogus results. C++ has stack based objects that are extraordinarily useful. Traditional C didn't allow the programmer to declare variables in the correct context. They always had to go at the start of a block. You don't have to do that anymore. You can include variable declarations and initializations in the flow and logic of your code. That allows you to write code that is simpler, with fewer branches.


Off the subject, Perl is a scripting language, if I'm not mistaking, correct? Yet it is still used to write standard applications and whatnot, like other, non-scripting languages, right? The reason I ask is that I find myself associating scripting languages with web design and whatnot, like JavaScript and VBScript and PHP and whatnot. What makes a scripting language a scripting language? If I remember correctly, Perl is interpreted and not compiled, right? Does this have anything to do with it? What does it mean for a language to be interpreted? Is it just another way of getting the code into object code or machine language or whatever? Sorry, off topic I know -- just curious...

Dec 19, 2009 6:35 PM in response to Tron55555

Tron55555 wrote:
It would generally be "better" to write this instead:


int myInt = 0;


Is that right?


Yes, but only slightly better. It would be much better to have:


int myInt = getIntValue();


You don't really want to be coding for fallbacks. You want to be in control of your code. You know what value you want to give that int, so do it. If you wind up with something like:


int myInt = isBigValue() ? getBigInt() : getSmallInt();


You are doing it right.

Off the subject, Perl is a scripting language, if I'm not mistaking, correct? Yet it is still used to write standard applications and whatnot, like other, non-scripting languages, right? The reason I ask is that I find myself associating scripting languages with web design and whatnot, like JavaScript and VBScript and PHP and whatnot. What makes a scripting language a scripting language?


Your perception of it. You can also write web server modules in C, C++, or Objective-C if you want.

What does it mean for a language to be interpreted?


There used to be a wide gulf between "interpreted" languages like Perl and "compiled" languages like C. That is a tiny crack these days. Now, it is more of a difference in the size of the runtime library. Perl has a very large runtime. C has a very small runtime. Objective-C is medium sized. C# is a bit larger.

In C, and similar languages, you have to do a lot of work trimming strings, tracking null terminators, allocating and releasing memory, etc. Eventually, it can get hard to see what you are really trying to accomplish with the code required just to get C to work. In Perl, very little of that code is required and you can code at a much higher level and model your business logic much more closely.

Of course, human nature gets involved again and, as I said before, humans have very little discipline. You "can" do all that in Perl, but many people "choose" not to. They seem to think it is some kind of badge of honor to write the entire program in 18 cryptic lines. Great languages like Perl are wasted on such people.

Dec 20, 2009 3:40 AM in response to etresoft

You don't really want to be coding for fallbacks. You want to be in control of your code. You know what value you want to give that int, so do it. If you wind up with something like (ect.) You are doing it right.


Okay, that makes sense. Thank you.

Your perception of it. You can also write web server modules in C, C++, or Objective-C if you want.


Yeah that's interesting. So scripting languages are just languages that were created with the intent of making writing scripts or whatever more practical and easier. So then the only thing that makes scripting language a scripting language is how it is used, not anything "physically" different about the language. Do I have the right idea?

In C, and similar languages, you have to do a lot of work trimming strings, tracking null terminators, allocating and releasing memory, etc. Eventually, it can get hard to see what you are really trying to accomplish with the code required just to get C to work. In Perl, very little of that code is required and you can code at a much higher level and model your business logic much more closely.


Yeah I know C is a much lower level language and I know Perl is a higher level language -- I just don't know what the difference is mechanically between compilation and interpretation. But, then again, I don't think it's something I really need to know either -- at least not at this point.

Of course, human nature gets involved again and, as I said before, humans have very little discipline.


Sounds like we have a similar outlook on human nature, at least in that regard. Anyways, Perl sounds like a fun language. I've been working with C so long that I've forgotten what a high level language is like, and my only experience with high level languages before that were VB and C#, which I'm not terribly fond of. I also had some experience with C++ -- is that considered high level? I mean, I know it's high level than C, but probably not as much so as a language like Perl, yes?

Anyways, as much as I love conversing about anything related to programming, I think I'm dragging us off topic here so I'll leave it at that.

Until next time...

Dec 20, 2009 7:00 AM in response to Tron55555

Tron55555 wrote:
the only thing that makes scripting language a scripting language is how it is used, not anything "physically" different about the language. Do I have the right idea?


You have the correct interpretation of my idea, which most programmers would probably disagree with. Perl and C are pretty radically different. In Perl there is an interpreter running that does all the work of executing the Perl code. In C, the code is compiled mostly at the machine level and the CPU executes it directly. C# and Objective-C are more "in-between". But in a modern OS, there is nothing to stop C code from calling and executing Perl code and vice versa.

Yeah I know C is a much lower level language and I know Perl is a higher level language -- I just don't know what the difference is mechanically between compilation and interpretation.


The basic difference is that compiled code is executed on hardware, by the CPU whereas interpreted code is executed in software, by the interpreter. That interpreter could be executing on the CPU, or via yet another interpreter.

I also had some experience with C++ -- is that considered high level?


C++ is more like an alternate reality than a higher level. It has a few tricks that C can't accomplish, but shares all of C's downsides. Objective-C has a few (but not all) of those new tricks. Plus is has Cocoa that integrates it very easily into the MacOS X user interface. Objective-C + Cocoa is almost as good as Perl. If Perl could integrate with the user interface, things could be very interesting. I would like to redo the defunct CamelBones one day when I get some "free time".

This thread has been closed by the system or the community team. You may vote for any posts you find helpful, or search the Community for additional answers.

question about size of ints in Xcode

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.