(unsigned int) problem
Hi there,
Not sure if I am in the correct place to ask these kind of questions. However in any case, some help will be greatly appreciated.
The problem is that I do not know what is wrong with this code:
#import <Foundation/Foundation.h>
int main (void)
{
@autoreleasepool {
unsigned int i, j; // the problem is solved if this line is interchanged by:
// int i, j;
unsigned int m=4, n=3;
double matrix[m][n];
for (i=0; i<m; i++)
for (j=0; j<n; j++)
matrix[i][j]=5*i-2*j;
printf("Matrix:\n");
for (i=0; i<m; i++)
for (j=0; j<n; j++)
printf(" matrix[%u][%u] = %.3f\n", i, j, matrix[i][j]);
}
return 0;
}
When running this code, I get these odd results:
matrix[0][1] = 4294967294.000
matrix[0][2] = 4294967292.000
which should be:
matrix[0][1] = -2.000
matrix[0][2] = -4.000
Any idea?
Thank you!
P.S. This problem is solved if both i and j are defined as numbers of type int. Why is this? I am using Xcode 4.2.1.
Xcode 4.2.1-OTHER, Mac OS X (10.7.3)