In debuggingProgramI am confused by the keyword sizeof.
See the following example:
Int A [5]
Everyone understands that an array is defined here, which contains five int-type data. We can use a [0] And a [1] to access each element in the array.
When we define an array A, the compiler determines the size (element type size * Number of elements) based on the specified number of elements and element type allocation, and name the memory as. Name a once
Matching with the memory cannot be changed. A [0], a [1], and so on are elements of a, but they are not element names. Each element of the array has no name. Now let's take a look at several questions about the sizeof Keyword:
The value of sizeof (a) is sizeof (INT) * 5, and 20 in 32-bit systems.
The value of sizeof (A [0]) is sizeof (INT) and 4 in a 32-bit system.
The value of sizeof (A [5]) is 4 in a 32-bit system, and no error occurs here. The sizeof keyword is evaluated during compilation. Although the element a [5] does not exist, it does not actually access a [5], but only determines its value based on the type of the array element. Therefore, using a [5] does not cause errors.
Sizeof (a) is the size of five elements, which is 20 in a 32-bit system. It is easy to understand that the sizeof value of all elements
The value of sizeof (& A [0]) is 4 in a 32-bit system. For better understanding, it is the first address of the first element a [0.
But in a 32-bit system, the result of sizeof (& A) VC ++ 6.0 is 20, which makes it hard for me to understand. Isn't & A the first address of array? & Since a is the first address of a, shouldn't it be 4? If my opinion is correct, is the compiler wrong? If my opinion is wrong, then & A is not the first address? Very contradictory!
Hope you can give me some advice.