It is the high season of the job-writing, a few days ago heard that someone encountered this problem: #define A 10 and the const int a=10 difference, nonsense said, below to explain:
#define directive is to define symbolic constants
Const defines a constant variable (the value of a variable cannot be changed)
Symbolic constants simply replace a string with a symbolic constant, which is replaced at the time of precompilation. There is no type, and in memory there is no memory cell with a long symbolic name;
The variable is of the type, and in memory there is a memory unit named after it, and it is possible to measure his length with sizeof.
The difference above means that the first A is of no type and is a symbolic constant
And the second A is of type int, a variable that cannot be changed by a value
OK, do you understand?
Random 1 (#define a 10 and const int a=10)