At first, the computer operating system was 32 bits, and the time was expressed as 32 bits.
(Integer. MAX_VALUE );
2147483647
Integer is expressed in 32 bits, so the maximum value expressed in 32 bits is 2147483647. In addition, the total number of seconds for the second day of the year is 365,
2147483647/31536000 = 68.1
That is to say, the maximum time that 32 bits can represent is 68 years. In fact, by, January 1, January 19, 2038
Seconds, the maximum time will be reached. After this time point, all 32-bit Operating System Time will change
10000000 00000000 00000000 00000000
That is, in December 13, 1901, 20, 45, and 52 seconds, this will result in time regression, and many software will run abnormally.
Here, I think the answer to the question has come out:
Because 32 bits are used to indicate that the maximum interval of time is 68 years
The generation TIME of the machine and the application TIME are combined Based on the epoch TIME of January 1, 1970 as unix time (start
Time), and java naturally follows this constraint.
As for the phenomenon of time regression, it is believed that 64-bit operations will be gradually solved with the emergence of the operating system.
The system can represent 15: 30 minutes and 08 seconds from January 1, 292,277,026,596. Believe in our generation and grandchildren
We don't have to worry about the destruction of the Earth that day, because this time is already years later.