As we all know, in PHP, time data can be directly converted to timestamps, so there are also ways to achieve this in front-end script js, that is, valueOf ()
For example, we can use this code to output the timestamp of the current time point.
Copy codeThe Code is as follows:
<Script type = "text/javascript">
Document. write (new Date (). valueOf ());
</Script>
So what is the timestamp?
The timestamp is the number of seconds from 00:00:00, January 1, January 1, 1970 to the current time point. , January 1, is the so-called "time epoch" in IT. I believe you have seen IT in many articles. For example, why 1970 instead of 1900 is selected for oracle time display and unix and linux time calculation. Next, let's explain its origins.
1. When the rise of 32-bit computers started,
The maximum value that can be expressed by integer-type Intger data is:
System. out. println (Integer. MAX_VALUE );
2147483647
2. The total number of seconds for 365 of a year is 31536000,
3. Division of the two, 2147483647/31536000 = 68.1,
4. In a 32-bit background, the maximum time interval that can be expressed is 68 years.
5. If it is counted from 1970, the maximum time will be reached by, January 1, January 19, 2038,
6. After this time point, the time for all 32-bit operating systems will change to 10000000 00000000 00000000 00000000
That is, 20: 45 minutes and 52 seconds in December 13, 1901, so time regression occurs. Some software may cause major exceptions.
To sum up the above points, the people who come up with this set of things are still the first people who actually play UNIX. They considered that the 68 years were not long, so they counted the 68 years from 1970, their hope is that in the past 68 years, people will be able to develop a better mechanism as soon as possible to change this restriction... Indeed, with the rise of 64-bit
January 19, 2038-292,277,026,596-December 4
Therefore, this number is big enough. The "end of the world" of computer time is the end of the world.
Code changes the world, my source code, and my world!