programming languages are affected by computer systems, and modern computer systems are widely affected by UNIX systems, and the time of January 1, 1970 is the starting Time for UNIX systems (epoch time). So each timestamp is expressed by how long it took since midnight on January 1, 1970 (Calendar)
The computer needs a reliable external clock synchronization source, so the early Unix system with a 32-bit word length to represent time, 1/60 seconds, that is, 1Hz for time interval and external time source synchronization (this is not entirely due to the old beauty of the power grid frequency is 60Hz, then the system motherboard crystal is 1Hz). As a result, the span represented by this time is only about 829 days (about 2.5 years), which is obviously not enough and therefore requires a primitive beginning (JI) (yuan)) time, since the UNIX system originated in the last 69 's, the first official version was first run on PDP-11 in 1970, and the November 1971 Unix Programmer ' s Manual (Unix Programmer's Manual) was first published, and the manual mentions the starting time , define it as the "1971" Year of January 1. The manual also acknowledges that the starting time will be revised approximately every 2.5 years.
The system time synchronization interval is then revised to 1 seconds, so that 32 bits can represent a span of about 136 years, which is exactly the period (ominous for the year), and the start time is revised to 1970.1.1 (Unix developers think that the previous 1971.1.1 rounding up to the nearest age of the beginning (with a date of every 10 years), than the 1971 this is a bit of a nondescript time good), so since then, UNIX has been the beginning of the 1970.1.1 this time, and the relevant program has been the corresponding use of this time, and by the U Nix impact of subsequent operating systems such as: OS/2, Windows, Mactonish, Linux .... Are followed by this {fact Standard}.
Why is the time in many languages of computers calculated from midnight on January 1, 1970?