What is a UNIX timestamp?
Unix timestamp (in English for Unix epoch, Unix time, POSIX times, or Unix timestamp)
Is the number of seconds that have elapsed since January 1, 1970 (Midnight of Utc/gmt), regardless of leap seconds.
Note:
JavaScript: Math.Round (New Date (). GetTime ()/1000), GetTime () the unit of return value is in milliseconds
C #: DateTime.Now.ToUniversalTime (). Ticks-621355968000000000)/10000000
Why does computer time start from January 1, 1970?
The first computer operating system is 32 bits, and the time is represented by 32 bits. The maximum value 32 bits can represent is 2147483647. Another 1 years the total number of seconds 365 days is 31536000,2147483647/31536000 = 68.1, that is, 32 can represent the longest time is 68, and actually to January 19, 2038 03:14 07 seconds, will reach the maximum time, over this time point, All 32-bit operating system time will change to 10000000 00000000 00000000 00000000, that is, December 13, 1901 20:45 52 seconds, so there will be a time regression phenomenon, many software will run the exception.
Because the maximum interval of 32 bits to represent time is 68, and the earliest Unix operating system takes into account the age of the computer and the time of the application, the time of January 1, 1970 as the era of UNIX times (the start time) is combined. As for the phenomenon of time regression, I believe that with the 64 of the operating system is gradually resolved, because 64-bit operating system can be represented to 292,277,026,596 December 4 15:30 08 seconds, believe that our generation of n generations, even if the day of the destruction of the earth will not worry enough, Because this time is hundreds of years later.
Deep reading:
Unix Timestamp: http://baike.baidu.com/view/821460.htm
C # in Datetime.ticks:https://msdn.microsoft.com/zh-cn/library/system.datetime.ticks.aspx
ERA time for UNIX times:
Http://www.cnblogs.com/haitao-fan/archive/2013/01/09/2853740.html
http://lixiaoyu080.blog.163.com/blog/static/4342673820104711362145
Unix time stamp