Why do programming languages and databases start computing in January 1, 1970?

Source: Internet
Author: User
Tags epoch time
The emergence of today reminds me of the use of datetime in SQL Server. the error is reported when the minvalue is inserted because of the minimum database time and. the minimum time inconsistency in. net. I checked some materials online and found the following results:
In net framewrok, datetime. minvalue = & gt; 0001/01/01 00: 00: 00sqldatetime. minvalue. value => 1753/01/01 00:00:00 in SQL Server 2005, datetime minimum value => 1753/01/01 00: 00: 00 smalldatetime minimum value => 1900/01/01 00:00:00 net framewrok, datetime. maxvalue => 9999/12/31 23:59:59. 999 sqldatetime. maxvalue. value => 9999/12/31 23:59:59. in 997sql Server 2005, the maximum datetime value is> 9999/12/31 23:59:59. 997 smalldatetime maximum => 2079.6.6

Therefore, datetime. minvalue cannot be used when the minimum database insertion time is required.

Sqldatetime. minvalue. value.

 

Now that the SQL Server database time problem has been solved, I suddenly think of the system's January 1, 1970 time. So what did this time come from? With curiosity, I read the following explanation on the Internet:

1. the UNIX system thinks that of the first day of the current year is the epoch of time. Therefore, we often say that the Unix timestamp is based on of the first day of the current month. This explanation is lazy's favorite ^_^

 

2. At first, the computer operating system was 32 bits, and the time was also represented by 32 bits. The maximum value of 32 bits is 2147483647. In addition, the total number of seconds for the first day of the year is 31536000,2147483647/365 = 31536000, that is to say, the maximum time for 32 bits to represent is 68 years. In fact, by the time of on January 1, 68.1, the maximum time is reached. After this time point, the time of all 32-bit operating systems will change to 10000000 00000000 00000000 00000000, that is, December 13, 1901, 45 minutes, and 52 seconds, in this way, there will be time regression, and many software will run abnormally.

Here, I think the answer to the question has come out: because the maximum interval of time expressed by 32 bits is 68 years, the earliest UNIX operating system took into account the generation of computers and the time limit of applications, and took January 1, 1970 as the epoch time (start time) of UNIX time ), as for the phenomenon of time regression, we believe that 64-bit operating systems will be gradually solved with the emergence of 64-bit operating systems, because the 64-bit operating system can represent 292,277,026,596-, seconds, I believe that even if the earth is destroyed, we don't have to worry about it enough, because this time is already years later.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.