A few days ago, we planned to add the daily task function to the task system. This requirement is very simple:
Wow
Similarly, daily tasks are refreshed regularly at some time every day. When thinking about adding this function, the solution is to retrieve
3
Time: the time when the task was last completed (
Pre
), Current time (
Now
) And the refresh time of the day (
Refresh
), And then complete some complicated logic (omitted here
Xx
). Because this scheme requires some time operations
Lua
And
C ++
The date and time operations are troublesome. Shui Gou provides a solution to solve this problem: first, use the current time (for example
) Minus the refresh time of the day (for example
Wow
Daily
15: 00
Refresh), so that the time is aligned to the previous day
0
Point, and then divide the current time and the last Completion Time by the number of seconds in a day (
24*60*60
), The current time and the number of days of the last Completion Time are obtained respectively. If the number of days of the last Completion time is less than the number of days of the current time, the daily task is refreshed.
Lua
The code is roughly as follows:
Local now = OS. Time ()
Local pre = task. completed_time
Local refresh = (15-8) * (60*60)
--
Subtraction
8
Because
OS. Time ()
The result is:
UTC
Time, while Beijing time is
UTC + 8
Local pre_days = (pre-Refresh)/(24*60*60)
Local now_days = (now-Refresh)/(24*60*60)
If (now_days> pre_days) then
--
Refresh task
End
Very concise. The entire algorithm only needs one time function --
OS. Time ()
.
I thought this was done in this way, but after repeated tests, the tragedy occurred: the server time is correct, but the client has
1
Deviation of about minutes. The same library and script get different time. After three feet of digging, it is found in the client.
Lua
After addition and subtraction, the double-precision floating-point in becomes a single-precision floating-point number. Due to precision problems, a certain degree of Time deviation occurs. The harmonious sunshine shines on the cup, and each cup smiles.
The first thing that comes to mind is
Lua
Library problems, after all
Lua
The precision in the script has changed, but I directly
C ++
I wrote a simple test and found that the accuracy has also changed. I will discuss this issue with a few friends again, mentioned by Shui Gu:
DX
It will change the floating point precision. Indeed, I have seen it in my memory.
D3d SDK
, In
Createdevice
Found
D3dcreate
There is such an option
D3dcreate_fpu_preserve |
Set the precision for direct3d floating-point calculations To the precision used by the calling thread. If you do not specify this flag, Direct3d defaults to single-precision round-to-nearest mode for two reasons:
- Double-precision
Mode will reduce direct3d performance.
- Portions
Direct3d assume floating-point unit exceptions are masked; unmasking These exceptions may result in undefined behavior.
|
Obviously,
D3d
If this parameter is not specified
FPU
It is set to single-precision mode because processing Single-precision data has better performance than double-precision data. Because
D3d
Modified by default
FPU
, Affected
Lua
Accuracy (
Lua
Inside
Number
Only
Double
.
As for performance problems, I think so. In a single floating point operation, modern
CPU
Pair
Double
And
Float
The computing efficiency is very close, and more efficiency may be
CPU
High-speed cache hits and temporary memory bandwidth usage. After all
Double
Ratio
Float
Double the amount of data, while
3D
A large number of floating point operations exist in the environment.
Finally, the problem is only an intermediate process. At last, the client time is changed to the server time, so there is no need to modify the client code at the client, the client still uses the default
24
Bit
FPU
When the precision mode is run, how to consider both performance and accuracy does not go into depth.