problem background : PHP reads the REDIS data on the line, which is often unstable, and the data response is sometimes not.
workaround : Read multiple times and read all the last unread data until you get it all.
This article implements multiple Redis data reads with the shell, each time fetching the valid values (for our example, to key, to get its value on the Redis as a valid value, other invalid), and to re-run the invalid value again, in this iteration, until all redis data is removed. Ps:redis data can be read by PHP or C, given the interface is very simple, the specific can refer to Phpredis, due to the possibility of a secret, this article does not give the implementation of this piece.
-
Method Overview:
- divides a key in source into n files, throws in Redis parallel get value
- real-time statistics n files get value output total number of rows, and the number of rows that are valid
- N file statistics end, merge all of its results, put in result/step/stepx.res
- Delete the original parallel source file, the result file
- will not be in result Take the key into the source/step2/contsigns, as the next round of input, again divided it into n files, run (here contsign is the Redis key)
- Finally, the results of each step are written to the Final.res file. Cat result/steps/step*.res >> final.res
-
Project structure:
- getredis.php: Implementing get Redis Data
- all.sh: Main program, parallel execution getredis.php;
- analyze_result.sh: Real-time analysis of Redis get Data Execution (2nd step), add parameters to implement the above 第3-5 step (see the next section Note);
- source/: Stores input data, where all/is the input for all Redis, step2/is the key that is not acquired in this round, and will be the input for Redis in the next round, and the remainder (such as XAA) is the N file divided into the key of the current round;
- result/: Stores the result, where source/contains the output of all n files under the current round of source; steps/contains the merged results after each round of output
-
Specifically implemented:
all.sh:
#Author: Rachel Zhang # Email: [email protected] for file in source /*do {echo $file " if test -f $file then php getredis.php $file > Result/ $file fi echo $file done ... "}& done
Analyze_result.sh:
#Author: Rachel Zhang#Email: [Email protected]filefolder=result/Source/*#Filefolder =source/*Echo "##################################"Echo "in Folder $Filefolder"Echo "##################################"Nl=0hv=0 forFileinch $Filefolder Do ifTest- F $file ThenFline= ' WC- L $file| Awk' {print '} '' Nl=$ ($NL+$fline)) Fvalue= ' Cat$file|awk' begin{x=0;} {if ($) x=x+1;} End{print x;} '' Hv=$ ($HV+$fvalue))fi DoneEcho "Totally $nl lines"Echo "$HV lines have tag value"###################################combine results into one fileif["$#" -GT 0] Then ifTest- e "Result/all.result" ThenMV Result/all.result Result/all.result.bakfi forFileinch $Filefolder Do ifTest- F $file ThenCat$file>> Result/all.resultfi Done Echo "All the output is write in Result/all.result" # put Null-value keys into source/step2/contsigns ifTest- e Source/step2 ThenMkdirSource/step2fiCat result/all.result| Awk' {if ($2==null) print} '>Source/step2/contsigns nnull_value= ' WC- L Source/step2/contsigns | Awk' {print '} '`Echo "remaining ${nnull_value} keys to being processed" # put Has-value key-value into Result/steps/step${step_id}.resStep_id=1 whileTest- eResult/steps/step${step_id}. res Dostep_id=$ (($step _id+1)) DoneCat Result/all.result | Awk' {if ($) print} '> Result/steps/step${step_id}. resEcho "Current valid results is writen in Result/steps/step${step_id}. res" # Remove the current source files, generate new source files and re-run all.sh Echo "Remove current source and result files?" (y/n) " ReadAnswerif[$answer=="Y"]; Then if[$Nnull _value -GT Ten]; ThenRmSource/* RM result/Source/*CD Source&& Split- L theStep2/contsigns &&CD.. /Echo "now re-run sh./all.sh?" (y/n) " ReadAnswerif[$answer=="Y"]; ThenSH all.shfi fi fifi
PS: The above analyze_result.sh can also remove the interactive part of the analyze_result.sh (no user input), through the crontab timing, further automated execution.
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
<shell> iterative reading of Redis to solve data instability problems