Now, we occasionally find that some keys in redis have dozens of MB values, which is abnormal data. Is there a way to sort all keys larger than 10 MB by value size? Now, we occasionally find that some keys in redis have dozens of MB values, which is abnormal data. Is there a way to sort all keys larger than 10 MB by value size?
Reply content:
Now, we occasionally find that some keys in redis have dozens of MB values, which is abnormal data. Is there a way to sort all keys larger than 10 MB by value size?
Helped you find a tool that was tested available: https://github.com/sripathikrishnan/redis-rdb-tools#generate-memory-report
The installation and usage document is written. If the rdb command cannot be found after the installation is complete, you can directly execute the command in the installation directory:
rdbtools/cli/rdb.py -c memory /path/to/your/dump.rdb > result.csv
| Database |
Type |
Key |
Size_in_bytes |
Encoding |
Num_elements |
Len_largest_element |
| 0 |
String |
"Cccc" |
98 |
String |
4 |
4 |
| 0 |
String |
"Bbb" |
96 |
String |
3 |
3 |
| 0 |
Hash |
"User" |
102 |
Ziplist |
1 |
6 |
| 0 |
String |
"Aa" |
94 |
String |
2 |
2 |
The sizeinbytes In the result column is the size you want. After export, you can sort it in the lower order. In addition, make sure to modify the path of your own dump. rdb file. I have not tested the case that the data volume is too large, and the analysis is estimated to be slow.