Ask a user statistics solution with a large data volume, for example, if an application has million users. 500 txt files are generated every day. each file corresponds to a user's operation record. the file content includes the user name, ip address, access page, mac, access duration, and other fields) now, we need to perform data statistics and filtering on the contents of these files. Is there any good solution. What I can think of now is to import the data to the MYSQL database and then perform Statistics and filtering. However, the data volume is too large and I don't know if it is feasible. And how to quickly ask users with large data volumes for statistical solutions
For example, an application has million users.
500 txt files are generated every day. each file corresponds to a user's operation record. the file content includes the user name, ip address, access page, mac, access duration, and other fields)
Now, we need to perform data statistics and filtering on the contents of these files. Is there any good solution.
What I can think of now is to import the data to the MYSQL database and then perform Statistics and filtering. However, the data volume is too large and I don't know if it is feasible. It is also a problem to quickly import databases.
------ Solution --------------------
I have never met a user of this magnitude.