about the text processing details of 50m+ in Android

Source: Internet
Author: User

For a long time did not write technical articles, 4 o'clock tomorrow morning also to climb up to catch the plane, feel the pit is necessary to record, to comfort their fragile soul. Monday and Tuesday busy 2 days to solve this problem, the middle filled a lot of pits, anyway, it is late at night, slowly record a little.

Scenario: The project has a block of business in a table about 60w of data, the terminal pulled down about 20多万条 bar, the table of data is an unknown depth of the number, if you want to know the depth, need recursion.

First edition: All data is pulled down through an interface (the interface is webservice with soap), and as the data is slowly enriched, it is found that an interface is completely unable to pull so much data at once. Modify....

Second Edition: The most parent node with an interface pull down, the interface data is not much, probably also more than 10. The second interface is then called by the primary key in the data to get the child nodes of the node. However, it is found that there is too much data on the individual interface, which is prone to memory overflow when encapsulating json, which results in app flash back. Need to improve again ...

Third edition: On the basis of the second version, the interface to obtain the data of the child node is changed to obtain the path of the TXT file, through the file path to download the text file (the program reads the file faster), in Java provides a good read and write file API, and download the file faster than the speed of access to the interface. So that's the approach. However, in the process of a single read, some of the larger files are read and encapsulated into the object when it is also easy to cause memory overflow, and write to the database is often slow (open transaction bulk write). But the problem is still unresolved, after several coding tests, the ultimate solution is now the best. First solve the batch processing problem, remember that when learning JDBC, there is a PreparedStatement class, is a database pre-compiled, batch processing of the DAO layer class. In SQLite also has a very similar class, called Sqlitestatement, do not know the people themselves brain. The problem with batching inserts is a preliminary solution, but what about the data in the big file? Of course is read bytes, each read x*1024*1024 (x<6), probably in this range, this also depends on the performance of the phone (memory), if it is old, it is finished, need to set up dynamically. Encapsulate each read batch of data into Sqlitestatement, then perform its bulk operations. The idea is over here.

2017-04-19 00:07

about the text processing details of 50m+ in Android

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.