Explain how MongoDB accelerates the storage of physical files and SQUID

Source: Internet
Author: User

When reading the MongoDB GFS document, I learned how to store physical files, including large files. In order to deepen your impression, I wrote an example of uploading files to Mongodb. Of course, since it is stored in the document database, you cannot access these physical files in normal ways, here, An aspx page is specially written to read these files or MP3 files). Therefore, after downloading the example, you will see two pages: uploadfile. aspx (upload), one is getfile. aspx reads the file from mongodb as a stream ). Of course, considering the access speed, SQUID is introduced here to accelerate files. Currently SQUID only caches static files by default, so we need to set the output of the ASPX page here, these will be introduced in this article)

First, let's introduce the development environment. I use VS2008 + SP1, And the mongodb client uses the latest version of samus-mongodb)

Add reference to the following namespace

 
 
  1. using MongoDB;  
  2. using MongoDB.GridFS; 

The following is the main code for uploading files:

Uploadfile. aspx. cs

 
 
  1. HttpPostedFile myFile = FileUpload.PostedFile;  
  2.      int nFileLen = myFile.ContentLength;  
  3.        
  4.      byte[] myData = new Byte[nFileLen];  
  5.      myFile.InputStream.Read(myData, 0, nFileLen);  
  6.        
  7.      GridFile fs = new GridFile(DB, filesystem);  
  8.  
  9.      Random random = new Random(unchecked((int)DateTime.Now.Ticks));  
  10.  string newfilename = string.Format("{0}{1}{2}", random.Next(1000, 99999), 
  11. random.Next(1000, 99999), System.IO.Path.GetExtension(myFile.FileName));  
  12.      GridFileStream gfs = fs.Create(newfilename);  
  13.      gfs.Write(myData, 0, nFileLen);  
  14.      gfs.Close(); 

Here we just give a random name to the uploaded file, so that if everything is normal, you can find the file in the database, such:

Next let's take a look at how to access the uploaded physical file getfile. aspx. cs by passing the filename parameter, the corresponding field structure in mongodb, such ):

 
 
  1. Protected void Page_Load (object sender, EventArgs e)
  2. {
  3. If (! String. IsNullOrEmpty (Request. QueryString ["filename"])
  4. {
  5. String filename = Request. QueryString ["filename"];
  6. Init ();
  7. String filesystem = "gfstream ";
  8.  
  9. GridFile fs = new GridFile (DB, filesystem );
  10. GridFileStream gfs = fs. OpenRead (filename );
  11.  
  12. Byte [] buffer = new Byte [gfs. Length];
  13. HttpContext. Current. Response. AddHeader ("Expires", DateTime. Now. AddDays (20). ToString ("r "));
  14. HttpContext. Current. Response. AddHeader ("Cache-Control", "public ");
  15. // The length of the data to be read
  16. Long dataToRead = gfs. Length;
  17. Int length;
  18. While (dataToRead> 0)
  19. {
  20. // Check whether the client is still connected
  21. If (HttpContext. Current. Response. IsClientConnected)
  22. {
  23. Length = gfs. Read (buffer, 0, 10000 );
  24. HttpContext. Current. Response. OutputStream. Write (buffer, 0, length );
  25. HttpContext. Current. Response. Flush ();
  26. Buffer = new Byte [10000];
  27. DataToRead = dataToRead-length;
  28. }
  29. Else
  30. {
  31. // Jump out of the endless loop if the connection is no longer available
  32. DataToRead =-1;
  33. }
  34. }
  35. Gfs. Dispose ();
  36. HttpContext. Current. Response. End ();
  37. }
  38. }

The following shows the final running effect of retrieving the file list from mongodb in a list:

Although MONGODB has good concurrent performance, it also suffers performance loss every time it is retrieved from mongodb. Especially for physical files that do not change frequently, SQUID is used for file caching. Currently, SQUID only supports static files by default. You need to set the stream information output from the ASPX page in this example to cache.

First, if the squid. conf file contains the following lines, you need to use # To comment it. It will disable caching all? ):

 
 
  1. hierarchy_stoplist cgi-bin ? \.php \.html  
  2. acl QUERY urlpath_regex cgi-bin \? \.php \.html     
  3. cache deny QUERY 

In this way, modify the corresponding. aspx and add the following information to the Header:

 
 
  1. HttpContext.Current.Response.AddHeader("Expires", DateTime.Now.AddDays(20).ToString("r"));  
  2. HttpContext.Current.Response.AddHeader("Cache-Control", "public"); 

In this way, SQUID will faithfully cache the corresponding file based on the header information.

Of course, you can also use the following method to make the specified aspx files be squid cached:

 
 
  1. Acl CACHABLE_PAGES urlpath_regex \ getfile. aspx
  2. # Allow the aspx page on the cache
  3. No_cache allow CACHABLE_PAGES

The following acl matches all dynamic pages and disables cache of all aspx pages.

 
 
  1. # Acl NONE_CACHABLE_PAGES urlpath_regex \? \. Aspx
  2. # Disable cache for other aspx pages
  3. # No_cache deny NONE_CACHABLE_PAGES

The following lines set the page cache duration. The first line is cache for one day and the second line is cache for two minutes.

 
 
  1. refresh_pattern ^http://10.0.4.114:1100/mongodbsample/getfile.aspx 1440 0% 1440 ignore-reload  
  2. refresh_pattern ^http://10.0.4.114:1100/mongodbsample/getfile.aspx 2 0% 2 ignore-reload 

If you want to configure SQUID correctly, you only need to access the site requested by SQUID as http: // 10.0.4.85: 8989/mongodbspame/uploadfile. aspx). here it will go to http: // 10.0.4.114: 1100/mongodbspame/uploadfile. aspx obtains the page information and getfile on the link in the page. all aspx files are cached, for example:

Now, the content of today is here first. Download the sample source code and SQUID configuration file:/Files/daizhj/mongodbsample.rar

Original article title: Use Mongodb storage to upload physical files and perform SQUID acceleration (based on the aspx page)

Link: http://www.cnblogs.com/daizhj/archive/2010/08/19/1803454.html

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.