將rails應用從linux平台遷移到windows2003,並將Proxy 伺服器從lighttpd轉為了iis,試用了一下,基本正常。
但後台發現上傳圖片時,報了個錯誤 Size is not included in the list
中午查了一下,本以為是size超出了限制,發現沒有超過,打了下調試資訊,發現self.size為0, 但回退再次上傳成功,應該是檔案在儲存上出了問題,
google之,發現有人解決了,用的是size之前sleep 5,暈倒,這解決方案也太隨便了。
blogpost上有另外一個老外的解決方案,不錯。轉過來算了,我痛恨gfw.
Size is not included in the list
Fixing attachment_fu on Windows
Like many others, I've encountered issue when developing Rails applications using attachment_fu on Windows. After doing some research, I've come up with the following solution to the problem.
The problem has two parts :
- Size is not included in the list error message,
- Timeout error when uploading to S3.
Fixing
"Size is not included in the list" error message
Some people have reported that there is a timing issue, when trying to get the file size, with Tempfile on Windows. It seems that the size of the file is not properly reported by Windows after writing data to it. Proposed solutions for this problem include :
- Sleeping in a loop as long as the file size is 0,
- Reading back the entire file in memory.
I think I found a better and less patchy solution for this issue: forcing the OS to flush the file to disk before reading it's size.
Here is the code to do it :
require 'tempfile'class Tempfiledef sizeif @tmpfile@tmpfile.fsync # added this line@tmpfile.flush@tmpfile.stat.sizeelse0endendend
Doing a flush is not enough... flush will flush the Ruby buffer but the file may not be immediately written to the disk by the OS. Doing the fsync ensure that the file is written to disk by the OS before continuing. After that, Windows will properly report the actual file size.
Fixing the Timeout error when uploading to S3
This issue is related to opening files for reading on Windows. On Windows, you have to open the file in binary mode. So patching attachment_fu is simple :
require 'technoweenie/attachment_fu/backends/s3_backend'module Technoweeniemodule AttachmentFumodule Backendsmodule S3Backendprotecteddef save_to_storageif save_attachment?S3Object.store(full_filename,(temp_path ? File.open(temp_path, "rb") : temp_data), # added , "rb"bucket_name,:content_type => content_type,:access => attachment_options[:s3_access])end@old_filename = niltrueendendendendendI've also included a fix from someone else (which was not enough in itself to solve my S3 upload problem):module Technoweeniemodule AttachmentFu# Gets the data from the latest temp file. This will read the file into memory.def temp_dataif save_attachment?f = File.new( temp_path )f.binmodereturn f.readelsereturn nilendendendend
Wrapping it up
So I put all this code in lib/attachment_fu_patch.rb and required it in environment.rb.
Problem fixed!
Note, I did not test it on other OSes, but these fixes should not have any adverse effects.