First, let's take a look at Newrelic's report.
 
Average response time of the last 24h
 
 
Those pages with high traffic (Action)
 
 
Access to several Action cases:
 
Topicscontroller#show
 
 
Userscontroller#show (more miserable, mainly GitHub API request slow)
 
 
PS: I had a little modification before releasing this article, the GITHUB request is placed in the background queue processing, the new result is this:
 
 
Topicscontroller#index
 
 
Homecontroller#index
 
 
From the report above, the current request from Ruby's back-end, excluding the user's home page, has a response time of less than 100ms, or even lower.
 
How did we do that?
 
Markdown Cache
Fragment Cache
Data caching
ETag
Static resource caching (JS,CSS, pictures)
Markdown Cache
 
When the content is modified, even if the result of good markdown, save to the database, avoid browsing time and again calculation.
 
In addition, this thing is deliberately not put to the Cache, but put in the database:
 
In order to persist, to avoid Memcached stop, a large number of lost;
Avoid excessive consumption of cached memory;
 
 
  
  
Class Topic
 Field:body # Stores the original content, which is used to modify
 field:body_html # to store calculated results for displaying
 before_save:markdown_body
 def Markdown_body
  self.body_html = Markdowntopicconverter.format (self.body) if self.body_changed?
 End
End
Fragment Cache
 
   
  
This is the most used caching scheme in Ruby country and the reason for the speed increase.
 
 
  
  
App/views/topics/_topic.html.erb
<% Cache ([topic, suggest]) do%> <div class=
"topic Topic_line topic_<%= topic.id%> ">
  <%= link_to (Topic.replies_count, #{topic_path)} #reply topic Topic.replies_count} ",
     : Class =>" Count State_false ")%>
 ... Omit Content section
</div>
<% End%>
 
   
  
Cache_key as cached cache views/topics/{number}-#{update time}/{suggest parameter}/{file content MD5}-> views/topics/19105-20140508153844 /false/bc178d556ecaee49971b0e80b3566f12
Some related to the user account, there are different state of the display, directly to the full HTML ready, through JS control state, such as the current "like" function.
 
 
  
  
<script type= "Text/javascript" >
 var readed_topic_ids = <%= current_user.filter_readed_topics (@topics)% >;
 for (var i = 0; i < readed_topic_ids.length i++) {
  topic_id = readed_topic_ids[i];
  $ (". Topic_" + topic_id + ". Right_info. Count"). AddClass ("State_true");
</script>
 
   
  
Another example
 
 
  
   App/views/topics/_reply.html.erb <% cache ([Reply, raw:#{@show_raw} "]) do%> ;d IV class= "Reply" > <div class= "pull-left face" ><%= User_avatar_tag (Reply.user,: normal)%></div > <div class= "infos" > <div class= "info" > <span class= "name" > <%= user_name_tag (reply.use R)%> </span> <span class= "opts" > <%= Likeable_tag (reply,: Cache => true)%> <%= Link_to ("", Edit_topic_reply_path (@topic, Reply),: Class => "edit icon Small_edit", ' Data-uid ' => reply.user_id,:
      Title => "Modify replies")%> <%= link_to ("", "#", ' Data-floor ' => floor, ' data-login ' => reply.user_login, : Title => t ("Topics.reply_this_floor"),: Class => "icon small_reply")%> </span> </div> &  Lt;div class= "Body" > <%= sanitize_reply reply.body_html%> </div> </div> </div> <% End %>  
   
  
It is also cached through the reply Cache_key views/replies/202695-20140508081517/raw:false/d91dddbcb269f3e0172bf5d0d27e9088
 
At the same time there are complex user rights control, with JS implementation;
 
 
  
  
<script type= "Text/javascript" >
 $ (document). Ready (function () {
  <% if admin?%>
   $ ("#replies. Reply A.edit "). CSS (' Display ', ' inline-block ');
  <% elsif current_user%>
   $ ("#replies. Reply a.edit[data-uid= ' <%= current_user.id ']"). CSS (' Display ' , ' Inline-block ');
  <% End%>
  <% if Current_User &&! @user_liked_reply_ids. Blank?%>
   Topics.checkreplieslikestatus ([<%= @user_liked_reply_ids. Join (",")%>];
  <% end%>
 })
</script>
 
   
  
Data caching
 
In fact, most of Ruby's Model queries are not Cache, because according to the actual situation, MongoDB query response time is very fast, most of the scenes are within 5ms, or even lower.
 
We will do some parity-responsible data query caching, such as: GitHub Repos access
 
 
  
  
def github_repos (user_id)
 Cache_key = "User:#{user_id}:github_repos"
 items = Rails.cache.read (Cache_key)
 if Items.blank?
  Items = Real_fetch_from_github ()
  Rails.cache.write (Cache_key, items, expires_in:15.days) end return
 Items
End
ETag
 
   
  
ETag is an HTTP Request where Response can take a parameter to detect if content has been updated to reduce network overhead.
 
The process is probably like this
 
 
The Rails Fresh_when method can help generate ETAG information for your query content
 
 
  
  
def show
 @topic = Topic.find (Params[:id])
 fresh_when (etag: [@topic])
end
 
   
  
Static resource Caching
 
Please do not underestimate this thing, the back end is written faster, it may be slowed down by these (browser performance)!
 
1, reasonable use of Rails Assets Pipeline, be sure to open!
 
 
  
  
# config/environments/production.rb
Config.assets.digest = True
 
   
  
2, in the Nginx inside will CSS, JS, Image cache validity set to Max;
 
 
  
  
Location ~ (/assets|/favicon.ico|/*.txt) {
 access_log off    ;
 Expires      Max;
 Gzip_static on;
}
 
   
  
3, as much as possible to reduce a page JS, CSS, Image number, the simple way is to merge them, reduce the HTTP request overhead;
 
 
Some Tips
 
Look at the statistics log, priority to deal with high flow of the page;
Updated_at is a very helpful thing to help you clean up the cache, use it! Don't ignore it when you modify the data!
Pay more attention to your Rails Log query time, 100ms page response time is a better state, more than 200ms users will feel sluggish.