How to use Ruby and Nokogiri to simulate crawlers to export RSS seeds

Source: Internet
Author: User
Code snippets, code sharing, PHP code sharing, Java code sharing, Ruby code sharing, Python code sharing, HTML code sharing, CSS code sharing, SQL code sharing, and JavaScript code sharing
# encoding: utf-8require 'thread'require 'nokogiri'require 'open-uri'require 'rss/maker' $result=Queue.newdef extract_readme_header(no,name,url)  frame = Nokogiri::HTML(open(url))  return unless frame  readme=$url+frame.css('frame')[1]['src']  return unless readme  open(readme) do |f|    doc = Nokogiri::HTML(f.read)    text=doc.css("p#content p#filecontents p")[0..4].map { |c| c.content }.join(" ").strip    return if text.length==0    if text !~ /(rails)|(activ_)/i      puts "========= #{no} #{name} : #{text[0..50]}"      date = f.last_modified      $result << [no,name,readme,date,text]    end  endrescue  puts $!.to_send def make_rss(items)  RSS::Maker.make("2.0") do |m|    m.channel.title = "GtitHub recently updated projects"    m.channel.link = "http://localhost"    m.channel.description = "GitHub recently updated projects"    m.items.do_sort = true    items.each do |no,name,url,date,descr|      i = m.items.new_item      i.title = name      i.link = url      i.description=descr      i.date = date    end  endend ############################## M A I N ######################## ############# Scan list of recent project lth=[]$url="http://rdoc.info"puts "get url #{$url}..."doc = Nokogiri::HTML(open($url))doc.css('ul.libraries')[1].css('li').each_with_index do |li,i|  aname =li.css('a').first  name=aname.content  purl=$url+aname['href']  lth << Thread.new(i,name,purl) { |j,n,u| extract_readme_header(j,n,u)  }end ################ wait all readme are read lth.each { |th| th.join() } ################ dequeue results and sort them by date descending result=[]result << $result.shift while $result.size>0result.sort!  { |a,b| a[0] <=> b[0] }  ################ format results in rss File.open("RubyFeeds.rss","w") do |file|  file.write make_rss(result)end

The above is a detailed description of the example of using Ruby and Nokogiri to simulate crawlers to export RSS seeds. For more information, see other related articles in the first PHP community!

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.