In the Django Advanced Foundation, some of the operations are to manually create the non-mainstream operation of the connection, which is too low, of course, also to familiarize yourself with this framework! In practice, Django comes with a mechanism for connecting databases and creating apps, along with a more sophisticated routing system mechanism. Now that the basics are understood, talk about the mainstream stuff. First, the web framework re-cogni
Reply content:I don't know what your specific needs are, and if you're just getting started, my experience may be good for you.
I just taught myself, just a few days ago.
I have 7 days to learn Python, the textbook is:/httplearnpythonthehardway.org/book/ Chinese version: / http Readthedocs.org/docs/le arn-python-the-hard-way-zh_cn-translation/en/latest/index.html
Then spent 6 days self-taught Django,
The framework of current Python web development is the most popular, but this article describes a more lightweight web framework: Bottle framework. The theoretical thing is not spoken, directly on the instance code.
1. Description of the problemRecently do a system of background development, with the python+bottle for
object is used as the input.
Standardized content extraction: uses standard xslt templates to extract webpage content
Standardized output: outputs the content extracted from the web page in standard XML format.
Explicit extract plugging Interface: The extract is a clearly defined class that interacts with the crawler engine module through class methods.
3. extract code
Pluggable extractors are the core com
Recently, the company is to give us training, mainly web automation testing, and now the work is a daily test app, just getting started, but when I read someone else to write the bug, I do not feel able to find out how bad the bug.The first two weeks of time has been set up in the environment of automated testing, because there is no time to work during the day to do exercises, only after work time to learn about their own, bitter force ah. Every day
For all Web applications, essentially a socket server, the user's browser is actually a socket client.WSGI (Web server Gateway Interface) is a specification that defines the interface format between Web apps and Web servers written in Python, enabling decoupling between
small, so the communication speed is fast.2, Flexible: HTTP allows the transfer of any type of data objects. The type being transmitted is marked by Content-type.3. No connection: The meaning of no connection is to limit the processing of only one request per connection. When the server finishes processing the customer's request and receives the customer's answer, the connection is disconnected. In this way, the transmission time can be saved.4. Stat
seconds, pypy+gevent and Nodejs performance is similar, pypy slightly better, Pypy+tornado is slightly slower than Nodejs.
2. Frame article:
From the above example can be seen, pure code is still more cumbersome to write, generally we are using the framework to write the web, I chose a few lightweight framework to output HelloWorld:
Go+martini
1 Package main 2 3 import "github.com/codegangsta/martini"
) print imglist cnt = 1 for Imgurl in imglist:
urllib.urlretrieve (Imgurl, '%s.jpg '%cnt) cnt + 1if __name__ = = ' __main__ ': html = gethtml (' http://www.baidu.com ') getimg (HTML)
According to the above method, we can crawl a certain page, and then extract the data we need.
In fact, we use urllib this module to do web crawler efficiency is extremely low, let us introduce Tornado Web Server.Tornado
A: IntroductionPython has a lot of web frameworks, and individuals have looked up, including Django, pylons, Tornado, bottle, and flask, among which the largest number of users is Django, And I learned Django because the Django framework was used in OpenStack.Django is an open-source Web application framework, written in Python, using MVC's software design patter
Python supports multithreading, mainly through the thread and threading modules. This article mainly shares with you how to implement multi-threaded web crawler in python. For more information, see, there are two ways to use a Thread. One is to create a function to be executed by the Thread, and pass the function into the Thread object for execution. the other is
UsepythonWrite a monitoringMySQLthe script, inZabbix WebAdd the Template:## First Use MySQLdb interface associated with the database. [[Emailprotected]python]#catcheck_mysql_custom.py#!/usr/local/bin/python '] Author =chenmingle "" Description:getmysqlstatus "importosimportsystry: importMySQLdbasmysqlexceptException,e: printeprint "Pipinstallmysql-python" Sys.ex
Very early want to learn the Web crawler ~ Suffering from the learning is not fine and too lazy so slow to action ~ recently because the project is almost done, just use empty learning this new language, learn about the new technology. (PS: Really do not typesetting ugly on the Ugly point bar)The above said that the idiot-type description is not spit groove in the look at you ~ but spit groove yourself ~ afraid of a day forgot how to operate haha ~ go
1. Browser request dynamic page procedure 2. WSGIPython Web Server Gateway Interface (or simply WSGI, read as "Wizgy").WSGI allows the developer to separate the selected web framework from the Web server. You can mix and match Web servers and web frameworks to select a suita
The example of this article for everyone to share the Python web framework Tornado operation and deployment of detailed content for your reference, the specific content is as follows
First, run and deployBecause Tornado has its own httpserver built-in, running and deploying it is not the same as other Python web frame
1. Install the Python package with PipMost Python packages are installed using the PIP utility, and the virtual environment created with pyvenv automatically installs the PIP.1. Use Pip to install flask (other Python packages)Pip Install flask 2. A simple DemoFrom flask import flaskfrom flask import abortfrom flask Import Redirectapp = Flask (__name__) @app. Rou
1, use the Python library urllib2, use the Urlopen and the request method.2, Method Urlopen prototype
urllib2. Urlopen (url[, data][, timeout])
url represents the destination Web page address, which can be a string, It can also be requested object request
data represents the parameters submitted to the target server by post
timeout indicates a time-out setting o
Document directory
1. Capture simple web pages
2. Download an object
3. basic use of urllib
4. basic use of urllib2
I recently learned Python again. Unfortunately, I didn't use it for work. I could only use my spare time to play it out.1. Capture simple web pages
# coding=utf-8import urllib2response = urllib2.
extracted from a Web page in a standard XML format
Explicit extractor plug-in interface: Extractor is a well-defined class that interacts with the Crawler engine module through class methods
3. Extractor code
Pluggable Extractor is the core component of the instant web crawler project, defined as a class: Gsextractor for python2.7 source code files and their doc
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.