Scrapy crawls the item, we can use pipeline.py to enter the item into the MySQL database.
Write pipeline.py define a class, this class is used to enter the database, remember in the setting.py item_pipelines={...} Add the path of this class, and then you can write in the setting database connection IP, port, user name, password, database has a variety of ways, some of them to define them, for example, mysql_user= ' root ', some can put the above several things into a string of URLs, Parsing in the pipeline.py.
Very good, but found that the Chinese input to the database, is garbled, ~~~~~~~
Then Baidu to this blog:
61619656/
So, we can set MySQL character set to UTF8,
Modify/ETC/MYSQL/MY.CNF:
[mysqld]character-set-server=UTF8 [client]default-character-set=UTF8 [Mysql]default- Character-set=utf8
Restart MySQL $ etc/init.d/mysql restart
Enter mysql> show variables like '%char% ';
You can see the settings change in order to UTF8
Also, for example, when we built the table in MySQL, we can specify the character set as UTF8 navicat has this option, very convenient
Well, it's not convenient at all, ~~~~~~.
Scrapy crawler input MySQL Chinese garbled solution