Mediawiki import database download: http://zh.wikipedia.org/wiki/Wikipedia:%E6%95%B0%E6%8D% AE %E5%BA%93%E4%B8%8B%E8%BD%BD
MediaWiki data import Method
Use the php Command that comes with MediaWiki: mwdumper.Manual: Importing XML dumps
This page describes methodsImport XML dumps.
Contents [Hide]
- 1 How to import?
- 1.1 Using Special: Import
- 1.1.1 Changing permissions
- 1.1.2 Possible Problems
- 1.2 Using importDump. php, if you have shell access
- 1.2.1 FAQ
- 1.2.2 Error messages
- 1.3 Using mwdumper
- 1.4 Using xml2sql
- 1.5 Using pywikipediabot, pagefromfile. py and Nokogiri
- 2 What to import?
- 3 Troubleshooting
- 3.1 Merging histories, revision conflict, edit summaries, and other complications
- 3.2 Interwikis
- 4 See also
|
The Special: Export page of any mediawiki site, including any Wikimedia site and wikipedia, creates an XML file (content dump ). see meta: Data dumps and Manual: DumpBackup. php. XML files are explained more on meta: Help: Export.
There are several methods for importing these XML dumps:
How to import? [Edit] Using Special: Import [edit]
Special: Import can be used by wiki usersImportPermission (by default this is users inSysopGroup) to import a small number of pages (about 100 shocould be safe ).Trying to import large dumps this way may result in timeouts or connection failures. See meta: Help: Importfor a detailed description.
See Manual: XML Import file manipulation in CSharp for a C # code sample that manipulates an XML import file.
Changing permissions [edit]
See Manual: User_rights
To allow all registered editors to import (not recommended) the line added to localsettings. php wocould be:
-
$ WgGroupPermissions ['user'] ['import'] = true;
-
$ WgGroupPermissions ['user'] ['ortupload'] = true;
Possible Problems [edit]
For using Transwiki-Import PHP safe_mode must be off and open_basedir must be empty. Otherwise the import fails.
Using importDump. php, if you have shell access [edit]
-
Recommended method for general use, but slow for very big data sets. for very large amounts of data, such as a dump of a big Wikipedia, use mwdumper, and import the links tables as separate SQL dumps.
importDump.php
Is a command line script located in the maintenance folder of your MediaWiki installation. If you have shell access, you can call importdump. php like this (add paths as necesary ):
php importDump.php --conf LocalSettings.php dumpfile.xml.gz wikidb
Or this:
php importDump.php < dumpfile.xml
WhereDumpfile. xmlIs the name of the XML dump file. If the file is compressed and that has. GzOr. Bz2File extension, it is decompressed automatically.
Afterwards use ImportImages. php to import the images:
php importImages.php ../path_to/images
Note: If you are using WAMP installation, you can have troubles with the importing, due to innoDB settings (by default is this engine disabled in my. ini, so if you don't need troubles, use MyIsam engine)
Note: For Mediawikis older than version 1.16, to run importDump. php (or any other tool from the maintenance directory), you need to set up yourAdminSettings. phpFile.
Note: Running importDump. php can take quite a long time. For a large Wikipedia dump with millions of pages, it may takeDays, Even on a fast server. Also note that the information in meta: Help: Import about merging histories, etc. also applies.
After running importDump. php, you may want to run rebuildrecentchanges. php in order to update the content of your Special: Recentchanges page.