The most common feature of data migration in Db2 is the import and export function. The Import and Export commands seem simple, but they actually contain xuanjicang, which is ever-changing. If you are not careful about it, there are hundreds of errors, here we will summarize the frequently used commands in our work and share them with you! Everyone is welcome! J, of course, before that, I felt it necessary to mention some basic Import and Export knowledge! DEL: defined ASCII file. The line separator and column separator separate data. ASC: a fixed-length ASCII file. The rows are separated by row delimiters and the column length is determined. PC/IXF: it can only be used to export data between db2. It is packaged into decimal or binary values based on the type of numeric values. The characters are saved as ASCII and only the length of the variable that has been used is saved, the file contains table definitions and table data. WSF: used to import and export data in a worksheet. in this format, fewer file types are used. Db2 supports different file types for different data import and export methods. I personally think it is worth noting that. File Type Import export load --------------------------------------------------------- support for non-demarcation support not support for Ixf support for Wsf worksheet support not support brief introduction to three Import and export operations: export: export data. IXF, DEL, or WSF import is supported. data can be imported into tables. Four file types mentioned above are supported. Load: import data. The function is basically the same as import. The preceding file types are supported. The Export is actually relatively simple, and there is nothing to say about it. General Command: export to filename of filetype select x from xx where; it will be OK. here you need to note: 1. export modified by codepage = Exprot to filename for different character sets. del for del modified by codepage = 1386 select... From... Where ...; Here, a database code page is converted when data is dumped from the database. 2. modified by timestampformat = "yyyy-mm-dd hh: mm: ss tt" Example: Exprot to filename. del for del modified by timestampformat = "yyyy-mm-dd hh: mm: ss tt" select... From... Where ...; About Import 1. Import mode CREATE/INSERT/INSERT_UPDATE/REPLACE/REPLACE_CREATE CREATE: first CREATE the target table and its index, and then Import the data to the new table. The only supported file format is PC/IXF. You can also specify the name of the tablespace where the new table is located: INSERT imported data into the table. The target table must already exist. INSERT_UPDATE: insert data into the table or update the rows in the table that match the primary key. The target table must already exist and a primary key is defined. REPLACE: delete all existing data and insert the imported data into an existing target table. REPLACE_CREATE: if the target table already exists, the import utility deletes the existing data and inserts new data, just like the REPLACE option. If the target table is not defined, create the table and its related indexes first, and then import the data. As you may imagine, the input file must be in the PC/IXF format, because that format contains a Structured description of the export table. If the target table is a parent table referenced by a foreign key, REPLACE_CREATE cannot be used. 2. batch submit COMMITCOUNT to ensure that the insert data is committed after COMMITCOUNT. This is a good method for importing large amounts of data to files, for example: import from filename of del COMMITCOUNT 50000 insert into tabname; 3. insert modified by compound in batches and import the COMPOUND Row Records in the file as a group. This operation can be used together with the batch submission above, which is ideal. Example: Import from filename of del modified by compound = 50 insert into tabname; 4. ROWCOUNT: Only rowcount data is imported. Sometimes, the business logic only needs to import part of the data, so ROWCOUNT is a good choice, in my tests, ROWCOUNT has never played a role. Example: Import from filename of del ROWCOUNT 10000 insert into tabname; 5. import start point RESTARTCOUNT: Import from the record RESTARTCOUNT of the Import file. Example: Import from filename of del RESTARTCOUNT 55 ROWCOUNT 10000 insert into tabname; -- Starting from 55, import 10000 data records 6. limited number of warning data items WARNINGCOUNT: When the imported data contains warnings or errors (such as Type mismatch or column mismatch) if the number of entries exceeds WARNINGCOUNT, the import will be stopped. Example: Import from filename of del WARNINGCOUNT 10 insert into tabname; 7. disable row warning modified by norowwarnings example: Import from filename of del modified by norowwarnings warningcount 10 insert into tabname; 8. LOB file lobs from: indicates the LOB path example: Import from filename of del lobs from '/home' modified by norowwarnings warningcount 10 insert into tabname; 9. we recommend that you do not import the tables of the auto-incrementing sequence (generated always) Because import does not Sequence. Only the operations in step 2 of modified by identityignore and modified by identitymissing change the original value of the sequence, in this way, if there is an association relationship between the exported table and the table based on the auto-incrementing sequence, the meaning of the data itself will be lost. Therefore, we recommend that you use the import-based auto-incrementing table operation as little as possible, what should we do? You can use load to replace import. We will talk about it in the load operation! About Load 1. String interval, column interval, decimal point represents CHARDEL/COLDEL/DECPT example: load client from 'f: s1.del 'of del modified by chardel (COLDEL = DECPT? Insert into "DB2ADMIN ". "ZXTABLES" 2. if there is a line break in the database record and data cannot be loaded, modified by delprioritychar Db2 defaults to the load priority policy: record delimiter, character delimiter, column delimiter. In this way, record delimiter has the highest priority, therefore, if the original file contains a line feed, load will be considered as a new record. If the user contains a line break in the downlink in some cases (for example, a post in the forum cannot be deleted ), you must use delprioritychar to change the default priority level to ensure that the data between "" is considered to be the same record no matter whether there is a line break. For example: load client from 'f: s1.del 'of del modified by delprioritychar insert into DB2ADMI N. ZXTABLES 3. processing of temporary tablespace mounting after load Copy YES/NONRECOVERABLE for DMS tablespaces, load is in copy NO mode by default. In this mode, after load is complete, the tablespace will be in the pending state, at this time, only the data in the table can be queried, and the table space needs to be backed up once before the table can be updated, insert, and other operations, so we can use the above two commands, for Copy YES, after load is complete, a backup operation is automatically performed. NONRECOVERABLE indicates that load cannot be restored. This command does not generate a tablespace temporarily suspended or automatically backs up the tablespace. However, it has one disadvantage, it cannot be recovered. When the database is rolled back, it cannot be restored, and the risk level is higher. However, I personally think NONRECOVERABLE is more practical. For example, load client from 'f: s1.del 'of del insert into DB2ADMIN. zxtables nonrecoverable load client from 'f: s1.del 'of del insert into DB2ADMIN. zxtables copy yes 4. load IXF files to multi-partition database partitioned db configmode load_only_verify_part part_file_location when data is moved between two databases with different number of nodes, if you still want to use load for IXF data loading, it will be tricky. At that time, I checked IBM's official documents and found that there was no result. When it was just getting upset, wolf appeared, I have offered you a trick and I will share it with you. Prepared by starbhhc