Today, when I imported the previously backed up data to oracle, I found the prompt ORA-12899 error value too large for column, this is the problem of too big data, let me share the solution process with you.
The following error occurs when exporting data from production to testing:
ORA-02374: conversion error loading table "SWTONLINE". "TBL_TRANS_LOG_HISTORY_ B"
ORA-12899: value too large for column MER_ADDR_NAME (actual: 54, maximum: 40)
It is preliminarily estimated that, due to character set differences, Chinese occupies 3 bytes in the UTF-8 and ZHS16GBK occupies 2 bytes.
View Test Library Character Set
The Code is as follows: |
Copy code |
Sys @ PROD> select * from V $ NLS_PARAMETERS 2; PARAMETER VALUE -------------------------------------------------------------------------------------------------------- NLS_LANGUAGE SIMPLIFIED CHINESE NLS_TERRITORY CHINA NLS_CURRENCY $ NLS_ISO_CURRENCY CHINA NLS_NUMERIC_CHARACTERS ., NLS_CALENDAR GREGORIAN NLS_DATE_FORMAT DD-MON-RR NLS_DATE_LANGUAGE SIMPLIFIED CHINESE NLS_CHARACTERSET AL32UTF8 NLS_SORT BINARY NLS_TIME_FORMAT HH. MI. SSXFF AM NLS_TIMESTAMP_FORMAT DD-MON-RR HH. MI. SSXFF AM NLS_TIME_TZ_FORMAT HH. MI. SSXFF AM TZR NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH. MI. SSXFF AM TZR NLS_DUAL_CURRENCY $ NLS_NCHAR_CHARACTERSET AL16UTF16 NLS_COMP BINARY NLS_LENGTH_SEMANTICS BYTE NLS_NCHAR_CONV_EXCP FALSE 19 rows selected. Elapsed: 00:00:00. 00 |
View production database Character Set
The Code is as follows: |
Copy code |
Sys @ ORADB> select * from V $ NLS_PARAMETERS; PARAMETER VALUE -------------------------------------------------------------------------------------------------------- NLS_LANGUAGE AMERICAN NLS_TERRITORY AMERICA NLS_CURRENCY $ NLS_ISO_CURRENCY AMERICA NLS_NUMERIC_CHARACTERS ., NLS_CALENDAR GREGORIAN NLS_DATE_FORMAT DD-MON-RR NLS_DATE_LANGUAGE AMERICAN NLS_CHARACTERSET ZHS16GBK NLS_SORT BINARY NLS_TIME_FORMAT HH. MI. SSXFF AM NLS_TIMESTAMP_FORMAT DD-MON-RR HH. MI. SSXFF AM NLS_TIME_TZ_FORMAT HH. MI. SSXFF AM TZR NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH. MI. SSXFF AM TZR NLS_DUAL_CURRENCY $ NLS_NCHAR_CHARACTERSET AL16UTF16 NLS_COMP BINARY NLS_LENGTH_SEMANTICS BYTE NLS_NCHAR_CONV_EXCP FALSE 19 rows selected. Elapsed: 00:00:00. 00 |
Changing the character set of the Test Library is the same as that of the production library.
The Code is as follows: |
Copy code |
Sys @ PROD> shutdown immediate; Database closed. Database dismounted. ORACLE instance shut down. Sys @ PROD> STARTUP MOUNT ORACLE instance started. Total System Global Area 939495424 bytes Fixed Size 2233960 bytes Variable Size 671091096 bytes Database Buffers 260046848 bytes Redo Buffers 6123520 bytes Database mounted. Sys @ PROD> alter system enable restricted session; System altered. Elapsed: 00:00:02. 19 Sys @ PROD> alter system set JOB_QUEUE_PROCESSES = 0; System altered. Elapsed: 00:00:00. 02 Sys @ PROD> alter system set AQ_TM_PROCESSES = 0; System altered. Elapsed: 00:00:00. 01 Sys @ PROD> alter database open; Database altered. Elapsed: 00:00:05. 09 Sys @ PROD> alter database contains character set ZHS16GBK; Alter database login character set ZHS16GBK * ERROR at line 1: ORA-00911: invalid characters Elapsed: 00:00:00. 00
|
--- Indicates that the character set is incompatible. The INTERNAL_USE command is used to check the character set superset.
The Code is as follows: |
Copy code |
Sys @ PROD> alter database character set INTERNAL_USE ZHS16GBK; Database altered.
Elapsed: 00:00:53. 35 |
Restart the database and import the database again.
The Code is as follows: |
Copy code |
[Oracle @ idata ~] $ Impdp '/as sysdba' directory = backup dumpfile = cil_20130702.dmp logfile = maid action = replace schemas = EPAYMENT, LOGSDB, PREPAID, RISK, SWTSETTLE, SWTONLINE |
Add oracle character encoding
Generally, the query NLS_CHARACTERSET (Database Character Set), NLS_NCHAR_CHARACTERSET (National Character Set), should be stored in multiple languages, need character set as UTF-8.
The Code is as follows: |
Copy code |
SELECT * from v $ NLS_PARAMETERS where parameter in ('nls _ CHARACTERSET ', 'nls _ NCHAR_CHARACTERSET ');
|
Result UTF-8, OK.
Check FieldA varchar2 (10 char) in the database)
Check FieldA varchar2 (10) in the problematic DB)
(Before the problem occurs, I did not notice the difference between the two definitions ...)
A Chinese Character in the UTF-8 is 3 bytes, as can be seen from the above definition, if the char/byte definition leads to a large difference in the length of the stored data.
You can set the NLS_LENGTH_SEMANTICS parameter to specify the length of a CHAR or VARCHAR2 column when creating a table by specifying the byte or character (character.
The NCHAR, NVARCHAR2, CLOB, and NCLOB columns are character-based.
NLS_LENGTH_SEMANTICS does not affect SYS and SYSTEM User tables. Data dictionary definitions use bytes ).
You can specify the length of a byte or character when defining a column:
CHAR (10 BYTE)-bytes are used regardless of what NLS_LENGTH_SEMANTICS is set ).
CHAR (10 CHAR)-no matter what the NLS_LENGTH_SEMANTICS is set to, the character (char) is used ).