Background:
The project uses oracle data and tests normally in the development environment. After the project is deployed on the customer's server, the system reports an error when adding data. An error message is output, indicating that the error message is "exceeds the maximum length. However, according to the database design, the added data should be within the allowed Data Length range. So what is the cause? The reason is that the customer follows the oracle database. It is suspected that the oracle parameter settings in the deployment environment are different from those in the development environment.
Process:Query the relevant oracle parameters: select * from nls_database_parameters; query results of the development database and deployment data are as follows: Development Database: Deployment Database: After comparison, it is found that the value of the nls_characterset parameter in the Development Library and deployment library is different. How many bytes does a Chinese character and an English letter occupy in the development and deployment libraries? We can use SQL statements to query. Select lengthb ('King') from dual; select lengthb ('A') from dual;
Development Library |
Deployment Library |
|
|
After query, a Chinese character occupies 2 bytes in the Development Library, and the deployment Library occupies 3 bytes. Let's take a further look at what the "gold" Chinese character is stored in the Development Library and deployment library. Statement: select dump ('King', 1016) from dual;
Development Library |
Deployment Library |
|
|
We can see that the Chinese character "gold" is stored in the Development Library as bd f0 occupies 2 bytes, while the storage in the deployment library is e9 87 91 occupies 3 bytes.
Conclusion: Based on the above information, we can conclude that when nls_characterset = zhs16gbk, a Chinese character occupies 2 bytes in oracle, and when nls_characterset = al32utf8,
A Chinese Character occupies three bytes in oracle.
Solution:The reason is basically found, so how can we solve it. This is the theme I want to express today, "Risk Avoidance ". I found that many people encountered this problem on the Internet. The following describes how to modify the parameters of nls_characterset and nls_length_semantics to solve the problem. However, modifying a parameter may cause serious consequences, and the risk factor is relatively high. It is another way to modify parameters. I chose the latter. Whether the varchar2 field length in the database can be expanded to avoid this problem. Although this method also has some drawbacks, there may be issues left over during project expansion, however, at present, it is a good method for low risk. The SQL statement used to generate the expansion script is as follows: select 'alter table' | table_name | 'modify' | column_name | 'varchar2 ('| data_length * 2 | '); 'From cols where data_type = 'varchar2' and table_name in (select table_name from tabs where status = 'valid') the data generation script is as follows:
Parameter information:Http://www.itpub.net/thread-838447-1-1.htmlhttp://blog.csdn.net/lanyunit/article/details/5768581