There is a project in the company that uses SQL2000 as the transfer station, updates data from the source database server (SQL2000) to the target database server (oracle 10 Gb), uses the link database, and implements it through the storage process. After running the stored procedure on a scheduled basis for a job for N years, it suddenly finds that one job cannot be executed normally, and the other job runs normally. It's inexplicable ....
Error Description: The execution is sometimes good and sometimes incorrect. Generally, an error occurs after several times.
Most error messages are:
Execution User: sa. The ole db Provider 'sqlodb' reports an error. The provider has exhausted its memory. [SQLSTATE 42000] (error 7399) ole db Error Tracking [OLE/DB Provider 'sqloledb' IRowset: GetNextRows returned 0x8007000e: the Provider has used up its memory.] [SQLSTATE 01000] (error 7300). Step failed.
Execution User: sa. The ole db Provider 'msdaora 'reports an error. [SQLSTATE 42000] (error 7399) [SQLSTATE 01000] (error 7312) ole db Error Tracking [OLE/DB Provider 'msdaora 'IDBInitialize: Initialize returned 0x80004005:]. [SQLSTATE 01000] (error 7300). Step failed.
I have been searching for N for a long time on the Internet, but I have only asked questions and occasionally answered questions. However, there may be a few that use this method, and there is no immediate solution.
Because the relevant documents cannot be found, after N long attempts and N methods are tested, they all fail one by one.
These methods include modifying the Registry to use different dll, modifying query statements to optimize data queries, creating views, and adding Table Primary keys.
Finally, observe the running process of the stored procedure and compare the implementation differences between the two jobs,
A job that runs normally needs to be updated during data insertion. The temporary table is used in combination with the trigger.
Another job uses direct table data insertion because it is relatively simple.
However, it is precisely because of its simplicity that an inexplicable error occurs.
The large volume of data in the target table is the culprit. whether to use the batch query statement is an accomplice is not tested. However, even if empty data is inserted, using the batch insert statement will also cause the SQL statement to load massive data.
Solution: Use a temporary table in oracle, use a trigger to store data, and then clear the temporary table. the SQL server does not load massive data, which consumes a lot of memory.