the rest of the afternoon was thrown to a task: To import street data from across the country into a street table. But the SQL script they threw me was down from the web, completely different from the database design on this side of the platform.
the idea was to first generate a table locally on the script that was thrown at me, and then select the data you want to bulk INSERT into the table on the server where the development library resides.
then, a test was made according to this idea:
Insert into Dbo.test_street (code,countycode,cnname,versionno,creator,createtime,validstatus)/* table to insert data */SELECT NEWID () as code,id as countrycode,name as cnname,1 as Versionno, ' LHC ', GETDATE () as createtime,1 as Validstatus from Pre_co Mmon_district/* Data source table */
After that, you successfully import more than 4W of data with a rotten script, and then you import the tables from the local database into the development library.
Here, we use the bcp command to export the data, and then insert the data into the development library with BULK INSERT:
Use the following command:
bcp "Kfglxt.dbo.test_Street" out f:\test. txt -c-q-S "zb-liuhch-pc\sqlexpress" - T
bcp "Kfglxt.dbo.test_Street" out f:\test. Dat -c-q-S "zb-liuhch-pc\sqlexpress" -T (also available, other formats supported)
Feel 4W more data or minutes of things, pretty fast.
In addition to executing with CMD, you can do it directly in Query Analyzer:
EXEC KFGLXT. xp_cmdshell '/* here to fill in the bcp command */'
Next we are going to import the data backed up as files into the development library:
Use KFGLXT BULK INSERT dbo.test_street from ' F:\test.txt ' with ( fieldterminator = ' \ t ', rowterminator = ' \ n '
do it, feel still pretty much pit ...
SQL Server Big Data Import Export: Import data from one table into another table