SQL Server Bi step by step 5 transaction, error output, event processing, logging

Source: Internet
Author: User
Tags sql 2008 ssis

 

SQL Server Bi step by step SSIS 5 (end)
--- Transactions, error output, event processing, logging


 

Like other programs, SSIS packages also require robust and stable operation to ensure reliability and scalability. SSIS provides the following support:

1. Transaction: You can set one or more transactions for a package, or even one transaction for two packages. To ensure data consistency, you can also perform DTC transactions or SQL Server engine-level transactions.
2. Checkpoint: used to record the running status of a task in case of a package error, so that when the package starts again, it is directly executed from the task where an error occurs.
3. Error output: even if the program is perfect, errors may occur, especially for tasks in the data stream, and errors may occur due to format, type, and other issues. SSIS Allows error output processing for such lines. You can fix or record the errors. You can also ignore the errors.
4. Priority constraint: In the control flow, you can not only use the success, failure, or end of a task as a condition to execute the following task, but also use an expression as a condition.
5. Event Processing: In the SSIS package, event processing is parallel with control flow and data flow. Here, you can process different events of a package, task, or container. The event can also be used to set breakpoints and control logging. We can use the onerror event to capture errors.
6. log Record: you can record the specified event information at runtime. It can be stored in local text or XML files, or saved to the database, or in Windows Eventlog or profiler files, you can even extend your logging.

Requirements:
To integrate with other systems, the product price of the adventureworks system is updated in real time. You need to download a compressed file from a specified URL, decompress the file, and analyze and import the data file. Because the downloaded file is a non-Program-maintained file, due to human factors, the data format in the file may be incorrect, but such data must be ignored. In addition, if the processing fails, you need to keep the historical files. If an error occurs, you need to send an error email. At the same time, you need to keep the program log to find the cause of the error.
Steps:


------------------------------------------------------------Download and decompress the file--------------------------------------

1. Download and decompress the file
1. Create a New productprice folder under your n disk to store compression and data files, and create a Bak directory to store backup files.
2. Create an SSIS package and create the following variables:

 

  • How to Set attributes of variables:
    Reference: http://msdn.microsoft.com/zh-cn/library/ms141663.aspx
    Select the variable to be set and click the Properties window to set the variable.

3. Create an ole db connection to connect to our adventureworks database.
4. Create an HTTP connection and enter the location of the remote compressed file in the URL (of course, variables can also be used here ).
 

Of course, creden or certificates may be used here.

 

5. Add a script task to the control flow to download the file and set the read variable User: zippath.
 
Edit script:

Script

Public void main ()
{
// Todo: add your code here
Object nativeojbect = DTS. Connections [0]. acquireconnection (null );

Httpclientconnection connection = new httpclientconnection (nativeojbect );

String filename = DTS. Variables ["zippath"]. value. tostring ();

Connection. downloadfile (filename, true );

DTS. taskresult = (INT) scriptresults. success;
}

Run the package and you will find that the rarfile can be downloaded to a local folder.

----------------------------------------------------Extract-------------------------------------------------------

 

Ii. decompress the file
1. to decompress the package, first confirm that the WinRAR application has been installed locally.
Add the process execution Task component named decompress the file. Set the property windowstyle to hidden, set the expression, and set the properties as follows:

:

In the execution package, two files, one zipzip file and one productlist.txt file, are generated at the same time under the productpricedirectory of the c drive.

 

 

-----------------------------------Analyze the data in the extracted files and update the database ----------------------------

 

3. analyze the data in the extracted files and update the database
1. Create a flat file connection manager named productlist, pointing to our c: \ ssis_example \ productprice \ productlist.txt file.

Edit the flat file Connection Manager

Switch to tab advanced:
Datatype is set to Unicode
However, our file path is a variable and cannot be written here. Therefore, in the attribute, set its expressions and set the connectionstring value to @ [user: datapath].

 

2. Add the data flow Task component named "analytic data update price". Edit the component to enter the data flow,
1) First add a flat file source component to connect the newly created file to the productlist. There are only two columns in the file, one is the product number and the other is the product price. Both are of the dt_wstr type.
 

Modify the output column name in the tab column.

2) Add the data conversion Task component and convert the product price to the decimal type:

  • Configuration Error output
    When a program package is executed and an error occurs, it is converted into a data conversion task. For an empty string, it cannot be converted into a number. What should I do? Click Configuration Error output in the lower-left corner. Here we can specify a row-level error and select an error row:
    1. component failure: the entire task fails to be executed.
    2. Ignore failed: Ignore this row error. For this task, the value of pricelist in the ignored row is null.
    3. Redefine the row: Failed data. You can redefine the output.
    Here, we choose to redefine the line

Similarly, we can also handle the same error output for the above file connection to prevent data errors during file reading.

 

3) add an ole db command to update the data. Sqlcommand is

Script

Update production. Product Set listprice =? Where productnumber =?

 
In the column ing column, map two parameters. Pay attention to the order before and after.
 

4) Add a row count task. Note: drag the red error from the data conversion task to the row count, so that the row data with the error enters the task. Set the variable of the row counting task to @ errorcount. Run the package again. The error is gone, and it is found that the correct direction of the data conversion is divided into three rows, and the error direction of the two rows

Run:


 

Database before running:


 
Database after update

 

-----------------------------------------Operations on compressed packages, backup, and deletion of data files------------------------------------

Iv. Operations on compressed packages, backup, and deletion of data files

1. Add file processing. Add three file systems and execute them in the control flow. Analyze the names to delete compressed files, back up compressed files, and delete data files. Note that all variables are used (For details, refer to the download source file ).

  • Delete compressed file Task component settings

  •  
  • Set Control Flow
    We want to execute backup compression files when the data stream fails to be executed. here we need to set priority constraints. The default value is green when the execution is completed. After the configuration is complete, right-click the connection and select failed.

  •  
  • Delete compressed file Task component settings

 

  • Backup and compression file Task component settings
    Here, the backpath variable in our variables uses the calculated variable to calculate the date and dynamically change the value. For variable settings here, refer to how to set the attribute of the variable at the beginning!

  •  

So far, the following operations have been performed to delete and back up compressed files:


  • Delete data file Task component settings
    Whether the execution succeeds or fails, we all want to delete the data file at the end. We also direct the two file system tasks to the data file deletion task at the same time. We need to set the tasks between them to be executed upon execution completion. As shown above, right-click and select Edit to go to the priority constraint editor. We choose logic or, that is, one of the two tasks is successfully executed.

  •  

 

-------------------------------------------------Set checkpoints---------------------------------------------------

5. Set checkpoints)


Despite the increase in the number of products, the downloaded compressed package is too large. When each execution fails, we hope to directly run the failed task without having to download and decompress it again.
Set the checkpoings and check points to complete this function.

1) First, set the three attributes of the package in the control flow tab: savecheckpoints to true, checkpointusage to ifexists, and checkpointfilenameto set a TXT file (create a TXT file in the directory ).

2) Note that the failpackageonfailure attribute of a task must be true if you want to set a checkpoint for a task. We set the failpackageonfailure attribute of the data stream that analyzes the data update price to true.. (This is actually in conflict with the above priority constraints, because now, even if the task fails, it will directly cause the entire package to fail, the following Backup and deletion tasks will not be executed. Here we just want to demonstrate different settings respectively ).

 

Run: Let's set the data in the data stream to component failure again to cause the entire data stream to fail. Run the package and check that there is a lot of data in the checkpointfilename you just selected. The data conversion task is changed to the new line. When the package is run, the two tasks of downloading the product price compressed file and decompression file are not run, but start directly from the price of the analysis data update.
 

In practice, we can use checkpoint settings. When re-running the package, we do not have to re-run the task without errors, and it consumes a lot of resources and time.

 

--------------------------------------------------Transactions---------------------------------------------------------

6. Set transactions

What if an error occurs when the file processing under the data stream fails? Will our price update be submitted?


We changed the path for deleting the compressed file to user: filepath to cause an error. After comparison, we found that even if the task of deleting the compressed file has encountered an error, the above analysis data update price task also submitted changes, and the product price has been updated successfully. In fact, this is not what we want to see. We want the files under productprice to be consistent with the update processing of our database. When the following file processing errors occur, the above database updates are also not submitted.

This requires transactions. You need to set the transaction attribute of the package and the task-Level Attribute-transactionoption in the control flow tab to describe its value.

Support if a transaction already exists in the parent object, add it.
Not Supported is not added even if a transaction exists.
The required transaction is required. If yes, add the existing transaction. If not, start a transaction.
We can see that our current task and package's transactionoption value are support, but the transaction is not started. The modification package's attribute is required, and an error occurs again. This setting is not allowed. The prompt is:
The current package settings are not supported. Change the savecheckpoints attribute or transactionoption attribute.

 

In fact, checkpoings and transaction conflict with each other. The container of a transaction is the smallest unit that can be re-started and cut-in. The whole transaction is not executed, or both are executed, which meets the characteristics of the transaction.
To reduce conflicts between the two, SSIS does not save its checkpoint information when a container (a package is also a container) is not in a task. In addition, checkpoints also has the same problem in the foreach container, because it cannot save the information inside the container.. (Here is a detailed description.Http://technet.microsoft.com/en-us/library/ms140226.aspx)
Similarly, to demonstrate the effect, set savecheckpoints to false and transactionoption to true. run the package again and compare the data at the front-end. When the compression file fails to be deleted, the price update task is immediately displayed, and the price is not updated. At this time, the distributed transaction has been started for the package.
As a newbie, we all have questions about whether the checkpoint and transaction can be operated in a package at the same time. Now, I am not planning to verify it immediately and wait for further solutions, or you can find a solution.

 

---------------------------------------------------------Set event handlers-------------------------------------

7. Set the event handler

Isn't the entire package going wrong?
Of course not. If remote access is unavailable or the connection times out, if the local disk space is insufficient, or if an error occurs when updating the database, the entire package may be faulty again. We hope that we can actively notify us of such errors. And tell us the specific error information.
Switch to the event handler. You can select two drop-down boxes to run files and event handlers. Here, we only respond to the onerror of the package and click Create below. In event processing, we can use the same task types as the control flow.

 

To send an email, we use the same method as last time. To send an email using a script, we only need one script task. For details, see "Submit ".


 

I am not familiar with the script and have not sent emails.

 

-------------------------------------------Set Logging------------------------------------------------------

8. Set Logging

Finally, let's set the log record. Click the SSIS menu above and select log record. Here you can also select a container and the log type to switch to the details, we can select the event to be recorded:

Create a new log.txt text file to the directory, which points to the text during Configuration:


 

Detailed error information is found in the dig command. We can also extend the provider type here to make it possible to send emails to achieve the same effect as event processing, and it is more universal, so we will not go further here.


------------------------------------------------Summary-----------------------------------------------------------

IX. Summary
We have demonstrated several series of operations such as remote data download, decompression, and analysis, it also has some practices for transactions, checkpoints, event handlers, diaries, and error outputs. I am also a beginner. I am not very familiar with the things that work very well. However, by studying SSIs, I found that SSIS has strong functions such as data import and export and data analysis! In this way, if SQL and SSIs are used well, they should be able to effectively link data and mine data to meet business needs.
P.s: The adventureworks database is a new user to learn from. Why? By sorting it out, we can find that there are excellent structures in terms of naming, foreign keys, primary keys, data table classification, and data structure.

Http://technet.microsoft.com/zh-cn/library/ms140185.aspx

 

 

 

Project Step 5 source code file: the version is SQL 2008. You must install the WinRAR application before running the code.

/Files/cocole/step5sql08.zip

 

Author: Wukong's Sky (tianma xingkong)
Source: http://www.cnblogs.com/cocole/
The copyright of this article is shared by the author and the blog Park. You are welcome to repost this article. However, you must retain this statement without the author's consent and provide a clear link to the original article on the article page. Otherwise, you will be held legally liable.

 

 

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.