How to implement 100% Dynamic Data Pipeline (iii)

Source: Internet
Author: User
Tags commit contains extend implement key sql return table name
Dynamic | data
Here's how to build the pipe syntax based on the data in the middle tier:


1. First set up a Data window object: D_vdtcolumns


The SQL syntax is:


SELECT Vdt_columns.utid,
Vdt_columns.uid,
Vdt_columns.upkey,
Vdt_columns.udmid,
Vdt_columns.udmname,
Vdt_columns.unulls,
Vdt_columns.uwidth,
Vdt_columns.uscale,
Vdt_columns.uname,
Vdt_columns.udefault,
Vdt_columns.ucheck,
Vdt_columns.utname,
Vdt_columns.uidentity
From Vdt_columns
WHERE Utname =: as_tname


2. Ready to work, the following is the main battlefield, start building data pipelines.


Considering that a pipe object can transfer multiple tasks, establish an object Nvo_pipetransattrib save the syntax for the transport required:
It contains a instance variable:


String is_objectname//table name
String Is_syntax//pipe syntax

String is_sconnect= ' ZW ', is_dconnect= ' daixf '/source database connection and destination database connection
String is_ptype,is_pcommit,is_errors//pipe several properties
String Is_sname,is_dname//source table name, destination table name
String is_sqlsyntax//pipe SQL syntax



Creates an object that inherits from the data pipeline object.

Start construction Syntax: Write a function.
Nvo_pipetransattrib inv_attrib[]

String Ls_syntax,ls_sourcesyntax,ls_destsyntax

int li,lj,li_ind,li_find,li_rows,li_identity
String Ls_tablename,ls_default,ls_defaultvalue,ls_pbdttype
Boolean Lb_find
Dec Ld_uwidth,ld_prec,ld_uscale
String ls_types,ls_dbtype,ls_prikey,ls_name,ls_nulls,ls_msg,ls_title= ' Of_constrpipesyntax () '
Nvo_string lnv_string
Nvo_datastore Lds_vdtcolumns
Boolean Lb_key

Lds_vdtcolumns=create Nvo_datastore
lds_vdtcolumns.dataobject= ' D_vdtcolumns '
Lds_vdtcolumns.settransobject (SRCSQLCA)
Li=1

Of_input (Inv_attrib[li])
Li_find=pos (Inv_attrib[li].is_sqlsyntax, ' * ', 1)
If Li_find>0 Then
Lds_vdtcolumns.retrieve (As_tablename)
Of_filterimg (Lds_vdtcolumns)
Li_rows=lds_vdtcolumns.rowcount ()
For Lj=1 to Li_rows
Ls_name=lds_vdtcolumns.getitemstring (LJ, ' uname ')
Ls_types=lds_vdtcolumns.getitemstring (LJ, ' Udmname ')
li_identity = Lds_vdtcolumns.getitemnumber (LJ, ' uidentity ')
Ls_types=of_getpipedbtype (Is_s_dbtype,ls_types)
Ls_pbdttype=of_getpbdttype (Is_s_dbtype,ls_types)
Choose Case Ls_types
Case ' char ', ' varchar ', ' nchar ', ' nvarchar ', ' long varchar '
If ls_types= ' long varchar ' then ls_types= ' varchar '
Ld_uwidth=lds_vdtcolumns.getitemnumber (LJ, ' uwidth ')
Ls_dbtype=ls_types+ ' (' +string (int (ld_uwidth)) + ') '
Case ' decimal ', ' numeric '
Ld_uwidth=lds_vdtcolumns.getitemnumber (LJ, ' uwidth ')
Ld_uscale=lds_vdtcolumns.getitemnumber (LJ, ' Uscale ')
If Li_identity=1 Then
Ls_dbtype= ' identity ' + ' (' +string (int (ld_uwidth)) + ', ' +string (int (ld_uscale)) + ') '
Else
Ls_dbtype=ls_types+ ' (' +string (int (ld_uwidth)) + ', ' +string (int (ld_uscale)) + ') '
End If
Case Else
Ls_dbtype=ls_types
End Choose
Ls_prikey=lds_vdtcolumns.getitemstring (LJ, ' Upkey ')
If ls_prikey= ' Y ' then
Lb_key=true
Ls_prikey= ' Key=yes, '
Else
Ls_prikey= '
End If

Ls_nulls=lds_vdtcolumns.getitemstring (LJ, ' unulls ')
If ls_nulls= ' Y ' then
Ls_nulls= ' yes '
Else
Ls_nulls= ' No '
End If

Ls_default=isnull (lds_vdtcolumns.getitemstring (LJ, ' Udefault '), "

ls_sourcesyntax+= "COLUMN (type=" +ls_pbdttype+ ", name=~" "+ls_name+" ~ ", dbtype=~" "+ls_dbtype+" ~ "," +ls_prikey+ "nulls _allowed= "+ls_nulls+") ~r~n "
If ls_default= ' Then
If li_identity = 1 Then
ls_destsyntax+= "COLUMN (type=" +ls_pbdttype+ ", name=~" "+ls_name+" ~ ", dbtype=~" "+ls_dbtype+" ~ "," +ls_prikey+ "Nulls_ Allowed= "+ls_nulls+", initial_value=~ "exclude~") ~r~n "
Else
ls_destsyntax+= "COLUMN (type=" +ls_pbdttype+ ", name=~" "+ls_name+" ~ ", dbtype=~" "+ls_dbtype+" ~ "," +ls_prikey+ "Nulls_ Allowed= "+ls_nulls+") ~r~n "
End If
Else
If li_identity = 1 Then
ls_destsyntax+= "COLUMN (type=" +ls_pbdttype+ ", name=~" "+ls_name+" ~ ", dbtype=~" "+ls_dbtype+" ~ "," +ls_prikey+ "Nulls_ Allowed= "+ls_nulls+", default_value=~ "+ls_default+", initial_value=~ "exclude~") ~r~n "
Else
ls_destsyntax+= "COLUMN (type=" +ls_pbdttype+ ", name=~" "+ls_name+" ~ ", dbtype=~" "+ls_dbtype+" ~ "," +ls_prikey+ "Nulls_ Allowed= "+ls_nulls+", default_value=~ "" +ls_default+ "~") ~r~n "
End If
End If
Next
Else
Return ""
End If
ls_sourcesyntax+= ') '
ls_destsyntax+= ') '

Generate PIPELINE
Example
PIPELINE (source_connect=csfdata,destination_connect=csfdata,type=replace,commit=100,errors=100,keyname= "Bar_ X ")
If Lb_key Then
ls_syntax+= ' PIPELINE (source_connect= ' +inv_attrib[li].is_sconnect+ ', destination_connect= ' +inv_attrib[li].is_ dconnect+ ', type= ' +inv_attrib[li].is_ptype+ ', commit= ' +inv_attrib[li].is_pcommit+ ', errors= ' +inv_attrib[li].is_ Errors+ ', keyname= "' +as_tablename+ ' _x") ~r~n '
Else
ls_syntax+= ' PIPELINE (source_connect= ' +inv_attrib[li].is_sconnect+ ', destination_connect= ' +inv_attrib[li].is_ dconnect+ ', type= ' +inv_attrib[li].is_ptype+ ', commit= ' +inv_attrib[li].is_pcommit+ ', errors= ' +inv_attrib[li].is_ errors+ ') ~r~n '
End If

Generate SOURCE
Example
SOURCE (name= "Bar", COLUMN (type=char,name= "Customcode", dbtype= "char (8)", Key=yes,nulls_allowed=no)
ls_syntax+= ' SOURCE (name= "' +inv_attrib[li].is_sname+ '"), '


Ls_syntax+=ls_sourcesyntax


Generate RETRIEVE
Example
RETRIEVE (statement= "Select Bar.customcode,bar.barcode,bar.itemcode,bar.metering,bar.packsize,bar.length, Bar.width,bar.high,bar.vol,bar.weight,bar.newpackflagfrom Bar ")
ls_syntax+= ' RETRIEVE (statement= "' +inv_attrib[li].is_sqlsyntax+ ')"


Generate destination
Example
Destination (name= "bar_copy"),
COLUMN (type=char,name= "Customcode", dbtype= "char (8)", key=yes,nulls_allowed=no,initial_value= "spaces")
ls_syntax+= ' Destination (name= "' +inv_attrib[li].is_dname+ '", "
Ls_syntax+=ls_destsyntax

Return Ls_syntax

The return value of this function is the constructed pipe syntax.



Where: initialized function: Of_input (Inv_attrib[li])


Is the initialization, Inv_attrib function, the initialization of the data is mainly the user needs to enter the conditions, such as the pipe Type,commit,errors,select statement.

You need to explain the functions that deal with a few special cases.


Of_filterimg (lds_vdtcolumns):


Filters out image columns in the table because the pipe does not support image data transfer.


Of_getpipedbtype (is_s_dbtype,ls_types):


The types of data columns in the pipeline are based on the types of columns in the table, because they are not always corresponding.


This can be done through a extend datawindowobject and contains the initial data.


Of_getpbdttype (is_s_dbtype,ls_types):
The types of columns in the pipeline are based on the types of columns in the table, because they are not always corresponding.
This can be done through a extend datawindowobject and contains the initial data.



When the pipe syntax is built, you can perform a pipeline transfer:



This.syntax= Get the syntax

LI_RC = this. Start (srcsqlca,destsqlca,idw_errors)

If LI_RC <> 1 Then
If not ib_silence then msg (ls_title, "Object Transfer failed:" + string (LI_RC))
Of_addtransmsg (' Object < ' +is_currentobj+ ' > Transmission failed: ' + string (LI_RC))
Return LI_RC
Rollback;
Else
Commit;
End If



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.