The requirements of the test environment for the first two days will be on-line production environment, demand or
A. Data source: SSP library ssp.m_system_user,oracle DB 12.1.0.2.0,ogg Version 12.2.0.1.1 oggcore_12.2.0.1.0_platforms_151211.1401_ FBO
B. Data target: MySQL DLS library Dls_system_user
C.kafka cluster: 10.1.1.247, Ogg Version 12.3.0.1.0 oggcore_oggadp.12.3.0.1.0ga_platforms_170828.1608
Since Oracle 12c is already a multitenant architecture, there are a few things to consider when using Ogg synchronization
A CDB contains multiple PDB
Extraction mode can only be integrated (integrated) mode, not support Claasic capture traditional mode captures;
Because you want to use integrated extract, you need to be able to access the Log mining server, which can only be accessed from Cdb$root;
The source side uses the common user, the C # #ogg这种用户来访问源端DB, to access the Redo log & all PDBs of the db.
In a ggsci or parameter file, you can use pdb.schema.table to access a specific table or sequence;
You can use the Sourcecatalog parameter in the parameter file, specify a PDB, and only the schema.table in the following parameters;
Target side each PDB has a replicat process, that is, a replicat process can only be delivered to a single PDB and cannot be delivered to multiple.
Source-side Ogg users need to empower: Dbms_goldengate_auth.grant_admin_privilege (' C # #GGADMIN ',container=> ' all '), It is also recommended that the user settings of Ogg be empowered as Grant DBA to C # #ogg Container=all;
Source-side DB In addition to the previously open archive, Force logging, minimum additional log, you may also need to open a switch: Alter system set enable_goldengate_replication=true;
specific implementation steps;
1. Add additional logs for the tables to be synchronized
dblogin userid [Email Protected],password ogg_prod
Add Trandata Ssp.m_system_user
2. Add Extraction Process
Add extract ext_kaf4,integrated Tranlog, begin now--12C difference
Add Exttrail./dirdat/k4, extract ext_kaf4,megabytes 200
Ggsci (salesdb as [email protected]/salespdb) 17> dblogin Useridalias ggroot
Successfully logged into database Cdb$root.
--Must register ext process in CDB
Register extract EXT_KAF4 Database Container (salespdb)
Edit params EXT_KAF4
Extract EXT_KAF4
UserID C # #ggadmin, PASSWORD ggadmin
Logallsupcols
Updaterecordformat COMPACT
Exttrail./dirdat/k4,format RELEASE 12.1
Sourcecatalog salespdb
Table Ssp.m_system_user;
3, add the delivery process:
Add extract Pmp_kaf4, Exttrailsource./dirdat/k4
Add Rmttrail./dirdat/b4,extract pmp_kaf4,megabytes 200
Eidt params PMP_KAF4
EXTRACT PMP_KAF4
USERID C # #ggadmin, PASSWORD ggadmin
PASSTHRU
Rmthost 10.1.1.247, Mgrport 9178
Rmttrail./dirdat/b4,format Release 12.1
Sourcecatalog salespdb
Table Ssp.m_system_user;
4. Add initialization process
Add EXTRACT ek_04, sourceistable---source side added
EXTRACT ek_04
USERID C # #ggadmin, PASSWORD ggadmin
Rmthost 10.1.1.247, Mgrport 9178
Rmtfile./dirdat/b5,maxfiles 999, megabytes 500,format release 12.1
Sourcecatalog salespdb
Table Ssp.m_system_user;
5. Generate DEF file:
Edit param SALESDB4
USERID C # #ggadmin, PASSWORD ggadmin
Defsfile/home/oracle/ogg/ggs12/dirdef/salesdb4.def,format Release 12.1
Sourcecatalog salespdb
Table Ssp.m_system_user;
Under Ogg_home, execute the following command to generate the Def file
Defgen Paramfile./DIRPRM/SALESDB4.PRM
Upload the generated DEF file to the target kafka--$OGG _home/dirdef
---mysql database address: 10.1.11.24 MySQL Address
---Kafka address 10.1.1.246:0000,10.1.1.247:0000 topic: dls_merchant
1. Add initialization process:---DIRPRM
ADD Replicat Rp_06,specialrun
EDIT PARAMS rp_06
Specialrun
End Runtime
Setenv (nls_lang= "American_america. ZHS16GBK ")
Targetdb libfile libggjava.so Set Property=./dirprm/kafka_k05.props
Sourcedefs./dirdef/salesdb4.def
Extfile./dirdat/b5
Reportcount Every 1 minutes, rate
Grouptransops 10000
Map salespdb. Ssp. M_system_user,target DLS. Dls_system_user;
2. Add the replication process:
Add Replicat rep_04,exttrail./dirdat/b4
Edit params rep_04
Replicat rep_04
Setenv (nls_lang= "American_america. ZHS16GBK ")
Handlecollisions
Targetdb libfile libggjava.so Set Property=./dirprm/kafka_k05.props
Sourcedefs./dirdef/salesdb4.def
Reportcount Every 1 minutes, rate
Grouptransops 10000
Map salespdb. Ssp. M_system_user,target DLS. Dls_system_user;
3, parameter configuration:
Cd/home/appgroup/ogg/ggs12/dirprm
Custom_kafka_producer.properties file
VI Kafka_k05.props
Gg.handlerlist = Kafkahandler
Gg.handler.kafkahandler.type=kafka
Gg.handler.kafkahandler.kafkaproducerconfigfile=custom_kafka_producer.properties
#The following resolves the topic name using the short table name
Gg.handler.kafkahandler.topicmappingtemplate= dls_merchant
#The following selects the message key using the concatenated primary keys
#gg. handler.kafkahandler.keymappingtemplate=
#gg. Handler.kafkahandler.format=avro_op
Gg.handler.kafkahandler.format =json
Gg.handler.kafkahandler.format.insertopkey=i
Gg.handler.kafkahandler.format.updateopkey=u
Gg.handler.kafkahandler.format.deleteopkey=d
Gg.handler.kafkahandler.format.truncateopkey=t
Gg.handler.kafkahandler.format.prettyprint=false
Gg.handler.kafkahandler.format.jsondelimiter=cdata[]
Gg.handler.kafkahandler.format.includeprimarykeys=true
Gg.handler.kafkahandler.schematopicname= dls_merchant
Gg.handler.kafkahandler.BlockingSend =false
Gg.handler.kafkahandler.includetokens=false
Gg.handler.kafkahandler.mode=op
Goldengate.userexit.timestamp=utc
Goldengate.userexit.writers=javawriter
Javawriter.stats.display=true
Javawriter.stats.full=true
gg.log=log4j
Gg.log.level=info
Gg.report.time=30sec
#Sample Gg.classpath for Apache Kafka
gg.classpath=dirprm/:/opt/cloudera/parcels/kafka/lib/kafka/libs/
#Sample Gg.classpath for HDP
#gg. classpath=/etc/kafka/conf:/usr/hdp/current/kafka-broker/libs/
Javawriter.bootoptions=-xmx512m-xms32m-djava.class.path=ggjava/ggjava.jar
----Start each process
1, source-side extraction, delivery, initialization process start
2, the target side initiates the initialization process, executes the initialization script, initiates the replication process
Start rp_06
./replicat paramfile./DIRPRM/RP_06.PRM reportfile./dirrpt/rp_06.rpt-p initialdataload
Start rep_04
12C database goldengate synchronization heterogeneous database Kafka middleware Two