Pentaho biserver Community edtion 6.1 is a kettle component that can run kettle program scripts. However, since Kettle does not publish directly to the BISERVER-CE service, the ETL scripts (. KTR . KJB) that are developed in the local (Windows environment) through a graphical interface need to be uploaded to the Biserv
ETL Tool Pentaho Kettle's transformation and job integration
1. Kettle
1.1. Introduction
Kettle is an open-source etl Tool written in pure java. It extracts data efficiently and stably (data migration tool ).
Website Link: http://wiki.pentaho.com/display/EAI/Pan+User+DocumentationPanA Pan is a program that can perform a transformation that is edited using spoon.The decompression of PDI Software.zip has been pan.batcommand line using pan to execute transformationThe official website mainly introduces the commands under the Linux platform, I mainly introduce the commands under the Windows platformOptions optionFormat/option: "Value"Parameters parametersFormat "-param:name=value"Repository Warehouse Sel
Syn Good son source: Http://www.cnblogs.com/cssdongl Welcome ReprintRecently wrote the Hadoop MapReduce program summed up, found that a lot of logic is basically the same, and then thought can use ETL tools to configure related logic to implement the MapReduce code automatically generated and executed, This simplifies both the existing and later parts of the work. The Pentaho kettle, which is easy to get st
I. BACKGROUNDCompanies in the use of kettle to do data etl, every job or transformation released on the line want to immediately execute see data effect, every time is to find the operation of the login server open kettle Find the corresponding file click execution, the whole process inefficient, not only occupy the op
Tags: TP server jump parsing BSP Bubuko ZIP compression TPS ODI infFirst, data integration business Scenario1.1 BackgroundBecause of a system of GA to adjust, resulting in the original from the system backup database obtained from the corresponding data resources can not be properly obtained, the subsequent data unifie
Xshell run into the graphical interface in xmanager 1 sh spoon. SHCreate a new job1. write data into HDFs 1) kettle writes data to HDFs in LinuxDouble-click hadoop copy FilesRun this jobView data:1) kettle Write Data to HDF
Tags: font 1.0 Using Database backup interface today Oracle's own ORACLObjectiveWith the growing variety of databases and the increasingly complex format of database backups, data formatting has always been a commonplace issue. According to the library backup file format so many, both SQL, and Bak, as well as TXT and so on. There are many kinds of database, Mysql,oracle,sql server and so on, how to manage these databases? Yesterday leaked the access f
the target table in DB2. In fact, in the kettle can also be in the implementation of the process to establish a table, I am also a beginner kettle, so chose a relatively simple way to operate, after all, the focus is on data migration. 2. Installing the JDK Because Kettle is pure Java write, so rely on the JDK, as for
Tags: command decompression table Linux Server version chmod tab INF successfulFirst, download kettle installation package from official websiteKettle Download URL: https://sourceforge.net/projects/pentaho/files/Kettle Version: Pdi-ce-5.0.1.a-stable.zipSecond, upload the kettle installation package to the Linux server,
lower granularity than a job. We divide tasks into jobs, then, you need to split the job into one or more transformations, and each transformation completes only part of the work.
Kettle basic example
Kettle's error handling requires error logging in many scenarios, for example, if the migration prompts data problems, primary/foreign key errors, or violation of constraints, the current scenario should b
Kettle Download Address: community.pentaho.com/projects/data-integration/ One, preparation work
1. Unzip the downloaded compression package such as: Pdi-ce-6.1.0.1-196.zipAnd then open Spoon.bat.2. Create a conversion.(When you connect to the database, you may encounter problems with the database driver package, download the corresponding driver package into the
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.