Springbatch manipulating CSV files

Source: Internet
Author: User

First, demand analysis

Read and write to a CSV file using Spring batch: reads a CSV file with four fields (ID, name, age, score),

Do a simple processing of the file, and then output it to a CSV file.

Second, the Code implementation

1. Code structure diagram:

Joblaunch: Start Job

Csvitemprocessor: Processing of reader data

Student: Entity Object

Input.csv: Data Read file

Output.csv: Data Output file

2. Applicationcontext.xml

<?xml version= "1.0" encoding= "UTF-8"? ><beans xmlns= "Http://www.springframework.org/schema/beans" xmlns: Xsi= "Http://www.w3.org/2001/XMLSchema-instance" xmlns:context= "Http://www.springframework.org/schema/context" xsi:schemalocation= "Http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/ Spring-beans-3.0.xsd Http://www.springframework.org/schema/context Http://www.springframework.org/schema/context /spring-context-2.5.xsd "default-autowire=" ByName "><context:annotation-config/><context: Component-scan base-package= "Com.zdp.springbatch"/><bean id= "Joblauncher" class= " Org.springframework.batch.core.launch.support.SimpleJobLauncher "><property name=" jobrepository "ref=" Jobrepository "/></bean><bean id=" jobrepository "class=" Org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean "/><bean id=" TransactionManager "class=" Org.springframework.batch.support.transaction.ResourcelessTransactionMAnager "/></beans> 

3. Springbatch.xml

<?xml version= "1.0" encoding= "UTF-8"? ><bean:beans xmlns= "Http://www.springframework.org/schema/batch" Xmlns:bean= "Http://www.springframework.org/schema/beans" xmlns:xsi= "Http://www.w3.org/2001/XMLSchema-instance" xmlns:context= "Http://www.springframework.org/schema/context" xsi:schemalocation= "http// Www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd/HTTP Www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-2.5.xsdhttp ://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch-2.1.xsd "> <!--loading Spring core profile--><bean:import resource= "Applicationcontext.xml"/> <bean:bean id= "Student" class            = "Com.zdp.springbatch.Student" ></bean:bean> <job id= "Csvjob" > <step id= "Csvstep" > <tasklet transaction-manager= "TransactionManager" > <chunk reader= "csvitemreader" writer= "csvItEmwriter "processor=" Csvitemprocessor "commit-interval=" 1 "/> </tasklet> </step> </ Job> <!--read CSV file--<bean:bean id= "Csvitemreader" class= "org.springframework.batch.item.file.FlatFi Leitemreader "scope=" step "> <bean:property name=" Resource "value=" Classpath:input.csv "/> <bean:p Roperty name= "Linemapper" > <bean:bean class= "Org.springframework.batch.item.file.mapping.DefaultLineMappe R "> <bean:property name=" Linetokenizer "ref=" Linetokenizer "/> <bean:property nam E= "Fieldsetmapper" > <bean:bean class= "Org.springframework.batch.item.file.mapping.BeanWrapperFiel Dsetmapper "> <bean:property name=" prototypebeanname "value=" Student "></bean:property&gt                    ; </bean:bean> </bean:property> </bean:bean> </bean:property> &L t;/Bean:bean> <!--linetokenizer--<bean:bean id= "Linetokenizer" class= "org.springframework.batch.item.fi  Le.transform.DelimitedLineTokenizer "> <bean:property name=" delimiter "value=", "/> <bean:property Name= "Names" > <bean:list> <bean:value>id</bean:value> <b Ean:value>name</bean:value> <bean:value>age</bean:value> <bean:value >score</bean:value> </bean:list> </bean:property> </bean:bean> < !--Write a CSV file--<bean:bean id= "Csvitemwriter" class= "Org.springframework.batch.item.file.FlatFileItemWriter" Scope= "Step" > <bean:property name= "Resource" value= "File:src/output.csv"/> <bean:property name= " Lineaggregator "> <bean:bean class=" Org.springframework.batch.item.file.transform.DelimitedLineAggregator " > <bean:Property name= "delimiter" value= "," ></bean:property> <bean:property name= "Fieldextractor" >                        <bean:bean class= "Org.springframework.batch.item.file.transform.BeanWrapperFieldExtractor" > <bean:property name= "Names" value= "Name,age,score" ></bean:property> </b ean:bean> </bean:property> </bean:bean> </bean:property> </bea N:bean></bean:beans>

This file is configured with the job:csvjob of this execution. This job includes a step. Completed a full CSV file read and write function.

By Csvitemreader finished the CSV file read operation, by Csvitemprocessor completed the processing of the data obtained by the Csvitemwriter to the CSV file write operations

Csvitemreader implements the Flatfileitemreader class provided by spring batch. This class is primarily used for read operations of flat files. It consists of two necessary attributes resource and linemapper. The former specifies the location of the file to be read, which maps each line of the file into a Pojo object. Linemapper also has two important properties, Linetokenizer and Fieldsetmapper, which linetokenizer a line of the file into a fieldset, which is then mapped to Fieldsetmapper objects by Pojo. This is similar to the read operation of the DB. Linemapper similar to ResultSet, a line in a file resembles a record in a table, encapsulated as a fieldset, similar to RowMapper.

As for how to encapsulate a record, this work is done by Linetokenizer's inheriting class Delimitedlinetokenizer. The Delimiter property of Delimitedlinetokenizer determines what the file's row of data is broken down by, and the default is ". ”。 The names attribute identifies the name of each field that is decomposed. When passed to Fieldsetmapper (this is the case with Beanwrapperfieldsetmapper). will be able to get the corresponding value according to the name. The Fieldsetmapper property Prototypebeanname is the name of the mapping Pojo class. When this property is set, the framework maps a fieldset of Linetokenizer into a Pojo object, which is completed by name (Linetokenizer the name of the callout when decomposing and corresponding to the name of the field in the Pojo object).

In summary, Flatfileitemreader reads a record from the following four steps: 1, reads a record from the resource specified file, 2. Linetokenizer this record into fileset according to delimiter, the name of each field is obtained by the names attribute, and 3, the decomposed Fileset is passed to Fieldsetmapper, which is mapped into Pojo object by name. 4, finally returned by the Pojo object that the Flatfileitemreader will map to, the framework passes the returned object to processor.

The Csvitemwriter implements the Flatfileitemwriter class. This class is similar to the Flatfileitemreader class and has two important properties: Resource and Lineaggregator.

The former is the path to the file to be exported, and the latter is similar to Linetokenizer. Lineaggregator (The Delimitedlineaggregator class for this example) also has two important properties: Delimiter and Fieldextractor.

Delimiter indicates what the field of the output is to be cut, which assembles the Pojo object into a string consisting of the fields of the Pojo object.

Same Flatfileitemwriter write a record also has the following four steps complete: 1,processor passed over an object to lineaggregator;2. Lineaggregator transforms the object into an array, 3, and the Lineaggregator property Fieldextractor transforms the array into a string that is cut by delimiter. 4, output this string.

4. Csvitemprocessor

/** * Itemprocessor class. */@Component ("Csvitemprocessor") public class Csvitemprocessor implements Itemprocessor<student, student> {    /**     * Simple processing of the data taken.

* * @param the data before student processing. * @return processed data. * @exception exception processing is occurring no matter what the exception.

* /@Override public Student process (Student Student) throws Exception { //merge ID and name Student.setname (Student.getid () + "--" + student.getname ()); Age plus 2 student.setage (Student.getage () + 2); Score plus Student.setscore (Student.getscore () +); Pass the processed results to writer return student;}

The csvitemprocessor implements the Itemprocessor class. This class accepts the Pojo object that the reader maps to. This object can be processed by the corresponding business logic, and then returned, the framework will pass the returned results to the writer for write operations

5. Student

/** * Pojo Classes _student */public class Student {private string id;private string name;private int age;private float SCORE;PUBL IC String getId () {return ID;} public void SetId (String id) {this.id = ID;} Public String GetName () {return name;} public void SetName (String name) {this.name = name;} public int getage () {return age;} public void Setage (int.) {this.age = age;} public float Getscore () {return score;} public void SetScore (float score) {this.score = score;}}
6. Joblaunch

/** * Test Client */public class Joblaunch {public static void main (string[] args) {try {applicationcontext context = new Classpathxmlapplicationcontext ("Springbatch.xml"); Joblauncher Joblauncher = (joblauncher) context.getbean ("Joblauncher"); Job Job = (Job) Context.getbean ("Csvjob"); Joblauncher can be used to start jobjobexecution result = Joblauncher.run (Job, New Jobparameters ());//end of processing, console print processing results System.out.println (Result.tostring ());} catch (Exception e) {throw new RuntimeException ("Error happens ...", e);}}

7. Input and output



Transferred from: Http://www.cnblogs.com/gulvzhe

Copyright notice: This article blog original articles, blogs, without consent, may not be reproduced.

Springbatch manipulating CSV files

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.