Elasticsearch Java API (V) Bulk Bulk Index

Source: Internet
Author: User

This blog describes an easy way to index multiple documents Elasticsearch. The support of the Bulk API enables you to perform batch additions, deletions, updates, and so on at one time. The bulk operation uses the UDP protocol, and UDP cannot ensure that data is not lost when communicating with the Elasticsearch server. First, Bulk API

When using the bulk command, the REST API ends with _bulk, and the batch operation is written in a JSON file, which is the syntax format given by the official website:

action_and_meta_data\n
optional_source\n
action_and_meta_data\n
optional_source\n ...
.
action_and_meta_data\n
optional_source\n

That is, each operation has 2 rows of data, the end of the return line. The first line is used to describe the operation command and the original data, and the second row is a custom option. For example, insert 2 data, delete a piece of data, create a new Bulkdata.json, and write the following:

{"Create": {"_index": "blog", "_type": "article", "_id": "3"}}
{"title": "Title1", "Posttime": "2016-07-02", "Content": "Contents One"}

{"Create": {"_index": "blog", "_type": "article", "_id": "4"}}
{"title": "Title2", "Posttime": "2016-07-03", "Content": "Contents 2"}

{"Delete": {"_index": "blog", "_type": "article", "_id": "1"}}

Perform:

$ curl-xpost "Http://localhost:9200/_bulk?pretty"--data-binary @bulkAdd. JSON
{
  "took": One,
  "errors": False,
  "items": [{"
    create": {"
      _index": "blog",
      "_type": "article",
      "_id": "", "
      _version ": 1,
      " _shards ": {
        " total ": 1,
        " successful ": 1,
        " failed ": 0
      },"
      status ": 201
    }
  } ]
}

Note: At the end of the line to enter the line, or you will not recognize the command because the error occurred.

$ curl-xpost "Http://localhost:9200/_bulk?pretty"--data-binary @bulkAdd. JSON 
{
  "error": {
    "root_ Cause ': [{
      ' type ': ' action_request_validation_exception ',
      ' reason ': ' Validation failed:1: No requests added ; '
    }],
    ' type ': ' action_request_validation_exception ',
    ' reason ': ' Validation failed:1: no requests Added; "
  },
  " status ":
}
second, batch export

The following example is to bulk export documents in the index library to a file in JSON format, where the cluster name is "Bropen", the index library is named "blog", the type is "article", and the new Files/bulk.txt is created under the project root directory. Indexed content written in Bulk.txt:

Import Java.io.BufferedWriter;
Import Java.io.File;
Import Java.io.FileWriter;
Import java.io.IOException;
Import java.net.InetAddress;

Import java.net.UnknownHostException;
Import Org.elasticsearch.action.search.SearchResponse;
Import org.elasticsearch.client.Client;
Import org.elasticsearch.client.transport.TransportClient;
Import org.elasticsearch.common.settings.Settings;
Import org.elasticsearch.common.transport.InetSocketTransportAddress;
Import Org.elasticsearch.index.query.QueryBuilder;
Import Org.elasticsearch.index.query.QueryBuilders;

Import org.elasticsearch.search.SearchHits; public class Elasticsearchbulkout {public static void main (string[] args) {try {Settings sett ings = Settings.settingsbuilder (). Put ("Cluster.name", "Bropen"). Build ()//Cluster.name in Elasticsearch . yml Client Client = Transportclient.builder (). settings (Settings). Build (). Addtransportad Dress (New inetsockettransportaddress(Inetaddress.getbyname ("127.0.0.1"), 9300));
            QueryBuilder QB = Querybuilders.matchallquery (); SearchResponse response = client.preparesearch ("blog"). Settypes ("article"). Setquery (Querybuilders.mat
            Challquery ()). Execute (). Actionget ();

            Searchhits resulthits = Response.gethits ();
            File Article = new file ("Files/bulk.txt");
            FileWriter FW = new FileWriter (article);

            BufferedWriter BFW = new BufferedWriter (FW);

            if (Resulthits.gethits (). Length = = 0) {System.out.println ("Find 0 data!"); else {for (int i = 0; i < resulthits.gethits (). length; i++) {String jsonstr = R
                    Esulthits.gethits () [i]. getsourceasstring ();
                    System.out.println (JSONSTR);
                    Bfw.write (JSONSTR);
                Bfw.write ("\ n");} bfw.close ();

        Fw.close ();
        catch (Unknownhostexception e) {e.printstacktrace ();
        catch (IOException e) {e.printstacktrace ();
 }

    }

}

third, batch import

Read by row from the Bulk.txt file that you just exported, and then bulk import. First, a Bulkrequestbuilder object is instantiated by calling Client.preparebulk (), and the Add method of the Bulkrequestbuilder object is invoked to append the data. Implementation code:

Import Java.io.BufferedReader;
Import Java.io.File;
Import java.io.FileNotFoundException;
Import Java.io.FileReader;
Import java.io.IOException;
Import java.net.InetAddress;

Import java.net.UnknownHostException;
Import Org.elasticsearch.action.bulk.BulkRequestBuilder;
Import org.elasticsearch.client.Client;
Import org.elasticsearch.client.transport.TransportClient;
Import org.elasticsearch.common.settings.Settings;

Import org.elasticsearch.common.transport.InetSocketTransportAddress; public class Elasticsearchbulkin {public static void main (string[] args) {try {Settings sett ings = Settings.settingsbuilder (). Put ("Cluster.name", "Bropen"). Build ()//Cluster.name in Elasticsearch In. YML Configure client client = Transportclient.builder (). settings (Settings). Build (). Addtranspor

            Taddress (New Inetsockettransportaddress (Inetaddress.getbyname ("127.0.0.1"), 9300); File Article = NEW File ("Files/bulk.txt");
            FileReader fr=new FileReader (article);
            BufferedReader bfr=new BufferedReader (FR);
            String Line=null;
            Bulkrequestbuilder Bulkrequest=client.preparebulk ();
            int count=0; while ((Line=bfr.readline ())!=null) {Bulkrequest.add (Client.prepareindex ("Test", "article"). SetSource (line)
                ;
                if (count%10==0) {Bulkrequest.execute (). Actionget ();
                } count++;
            System.out.println (line);

            } bulkrequest.execute (). Actionget ();
            Bfr.close ();
        Fr.close ();
        catch (Unknownhostexception e) {e.printstacktrace ();
        catch (FileNotFoundException e) {e.printstacktrace ();
        catch (IOException e) {e.printstacktrace ();
 }

    }

}

Reference Documentation: Elasticsearch Reference [2.3]»document apis»bulk API

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.