In the process of building Elasticsearch database, the first use of its recommended Logstash tools to import data, but it is very uncomfortable to use, so you want to use Perl good regular expression to filter the data classification, and then import Elasticsearch, So search Cpan found the Search::elasticsearch module.
The module on the cpan of the document written relatively concise, so the use of experience in the process is summarized as follows:
One, data write:
Use Search::elasticsearch;my $e =search::elasticsearch->new (nodes=>[' localhost:9200 '); $e->index (index= > "$index _name", type=> "$type _name", id=> "$id _name", body=>{title=> "$data _name", dat A=> "$data"});
Second, bulk data write:
Use Search::elasticsearch;my $e =search::elasticsearch->new (nodes=>[' localhost:9200 '); my $bulk = $e->bulk_ Helper (index=> "$index _name", type=> "$type _name"), my $i =0;while (...) {#do something $bulk->add_action (index=>{id=> $id _name,source=>{title = $data _name,data=> $data}} ); if ($i >999) {$bulk->flush; $i = 0; } $i + +;}
Third, read a record:
Use Search::elasticsearch;my $e =search::elasticsearch->new (nodes=>[' localhost:9200 '); my $doc = $e->get ( Index=> "$index _name", type=> "$type _name", id=> "$id _name"); my $data = $doc->{_source}->{$data _name};# Do something
Four, sequentially read all records:
Use Search::elasticsearch;my $e =search::elasticsearch->new (nodes=>[' localhost:9200 '); my $scroll = $e Scroll_helper (index=> "$index _name", type=> "$type _name", body=>{query=>{match_all=>{}, size=>5000}); while (my $doc = $scroll->next) {My $id = $doc->{_id}; My $data = $doc->{_source}->{$data _name}; #do something}
This article is from "Ruyi Spirit Pro" blog, please be sure to keep this source http://417722381.blog.51cto.com/7856838/1880794
Perl Search::elasticsearch Module Use experience summary