No tools or projects exported directly to MySQL database are currently found
Current implementation ideas: Use the Elasticdump tool to export from ES data to a JSON file, and then use the script program to change the JSON file implementation to import to MySQL data
The specific contents are as follows:
Installing Elasticdump
1 npm Install elasticdump-g elasticdump Global Installation
2 download the corresponding version from GitHub such as 2.4.*
Specific reference Https://www.npmjs.com/package/elasticdump
GitHub Address Https://github.com/taskrabbit/elasticsearch-dump
Once the installation is complete, you can use the following example (import the data from index a into index B):
Elasticdump Usage Explanation
./elasticdump--input=http://127.0.0.1:9200/domain6--output=query.json--debug=true--limit=1--offset=0-- Searchbody= ' {"query": {"range": {"id": {"LTE": "+"}}} '--sourceonly=true
Get the field value that you want
./elasticdump--input=http://127.0.0.1:9200/domain6--output=query2.json--limit=100--offset=0--searchBody= ' {' Query ": {" range ": {" id ": {" LTE ":" + "}}," _source ": [" id "," CNAME "]} '--sourceonly=true
Problem similar to ES size function has not yet been found in the scheme
./elasticdump--input=http://127.0.0.1:9200/domain6--output=query3.json--limit=100--offset=0--searchBody= ' {' Query ": {" range ": {" id ": {" LTE ":" + "}}," _source ": [" id "," CNAME "]} '--sourceonly=true
--input Original data source
--output Exporting Data addresses
--limit (not the total number of exports and ES differ) the number of imported data bars per execution
--offset equivalent to ES inside the From
--searchbody Query Statements
--SOURCEONLY Export data format defaults to export _type _id _source _index default False
Elasticdump--input=http://localhost:9200/a--output=http://localhost:9200/b--type=data
Elasticdump--input=http://localhost:9200/domain_v12--output=d:/ziliao/elasticsearch-2.3.3/dd--type=data
Elasticdump--input=http://localhost:9200/domain_v12--limit=10--output=d:/ziliao/elasticsearch-2.3.3/dd4.json-- Type=data
Perform data migration
Export mapping information
Elasticdump--ignore-errors=true--scrolltime=120m--bulk=true--input=http://10.10.20.164:9200/ xmonitor-2015.04.29--output=http://192.168.100.72:9200/xmonitor-prd-2015.04.29--type=mapping
Exporting data
Elasticdump--ignore-errors=true--scrolltime=120m--bulk=true--input=http://10.10.20.164:9200/ xmonitor-2015.04.28--output=/usr/local/esdump/node-v0.12.2-linux-x64/data/xmonitor-prd-2015.04.28.json--type= Data
Export data to a local cluster
Elasticdump--ignore-errors=true--scrolltime=120m--bulk=true--input=http://10.10.20.164:9200/ xmonitor-2015.04.29--output=http://192.168.100.72:9200/xmonitor-prd-2015.04.29--type=data
var defaults = {
Limit:10,
offset:0,
Debug:false,
Type: ' Data ',
Delete:false,
Maxsockets:null,
Input:null,
' Input-index ': null,
Output:null,
' Output-index ': null,
Inputtransport:null,
Outputtransport:null,
Searchbody:null,
Sourceonly:false,
Jsonlines:false,
Format: ",
' Ignore-errors ': false,
Scrolltime: ' 10m ',
Timeout:null,
Tolog:null,
Quiet:false,
Awsaccesskeyid:null,
Awssecretaccesskey:null,
};
NPM Install Elasticdump2.1.0-g
Elasticdump
NPM Install [Email protected]*
Import MySQL from a JSON file
<?php
$json _data = file_get_contents (' D:/ziliao/node_modules/elasticdump/bin/query1223.json ');(all paths to the JSON file)
$len = strlen ($json _data);
$begin = 0;
$end = 0;
$data = [];
for ($i =0; $i < $len; $i + +) {
if ($json _data[$i]== "}")
{
$end = $i;
$lens = $end-$begin +1;
$data [] = Json_decode (substr ($json _data, $begin, $lens), true);
$begin = $end +1;
}
}
Print_r ($data);
Link MySQL Database Configuration
$con = mysql_connect ("localhost", "root", "");
if (! $con)
{
Die (' Could not connect: '. Mysql_error ());
}
mysql_select_db ("Adbug", $con);
foreach ($data as $key = = $value) {
$sql = "INSERT into test2 (name) VALUES ('". $value [' Host ']. "')";
mysql_query ($sql);
}
Mysql_close ($con);
?>
ES data Export to MySQL