ES数据备份到HDFS

1.准备好HDFS(这里我是本机测试)

 

2.es 安装repository-hdfs插件

(如es为多节点需在每一个节点都安装插件)elasticsearch

 

 

elasticsearch-plugin install repository-hdfs

3. 重启ES

 

4.建立快照仓库

PUT /_snapshot/backup_hdfs测试

{
  "type": "hdfs",
  "settings": {
    "uri": "hdfs://localhost:8020/",
    "path": "elasticsearch/respositories/my_hdfs_repository",
    "conf.dfs.client.read.shortcircuit": "true"
  }
}

5.查看快照仓库

GET /_snapshot/_allui

6.建立快照

PUT /_snapshot/backup_hdfs/snapshot_1url

{
  "indices": "logstash-2018-08-08,index_2",//注意不设置这个属性,默认是备份全部
  "ignore_unavailable": true,
  "include_global_state": false
}

7.恢复快照

POST /_snapshot/backup_hdfs/snapshot_1/_restore
spa

{
  "indices": "zhangmingli", //指定索引恢复,不指定就是全部
  "ignore_unavailable": true,//忽略恢复时异常索引
  "include_global_state": false//是否存储全局转态信息,fasle表明有一个或几个失败,不会致使整个任务失败
}

8.删除快照

DELETE /_snapshot/backup_hdfs/snapshot_1
插件

相关文章
相关标签/搜索