Elasticsearch实现增删改查

调用elasticsearch通常使用restful风格请求,这里记录一些常用的Java API和Postman Url

Java API调用Es

1. 查询总文档数

java 复制代码
@Test
    void getAllCount() {
//        RestHighLevelClient client=new RestHighLevelClient(RestClient.builder(new HttpHost("192.168.32.3",9200,"http")));
        RestHighLevelClient client=new RestHighLevelClient(RestClient.builder(new HttpHost("192.168.32.1",9200,"http")));
        SearchRequest searchRequest=new SearchRequest();
        SearchSourceBuilder searchSourceBuilder=new SearchSourceBuilder();
        searchSourceBuilder.query(QueryBuilders.matchAllQuery());
        searchRequest.indices("_all");
        searchRequest.source(searchSourceBuilder);
        try {
            SearchResponse searchResponse=client.search(searchRequest, RequestOptions.DEFAULT);
            long count=searchResponse.getHits().getTotalHits().value;
            System.out.println("192.168.32.3的文档总数量为: "+count);
        } catch (IOException e) {
            e.printStackTrace();
        }

    }

2.批量插入100w数据

java 复制代码
    @Test
    public void setESBatchData() throws IOException {
        RestHighLevelClient client=new RestHighLevelClient(RestClient.builder(new HttpHost("192.168.32.2",9200,"http")));
        BulkRequest request=new BulkRequest();
        request.timeout("10m");
        long startTime = System.currentTimeMillis();
        for (int i = 1; i < 1000000; i++) {
//            Map<String, String> jsonMap = new HashMap<>();
            String name="test"+i;
            int age=i;
//            jsonMap.put("name", "test"+i);
//            jsonMap.put("age", String.valueOf(i));
//            String jsonString = formatAsJSON(jsonMap);
            IndexRequest indexRequest=new IndexRequest("my_index").source("name",name,"age",age);
            request.add(indexRequest);
            if(i%10000==0){
                BulkResponse response=client.bulk(request,RequestOptions.DEFAULT);
                request=new BulkRequest();
                if (response.hasFailures()) {
                    // 处理失败的文档
                    System.out.println("第"+i/10000+"次批量插入失败");
                } else {
                    System.out.println("第"+i/10000+"次批量插入成功,已插入"+i/10000+"w条");
                    // 所有文档都已成功插入
                }
            }
        }
        long endTime = System.currentTimeMillis();
        double timeDifference = (endTime - startTime)/1000;
        System.out.println("两次系统时间差:" + timeDifference + " 秒");
//        RequestOptions.Builder builder = RequestOptions.DEFAULT.toBuilder();
//        builder.setHttpAsyncResponseConsumerFactory(new HttpAsyncResponseConsumerFactory.HeapBufferedResponseConsumerFactory(500 * 1024 * 1024));
//        RequestConfig config=RequestConfig.custom().setConnectTimeout(60000).setSocketTimeout(600000).build();
//        RequestOptions options=builder.setRequestConfig(config).build();
    }

3.关闭指定索引

java 复制代码
    @Test
    public void closeESIndex() throws IOException {
        //创建es客户端,建立连接
        RestHighLevelClient client=new RestHighLevelClient(RestClient.builder(new HttpHost("192.168.32.1",9200,"http")));
//        String indexName=".ds-ilm-history-5-2023.06.29-000001";
        String indexName=".ds-.logs-deprecation.elasticsearch-default-2023.06.29-000001";
//        String indexName="my_index";
        //根据操作类型创建请求
        CloseIndexRequest request=new CloseIndexRequest(indexName);
        //执行请求,返回结果
        AcknowledgedResponse response=client.indices().close(request,RequestOptions.DEFAULT);
        if (response.isAcknowledged()){
            System.out.println("索引关闭成功");
        }else {
            System.out.println("索引关闭失败");
        }
        //释放连接资源
        client.close();
    }

4.添加文档

java 复制代码
@Test
    public void addES() throws IOException {
        RestHighLevelClient client=new RestHighLevelClient(RestClient.builder(new HttpHost("192.168.32.3",9200,"http")));
        String jsonDocument = "{ \"name\": \"test14\", \"age\": \"140\" }";
        IndexRequest request=new IndexRequest("my_index").id("14").source(jsonDocument, XContentType.JSON);
        IndexResponse response=client.index(request,RequestOptions.DEFAULT);
        if(response.getResult()== IndexResponse.Result.CREATED){
            System.out.println("文档创建成功");
        }else if(response.getResult()==IndexResponse.Result.UPDATED){
            System.out.println("文档更新成功");
        }
        client.close();
    }

5.修改文档

java 复制代码
 @Test
    public void updateES() throws IOException {
        RestHighLevelClient client=new RestHighLevelClient(RestClient.builder(new HttpHost("192.168.32.3",9200,"http")));
        UpdateRequest request=new UpdateRequest("my_index","_doc","14");
        Map<String,String> updateInfo=new HashMap<>();
        request.doc("name","test14_update");
        UpdateResponse response=client.update(request,RequestOptions.DEFAULT);
        if(response.getResult()== DocWriteResponse.Result.UPDATED){
            System.out.println("修改文档成功");
        }
        client.close();
    }

6.删除文档

java 复制代码
 @Test
    public void deleteES() throws IOException {
        RestHighLevelClient client=new RestHighLevelClient(RestClient.builder(new HttpHost("192.168.32.3",9200,"http")));
        DeleteRequest request=new DeleteRequest("my_index","14");
        DeleteResponse response=client.delete(request,RequestOptions.DEFAULT);
        if(response.getResult()== DocWriteResponse.Result.DELETED){
            System.out.println("删除文档成功");
        }
    }

Postman请求(DSL)

1.查询全量文档

java 复制代码
POST http://192.168.32.2:9201/_search?pretty
{
    "size": 1000,
    "query": {
        "match_all": {}
    }
}

2.删除全部文档(保留索引)

java 复制代码
POST http://192.168.32.2:9201/_all/_delete_by_query
{
    "query": {
        "match_all": {}
    }
}

3.新增文档

java 复制代码
PUT http://192.168.32.2:9201/my_index/_doc/1
{
    "name": "test1",
    "age": 1
}

4.查看x-pack许可证信息

java 复制代码
GET http://192.168.32.2:9201/_xpack?pretty

5.查看集群节点

java 复制代码
GET http://192.168.32.2:9200/_cat/nodes?v

6.查看节点信息

java 复制代码
GET http://192.168.32.1:9201

7.查询文档总数

java 复制代码
GET http://192.168.32.2:9201/_search?pretty
{
    "size": 0,
    "aggs": {
        "total_documents": {
            "value_count": {
                "field": "_id"
            }
        }
    }
}

8.创建索引

java 复制代码
PUT http://192.168.32.2:9200/test_index
{
    "mappings": {
        "properties": {
            "name": {
                "type": "text"
            },
            "age": {
                "type": "long"
            }
        }
    }
}

9.删除索引(包括数据)

java 复制代码
DELETE http://192.168.32.2:9201/my_index

10.查询集群索引

java 复制代码
GET http://192.168.32.2:9201/_cat/indices?v

11.查看指定索引结构

java 复制代码
GET http://192.168.32.2:9200/my_index/_mapping

12.开启指定索引

java 复制代码
POST http://192.168.32.2:9201/my_index/_open

13.关闭指定索引

java 复制代码
POST http://192.168.32.2:9201/my_index/_close

14.定义远程集群

java 复制代码
POST http://192.168.32.2:9201/my_index/_close

15.查看远程集群

java 复制代码
GET http://192.168.32.2:9201/_remote/info

16.删除远程集群

java 复制代码
PUT http://192.168.32.2:9201/_cluster/settings
{
  "persistent": {
    "cluster": {
      "remote": {
        "es-source": {
          "seeds": null 
        }
      }
    }
  }
}

17.开启CCR(指定索引)

java 复制代码
PUT http://192.168.32.2:9201/my_index/_ccr/follow
{
    "remote_cluster": "source",
    "leader_index": "my_index"
}

18.暂停CCR

java 复制代码
POST http://192.168.32.2:9201/my_index/_ccr/pause_follow

19.删除CCR

java 复制代码
POST http://192.168.32.2:9201/my_index/_ccr/unfollow

删除索引也会自动删除CCR

20.查看CCR信息

java 复制代码
POST http://192.168.32.2:9201/my_index/_ccr/info

21.自动跟随增量同步索引(auto follow)

java 复制代码
PUT http://192.168.32.2:9201/_ccr/auto_follow/beats
{
    "remote_cluster": "source",
    "leader_index_patterns": [
        "*"
    ]
}
相关推荐
mazhafener1235 小时前
智慧照明:集中控制器、单双灯控制器与智慧灯杆网关的高效协同
大数据
打码人的日常分享5 小时前
物联网智慧医院建设方案(PPT)
大数据·物联网·架构·流程图·智慧城市·制造
Lansonli7 小时前
大数据Spark(六十一):Spark基于Standalone提交任务流程
大数据·分布式·spark
Rverdoser8 小时前
电脑硬盘分几个区好
大数据
傻啦嘿哟8 小时前
Python 数据分析与可视化实战:从数据清洗到图表呈现
大数据·数据库·人工智能
Theodore_10228 小时前
大数据(2) 大数据处理架构Hadoop
大数据·服务器·hadoop·分布式·ubuntu·架构
簌簌曌9 小时前
CentOS7 + JDK8 虚拟机安装与 Hadoop + Spark 集群搭建实践
大数据·hadoop·spark
冒泡的肥皂10 小时前
强大的ANTLR4语法解析器入门demo
后端·搜索引擎·编程语言
Theodore_102211 小时前
大数据(1) 大数据概述
大数据·hadoop·数据分析·spark·hbase
Aurora_NeAr11 小时前
Apache Spark详解
大数据·后端·spark