Go to file
Slobodan Adamović 112859b85d
Set `keyUsage` for generated HTTP certificates and self-signed CA (#126376) (#126447)
The `elasticsearch-certutil http` command, and security auto-configuration, 
generate the HTTP certificate and CA without setting the `keyUsage` extension.

This PR fixes this by setting (by default):
- `keyCertSign` and `cRLSign` for self-signed CAs 
- `digitalSignature` and `keyEncipherment` for HTTP certificates and CSRs

These defaults can be overridden when running `elasticsearch-certutil http` 
command. The user will be prompted to change them as they wish.

For `elasticsearch-certutil ca`, the default value can be overridden by passing 
the `--keysage` option, e.g.
```
elasticsearch-certutil ca --keyusage "digitalSignature,keyCertSign,cRLSign" -pem    
```

Fixes #117769
2025-04-08 18:55:37 +10:00
.buildkite [CI] Trigger FIPS PR tests when test-fips label is present (#126332) (#126334) 2025-04-04 17:47:47 -04:00
.ci Bump versions after 8.17.4 release 2025-03-25 15:23:21 +00:00
.github Remove CODEOWNERS for backports 2025-04-01 17:03:43 -04:00
.idea Don't apply IntelliJ illegal module dependency inspection to test code (#101977) 2023-11-09 12:46:18 -05:00
benchmarks ESQL: Speed up TO_IP (#126338) (#126431) 2025-04-08 07:38:46 +10:00
build-conventions Update Gradle Enterprise Plugin (Develocity) to 3.19.2 (#125137) (#125248) 2025-03-20 06:29:26 +11:00
build-tools Convert remaining plugin projects to new test clusters framework (#125626) (#125722) 2025-03-27 08:57:22 +11:00
build-tools-internal Use official ubi9-minimal base image for es docker (#126280) (#126283) 2025-04-04 23:55:00 +11:00
client Update Gradle wrapper to 8.13 (#122421) (#123874) 2025-03-05 23:49:38 +11:00
dev-tools Fixed variable name in Zstd publish script (#118207) 2024-12-09 13:03:43 -08:00
distribution [Entitlements] Replace Permissions with Entitlements in InstallPluginAction (#125207) (#126115) 2025-04-02 23:03:25 +11:00
docs Set `keyUsage` for generated HTTP certificates and self-signed CA (#126376) (#126447) 2025-04-08 18:55:37 +10:00
docs-mdx/painless [DOCS] Adds an MDX file for testing purposes. (#106165) 2024-03-11 15:57:00 +01:00
gradle Update Gradle Enterprise Plugin (Develocity) to 3.19.2 (#125137) (#125248) 2025-03-20 06:29:26 +11:00
libs [Entitlements] Expand and update README (#125395) (#126267) 2025-04-04 19:23:48 +11:00
licenses Add AGPLv3 as a supported license 2024-09-13 15:29:46 -07:00
modules Adding a cleanup method to EnterpriseGeoIpDownloaderIT (#125958) (#125988) 2025-04-01 07:37:00 +11:00
plugins [Entitlements] Replace Permissions with Entitlements in InstallPluginAction (#125207) (#126115) 2025-04-02 23:03:25 +11:00
qa [9.0] Log stack traces on data nodes before they are cleared for transport (#125732) (#126245) 2025-04-04 11:46:16 -04:00
rest-api-spec Fix cat allocation YAML test (#126003) (#126028) 2025-04-01 23:25:19 +11:00
server [9.0] Log stack traces on data nodes before they are cleared for transport (#125732) (#126245) 2025-04-04 11:46:16 -04:00
test [9.0] Log stack traces on data nodes before they are cleared for transport (#125732) (#126245) 2025-04-04 11:46:16 -04:00
x-pack Set `keyUsage` for generated HTTP certificates and self-signed CA (#126376) (#126447) 2025-04-08 18:55:37 +10:00
.backportrc.json Update .backportrc.json 2025-04-04 16:36:01 -04:00
.dir-locals.el
.editorconfig Add a blank line between java and javax imports (#117602) 2024-11-27 11:08:13 +11:00
.git-blame-ignore-revs Update .git-blame-ignore-revs 2023-04-04 10:05:42 +01:00
.gitattributes [9.0] ESQL autogenerate docs v3 (#124312) (#124786) 2025-03-14 06:33:18 +11:00
.gitignore Delete accidentally added file, and gitignore (#118971) 2024-12-18 15:44:52 +00:00
BUILDING.md Spelling and grammar fixes in repository docs (#102345) 2023-11-18 15:05:02 +00:00
CHANGELOG.md
CONTRIBUTING.md Create a doc for versioning info (#113601) 2024-09-30 10:42:59 +01:00
LICENSE.txt Add AGPLv3 as a supported license 2024-09-13 15:29:46 -07:00
NOTICE.txt Update year in NOTICE.txt (#109548) 2024-06-10 15:12:38 -07:00
README.asciidoc [Build] Remove mentioning ./gradlew assemble to build all distributions (#116400) 2024-11-08 07:28:52 +01:00
REST_API_COMPATIBILITY.md Replace remaining references to old task names 2024-09-12 11:05:48 -07:00
TESTING.asciidoc [Gradle] Remove static use of BuildParams (#115122) 2024-11-15 17:58:57 +01:00
TRACING.md Update/Cleanup references to old tracing.apm.* legacy settings in favor of the telemetry.* settings (#104917) 2024-01-31 09:20:05 +01:00
Vagrantfile Add AGPLv3 as a supported license 2024-09-13 15:29:46 -07:00
branches.json Remove 8.19.0 version constant from 9.0 2025-01-31 07:22:32 -08:00
build.gradle [Gradle] Autoprovision jvm for gradle daemon (#124071) (#125147) 2025-03-19 22:12:26 +11:00
catalog-info.yaml Move lucene CI job messages to a dedicate notification channel (#121145) 2025-01-29 10:51:02 +00:00
gradle.properties [IDEA] Enable Gradle Configuration Cache for Gradle Test Runner (#123552) (#123622) 2025-02-28 06:44:03 +11:00
gradlew Update Gradle wrapper to 8.13 (#122421) (#123874) 2025-03-05 23:49:38 +11:00
gradlew.bat Update Gradle wrapper to 8.9 (#110109) 2024-07-19 13:42:33 +02:00
muted-tests.yml Re-enable SearchProgressActionListenerIT testSearchProgressWithQuery (#124302) 2025-04-07 11:20:46 +02:00
renovate.json [renovate] Update branches config 2025-01-30 11:30:11 +11:00
settings.gradle Update Gradle Enterprise Plugin (Develocity) to 3.19.2 (#125137) (#125248) 2025-03-20 06:29:26 +11:00
updatecli-compose.yaml deps(updatecli): bump all policies (#119204) 2024-12-23 17:40:54 +00:00

README.asciidoc

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

= Elasticsearch

Elasticsearch is a distributed search and analytics engine, scalable data store and vector database optimized for speed and relevance on production-scale workloads. Elasticsearch is the foundation of Elastic's open Stack platform. Search in near real-time over massive datasets, perform vector searches, integrate with generative AI applications, and much more.

Use cases enabled by Elasticsearch include:

* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
* https://www.elastic.co/search-labs/blog/categories/vector-search[Vector search]
* Full-text search
* Logs
* Metrics
* Application performance monitoring (APM)
* Security logs

\... and more!

To learn more about Elasticsearch's features and capabilities, see our
https://www.elastic.co/products/elasticsearch[product page].

To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].

[[get-started]]
== Get started

The simplest way to set up Elasticsearch is to create a managed deployment with
https://www.elastic.co/cloud/as-a-service[Elasticsearch Service on Elastic
Cloud].

If you prefer to install and manage Elasticsearch yourself, you can download
the latest version from
https://www.elastic.co/downloads/elasticsearch[elastic.co/downloads/elasticsearch].

=== Run Elasticsearch locally

////
IMPORTANT: This content is replicated in the Elasticsearch repo. See `run-elasticsearch-locally.asciidoc`.
Ensure both files are in sync.

https://github.com/elastic/start-local is the source of truth.
////

[WARNING]
====
DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS.

This setup is intended for local development and testing only.
====

Quickly set up Elasticsearch and Kibana in Docker for local development or testing, using the https://github.com/elastic/start-local?tab=readme-ov-file#-try-elasticsearch-and-kibana-locally[`start-local` script].

 For more detailed information about the `start-local` setup, refer to the https://github.com/elastic/start-local[README on GitHub].

==== Prerequisites

- If you don't have Docker installed, https://www.docker.com/products/docker-desktop[download and install Docker Desktop] for your operating system.
- If you're using Microsoft Windows, then install https://learn.microsoft.com/en-us/windows/wsl/install[Windows Subsystem for Linux (WSL)].

==== Trial license
This setup comes with a one-month trial license that includes all Elastic features.

After the trial period, the license reverts to *Free and open - Basic*.
Refer to https://www.elastic.co/subscriptions[Elastic subscriptions] for more information.

==== Run `start-local`

To set up Elasticsearch and Kibana locally, run the `start-local` script:

[source,sh]
----
curl -fsSL https://elastic.co/start-local | sh
----
// NOTCONSOLE

This script creates an `elastic-start-local` folder containing configuration files and starts both Elasticsearch and Kibana using Docker.

After running the script, you can access Elastic services at the following endpoints:

* *Elasticsearch*: http://localhost:9200
* *Kibana*: http://localhost:5601

The script generates a random password for the `elastic` user, which is displayed at the end of the installation and stored in the `.env` file.

[CAUTION]
====
This setup is for local testing only. HTTPS is disabled, and Basic authentication is used for Elasticsearch. For security, Elasticsearch and Kibana are accessible only through `localhost`.
====

==== API access

An API key for Elasticsearch is generated and stored in the `.env` file as `ES_LOCAL_API_KEY`.
Use this key to connect to Elasticsearch with a https://www.elastic.co/guide/en/elasticsearch/client/index.html[programming language client] or the https://www.elastic.co/guide/en/elasticsearch/reference/current/rest-apis.html[REST API].

From the `elastic-start-local` folder, check the connection to Elasticsearch using `curl`:

[source,sh]
----
source .env
curl $ES_LOCAL_URL -H "Authorization: ApiKey ${ES_LOCAL_API_KEY}"
----
// NOTCONSOLE

=== Send requests to Elasticsearch

You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
such as the https://www.elastic.co/guide/en/elasticsearch/client/index.html[Elasticsearch
language clients] and https://curl.se[curl].

==== Using curl

Here's an example curl command to create a new Elasticsearch index, using basic auth:

[source,sh]
----
curl -u elastic:$ELASTIC_PASSWORD \
  -X PUT \
  http://localhost:9200/my-new-index \
  -H 'Content-Type: application/json'
----
// NOTCONSOLE

==== Using a language client

To connect to your local dev Elasticsearch cluster with a language client, you can use basic authentication with the `elastic` username and the password you set in the environment variable.

You'll use the following connection details:

* **Elasticsearch endpoint**: `http://localhost:9200`
* **Username**: `elastic`
* **Password**: `$ELASTIC_PASSWORD` (Value you set in the environment variable)

For example, to connect with the Python `elasticsearch` client:

[source,python]
----
import os
from elasticsearch import Elasticsearch

username = 'elastic'
password = os.getenv('ELASTIC_PASSWORD') # Value you set in the environment variable

client = Elasticsearch(
    "http://localhost:9200",
    basic_auth=(username, password)
)

print(client.info())
----

==== Using the Dev Tools Console

Kibana's developer console provides an easy way to experiment and test requests.
To access the console, open Kibana, then go to **Management** > **Dev Tools**.

**Add data**

You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.

For timestamped data such as logs and metrics, you typically add documents to a
data stream made up of multiple auto-generated backing indices.

To add a single document to an index, submit an HTTP post request that targets the index.

----
POST /customer/_doc/1
{
  "firstname": "Jennifer",
  "lastname": "Walters"
}
----

This request automatically creates the `customer` index if it doesn't exist,
adds a new document that has an ID of 1, and
stores and indexes the `firstname` and `lastname` fields.

The new document is available immediately from any node in the cluster.
You can retrieve it with a GET request that specifies its document ID:

----
GET /customer/_doc/1
----

To add multiple documents in one request, use the `_bulk` API.
Bulk data must be newline-delimited JSON (NDJSON).
Each line must end in a newline character (`\n`), including the last line.

----
PUT customer/_bulk
{ "create": { } }
{ "firstname": "Monica","lastname":"Rambeau"}
{ "create": { } }
{ "firstname": "Carol","lastname":"Danvers"}
{ "create": { } }
{ "firstname": "Wanda","lastname":"Maximoff"}
{ "create": { } }
{ "firstname": "Jennifer","lastname":"Takeda"}
----

**Search**

Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of _Jennifer_
in the `customer` index.

----
GET customer/_search
{
  "query" : {
    "match" : { "firstname": "Jennifer" }
  }
}
----

**Explore**

You can use Discover in Kibana to interactively search and filter your data.
From there, you can start creating visualizations and building and sharing dashboards.

To get started, create a _data view_ that connects to one or more Elasticsearch indices,
data streams, or index aliases.

. Go to **Management > Stack Management > Kibana > Data Views**.
. Select **Create data view**.
. Enter a name for the data view and a pattern that matches one or more indices,
such as _customer_.
. Select **Save data view to Kibana**.

To start exploring, go to **Analytics > Discover**.

[[upgrade]]
== Upgrade

To upgrade from an earlier version of Elasticsearch, see the
https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-upgrade.html[Elasticsearch upgrade
documentation].

[[build-source]]
== Build from source

Elasticsearch uses https://gradle.org[Gradle] for its build system.

To build a distribution for your local OS and print its output location upon
completion, run:
----
./gradlew localDistro
----

To build a distribution for another platform, run the related command:
----
./gradlew :distribution:archives:linux-tar:assemble
./gradlew :distribution:archives:darwin-tar:assemble
./gradlew :distribution:archives:windows-zip:assemble
----

Distributions are output to `distribution/archives`.

To run the test suite, see xref:TESTING.asciidoc[TESTING].

[[docs]]
== Documentation

For the complete Elasticsearch documentation visit
https://www.elastic.co/guide/en/elasticsearch/reference/current/index.html[elastic.co].

For information about our documentation processes, see the
xref:docs/README.asciidoc[docs README].

[[examples]]
== Examples and guides

The https://github.com/elastic/elasticsearch-labs[`elasticsearch-labs`] repo contains executable Python notebooks, sample apps, and resources to test out Elasticsearch for vector search, hybrid search and generative AI use cases.


[[contribute]]
== Contribute

For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].

[[questions]]
== Questions? Problems? Suggestions?

* To report a bug or request a feature, create a
https://github.com/elastic/elasticsearch/issues/new/choose[GitHub Issue]. Please
ensure someone else hasn't created an issue for the same topic.

* Need help using Elasticsearch? Reach out on the
https://discuss.elastic.co[Elastic Forum] or https://ela.st/slack[Slack]. A
fellow community member or Elastic engineer will be happy to help you out.