As per the new licensing change for Elasticsearch and Kibana this commit
moves existing Apache 2.0 licensed source code to the new dual license
SSPL+Elastic license 2.0. In addition, existing x-pack code now uses
the new version 2.0 of the Elastic license. Full changes include:
- Updating LICENSE and NOTICE files throughout the code base, as well
as those packaged in our published artifacts
- Update IDE integration to now use the new license header on newly
created source files
- Remove references to the "OSS" distribution from our documentation
- Update build time verification checks to no longer allow Apache 2.0
license header in Elasticsearch source code
- Replace all existing Apache 2.0 license headers for non-xpack code
with updated header (vendored code with Apache 2.0 headers obviously
remains the same).
- Replace all Elastic license 1.0 headers with new 2.0 header in xpack.
Upgrade JMH to latest (1.26) to pick up its async profiler integration
and update the documentation to include instructions to running the
async profiler and making pretty pretty flame graphs.
This commit cleans up some of the rest resources to honor best practices.
These changes include:
* remove SourceSet from copy api/test tasks and move the logic to the plugin.
* this allows the tasks talk in simpler terms in File and FileCollections.
* prefer constructor injection over method injection for gradle components
* rename tasks to better match usage:
* copyRestApiCompatSpecsTask -> copyRestCompatApiTask
* copyRestApiCompatTestTask -> copyRestCompatTestTask
* remove no longer valid TODO's
Introduce RestCompatTestTransformTask to execute in transformations
needed for REST compatibility testing.
This new task is part of the execution graph for yamlRestCompatTest
such that when that task is run it now copy the bwc:minor rest
tests to an intermediate location (e.g. build/resources/yamlRestCompatTest/v7restTests/rest-api-spec/test)
Then this new task will read from that location, transform the tests as
necessary and output to the location that test runner expects
(e.g. build/resources/yamlRestCompatTest/rest-api-spec/test)
Currently only the compatibility headers are injected via the
transformation(s), but future additional transformations are possible and this
commit contains only the gradle hooks needed to ensure that all transforms are
executed and result in the correct location on disk for the test runner.
* Fix jdk download plugin to handle extracting azul jdks
Unpacking transformation of azul packaged jdks have been broken due to
a different packaging structure not handled correctly by the JdkDownloadPlugin
This fixes this and also fixes the packaging of aarch64 jdks into elasticsearch with
jdk distros
We have an in-house rule to compare explicitly against `false` instead
of using the logical not operator (`!`). However, this hasn't
historically been enforced, meaning that there are many violations in
the source at present.
We now have a Checkstyle rule that can detect these cases, but before we
can turn it on, we need to fix the existing violations. This is being
done over a series of PRs, since there are a lot to fix.
This commit moves the yamlRestCompatTest plugin to an internal package
since this plugin is use for the internal build only (ie not plugin devs).
Since there was considerable use of package private class members directly,
getters/setters have been added and the class members moved to private scope.
This commit also adds a lifecycle to help abstract the execution of the
concept of rest compatibility testing from the current (and only) task.
This allows this lifecycle task to be added to CI (if needed) and other
tasks added (if needed) without needing to change the CI setup.
Now that Checkstyle can be made useful in an IDE, add extra rules only when
checking in the IDE so that a contributor is given extra help when editing
files, without having the checkstyle task spam the console when running gradle.
Apart from the `BooleanNegation` rule below, the new rules have the `warning`
severity level. The new Javadoc rules reflect the guidelines in
`CONTRIBUTING.md` that we've had for some time.
* I upgraded Checkstyle, so now it interprets the config file in the same was
as the IntelliJ plugin. That means that I could move the `LineLength` rule
up a level and remove its special handling in the `:configureIdeCheckstyle`
task
* I added the `SuppressWarningsFilter` and `SuppressWarningsHolder` rules so
that the `@SuppressWarnings` annotation can be used to silence Checkstyle
checks. We shouldn't typically need this, but it seemed like a useful thing
to configure. In contrast to the suppressions file, this approach makes the
suppression obvious.
* I changed the `:configureIdeCheckstyle` task to pull in rules from
`checkstyle_ide_fragment.xml` and merged them into the generated config.
The rules are as follows:
* `BooleanNegation` - a task defined using `DescendantToken` that detects
cases of `!` being used negate a boolean expression. This ought to be in
the main config, but at present we have a number of violations in the
source
* `MissingJavadocMethod` - requires that public methods are documented. I
set `minLineCount` to 2 so that the rule doesn't trigger on simple
methods.
* `MissingJavadocPackage` - require Javadoc in `package-info.java`
* `MissingJavadocType` - require types to be documented
* `JavadocMethod` - require params and return type to be documeted
This commit introduces 3 new Gradle plugin functional tests.
These tests make use of dummy subprojects to help satisfy the
inter dependencies of the projects in use. Helper methods
have been added to a new shared common parent.
This commit introduces the ability to programmatically transform YAML
based REST tests. Specifically this initial commit introduces the
ability to inject HTTP headers into the YAML REST tests. Additional
capabilities (transforms) will be introduced in future commits.
Transforming REST tests is a key component to test REST API compatibility.
Eventually (not included here) the ability to transform tests will be
integrated with Gradle such that the tests that are transformed prior
to being executed. This commit does not have any REST API compatibility
specific changes, and only introduces the code that will be used for that
testing.
Transforming REST tests at the YAML level was chosen over updating the
test runners to ensure that these tests may (at a future time) be used
by the various clients that re-use the YAML based REST tests. By
transforming the tests themselves, there is a consistent view across all
runners for exactly how and what to test when running N-1 tests against
a N cluster with REST compatibility.
This change upgrades to the latest Lucene 8.8.0 snapshot.
It also restores the compression on binary doc values that was lost in the last snapshot upgrade.
The compression is now configurable on binary doc values but we don't expose this functionality yet so this commit ensures that we pick the same compression mode as previous releases (BEST_COMPRESSION).
Find file structure finder is now its own plugin, and separated from the ml plugin.
This commit updates the rest high level client to reflect this.
Additionally, this adjusts the internal and client object names from `FileStructure` to the more general `TextStructure`
- Updates our build to use the latest Gradle 6.8 release which is the last release
before the major 7.0 release.
- Resolve available gradle versions using built-in toolchain support
- Fixes deprecated usage of JavaInstallationRegistry
- We replace jdk handling in our build to rely on jvm detection provided by the gradle build tool itself.
As we rely on environment variables pointing to jdks we wire this into the gradle jdk detection mechanism
This PR is a first attempt to get the build to run on an Apple M1 (ARM 64 / aarch64) machine.
I think the changes are mostly reasonable, apart from some hard-coding to use the Azul JVM,
which at the time of writing seems to be the only available JVM. I'll follow up when our preferred
JVM is available.
A new version of this test dependency is finally available, enabling us
to remove a hack from production code we've long carried because of a
bug in that test dependency. This commit upgrades our tests to use
jimfs-1.2.
This PR conditionally allow SunRsaSign to be used as a security provider when
the runtime java is an implementation by Oracle. It is necessary because:
* Oracle JVM mandates Security Provider verification
* The verification class (javax.crypto.JarVerifier) uses hardcoded certificates
with md5WithRsa signature.
We were depending on the BouncyCastle FIPS own mechanics to set
itself in approved only mode since we run with the Security
Manager enabled. The check during startup seems to happen before we
set our restrictive SecurityManager though in
org.elasticsearch.bootstrap.Elasticsearch , and this means that
BCFIPS would not be in approved only mode, unless explicitly
configured so.
This commit sets the appropriate JVM property to explicitly set
BCFIPS in approved only mode in CI and adds tests to ensure that we
will be running with BCFIPS in approved only mode when we expect to.
It also sets xpack.security.fips_mode.enabled to true for all test clusters
used in fips mode and sets the distribution to the default one. It adds a
password to the elasticsearch keystore for all test clusters that run in fips
mode.
Moreover, it changes a few unit tests where we would use bcrypt even in
FIPS 140 mode. These would still pass since we are bundling our own
bcrypt implementation, but are now changed to use FIPS 140 approved
algorithms instead for better coverage.
It also addresses a number of tests that would fail in approved only mode
Mainly:
Tests that use PBKDF2 with a password less than 112 bits (14char). We
elected to change the passwords used everywhere to be at least 14
characters long instead of mandating
the use of pbkdf2_stretch because both pbkdf2 and
pbkdf2_stretch are supported and allowed in fips mode and it makes sense
to test with both. We could possibly figure out the password algorithm used
for each test and adjust password length accordingly only for pbkdf2 but
there is little value in that. It's good practice to use strong passwords so if
our docs and tests use longer passwords, then it's for the best. The approach
is brittle as there is no guarantee that the next test that will be added won't
use a short password, so we add some testing documentation too.
This leaves us with a possible coverage gap since we do support passwords
as short as 6 characters but we only test with > 14 chars but the
validation itself was not tested even before. Tests can be added in a followup,
outside of fips related context.
Tests that use a PKCS12 keystore and were not already muted.
Tests that depend on running test clusters with a basic license or
using the OSS distribution as FIPS 140 support is not available in
neither of these.
Finally, it adds some information around FIPS 140 testing in our testing
documentation reference so that developers can hopefully keep in
mind fips 140 related intricacies when writing/changing docs.
Certain BWC tests rely on elasticsearch distributions build from source.
So far this was done by building an ES archive in a nested build ( e.g. see
https://gradle-enterprise.elastic.co/s/e7jb5w2dyi7oy/timeline?details=potxx3gikoxci )
and then the consuming build extracted that archive immediately before running tests
against that distribution.
This change closes the loop on consuming extract only configurations that were introduced in
an earlier PR (#63599) which results in saving us building
the elasticsearch tars and zips in the nested builds which reduces the overhead drastically
new nested build without archive building:
https://gradle-enterprise.elastic.co/s/xa56zkpa6awhw/timeline?details=qjvevbdhsjooy
Most of the changed files are integTest related to have a proper reflection of how we handle
older versions that do not support directly access extracted assemble.
Fixes#62115
Running multiple hdfs fixtures in parallel for running integration tests requires
a dynamic port assignment in order to avoid port clashes. This introduces
the ability to assign port ranges to gradle projects that can be used
to allocate dynamically ports used by these projects.
We apply this dynamic port setup for hdfs fixtures used in
:x-pack:plugin:searchable-snapshots:qa only at the moment as
tests sources (rest tests) in :plugins:repository-hdfs still rely on
hard coded ports.
This is a simplified version of fixtures I created before on the gradle codebase
to deal with similar issues.
Fixes#66377
Closes#62758.
Include the Stack log4j config in the Docker image, in order to
make it possible to write logs in a container environment in the
same way as for an archive or package deployment. This is useful
in situations where the user is bind-mounting the logs directory
and has their own arrangements for log shipping.
To use stack logging, set the environment variable `ES_LOG_STYLE`
to `file`. It can also be set to `console`, which is the same as
not specifying it at all.
The Docker logging config is now auto-generated at image build time,
by running the default config through a transformer program when
preparing the distribution in an image builder step.
Also, in the docker distribution `build.gradle`, I changed a helper
closure into a class with a static method in order to fix an
issue where the Docker image was always being rebuilt, even when
there were no changes.
This tweaks the AntFixture handling to make it compliant with the task avoidance api.
Tasks of type StandaloneRestTestTask are now generally finalised by using the typed ant stop task
which allows us to remove of errorprone dependsOn overrides in StandaloneRestTestTask. As a result
we also ported more task definitions in the build to task avoidance api.
Next work item regarding AntFixture handling is porting AntFixture to a plain Gradle task and remove
Groovy AntBuilder will allow us to port more build logic from Groovy to Java but is out of the scope of
This PR.