AbstractGradleVsMavenPerformanceTest.groovy

Clone Tools
  • last updated a few seconds ago
Constraints
Constraints: committers
 
Constraints: files
Constraints: dates
Simplify performance measurements

The many measurements that we injected into the build under test were

skewing our measurements to the point of making them unreliable or

unrealistic. We now only measure end to end build time. Here's a breakdown

with the rationale for removing each other measurement:

- configuration time: can be done by having a `gradle help` scenario instead

- execution time: the user does not care whether a long build is stuck in execution or configuration

- setup/teardown: was ill-defined anyway, basically total - configuration - execution

- JIT compile time: this is nothing we can influence and thus pointless to measure

- Memory usage: Was only measured at one point in the build, which doesn't tell us anything about

any problems at any other point in the build

- GC CPU time: If this increases we'd see it in total execution time

Generally, looking at the graphs has never pointed us directly at the problem, we always need to

profile anyway. So instead of skewing our measurements with lots of profling code, we should

instead use a dedicated profiling job to measure if we actually see a regression.

Memory usage can be tested indirectly by giving each scenario a reasonable amount of memory.

If memory usage rises above that reasonable limit, we'd see execution time rise, telling us about

the regression. Generally, we do not optimize for smallest memory usage, but for fastest execution

with reasonable memory overhead.

This change also removes all JVM tweaking and wait periods which we introduced in an attempt to

make tests more predictable and stable. These tweaks have not really helped us achieve more stable

tests and have often done the opposite. They also add lots of complexity and make our tests more

unrealistic. A real users will not add all these JVM options to Gradle.

    • -2
    • +2
    ./AbstractGradleVsMavenPerformanceTest.groovy
  1. … 59 more files in changeset.
Wire integration test build context instance

- enables using performance test specific build context when an instance

is properly wired

    • -1
    • +4
    ./AbstractGradleVsMavenPerformanceTest.groovy
  1. … 48 more files in changeset.
Allow specifying the list of baselines from command line

This commit introduces the ability to set the list of baseline versions for performance tests through a command-line

flag (`--baselines`) or a system property (`org.gradle.performance.baselines`). The list must be specified as a

comma-separated list of versions or as a semicolon separated list of versions.

Two versions are handled particularly:

* `last` corresponds to the last release of Gradle

* `nightly` corresponds to the latest build of Gradle (`master` branch)

This commit also removes the "ad-hoc" mode for executing tests. The idea is to replace it with this flag, which

can be set to `nightly`. However, the "ad-hoc" mode skipped writing results to the database, in order to avoid

polluting the performance results DB with tests. If you want to do this, you need to set the `gradle.performance.db.url`

to a local, temporary database:

```

./gradlew cleanSmallOldJava smallOldJava cleanPerformanceTest performanceTest --scenarios 'clean Java build smallOldJava (daemon)' -x prepareSamples --baselines nightly -Porg.gradle.performance.db.url=jdbc:h2:./build/database

```

    • -2
    • +1
    ./AbstractGradleVsMavenPerformanceTest.groovy
  1. … 10 more files in changeset.
Exclude Gradle vs. Maven tests from regression suite

Any regressions will already be detected by our cross-version tests.

These comparisons only have informational value now and only slow down

our regression test suite. Moving the to the experiment category.

    • -3
    • +7
    ./AbstractGradleVsMavenPerformanceTest.groovy
Fix GC logging location in Maven performance tests

    • -1
    • +1
    ./AbstractGradleVsMavenPerformanceTest.groovy
  1. … 4 more files in changeset.
Execute performance test scenarios on a fresh working copy

Reusing whatever state the last test left behind can make performance

seem better (because of preexisting caches) or worse (because of lots of output).

This makes the results dependent on the order in which the tests are executed.

It also prevented us from using incremental build for the project templates.

We now create a fresh copy of the template project for each test run,

fixing both of these problems at once.

    • -0
    • +1
    ./AbstractGradleVsMavenPerformanceTest.groovy
  1. … 14 more files in changeset.
Move all result-related classes to org.gradle.performance.results

    • -0
    • +2
    ./AbstractGradleVsMavenPerformanceTest.groovy
  1. … 55 more files in changeset.
Extract performance test fixtures to separate project

    • -0
    • +67
    ./AbstractGradleVsMavenPerformanceTest.groovy
  1. … 248 more files in changeset.