CrossVersionPerformanceTestRunnerTest.groovy

Clone Tools
  • last updated a few seconds ago
Constraints
Constraints: committers
 
Constraints: files
Constraints: dates
Ignored a couple of tests.

    • -0
    • +3
    ./CrossVersionPerformanceTestRunnerTest.groovy
Make it possible to generate flame graphs during performance tests

This commit improves the integration of Honest Profiler by making it possible to post-process the log files and generate a flame graph.

For this to be possible, the `org.gradle.performance.honestprofiler` system property must be set to a directory where flame graphs will

be exported.

It is expected to find 2 environment variables:

- `HP_HOME_DIR` must point to a valid Honest Profiler installation (https://github.com/RichardWarburton/honest-profiler)

- `FG_HOME_DIR` must point to a valid FlameGraph installation (https://github.com/brendangregg/FlameGraph)

In case those are not set, the collector will try to find them in `~/tools/honest-profiler` and `~/tools/FlameGraph` respectively.

The graphs are *not* integrated into the performance reports yet.

    • -0
    • +2
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 3 more files in changeset.
Make it possible to generate flame graphs during performance tests

This commit improves the integration of Honest Profiler by making it possible to post-process the log files and generate a flame graph.

For this to be possible, the `org.gradle.performance.honestprofiler` system property must be set to a directory where flame graphs will

be exported.

It is expected to find 2 environment variables:

- `HP_HOME_DIR` must point to a valid Honest Profiler installation (https://github.com/RichardWarburton/honest-profiler)

- `FG_HOME_DIR` must point to a valid FlameGraph installation (https://github.com/brendangregg/FlameGraph)

In case those are not set, the collector will try to find them in `~/tools/honest-profiler` and `~/tools/FlameGraph` respectively.

The graphs are *not* integrated into the performance reports yet.

    • -0
    • +2
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 3 more files in changeset.
Polish CrossVersionPerformanceTestRunner

    • -8
    • +12
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 1 more file in changeset.
Add support for --baselines none

    • -0
    • +32
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 1 more file in changeset.
Fix tests

    • -14
    • +14
    ./CrossVersionPerformanceTestRunnerTest.groovy
Allow leaving targetVersions unspecified

- it's faster to do profiling when running only the current version

    • -0
    • +18
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 1 more file in changeset.
Split test so that each perf run has a different directory

    • -13
    • +49
    ./CrossVersionPerformanceTestRunnerTest.groovy
Reintroduce gradle/gradle@784d747

- There was previously a concern that this would break the

BuildScansPerformanceTest, but that test doesn't actually use the

BaselineVersion class at all.

    • -4
    • +0
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 31 more files in changeset.
Revert "Make strict performance testing default."

This reverts commit 784d7476556e63c8d7f15919ae9875c3b7181aa3.

    • -0
    • +4
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 31 more files in changeset.
Make strict performance testing default.

- This makes it impossible to set maximum regression limits manually and

defaults to using a statistically derived allowable amount of

variance.

    • -4
    • +0
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 31 more files in changeset.
Allow usage of snapshot versions in performance test baselines

    • -0
    • +14
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 2 more files in changeset.
Allow to specify which regressions to look for from command line

This commit introduces a new flag (`--checks`) or system property (`org.gradle.performance.execution.checks`)

that can be used to control which regressions are going to be looked for in a performance test. By default,

we're looking for regressions in memory and speed. With this flag it is possible to control precisely what

to look for:

- `all` is the default and will look for both memory and speed regressions

- `memory` will only check for memory regressions

- `speed` will only check for speed regressions

- `none` will not check for regressions

The `memory` and `speed` values are useful when bisecting a regression: they let us ignore potential statistical

artifacts of measurement of a category we're not interested in.

The `none` value is interesting for historical builds, where we want to collect all measurements, but not

fail the build if a regression is encountered.

    • -0
    • +103
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 2 more files in changeset.
Fix CodeNarc

    • -17
    • +17
    ./CrossVersionPerformanceTestRunnerTest.groovy
Support `defaults` in `--baselines` parameter

If `defaults` is found in the list of versions in `baselines`, it is automatically expanded to the original list of baseline versions.

This is done so that we can use the `--baselines` parameter in anycase from TeamCity, whether we want to override the original list

of versions or not. If we don't, then we can set the `defaults` value and still run the build.

    • -10
    • +39
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 1 more file in changeset.
Allow specifying the list of baselines from command line

This commit introduces the ability to set the list of baseline versions for performance tests through a command-line

flag (`--baselines`) or a system property (`org.gradle.performance.baselines`). The list must be specified as a

comma-separated list of versions or as a semicolon separated list of versions.

Two versions are handled particularly:

* `last` corresponds to the last release of Gradle

* `nightly` corresponds to the latest build of Gradle (`master` branch)

This commit also removes the "ad-hoc" mode for executing tests. The idea is to replace it with this flag, which

can be set to `nightly`. However, the "ad-hoc" mode skipped writing results to the database, in order to avoid

polluting the performance results DB with tests. If you want to do this, you need to set the `gradle.performance.db.url`

to a local, temporary database:

```

./gradlew cleanSmallOldJava smallOldJava cleanPerformanceTest performanceTest --scenarios 'clean Java build smallOldJava (daemon)' -x prepareSamples --baselines nightly -Porg.gradle.performance.db.url=jdbc:h2:./build/database

```

    • -1
    • +6
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 10 more files in changeset.
Don't add MaxPermSize param on Java 8+ for running perf tests

    • -1
    • +1
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 19 more files in changeset.
Execute performance test scenarios on a fresh working copy

Reusing whatever state the last test left behind can make performance

seem better (because of preexisting caches) or worse (because of lots of output).

This makes the results dependent on the order in which the tests are executed.

It also prevented us from using incremental build for the project templates.

We now create a fresh copy of the template project for each test run,

fixing both of these problems at once.

    • -0
    • +5
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 14 more files in changeset.
Fix unit test

    • -1
    • +4
    ./CrossVersionPerformanceTestRunnerTest.groovy
Move all result-related classes to org.gradle.performance.results

    • -0
    • +2
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 55 more files in changeset.
Extract performance test fixtures to separate project

    • -0
    • +179
    ./CrossVersionPerformanceTestRunnerTest.groovy
  1. … 248 more files in changeset.