diff options
author | GitLab Bot <gitlab-bot@gitlab.com> | 2022-05-31 12:09:12 +0000 |
---|---|---|
committer | GitLab Bot <gitlab-bot@gitlab.com> | 2022-05-31 12:09:12 +0000 |
commit | 8243505178033432b7fc6834eef425c9dcdfd7bc (patch) | |
tree | 0497dc786de7ae7563141055bc67d43645489f69 /doc/ci/testing | |
parent | dcbe65b8b6d3be931c10a823d1271318f70b1507 (diff) | |
download | gitlab-ce-8243505178033432b7fc6834eef425c9dcdfd7bc.tar.gz |
Add latest changes from gitlab-org/gitlab@master
Diffstat (limited to 'doc/ci/testing')
-rw-r--r-- | doc/ci/testing/img/junit_test_report.png | bin | 0 -> 28718 bytes | |||
-rw-r--r-- | doc/ci/testing/img/pipelines_junit_test_report_v13_10.png | bin | 0 -> 16710 bytes | |||
-rw-r--r-- | doc/ci/testing/img/pipelines_junit_test_report_with_errors_v13_10.png | bin | 0 -> 18358 bytes | |||
-rw-r--r-- | doc/ci/testing/index.md | 35 | ||||
-rw-r--r-- | doc/ci/testing/unit_test_report_examples.md | 266 | ||||
-rw-r--r-- | doc/ci/testing/unit_test_reports.md | 160 |
6 files changed, 461 insertions, 0 deletions
diff --git a/doc/ci/testing/img/junit_test_report.png b/doc/ci/testing/img/junit_test_report.png Binary files differnew file mode 100644 index 00000000000..a4b98c8b910 --- /dev/null +++ b/doc/ci/testing/img/junit_test_report.png diff --git a/doc/ci/testing/img/pipelines_junit_test_report_v13_10.png b/doc/ci/testing/img/pipelines_junit_test_report_v13_10.png Binary files differnew file mode 100644 index 00000000000..ef79a2547af --- /dev/null +++ b/doc/ci/testing/img/pipelines_junit_test_report_v13_10.png diff --git a/doc/ci/testing/img/pipelines_junit_test_report_with_errors_v13_10.png b/doc/ci/testing/img/pipelines_junit_test_report_with_errors_v13_10.png Binary files differnew file mode 100644 index 00000000000..cfcf3bec76c --- /dev/null +++ b/doc/ci/testing/img/pipelines_junit_test_report_with_errors_v13_10.png diff --git a/doc/ci/testing/index.md b/doc/ci/testing/index.md new file mode 100644 index 00000000000..52af329873f --- /dev/null +++ b/doc/ci/testing/index.md @@ -0,0 +1,35 @@ +--- +stage: Verify +group: Pipeline Insights +info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments +--- + +# Test with GitLab CI/CD and generate reports in merge requests **(FREE)** + +Use GitLab CI/CD to test the changes included in a feature branch. You can also +display reports or link to important information directly from [merge requests](../../user/project/merge_requests/index.md). + +| Feature | Description | +|-------------------------------------------------------------------------------------------------|-------------| +| [Accessibility Testing](../../user/project/merge_requests/accessibility_testing.md) | Automatically report A11y violations for changed pages in merge requests. | +| [Browser Performance Testing](../../user/project/merge_requests/browser_performance_testing.md) | Quickly determine the browser performance impact of pending code changes. | +| [Load Performance Testing](../../user/project/merge_requests/load_performance_testing.md) | Quickly determine the server performance impact of pending code changes. | +| [Code Quality](../../user/project/merge_requests/code_quality.md) | Analyze your source code quality using the [Code Climate](https://codeclimate.com/) analyzer and show the Code Climate report right in the merge request widget area. | +| [Display arbitrary job artifacts](../yaml/index.md#artifactsexpose_as) | Configure CI pipelines with the `artifacts:expose_as` parameter to directly link to selected [artifacts](../pipelines/job_artifacts.md) in merge requests. | +| [Unit test reports](unit_test_reports.md) | Configure your CI jobs to use Unit test reports, and let GitLab display a report on the merge request so that it's easier and faster to identify the failure without having to check the entire job log. | +| [License Compliance](../../user/compliance/license_compliance/index.md) | Manage the licenses of your dependencies. | +| [Metrics Reports](../metrics_reports.md) | Display the Metrics Report on the merge request so that it's fast and easier to identify changes to important metrics. | +| [Test Coverage visualization](../../user/project/merge_requests/test_coverage_visualization.md) | See test coverage results for merge requests, in the file diff. | +| [Fail fast testing](../../user/project/merge_requests/fail_fast_testing.md#fail-fast-testing) | Run a subset of your RSpec test suite, so failed tests stop the pipeline before the full suite of tests run, saving resources. | + +## Security Reports **(ULTIMATE)** + +In addition to the reports listed above, GitLab can do many types of [Security reports](../../user/application_security/index.md), +generated by scanning and reporting any vulnerabilities found in your project: + +| Feature | Description | +|----------------------------------------------------------------------------------------------|-------------| +| [Container Scanning](../../user/application_security/container_scanning/index.md) | Analyze your Docker images for known vulnerabilities. | +| [Dynamic Application Security Testing (DAST)](../../user/application_security/dast/index.md) | Analyze your running web applications for known vulnerabilities. | +| [Dependency Scanning](../../user/application_security/dependency_scanning/index.md) | Analyze your dependencies for known vulnerabilities. | +| [Static Application Security Testing (SAST)](../../user/application_security/sast/index.md) | Analyze your source code for known vulnerabilities. | diff --git a/doc/ci/testing/unit_test_report_examples.md b/doc/ci/testing/unit_test_report_examples.md new file mode 100644 index 00000000000..a54deb254b7 --- /dev/null +++ b/doc/ci/testing/unit_test_report_examples.md @@ -0,0 +1,266 @@ +--- +stage: Verify +group: Pipeline Insights +info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments +--- + +# Unit test report examples **(FREE)** + +[Unit test reports](unit_test_reports.md) can be generated for many languages and packages. +Use these examples as guidelines for configuring your pipeline to generate unit test reports +for the listed languages and packages. You might need to edit the examples to match +the version of the language or package you are using. + +## Ruby + +Use the following job in `.gitlab-ci.yml`. This includes the `artifacts:paths` keyword to provide a link to the Unit test report output file. + +```yaml +## Use https://github.com/sj26/rspec_junit_formatter to generate a JUnit report format XML file with rspec +ruby: + stage: test + script: + - bundle install + - bundle exec rspec --format progress --format RspecJunitFormatter --out rspec.xml + artifacts: + when: always + paths: + - rspec.xml + reports: + junit: rspec.xml +``` + +## Go + +Use the following job in `.gitlab-ci.yml`: + +```yaml +## Use https://github.com/gotestyourself/gotestsum to generate a JUnit report format XML file with go +golang: + stage: test + script: + - go get gotest.tools/gotestsum + - gotestsum --junitfile report.xml --format testname + artifacts: + when: always + reports: + junit: report.xml +``` + +## Java + +There are a few tools that can produce JUnit report format XML file in Java. + +### Gradle + +In the following example, `gradle` is used to generate the test reports. +If there are multiple test tasks defined, `gradle` generates multiple +directories under `build/test-results/`. In that case, you can leverage glob +matching by defining the following path: `build/test-results/test/**/TEST-*.xml`: + +```yaml +java: + stage: test + script: + - gradle test + artifacts: + when: always + reports: + junit: build/test-results/test/**/TEST-*.xml +``` + +In [GitLab Runner 13.0](https://gitlab.com/gitlab-org/gitlab-runner/-/issues/2620) +and later, you can use `**`. + +### Maven + +For parsing [Surefire](https://maven.apache.org/surefire/maven-surefire-plugin/) +and [Failsafe](https://maven.apache.org/surefire/maven-failsafe-plugin/) test +reports, use the following job in `.gitlab-ci.yml`: + +```yaml +java: + stage: test + script: + - mvn verify + artifacts: + when: always + reports: + junit: + - target/surefire-reports/TEST-*.xml + - target/failsafe-reports/TEST-*.xml +``` + +## Python example + +This example uses pytest with the `--junitxml=report.xml` flag to format the output +into the JUnit report XML format: + +```yaml +pytest: + stage: test + script: + - pytest --junitxml=report.xml + artifacts: + when: always + reports: + junit: report.xml +``` + +## C/C++ + +There are a few tools that can produce JUnit report format XML files in C/C++. + +### GoogleTest + +In the following example, `gtest` is used to generate the test reports. +If there are multiple `gtest` executables created for different architectures (`x86`, `x64` or `arm`), +you are required to run each test providing a unique filename. The results +are then aggregated together. + +```yaml +cpp: + stage: test + script: + - gtest.exe --gtest_output="xml:report.xml" + artifacts: + when: always + reports: + junit: report.xml +``` + +### CUnit + +[CUnit](https://cunity.gitlab.io/cunit/) can be made to produce [JUnit report format XML files](https://cunity.gitlab.io/cunit/group__CI.html) +automatically when run using its `CUnitCI.h` macros: + +```yaml +cunit: + stage: test + script: + - ./my-cunit-test + artifacts: + when: always + reports: + junit: ./my-cunit-test.xml +``` + +## .NET + +The [JunitXML.TestLogger](https://www.nuget.org/packages/JunitXml.TestLogger/) NuGet +package can generate test reports for .Net Framework and .Net Core applications. The following +example expects a solution in the root folder of the repository, with one or more +project files in sub-folders. One result file is produced per test project, and each file +is placed in the artifacts folder. This example includes optional formatting arguments, which +improve the readability of test data in the test widget. A full .Net Core +[example is available](https://gitlab.com/Siphonophora/dot-net-cicd-test-logging-demo). + +```yaml +## Source code and documentation are here: https://github.com/spekt/junit.testlogger/ + +Test: + stage: test + script: + - 'dotnet test --test-adapter-path:. --logger:"junit;LogFilePath=..\artifacts\{assembly}-test-result.xml;MethodFormat=Class;FailureBodyFormat=Verbose"' + artifacts: + when: always + paths: + - ./**/*test-result.xml + reports: + junit: + - ./**/*test-result.xml +``` + +## JavaScript + +There are a few tools that can produce JUnit report format XML files in JavaScript. + +### Jest + +The [jest-junit](https://github.com/jest-community/jest-junit) npm package can generate +test reports for JavaScript applications. In the following `.gitlab-ci.yml` example, +the `javascript` job uses Jest to generate the test reports: + +```yaml +javascript: + stage: test + script: + - 'jest --ci --reporters=default --reporters=jest-junit' + artifacts: + when: always + reports: + junit: + - junit.xml +``` + +### Karma + +The [Karma-junit-reporter](https://github.com/karma-runner/karma-junit-reporter) +npm package can generate test reports for JavaScript applications. In the following +`.gitlab-ci.yml` example, the `javascript` job uses Karma to generate the test reports: + +```yaml +javascript: + stage: test + script: + - karma start --reporters junit + artifacts: + when: always + reports: + junit: + - junit.xml +``` + +### Mocha + +The [JUnit Reporter for Mocha](https://github.com/michaelleeallen/mocha-junit-reporter) +NPM package can generate test reports for JavaScript applications. In the following +`.gitlab-ci.yml` example, the `javascript` job uses Mocha to generate the test reports: + +```yaml +javascript: + stage: test + script: + - mocha --reporter mocha-junit-reporter --reporter-options mochaFile=junit.xml + artifacts: + when: always + reports: + junit: + - junit.xml +``` + +## Flutter or Dart + +This example `.gitlab-ci.yml` file uses the [JUnit Report](https://pub.dev/packages/junitreport) +package to convert the `flutter test` output into JUnit report XML format: + +```yaml +test: + stage: test + script: + - flutter test --machine | tojunit -o report.xml + artifacts: + when: always + reports: + junit: + - report.xml +``` + +## PHP + +This example uses [PHPUnit](https://phpunit.de/) with the `--log-junit` flag. +You can also add this option using +[XML](https://phpunit.readthedocs.io/en/stable/configuration.html#the-junit-element) +in the `phpunit.xml` configuration file. + +```yaml +phpunit: + stage: test + script: + - composer install + - vendor/bin/phpunit --log-junit report.xml + artifacts: + when: always + reports: + junit: report.xml +``` diff --git a/doc/ci/testing/unit_test_reports.md b/doc/ci/testing/unit_test_reports.md new file mode 100644 index 00000000000..e9c9410b16d --- /dev/null +++ b/doc/ci/testing/unit_test_reports.md @@ -0,0 +1,160 @@ +--- +stage: Verify +group: Pipeline Insights +info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments +--- + +# Unit test reports **(FREE)** + +> - [Introduced](https://gitlab.com/gitlab-org/gitlab-foss/-/issues/45318) in GitLab 11.2. Requires GitLab Runner 11.2 and above. +> - [Renamed](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/39737) from JUnit test reports to Unit test reports in GitLab 13.4. + +It is very common that a [CI/CD pipeline](../pipelines/index.md) contains a +test job that verifies your code. +If the tests fail, the pipeline fails and users get notified. The person that +works on the merge request has to check the job logs and see where the +tests failed so that they can fix them. + +You can configure your job to use Unit test reports, and GitLab displays a +report on the merge request so that it's easier and faster to identify the +failure without having to check the entire log. Unit test reports currently +only support test reports in the JUnit report format. + +If you don't use merge requests but still want to see the unit test report +output without searching through job logs, the full +[Unit test reports](#view-unit-test-reports-on-gitlab) are available +in the pipeline detail view. + +Consider the following workflow: + +1. Your default branch is rock solid, your project is using GitLab CI/CD and + your pipelines indicate that there isn't anything broken. +1. Someone from your team submits a merge request, a test fails and the pipeline + gets the known red icon. To investigate more, you have to go through the job + logs to figure out the cause of the failed test, which usually contain + thousands of lines. +1. You configure the Unit test reports and immediately GitLab collects and + exposes them in the merge request. No more searching in the job logs. +1. Your development and debugging workflow becomes easier, faster and efficient. + +## How it works + +First, GitLab Runner uploads all [JUnit report format XML files](https://www.ibm.com/docs/en/adfz/developer-for-zos/14.1.0?topic=formats-junit-xml-format) +as [artifacts](../yaml/artifacts_reports.md#artifactsreportsjunit) to GitLab. Then, when you visit a merge request, GitLab starts +comparing the head and base branch's JUnit report format XML files, where: + +- The base branch is the target branch (usually the default branch). +- The head branch is the source branch (the latest pipeline in each merge request). + +The reports panel has a summary showing how many tests failed, how many had errors +and how many were fixed. If no comparison can be done because data for the base branch +is not available, the panel just shows the list of failed tests for head. + +There are four types of results: + +1. **Newly failed tests:** Test cases which passed on base branch and failed on head branch +1. **Newly encountered errors:** Test cases which passed on base branch and failed due to a + test error on head branch +1. **Existing failures:** Test cases which failed on base branch and failed on head branch +1. **Resolved failures:** Test cases which failed on base branch and passed on head branch + +Each entry in the panel shows the test name and its type from the list +above. Clicking on the test name opens a modal window with details of its +execution time and the error output. + + + +### Number of recent failures + +> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/241759) in merge requests in GitLab 13.7. +> - [Feature flag removed](https://gitlab.com/gitlab-org/gitlab/-/issues/268249) in GitLab 13.8. +> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/235525) in Test Reports in GitLab 13.9. + +If a test failed in the project's default branch in the last 14 days, a message like +`Failed {n} time(s) in {default_branch} in the last 14 days` is displayed for that test. + +## How to set it up + +To enable the Unit test reports in merge requests, you must add +[`artifacts:reports:junit`](../yaml/artifacts_reports.md#artifactsreportsjunit) +in `.gitlab-ci.yml`, and specify the paths of the generated test reports. +The reports must be `.xml` files, otherwise [GitLab returns an Error 500](https://gitlab.com/gitlab-org/gitlab/-/issues/216575). + +In the following example for Ruby, the job in the `test` stage runs and GitLab +collects the unit test report from the job. After the job is executed, the +XML report is stored in GitLab as an artifact, and the results are shown in the +merge request widget. + +```yaml +## Use https://github.com/sj26/rspec_junit_formatter to generate a JUnit report format XML file with rspec +ruby: + stage: test + script: + - bundle install + - bundle exec rspec --format progress --format RspecJunitFormatter --out rspec.xml + artifacts: + when: always + paths: + - rspec.xml + reports: + junit: rspec.xml +``` + +To make the Unit test report output files browsable, include them with the +[`artifacts:paths`](../yaml/index.md#artifactspaths) keyword as well, as shown in the example. +To upload the report even if the job fails (for example if the tests do not pass), +use the [`artifacts:when:always`](../yaml/index.md#artifactswhen) keyword. + +You cannot have multiple tests with the same name and class in your JUnit report format XML file. + +## View Unit test reports on GitLab + +> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/24792) in GitLab 12.5 behind a feature flag (`junit_pipeline_view`), disabled by default. +> - [Feature flag removed](https://gitlab.com/gitlab-org/gitlab/-/issues/216478) in GitLab 13.3. + +If JUnit report format XML files are generated and uploaded as part of a pipeline, these reports +can be viewed inside the pipelines details page. The **Tests** tab on this page +displays a list of test suites and cases reported from the XML file. + + + +You can view all the known test suites and select each of these to see further +details, including the cases that make up the suite. + +You can also retrieve the reports via the [GitLab API](../../api/pipelines.md#get-a-pipelines-test-report). + +### Unit test reports parsing errors + +> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/263457) in GitLab 13.10. + +If parsing JUnit report XML results in an error, an indicator is shown next to the job name. Hovering over the icon shows the parser error in a tooltip. If multiple parsing errors come from [grouped jobs](../jobs/index.md#group-jobs-in-a-pipeline), GitLab shows only the first error from the group. + + + +For test case parsing limits, see [Max test cases per unit test report](../../user/gitlab_com/#gitlab-cicd). + +GitLab does not parse very [large nodes](https://nokogiri.org/tutorials/parsing_an_html_xml_document.html#parse-options) of JUnit reports. There is [an issue](https://gitlab.com/gitlab-org/gitlab/-/issues/268035) open to make this optional. + +## View JUnit screenshots on GitLab + +> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/202114) in GitLab 13.0 behind the `:junit_pipeline_screenshots_view` feature flag, disabled by default. +> - [Feature flag removed](https://gitlab.com/gitlab-org/gitlab/-/issues/216979) in GitLab 13.12. + +Upload your screenshots as [artifacts](../yaml/artifacts_reports.md#artifactsreportsjunit) to GitLab. If JUnit +report format XML files contain an `attachment` tag, GitLab parses the attachment. Note that: + +- The `attachment` tag **must** contain the relative path to `$CI_PROJECT_DIR` of the screenshots you uploaded. For + example: + + ```xml + <testcase time="1.00" name="Test"> + <system-out>[[ATTACHMENT|/path/to/some/file]]</system-out> + </testcase> + ``` + +- You should set the job that uploads the screenshot to + [`artifacts:when: always`](../yaml/index.md#artifactswhen) so that it still uploads a screenshot + when a test fails. + +A link to the test case attachment appears in the test case details in +[the pipeline test report](#view-unit-test-reports-on-gitlab). |