summaryrefslogtreecommitdiff
path: root/doc/development
diff options
context:
space:
mode:
Diffstat (limited to 'doc/development')
-rw-r--r--doc/development/README.md12
-rw-r--r--doc/development/architecture.md24
-rw-r--r--doc/development/automatic_ce_ee_merge.md15
-rw-r--r--doc/development/chaos_endpoints.md37
-rw-r--r--doc/development/code_review.md61
-rw-r--r--doc/development/contributing/community_roles.md2
-rw-r--r--doc/development/contributing/issue_workflow.md117
-rw-r--r--doc/development/contributing/merge_request_workflow.md34
-rw-r--r--doc/development/contributing/style_guides.md1
-rw-r--r--doc/development/database_debugging.md34
-rw-r--r--doc/development/database_review.md134
-rw-r--r--doc/development/diffs.md2
-rw-r--r--doc/development/documentation/feature-change-workflow.md30
-rw-r--r--doc/development/documentation/improvement-workflow.md6
-rw-r--r--doc/development/documentation/index.md146
-rw-r--r--doc/development/documentation/styleguide.md258
-rw-r--r--doc/development/ee_features.md46
-rw-r--r--doc/development/elasticsearch.md30
-rw-r--r--doc/development/fe_guide/architecture.md2
-rw-r--r--doc/development/fe_guide/axios.md5
-rw-r--r--doc/development/fe_guide/components.md45
-rw-r--r--doc/development/fe_guide/event_tracking.md97
-rw-r--r--doc/development/fe_guide/frontend_faq.md6
-rw-r--r--doc/development/fe_guide/graphql.md3
-rw-r--r--doc/development/fe_guide/icons.md16
-rw-r--r--doc/development/fe_guide/index.md2
-rw-r--r--doc/development/fe_guide/performance.md40
-rw-r--r--doc/development/fe_guide/style_guide_js.md863
-rw-r--r--doc/development/fe_guide/style_guide_scss.md3
-rw-r--r--doc/development/fe_guide/vuex.md44
-rw-r--r--doc/development/file_storage.md4
-rw-r--r--doc/development/filtering_by_label.md166
-rw-r--r--doc/development/geo.md26
-rw-r--r--doc/development/git_object_deduplication.md110
-rw-r--r--doc/development/go_guide/index.md102
-rw-r--r--doc/development/gotchas.md2
-rw-r--r--doc/development/hash_indexes.md2
-rw-r--r--doc/development/i18n/externalization.md221
-rw-r--r--doc/development/i18n/proofreader.md1
-rw-r--r--doc/development/img/architecture_simplified.pngbin61590 -> 34330 bytes
-rw-r--r--doc/development/img/distributed_tracing_jaeger_ui.pngbin1032713 -> 546331 bytes
-rw-r--r--doc/development/img/distributed_tracing_performance_bar.pngbin108809 -> 34370 bytes
-rw-r--r--doc/development/import_export.md5
-rw-r--r--doc/development/instrumentation.md2
-rw-r--r--doc/development/interacting_components.md29
-rw-r--r--doc/development/licensed_feature_availability.md6
-rw-r--r--doc/development/migration_style_guide.md11
-rw-r--r--doc/development/module_with_instance_variables.md20
-rw-r--r--doc/development/new_fe_guide/dependencies.md12
-rw-r--r--doc/development/new_fe_guide/development/accessibility.md5
-rw-r--r--doc/development/new_fe_guide/development/performance.md2
-rw-r--r--doc/development/new_fe_guide/development/testing.md363
-rw-r--r--doc/development/new_fe_guide/modules/dirty_submit.md9
-rw-r--r--doc/development/new_fe_guide/style/index.md2
-rw-r--r--doc/development/new_fe_guide/style/javascript.md7
-rw-r--r--doc/development/performance.md8
-rw-r--r--doc/development/policies.md4
-rw-r--r--doc/development/query_recorder.md7
-rw-r--r--doc/development/rake_tasks.md3
-rw-r--r--doc/development/reusing_abstractions.md11
-rw-r--r--doc/development/sha1_as_binary.md2
-rw-r--r--doc/development/shell_scripting_guide/index.md118
-rw-r--r--doc/development/sql.md30
-rw-r--r--doc/development/testing_guide/best_practices.md24
-rw-r--r--doc/development/testing_guide/ci.md1
-rw-r--r--doc/development/testing_guide/end_to_end/index.md23
-rw-r--r--doc/development/testing_guide/end_to_end/page_objects.md7
-rw-r--r--doc/development/testing_guide/end_to_end/quick_start_guide.md42
-rw-r--r--doc/development/testing_guide/end_to_end/style_guide.md48
-rw-r--r--doc/development/testing_guide/flaky_tests.md4
-rw-r--r--doc/development/testing_guide/frontend_testing.md540
-rw-r--r--doc/development/testing_guide/img/qa_on_merge_requests_cicd_architecture.pngbin64862 -> 0 bytes
-rw-r--r--doc/development/testing_guide/img/review_apps_cicd_architecture.pngbin136431 -> 0 bytes
-rw-r--r--doc/development/testing_guide/review_apps.md66
-rw-r--r--doc/development/testing_guide/testing_levels.md3
-rw-r--r--doc/development/understanding_explain_plans.md4
-rw-r--r--doc/development/utilities.md128
-rw-r--r--doc/development/ux_guide/animation.md4
-rw-r--r--doc/development/ux_guide/illustrations.md4
-rw-r--r--doc/development/verifying_database_capabilities.md10
-rw-r--r--doc/development/what_requires_downtime.md27
81 files changed, 2630 insertions, 1710 deletions
diff --git a/doc/development/README.md b/doc/development/README.md
index a74770ae383..6281bb809ff 100644
--- a/doc/development/README.md
+++ b/doc/development/README.md
@@ -3,7 +3,7 @@ comments: false
description: 'Learn how to contribute to GitLab.'
---
-# GitLab development guides
+# Contributor and Development Docs
## Get started!
@@ -17,6 +17,7 @@ description: 'Learn how to contribute to GitLab.'
- [GitLab core team & GitLab Inc. contribution process](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/PROCESS.md)
- [Generate a changelog entry with `bin/changelog`](changelog.md)
- [Code review guidelines](code_review.md) for reviewing code and having code reviewed
+- [Database review guidelines](database_review.md) for reviewing database-related changes and complex SQL queries
- [Automatic CE->EE merge](automatic_ce_ee_merge.md)
- [Guidelines for implementing Enterprise Edition features](ee_features.md)
- [Security process for developers](https://gitlab.com/gitlab-org/release/docs/blob/master/general/security/developer.md#security-releases-critical-non-critical-as-a-developer)
@@ -63,6 +64,7 @@ description: 'Learn how to contribute to GitLab.'
- [Routing](routing.md)
- [Repository mirroring](repository_mirroring.md)
- [Git LFS](lfs.md)
+- [Developing against interacting components or features](interacting_components.md)
## Performance guides
@@ -111,6 +113,10 @@ description: 'Learn how to contribute to GitLab.'
- [Database helper modules](database_helpers.md)
- [Code comments](code_comments.md)
+## Case studies
+
+- [Database case study: Filtering by label](filtering_by_label.md)
+
## Integration guides
- [Jira Connect app](integrations/jira_connect.md)
@@ -144,6 +150,10 @@ description: 'Learn how to contribute to GitLab.'
- [Go Guidelines](go_guide/index.md)
+## Shell Scripting guides
+
+- [Shell scripting standards and style guidelines](shell_scripting_guide/index.md)
+
## Other GitLab Development Kit (GDK) guides
- [Run full Auto DevOps cycle in a GDK instance](https://gitlab.com/gitlab-org/gitlab-development-kit/blob/master/doc/howto/auto_devops.md)
diff --git a/doc/development/architecture.md b/doc/development/architecture.md
index 091edb6ac40..5cb2ddf6e52 100644
--- a/doc/development/architecture.md
+++ b/doc/development/architecture.md
@@ -20,7 +20,7 @@ A typical install of GitLab will be on GNU/Linux. It uses Nginx or Apache as a w
We also support deploying GitLab on Kubernetes using our [gitlab Helm chart](https://docs.gitlab.com/charts/).
-The GitLab web app uses MySQL or PostgreSQL for persistent database information (e.g. users, permissions, issues, other meta data). GitLab stores the bare git repositories it serves in `/home/git/repositories` by default. It also keeps default branch and hook information with the bare repository.
+The GitLab web app uses PostgreSQL for persistent database information (e.g. users, permissions, issues, other meta data). GitLab stores the bare git repositories it serves in `/home/git/repositories` by default. It also keeps default branch and hook information with the bare repository.
When serving repositories over HTTP/HTTPS GitLab utilizes the GitLab API to resolve authorization and access as well as serving git objects.
@@ -511,7 +511,15 @@ To summarize here's the [directory structure of the `git` user home directory](.
ps aux | grep '^git'
```
-GitLab has several components to operate. As a system user (i.e. any user that is not the `git` user) it requires a persistent database (MySQL/PostreSQL) and redis database. It also uses Apache httpd or Nginx to proxypass Unicorn. As the `git` user it starts Sidekiq and Unicorn (a simple ruby HTTP server running on port `8080` by default). Under the GitLab user there are normally 4 processes: `unicorn_rails master` (1 process), `unicorn_rails worker` (2 processes), `sidekiq` (1 process).
+GitLab has several components to operate. It requires a persistent database
+(PostgreSQL) and redis database, and uses Apache httpd or Nginx to proxypass
+Unicorn. All these components should run as different system users to GitLab
+(e.g., `postgres`, `redis` and `www-data`, instead of `git`).
+
+As the `git` user it starts Sidekiq and Unicorn (a simple ruby HTTP server
+running on port `8080` by default). Under the GitLab user there are normally 4
+processes: `unicorn_rails master` (1 process), `unicorn_rails worker`
+(2 processes), `sidekiq` (1 process).
### Repository access
@@ -554,12 +562,9 @@ $ /etc/init.d/nginx
Usage: nginx {start|stop|restart|reload|force-reload|status|configtest}
```
-Persistent database (one of the following)
+Persistent database
```
-/etc/init.d/mysqld
-Usage: /etc/init.d/mysqld {start|stop|status|restart|condrestart|try-restart|reload|force-reload}
-
$ /etc/init.d/postgresql
Usage: /etc/init.d/postgresql {start|stop|restart|reload|force-reload|status} [version ..]
```
@@ -568,7 +573,7 @@ Usage: /etc/init.d/postgresql {start|stop|restart|reload|force-reload|status} [v
gitlabhq (includes Unicorn and Sidekiq logs)
-- `/home/git/gitlab/log/` contains `application.log`, `production.log`, `sidekiq.log`, `unicorn.stdout.log`, `githost.log` and `unicorn.stderr.log` normally.
+- `/home/git/gitlab/log/` contains `application.log`, `production.log`, `sidekiq.log`, `unicorn.stdout.log`, `git_json.log` and `unicorn.stderr.log` normally.
gitlab-shell
@@ -597,11 +602,6 @@ PostgreSQL
- `/var/log/postgresql/*`
-MySQL
-
-- `/var/log/mysql/*`
-- `/var/log/mysql.*`
-
### GitLab specific config files
GitLab has configuration files located in `/home/git/gitlab/config/*`. Commonly referenced config files include:
diff --git a/doc/development/automatic_ce_ee_merge.md b/doc/development/automatic_ce_ee_merge.md
index 423b35a9e3a..001a92790e1 100644
--- a/doc/development/automatic_ce_ee_merge.md
+++ b/doc/development/automatic_ce_ee_merge.md
@@ -171,6 +171,19 @@ Now, every time you create an MR for CE and EE:
job failed, you are required to submit the EE MR so that you can fix the conflicts in EE
before merging your changes into CE.
+## How we run the Automatic CE->EE merge at GitLab
+
+At GitLab, we use the [Merge Train](https://gitlab.com/gitlab-org/merge-train)
+project to keep our [gitlab-ee](https://gitlab.com/gitlab-org/gitlab-ee)
+repository updated with commits from
+[gitlab-ce](https://gitlab.com/gitlab-org/gitlab-ce).
+
+We have a mirror of the [Merge Train](https://gitlab.com/gitlab-org/merge-train)
+project [configured](https://ops.gitlab.net/gitlab-org/merge-train) to run an
+automatic CE->EE merge job every twenty minutes as a scheduled CI job. The
+[configured](https://ops.gitlab.net/gitlab-org/merge-train) Merge Train project
+is only accessible to authorized GitLab staff.
+
## FAQ
### How does automatic merging work?
@@ -187,7 +200,7 @@ code.
### Why merge automatically?
As we work towards continuous deployments and a single repository for both CE
-and EE, we need to first make sure that all CE changes make their way into CE as
+and EE, we need to first make sure that all CE changes make their way into EE as
fast as possible. Past experiences and data have shown that periodic CE to EE
merge requests do not scale, and often take a very long time to complete. For
example, [in this
diff --git a/doc/development/chaos_endpoints.md b/doc/development/chaos_endpoints.md
index b3406275937..961520db7d8 100644
--- a/doc/development/chaos_endpoints.md
+++ b/doc/development/chaos_endpoints.md
@@ -36,6 +36,10 @@ Replace `secret` with your own secret token.
Once you have enabled the chaos endpoints and restarted the application, you can start testing using the endpoints.
+By default, when invoking a chaos endpoint, the web worker process which receives the request will handle it. This means, for example, that if the Kill
+operation is invoked, the Puma or Unicorn worker process handling the request will be killed. To test these operations in Sidekiq, the `async` parameter on
+each endpoint can be set to `true`. This will run the chaos process in a Sidekiq worker.
+
## Memory leaks
To simulate a memory leak in your application, use the `/-/chaos/leakmem` endpoint.
@@ -47,12 +51,14 @@ The memory is not retained after the request finishes. Once the request has comp
GET /-/chaos/leakmem
GET /-/chaos/leakmem?memory_mb=1024
GET /-/chaos/leakmem?memory_mb=1024&duration_s=50
+GET /-/chaos/leakmem?memory_mb=1024&duration_s=50&async=true
```
-| Attribute | Type | Required | Description |
-| ------------ | ------- | -------- | ---------------------------------------------------------------------------------- |
-| `memory_mb` | integer | no | How much memory, in MB, should be leaked. Defaults to 100MB. |
+| Attribute | Type | Required | Description |
+| ------------ | ------- | -------- | ------------------------------------------------------------------------------------ |
+| `memory_mb` | integer | no | How much memory, in MB, should be leaked. Defaults to 100MB. |
| `duration_s` | integer | no | Minimum duration_s, in seconds, that the memory should be retained. Defaults to 30s. |
+| `async` | boolean | no | Set to true to leak memory in a Sidekiq background worker process |
```bash
curl http://localhost:3000/-/chaos/leakmem?memory_mb=1024&duration_s=10 --header 'X-Chaos-Secret: secret'
@@ -63,17 +69,19 @@ curl http://localhost:3000/-/chaos/leakmem?memory_mb=1024&duration_s=10&token=se
This endpoint attempts to fully utilise a single core, at 100%, for the given period.
-Depending on your rack server setup, your request may timeout after a predermined period (normally 60 seconds).
+Depending on your rack server setup, your request may timeout after a predetermined period (normally 60 seconds).
If you're using Unicorn, this is done by killing the worker process.
```
GET /-/chaos/cpu_spin
GET /-/chaos/cpu_spin?duration_s=50
+GET /-/chaos/cpu_spin?duration_s=50&async=true
```
| Attribute | Type | Required | Description |
| ------------ | ------- | -------- | --------------------------------------------------------------------- |
-| `duration_s` | integer | no | Duration, in seconds, that the core will be utilised. Defaults to 30s |
+| `duration_s` | integer | no | Duration, in seconds, that the core will be utilized. Defaults to 30s |
+| `async` | boolean | no | Set to true to consume CPU in a Sidekiq background worker process |
```bash
curl http://localhost:3000/-/chaos/cpu_spin?duration_s=60 --header 'X-Chaos-Secret: secret'
@@ -85,18 +93,20 @@ curl http://localhost:3000/-/chaos/cpu_spin?duration_s=60&token=secret
This endpoint attempts to fully utilise a single core, and interleave it with DB request, for the given period.
This endpoint can be used to model yielding execution to another threads when running concurrently.
-Depending on your rack server setup, your request may timeout after a predermined period (normally 60 seconds).
+Depending on your rack server setup, your request may timeout after a predetermined period (normally 60 seconds).
If you're using Unicorn, this is done by killing the worker process.
```
GET /-/chaos/db_spin
GET /-/chaos/db_spin?duration_s=50
+GET /-/chaos/db_spin?duration_s=50&async=true
```
-| Attribute | Type | Required | Description |
-| ------------ | ------- | -------- | --------------------------------------------------------------------- |
-| `interval_s` | float | no | Interval, in seconds, for every DB request. Defaults to 1s |
-| `duration_s` | integer | no | Duration, in seconds, that the core will be utilised. Defaults to 30s |
+| Attribute | Type | Required | Description |
+| ------------ | ------- | -------- | --------------------------------------------------------------------------- |
+| `interval_s` | float | no | Interval, in seconds, for every DB request. Defaults to 1s |
+| `duration_s` | integer | no | Duration, in seconds, that the core will be utilized. Defaults to 30s |
+| `async` | boolean | no | Set to true to perform the operation in a Sidekiq background worker process |
```bash
curl http://localhost:3000/-/chaos/db_spin?interval_s=1&duration_s=60 --header 'X-Chaos-Secret: secret'
@@ -112,11 +122,13 @@ As with the CPU Spin endpoint, this may lead to your request timing out if durat
```
GET /-/chaos/sleep
GET /-/chaos/sleep?duration_s=50
+GET /-/chaos/sleep?duration_s=50&async=true
```
| Attribute | Type | Required | Description |
| ------------ | ------- | -------- | ---------------------------------------------------------------------- |
| `duration_s` | integer | no | Duration, in seconds, that the request will sleep for. Defaults to 30s |
+| `async` | boolean | no | Set to true to sleep in a Sidekiq background worker process |
```bash
curl http://localhost:3000/-/chaos/sleep?duration_s=60 --header 'X-Chaos-Secret: secret'
@@ -132,8 +144,13 @@ Since this endpoint uses the `KILL` signal, the worker is not given a chance to
```
GET /-/chaos/kill
+GET /-/chaos/kill?async=true
```
+| Attribute | Type | Required | Description |
+| ------------ | ------- | -------- | ---------------------------------------------------------------------- |
+| `async` | boolean | no | Set to true to kill a Sidekiq background worker process |
+
```bash
curl http://localhost:3000/-/chaos/kill --header 'X-Chaos-Secret: secret'
curl http://localhost:3000/-/chaos/kill?token=secret
diff --git a/doc/development/code_review.md b/doc/development/code_review.md
index 35ce6b8cbe4..b7d74b17eb3 100644
--- a/doc/development/code_review.md
+++ b/doc/development/code_review.md
@@ -45,9 +45,9 @@ page, with these behaviours:
1. It will not pick people whose [GitLab status](../user/profile/index.md#current-status)
contains the string 'OOO'.
-2. [Trainee maintainers](https://about.gitlab.com/handbook/engineering/workflow/code-review/#trainee-maintainer)
+1. [Trainee maintainers](https://about.gitlab.com/handbook/engineering/workflow/code-review/#trainee-maintainer)
are three times as likely to be picked as other reviewers.
-3. It always picks the same reviewers and maintainers for the same
+1. It always picks the same reviewers and maintainers for the same
branch name (unless their OOO status changes, as in point 1). It
removes leading `ce-` and `ee-`, and trailing `-ce` and `-ee`, so
that it can be stable for backport branches.
@@ -58,20 +58,21 @@ As described in the section on the responsibility of the maintainer below, you
are recommended to get your merge request approved and merged by maintainer(s)
from teams other than your own.
- 1. If your merge request includes backend changes [^1], it must be
- **approved by a [backend maintainer](https://about.gitlab.com/handbook/engineering/projects/#gitlab-ce_maintainers_backend)**.
- 1. If your merge request includes database migrations or changes to expensive queries [^2], it must be
- **approved by a [database maintainer](https://about.gitlab.com/handbook/engineering/projects/#gitlab-ce_maintainers_database)**.
- 1. If your merge request includes frontend changes [^1], it must be
- **approved by a [frontend maintainer](https://about.gitlab.com/handbook/engineering/projects/#gitlab-ce_maintainers_frontend)**.
- 1. If your merge request includes UX changes [^1], it must be
- **approved by a [UX team member][team]**.
- 1. If your merge request includes adding a new JavaScript library [^1], it must be
- **approved by a [frontend lead][team]**.
- 1. If your merge request includes adding a new UI/UX paradigm [^1], it must be
- **approved by a [UX lead][team]**.
- 1. If your merge request includes a new dependency or a filesystem change, it must be
- **approved by a [Distribution team member][team]**. See how to work with the [Distribution team](https://about.gitlab.com/handbook/engineering/dev-backend/distribution/) for more details.
+1. If your merge request includes backend changes [^1], it must be
+ **approved by a [backend maintainer](https://about.gitlab.com/handbook/engineering/projects/#gitlab-ce_maintainers_backend)**.
+1. If your merge request includes database migrations or changes to expensive queries [^2], it must be
+ **approved by a [database maintainer](https://about.gitlab.com/handbook/engineering/projects/#gitlab-ce_maintainers_database)**.
+ Read the [database review guidelines](database_review.md) for more details.
+1. If your merge request includes frontend changes [^1], it must be
+ **approved by a [frontend maintainer](https://about.gitlab.com/handbook/engineering/projects/#gitlab-ce_maintainers_frontend)**.
+1. If your merge request includes UX changes [^1], it must be
+ **approved by a [UX team member][team]**.
+1. If your merge request includes adding a new JavaScript library [^1], it must be
+ **approved by a [frontend lead][team]**.
+1. If your merge request includes adding a new UI/UX paradigm [^1], it must be
+ **approved by a [UX lead][team]**.
+1. If your merge request includes a new dependency or a filesystem change, it must be
+ **approved by a [Distribution team member][team]**. See how to work with the [Distribution team](https://about.gitlab.com/handbook/engineering/dev-backend/distribution/) for more details.
#### Security requirements
@@ -341,8 +342,7 @@ Enterprise Edition instance. This has some implications:
- [Background migrations](background_migrations.md) run in Sidekiq, and
should only be done for migrations that would take an extreme amount of
time at GitLab.com scale.
-1. **Sidekiq workers**
- [cannot change in a backwards-incompatible way](sidekiq_style_guide.md#removing-or-renaming-queues):
+1. **Sidekiq workers** [cannot change in a backwards-incompatible way](sidekiq_style_guide.md#removing-or-renaming-queues):
1. Sidekiq queues are not drained before a deploy happens, so there will be
workers in the queue from the previous version of GitLab.
1. If you need to change a method signature, try to do so across two releases,
@@ -365,6 +365,31 @@ Enterprise Edition instance. This has some implications:
1. **Filesystem access** can be slow, so try to avoid
[shared files](shared_files.md) when an alternative solution is available.
+## Examples
+
+How code reviews are conducted can surprise new contributors. Here are some examples of code reviews that should help to orient you as to what to expect.
+
+**["Modify `DiffNote` to reuse it for Designs"](https://gitlab.com/gitlab-org/gitlab-ee/merge_requests/13703):**
+It contained everything from nitpicks around newlines to reasoning
+about what versions for designs are, how we should compare them
+if there was no previous version of a certain file (parent vs.
+blank `sha` vs empty tree).
+
+**["Support multi-line suggestions"](https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/25211)**:
+The MR itself consists of a collaboration between FE and BE,
+and documenting comments from the author for the reviewer.
+There's some nitpicks, some questions for information, and
+towards the end, a security vulnerability.
+
+**["Allow multiple repositories per project"](https://gitlab.com/gitlab-org/gitlab-ee/merge_requests/10251)**:
+ZJ referred to the other projects (workhorse) this might impact,
+suggested some improvements for consistency. And James' comments
+helped us with overall code quality (using delegation, `&.` those
+types of things), and making the code more robust.
+
+**["Support multiple assignees for merge requests"](https://gitlab.com/gitlab-org/gitlab-ee/merge_requests/10161)**:
+A good example of collaboration on an MR touching multiple parts of the codebase. Nick pointed out interesting edge cases, James Lopes also joined in raising concerns on import/export feature.
+
### Credits
Largely based on the [thoughtbot code review guide].
diff --git a/doc/development/contributing/community_roles.md b/doc/development/contributing/community_roles.md
index 3296cb173d7..7d2d1b77a0e 100644
--- a/doc/development/contributing/community_roles.md
+++ b/doc/development/contributing/community_roles.md
@@ -1,4 +1,4 @@
-### Community members & roles
+# Community members & roles
GitLab community members and their privileges/responsibilities.
diff --git a/doc/development/contributing/issue_workflow.md b/doc/development/contributing/issue_workflow.md
index abe11b8d1a8..b2e3ef7bf63 100644
--- a/doc/development/contributing/issue_workflow.md
+++ b/doc/development/contributing/issue_workflow.md
@@ -60,40 +60,18 @@ Subject labels are always all-lowercase.
## Team labels
-**Important**: Most of the team labels will be soon deprecated in favor of [Group labels](#group-labels).
+**Important**: Most of the historical team labels (e.g. Manage, Plan etc.) are
+now deprecated in favor of [Group labels](#group-labels) and [Stage labels](#stage-labels).
Team labels specify what team is responsible for this issue.
Assigning a team label makes sure issues get the attention of the appropriate
people.
-The team labels planned for deprecation are:
-
-- ~Configure
-- ~Create
-- ~Defend
-- ~Distribution
-- ~Ecosystem
-- ~Geo
-- ~Gitaly
-- ~Growth
-- ~Manage
-- ~Memory
-- ~Monitor
-- ~Plan
-- ~Release
-- ~Secure
-- ~Verify
-
-The following team labels are **true** teams per our [organization structure](https://about.gitlab.com/company/team/structure/#organizational-structure) which will remain post deprecation.
+The current team labels are:
- ~Delivery
- ~Documentation
-
-The descriptions on the [labels page](https://gitlab.com/gitlab-org/gitlab-ce/-/labels) explain what falls under the
-responsibility of each team.
-
-Within those team labels, we also have the ~backend and ~frontend labels to
-indicate if an issue needs backend work, frontend work, or both.
+- ~Quality
Team labels are always capitalized so that they show up as the first label for
any issue.
@@ -102,33 +80,11 @@ any issue.
Stage labels specify which [DevOps stage][devops-stages] the issue belongs to.
-The current stage labels are:
-
-- ~"devops::manage"
-- ~"devops::plan"
-- ~"devops::create"
-- ~"devops::verify"
-- ~"devops::package"
-- ~"devops::release"
-- ~"devops::configure"
-- ~"devops::monitor"
-- ~"devops::secure"
-- ~"devops::defend"
-- ~"devops::growth"
-- ~"devops::enablement"
+The current stage labels can be found by [searching the labels list for `devops::`](https://gitlab.com/groups/gitlab-org/-/labels?search=devops%3A%3A).
These labels are [scoped labels](../../user/project/labels.md#scoped-labels-premium)
and thus are mutually exclusive.
-They differ from the [Team labels](#team-labels) because teams may work on
-issues outside their stage.
-
-Normally there is a 1:1 relationship between Stage labels and Team labels, but
-any issue can be picked up by any team, depending on current priorities.
-So, an issue labeled ~"devops:create" may be scheduled by the ~Plan team, for
-example. In such cases, it's usual to include both team labels so each team can
-be aware of the progress.
-
The Stage labels are used to generate the [direction pages][direction-pages] automatically.
[devops-stages]: https://about.gitlab.com/direction/#devops-stages
@@ -138,50 +94,21 @@ The Stage labels are used to generate the [direction pages][direction-pages] aut
Group labels specify which [groups][structure-groups] the issue belongs to.
-The current group labels are:
-
-- ~"group::access"
-- ~"group::measure"
-- ~"group::source code"
-- ~"group::knowledge"
-- ~"group::editor"
-- ~"group::gitaly"
-- ~"group::gitter"
-- ~"group::team planning"
-- ~"group::enterprise planning"
-- ~"group::certify"
-- ~"group::ci and runner"
-- ~"group::testing"
-- ~"group::package"
-- ~"group::progressive delivery"
-- ~"group::release management"
-- ~"group::autodevops and kubernetes"
-- ~"group::serverless and paas"
-- ~"group::apm"
-- ~"group::health"
-- ~"group::static analysis"
-- ~"group::dynamic analysis"
-- ~"group::software composition analysis"
-- ~"group::runtime application security"
-- ~"group::threat management"
-- ~"group::application infrastructure security"
-- ~"group::activation"
-- ~"group::adoption"
-- ~"group::upsell"
-- ~"group::retention"
-- ~"group::fulfillment"
-- ~"group::telemetry"
-- ~"group::distribution"
-- ~"group::geo"
-- ~"group::memory"
-- ~"group::ecosystem"
-
+The current group labels can be found by [searching the labels list for `group::`](https://gitlab.com/groups/gitlab-org/-/labels?search=group%3A%3A).
+
These labels are [scoped labels](../../user/project/labels.md#scoped-labels-premium)
and thus are mutually exclusive.
-Groups are nested beneath a particular stage, so only one stage label and one group label
-can be applied to a single issue. You can find the groups listed in the
-[Product Categories pages][product-categories].
+You can find the groups listed in the [Product Stages, Groups, and Categories][product-categories] page.
+
+We use the term group to map down product requirements from our product stages.
+As a team needs some way to collect the work their members are planning to be assigned to, we use the `~group::` labels to do so.
+
+Normally there is a 1:1 relationship between Stage labels and Group labels. In the spirit of "Everyone can contribute",
+any issue can be picked up by any group, depending on current priorities. For example, an issue labeled ~"devops::create" may be picked up by the ~"group::access" group.
+
+We also use stage and group labels to help quantify our [throughput](https://about.gitlab.com/handbook/engineering/management/throughput).
+Please read [Stage and Group labels in Throughtput](https://about.gitlab.com/handbook/engineering/management/throughput/#stage-and-group-labels-in-throughput) for more information on how the labels are used in this context.
[structure-groups]: https://about.gitlab.com/company/team/structure/#groups
[product-categories]: https://about.gitlab.com/handbook/product/categories/
@@ -248,7 +175,7 @@ If a bug seems to fall between two severity labels, assign it to the higher-seve
- Example(s) of ~S1
- Data corruption/loss.
- Security breach.
- - Unable to create an issue or merge request.
+ - Unable to create an issue or merge request.
- Unable to add a comment or thread to the issue or merge request.
- Example(s) of ~S2
- Cannot submit changes through the web IDE but the commandline works.
@@ -259,7 +186,7 @@ If a bug seems to fall between two severity labels, assign it to the higher-seve
- Example(s) of ~S4
- Label colors are incorrect.
- UI elements are not fully aligned.
-
+
## Label for community contributors
Issues that are beneficial to our users, 'nice to haves', that we currently do
@@ -294,7 +221,7 @@ know how difficult the issue is. Additionally:
as suitable for people that have never contributed to GitLab before on the
[Up For Grabs campaign](http://up-for-grabs.net)
- We encourage people that have never contributed to any open source project to
- look for [`Accepting merge requests` issues with a weight of 1][firt-timers]
+ look for [`Accepting merge requests` issues with a weight of 1][first-timers]
If you've decided that you would like to work on an issue, please @-mention
the [appropriate product manager](https://about.gitlab.com/handbook/product/#who-to-talk-to-for-what)
@@ -307,8 +234,8 @@ GitLab team members who apply the ~"Accepting merge requests" label to an issue
should update the issue description with a responsible product manager, inviting
any potential community contributor to @-mention per above.
-[up-for-grabs]: https://gitlab.com/groups/gitlab-org/-/issues?state=opened&label_name[]=Accepting+merge+requests&assignee_id=0&sort=weight
-[firt-timers]: https://gitlab.com/groups/gitlab-org/-/issues?state=opened&label_name[]=Accepting+merge+requests&assignee_id=0&sort=weight&weight=1
+[up-for-grabs]: https://gitlab.com/groups/gitlab-org/-/issues?state=opened&label_name[]=Accepting+merge+requests&assignee_id=None&sort=weight
+[first-timers]: https://gitlab.com/groups/gitlab-org/-/issues?state=opened&label_name[]=Accepting+merge+requests&assignee_id=None&sort=weight&weight=1
## Issue triaging
diff --git a/doc/development/contributing/merge_request_workflow.md b/doc/development/contributing/merge_request_workflow.md
index 3325d3e074e..4e9c5c81379 100644
--- a/doc/development/contributing/merge_request_workflow.md
+++ b/doc/development/contributing/merge_request_workflow.md
@@ -103,7 +103,8 @@ If you would like quick feedback on your merge request feel free to mention some
from the [core team](https://about.gitlab.com/community/core-team/) or one of the
[merge request coaches](https://about.gitlab.com/team/). When having your code reviewed
and when reviewing merge requests, please keep the [code review guidelines](../code_review.md)
-in mind.
+in mind. And if your code also makes changes to the database, or does expensive queries,
+check the [database review guidelines](../database_review.md).
### Keep it simple
@@ -142,6 +143,37 @@ If the guidelines are not met, the MR will not pass the
[Danger checks](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/danger/commit_messages/Dangerfile).
For more information see [How to Write a Git Commit Message](https://chris.beams.io/posts/git-commit/).
+Example commit message template that can be used on your machine that embodies the above (guide for [how to apply template](https://codeinthehole.com/tips/a-useful-template-for-commit-messages/)):
+
+```text
+# (If applied, this commit will...) <subject> (Max 50 char)
+# |<---- Using a Maximum Of 50 Characters ---->|
+
+
+# Explain why this change is being made
+# |<---- Try To Limit Each Line to a Maximum Of 72 Characters ---->|
+
+# Provide links or keys to any relevant tickets, articles or other resources
+# Use issues and merge requests' full URLs instead of short references,
+# as they are displayed as plain text outside of GitLab
+
+# --- COMMIT END ---
+# --------------------
+# Remember to
+# Capitalize the subject line
+# Use the imperative mood in the subject line
+# Do not end the subject line with a period
+# Subject must contain at least 3 words
+# Separate subject from body with a blank line
+# Commits that change 30 or more lines across at least 3 files must
+# describe these changes in the commit body
+# Do not use Emojis
+# Use the body to explain what and why vs. how
+# Can use multiple lines with "-" for bullet points in body
+# For more information: https://chris.beams.io/posts/git-commit/
+# --------------------
+```
+
## Contribution acceptance criteria
To make sure that your merge request can be approved, please ensure that it meets
diff --git a/doc/development/contributing/style_guides.md b/doc/development/contributing/style_guides.md
index 5c6ea1f469d..7832850a9f0 100644
--- a/doc/development/contributing/style_guides.md
+++ b/doc/development/contributing/style_guides.md
@@ -23,6 +23,7 @@
1. Code should be written in [US English][us-english]
1. [Go](../go_guide/index.md)
1. [Python](../python_guide/index.md)
+1. [Shell scripting](../shell_scripting_guide/index.md)
This is also the style used by linting tools such as
[RuboCop](https://github.com/rubocop-hq/rubocop) and [Hound CI](https://houndci.com).
diff --git a/doc/development/database_debugging.md b/doc/development/database_debugging.md
index 0311eda1ff1..eb3b227473b 100644
--- a/doc/development/database_debugging.md
+++ b/doc/development/database_debugging.md
@@ -9,31 +9,31 @@ An easy first step is to search for your error in Slack or google "GitLab (my er
Available `RAILS_ENV`
- - `production` (generally not for your main GDK db, but you may need this for e.g. omnibus)
- - `development` (this is your main GDK db)
- - `test` (used for tests like rspec)
+- `production` (generally not for your main GDK db, but you may need this for e.g. omnibus)
+- `development` (this is your main GDK db)
+- `test` (used for tests like rspec)
## Nuke everything and start over
If you just want to delete everything and start over with an empty DB (~1 minute):
- - `bundle exec rake db:reset RAILS_ENV=development`
+- `bundle exec rake db:reset RAILS_ENV=development`
If you just want to delete everything and start over with dummy data (~40 minutes). This also does `db:reset` and runs DB-specific migrations:
- - `bundle exec rake dev:setup RAILS_ENV=development`
+- `bundle exec rake dev:setup RAILS_ENV=development`
If your test DB is giving you problems, it is safe to nuke it because it doesn't contain important data:
- - `bundle exec rake db:reset RAILS_ENV=test`
+- `bundle exec rake db:reset RAILS_ENV=test`
## Migration wrangling
- - `bundle exec rake db:migrate RAILS_ENV=development`: Execute any pending migrations that you may have picked up from a MR
- - `bundle exec rake db:migrate:status RAILS_ENV=development`: Check if all migrations are `up` or `down`
- - `bundle exec rake db:migrate:down VERSION=20170926203418 RAILS_ENV=development`: Tear down a migration
- - `bundle exec rake db:migrate:up VERSION=20170926203418 RAILS_ENV=development`: Set up a migration
- - `bundle exec rake db:migrate:redo VERSION=20170926203418 RAILS_ENV=development`: Re-run a specific migration
+- `bundle exec rake db:migrate RAILS_ENV=development`: Execute any pending migrations that you may have picked up from a MR
+- `bundle exec rake db:migrate:status RAILS_ENV=development`: Check if all migrations are `up` or `down`
+- `bundle exec rake db:migrate:down VERSION=20170926203418 RAILS_ENV=development`: Tear down a migration
+- `bundle exec rake db:migrate:up VERSION=20170926203418 RAILS_ENV=development`: Set up a migration
+- `bundle exec rake db:migrate:redo VERSION=20170926203418 RAILS_ENV=development`: Re-run a specific migration
## Manually access the database
@@ -45,12 +45,12 @@ bundle exec rails dbconsole RAILS_ENV=development
bundle exec rails db RAILS_ENV=development
```
- - `\q`: Quit/exit
- - `\dt`: List all tables
- - `\d+ issues`: List columns for `issues` table
- - `CREATE TABLE board_labels();`: Create a table called `board_labels`
- - `SELECT * FROM schema_migrations WHERE version = '20170926203418';`: Check if a migration was run
- - `DELETE FROM schema_migrations WHERE version = '20170926203418';`: Manually remove a migration
+- `\q`: Quit/exit
+- `\dt`: List all tables
+- `\d+ issues`: List columns for `issues` table
+- `CREATE TABLE board_labels();`: Create a table called `board_labels`
+- `SELECT * FROM schema_migrations WHERE version = '20170926203418';`: Check if a migration was run
+- `DELETE FROM schema_migrations WHERE version = '20170926203418';`: Manually remove a migration
## FAQ
diff --git a/doc/development/database_review.md b/doc/development/database_review.md
new file mode 100644
index 00000000000..3f1b359cb0b
--- /dev/null
+++ b/doc/development/database_review.md
@@ -0,0 +1,134 @@
+# Database Review Guidelines
+
+This page is specific to database reviews. Please refer to our
+[code review guide](code_review.md) for broader advice and best
+practices for code review in general.
+
+## General process
+
+A database review is required for:
+
+- Changes that touch the database schema or perform data migrations,
+ including files in:
+ - `db/`
+ - `lib/gitlab/background_migration/`
+- Changes to the database tooling, e.g.:
+ - migration or ActiveRecord helpers in `lib/gitlab/database/`
+ - load balancing
+- Changes that produce SQL queries that are beyond the obvious. It is
+ generally up to the author of a merge request to decide whether or
+ not complex queries are being introduced and if they require a
+ database review.
+
+A database reviewer is expected to look out for obviously complex
+queries in the change and review those closer. If the author does not
+point out specific queries for review and there are no obviously
+complex queries, it is enough to concentrate on reviewing the
+migration only.
+
+It is preferable to review queries in SQL form and generally accepted
+to ask the author to translate any ActiveRecord queries in SQL form
+for review.
+
+### Roles and process
+
+A Merge Request author's role is to:
+
+- Decide whether a database review is needed.
+- If database review is needed, add the ~database label.
+- Use the [database changes](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/.gitlab/merge_request_templates/Database%20changes.md)
+ merge request template, or include the appropriate items in the MR description.
+
+A database **reviewer**'s role is to:
+
+- Perform a first-pass review on the MR and suggest improvements to the author.
+- Once satisfied, relabel the MR with ~"database::reviewed", approve it, and
+ reassign MR to the database **maintainer** suggested by Reviewer
+ Roulette.
+
+A database **maintainer**'s role is to:
+
+- Perform the final database review on the MR.
+- Discuss further improvements or other relevant changes with the
+ database reviewer and the MR author.
+- Finally approve the MR and relabel the MR with ~"database::approved"
+- Merge the MR if no other approvals are pending or pass it on to
+ other maintainers as required (frontend, backend, docs).
+
+### Distributing review workload
+
+Review workload is distributed using [reviewer roulette](code_review.md#reviewer-roulette)
+([example](https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/25181#note_147551725)).
+The MR author should then co-assign the suggested database
+**reviewer**. When they give their sign-off, they will hand over to
+the suggested database **maintainer**.
+
+If reviewer roulette didn't suggest a database reviewer & maintainer,
+make sure you have applied the ~database label and rerun the
+`danger-review` CI job, or pick someone from the
+[`@gl-database` team](https://gitlab.com/groups/gl-database/-/group_members).
+
+### How to prepare for speedy database reviews
+
+In order to make reviewing easier and therefore faster, please consider preparing a comment
+and details for a database reviewer:
+
+- Provide queries in SQL form rather than ActiveRecord.
+- Format any queries with a SQL query formatter, for example with [sqlformat.darold.net](http://sqlformat.darold.net).
+- Consider providing query plans via a link to [explain.depesz.com](https://explain.depesz.com) or another tool instead of textual form.
+- For query changes, it is best to provide the SQL query along with a plan *before* and *after* the change. This helps to spot differences quickly.
+- When providing query plans, make sure to use good parameter values, so that the query executed is a good example and also hits enough data. Usually, the `gitlab-org` namespace (`namespace_id = 9970`) and the `gitlab-org/gitlab-ce` project (`project_id = 13083`) provides enough data to serve as a good example.
+
+### How to review for database
+
+- Check migrations
+ - Review relational modeling and design choices
+ - Review migrations follow [database migration style guide](migration_style_guide.md),
+ for example
+ - [Check ordering of columns](ordering_table_columns.md)
+ - [Check indexes are present for foreign keys](migration_style_guide.md#adding-foreign-key-constraints)
+ - Ensure that migrations execute in a transaction or only contain
+ concurrent index/foreign key helpers (with transactions disabled)
+ - Check consistency with `db/schema.rb` and that migrations are [reversible](migration_style_guide.md#reversibility)
+ - Check queries timing (If any): Queries executed in a migration
+ need to fit comfortable within `15s` - preferably much less than that - on GitLab.com.
+- Check [background migrations](background_migrations.md):
+ - For data migrations, establish a time estimate for execution
+ - They should only be used when migrating data in larger tables.
+ - If a single `update` is below than `1s` the query can be placed
+ directly in a regular migration (inside `db/migrate`).
+ - Review queries (for example, make sure batch sizes are fine)
+ - Establish a time estimate for execution
+ - Because execution time can be longer than for a regular migration,
+ it's suggested to treat background migrations as post migrations:
+ place them in `db/post_migrate` instead of `db/migrate`. Keep in mind
+ that post migrations are executed post-deployment in production.
+- Check [timing guidelines for migrations](#timing-guidelines-for-migrations)
+- Query performance
+ - Check for any obviously complex queries and queries the author specifically
+ points out for review (if any)
+ - If not present yet, ask the author to provide SQL queries and query plans
+ (e.g. by using [chatops](understanding_explain_plans.md#chatops) or direct
+ database access)
+ - For given queries, review parameters regarding data distribution
+ - [Check query plans](understanding_explain_plans.md) and suggest improvements
+ to queries (changing the query, schema or adding indexes and similar)
+ - General guideline is for queries to come in below 100ms execution time
+ - If queries rely on prior migrations that are not present yet on production
+ (eg indexes, columns), you can use a [one-off instance from the restore
+ pipeline](https://ops.gitlab.net/gitlab-com/gl-infra/gitlab-restore/postgres-gprd)
+ in order to establish a proper testing environment.
+
+### Timing guidelines for migrations
+
+In general, migrations for a single deploy shouldn't take longer than
+1 hour for GitLab.com. The following guidelines are not hard rules, they were
+estimated to keep migration timing to a minimum.
+
+NOTE: **Note:** Keep in mind that all runtimes should be measured against GitLab.com.
+
+| Migration Type | Execution Time Recommended | Notes |
+|----|----|---|
+| Regular migrations on `db/migrate` | `3 minutes` | A valid exception are index creation as this can take a long time. |
+| Post migrations on `db/post_migrate` | `10 minutes` | |
+| Background migrations | --- | Since these are suitable for larger tables, it's not possible to set a precise timing guideline, however, any query must stay well below `10s` of execution time. |
diff --git a/doc/development/diffs.md b/doc/development/diffs.md
index 5655398c886..ac0b8555360 100644
--- a/doc/development/diffs.md
+++ b/doc/development/diffs.md
@@ -133,4 +133,4 @@ File diff will be suppressed (technically different from collapsed, but behaves
Diff Viewers, which can be found on `models/diff_viewer/*` are classes used to map metadata about each type of Diff File. It has information
whether it's a binary, which partial should be used to render it or which File extensions this class accounts for.
-`DiffViewer::Base` validates _blobs_ (old and new versions) content, extension and file type in order to check if it can be rendered. \ No newline at end of file
+`DiffViewer::Base` validates _blobs_ (old and new versions) content, extension and file type in order to check if it can be rendered.
diff --git a/doc/development/documentation/feature-change-workflow.md b/doc/development/documentation/feature-change-workflow.md
index ca29353ecbe..ac93ada5a4b 100644
--- a/doc/development/documentation/feature-change-workflow.md
+++ b/doc/development/documentation/feature-change-workflow.md
@@ -121,27 +121,27 @@ All reviewers can help ensure accuracy, clarity, completeness, and adherence to
- **Prior to merging**, documentation changes committed by the developer must be reviewed by:
- 1. **The code reviewer** for the MR, to confirm accuracy, clarity, and completeness.
- 1. Optionally: Others involved in the work, such as other devs or the PM.
- 1. Optionally: The technical writer for the DevOps stage. If not prior to merging, the technical writer will review after the merge.
- This helps us ensure that the developer has time to merge good content by the freeze, and that it can be further refined by the release, if needed.
- - To decide whether to request this review before the merge, consider the amount of time left before the code freeze, the size of the change,
- and your degree of confidence in having users of an RC use your docs as written.
- - Pre-merge tech writer reviews should be most common when the code is complete well in advance of the freeze and/or for larger documentation changes.
- - You can request a review and if there is not sufficient time to complete it prior to the freeze,
- the maintainer can merge the current doc changes (if complete) and create a follow-up doc review issue.
- - The technical writer can also help decide what docs to merge before the freeze and whether to work on further changes in a follow up MR.
- - **To request a pre-merge technical writer review**, assign the writer listed for the applicable [DevOps stage](https://about.gitlab.com/handbook/product/categories/#devops-stages).
- - **To request a post-merge technical writer review**, [create an issue for one using the Doc Review template](https://gitlab.com/gitlab-org/gitlab-ce/issues/new?issuable_template=Doc%20Review) and link it from the MR that makes the doc change.
- 1. **The maintainer** who is assigned to merge the MR, to verify clarity, completeness, and quality, to the best of their ability.
+ 1. **The code reviewer** for the MR, to confirm accuracy, clarity, and completeness.
+ 1. Optionally: Others involved in the work, such as other devs or the PM.
+ 1. Optionally: The technical writer for the DevOps stage. If not prior to merging, the technical writer will review after the merge.
+ This helps us ensure that the developer has time to merge good content by the freeze, and that it can be further refined by the release, if needed.
+ - To decide whether to request this review before the merge, consider the amount of time left before the code freeze, the size of the change,
+ and your degree of confidence in having users of an RC use your docs as written.
+ - Pre-merge tech writer reviews should be most common when the code is complete well in advance of the freeze and/or for larger documentation changes.
+ - You can request a review and if there is not sufficient time to complete it prior to the freeze,
+ the maintainer can merge the current doc changes (if complete) and create a follow-up doc review issue.
+ - The technical writer can also help decide what docs to merge before the freeze and whether to work on further changes in a follow up MR.
+ - **To request a pre-merge technical writer review**, assign the writer listed for the applicable [DevOps stage](https://about.gitlab.com/handbook/product/categories/#devops-stages).
+ - **To request a post-merge technical writer review**, [create an issue for one using the Doc Review template](https://gitlab.com/gitlab-org/gitlab-ce/issues/new?issuable_template=Doc%20Review) and link it from the MR that makes the doc change.
+ 1. **The maintainer** who is assigned to merge the MR, to verify clarity, completeness, and quality, to the best of their ability.
- Upon merging, if a technical writer review has not been performed and there is not yet a linked issue for a follow-up review, the maintainer should [create an issue using the Doc Review template](https://gitlab.com/gitlab-org/gitlab-ce/issues/new?issuable_template=Doc%20Review), link it from the MR, and
mention the original MR author in the new issue. Alternatively, the maintainer can ask the MR author to create and link this issue before the MR is merged.
- After merging, documentation changes are reviewed by:
- 1. The technical writer--**if** their review was not performed prior to the merge.
- 2. Optionally: by the PM (for accuracy and to ensure it's consistent with the vision for how the product will be used).
+ 1. The technical writer -- **if** their review was not performed prior to the merge.
+ 1. Optionally: by the PM (for accuracy and to ensure it's consistent with the vision for how the product will be used).
Any party can raise the item to the PM for review at any point: the dev, the technical writer, or the PM, who can request/plan a review at the outset.
### Technical Writer role
diff --git a/doc/development/documentation/improvement-workflow.md b/doc/development/documentation/improvement-workflow.md
index a12c3d5ea7b..80fbd4b6427 100644
--- a/doc/development/documentation/improvement-workflow.md
+++ b/doc/development/documentation/improvement-workflow.md
@@ -52,9 +52,9 @@ To request a post-merge review, [create an issue for one using the Doc Review te
**3. Maintainer**
1. Review by assigned maintainer, who can always request/require the above reviews. Maintainer review can occur before or after a technical writer review.
-2. Ensure a release milestone of the format XX.Y is set. If the freeze for that release has begun, add the label `pick into <XX.Y>` unless this change is not required for the release. In that case, simply change the milestone.
-3. If EE and CE MRs exist, merge the EE MR first, then the CE MR.
-4. After merging, if there has not been a technical writer review and an issue for a follow-up review was not already created and linked from the MR, [create the issue using the Doc Review template](https://gitlab.com/gitlab-org/gitlab-ce/issues/new?issuable_template=Doc%20Review) and link it from the MR.
+1. Ensure a release milestone of the format XX.Y is set. If the freeze for that release has begun, add the label `pick into <XX.Y>` unless this change is not required for the release. In that case, simply change the milestone.
+1. If EE and CE MRs exist, merge the EE MR first, then the CE MR.
+1. After merging, if there has not been a technical writer review and an issue for a follow-up review was not already created and linked from the MR, [create the issue using the Doc Review template](https://gitlab.com/gitlab-org/gitlab-ce/issues/new?issuable_template=Doc%20Review) and link it from the MR.
## Other ways to help
diff --git a/doc/development/documentation/index.md b/doc/development/documentation/index.md
index 36a2f47a55b..c9ae00d148a 100644
--- a/doc/development/documentation/index.md
+++ b/doc/development/documentation/index.md
@@ -57,35 +57,22 @@ See the [Structure](styleguide.md#structure) section of the [Documentation Style
We currently maintain two sets of docs: one in the
[gitlab-ce](https://gitlab.com/gitlab-org/gitlab-ce/tree/master/doc) repo and
one in [gitlab-ee](https://gitlab.com/gitlab-org/gitlab-ee/tree/master/doc).
-They are similar, and most pages are identical, but they are different repositories.
-With the single codebase effort, we want to make those two sets identical, so when the
-time comes to have only one codebase, we'll be ready.
-
-Here are some links to get you up to speed with the current effort:
-
-- [CE/EE codebases blueprint](https://about.gitlab.com/handbook/engineering/infrastructure/blueprint/ce-ee-codebases/)
-- [CE/EE codebases merge design](https://about.gitlab.com/handbook/engineering/infrastructure/design/merge-ce-ee-codebases/)
-- [Single docs codebase epic](https://gitlab.com/groups/gitlab-org/-/epics/199)
-- [Issue board of related issues](https://gitlab.com/groups/gitlab-org/-/boards/981090?&label_name[]=Documentation&label_name[]=single%20codebase)
-- [Related merge requests](https://gitlab.com/groups/gitlab-org/-/merge_requests?scope=all&utf8=%E2%9C%93&state=all&label_name[]=Documentation&label_name[]=single%20codebase)
-- [Visualize the existing diffs](https://leipert-projects.gitlab.io/is-gitlab-pretty-yet/diff/?search=%5Edoc)
+They are identical, but they are different repositories. When the
+time comes to have only one codebase for the GitLab project, we'll be ready.
### CE first
-After a given documentation path is aligned across CE and EE, all merge requests
-affecting that path must be submitted to CE, regardless of the content it has.
-This means that:
+All merge requests for documentation must be submitted to CE, regardless of the content
+it has. This means that:
-- For **EE-only docs changes**, you only have to submit a CE MR.
+- For **EE-only docs changes**, you only have to submit an MR in the CE project.
- For **EE-only features** that touch both the code and the docs, you have to submit
- an EE MR containing all changes, and a CE MR containing only the docs changes
+ an EE MR containing all code changes, and a CE MR containing only the docs changes
and without a changelog entry.
This might seem like a duplicate effort, but it's only for the short term.
-A list of the already aligned docs can be found in
-[the epic description](https://gitlab.com/groups/gitlab-org/-/epics/199#ee-specific-lines-check).
-Since the docs will be combined, it's crucial to add the relevant
+Since the CE and EE docs are combined, it's crucial to add the relevant
[product badges](styleguide.md#product-badges) for all EE documentation, so that
we can discern which features belong to which tier.
@@ -94,19 +81,9 @@ we can discern which features belong to which tier.
There's a special test in place
([`ee_specific_check.rb`](https://gitlab.com/gitlab-org/gitlab-ee/blob/master/scripts/ee_specific_check/ee_specific_check.rb)),
which, among others, checks and prevents creating/editing new files and directories
-in EE under `doc/`.
-
-We have a long list of documentation paths that are either whitelisted or not.
-Paths in the whitelist (not commented out) will not be subject to the test,
-which means you are allowed to create/change docs content in EE for the time
-being. The goal is to not have any doc whitelisted.
-
-At the time of this writing, the only items left to be aligned are the API docs:
-
-- `doc/api/*` ([issue](https://gitlab.com/gitlab-org/gitlab-ce/issues/60045) / [merge request](https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/27491))
-
-Eventually, once all docs are aligned, we'll remove any doc reference from that
-script, so it catches everything.
+in EE under `doc/`. This should fail when changes to anything in `/doc` are submitted
+in an EE MR. To pass the test, simply remove the docs changes from the EE MR, and
+[submit them in CE](#ce-first).
## Changing document location
@@ -134,18 +111,18 @@ For example, if you were to move `doc/workflow/lfs/lfs_administration.md` to
1. Copy `doc/workflow/lfs/lfs_administration.md` to `doc/administration/lfs.md`
1. Replace the contents of `doc/workflow/lfs/lfs_administration.md` with:
- ```md
- This document was moved to [another location](../../administration/lfs.md).
- ```
+ ```md
+ This document was moved to [another location](../../administration/lfs.md).
+ ```
1. Find and replace any occurrences of the old location with the new one.
A quick way to find them is to use `git grep`. First go to the root directory
where you cloned the `gitlab-ce` repository and then do:
- ```sh
- git grep -n "workflow/lfs/lfs_administration"
- git grep -n "lfs/lfs_administration"
- ```
+ ```sh
+ git grep -n "workflow/lfs/lfs_administration"
+ git grep -n "lfs/lfs_administration"
+ ```
NOTE: **Note:**
If the document being moved has any Disqus comments on it, there are extra steps
@@ -193,7 +170,7 @@ Disqus uses an identifier per page, and for docs.gitlab.com, the page identifier
is configured to be the page URL. Therefore, when we change the document location,
we need to preserve the old URL as the same Disqus identifier.
-To do that, add to the frontmatter the variable `redirect_from`,
+To do that, add to the frontmatter the variable `disqus_identifier`,
using the old URL as value. For example, let's say I moved the document
available under `https://docs.gitlab.com/my-old-location/README.html` to a new location,
`https://docs.gitlab.com/my-new-location/index.html`.
@@ -202,11 +179,11 @@ Into the **new document** frontmatter add the following:
```yaml
---
-redirect_from: 'https://docs.gitlab.com/my-old-location/README.html'
+disqus_identifier: 'https://docs.gitlab.com/my-old-location/README.html'
---
```
-Note: it is necessary to include the file name in the `redirect_from` URL,
+Note: it is necessary to include the file name in the `disqus_identifier` URL,
even if it's `index.html` or `README.html`.
## Branch naming
@@ -319,45 +296,45 @@ You can combine one or more of the following:
1. **Linking to an anchor link.** Use `anchor` as part of the `help_page_path`
method:
- ```haml
- = link_to 'Help page', help_page_path('user/permissions', anchor: 'anchor-link')
- ```
+ ```haml
+ = link_to 'Help page', help_page_path('user/permissions', anchor: 'anchor-link')
+ ```
1. **Opening links in a new tab.** This should be the default behavior:
- ```haml
- = link_to 'Help page', help_page_path('user/permissions'), target: '_blank'
- ```
+ ```haml
+ = link_to 'Help page', help_page_path('user/permissions'), target: '_blank'
+ ```
1. **Linking to a circle icon.** Usually used in settings where a long
description cannot be used, like near checkboxes. You can basically use
any font awesome icon, but prefer the `question-circle`:
- ```haml
- = link_to icon('question-circle'), help_page_path('user/permissions')
- ```
+ ```haml
+ = link_to icon('question-circle'), help_page_path('user/permissions')
+ ```
1. **Using a button link.** Useful in places where text would be out of context
with the rest of the page layout:
- ```haml
- = link_to 'Help page', help_page_path('user/permissions'), class: 'btn btn-info'
- ```
+ ```haml
+ = link_to 'Help page', help_page_path('user/permissions'), class: 'btn btn-info'
+ ```
1. **Using links inline of some text.**
- ```haml
- Description to #{link_to 'Help page', help_page_path('user/permissions')}.
- ```
+ ```haml
+ Description to #{link_to 'Help page', help_page_path('user/permissions')}.
+ ```
1. **Adding a period at the end of the sentence.** Useful when you don't want
the period to be part of the link:
- ```haml
- = succeed '.' do
- Learn more in the
- = link_to 'Help page', help_page_path('user/permissions')
- ```
+ ```haml
+ = succeed '.' do
+ Learn more in the
+ = link_to 'Help page', help_page_path('user/permissions')
+ ```
### GitLab `/help` tests
@@ -488,15 +465,17 @@ Currently, the following tests are in place:
that all cURL examples in API docs use the full switches. It's recommended
to [check locally](#previewing-the-changes-live) before pushing to GitLab by executing the command
`bundle exec nanoc check internal_links` on your local
- [`gitlab-docs`](https://gitlab.com/gitlab-org/gitlab-docs) directory.
+ [`gitlab-docs`](https://gitlab.com/gitlab-org/gitlab-docs) directory. In addition,
+ `docs-lint` also runs [markdownlint](styleguide.md#markdown-rules) to ensure the
+ markdown is consistent across all documentation.
1. [`ee_compat_check`](../automatic_ce_ee_merge.md#avoiding-ce-ee-merge-conflicts-beforehand) (runs on CE only):
- When you submit a merge request to GitLab Community Edition (CE),
- there is this additional job that runs against Enterprise Edition (EE)
- and checks if your changes can apply cleanly to the EE codebase.
- If that job fails, read the instructions in the job log for what to do next.
- As CE is merged into EE once a day, it's important to avoid merge conflicts.
- Submitting an EE-equivalent merge request cherry-picking all commits from CE to EE is
- essential to avoid them.
+ When you submit a merge request to GitLab Community Edition (CE),
+ there is this additional job that runs against Enterprise Edition (EE)
+ and checks if your changes can apply cleanly to the EE codebase.
+ If that job fails, read the instructions in the job log for what to do next.
+ As CE is merged into EE once a day, it's important to avoid merge conflicts.
+ Submitting an EE-equivalent merge request cherry-picking all commits from CE to EE is
+ essential to avoid them.
1. [`ee-files-location-check`/`ee-specific-lines-check`](#ee-specific-lines-check) (runs on EE only):
This test ensures that no new files/directories are created/changed in EE.
All docs should be submitted in CE instead, regardless the tier they are on.
@@ -559,15 +538,16 @@ A file with `proselint` configuration must be placed in a
#### `markdownlint`
`markdownlint` checks that certain rules ([example](https://github.com/DavidAnson/markdownlint/blob/master/README.md#rules--aliases))
- are followed for Markdown syntax.
- Our [Documentation Style Guide](styleguide.md) and [Markdown Guide](https://about.gitlab.com/handbook/product/technical-writing/markdown-guide/)
- elaborate on which choices must be made when selecting Markdown syntax for
- GitLab documentation. This tool helps catch deviations from those guidelines.
+are followed for Markdown syntax. Our [Documentation Style Guide](styleguide.md) and
+[Markdown Guide](https://about.gitlab.com/handbook/product/technical-writing/markdown-guide/)
+elaborate on which choices must be made when selecting Markdown syntax for GitLab
+documentation. This tool helps catch deviations from those guidelines, and matches the
+tests run on the documentation by [`docs-lint`](#testing).
`markdownlint` can be used [on the command line](https://github.com/igorshubovych/markdownlint-cli#markdownlint-cli--),
- either on a single Markdown file or on all Markdown files in a project. For example, to run
- `markdownlint` on all documentation in the [`gitlab-ce` project](https://gitlab.com/gitlab-org/gitlab-ce),
- run the following commands from within the `gitlab-ce` project:
+either on a single Markdown file or on all Markdown files in a project. For example, to run
+`markdownlint` on all documentation in the [`gitlab-ce` project](https://gitlab.com/gitlab-org/gitlab-ce),
+run the following commands from within the `gitlab-ce` project:
```sh
cd doc
@@ -597,7 +577,7 @@ The following sample `markdownlint` configuration modifies the available default
"line-length": false,
"no-trailing-punctuation": false,
"ol-prefix": { "style": "one" },
- "blanks-around-fences": false,
+ "blanks-around-fences": true,
"no-inline-html": {
"allowed_elements": [
"table",
@@ -612,11 +592,15 @@ The following sample `markdownlint` configuration modifies the available default
"a",
"strong",
"i",
- "div"
+ "div",
+ "b"
]
},
"hr-style": { "style": "---" },
- "fenced-code-language": false
+ "code-block-style": { "style": "fenced" },
+ "fenced-code-language": false,
+ "no-duplicate-header": { "allow_different_nesting": true },
+ "commands-show-output": false
}
```
diff --git a/doc/development/documentation/styleguide.md b/doc/development/documentation/styleguide.md
index 1ca965f8ae9..c1e3eb9680b 100644
--- a/doc/development/documentation/styleguide.md
+++ b/doc/development/documentation/styleguide.md
@@ -36,17 +36,17 @@ For the Troubleshooting sections, people in GitLab Support can merge additions t
Include any media types/sources if the content is relevant to readers. You can freely include or link presentations, diagrams, videos, etc.; no matter who it was originally composed for, if it is helpful to any of our audiences, we can include it.
- - If you use an image that has a separate source file (for example, a vector or diagram format), link the image to the source file so that it may be reused or updated by anyone.
- - Do not copy and paste content from other sources unless it is a limited quotation with the source cited. Typically it is better to either rephrase relevant information in your own words or link out to the other source.
+- If you use an image that has a separate source file (for example, a vector or diagram format), link the image to the source file so that it may be reused or updated by anyone.
+- Do not copy and paste content from other sources unless it is a limited quotation with the source cited. Typically it is better to either rephrase relevant information in your own words or link out to the other source.
### No special types
-In the software industry, it is a best practice to organize documentatioin in different types. For example, [Divio recommends](https://www.divio.com/blog/documentation/):
+In the software industry, it is a best practice to organize documentation in different types. For example, [Divio recommends](https://www.divio.com/blog/documentation/):
1. Tutorials
-2. How-to guides
-3. Explanation
-4. Reference (for example, a glossary)
+1. How-to guides
+1. Explanation
+1. Reference (for example, a glossary)
At GitLab, we have so many product changes in our monthly releases that we can't afford to continually update multiple types of information.
If we have multiple types, the information will become outdated. Therefore, we have a [single template](structure.md) for documentation.
@@ -107,6 +107,22 @@ Hard-coded HTML is valid, although it's discouraged to be used while we have `/h
- Special styling is required.
- Reviewed and approved by a technical writer.
+### Markdown Rules
+
+GitLab ensures that the Markdown used across all documentation is consistent, as
+well as easy to review and maintain, by testing documentation changes with
+[Markdownlint (mdl)](https://github.com/markdownlint/markdownlint). This lint test
+checks many common problems with Markdown, and fails when any document has an issue
+with Markdown formatting that may cause the page to render incorrectly within GitLab.
+It will also fail when a document is using non-standard Markdown (which may render
+correctly, but is not the current standard in GitLab documentation).
+
+Each formatting issue that mdl checks has an associated [rule](https://github.com/markdownlint/markdownlint/blob/master/docs/RULES.md),
+and the rules that are currently enabled for GitLab documentation are listed in the
+[`.mdlrc.style`](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/.mdlrc.style) file.
+Configuration options are set in the [`.mdlrc`](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/.mdlrc.style)
+file.
+
## Structure
### Organize by topic, not by type
@@ -142,9 +158,9 @@ The table below shows what kind of documentation goes where.
### Working with directories and files
1. When you create a new directory, always start with an `index.md` file.
- Do not use another file name and **do not** create `README.md` files.
+ Do not use another file name and **do not** create `README.md` files.
1. **Do not** use special characters and spaces, or capital letters in file names,
- directory names, branch names, and anything that generates a path.
+ directory names, branch names, and anything that generates a path.
1. When creating a new document and it has more than one word in its name,
make sure to use underscores instead of spaces or dashes (`-`). For example,
a proper naming would be `import_projects_from_github.md`. The same rule
@@ -221,14 +237,14 @@ Do not include the same information in multiple places. [Link to a SSOT instead.
- Use sentence case for titles, headings, labels, menu items, and buttons.
- Insert an empty line between different markups (e.g., after every paragraph, header, list, etc). Example:
- ```md
- ## Header
+ ```md
+ ## Header
- Paragraph.
+ Paragraph.
- - List item 1
- - List item 2
- ```
+ - List item 1
+ - List item 2
+ ```
### Tables overlapping the TOC
@@ -267,32 +283,52 @@ Check specific punctuation rules for [list items](#list-items) below.
## List items
-- Always start list items with a capital letter.
+- Always start list items with a capital letter, unless they are parameters or commands
+ that are in backticks, or similar.
- Always leave a blank line before and after a list.
-- Begin a line with spaces (not tabs) to denote a subitem.
-- To nest subitems, indent them with two spaces.
-- To nest code blocks, indent them with four spaces.
-- Only use ordered lists when their items describe a sequence of steps to follow.
+- Begin a line with spaces (not tabs) to denote a [nested subitem](#nesting-inside-a-list-item).
+- Only use ordered lists when their items describe a sequence of steps to follow:
+
+ Do:
+
+ These are the steps to do something:
+
+ 1. First, do step 1
+ 1. Then, do step 2
+ 1. Finally, do step 3
+
+ Don't:
+
+ This is a list of different features:
+
+ 1. Feature 1
+ 1. Feature 2
+ 1. Feature 3
**Markup:**
- Use dashes (`-`) for unordered lists instead of asterisks (`*`).
-- Use the number one (`1`) for each item in an ordered list.
- When rendered, the list items will appear with sequential numbering.
+- Prefix `1.` to each item in an ordered list.
+ When rendered, the list items will appear with sequential numbering automatically.
**Punctuation:**
-- Do not add commas (`,`) or semicolons (`;`) to the end of a list item.
-- Only add periods to the end of a list item if the item consists of a complete sentence. The [definition of full sentence](https://www2.le.ac.uk/offices/ld/resources/writing/grammar/grammar-guides/sentence) is: _"a complete sentence always contains a verb, expresses a complete idea, and makes sense standing alone"_.
-- Be consistent throughout the list: if the majority of the items do not end in a period, do not end any of the items in a period, even if they consist of a complete sentence. The opposite is also valid: if the majority of the items end with a period, end all with a period.
+- Do not add commas (`,`) or semicolons (`;`) to the end of list items.
+- Only add periods to the end of a list item if the item consists of a complete sentence.
+ The [definition of full sentence](https://www2.le.ac.uk/offices/ld/resources/writing/grammar/grammar-guides/sentence)
+ is: _"a complete sentence always contains a verb, expresses a complete idea, and makes sense standing alone"_.
+- Be consistent throughout the list: if the majority of the items do not end in a period,
+ do not end any of the items in a period, even if they consist of a complete sentence.
+ The opposite is also valid: if the majority of the items end with a period, end
+ all with a period.
- Separate list items from explanatory text with a colon (`:`). For example:
- ```md
- The list is as follows:
+ ```md
+ The list is as follows:
- - First item: this explains the first item.
- - Second item: this explains the second item.
- ```
+ - First item: this explains the first item.
+ - Second item: this explains the second item.
+ ```
**Examples:**
@@ -314,12 +350,86 @@ Do:
- Let's say this is also a complete sentence.
- Not a complete sentence.
-Don't:
+Don't (third item should have a `.` to match the first and second items):
- Let's say this is a complete sentence.
- Let's say this is also a complete sentence.
- Not a complete sentence
+### Nesting inside a list item
+
+It is possible to nest items under a list item, so that they render with the same indentation
+as the list item. This can be done with:
+
+- [Code blocks](#code-blocks)
+- [Blockquotes](#blockquotes)
+- [Alert boxes](#alert-boxes)
+- [Images](#images)
+
+Items nested in lists should always align with the first character of the list item.
+In unordered lists (using `-`), this means two spaces for each level of indentation:
+
+~~~md
+- Unordered list item 1
+
+ A line nested using 2 spaces to align with the `U` above.
+
+- Unordered list item 2
+
+ > A quote block that will nest
+ > inside list item 2.
+
+- Unordered list item 3
+
+ ```text
+ a codeblock that will next inside list item 3
+ ```
+
+- Unordered list item 4
+
+ ![an image that will nest inside list item 4](image.png)
+~~~
+
+For ordered lists, use three spaces for each level of indentation:
+
+~~~md
+1. Ordered list item 1
+
+ A line nested using 3 spaces to align with the `O` above.
+
+1. Ordered list item 2
+
+ > A quote block that will nest
+ > inside list item 2.
+
+1. Ordered list item 3
+
+ ```text
+ a codeblock that will next inside list item 3
+ ```
+
+1. Ordered list item 4
+
+ ![an image that will nest inside list item 4](image.png)
+~~~
+
+You can nest full lists inside other lists using the same rules as above. If you wish
+to mix types, that is also possible, as long as you don't mix items at the same level:
+
+```
+1. Ordered list item one.
+1. Ordered list item two.
+ - Nested unordered list item one.
+ - Nested unordered list item two.
+1. Ordered list item three.
+
+- Unordered list item one.
+- Unordered list item two.
+ 1. Nested ordered list item one.
+ 1. Nested ordered list item two.
+- Unordered list item three.
+```
+
## Quotes
Valid for markdown content only, not for frontmatter entries:
@@ -504,16 +614,16 @@ To embed a video, follow the instructions below and make sure
you have your MR reviewed and approved by a technical writer.
1. Copy the code below and paste it into your markdown file.
- Leave a blank line above and below it. Do NOT edit the code
- (don't remove or add any spaces, etc).
+ Leave a blank line above and below it. Do NOT edit the code
+ (don't remove or add any spaces, etc).
1. On YouTube, visit the video URL you want to display. Copy
- the regular URL from your browser (`https://www.youtube.com/watch?v=VIDEO-ID`)
- and replace the video title and link in the line under `<div class="video-fallback">`.
+ the regular URL from your browser (`https://www.youtube.com/watch?v=VIDEO-ID`)
+ and replace the video title and link in the line under `<div class="video-fallback">`.
1. On YouTube, click **Share**, then **Embed**.
1. Copy the `<iframe>` source (`src`) **URL only**
- (`https://www.youtube.com/embed/VIDEO-ID`),
- and paste it, replacing the content of the `src` field in the
- `iframe` tag.
+ (`https://www.youtube.com/embed/VIDEO-ID`),
+ and paste it, replacing the content of the `src` field in the
+ `iframe` tag.
```html
leave a blank line here
@@ -545,15 +655,16 @@ nicely on different mobile devices.
## Code blocks
-- Always wrap code added to a sentence in inline code blocks (``` ` ```).
+- Always wrap code added to a sentence in inline code blocks (`` ` ``).
E.g., `.gitlab-ci.yml`, `git add .`, `CODEOWNERS`, `only: master`.
File names, commands, entries, and anything that refers to code should be added to code blocks.
To make things easier for the user, always add a full code block for things that can be
useful to copy and paste, as they can easily do it with the button on code blocks.
+- Add a blank line above and below code blocks.
- For regular code blocks, always use a highlighting class corresponding to the
language for better readability. Examples:
- ````md
+ ~~~md
```ruby
Ruby code
```
@@ -563,16 +674,17 @@ nicely on different mobile devices.
```
```md
- Markdown code
+ [Markdown code example](example.md)
```
```text
- Code for which no specific highlighting class is available.
+ Code or text for which no specific highlighting class is available.
```
- ````
+ ~~~
-- To display raw markdown instead of rendered markdown, use four backticks on their own lines around the
- markdown to display. See [example](https://gitlab.com/gitlab-org/gitlab-ce/blob/8c1991b9bb7e3b8d606481fdea316d633cfa5eb7/doc/development/documentation/styleguide.md#L275-287).
+- To display raw markdown instead of rendered markdown, you can use triple backticks
+ with `md`, like the `Markdown code` example above, unless you want to include triple
+ backticks in the code block as well. In that case, use triple tildes (`~~~`) instead.
- For a complete reference on code blocks, check the [Kramdown guide](https://about.gitlab.com/handbook/product/technical-writing/markdown-guide/#code-blocks).
## Alert boxes
@@ -595,7 +707,7 @@ In most cases, content considered for a note should be included:
#### When to use
Use a note when there is a reason that most or all readers who browse the
-section should see the content. That is, if missed, it’s likely to cause
+section should see the content. That is, if missed, it’s likely to cause
major trouble for a minority of users or significant trouble for a majority
of users.
@@ -731,24 +843,24 @@ a helpful link back to how the feature was developed.
- For features that need to declare the GitLab version that the feature was introduced. Text similar
to the following should be added immediately below the heading as a blockquote:
- ```md
- > Introduced in GitLab 11.3.
- ```
+ ```md
+ > Introduced in GitLab 11.3.
+ ```
- Whenever possible, version text should have a link to the issue, merge request, or epic that introduced the feature.
An issue is preferred over a merge request, and a merge request is preferred over an epic. For example:
- ```md
- > [Introduced](<link-to-issue>) in GitLab 11.3.
- ```
+ ```md
+ > [Introduced](<link-to-issue>) in GitLab 11.3.
+ ```
- If the feature is only available in GitLab Enterprise Edition, mention
the [paid tier](https://about.gitlab.com/handbook/marketing/product-marketing/#tiers)
the feature is available in:
- ```md
- > [Introduced](<link-to-issue>) in [GitLab Starter](https://about.gitlab.com/pricing/) 11.3.
- ```
+ ```md
+ > [Introduced](<link-to-issue>) in [GitLab Starter](https://about.gitlab.com/pricing/) 11.3.
+ ```
### Removing version text
@@ -767,10 +879,10 @@ Other text includes deprecation notices and version-specific how-to information.
When a feature is available in EE-only tiers, add the corresponding tier according to the
feature availability:
+- For GitLab Core and GitLab.com Free: `**(CORE)**`.
- For GitLab Starter and GitLab.com Bronze: `**(STARTER)**`.
- For GitLab Premium and GitLab.com Silver: `**(PREMIUM)**`.
- For GitLab Ultimate and GitLab.com Gold: `**(ULTIMATE)**`.
-- For GitLab Core and GitLab.com Free: `**(CORE)**`.
To exclude GitLab.com tiers (when the feature is not available in GitLab.com), add the
keyword "only":
@@ -782,6 +894,7 @@ keyword "only":
For GitLab.com only tiers (when the feature is not available for self-hosted instances):
+- For GitLab Free and higher tiers: `**(FREE ONLY)**`.
- For GitLab Bronze and higher tiers: `**(BRONZE ONLY)**`.
- For GitLab Silver and higher tiers: `**(SILVER ONLY)**`.
- For GitLab Gold: `**(GOLD ONLY)**`.
@@ -855,14 +968,14 @@ When there is a list of steps to perform, usually that entails editing the
configuration file and reconfiguring/restarting GitLab. In such case, follow
the style below as a guide:
-```md
+````md
**For Omnibus installations**
1. Edit `/etc/gitlab/gitlab.rb`:
- ```ruby
- external_url "https://gitlab.example.com"
- ```
+ ```ruby
+ external_url "https://gitlab.example.com"
+ ```
1. Save the file and [reconfigure] GitLab for the changes to take effect.
@@ -872,17 +985,16 @@ the style below as a guide:
1. Edit `config/gitlab.yml`:
- ```yaml
- gitlab:
- host: "gitlab.example.com"
- ```
+ ```yaml
+ gitlab:
+ host: "gitlab.example.com"
+ ```
1. Save the file and [restart] GitLab for the changes to take effect.
-
[reconfigure]: path/to/administration/restart_gitlab.md#omnibus-gitlab-reconfigure
[restart]: path/to/administration/restart_gitlab.md#installations-from-source
-```
+````
In this case:
@@ -901,9 +1013,9 @@ on this document. Further explanation is given below.
- Every method must have the REST API request. For example:
- ```
- GET /projects/:id/repository/branches
- ```
+ ```
+ GET /projects/:id/repository/branches
+ ```
- Every method must have a detailed
[description of the parameters](#method-description).
@@ -914,7 +1026,7 @@ on this document. Further explanation is given below.
The following can be used as a template to get started:
-````md
+~~~md
## Descriptive title
One or two sentence description of what endpoint does.
@@ -942,7 +1054,7 @@ Example response:
}
]
```
-````
+~~~
### Fake tokens
@@ -955,7 +1067,7 @@ You can use the following fake tokens as examples.
| Token type | Token value |
|:----------------------|:-------------------------------------------------------------------|
-| Private user token | `<your_access_token>` |
+| Private user token | `<your_access_token>` |
| Personal access token | `n671WNGecHugsdEDPsyo` |
| Application ID | `2fcb195768c39e9a94cec2c2e32c59c0aad7a3365c10892e8116b5d83d4096b6` |
| Application secret | `04f294d1eaca42b8692017b426d53bbc8fe75f827734f0260710b83a556082df` |
@@ -970,7 +1082,7 @@ You can use the following fake tokens as examples.
### Method description
Use the following table headers to describe the methods. Attributes should
-always be in code blocks using backticks (``` ` ```).
+always be in code blocks using backticks (`` ` ``).
```md
| Attribute | Type | Required | Description |
@@ -1076,6 +1188,6 @@ curl --request PUT --header "PRIVATE-TOKEN: <your_access_token>" --data "domain_
[cURL]: http://curl.haxx.se/ "cURL website"
[single spaces]: http://www.slate.com/articles/technology/technology/2011/01/space_invaders.html
-[gfm]: https://docs.gitlab.com/ce/user/markdown.html#newlines "GitLab flavored markdown documentation"
+[gfm]: ../../user/markdown.md#newlines "GitLab flavored markdown documentation"
[ce-1242]: https://gitlab.com/gitlab-org/gitlab-ce/issues/1242
[doc-restart]: ../../administration/restart_gitlab.md "GitLab restart documentation"
diff --git a/doc/development/ee_features.md b/doc/development/ee_features.md
index 7131b717353..2217dedccd3 100644
--- a/doc/development/ee_features.md
+++ b/doc/development/ee_features.md
@@ -125,20 +125,24 @@ This also applies to views.
### EE features based on CE features
For features that build on existing CE features, write a module in the `EE`
-namespace and `prepend` it in the CE class, on the last line of the file that
-the class resides in. This makes conflicts less likely to happen during CE to EE
-merges because only one line is added to the CE class - the `prepend` line. For
-example, to prepend a module into the `User` class you would use the following
-approach:
+namespace and inject it in the CE class, on the last line of the file that the
+class resides in. This makes conflicts less likely to happen during CE to EE
+merges because only one line is added to the CE class - the line that injects
+the module. For example, to prepend a module into the `User` class you would use
+the following approach:
```ruby
class User < ActiveRecord::Base
# ... lots of code here ...
end
-User.prepend(EE::User)
+User.prepend_if_ee('EE::User')
```
+Do not use methods such as `prepend`, `extend`, and `include`. Instead, use
+`prepend_if_ee`, `extend_if_ee`, or `include_if_ee`. These methods take a
+_String_ containing the full module name as the argument, not the module itself.
+
Since the module would require an `EE` namespace, the file should also be
put in an `ee/` sub-directory. For example, we want to extend the user model
in EE, so we have a module called `::EE::User` put inside
@@ -255,7 +259,7 @@ class ApplicationController < ActionController::Base
# ...
end
-ApplicationController.prepend(EE::ApplicationController)
+ApplicationController.prepend_if_ee('EE::ApplicationController')
```
And create a new file in the `ee/` sub-directory with the altered
@@ -504,9 +508,9 @@ EE-specific LDAP classes in `ee/lib/ee/gitlab/ldap`.
### Code in `lib/api/`
-It can be very tricky to extend EE features by a single line of `prepend`,
-and for each different [Grape](https://github.com/ruby-grape/grape) feature,
-we might need different strategies to extend it. To apply different strategies
+It can be very tricky to extend EE features by a single line of `prepend_if_ee`,
+and for each different [Grape](https://github.com/ruby-grape/grape) feature, we
+might need different strategies to extend it. To apply different strategies
easily, we would use `extend ActiveSupport::Concern` in the EE module.
Put the EE module files following
@@ -543,12 +547,12 @@ constants.
We can define `params` and utilize `use` in another `params` definition to
include params defined in EE. However, we need to define the "interface" first
in CE in order for EE to override it. We don't have to do this in other places
-due to `prepend`, but Grape is complex internally and we couldn't easily do
-that, so we'll follow regular object-oriented practices that we define the
+due to `prepend_if_ee`, but Grape is complex internally and we couldn't easily
+do that, so we'll follow regular object-oriented practices that we define the
interface first here.
For example, suppose we have a few more optional params for EE. We can move the
-params out of the `Grape::API` class to a helper module, so we can `prepend` it
+params out of the `Grape::API` class to a helper module, so we can inject it
before it would be used in the class.
```ruby
@@ -583,7 +587,7 @@ module API
end
end
-API::Helpers::ProjectsHelpers.prepend(EE::API::Helpers::ProjectsHelpers)
+API::Helpers::ProjectsHelpers.prepend_if_ee('EE::API::Helpers::ProjectsHelpers')
```
We could override it in EE module:
@@ -624,7 +628,7 @@ module API
end
end
-API::JobArtifacts.prepend(EE::API::JobArtifacts)
+API::JobArtifacts.prepend_if_ee('EE::API::JobArtifacts')
```
And then we can follow regular object-oriented practices to override it:
@@ -677,7 +681,7 @@ module API
end
end
-API::MergeRequests.prepend(EE::API::MergeRequests)
+API::MergeRequests.prepend_if_ee('EE::API::MergeRequests')
```
Note that `update_merge_request_ee` doesn't do anything in CE, but
@@ -717,8 +721,8 @@ Sometimes we need to use different arguments for a particular API route, and we
can't easily extend it with an EE module because Grape has different context in
different blocks. In order to overcome this, we need to move the data to a class
method that resides in a separate module or class. This allows us to extend that
-module or class before its data is used, without having to place a `prepend` in
-the middle of CE code.
+module or class before its data is used, without having to place a
+`prepend_if_ee` in the middle of CE code.
For example, in one place we need to pass an extra argument to
`at_least_one_of` so that the API could consider an EE-only argument as the
@@ -739,7 +743,7 @@ module API
end
end
-API::MergeRequests::Parameters.prepend(EE::API::MergeRequests::Parameters)
+API::MergeRequests::Parameters.prepend_if_ee('EE::API::MergeRequests::Parameters')
# api/merge_requests.rb
module API
@@ -789,7 +793,7 @@ class Identity < ActiveRecord::Base
[:provider]
end
- prepend EE::Identity
+ prepend_if_ee('EE::Identity')
validates :extern_uid,
allow_blank: true,
@@ -841,7 +845,7 @@ class Identity < ActiveRecord::Base
end
end
-Identity::UniquenessScopes.prepend(EE::Identity::UniquenessScopes)
+Identity::UniquenessScopes.prepend_if_ee('EE::Identity::UniquenessScopes')
# app/models/identity.rb
class Identity < ActiveRecord::Base
diff --git a/doc/development/elasticsearch.md b/doc/development/elasticsearch.md
index 0965db29557..635895051bc 100644
--- a/doc/development/elasticsearch.md
+++ b/doc/development/elasticsearch.md
@@ -148,6 +148,36 @@ Uses an [Edge NGram token filter](https://www.elastic.co/guide/en/elasticsearch/
- Searches can have their own analyzers. Remember to check when editing analyzers
- `Character` filters (as opposed to token filters) always replace the original character, so they're not a good choice as they can hinder exact searches
+## Architecture
+
+GitLab uses `elasticsearch-rails` for handling communication with Elasticsearch server. However, in order to achieve zero-downtime deployment during schema changes, an extra abstraction layer is built to allow:
+
+* Indexing (writes) to multiple indexes, with different mappings
+* Switching to different index for searches (reads) on the fly
+
+Currently we are on the process of migrating models to this new design (e.g. `Snippet`), and it is hardwired to work with a single version for now.
+
+Traditionally, `elasticsearch-rails` provides class and instance level `__elasticsearch__` proxy methods. If you call `Issue.__elasticsearch__`, you will get an instance of `Elasticsearch::Model::Proxy::ClassMethodsProxy`, and if you call `Issue.first.__elasticsearch__`, you will get an instance of `Elasticsearch::Model::Proxy::InstanceMethodsProxy`. These proxy objects would talk to Elasticsearch server directly.
+
+In the new design, `__elasticsearch__` instead represents one extra layer of proxy. It would keep multiple versions of the actual proxy objects, and it would forward read and write calls to the proxy of the intended version.
+
+The `elasticsearch-rails`'s way of specifying each model's mappings and other settings is to create a module for the model to include. However in the new design, each model would have its own corresponding subclassed proxy object, where the settings reside in. For example, snippet related setting in the past reside in `SnippetsSearch` module, but in the new design would reside in `SnippetClassProxy` (which is a subclass of `Elasticsearch::Model::Proxy::ClassMethodsProxy`). This reduces namespace pollution in model classes.
+
+The global configurations per version are now in the `Elastic::(Version)::Config` class. You can change mappings there.
+
+### Creating new version of schema
+
+Currently GitLab would still work with a single version of setting. Once it is implemented, multiple versions of setting can exists in different folders (e.g. `ee/lib/elastic/v12p1` and `ee/lib/elastic/v12p3`). To keep a continuous git history, the latest version lives under the `/latest` folder, but is aliased as the latest version.
+
+If the current version is `v12p1`, and we need to create a new version for `v12p3`, the steps are as follows:
+
+1. Copy the entire folder of `v12p1` as `v12p3`
+1. Change the namespace for files under `v12p3` folder from `V12p1` to `V12p3` (which are still aliased to `Latest`)
+1. Delete `v12p1` folder
+1. Copy the entire folder of `latest` as `v12p1`
+1. Change the namespace for files under `v12p1` folder from `Latest` to `V12p1`
+1. Make changes to `Latest` as needed
+
## Troubleshooting
### Getting `flood stage disk watermark [95%] exceeded`
diff --git a/doc/development/fe_guide/architecture.md b/doc/development/fe_guide/architecture.md
index 49b74b5ebcf..3d27f67a8a6 100644
--- a/doc/development/fe_guide/architecture.md
+++ b/doc/development/fe_guide/architecture.md
@@ -11,7 +11,7 @@ Architectural decisions should be accessible to everyone, so please document
them in the relevant Merge Request discussion or by updating our documentation
when appropriate.
-You can find the Frontend Architecture experts on the [team page](https://about.gitlab.com/company/team).
+You can find the Frontend Architecture experts on the [team page](https://about.gitlab.com/company/team/).
## Examples
diff --git a/doc/development/fe_guide/axios.md b/doc/development/fe_guide/axios.md
index 0d9397c3bd5..6e7cf523f36 100644
--- a/doc/development/fe_guide/axios.md
+++ b/doc/development/fe_guide/axios.md
@@ -1,15 +1,18 @@
# Axios
+
We use [axios][axios] to communicate with the server in Vue applications and most new code.
In order to guarantee all defaults are set you *should not use `axios` directly*, you should import `axios` from `axios_utils`.
## CSRF token
+
All our request require a CSRF token.
To guarantee this token is set, we are importing [axios][axios], setting the token, and exporting `axios` .
This exported module should be used instead of directly using `axios` to ensure the token is set.
## Usage
+
```javascript
import axios from './lib/utils/axios_utils';
@@ -35,7 +38,7 @@ Advantages over [`spyOn()`]:
- no need to create response objects
- does not allow call through (which we want to avoid)
-- simple API to test error cases
+- simple API to test error cases
- provides `replyOnce()` to allow for different responses
We have also decided against using [axios interceptors] because they are not suitable for mocking.
diff --git a/doc/development/fe_guide/components.md b/doc/development/fe_guide/components.md
index 52462a4bec9..096ce8ca25a 100644
--- a/doc/development/fe_guide/components.md
+++ b/doc/development/fe_guide/components.md
@@ -14,28 +14,29 @@ See also the [corresponding UX guide](https://design.gitlab.com/#/components/dro
1. Use the HTML structure provided by the [docs][bootstrap-dropdowns]
1. Add a specific class to the top level `.dropdown` element
- ```Haml
- .dropdown.my-dropdown
- %button{ type: 'button', data: { toggle: 'dropdown' }, 'aria-haspopup': true, 'aria-expanded': false }
- %span.dropdown-toggle-text
- Toggle Dropdown
- = icon('chevron-down')
-
- %ul.dropdown-menu
- %li
- %a
- item!
- ```
-
- Or use the helpers
- ```Haml
- .dropdown.my-dropdown
- = dropdown_toggle('Toogle!', { toggle: 'dropdown' })
- = dropdown_content
- %li
- %a
- item!
- ```
+ ```Haml
+ .dropdown.my-dropdown
+ %button{ type: 'button', data: { toggle: 'dropdown' }, 'aria-haspopup': true, 'aria-expanded': false }
+ %span.dropdown-toggle-text
+ Toggle Dropdown
+ = icon('chevron-down')
+
+ %ul.dropdown-menu
+ %li
+ %a
+ item!
+ ```
+
+ Or use the helpers
+
+ ```Haml
+ .dropdown.my-dropdown
+ = dropdown_toggle('Toogle!', { toggle: 'dropdown' })
+ = dropdown_content
+ %li
+ %a
+ item!
+ ```
[bootstrap-dropdowns]: https://getbootstrap.com/docs/3.3/javascript/#dropdowns
diff --git a/doc/development/fe_guide/event_tracking.md b/doc/development/fe_guide/event_tracking.md
index 716f6ad7f92..1b417d4c8c2 100644
--- a/doc/development/fe_guide/event_tracking.md
+++ b/doc/development/fe_guide/event_tracking.md
@@ -1,79 +1,76 @@
-# Event Tracking
+# Event tracking
-We use [Snowplow](https://github.com/snowplow/snowplow) for tracking custom events (available in GitLab [Enterprise Edition](https://about.gitlab.com/pricing/) only).
+GitLab provides `Tracking`, an interface that wraps
+[Snowplow](https://github.com/snowplow/snowplow) for tracking custom events.
+It uses Snowplow's custom event tracking functions.
-## Generic tracking function
-
-In addition to Snowplow's built-in method for tracking page views, we use a generic tracking function which enables us to selectively apply listeners to events.
-
-The generic tracking function can be imported in EE-specific JS files as follows:
+The tracking interface can be imported in JS files as follows:
```javascript
-import { trackEvent } from `ee/stats`;
+import Tracking from `~/tracking`;
```
-This gives the user access to the `trackEvent` method, which takes the following parameters:
+## Tracking in HAML or Vue templates
-| parameter | type | description | required |
-| ---------------- | ------ | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- |
-| `category` | string | Describes the page that you're capturing click events on. Unless infeasible, please use the Rails page attribute `document.body.dataset.page` by default. | true |
-| `eventName` | string | Describes the action the user is taking. The first word should always describe the action. For example, clicks should be `click` and activations should be `activate`. Use underscores to describe what was acted on. For example, activating a form field would be `activate_form_input`. Clicking on a dropdown is `click_dropdown`. | true |
-| `additionalData` | object | Additional data such as `label`, `property`, and `value` as described [in our Feature Instrumentation taxonomy](https://about.gitlab.com/handbook/product/feature-instrumentation/#taxonomy). | false |
+To avoid having to do create a bunch of custom javascript event handlers, when working within HAML or Vue templates, we can add `data-track-*` attributes to elements of interest. This way, all elements that have a `data-track-event` attribute to automatically have event tracking bound.
-Read more about instrumentation and the taxonomy in the [Product Handbook](https://about.gitlab.com/handbook/product/feature-instrumentation).
-
-### Tracking in `.js` and `.vue` files
+Below is an example of `data-track-*` attributes assigned to a button in HAML:
-The most simple use case is to add tracking programmatically to an event of interest in Javascript.
+```haml
+%button.btn{ data: { track_event: "click_button", track_label: "template_preview", track_property: "my-template", track_value: "" } }
+```
-The following example demonstrates how to track a click on a button in Javascript by calling the `trackEvent` method explicitly:
+We can then setup tracking for large sections of a page, or an entire page by telling the Tracking interface to bind to it.
```javascript
-import { trackEvent } from `ee/stats`;
+import Tracking from '~/tracking';
-trackEvent('dashboard:projects:index', 'click_button', {
- label: 'create_from_template',
- property: 'template_preview',
- value: 'rails',
+// for the entire document
+new Tracking().bind();
+
+// for a container element
+document.addEventListener('DOMContentLoaded', () => {
+ new Tracking('my_category').bind(document.getElementById('my-container'));
});
+
```
-### Tracking in HAML templates
+When you instantiate a Tracking instance you can provide a category. If none is provided, `document.body.dataset.page` will be used. When you bind the Tracking instance you can provide an element. If no element is provided to bind to, the `document` is assumed.
-Sometimes we want to track clicks for multiple elements on a page. Creating event handlers for all elements could soon turn into a tedious task.
+Below is a list of supported `data-track-*` attributes:
-There's a more convenient solution to this problem. When working with HAML templates, we can add `data-track-*` attributes to elements of interest. This way, all elements that have both `data-track-label` and `data-track-event` attributes assigned get marked for event tracking. All we have to do is call the `bindTrackableContainer` method on a container which allows for better scoping.
+| attribute | required | description |
+|:----------------------|:---------|:------------|
+| `data-track-event` | true | Action the user is taking. Clicks should be `click` and activations should be `activate`, so for example, focusing a form field would be `activate_form_input`, and clicking a button would be `click_button`. |
+| `data-track-label` | false | The `label` as described [in our Feature Instrumentation taxonomy](https://about.gitlab.com/handbook/product/feature-instrumentation/#taxonomy). |
+| `data-track-property` | false | The `property` as described [in our Feature Instrumentation taxonomy](https://about.gitlab.com/handbook/product/feature-instrumentation/#taxonomy). |
+| `data-track-value` | false | The `value` as described [in our Feature Instrumentation taxonomy](https://about.gitlab.com/handbook/product/feature-instrumentation/#taxonomy). If omitted, this will be the elements `value` property or an empty string. For checkboxes, the default value will be the element's checked attribute or `false` when unchecked. |
-Below is an example of `data-track-*` attributes assigned to a button in HAML:
+## Tracking in raw Javascript
-```ruby
-%button.btn{ data: { track_label: "template_preview", track_property: "my-template", track_event: "click_button", track_value: "" } }
-```
+Custom events can be tracked by directly calling the `Tracking.event` static function, which accepts the following arguments:
+
+| argument | type | default value | description |
+|:-----------|:-------|:---------------------------|:------------|
+| `category` | string | document.body.dataset.page | Page or subsection of a page that events are being captured within. |
+| `event` | string | 'generic' | Action the user is taking. Clicks should be `click` and activations should be `activate`, so for example, focusing a form field would be `activate_form_input`, and clicking a button would be `click_button`. |
+| `data` | object | {} | Additional data such as `label`, `property`, and `value` as described [in our Feature Instrumentation taxonomy](https://about.gitlab.com/handbook/product/feature-instrumentation/#taxonomy). These will be set as empty strings if you don't provide them. |
-By calling `bindTrackableContainer('.my-container')`, click handlers get bound to all elements located in `.my-container` provided that they have the necessary `data-track-*` attributes assigned to them.
+Tracking can be programmatically added to an event of interest in Javascript, and the following example demonstrates tracking a click on a button by calling `Tracking.event` manually.
```javascript
-import Stats from 'ee/stats';
+import Tracking from `~/tracking`;
-document.addEventListener('DOMContentLoaded', () => {
- Stats.bindTrackableContainer('.my-container', 'category');
-});
+document.getElementById('my_button').addEventListener('click', () => {
+ Tracking.event('dashboard:projects:index', 'click_button', {
+ label: 'create_from_template',
+ property: 'template_preview',
+ value: 'rails',
+ });
+})
```
-The second parameter in `bindTrackableContainer` is optional. If omitted, the value of `document.body.dataset.page` will be used as category instead.
-
-Below is a list of supported `data-track-*` attributes:
-
-| attribute | description | required |
-| --------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- |
-| `data-track-label` | The `label` in `trackEvent` | true |
-| `data-track-event` | The `eventName` in `trackEvent` | true |
-| `data-track-property` | The `property` in `trackEvent`. If omitted, an empty string will be used as a default value. | false |
-| `data-track-value` | The `value` in `trackEvent`. If omitted, this will be `target.value` or empty string. For checkboxes, the default value being tracked will be the element's checked attribute if `data-track-value` is omitted. | false |
-
-Since Snowplow is an Enterprise Edition feature, it's necessary to create a CE backport when adding `data-track-*` attributes to HAML templates in most cases.
-
-## Testing
+## Toggling tracking on or off
Snowplow can be enabled by navigating to:
diff --git a/doc/development/fe_guide/frontend_faq.md b/doc/development/fe_guide/frontend_faq.md
index e4225f2bc39..0d2aeffeac0 100644
--- a/doc/development/fe_guide/frontend_faq.md
+++ b/doc/development/fe_guide/frontend_faq.md
@@ -5,12 +5,12 @@
1. **You talk about Frontend FAQ.**
Please share links to it whenever applicable, so more eyes catch when content
gets outdated.
-2. **Keep it short and simple.**
+1. **Keep it short and simple.**
Whenever an answer needs more than two sentences it does not belong here.
-3. **Provide background when possible.**
+1. **Provide background when possible.**
Linking to relevant source code, issue / epic, or other documentation helps
to understand the answer.
-4. **If you see something, do something.**
+1. **If you see something, do something.**
Please remove or update any content that is outdated as soon as you see it.
## FAQ
diff --git a/doc/development/fe_guide/graphql.md b/doc/development/fe_guide/graphql.md
index 55b719227e5..4fc5dfc8c3d 100644
--- a/doc/development/fe_guide/graphql.md
+++ b/doc/development/fe_guide/graphql.md
@@ -47,7 +47,7 @@ new Vue({
});
```
-Read more about [Vue Apollo][vue-apollo] in the [Vue Apollo documentation][vue-apollo-docs].
+Read more about [Vue Apollo][vue-apollo] in the [Vue Apollo documentation](https://vue-apollo.netlify.com/guide/).
### Local state with Apollo
@@ -118,7 +118,6 @@ Read more about the [Apollo] client in the [Apollo documentation](https://www.ap
[Apollo]: https://www.apollographql.com/
[vue-apollo]: https://github.com/Akryum/vue-apollo/
-[vue-apollo-docs]: https://akryum.github.io/vue-apollo/
[feature-flags]: ../feature_flags.md
[default-client]: https://gitlab.com/gitlab-org/gitlab-ce/blob/master/app/assets/javascripts/lib/graphql.js
[vue-test-utils]: https://vue-test-utils.vuejs.org/
diff --git a/doc/development/fe_guide/icons.md b/doc/development/fe_guide/icons.md
index 533e2001300..4f687d8642e 100644
--- a/doc/development/fe_guide/icons.md
+++ b/doc/development/fe_guide/icons.md
@@ -21,10 +21,10 @@ To use a sprite Icon in HAML or Rails we use a specific helper function :
sprite_icon(icon_name, size: nil, css_class: '')
```
-- **icon_name** Use the icon_name that you can find in the SVG Sprite
- ([Overview is available here][svg-preview]).
-- **size (optional)** Use one of the following sizes : 16, 24, 32, 48, 72 (this will be translated into a `s16` class)
-- **css_class (optional)** If you want to add additional css classes
+- **icon_name** Use the icon_name that you can find in the SVG Sprite
+ ([Overview is available here][svg-preview]).
+- **size (optional)** Use one of the following sizes : 16, 24, 32, 48, 72 (this will be translated into a `s16` class)
+- **css_class (optional)** If you want to add additional css classes
**Example**
@@ -65,10 +65,10 @@ export default {
</template>
```
-- **name** Name of the Icon in the SVG Sprite ([Overview is available here][svg-preview]).
-- **size (optional)** Number value for the size which is then mapped to a specific CSS class
- (Available Sizes: 8, 12, 16, 18, 24, 32, 48, 72 are mapped to `sXX` css classes)
-- **css-classes (optional)** Additional CSS Classes to add to the svg tag.
+- **name** Name of the Icon in the SVG Sprite ([Overview is available here][svg-preview]).
+- **size (optional)** Number value for the size which is then mapped to a specific CSS class
+ (Available Sizes: 8, 12, 16, 18, 24, 32, 48, 72 are mapped to `sXX` css classes)
+- **css-classes (optional)** Additional CSS Classes to add to the svg tag.
### Usage in HTML/JS
diff --git a/doc/development/fe_guide/index.md b/doc/development/fe_guide/index.md
index 6bf6113dd81..deaef8e768b 100644
--- a/doc/development/fe_guide/index.md
+++ b/doc/development/fe_guide/index.md
@@ -17,6 +17,8 @@ Working with our frontend assets requires Node (v8.10.0 or greater) and Yarn
For our currently-supported browsers, see our [requirements][requirements].
+Use [BrowserStack](https://www.browserstack.com/) to test with our supported browsers. Login to BrowserStack with the credentials saved in GitLab's [shared 1Password account](https://about.gitlab.com/handbook/security/#1password-for-teams).
+
## Initiatives
Current high-level frontend goals are listed on [Frontend Epics](https://gitlab.com/groups/gitlab-org/-/epics?label_name%5B%5D=frontend).
diff --git a/doc/development/fe_guide/performance.md b/doc/development/fe_guide/performance.md
index 2628e95dbc1..676bce32998 100644
--- a/doc/development/fe_guide/performance.md
+++ b/doc/development/fe_guide/performance.md
@@ -30,8 +30,8 @@ To improve the time to first render we are using lazy loading for images. This w
the actual image source on the `data-src` attribute. After the HTML is rendered and JavaScript is loaded,
the value of `data-src` will be moved to `src` automatically if the image is in the current viewport.
-- Prepare images in HTML for lazy loading by renaming the `src` attribute to `data-src` AND adding the class `lazy`.
-- If you are using the Rails `image_tag` helper, all images will be lazy-loaded by default unless `lazy: false` is provided.
+- Prepare images in HTML for lazy loading by renaming the `src` attribute to `data-src` AND adding the class `lazy`.
+- If you are using the Rails `image_tag` helper, all images will be lazy-loaded by default unless `lazy: false` is provided.
If you are asynchronously adding content which contains lazy images then you need to call the function
`gl.lazyLoader.searchLazyImages()` which will search for lazy images and load them if needed.
@@ -96,26 +96,26 @@ bundle and included on the page.
DOM has loaded, you should attach an event handler to the `DOMContentLoaded`
event with:
- ```javascript
- import initMyWidget from './my_widget';
+ ```javascript
+ import initMyWidget from './my_widget';
- document.addEventListener('DOMContentLoaded', () => {
- initMyWidget();
- });
- ```
+ document.addEventListener('DOMContentLoaded', () => {
+ initMyWidget();
+ });
+ ```
- **Supporting Module Placement:**
- - If a class or a module is _specific to a particular route_, try to locate
- it close to the entry point it will be used. For instance, if
- `my_widget.js` is only imported within `pages/widget/show/index.js`, you
- should place the module at `pages/widget/show/my_widget.js` and import it
- with a relative path (e.g. `import initMyWidget from './my_widget';`).
- - If a class or module is _used by multiple routes_, place it within a
- shared directory at the closest common parent directory for the entry
- points that import it. For example, if `my_widget.js` is imported within
- both `pages/widget/show/index.js` and `pages/widget/run/index.js`, then
- place the module at `pages/widget/shared/my_widget.js` and import it with
- a relative path if possible (e.g. `../shared/my_widget`).
+ - If a class or a module is _specific to a particular route_, try to locate
+ it close to the entry point it will be used. For instance, if
+ `my_widget.js` is only imported within `pages/widget/show/index.js`, you
+ should place the module at `pages/widget/show/my_widget.js` and import it
+ with a relative path (e.g. `import initMyWidget from './my_widget';`).
+ - If a class or module is _used by multiple routes_, place it within a
+ shared directory at the closest common parent directory for the entry
+ points that import it. For example, if `my_widget.js` is imported within
+ both `pages/widget/show/index.js` and `pages/widget/run/index.js`, then
+ place the module at `pages/widget/shared/my_widget.js` and import it with
+ a relative path if possible (e.g. `../shared/my_widget`).
- **Enterprise Edition Caveats:**
For GitLab Enterprise Edition, page-specific entry points will override their
@@ -161,7 +161,7 @@ General tips:
- Use code-splitting dynamic imports wherever possible to lazy-load code that is not needed initially.
- [High Performance Animations][high-perf-animations]
--------
+---
## Additional Resources
diff --git a/doc/development/fe_guide/style_guide_js.md b/doc/development/fe_guide/style_guide_js.md
index b50159c2b75..125b11afcd0 100644
--- a/doc/development/fe_guide/style_guide_js.md
+++ b/doc/development/fe_guide/style_guide_js.md
@@ -20,148 +20,148 @@ See [our current .eslintrc](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/
1. **Never Ever EVER** disable eslint globally for a file
- ```javascript
- // bad
- /* eslint-disable */
+ ```javascript
+ // bad
+ /* eslint-disable */
- // better
- /* eslint-disable some-rule, some-other-rule */
+ // better
+ /* eslint-disable some-rule, some-other-rule */
- // best
- // nothing :)
- ```
+ // best
+ // nothing :)
+ ```
1. If you do need to disable a rule for a single violation, try to do it as locally as possible
- ```javascript
- // bad
- /* eslint-disable no-new */
+ ```javascript
+ // bad
+ /* eslint-disable no-new */
- import Foo from 'foo';
+ import Foo from 'foo';
- new Foo();
+ new Foo();
- // better
- import Foo from 'foo';
+ // better
+ import Foo from 'foo';
- // eslint-disable-next-line no-new
- new Foo();
- ```
+ // eslint-disable-next-line no-new
+ new Foo();
+ ```
1. There are few rules that we need to disable due to technical debt. Which are:
- 1. [no-new][eslint-new]
- 1. [class-methods-use-this][eslint-this]
+ 1. [no-new](https://eslint.org/docs/rules/no-new)
+ 1. [class-methods-use-this](https://eslint.org/docs/rules/class-methods-use-this)
1. When they are needed _always_ place ESlint directive comment blocks on the first line of a script,
followed by any global declarations, then a blank newline prior to any imports or code.
- ```javascript
- // bad
- /* global Foo */
- /* eslint-disable no-new */
- import Bar from './bar';
+ ```javascript
+ // bad
+ /* global Foo */
+ /* eslint-disable no-new */
+ import Bar from './bar';
- // good
- /* eslint-disable no-new */
- /* global Foo */
+ // good
+ /* eslint-disable no-new */
+ /* global Foo */
- import Bar from './bar';
- ```
+ import Bar from './bar';
+ ```
1. **Never** disable the `no-undef` rule. Declare globals with `/* global Foo */` instead.
1. When declaring multiple globals, always use one `/* global [name] */` line per variable.
- ```javascript
- // bad
- /* globals Flash, Cookies, jQuery */
+ ```javascript
+ // bad
+ /* globals Flash, Cookies, jQuery */
- // good
- /* global Flash */
- /* global Cookies */
- /* global jQuery */
- ```
+ // good
+ /* global Flash */
+ /* global Cookies */
+ /* global jQuery */
+ ```
1. Use up to 3 parameters for a function or class. If you need more accept an Object instead.
- ```javascript
- // bad
- fn(p1, p2, p3, p4) {}
+ ```javascript
+ // bad
+ fn(p1, p2, p3, p4) {}
- // good
- fn(options) {}
- ```
+ // good
+ fn(options) {}
+ ```
#### Modules, Imports, and Exports
1. Use ES module syntax to import modules
- ```javascript
- // bad
- const SomeClass = require('some_class');
+ ```javascript
+ // bad
+ const SomeClass = require('some_class');
- // good
- import SomeClass from 'some_class';
+ // good
+ import SomeClass from 'some_class';
- // bad
- module.exports = SomeClass;
+ // bad
+ module.exports = SomeClass;
- // good
- export default SomeClass;
- ```
+ // good
+ export default SomeClass;
+ ```
- Import statements are following usual naming guidelines, for example object literals use camel case:
+ Import statements are following usual naming guidelines, for example object literals use camel case:
- ```javascript
- // some_object file
- export default {
- key: 'value',
- };
+ ```javascript
+ // some_object file
+ export default {
+ key: 'value',
+ };
- // bad
- import ObjectLiteral from 'some_object';
+ // bad
+ import ObjectLiteral from 'some_object';
- // good
- import objectLiteral from 'some_object';
- ```
+ // good
+ import objectLiteral from 'some_object';
+ ```
1. Relative paths: when importing a module in the same directory, a child
directory, or an immediate parent directory prefer relative paths. When
importing a module which is two or more levels up, prefer either `~/` or `ee/`.
- In **app/assets/javascripts/my-feature/subdir**:
+ In **app/assets/javascripts/my-feature/subdir**:
- ```javascript
- // bad
- import Foo from '~/my-feature/foo';
- import Bar from '~/my-feature/subdir/bar';
- import Bin from '~/my-feature/subdir/lib/bin';
+ ```javascript
+ // bad
+ import Foo from '~/my-feature/foo';
+ import Bar from '~/my-feature/subdir/bar';
+ import Bin from '~/my-feature/subdir/lib/bin';
- // good
- import Foo from '../foo';
- import Bar from './bar';
- import Bin from './lib/bin';
- ```
+ // good
+ import Foo from '../foo';
+ import Bar from './bar';
+ import Bin from './lib/bin';
+ ```
- In **spec/javascripts**:
+ In **spec/javascripts**:
- ```javascript
- // bad
- import Foo from '../../app/assets/javascripts/my-feature/foo';
+ ```javascript
+ // bad
+ import Foo from '../../app/assets/javascripts/my-feature/foo';
- // good
- import Foo from '~/my-feature/foo';
- ```
+ // good
+ import Foo from '~/my-feature/foo';
+ ```
- When referencing an **EE component**:
+ When referencing an **EE component**:
- ```javascript
- // bad
- import Foo from '../../../../../ee/app/assets/javascripts/my-feature/ee-foo';
+ ```javascript
+ // bad
+ import Foo from '../../../../../ee/app/assets/javascripts/my-feature/ee-foo';
- // good
- import Foo from 'ee/my-feature/foo';
- ```
+ // good
+ import Foo from 'ee/my-feature/foo';
+ ```
1. Avoid using IIFE. Although we have a lot of examples of files which wrap their
contents in IIFEs (immediately-invoked function expressions),
@@ -170,136 +170,136 @@ See [our current .eslintrc](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/
1. Avoid adding to the global namespace.
- ```javascript
- // bad
- window.MyClass = class { /* ... */ };
+ ```javascript
+ // bad
+ window.MyClass = class { /* ... */ };
- // good
- export default class MyClass { /* ... */ }
- ```
+ // good
+ export default class MyClass { /* ... */ }
+ ```
1. Side effects are forbidden in any script which contains export
- ```javascript
- // bad
- export default class MyClass { /* ... */ }
+ ```javascript
+ // bad
+ export default class MyClass { /* ... */ }
- document.addEventListener("DOMContentLoaded", function(event) {
- new MyClass();
- }
- ```
+ document.addEventListener("DOMContentLoaded", function(event) {
+ new MyClass();
+ }
+ ```
#### Data Mutation and Pure functions
1. Strive to write many small pure functions, and minimize where mutations occur.
- ```javascript
- // bad
- const values = {foo: 1};
+ ```javascript
+ // bad
+ const values = {foo: 1};
- function impureFunction(items) {
- const bar = 1;
+ function impureFunction(items) {
+ const bar = 1;
- items.foo = items.a * bar + 2;
+ items.foo = items.a * bar + 2;
- return items.a;
- }
+ return items.a;
+ }
- const c = impureFunction(values);
+ const c = impureFunction(values);
- // good
- var values = {foo: 1};
+ // good
+ var values = {foo: 1};
- function pureFunction (foo) {
- var bar = 1;
+ function pureFunction (foo) {
+ var bar = 1;
- foo = foo * bar + 2;
+ foo = foo * bar + 2;
- return foo;
- }
+ return foo;
+ }
- var c = pureFunction(values.foo);
+ var c = pureFunction(values.foo);
```
1. Avoid constructors with side-effects.
Although we aim for code without side-effects we need some side-effects for our code to run.
- If the class won't do anything if we only instantiate it, it's ok to add side effects into the constructor (_Note:_ The following is just an example. If the only purpose of the class is to add an event listener and handle the callback a function will be more suitable.)
-
- ```javascript
- // Bad
- export class Foo {
- constructor() {
- this.init();
- }
- init() {
- document.addEventListener('click', this.handleCallback)
- },
- handleCallback() {
-
- }
- }
-
- // Good
- export class Foo {
- constructor() {
- document.addEventListener()
- }
- handleCallback() {
- }
- }
- ```
-
- On the other hand, if a class only needs to extend a third party/add event listeners in some specific cases, they should be initialized outside of the constructor.
+ If the class won't do anything if we only instantiate it, it's ok to add side effects into the constructor (_Note:_ The following is just an example. If the only purpose of the class is to add an event listener and handle the callback a function will be more suitable.)
+
+ ```javascript
+ // Bad
+ export class Foo {
+ constructor() {
+ this.init();
+ }
+ init() {
+ document.addEventListener('click', this.handleCallback)
+ },
+ handleCallback() {
+
+ }
+ }
+
+ // Good
+ export class Foo {
+ constructor() {
+ document.addEventListener()
+ }
+ handleCallback() {
+ }
+ }
+ ```
+
+ On the other hand, if a class only needs to extend a third party/add event listeners in some specific cases, they should be initialized outside of the constructor.
1. Prefer `.map`, `.reduce` or `.filter` over `.forEach`
A forEach will most likely cause side effects, it will be mutating the array being iterated. Prefer using `.map`,
`.reduce` or `.filter`
- ```javascript
- const users = [ { name: 'Foo' }, { name: 'Bar' } ];
+ ```javascript
+ const users = [ { name: 'Foo' }, { name: 'Bar' } ];
- // bad
- users.forEach((user, index) => {
- user.id = index;
- });
+ // bad
+ users.forEach((user, index) => {
+ user.id = index;
+ });
- // good
- const usersWithId = users.map((user, index) => {
- return Object.assign({}, user, { id: index });
- });
- ```
+ // good
+ const usersWithId = users.map((user, index) => {
+ return Object.assign({}, user, { id: index });
+ });
+ ```
#### Parse Strings into Numbers
1. `parseInt()` is preferable over `Number()` or `+`
- ```javascript
- // bad
- +'10' // 10
+ ```javascript
+ // bad
+ +'10' // 10
- // good
- Number('10') // 10
+ // good
+ Number('10') // 10
- // better
- parseInt('10', 10);
- ```
+ // better
+ parseInt('10', 10);
+ ```
#### CSS classes used for JavaScript
1. If the class is being used in Javascript it needs to be prepend with `js-`
- ```html
- // bad
- <button class="add-user">
- Add User
- </button>
+ ```html
+ // bad
+ <button class="add-user">
+ Add User
+ </button>
- // good
- <button class="js-add-user">
- Add User
- </button>
- ```
+ // good
+ <button class="js-add-user">
+ Add User
+ </button>
+ ```
### Vue.js
@@ -314,43 +314,44 @@ Please check this [rules][eslint-plugin-vue-rules] for more documentation.
1. The store has it's own file
1. Use a function in the bundle file to instantiate the Vue component:
- ```javascript
- // bad
- class {
- init() {
- new Component({})
- }
- }
-
- // good
- document.addEventListener('DOMContentLoaded', () => new Vue({
- el: '#element',
- components: {
- componentName
- },
- render: createElement => createElement('component-name'),
- }));
- ```
+ ```javascript
+ // bad
+ class {
+ init() {
+ new Component({})
+ }
+ }
+
+ // good
+ document.addEventListener('DOMContentLoaded', () => new Vue({
+ el: '#element',
+ components: {
+ componentName
+ },
+ render: createElement => createElement('component-name'),
+ }));
+ ```
1. Do not use a singleton for the service or the store
- ```javascript
- // bad
- class Store {
- constructor() {
- if (!this.prototype.singleton) {
- // do something
- }
- }
- }
+ ```javascript
+ // bad
+ class Store {
+ constructor() {
+ if (!this.prototype.singleton) {
+ // do something
+ }
+ }
+ }
+
+ // good
+ class Store {
+ constructor() {
+ // do something
+ }
+ }
+ ```
- // good
- class Store {
- constructor() {
- // do something
- }
- }
- ```
1. Use `.vue` for Vue templates. Do not use `%template` in HAML.
#### Naming
@@ -358,38 +359,38 @@ Please check this [rules][eslint-plugin-vue-rules] for more documentation.
1. **Extensions**: Use `.vue` extension for Vue components. Do not use `.js` as file extension ([#34371]).
1. **Reference Naming**: Use PascalCase for their instances:
- ```javascript
- // bad
- import cardBoard from 'cardBoard.vue'
+ ```javascript
+ // bad
+ import cardBoard from 'cardBoard.vue'
- components: {
- cardBoard,
- };
+ components: {
+ cardBoard,
+ };
- // good
- import CardBoard from 'cardBoard.vue'
+ // good
+ import CardBoard from 'cardBoard.vue'
- components: {
- CardBoard,
- };
- ```
+ components: {
+ CardBoard,
+ };
+ ```
1. **Props Naming:** Avoid using DOM component prop names.
1. **Props Naming:** Use kebab-case instead of camelCase to provide props in templates.
- ```javascript
- // bad
- <component class="btn">
+ ```javascript
+ // bad
+ <component class="btn">
- // good
- <component css-class="btn">
+ // good
+ <component css-class="btn">
- // bad
- <component myProp="prop" />
+ // bad
+ <component myProp="prop" />
- // good
- <component my-prop="prop" />
- ```
+ // good
+ <component my-prop="prop" />
+ ```
[#34371]: https://gitlab.com/gitlab-org/gitlab-ce/issues/34371
@@ -399,205 +400,205 @@ Please check this [rules][eslint-plugin-vue-rules] for more documentation.
1. With more than one attribute, all attributes should be on a new line:
- ```javascript
- // bad
- <component v-if="bar"
- param="baz" />
+ ```javascript
+ // bad
+ <component v-if="bar"
+ param="baz" />
- <button class="btn">Click me</button>
+ <button class="btn">Click me</button>
- // good
- <component
- v-if="bar"
- param="baz"
- />
+ // good
+ <component
+ v-if="bar"
+ param="baz"
+ />
- <button class="btn">
- Click me
- </button>
- ```
+ <button class="btn">
+ Click me
+ </button>
+ ```
1. The tag can be inline if there is only one attribute:
- ```javascript
- // good
- <component bar="bar" />
+ ```javascript
+ // good
+ <component bar="bar" />
- // good
- <component
- bar="bar"
- />
+ // good
+ <component
+ bar="bar"
+ />
- // bad
- <component
- bar="bar" />
- ```
+ // bad
+ <component
+ bar="bar" />
+ ```
#### Quotes
1. Always use double quotes `"` inside templates and single quotes `'` for all other JS.
- ```javascript
- // bad
- template: `
- <button :class='style'>Button</button>
- `
+ ```javascript
+ // bad
+ template: `
+ <button :class='style'>Button</button>
+ `
- // good
- template: `
- <button :class="style">Button</button>
- `
- ```
+ // good
+ template: `
+ <button :class="style">Button</button>
+ `
+ ```
#### Props
1. Props should be declared as an object
- ```javascript
- // bad
- props: ['foo']
+ ```javascript
+ // bad
+ props: ['foo']
- // good
- props: {
- foo: {
- type: String,
- required: false,
- default: 'bar'
- }
- }
- ```
+ // good
+ props: {
+ foo: {
+ type: String,
+ required: false,
+ default: 'bar'
+ }
+ }
+ ```
1. Required key should always be provided when declaring a prop
- ```javascript
- // bad
- props: {
- foo: {
- type: String,
- }
- }
-
- // good
- props: {
- foo: {
- type: String,
- required: false,
- default: 'bar'
- }
- }
- ```
+ ```javascript
+ // bad
+ props: {
+ foo: {
+ type: String,
+ }
+ }
+
+ // good
+ props: {
+ foo: {
+ type: String,
+ required: false,
+ default: 'bar'
+ }
+ }
+ ```
1. Default key should be provided if the prop is not required.
_Note:_ There are some scenarios where we need to check for the existence of the property.
On those a default key should not be provided.
- ```javascript
- // good
- props: {
- foo: {
- type: String,
- required: false,
- }
- }
-
- // good
- props: {
- foo: {
- type: String,
- required: false,
- default: 'bar'
- }
- }
-
- // good
- props: {
- foo: {
- type: String,
- required: true
- }
- }
- ```
+ ```javascript
+ // good
+ props: {
+ foo: {
+ type: String,
+ required: false,
+ }
+ }
+
+ // good
+ props: {
+ foo: {
+ type: String,
+ required: false,
+ default: 'bar'
+ }
+ }
+
+ // good
+ props: {
+ foo: {
+ type: String,
+ required: true
+ }
+ }
+ ```
#### Data
1. `data` method should always be a function
- ```javascript
- // bad
- data: {
- foo: 'foo'
- }
+ ```javascript
+ // bad
+ data: {
+ foo: 'foo'
+ }
- // good
- data() {
- return {
- foo: 'foo'
- };
- }
- ```
+ // good
+ data() {
+ return {
+ foo: 'foo'
+ };
+ }
+ ```
#### Directives
1. Shorthand `@` is preferable over `v-on`
- ```javascript
- // bad
- <component v-on:click="eventHandler"/>
+ ```javascript
+ // bad
+ <component v-on:click="eventHandler"/>
- // good
- <component @click="eventHandler"/>
- ```
+ // good
+ <component @click="eventHandler"/>
+ ```
-2. Shorthand `:` is preferable over `v-bind`
+1. Shorthand `:` is preferable over `v-bind`
- ```javascript
- // bad
- <component v-bind:class="btn"/>
+ ```javascript
+ // bad
+ <component v-bind:class="btn"/>
- // good
- <component :class="btn"/>
- ```
+ // good
+ <component :class="btn"/>
+ ```
-3. Shorthand `#` is preferable over `v-slot`
+1. Shorthand `#` is preferable over `v-slot`
- ```javascript
- // bad
- <template v-slot:header></template>
+ ```javascript
+ // bad
+ <template v-slot:header></template>
- // good
- <template #header></template>
- ```
+ // good
+ <template #header></template>
+ ```
#### Closing tags
1. Prefer self closing component tags
- ```javascript
- // bad
- <component></component>
+ ```javascript
+ // bad
+ <component></component>
- // good
- <component />
- ```
+ // good
+ <component />
+ ```
#### Ordering
1. Tag order in `.vue` file
- ```
- <script>
- // ...
- </script>
-
- <template>
- // ...
- </template>
-
- // We don't use scoped styles but there are few instances of this
- <style>
- // ...
- </style>
- ```
+ ```
+ <script>
+ // ...
+ </script>
+
+ <template>
+ // ...
+ </template>
+
+ // We don't use scoped styles but there are few instances of this
+ <style>
+ // ...
+ </style>
+ ```
1. Properties in a Vue Component:
Check [order of properties in components rule][vue-order].
@@ -608,50 +609,50 @@ When using `v-for` you need to provide a *unique* `:key` attribute for each item
1. If the elements of the array being iterated have an unique `id` it is advised to use it:
- ```html
- <div
- v-for="item in items"
- :key="item.id"
- >
- <!-- content -->
- </div>
- ```
+ ```html
+ <div
+ v-for="item in items"
+ :key="item.id"
+ >
+ <!-- content -->
+ </div>
+ ```
1. When the elements being iterated don't have a unique id, you can use the array index as the `:key` attribute
- ```html
- <div
- v-for="(item, index) in items"
- :key="index"
- >
- <!-- content -->
- </div>
- ```
+ ```html
+ <div
+ v-for="(item, index) in items"
+ :key="index"
+ >
+ <!-- content -->
+ </div>
+ ```
1. When using `v-for` with `template` and there is more than one child element, the `:key` values must be unique. It's advised to use `kebab-case` namespaces.
- ```html
- <template v-for="(item, index) in items">
- <span :key="`span-${index}`"></span>
- <button :key="`button-${index}`"></button>
- </template>
- ```
+ ```html
+ <template v-for="(item, index) in items">
+ <span :key="`span-${index}`"></span>
+ <button :key="`button-${index}`"></button>
+ </template>
+ ```
1. When dealing with nested `v-for` use the same guidelines as above.
- ```html
- <div
- v-for="item in items"
- :key="item.id"
- >
- <span
- v-for="element in array"
- :key="element.id"
- >
- <!-- content -->
- </span>
- </div>
- ```
+ ```html
+ <div
+ v-for="item in items"
+ :key="item.id"
+ >
+ <span
+ v-for="element in array"
+ :key="element.id"
+ >
+ <!-- content -->
+ </span>
+ </div>
+ ```
Useful links:
@@ -662,35 +663,35 @@ Useful links:
1. Tooltips: Do not rely on `has-tooltip` class name for Vue components
- ```javascript
- // bad
- <span
- class="has-tooltip"
- title="Some tooltip text">
- Text
- </span>
-
- // good
- <span
- v-tooltip
- title="Some tooltip text">
- Text
- </span>
- ```
+ ```javascript
+ // bad
+ <span
+ class="has-tooltip"
+ title="Some tooltip text">
+ Text
+ </span>
+
+ // good
+ <span
+ v-tooltip
+ title="Some tooltip text">
+ Text
+ </span>
+ ```
1. Tooltips: When using a tooltip, include the tooltip directive, `./app/assets/javascripts/vue_shared/directives/tooltip.js`
1. Don't change `data-original-title`.
- ```javascript
- // bad
- <span data-original-title="tooltip text">Foo</span>
+ ```javascript
+ // bad
+ <span data-original-title="tooltip text">Foo</span>
- // good
- <span title="tooltip text">Foo</span>
+ // good
+ <span title="tooltip text">Foo</span>
- $('span').tooltip('_fixTitle');
- ```
+ $('span').tooltip('_fixTitle');
+ ```
### The Javascript/Vue Accord
@@ -713,8 +714,6 @@ The goal of this accord is to make sure we are all on the same page.
[airbnb-js-style-guide]: https://github.com/airbnb/javascript
[eslintrc]: https://gitlab.com/gitlab-org/gitlab-ce/blob/master/.eslintrc
-[eslint-this]: http://eslint.org/docs/rules/class-methods-use-this
-[eslint-new]: http://eslint.org/docs/rules/no-new
[eslint-plugin-vue]: https://github.com/vuejs/eslint-plugin-vue
[eslint-plugin-vue-rules]: https://github.com/vuejs/eslint-plugin-vue#bulb-rules
[vue-order]: https://github.com/vuejs/eslint-plugin-vue/blob/master/docs/rules/order-in-components.md
diff --git a/doc/development/fe_guide/style_guide_scss.md b/doc/development/fe_guide/style_guide_scss.md
index 5220c9eeea3..95c4a094c04 100644
--- a/doc/development/fe_guide/style_guide_scss.md
+++ b/doc/development/fe_guide/style_guide_scss.md
@@ -35,7 +35,7 @@ New utility classes should be added to [`utilities.scss`](https://gitlab.com/git
We recommend a "utility-first" approach.
1. Start with utility classes.
-2. If composing utility classes into a component class removes code duplication and encapsulates a clear responsibility, do it.
+1. If composing utility classes into a component class removes code duplication and encapsulates a clear responsibility, do it.
This encourages an organic growth of component classes and prevents the creation of one-off unreusable classes. Also, the kind of classes that emerge from "utility-first" tend to be design-centered (e.g. `.button`, `.alert`, `.card`) rather than domain-centered (e.g. `.security-report-widget`, `.commit-header-icon`).
@@ -212,6 +212,7 @@ selectors are intended for use only with JavaScript to allow for removal or
renaming without breaking styling.
### IDs
+
Don't use ID selectors in CSS.
```scss
diff --git a/doc/development/fe_guide/vuex.md b/doc/development/fe_guide/vuex.md
index bf248b7f8af..557d3132d71 100644
--- a/doc/development/fe_guide/vuex.md
+++ b/doc/development/fe_guide/vuex.md
@@ -1,15 +1,18 @@
# Vuex
+
To manage the state of an application you should use [Vuex][vuex-docs].
_Note:_ All of the below is explained in more detail in the official [Vuex documentation][vuex-docs].
## Separation of concerns
+
Vuex is composed of State, Getters, Mutations, Actions and Modules.
When a user clicks on an action, we need to `dispatch` it. This action will `commit` a mutation that will change the state.
_Note:_ The action itself will not update the state, only a mutation should update the state.
## File structure
+
When using Vuex at GitLab, separate this concerns into different files to improve readability:
```
@@ -21,10 +24,12 @@ When using Vuex at GitLab, separate this concerns into different files to improv
├── state.js # state
└── mutation_types.js # mutation types
```
+
The following example shows an application that lists and adds users to the state.
(For a more complex example implementation take a look at the security applications store in [here](https://gitlab.com/gitlab-org/gitlab-ee/tree/master/ee/app/assets/javascripts/vue_shared/security_reports/store))
### `index.js`
+
This is the entry point for our store. You can use the following as a guide:
```javascript
@@ -47,6 +52,7 @@ export default createStore();
```
### `state.js`
+
The first thing you should do before writing any code is to design the state.
Often we need to provide data from haml to our Vue application. Let's store it in the state for better access.
@@ -66,9 +72,11 @@ Often we need to provide data from haml to our Vue application. Let's store it i
```
#### Access `state` properties
+
You can use `mapState` to access state properties in the components.
### `actions.js`
+
An action is a payload of information to send data from our application to our store.
An action is usually composed by a `type` and a `payload` and they describe what happened.
@@ -110,6 +118,7 @@ In this file, we will write the actions that will call the respective mutations:
```
#### Actions Pattern: `request` and `receive` namespaces
+
When a request is made we often want to show a loading state to the user.
Instead of creating an action to toggle the loading state and dispatch it in the component,
@@ -136,6 +145,7 @@ By following this pattern we guarantee:
1. Actions are simple and straightforward
#### Dispatching actions
+
To dispatch an action from a component, use the `mapActions` helper:
```javascript
@@ -154,6 +164,7 @@ import { mapActions } from 'vuex';
```
### `mutations.js`
+
The mutations specify how the application state changes in response to actions sent to the store.
The only way to change state in a Vuex store should be by committing a mutation.
@@ -193,6 +204,7 @@ Remember that actions only describe that something happened, they don't describe
```
### `getters.js`
+
Sometimes we may need to get derived state based on store state, like filtering for a specific prop.
Using a getter will also cache the result based on dependencies due to [how computed props work](https://vuejs.org/v2/guide/computed.html#Computed-Caching-vs-Methods)
This can be done through the `getters`:
@@ -219,6 +231,7 @@ import { mapGetters } from 'vuex';
```
### `mutation_types.js`
+
From [vuex mutations docs][vuex-mutations]:
> It is a commonly seen pattern to use constants for mutation types in various Flux implementations. This allows the code to take advantage of tooling like linters, and putting all constants in a single file allows your collaborators to get an at-a-glance view of what mutations are possible in the entire application.
@@ -227,6 +240,7 @@ export const ADD_USER = 'ADD_USER';
```
### How to include the store in your application
+
The store should be included in the main component of your application:
```javascript
@@ -241,6 +255,7 @@ The store should be included in the main component of your application:
```
### Communicating with the Store
+
```javascript
<script>
import { mapActions, mapState, mapGetters } from 'vuex';
@@ -298,29 +313,33 @@ export default {
1. Do not call a mutation directly. Always use an action to commit a mutation. Doing so will keep consistency throughout the application. From Vuex docs:
- > why don't we just call store.commit('action') directly? Well, remember that mutations must be synchronous? Actions aren't. We can perform asynchronous operations inside an action.
+ > Why don't we just call store.commit('action') directly? Well, remember that mutations must be synchronous? Actions aren't. We can perform asynchronous operations inside an action.
- ```javascript
- // component.vue
+ ```javascript
+ // component.vue
- // bad
- created() {
- this.$store.commit('mutation');
- }
+ // bad
+ created() {
+ this.$store.commit('mutation');
+ }
+
+ // good
+ created() {
+ this.$store.dispatch('action');
+ }
+ ```
- // good
- created() {
- this.$store.dispatch('action');
- }
- ```
1. Use mutation types instead of hardcoding strings. It will be less error prone.
1. The State will be accessible in all components descending from the use where the store is instantiated.
### Testing Vuex
+
#### Testing Vuex concerns
+
Refer to [vuex docs][vuex-testing] regarding testing Actions, Getters and Mutations.
#### Testing components that need a store
+
Smaller components might use `store` properties to access the data.
In order to write unit tests for those components, we need to include the store and provide the correct state:
@@ -363,6 +382,7 @@ describe('component', () => {
```
#### Testing Vuex actions and getters
+
Because we're currently using [`babel-plugin-rewire`](https://github.com/speedskater/babel-plugin-rewire), you may encounter the following error when testing your Vuex actions and getters:
`[vuex] actions should be function or object with "handler" function`
diff --git a/doc/development/file_storage.md b/doc/development/file_storage.md
index 02874d18a30..475d1c1611e 100644
--- a/doc/development/file_storage.md
+++ b/doc/development/file_storage.md
@@ -92,8 +92,8 @@ in your uploader, you need to either 1) include `RecordsUpload::Concern` and pre
The `CarrierWave::Uploader#store_dir` is overridden to
- - `GitlabUploader.base_dir` + `GitlabUploader.dynamic_segment` when the store is LOCAL
- - `GitlabUploader.dynamic_segment` when the store is REMOTE (the bucket name is used to namespace)
+- `GitlabUploader.base_dir` + `GitlabUploader.dynamic_segment` when the store is LOCAL
+- `GitlabUploader.dynamic_segment` when the store is REMOTE (the bucket name is used to namespace)
### Using `ObjectStorage::Extension::RecordsUploads`
diff --git a/doc/development/filtering_by_label.md b/doc/development/filtering_by_label.md
new file mode 100644
index 00000000000..5e7376db725
--- /dev/null
+++ b/doc/development/filtering_by_label.md
@@ -0,0 +1,166 @@
+# Filtering by label
+
+## Introduction
+
+GitLab has [labels](../user/project/labels.md) that can be assigned to issues,
+merge requests, and epics. Labels on those objects are a many-to-many relation
+through the polymorphic `label_links` table.
+
+To filter these objects by multiple labels - for instance, 'all open
+issues with the label ~Plan and the label ~backend' - we generate a
+query containing a `GROUP BY` clause. In a simple form, this looks like:
+
+```sql
+SELECT
+ issues.*
+FROM
+ issues
+ INNER JOIN label_links ON label_links.target_id = issues.id
+ AND label_links.target_type = 'Issue'
+ INNER JOIN labels ON labels.id = label_links.label_id
+WHERE
+ issues.project_id = 13083
+ AND (issues.state IN ('opened'))
+ AND labels.title IN ('Plan',
+ 'backend')
+GROUP BY
+ issues.id
+HAVING (COUNT(DISTINCT labels.title) = 2)
+ORDER BY
+ issues.updated_at DESC,
+ issues.id DESC
+LIMIT 20 OFFSET 0
+```
+
+In particular, note that:
+
+1. We `GROUP BY issues.id` so that we can ...
+1. Use the `HAVING (COUNT(DISTINCT labels.title) = 2)` condition to ensure that
+ all matched issues have both labels.
+
+This is more complicated than is ideal. It makes the query construction more
+prone to errors (such as
+[gitlab-org/gitlab-ce#15557](https://gitlab.com/gitlab-org/gitlab-ce/issues/15557)).
+
+## Attempt A: WHERE EXISTS
+
+### Attempt A1: use multiple subqueries with WHERE EXISTS
+
+In
+[gitlab-org/gitlab-ce#37137](https://gitlab.com/gitlab-org/gitlab-ce/issues/37137)
+and its associated merge request
+[gitlab-org/gitlab-ce!14022](https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/14022),
+we tried to replace the `GROUP BY` with multiple uses of `WHERE EXISTS`. For the
+example above, this would give:
+
+```sql
+WHERE (EXISTS (
+ SELECT
+ TRUE
+ FROM
+ label_links
+ INNER JOIN labels ON labels.id = label_links.label_id
+ WHERE
+ labels.title = 'Plan'
+ AND target_type = 'Issue'
+ AND target_id = issues.id))
+AND (EXISTS (
+ SELECT
+ TRUE
+ FROM
+ label_links
+ INNER JOIN labels ON labels.id = label_links.label_id
+ WHERE
+ labels.title = 'backend'
+ AND target_type = 'Issue'
+ AND target_id = issues.id))
+```
+
+While this worked without schema changes, and did improve readability somewhat,
+it did not improve query performance.
+
+## Attempt B: Denormalize using an array column
+
+Having [removed MySQL support in GitLab
+12.1](https://about.gitlab.com/2019/06/27/removing-mysql-support/), using
+[Postgres's arrays](https://www.postgresql.org/docs/9.6/arrays.html) became more
+tractable as we didn't have to support two databases. We discussed denormalizing
+the `label_links` table for querying in
+[gitlab-org/gitlab-ce#49651](https://gitlab.com/gitlab-org/gitlab-ce/issues/49651),
+with two options: label IDs and titles.
+
+We can think of both of those as array columns on `issues`, `merge_requests`,
+and `epics`: `issues.label_ids` would be an array column of label IDs, and
+`issues.label_titles` would be an array of label titles.
+
+These array columns can be complemented with [GIN
+indexes](https://www.postgresql.org/docs/9.6/gin-intro.html) to improve
+matching.
+
+### Attempt B1: store label IDs for each object
+
+This has some strong advantages over titles:
+
+1. Unless a label is deleted, or a project is moved, we never need to
+ bulk-update the denormalized column.
+1. It uses less storage than the titles.
+
+Unfortunately, our application design makes this hard. If we were able to query
+just by label ID easily, we wouldn't need the `INNER JOIN labels` in the initial
+query at the start of this document. GitLab allows users to filter by label
+title across projects and even across groups, so a filter by the label ~Plan may
+include labels with multiple distinct IDs.
+
+We do not want users to have to know about the different IDs, which means that
+given this data set:
+
+| Project | ~Plan label ID | ~backend label ID |
+| ------- | -------------- | ----------------- |
+| A | 11 | 12 |
+| B | 21 | 22 |
+| C | 31 | 32 |
+
+We would need something like:
+
+```sql
+WHERE
+ label_ids @> ARRAY[11, 12]
+ OR label_ids @> ARRAY[21, 22]
+ OR label_ids @> ARRAY[31, 32]
+```
+
+This can get even more complicated when we consider that in some cases, there
+might be two ~backend labels - with different IDs - that could apply to the same
+object, so the number of combinations would balloon further.
+
+### Attempt B2: store label titles for each object
+
+From the perspective of updating the labelable object, this is the worst
+option. We have to bulk update the objects when:
+
+1. The objects are moved from one project to another.
+1. The project is moved from one group to another.
+1. The label is renamed.
+1. The label is deleted.
+
+It also uses much more storage. Querying is simple, though:
+
+```sql
+WHERE
+ label_titles @> ARRAY['Plan', 'backend']
+```
+
+And our [tests in
+gitlab-org/gitlab-ce#49651](https://gitlab.com/gitlab-org/gitlab-ce/issues/49651#note_188777346)
+showed that this could be fast.
+
+However, at present, the disadvantages outweigh the advantages.
+
+## Conclusion
+
+We have yet to find a method that is demonstratably better than the current
+method, when considering:
+
+1. Query performance.
+1. Readability.
+1. Ease of maintaining schema consistency.
diff --git a/doc/development/geo.md b/doc/development/geo.md
index 685d4e44ad3..24f16eae9fa 100644
--- a/doc/development/geo.md
+++ b/doc/development/geo.md
@@ -341,9 +341,9 @@ not used, so sessions etc. aren't shared between nodes.
GitLab can optionally use Object Storage to store data it would
otherwise store on disk. These things can be:
- - LFS Objects
- - CI Job Artifacts
- - Uploads
+- LFS Objects
+- CI Job Artifacts
+- Uploads
Objects that are stored in object storage, are not handled by Geo. Geo
ignores items in object storage. Either:
@@ -412,15 +412,15 @@ The Geo **primary** stores events in the `geo_event_log` table. Each
entry in the log contains a specific type of event. These type of
events include:
- - Repository Deleted event
- - Repository Renamed event
- - Repositories Changed event
- - Repository Created event
- - Hashed Storage Migrated event
- - Lfs Object Deleted event
- - Hashed Storage Attachments event
- - Job Artifact Deleted event
- - Upload Deleted event
+- Repository Deleted event
+- Repository Renamed event
+- Repositories Changed event
+- Repository Created event
+- Hashed Storage Migrated event
+- Lfs Object Deleted event
+- Hashed Storage Attachments event
+- Job Artifact Deleted event
+- Upload Deleted event
### Geo Log Cursor
@@ -526,4 +526,4 @@ old method:
- Replication is synchronous and we preserve the order of events.
- Replication of the events happen at the same time as the changes in the
- database.
+ database.
diff --git a/doc/development/git_object_deduplication.md b/doc/development/git_object_deduplication.md
index c103a4527ff..5ce59891afa 100644
--- a/doc/development/git_object_deduplication.md
+++ b/doc/development/git_object_deduplication.md
@@ -79,11 +79,11 @@ at the Rails application level in SQL.
In conclusion, we need three things for effective object deduplication
across a collection of GitLab project repositories at the Git level:
-1. A pool repository must exist.
-2. The participating project repositories must be linked to the pool
- repository via their respective `objects/info/alternates` files.
-3. The pool repository must contain Git object data common to the
- participating project repositories.
+1. A pool repository must exist.
+1. The participating project repositories must be linked to the pool
+ repository via their respective `objects/info/alternates` files.
+1. The pool repository must contain Git object data common to the
+ participating project repositories.
### Deduplication factor
@@ -105,71 +105,71 @@ With pool repositories we made a fresh start. These live in their own
`pool_repositories` SQL table. The relations between these two tables
are as follows:
-- a `Project` belongs to at most one `PoolRepository`
- (`project.pool_repository`)
-- as an automatic consequence of the above, a `PoolRepository` has
- many `Project`s
-- a `PoolRepository` has exactly one "source `Project`"
- (`pool.source_project`)
+- a `Project` belongs to at most one `PoolRepository`
+ (`project.pool_repository`)
+- as an automatic consequence of the above, a `PoolRepository` has
+ many `Project`s
+- a `PoolRepository` has exactly one "source `Project`"
+ (`pool.source_project`)
> TODO Fix invalid SQL data for pools created prior to GitLab 11.11
> <https://gitlab.com/gitlab-org/gitaly/issues/1653>.
### Assumptions
-- All repositories in a pool must use [hashed
- storage](../administration/repository_storage_types.md). This is so
- that we don't have to ever worry about updating paths in
- `object/info/alternates` files.
-- All repositories in a pool must be on the same Gitaly storage shard.
- The Git alternates mechanism relies on direct disk access across
- multiple repositories, and we can only assume direct disk access to
- be possible within a Gitaly storage shard.
-- The only two ways to remove a member project from a pool are (1) to
- delete the project or (2) to move the project to another Gitaly
- storage shard.
+- All repositories in a pool must use [hashed
+ storage](../administration/repository_storage_types.md). This is so
+ that we don't have to ever worry about updating paths in
+ `object/info/alternates` files.
+- All repositories in a pool must be on the same Gitaly storage shard.
+ The Git alternates mechanism relies on direct disk access across
+ multiple repositories, and we can only assume direct disk access to
+ be possible within a Gitaly storage shard.
+- The only two ways to remove a member project from a pool are (1) to
+ delete the project or (2) to move the project to another Gitaly
+ storage shard.
### Creating pools and pool memberships
-- When a pool gets created, it must have a source project. The initial
- contents of the pool repository are a Git clone of the source
- project repository.
-- The occasion for creating a pool is when an existing eligible
- (public, hashed storage, non-forked) GitLab project gets forked and
- this project does not belong to a pool repository yet. The fork
- parent project becomes the source project of the new pool, and both
- the fork parent and the fork child project become members of the new
- pool.
-- Once project A has become the source project of a pool, all future
- eligible forks of A will become pool members.
-- If the fork source is itself a fork, the resulting repository will
- neither join the repository nor will a new pool repository be
- seeded.
-
- eg:
-
- Suppose fork A is part of a pool repository, any forks created off
- of fork A *will not* be a part of the pool repository that fork A is
- a part of.
-
- Suppose B is a fork of A, and A does not belong to an object pool.
- Now C gets created as a fork of B. C will not be part of a pool
- repository.
+- When a pool gets created, it must have a source project. The initial
+ contents of the pool repository are a Git clone of the source
+ project repository.
+- The occasion for creating a pool is when an existing eligible
+ (public, hashed storage, non-forked) GitLab project gets forked and
+ this project does not belong to a pool repository yet. The fork
+ parent project becomes the source project of the new pool, and both
+ the fork parent and the fork child project become members of the new
+ pool.
+- Once project A has become the source project of a pool, all future
+ eligible forks of A will become pool members.
+- If the fork source is itself a fork, the resulting repository will
+ neither join the repository nor will a new pool repository be
+ seeded.
+
+ eg:
+
+ Suppose fork A is part of a pool repository, any forks created off
+ of fork A *will not* be a part of the pool repository that fork A is
+ a part of.
+
+ Suppose B is a fork of A, and A does not belong to an object pool.
+ Now C gets created as a fork of B. C will not be part of a pool
+ repository.
> TODO should forks of forks be deduplicated?
> <https://gitlab.com/gitlab-org/gitaly/issues/1532>
### Consequences
-- If a normal Project participating in a pool gets moved to another
- Gitaly storage shard, its "belongs to PoolRepository" relation will
- be broken. Because of the way moving repositories between shard is
- implemented, we will automatically get a fresh self-contained copy
- of the project's repository on the new storage shard.
-- If the source project of a pool gets moved to another Gitaly storage
- shard or is deleted the "source project" relation is not broken.
- However, as of GitLab 12.0 a pool will not fetch from a source
- unless the source is on the same Gitaly shard.
+- If a normal Project participating in a pool gets moved to another
+ Gitaly storage shard, its "belongs to PoolRepository" relation will
+ be broken. Because of the way moving repositories between shard is
+ implemented, we will automatically get a fresh self-contained copy
+ of the project's repository on the new storage shard.
+- If the source project of a pool gets moved to another Gitaly storage
+ shard or is deleted the "source project" relation is not broken.
+ However, as of GitLab 12.0 a pool will not fetch from a source
+ unless the source is on the same Gitaly shard.
## Consistency between the SQL pool relation and Gitaly
diff --git a/doc/development/go_guide/index.md b/doc/development/go_guide/index.md
index f09339eb3a4..83444093f9c 100644
--- a/doc/development/go_guide/index.md
+++ b/doc/development/go_guide/index.md
@@ -107,6 +107,32 @@ Modules](https://github.com/golang/go/wiki/Modules). It provides a way to
define and lock dependencies for reproducible builds. It should be used
whenever possible.
+When Go Modules are in use, there should not be a `vendor/` directory. Instead,
+Go will automatically download dependencies when they are needed to build the
+project. This is in line with how dependencies are handled with Bundler in Ruby
+projects, and makes merge requests easier to review.
+
+In some cases, such as building a Go project for it to act as a dependency of a
+CI run for another project, removing the `vendor/` directory means the code must
+be downloaded repeatedly, which can lead to intermittent problems due to rate
+limiting or network failures. In these circumstances, you should cache the
+downloaded code between runs with a `.gitlab-ci.yml` snippet like this:
+
+```yaml
+.go-cache:
+ variables:
+ GOPATH: $CI_PROJECT_DIR/.go
+ before_script:
+ - mkdir -p .go
+ cache:
+ paths:
+ - .go/pkg/mod/
+
+test:
+ extends: .go-cache
+ # ...
+```
+
There was a [bug on modules
checksums](https://github.com/golang/go/issues/29278) in Go < v1.11.4, so make
sure to use at least this version to avoid `checksum mismatch` errors.
@@ -129,17 +155,89 @@ deploy a new pod, migrating the data automatically.
## Testing
+### Testing frameworks
+
We should not use any specific library or framework for testing, as the
[standard library](https://golang.org/pkg/) provides already everything to get
-started. For example, some external dependencies might be worth considering in
-case we decide to use a specific library or framework:
+started. If there is a need for more sophisticated testing tools, the following
+external dependencies might be worth considering in case we decide to use a specific
+library or framework:
- [Testify](https://github.com/stretchr/testify)
- [httpexpect](https://github.com/gavv/httpexpect)
+### Subtests
+
Use [subtests](https://blog.golang.org/subtests) whenever possible to improve
code readability and test output.
+### Better output in tests
+
+When comparing expected and actual values in tests, use
+[testify/require.Equal](https://godoc.org/github.com/stretchr/testify/require#Equal),
+[testify/require.EqualError](https://godoc.org/github.com/stretchr/testify/require#EqualError),
+[testify/require.EqualValues](https://godoc.org/github.com/stretchr/testify/require#EqualValues),
+and others to improve readability when comparing structs, errors,
+large portions of text, or JSON documents:
+
+```go
+type TestData struct {
+ // ...
+}
+
+func FuncUnderTest() TestData {
+ // ...
+}
+
+func Test(t *testing.T) {
+ t.Run("FuncUnderTest", func(t *testing.T) {
+ want := TestData{}
+ got := FuncUnderTest()
+
+ require.Equal(t, want, got) // note that expected value comes first, then comes the actual one ("diff" semantics)
+ })
+}
+```
+
+### Table-Driven Tests
+
+Using [Table-Driven Tests](https://github.com/golang/go/wiki/TableDrivenTests)
+is generally good practice when you have multiple entries of
+inputs/outputs for the same function. Below are some guidelines one can
+follow when writing table-driven test. These guidelines are mostly
+extracted from Go standard library source code. Keep in mind it's OK not
+to follow these guidelines when it makes sense.
+
+#### Defining test cases
+
+Each table entry is a complete test case with inputs and expected
+results, and sometimes with additional information such as a test name
+to make the test output easily readable.
+
+- [Define a slice of anonymous struct](https://github.com/golang/go/blob/50bd1c4d4eb4fac8ddeb5f063c099daccfb71b26/src/encoding/csv/reader_test.go#L16)
+ inside of the test.
+- [Define a slice of anonymous struct](https://github.com/golang/go/blob/55d31e16c12c38d36811bdee65ac1f7772148250/src/cmd/go/internal/module/module_test.go#L9-L66)
+ outside of the test.
+- [Named structs](https://github.com/golang/go/blob/2e0cd2aef5924e48e1ceb74e3d52e76c56dd34cc/src/cmd/go/internal/modfetch/coderepo_test.go#L54-L69)
+ for code reuse.
+- [Using `map[string]struct{}`](https://github.com/golang/go/blob/6d5caf38e37bf9aeba3291f1f0b0081f934b1187/src/cmd/trace/annotations_test.go#L180-L235).
+
+#### Contents of the test case
+
+- Ideally, each test case should have a field with a unique identifier
+ to use for naming subtests. In the Go standard library, this is commonly the
+ `name string` field.
+- Use `want`/`expect`/`actual` when you are specifcing something in the
+ test case that will be used for assertion.
+
+#### Variable names
+
+- Each table-driven test map/slice of struct can be named `tests`.
+- When looping through `tests` the anonymous struct can be referred
+ to as `tt` or `tc`.
+- The description of the test can be referred to as
+ `name`/`testName`/`tn`.
+
### Benchmarks
Programs handling a lot of IO or complex operations should always include
diff --git a/doc/development/gotchas.md b/doc/development/gotchas.md
index 13dda17bb7d..941eea2609e 100644
--- a/doc/development/gotchas.md
+++ b/doc/development/gotchas.md
@@ -53,7 +53,7 @@ When run, this spec doesn't do what we might expect:
(compared using ==)
```
-That's because FactoryBot sequences are not reseted for each example.
+This is because FactoryBot sequences are not reset for each example.
Please remember that sequence-generated values exist only to avoid having to
explicitly set attributes that have a uniqueness constraint when using a factory.
diff --git a/doc/development/hash_indexes.md b/doc/development/hash_indexes.md
index e6c1b3590b1..417ea18e22f 100644
--- a/doc/development/hash_indexes.md
+++ b/doc/development/hash_indexes.md
@@ -1,6 +1,6 @@
# Hash Indexes
-Both PostgreSQL and MySQL support hash indexes besides the regular btree
+PostgreSQL supports hash indexes besides the regular btree
indexes. Hash indexes however are to be avoided at all costs. While they may
_sometimes_ provide better performance the cost of rehashing can be very high.
More importantly: at least until PostgreSQL 10.0 hash indexes are not
diff --git a/doc/development/i18n/externalization.md b/doc/development/i18n/externalization.md
index 17462887162..141f5a8d6d9 100644
--- a/doc/development/i18n/externalization.md
+++ b/doc/development/i18n/externalization.md
@@ -21,18 +21,18 @@ The following tools are used:
1. [`gettext_i18n_rails`](https://github.com/grosser/gettext_i18n_rails): this
gem allow us to translate content from models, views and controllers. Also
it gives us access to the following raketasks:
- - `rake gettext:find`: Parses almost all the files from the
- Rails application looking for content that has been marked for
- translation. Finally, it updates the PO files with the new content that
- it has found.
- - `rake gettext:pack`: Processes the PO files and generates the
- MO files that are binary and are finally used by the application.
+ - `rake gettext:find`: Parses almost all the files from the
+ Rails application looking for content that has been marked for
+ translation. Finally, it updates the PO files with the new content that
+ it has found.
+ - `rake gettext:pack`: Processes the PO files and generates the
+ MO files that are binary and are finally used by the application.
1. [`gettext_i18n_rails_js`](https://github.com/webhippie/gettext_i18n_rails_js):
this gem is useful to make the translations available in JavaScript. It
provides the following raketask:
- - `rake gettext:po_to_json`: Reads the contents from the PO files and
- generates JSON files containing all the available translations.
+ - `rake gettext:po_to_json`: Reads the contents from the PO files and
+ generates JSON files containing all the available translations.
1. PO editor: there are multiple applications that can help us to work with PO
files, a good option is [Poedit](https://poedit.net/download) which is
@@ -139,60 +139,61 @@ For example use `%{created_at}` in Ruby but `%{createdAt}` in JavaScript. Make s
- In Ruby/HAML:
- ```ruby
- _("Hello %{name}") % { name: 'Joe' } => 'Hello Joe'
- ```
+ ```ruby
+ _("Hello %{name}") % { name: 'Joe' } => 'Hello Joe'
+ ```
- In JavaScript:
- ```js
- import { __, sprintf } from '~/locale';
+ ```js
+ import { __, sprintf } from '~/locale';
- sprintf(__('Hello %{username}'), { username: 'Joe' }); // => 'Hello Joe'
- ```
+ sprintf(__('Hello %{username}'), { username: 'Joe' }); // => 'Hello Joe'
+ ```
- By default, `sprintf` escapes the placeholder values.
- If you want to take care of that yourself, you can pass `false` as third argument.
+ By default, `sprintf` escapes the placeholder values.
+ If you want to take care of that yourself, you can pass `false` as third argument.
- ```js
- import { __, sprintf } from '~/locale';
+ ```js
+ import { __, sprintf } from '~/locale';
- sprintf(__('This is %{value}'), { value: '<strong>bold</strong>' }); // => 'This is &lt;strong&gt;bold&lt;/strong&gt;'
- sprintf(__('This is %{value}'), { value: '<strong>bold</strong>' }, false); // => 'This is <strong>bold</strong>'
- ```
+ sprintf(__('This is %{value}'), { value: '<strong>bold</strong>' }); // => 'This is &lt;strong&gt;bold&lt;/strong&gt;'
+ sprintf(__('This is %{value}'), { value: '<strong>bold</strong>' }, false); // => 'This is <strong>bold</strong>'
+ ```
### Plurals
- In Ruby/HAML:
- ```ruby
- n_('Apple', 'Apples', 3)
- # => 'Apples'
- ```
+ ```ruby
+ n_('Apple', 'Apples', 3)
+ # => 'Apples'
+ ```
- Using interpolation:
- ```ruby
- n_("There is a mouse.", "There are %d mice.", size) % size
- # => When size == 1: 'There is a mouse.'
- # => When size == 2: 'There are 2 mice.'
- ```
+ Using interpolation:
- Avoid using `%d` or count variables in singular strings. This allows more natural translation in some languages.
+ ```ruby
+ n_("There is a mouse.", "There are %d mice.", size) % size
+ # => When size == 1: 'There is a mouse.'
+ # => When size == 2: 'There are 2 mice.'
+ ```
+
+ Avoid using `%d` or count variables in singular strings. This allows more natural translation in some languages.
- In JavaScript:
- ```js
- n__('Apple', 'Apples', 3)
- // => 'Apples'
- ```
+ ```js
+ n__('Apple', 'Apples', 3)
+ // => 'Apples'
+ ```
- Using interpolation:
+ Using interpolation:
- ```js
- n__('Last day', 'Last %d days', x)
- // => When x == 1: 'Last day'
- // => When x == 2: 'Last 2 days'
- ```
+ ```js
+ n__('Last day', 'Last %d days', x)
+ // => When x == 1: 'Last day'
+ // => When x == 2: 'Last 2 days'
+ ```
### Namespaces
@@ -202,17 +203,17 @@ Namespaces should be PascalCase.
- In Ruby/HAML:
- ```ruby
- s_('OpenedNDaysAgo|Opened')
- ```
+ ```ruby
+ s_('OpenedNDaysAgo|Opened')
+ ```
- In case the translation is not found it will return `Opened`.
+ In case the translation is not found it will return `Opened`.
- In JavaScript:
- ```js
- s__('OpenedNDaysAgo|Opened')
- ```
+ ```js
+ s__('OpenedNDaysAgo|Opened')
+ ```
Note: The namespace should be removed from the translation. See the [translation
guidelines for more details](translation.md#namespaced-strings).
@@ -235,12 +236,12 @@ This makes use of [`Intl.DateTimeFormat`].
- In Ruby/HAML, we have two ways of adding format to dates and times:
1. **Through the `l` helper**, i.e. `l(active_session.created_at, format: :short)`. We have some predefined formats for
- [dates](https://gitlab.com/gitlab-org/gitlab-ce/blob/v11.7.0/config/locales/en.yml#L54) and [times](https://gitlab.com/gitlab-org/gitlab-ce/blob/v11.7.0/config/locales/en.yml#L261).
- If you need to add a new format, because other parts of the code could benefit from it,
- you'll need to add it to [en.yml](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/config/locales/en.yml) file.
+ [dates](https://gitlab.com/gitlab-org/gitlab-ce/blob/v11.7.0/config/locales/en.yml#L54) and [times](https://gitlab.com/gitlab-org/gitlab-ce/blob/v11.7.0/config/locales/en.yml#L261).
+ If you need to add a new format, because other parts of the code could benefit from it,
+ you'll need to add it to [en.yml](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/config/locales/en.yml) file.
1. **Through `strftime`**, i.e. `milestone.start_date.strftime('%b %-d')`. We use `strftime` in case none of the formats
- defined on [en.yml](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/config/locales/en.yml) matches the date/time
- specifications we need, and if there is no need to add it as a new format because is very particular (i.e. it's only used in a single view).
+ defined on [en.yml](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/config/locales/en.yml) matches the date/time
+ specifications we need, and if there is no need to add it as a new format because is very particular (i.e. it's only used in a single view).
## Best practices
@@ -268,40 +269,40 @@ should be externalized as follows:
This also applies when using links in between translated sentences, otherwise these texts are not translatable in certain languages.
- In Ruby/HAML, instead of:
-
- ```haml
- - zones_link = link_to(s_('ClusterIntegration|zones'), 'https://cloud.google.com/compute/docs/regions-zones/regions-zones', target: '_blank', rel: 'noopener noreferrer')
- = s_('ClusterIntegration|Learn more about %{zones_link}').html_safe % { zones_link: zones_link }
- ```
-
- Set the link starting and ending HTML fragments as variables like so:
-
- ```haml
- - zones_link_url = 'https://cloud.google.com/compute/docs/regions-zones/regions-zones'
- - zones_link_start = '<a href="%{url}" target="_blank" rel="noopener noreferrer">'.html_safe % { url: zones_link_url }
- = s_('ClusterIntegration|Learn more about %{zones_link_start}zones%{zones_link_end}').html_safe % { zones_link_start: zones_link_start, zones_link_end: '</a>'.html_safe }
- ```
+
+ ```haml
+ - zones_link = link_to(s_('ClusterIntegration|zones'), 'https://cloud.google.com/compute/docs/regions-zones/regions-zones', target: '_blank', rel: 'noopener noreferrer')
+ = s_('ClusterIntegration|Learn more about %{zones_link}').html_safe % { zones_link: zones_link }
+ ```
+
+ Set the link starting and ending HTML fragments as variables like so:
+
+ ```haml
+ - zones_link_url = 'https://cloud.google.com/compute/docs/regions-zones/regions-zones'
+ - zones_link_start = '<a href="%{url}" target="_blank" rel="noopener noreferrer">'.html_safe % { url: zones_link_url }
+ = s_('ClusterIntegration|Learn more about %{zones_link_start}zones%{zones_link_end}').html_safe % { zones_link_start: zones_link_start, zones_link_end: '</a>'.html_safe }
+ ```
- In JavaScript, instead of:
- ```js
- {{
- sprintf(s__("ClusterIntegration|Learn more about %{link}"), {
- link: '<a href="https://cloud.google.com/compute/docs/regions-zones/regions-zones" target="_blank" rel="noopener noreferrer">zones</a>'
- })
- }}
- ```
-
- Set the link starting and ending HTML fragments as variables like so:
-
- ```js
- {{
- sprintf(s__("ClusterIntegration|Learn more about %{linkStart}zones%{linkEnd}"), {
- linkStart: '<a href="https://cloud.google.com/compute/docs/regions-zones/regions-zones" target="_blank" rel="noopener noreferrer">'
- linkEnd: '</a>',
- })
- }}
- ```
+ ```js
+ {{
+ sprintf(s__("ClusterIntegration|Learn more about %{link}"), {
+ link: '<a href="https://cloud.google.com/compute/docs/regions-zones/regions-zones" target="_blank" rel="noopener noreferrer">zones</a>'
+ })
+ }}
+ ```
+
+ Set the link starting and ending HTML fragments as variables like so:
+
+ ```js
+ {{
+ sprintf(s__("ClusterIntegration|Learn more about %{linkStart}zones%{linkEnd}"), {
+ linkStart: '<a href="https://cloud.google.com/compute/docs/regions-zones/regions-zones" target="_blank" rel="noopener noreferrer">'
+ linkEnd: '</a>',
+ })
+ }}
+ ```
The reasoning behind this is that in some languages words change depending on context. For example in Japanese は is added to the subject of a sentence and を to the object. This is impossible to translate correctly if we extract individual words from the sentence.
@@ -374,29 +375,29 @@ Let's suppose you want to add translations for a new language, let's say French.
1. The first step is to register the new language in `lib/gitlab/i18n.rb`:
- ```ruby
- ...
- AVAILABLE_LANGUAGES = {
- ...,
- 'fr' => 'Français'
- }.freeze
- ...
- ```
+ ```ruby
+ ...
+ AVAILABLE_LANGUAGES = {
+ ...,
+ 'fr' => 'Français'
+ }.freeze
+ ...
+ ```
1. Next, you need to add the language:
- ```sh
- bin/rake gettext:add_language[fr]
- ```
+ ```sh
+ bin/rake gettext:add_language[fr]
+ ```
- If you want to add a new language for a specific region, the command is similar,
- you just need to separate the region with an underscore (`_`). For example:
+ If you want to add a new language for a specific region, the command is similar,
+ you just need to separate the region with an underscore (`_`). For example:
- ```sh
- bin/rake gettext:add_language[en_GB]
- ```
+ ```sh
+ bin/rake gettext:add_language[en_GB]
+ ```
- Please note that you need to specify the region part in capitals.
+ Please note that you need to specify the region part in capitals.
1. Now that the language is added, a new directory has been created under the
path: `locale/fr/`. You can now start using your PO editor to edit the PO file
@@ -406,9 +407,9 @@ Let's suppose you want to add translations for a new language, let's say French.
in order to generate the binary MO files and finally update the JSON files
containing the translations:
- ```sh
- bin/rake gettext:compile
- ```
+ ```sh
+ bin/rake gettext:compile
+ ```
1. In order to see the translated content we need to change our preferred language
which can be found under the user's **Settings** (`/profile`).
@@ -416,7 +417,7 @@ Let's suppose you want to add translations for a new language, let's say French.
1. After checking that the changes are ok, you can proceed to commit the new files.
For example:
- ```sh
- git add locale/fr/ app/assets/javascripts/locale/fr/
- git commit -m "Add French translations for Cycle Analytics page"
- ```
+ ```sh
+ git add locale/fr/ app/assets/javascripts/locale/fr/
+ git commit -m "Add French translations for Cycle Analytics page"
+ ```
diff --git a/doc/development/i18n/proofreader.md b/doc/development/i18n/proofreader.md
index 910d7296057..492e3d48164 100644
--- a/doc/development/i18n/proofreader.md
+++ b/doc/development/i18n/proofreader.md
@@ -80,6 +80,7 @@ are very appreciative of the work done by translators and proofreaders!
- Russian
- Nikita Grylov - [GitLab](https://gitlab.com/nixel2007), [Crowdin](https://crowdin.com/profile/nixel2007)
- Alexy Lustin - [GitLab](https://gitlab.com/allustin), [Crowdin](https://crowdin.com/profile/lustin)
+ - Mark Minakou - [GitLab](https://gitlab.com/sandzhaj), [Crowdin](https://crowdin.com/profile/sandzhaj)
- NickVolynkin - [Crowdin](https://crowdin.com/profile/NickVolynkin)
- Serbian (Cyrillic)
- Proofreaders needed.
diff --git a/doc/development/img/architecture_simplified.png b/doc/development/img/architecture_simplified.png
index 1698c167c5e..1ad57b65468 100644
--- a/doc/development/img/architecture_simplified.png
+++ b/doc/development/img/architecture_simplified.png
Binary files differ
diff --git a/doc/development/img/distributed_tracing_jaeger_ui.png b/doc/development/img/distributed_tracing_jaeger_ui.png
index 57517dacced..dcd18b1ec9f 100644
--- a/doc/development/img/distributed_tracing_jaeger_ui.png
+++ b/doc/development/img/distributed_tracing_jaeger_ui.png
Binary files differ
diff --git a/doc/development/img/distributed_tracing_performance_bar.png b/doc/development/img/distributed_tracing_performance_bar.png
index c9998cedd2d..8c819045104 100644
--- a/doc/development/img/distributed_tracing_performance_bar.png
+++ b/doc/development/img/distributed_tracing_performance_bar.png
Binary files differ
diff --git a/doc/development/import_export.md b/doc/development/import_export.md
index 64c91f151c5..0c343bc22e4 100644
--- a/doc/development/import_export.md
+++ b/doc/development/import_export.md
@@ -19,6 +19,7 @@ Project.find_by_full_path('group/project').import_state.slice(:jid, :status, :la
grep JID /var/log/gitlab/sidekiq/current
grep "Import/Export error" /var/log/gitlab/sidekiq/current
grep "Import/Export backtrace" /var/log/gitlab/sidekiq/current
+tail /var/log/gitlab/gitlab-rails/importer.log
```
## Troubleshooting performance issues
@@ -229,8 +230,8 @@ meaning that we want to bump the version up in the next version (or patch releas
For example:
1. Add rename to `RelationRenameService` in X.Y
-2. Remove it from `RelationRenameService` in X.Y + 1
-3. Bump Import/Export version in X.Y + 1
+1. Remove it from `RelationRenameService` in X.Y + 1
+1. Bump Import/Export version in X.Y + 1
```ruby
module Gitlab
diff --git a/doc/development/instrumentation.md b/doc/development/instrumentation.md
index 5f95cf3707c..777d372ec60 100644
--- a/doc/development/instrumentation.md
+++ b/doc/development/instrumentation.md
@@ -1,6 +1,6 @@
# Instrumenting Ruby Code
-GitLab Performance Monitoring allows instrumenting of both methods and custom
+[GitLab Performance Monitoring](../administration/monitoring/performance/index.md) allows instrumenting of both methods and custom
blocks of Ruby code. Method instrumentation is the primary form of
instrumentation with block-based instrumentation only being used when we want to
drill down to specific regions of code within a method.
diff --git a/doc/development/interacting_components.md b/doc/development/interacting_components.md
new file mode 100644
index 00000000000..5e6dc8d460a
--- /dev/null
+++ b/doc/development/interacting_components.md
@@ -0,0 +1,29 @@
+# Developing against interacting components or features
+
+It's not uncommon that a single code change can reflect and interact with multiple parts of GitLab
+codebase. Furthermore, an existing feature might have an underlying integration or behavior that
+might go unnoticed even by reviewers and maintainers.
+
+The goal of this section is to briefly list interacting pieces to think about
+when making _backend_ changes that might involve multiple features or [components](architecture.md#components).
+
+## Uploads
+
+GitLab supports uploads to [object storage]. That means every feature and
+change that affects uploads should also be tested against [object storage],
+which is _not_ enabled by default in [GDK](https://gitlab.com/gitlab-org/gitlab-development-kit).
+
+When working on a related feature, make sure to enable and test it
+against [Minio](https://gitlab.com/gitlab-org/gitlab-development-kit/blob/master/doc/howto/object_storage.md).
+
+See also [File Storage in GitLab](file_storage.md).
+
+## Merge requests
+
+### Forks
+
+GitLab supports a great amount of features for [merge requests](../user/project/merge_requests/index.md). One
+of them is the ability to create merge requests from and to [forks](../gitlab-basics/fork-project.md),
+which should also be highly considered and tested upon development phase.
+
+[object storage]: https://docs.gitlab.com/charts/advanced/external-object-storage/
diff --git a/doc/development/licensed_feature_availability.md b/doc/development/licensed_feature_availability.md
index 80ec7b8c0cf..29e4ace157b 100644
--- a/doc/development/licensed_feature_availability.md
+++ b/doc/development/licensed_feature_availability.md
@@ -14,7 +14,7 @@ it should be restricted on namespace scope.
1. Add the feature symbol on `EES_FEATURES`, `EEP_FEATURES` or `EEU_FEATURES` constants in
`ee/app/models/license.rb`. Note on `ee/app/models/ee/namespace.rb` that _Bronze_ GitLab.com
features maps to on-premise _EES_, _Silver_ to _EEP_ and _Gold_ to _EEU_.
-2. Check using:
+1. Check using:
```ruby
project.feature_available?(:feature_symbol)
@@ -29,8 +29,8 @@ the instance license.
1. Add the feature symbol on `EES_FEATURES`, `EEP_FEATURES` or `EEU_FEATURES` constants in
`ee/app/models/license.rb`.
-2. Add the same feature symbol to `GLOBAL_FEATURES`
-3. Check using:
+1. Add the same feature symbol to `GLOBAL_FEATURES`
+1. Check using:
```ruby
License.feature_available?(:feature_symbol)
diff --git a/doc/development/migration_style_guide.md b/doc/development/migration_style_guide.md
index 0c7601b415e..3181b3a88cc 100644
--- a/doc/development/migration_style_guide.md
+++ b/doc/development/migration_style_guide.md
@@ -10,9 +10,7 @@ migrations are written carefully, can be applied online and adhere to the style
guide below.
Migrations are **not** allowed to require GitLab installations to be taken
-offline unless _absolutely necessary_. Downtime assumptions should be based on
-the behaviour of a migration when performed using PostgreSQL, as various
-operations in MySQL may require downtime without there being alternatives.
+offline unless _absolutely necessary_.
When downtime is necessary the migration has to be approved by:
@@ -343,10 +341,7 @@ class AddOptionsToBuildMetadata < ActiveRecord::Migration[5.0]
end
```
-On MySQL the `JSON` and `JSONB` is translated to `TEXT 1MB`, as `JSONB` is PostgreSQL only feature.
-
-For above reason you have to use a serializer to provide a translation layer
-in order to support PostgreSQL and MySQL seamlessly:
+You have to use a serializer to provide a translation layer:
```ruby
class BuildMetadata
@@ -356,7 +351,7 @@ end
## Testing
-Make sure that your migration works with MySQL and PostgreSQL with data. An
+Make sure that your migration works for databases with data. An
empty database does not guarantee that your migration is correct.
Make sure your migration can be reversed.
diff --git a/doc/development/module_with_instance_variables.md b/doc/development/module_with_instance_variables.md
index 7bdfa04fc57..443eee0b62c 100644
--- a/doc/development/module_with_instance_variables.md
+++ b/doc/development/module_with_instance_variables.md
@@ -1,6 +1,6 @@
-## Modules with instance variables could be considered harmful
+# Modules with instance variables could be considered harmful
-### Background
+## Background
Rails somehow encourages people using modules and instance variables
everywhere. For example, using instance variables in the controllers,
@@ -9,7 +9,7 @@ helpers, and views. They're also encouraging the use of
saving everything in a giant, single object, and people could access
everything in that one giant object.
-### The problems
+## The problems
Of course this is convenient to develop, because we just have everything
within reach. However this has a number of downsides when that chosen object
@@ -24,7 +24,7 @@ manipulated from 3 different modules. It's hard to track when those variables
start giving us troubles. We don't know which module would suddenly change
one of the variables. Everything could touch anything.
-### Similar concerns
+## Similar concerns
People are saying multiple inheritance is bad. Mixing multiple modules with
multiple instance variables scattering everywhere suffer from the same issue.
@@ -40,7 +40,7 @@ Note that `included` doesn't solve the whole issue. They define the
dependencies, but they still allow each modules to talk implicitly via the
instance variables in the final giant object, and that's where the problem is.
-### Solutions
+## Solutions
We should split the giant object into multiple objects, and they communicate
with each other with the API, i.e. public methods. In short, composition over
@@ -53,7 +53,7 @@ With clearly defined API, this would make things less coupled and much easier
to debug and track, and much more extensible for other objects to use, because
they communicate in a clear way, rather than implicit dependencies.
-### Acceptable use
+## Acceptable use
However, it's not always bad to use instance variables in a module,
as long as it's contained in the same module; that is, no other modules or
@@ -74,7 +74,7 @@ Unfortunately it's not easy to code more complex rules into the cop, so
we rely on people's best judgement. If we could find another good pattern
we could easily add to the cop, we should do it.
-### How to rewrite and avoid disabling this cop
+## How to rewrite and avoid disabling this cop
Even if we could just disable the cop, we should avoid doing so. Some code
could be easily rewritten in simple form. Consider this acceptable method:
@@ -181,7 +181,7 @@ rather than whatever includes the module, and those modules which were also
included, making it much easier to track down any issues,
and reducing the chance of having name conflicts.
-### How to disable this cop
+## How to disable this cop
Put the disabling comment right after your code in the same line:
@@ -210,14 +210,14 @@ end
Note that you need to enable it at some point, otherwise everything below
won't be checked.
-### Things we might need to ignore right now
+## Things we might need to ignore right now
Because of the way Rails helpers and mailers work, we might not be able to
avoid the use of instance variables there. For those cases, we could ignore
them at the moment. At least we're not going to share those modules with
other random objects, so they're still somewhat isolated.
-### Instance variables in views
+## Instance variables in views
They're bad because we can't easily tell who's using the instance variables
(from controller's point of view) and where we set them up (from partials'
diff --git a/doc/development/new_fe_guide/dependencies.md b/doc/development/new_fe_guide/dependencies.md
index 12a4f089d41..8a6930acd37 100644
--- a/doc/development/new_fe_guide/dependencies.md
+++ b/doc/development/new_fe_guide/dependencies.md
@@ -15,6 +15,18 @@ Exceptions are made for some tools that we require in the
`gitlab:assets:compile` CI job such as `webpack-bundle-analyzer` to analyze our
production assets post-compile.
+To add or upgrade a dependency, run:
+
+```sh
+yarn add <your dependency here>
+```
+
+This may introduce duplicate dependencies. To de-duplicate `yarn.lock`, run:
+
+```sh
+node_modules/.bin/yarn-deduplicate --list --strategy fewer yarn.lock && yarn install
+```
+
---
> TODO: Add Dependencies
diff --git a/doc/development/new_fe_guide/development/accessibility.md b/doc/development/new_fe_guide/development/accessibility.md
index 81a29170129..ae5c4c6a6cc 100644
--- a/doc/development/new_fe_guide/development/accessibility.md
+++ b/doc/development/new_fe_guide/development/accessibility.md
@@ -1,17 +1,21 @@
# Accessiblity
+
Using semantic HTML plays a key role when it comes to accessibility.
## Accessible Rich Internet Applications - ARIA
+
WAI-ARIA, the Accessible Rich Internet Applications specification, defines a way to make Web content and Web applications more accessible to people with disabilities.
> Note: It is [recommended][using-aria] to use semantic elements as the primary method to achieve accessibility rather than adding aria attributes. Adding aria attributes should be seen as a secondary method for creating accessible elements.
### Role
+
The `role` attribute describes the role the element plays in the context of the document.
Check the list of WAI-ARIA roles [here][roles]
## Icons
+
When using icons or images that aren't absolutely needed to understand the context, we should use `aria-hidden="true"`.
On the other hand, if an icon is crucial to understand the context we should do one of the following:
@@ -20,6 +24,7 @@ On the other hand, if an icon is crucial to understand the context we should do
1. Use `aria-labelledby` to point to an element that contains the explanation for that icon
## Form inputs
+
In forms we should use the `for` attribute in the label statement:
```
diff --git a/doc/development/new_fe_guide/development/performance.md b/doc/development/new_fe_guide/development/performance.md
index c54b8305991..d41239693bf 100644
--- a/doc/development/new_fe_guide/development/performance.md
+++ b/doc/development/new_fe_guide/development/performance.md
@@ -5,7 +5,7 @@
We have a performance dashboard available in one of our [grafana instances](https://dashboards.gitlab.net/d/1EBTz3Dmz/sitespeed-page-summary?orgId=1). This dashboard automatically aggregates metric data from [sitespeed.io](https://www.sitespeed.io/) every 6 hours. These changes are displayed after a set number of pages are aggregated.
These pages can be found inside a text file in the gitlab-build-images [repository](https://gitlab.com/gitlab-org/gitlab-build-images) called [gitlab.txt](https://gitlab.com/gitlab-org/gitlab-build-images/blob/master/scripts/gitlab.txt)
-Any frontend engineer can contribute to this dashboard. They can contribute by adding or removing urls of pages from this text file. Please have a [frontend monitoring expert](https://about.gitlab.com/company/team) review your changes before assigning to a maintainer of the `gitlab-build-images` project. The changes will go live on the next scheduled run after the changes are merged into `master`.
+Any frontend engineer can contribute to this dashboard. They can contribute by adding or removing urls of pages from this text file. Please have a [frontend monitoring expert](https://about.gitlab.com/company/team/) review your changes before assigning to a maintainer of the `gitlab-build-images` project. The changes will go live on the next scheduled run after the changes are merged into `master`.
There are 3 recommended high impact metrics to review on each page:
diff --git a/doc/development/new_fe_guide/development/testing.md b/doc/development/new_fe_guide/development/testing.md
index 2b62c2a41fe..e0d413b748b 100644
--- a/doc/development/new_fe_guide/development/testing.md
+++ b/doc/development/new_fe_guide/development/testing.md
@@ -1,361 +1,6 @@
-# Overview of Frontend Testing
+---
+redirect_to: '../../testing_guide/frontend_testing.md'
+---
-Tests relevant for frontend development can be found at the following places:
+This document was moved to [another location](../../testing_guide/frontend_testing.md).
-- `spec/javascripts/` which are run by Karma (command: `yarn karma`) and contain
- - [frontend unit tests](#frontend-unit-tests)
- - [frontend component tests](#frontend-component-tests)
- - [frontend integration tests](#frontend-integration-tests)
-- `spec/frontend/` which are run by Jest (command: `yarn jest`) and contain
- - [frontend unit tests](#frontend-unit-tests)
- - [frontend component tests](#frontend-component-tests)
- - [frontend integration tests](#frontend-integration-tests)
-- `spec/features/` which are run by RSpec and contain
- - [feature tests](#feature-tests)
-
-All tests in `spec/javascripts/` will eventually be migrated to `spec/frontend/` (see also [#52483](https://gitlab.com/gitlab-org/gitlab-ce/issues/52483)).
-
-In addition there were feature tests in `features/` run by Spinach in the past.
-These have been removed from our codebase in May 2018 ([#23036](https://gitlab.com/gitlab-org/gitlab-ce/issues/23036)).
-
-See also:
-
-- [Old testing guide](../../testing_guide/frontend_testing.html).
-- [Notes on testing Vue components](../../fe_guide/vue.html#testing-vue-components).
-
-## Frontend unit tests
-
-Unit tests are on the lowest abstraction level and typically test functionality that is not directly perceivable by a user.
-
-### When to use unit tests
-
-<details>
- <summary>exported functions and classes</summary>
- Anything that is exported can be reused at various places in a way you have no control over.
- Therefore it is necessary to document the expected behavior of the public interface with tests.
-</details>
-
-<details>
- <summary>Vuex actions</summary>
- Any Vuex action needs to work in a consistent way independent of the component it is triggered from.
-</details>
-
-<details>
- <summary>Vuex mutations</summary>
- For complex Vuex mutations it helps to identify the source of a problem by separating the tests from other parts of the Vuex store.
-</details>
-
-### When *not* to use unit tests
-
-<details>
- <summary>non-exported functions or classes</summary>
- Anything that is not exported from a module can be considered private or an implementation detail and doesn't need to be tested.
-</details>
-
-<details>
- <summary>constants</summary>
- Testing the value of a constant would mean to copy it.
- This results in extra effort without additional confidence that the value is correct.
-</details>
-
-<details>
- <summary>Vue components</summary>
- Computed properties, methods, and lifecycle hooks can be considered an implementation detail of components and don't need to be tested.
- They are implicitly covered by component tests.
- The <a href="https://vue-test-utils.vuejs.org/guides/#getting-started">official Vue guidelines</a> suggest the same.
-</details>
-
-### What to mock in unit tests
-
-<details>
- <summary>state of the class under test</summary>
- Modifying the state of the class under test directly rather than using methods of the class avoids side-effects in test setup.
-</details>
-
-<details>
- <summary>other exported classes</summary>
- Every class needs to be tested in isolation to prevent test scenarios from growing exponentially.
-</details>
-
-<details>
- <summary>single DOM elements if passed as parameters</summary>
- For tests that only operate on single DOM elements rather than a whole page, creating these elements is cheaper than loading a whole HTML fixture.
-</details>
-
-<details>
- <summary>all server requests</summary>
- When running frontend unit tests, the backend may not be reachable.
- Therefore all outgoing requests need to be mocked.
-</details>
-
-<details>
- <summary>asynchronous background operations</summary>
- Background operations cannot be stopped or waited on, so they will continue running in the following tests and cause side effects.
-</details>
-
-### What *not* to mock in unit tests
-
-<details>
- <summary>non-exported functions or classes</summary>
- Everything that is not exported can be considered private to the module and will be implicitly tested via the exported classes / functions.
-</details>
-
-<details>
- <summary>methods of the class under test</summary>
- By mocking methods of the class under test, the mocks will be tested and not the real methods.
-</details>
-
-<details>
- <summary>utility functions (pure functions, or those that only modify parameters)</summary>
- If a function has no side effects because it has no state, it is safe to not mock it in tests.
-</details>
-
-<details>
- <summary>full HTML pages</summary>
- Loading the HTML of a full page slows down tests, so it should be avoided in unit tests.
-</details>
-
-## Frontend component tests
-
-Component tests cover the state of a single component that is perceivable by a user depending on external signals such as user input, events fired from other components, or application state.
-
-### When to use component tests
-
-- Vue components
-
-### When *not* to use component tests
-
-<details>
- <summary>Vue applications</summary>
- Vue applications may contain many components.
- Testing them on a component level requires too much effort.
- Therefore they are tested on frontend integration level.
-</details>
-
-<details>
- <summary>HAML templates</summary>
- HAML templates contain only Markup and no frontend-side logic.
- Therefore they are not complete components.
-</details>
-
-### What to mock in component tests
-
-<details>
- <summary>DOM</summary>
- Operating on the real DOM is significantly slower than on the virtual DOM.
-</details>
-
-<details>
- <summary>properties and state of the component under test</summary>
- Similarly to testing classes, modifying the properties directly (rather than relying on methods of the component) avoids side-effects.
-</details>
-
-<details>
- <summary>Vuex store</summary>
- To avoid side effects and keep component tests simple, Vuex stores are replaced with mocks.
-</details>
-
-<details>
- <summary>all server requests</summary>
- Similar to unit tests, when running component tests, the backend may not be reachable.
- Therefore all outgoing requests need to be mocked.
-</details>
-
-<details>
- <summary>asynchronous background operations</summary>
- Similar to unit tests, background operations cannot be stopped or waited on, so they will continue running in the following tests and cause side effects.
-</details>
-
-<details>
- <summary>child components</summary>
- Every component is tested individually, so child components are mocked.
- See also <a href="https://vue-test-utils.vuejs.org/api/#shallowmount">shallowMount()</a>
-</details>
-
-### What *not* to mock in component tests
-
-<details>
- <summary>methods or computed properties of the component under test</summary>
- By mocking part of the component under test, the mocks will be tested and not the real component.
-</details>
-
-<details>
- <summary>functions and classes independent from Vue</summary>
- All plain JavaScript code is already covered by unit tests and needs not to be mocked in component tests.
-</details>
-
-## Frontend integration tests
-
-Integration tests cover the interaction between all components on a single page.
-Their abstraction level is comparable to how a user would interact with the UI.
-
-### When to use integration tests
-
-<details>
- <summary>page bundles (<code>index.js</code> files in <code>app/assets/javascripts/pages/</code>)</summary>
- Testing the page bundles ensures the corresponding frontend components integrate well.
-</details>
-
-<details>
- <summary>Vue applications outside of page bundles</summary>
- Testing Vue applications as a whole ensures the corresponding frontend components integrate well.
-</details>
-
-### What to mock in integration tests
-
-<details>
- <summary>HAML views (use fixtures instead)</summary>
- Rendering HAML views requires a Rails environment including a running database which we cannot rely on in frontend tests.
-</details>
-
-<details>
- <summary>all server requests</summary>
- Similar to unit and component tests, when running component tests, the backend may not be reachable.
- Therefore all outgoing requests need to be mocked.
-</details>
-
-<details>
- <summary>asynchronous background operations that are not perceivable on the page</summary>
- Background operations that affect the page need to be tested on this level.
- All other background operations cannot be stopped or waited on, so they will continue running in the following tests and cause side effects.
-</details>
-
-### What *not* to mock in integration tests
-
-<details>
- <summary>DOM</summary>
- Testing on the real DOM ensures our components work in the environment they are meant for.
- Part of this will be delegated to <a href="https://gitlab.com/gitlab-org/quality/team-tasks/issues/45">cross-browser testing</a>.
-</details>
-
-<details>
- <summary>properties or state of components</summary>
- On this level, all tests can only perform actions a user would do.
- For example to change the state of a component, a click event would be fired.
-</details>
-
-<details>
- <summary>Vuex stores</summary>
- When testing the frontend code of a page as a whole, the interaction between Vue components and Vuex stores is covered as well.
-</details>
-
-## Feature tests
-
-In contrast to [frontend integration tests](#frontend-integration-tests), feature tests make requests against the real backend instead of using fixtures.
-This also implies that database queries are executed which makes this category significantly slower.
-
-See also the [RSpec testing guidelines](../../testing_guide/best_practices.md#rspec).
-
-### When to use feature tests
-
-- use cases that require a backend and cannot be tested using fixtures
-- behavior that is not part of a page bundle but defined globally
-
-### Relevant notes
-
-A `:js` flag is added to the test to make sure the full environment is loaded.
-
-```
-scenario 'successfully', :js do
- sign_in(create(:admin))
-end
-```
-
-The steps of each test are written using capybara methods ([documentation](https://www.rubydoc.info/gems/capybara/2.15.1)).
-
-Bear in mind <abbr title="XMLHttpRequest">XHR</abbr> calls might require you to use `wait_for_requests` in between steps, like so:
-
-```rspec
-find('.form-control').native.send_keys(:enter)
-
-wait_for_requests
-
-expect(page).not_to have_selector('.card')
-```
-
-## Test helpers
-
-### Vuex Helper: `testAction`
-
-We have a helper available to make testing actions easier, as per [official documentation](https://vuex.vuejs.org/guide/testing.html):
-
-```
-testAction(
- actions.actionName, // action
- { }, // params to be passed to action
- state, // state
- [
- { type: types.MUTATION},
- { type: types.MUTATION_1, payload: {}},
- ], // mutations committed
- [
- { type: 'actionName', payload: {}},
- { type: 'actionName1', payload: {}},
- ] // actions dispatched
- done,
-);
-```
-
-Check an example in [spec/javascripts/ide/stores/actions_spec.jsspec/javascripts/ide/stores/actions_spec.js](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/spec/javascripts/ide/stores/actions_spec.js).
-
-### Vue Helper: `mountComponent`
-
-To make mounting a Vue component easier and more readable, we have a few helpers available in `spec/helpers/vue_mount_component_helper`.
-
-- `createComponentWithStore`
-- `mountComponentWithStore`
-
-Examples of usage:
-
-```
-beforeEach(() => {
- vm = createComponentWithStore(Component, store);
-
- vm.$store.state.currentBranchId = 'master';
-
- vm.$mount();
-},
-```
-
-```
-beforeEach(() => {
- vm = mountComponentWithStore(Component, {
- el: '#dummy-element',
- store,
- props: { badge },
- });
-},
-```
-
-Don't forget to clean up:
-
-```
-afterEach(() => {
- vm.$destroy();
-});
-```
-
-## Testing with older browsers
-
-Some regressions only affect a specific browser version. We can install and test in particular browsers with either Firefox or Browserstack using the following steps:
-
-### Browserstack
-
-[Browserstack](https://www.browserstack.com/) allows you to test more than 1200 mobile devices and browsers.
-You can use it directly through the [live app](https://www.browserstack.com/live) or you can install the [chrome extension](https://chrome.google.com/webstore/detail/browserstack/nkihdmlheodkdfojglpcjjmioefjahjb) for easy access.
-You can find the credentials on 1Password, under `frontendteam@gitlab.com`.
-
-### Firefox
-
-#### macOS
-
-You can download any older version of Firefox from the releases FTP server, <https://ftp.mozilla.org/pub/firefox/releases/>
-
-1. From the website, select a version, in this case `50.0.1`.
-1. Go to the mac folder.
-1. Select your preferred language, you will find the dmg package inside, download it.
-1. Drag and drop the application to any other folder but the `Applications` folder.
-1. Rename the application to something like `Firefox_Old`.
-1. Move the application to the `Applications` folder.
-1. Open up a terminal and run `/Applications/Firefox_Old.app/Contents/MacOS/firefox-bin -profilemanager` to create a new profile specific to that Firefox version.
-1. Once the profile has been created, quit the app, and run it again like normal. You now have a working older Firefox version.
diff --git a/doc/development/new_fe_guide/modules/dirty_submit.md b/doc/development/new_fe_guide/modules/dirty_submit.md
index 6c03958b463..217743ea395 100644
--- a/doc/development/new_fe_guide/modules/dirty_submit.md
+++ b/doc/development/new_fe_guide/modules/dirty_submit.md
@@ -1,7 +1,6 @@
# Dirty Submit
-> [Introduced][ce-21115] in GitLab 11.3.
-> [dirty_submit][dirty-submit]
+> [Introduced](https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/21115) in GitLab 11.3.
## Summary
@@ -9,6 +8,9 @@ Prevent submitting forms with no changes.
Currently handles `input`, `textarea` and `select` elements.
+Also, see [the code](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/app/assets/javascripts/dirty_submit/)
+within the GitLab project.
+
## Usage
```js
@@ -18,6 +20,3 @@ new DirtySubmitForm(document.querySelector('form'));
// or
new DirtySubmitForm(document.querySelectorAll('form'));
```
-
-[ce-21115]: https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/21115
-[dirty-submit]: https://gitlab.com/gitlab-org/gitlab-ce/blob/master/app/assets/javascripts/dirty_submit/ \ No newline at end of file
diff --git a/doc/development/new_fe_guide/style/index.md b/doc/development/new_fe_guide/style/index.md
index 335d9e66240..f073dc56f1f 100644
--- a/doc/development/new_fe_guide/style/index.md
+++ b/doc/development/new_fe_guide/style/index.md
@@ -8,7 +8,7 @@
## [Vue style guide](vue.md)
-# Tooling
+## Tooling
## [Prettier](prettier.md)
diff --git a/doc/development/new_fe_guide/style/javascript.md b/doc/development/new_fe_guide/style/javascript.md
index 3019eaa089c..b742d567f41 100644
--- a/doc/development/new_fe_guide/style/javascript.md
+++ b/doc/development/new_fe_guide/style/javascript.md
@@ -71,7 +71,6 @@ class myClass {
}
const instance = new myClass();
instance.makeRequest();
-
```
## Avoid classes to handle DOM events
@@ -189,8 +188,8 @@ disabled due to legacy compatibility reasons but they are in the process of bein
Do not disable specific ESLint rules. Due to technical debt, you may disable the following
rules only if you are invoking/instantiating existing code modules.
- - [no-new](http://eslint.org/docs/rules/no-new)
- - [class-method-use-this](http://eslint.org/docs/rules/class-methods-use-this)
+- [no-new](http://eslint.org/docs/rules/no-new)
+- [class-method-use-this](http://eslint.org/docs/rules/class-methods-use-this)
> Note: Disable these rules on a per line basis. This makes it easier to refactor
- in the future. E.g. use `eslint-disable-next-line` or `eslint-disable-line`.
+> in the future. E.g. use `eslint-disable-next-line` or `eslint-disable-line`.
diff --git a/doc/development/performance.md b/doc/development/performance.md
index 8b569a677b6..14b3f8204d2 100644
--- a/doc/development/performance.md
+++ b/doc/development/performance.md
@@ -267,7 +267,7 @@ piece of code is worth optimizing. The only two things you can do are:
1. Think about what the code does, how it's used, how many times it's called and
how much time is spent in it relative to the total execution time (e.g., the
total time spent in a web request).
-2. Ask others (preferably in the form of an issue).
+1. Ask others (preferably in the form of an issue).
Some examples of changes that aren't really important/worth the effort:
@@ -285,10 +285,10 @@ directly in a web request as much as possible. This has numerous benefits such
as:
1. An error won't prevent the request from completing.
-2. The process being slow won't affect the loading time of a page.
-3. In case of a failure it's easy to re-try the process (Sidekiq takes care of
+1. The process being slow won't affect the loading time of a page.
+1. In case of a failure it's easy to re-try the process (Sidekiq takes care of
this automatically).
-4. By isolating the code from a web request it will hopefully be easier to test
+1. By isolating the code from a web request it will hopefully be easier to test
and maintain.
It's especially important to use Sidekiq as much as possible when dealing with
diff --git a/doc/development/policies.md b/doc/development/policies.md
index c4ac42bb40a..833b0acb13e 100644
--- a/doc/development/policies.md
+++ b/doc/development/policies.md
@@ -89,8 +89,8 @@ Each line represents a rule that was evaluated. There are a few things to note:
1. The `-` or `+` symbol indicates whether the rule block was evaluated to be
`false` or `true`, respectively.
-2. The number inside the brackets indicates the score.
-3. The last part of the line (e.g. `@john : Issue/1`) shows the username
+1. The number inside the brackets indicates the score.
+1. The last part of the line (e.g. `@john : Issue/1`) shows the username
and subject for that rule.
Here you can see that the first four rules were evaluated `false` for
diff --git a/doc/development/query_recorder.md b/doc/development/query_recorder.md
index a6b60149ea4..3787e2ef187 100644
--- a/doc/development/query_recorder.md
+++ b/doc/development/query_recorder.md
@@ -36,6 +36,13 @@ it "avoids N+1 database queries" do
end
```
+## Use request specs instead of controller specs
+
+Use a [request spec](https://gitlab.com/gitlab-org/gitlab-ce/tree/master/spec/requests) when writing a N+1 test on the controller level.
+
+Controller specs should not be used to write N+1 tests as the controller is only initialized once per example.
+This could lead to false successes where subsequent "requests" could have queries reduced (e.g. because of memoization).
+
## Finding the source of the query
It may be useful to identify the source of the queries by looking at the call backtrace.
diff --git a/doc/development/rake_tasks.md b/doc/development/rake_tasks.md
index c97e179910b..fc96820c555 100644
--- a/doc/development/rake_tasks.md
+++ b/doc/development/rake_tasks.md
@@ -9,7 +9,7 @@ bundle exec rake setup
```
The `setup` task is an alias for `gitlab:setup`.
-This tasks calls `db:reset` to create the database, calls `add_limits_mysql` that adds limits to the database schema in case of a MySQL database and finally it calls `db:seed_fu` to seed the database.
+This tasks calls `db:reset` to create the database, and calls `db:seed_fu` to seed the database.
Note: `db:setup` calls `db:seed` but this does nothing.
### Seeding issues for all or a given project
@@ -216,3 +216,4 @@ bundle exec rake routes
Since these take some time to create, it's often helpful to save the output to
a file for quick reference.
+
diff --git a/doc/development/reusing_abstractions.md b/doc/development/reusing_abstractions.md
index 59da02ed6fd..fce144f8dc2 100644
--- a/doc/development/reusing_abstractions.md
+++ b/doc/development/reusing_abstractions.md
@@ -14,13 +14,13 @@ on maintainability, the ability to easily debug problems, or even performance.
An example would be to use `ProjectsFinder` in `IssuesFinder` to limit issues to
those belonging to a set of projects. While initially this may seem like a good
idea, both classes provide a very high level interface with very little control.
-This means that `IssuesFinder` may not be able to produce a better optimised
+This means that `IssuesFinder` may not be able to produce a better optimized
database query, as a large portion of the query is controlled by the internals
of `ProjectsFinder`.
To work around this problem, you would use the same code used by
`ProjectsFinder`, instead of using `ProjectsFinder` itself directly. This allows
-you to compose your behaviour better, giving you more control over the behaviour
+you to compose your behavior better, giving you more control over the behavior
of the code.
To illustrate, consider the following code from `IssuableFinder#projects`:
@@ -52,7 +52,7 @@ functionality is added to this (high level) interface. Instead of _only_
affecting the cases where this is necessary, it may also start affecting
`IssuableFinder` in a negative way. For example, the query produced by
`GroupProjectsFinder` may include unnecessary conditions. Since we're using a
-finder here, we can't easily opt-out of that behaviour. We could add options to
+finder here, we can't easily opt-out of that behavior. We could add options to
do so, but then we'd need as many options as we have features. Every option adds
two code paths, which means that for four features we have to cover 8 different
code paths.
@@ -213,6 +213,5 @@ The API provided by Active Record itself, such as the `where` method, `save`,
Everything in `app/workers`.
-The scheduling of Sidekiq jobs using `SomeWorker.perform_async`, `perform_in`,
-etc. Directly invoking a worker using `SomeWorker.new.perform` should be avoided
-at all times in application code, though this is fine to use in tests.
+Use `SomeWorker.perform_async` or `SomeWorker.perform_in` to schedule Sidekiq
+jobs. Never directly invoke a worker using `SomeWorker.new.perform`.
diff --git a/doc/development/sha1_as_binary.md b/doc/development/sha1_as_binary.md
index 3151cc29bbc..6c4252ec634 100644
--- a/doc/development/sha1_as_binary.md
+++ b/doc/development/sha1_as_binary.md
@@ -2,7 +2,7 @@
Storing SHA1 hashes as strings is not very space efficient. A SHA1 as a string
requires at least 40 bytes, an additional byte to store the encoding, and
-perhaps more space depending on the internals of PostgreSQL and MySQL.
+perhaps more space depending on the internals of PostgreSQL.
On the other hand, if one were to store a SHA1 as binary one would only need 20
bytes for the actual SHA1, and 1 or 4 bytes of additional space (again depending
diff --git a/doc/development/shell_scripting_guide/index.md b/doc/development/shell_scripting_guide/index.md
new file mode 100644
index 00000000000..0809f8b1a0a
--- /dev/null
+++ b/doc/development/shell_scripting_guide/index.md
@@ -0,0 +1,118 @@
+# Shell scripting standards and style guidelines
+
+## Overview
+
+GitLab consists of many various services and sub-projects. The majority of
+their backend code is written in [Ruby](https://www.ruby-lang.org) and
+[Go](https://golang.org). However, some of them use shell scripts for
+automation of routine system administration tasks like deployment,
+installation, etc. It's being done either for historical reasons or as an effort
+to minimize the dependencies, for instance, for Docker images.
+
+This page aims to define and organize our shell scripting guidelines,
+based on our various experiences. All shell scripts across GitLab project
+should be eventually harmonized with this guide. If there are any per-project
+deviations from this guide, they should be described in the
+`README.md` or `PROCESS.md` file for such a project.
+
+### Avoid using shell scripts
+
+CAUTION: **Caution:**
+This is a must-read section.
+
+Having said all of the above, we recommend staying away from shell scripts
+as much as possible. A language like Ruby or Python (if required for
+consistency with codebases that we leverage) is almost always a better choice.
+The high-level interpreted languages have more readable syntax, offer much more
+mature capabilities for unit-testing, linting, and error reporting.
+
+Use shell scripts only if there's a strong restriction on project's
+dependencies size or any other requirements that are more important
+in a particular case.
+
+## Scope of this guide
+
+According to the [GitLab installation requirements](../../install/requirements.md),
+this guide covers only those shells that are used by
+[supported Linux distributions](../../install/requirements.md#supported-linux-distributions),
+that is:
+
+- [POSIX Shell](https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html)
+- [Bash](https://www.gnu.org/software/bash/)
+
+## Shell language choice
+
+- When you need to reduce the dependencies list, use what's provided by the environment. For example, for Docker images it's `sh` from `alpine` which is the base image for most of our tool images.
+- Everywhere else, use `bash` if possible. It's more powerful than `sh` but still a widespread shell.
+
+## Code style and format
+
+This section describes the tools that should be made a mandatory part of
+a project's CI pipeline if it contains shell scripts. These tools
+automate shell code formatting, checking for errors or vulnerabilities, etc.
+
+### Linting
+
+We're using the [ShellCheck](https://www.shellcheck.net/) utility in its default configuration to lint our
+shell scripts.
+
+All projects with shell scripts should use this GitLab CI/CD job:
+
+```yaml
+shell check:
+ image: koalaman/shellcheck-alpine
+ stage: test
+ before_script:
+ - shellcheck --version
+ script:
+ - shellcheck scripts/**/*.sh # path to your shell scripts
+```
+
+TIP: **Tip:**
+By default, ShellCheck will use the [shell detection](https://github.com/koalaman/shellcheck/wiki/SC2148#rationale)
+to determine the shell dialect in use. If the shell file is out of your control and ShellCheck cannot
+detect the dialect, use `-s` flag to specify it: `-s sh` or `-s bash`.
+
+### Formatting
+
+It's recommended to use the [shfmt](https://github.com/mvdan/sh#shfmt) tool to maintain consistent formatting.
+We format shell scripts according to the [Google Shell Style Guide](https://google.github.io/styleguide/shell.xml),
+so the following `shfmt` invocation should be applied to the project's script files:
+
+```bash
+shfmt -i 2 -ci scripts/**/*.sh
+```
+
+TIP: **Tip:**
+By default, shfmt will use the [shell detection](https://github.com/mvdan/sh#shfmt) similar to one of ShellCheck
+and ignore files starting with a period. To override this, use `-ln` flag to specify the shell dialect:
+`-ln posix` or `-ln bash`.
+
+NOTE: **Note:**
+Currently, the `shfmt` tool [is not shipped](https://github.com/mvdan/sh/issues/68) as a Docker image containing
+a Linux shell. This makes it impossible to use the [official Docker image](https://hub.docker.com/r/mvdan/shfmt)
+in GitLab Runner. This [may change](https://github.com/mvdan/sh/issues/68#issuecomment-507721371) in future.
+
+## Testing
+
+NOTE: **Note:**
+This is a work in progress.
+
+It is an [ongoing effort](https://gitlab.com/gitlab-org/gitlab-ce/issues/64016) to evaluate different tools for the
+automated testing of shell scripts (like [BATS](https://github.com/sstephenson/bats)).
+
+## Code Review
+
+The code review should be performed according to:
+
+- [ShellCheck Checks list](https://github.com/koalaman/shellcheck/wiki/Checks)
+- [Google Shell Style Guide](https://google.github.io/styleguide/shell.xml)
+- [Shfmt formatting caveats](https://github.com/mvdan/sh#caveats)
+
+However, the recommended course of action is to use the aforementioned
+tools and address reported offenses. This should eliminate the need
+for code review.
+
+---
+
+[Return to Development documentation](../README.md).
diff --git a/doc/development/sql.md b/doc/development/sql.md
index a256fd46c09..2584dcfb4ca 100644
--- a/doc/development/sql.md
+++ b/doc/development/sql.md
@@ -15,14 +15,11 @@ FROM issues
WHERE title LIKE 'WIP:%';
```
-On PostgreSQL the `LIKE` statement is case-sensitive. On MySQL this depends on
-the case-sensitivity of the collation, which is usually case-insensitive. To
-perform a case-insensitive `LIKE` on PostgreSQL you have to use `ILIKE` instead.
-This statement in turn isn't supported on MySQL.
+On PostgreSQL the `LIKE` statement is case-sensitive. To perform a case-insensitive
+`LIKE` you have to use `ILIKE` instead.
-To work around this problem you should write `LIKE` queries using Arel instead
-of raw SQL fragments as Arel automatically uses `ILIKE` on PostgreSQL and `LIKE`
-on MySQL. This means that instead of this:
+To handle this automatically you should use `LIKE` queries using Arel instead
+of raw SQL fragments, as Arel automatically uses `ILIKE` on PostgreSQL.
```ruby
Issue.where('title LIKE ?', 'WIP:%')
@@ -45,7 +42,7 @@ table = Issue.arel_table
Issue.where(table[:title].matches('WIP:%').or(table[:foo].matches('WIP:%')))
```
-For PostgreSQL this produces:
+On PostgreSQL, this produces:
```sql
SELECT *
@@ -53,18 +50,10 @@ FROM issues
WHERE (title ILIKE 'WIP:%' OR foo ILIKE 'WIP:%')
```
-In turn for MySQL this produces:
-
-```sql
-SELECT *
-FROM issues
-WHERE (title LIKE 'WIP:%' OR foo LIKE 'WIP:%')
-```
-
## LIKE & Indexes
-Neither PostgreSQL nor MySQL use any indexes when using `LIKE` / `ILIKE` with a
-wildcard at the start. For example, this will not use any indexes:
+PostgreSQL won't use any indexes when using `LIKE` / `ILIKE` with a wildcard at
+the start. For example, this will not use any indexes:
```sql
SELECT *
@@ -75,9 +64,8 @@ WHERE title ILIKE '%WIP:%';
Because the value for `ILIKE` starts with a wildcard the database is not able to
use an index as it doesn't know where to start scanning the indexes.
-MySQL provides no known solution to this problem. Luckily PostgreSQL _does_
-provide a solution: trigram GIN indexes. These indexes can be created as
-follows:
+Luckily, PostgreSQL _does_ provide a solution: trigram GIN indexes. These
+indexes can be created as follows:
```sql
CREATE INDEX [CONCURRENTLY] index_name_here
diff --git a/doc/development/testing_guide/best_practices.md b/doc/development/testing_guide/best_practices.md
index 448d9fd01c4..f30a83a4c71 100644
--- a/doc/development/testing_guide/best_practices.md
+++ b/doc/development/testing_guide/best_practices.md
@@ -15,16 +15,6 @@ manifest themselves within our code. When designing our tests, take time to revi
our test design. We can find some helpful heuristics documented in the Handbook in the
[Test Design](https://about.gitlab.com/handbook/engineering/quality/guidelines/test-engineering/test-design/) section.
-## Run tests against MySQL
-
-By default, tests are only run against PostgreSQL, but you can run them on
-demand against MySQL by following one of the following conventions:
-
-| Convention | Valid example |
-|:----------------------|:-----------------------------|
-| Include `mysql` in your branch name | `enhance-mysql-support` |
-| Include `[run mysql]` in your commit message | `Fix MySQL support<br><br>[run mysql]` |
-
## Test speed
GitLab has a massive test suite that, without [parallelization], can take hours
@@ -70,6 +60,7 @@ bundle exec rspec spec/[path]/[to]/[spec].rb
- On `before` and `after` hooks, prefer it scoped to `:context` over `:all`
- When using `evaluate_script("$('.js-foo').testSomething()")` (or `execute_script`) which acts on a given element,
use a Capyabara matcher beforehand (e.g. `find('.js-foo')`) to ensure the element actually exists.
+- Use `focus: true` to isolate parts of the specs you want to run.
[four-phase-test]: https://robots.thoughtbot.com/four-phase-test
@@ -454,6 +445,19 @@ complexity of RSpec expectations.They should be placed under
a certain type of specs only (e.g. features, requests etc.) but shouldn't be if
they apply to multiple type of specs.
+#### `be_like_time`
+
+Time returned from a database can differ in precision from time objects
+in Ruby, so we need flexible tolerances when comparing in specs. We can
+use `be_like_time` to compare that times are within one second of each
+other.
+
+Example:
+
+```ruby
+expect(metrics.merged_at).to be_like_time(time)
+```
+
#### `have_gitlab_http_status`
Prefer `have_gitlab_http_status` over `have_http_status` because the former
diff --git a/doc/development/testing_guide/ci.md b/doc/development/testing_guide/ci.md
index 87d48726268..d9f66a827de 100644
--- a/doc/development/testing_guide/ci.md
+++ b/doc/development/testing_guide/ci.md
@@ -39,7 +39,6 @@ slowest test files and try to improve them.
## CI setup
-- On CE and EE, the test suite runs both PostgreSQL and MySQL.
- Rails logging to `log/test.log` is disabled by default in CI [for
performance reasons][logging]. To override this setting, provide the
`RAILS_ENABLE_TEST_LOG` environment variable.
diff --git a/doc/development/testing_guide/end_to_end/index.md b/doc/development/testing_guide/end_to_end/index.md
index 2dc06ba10a5..d6b944a3e74 100644
--- a/doc/development/testing_guide/end_to_end/index.md
+++ b/doc/development/testing_guide/end_to_end/index.md
@@ -65,28 +65,23 @@ Below you can read more about how to use it and how does it work.
Currently, we are using _multi-project pipeline_-like approach to run QA
pipelines.
-![QA on merge requests CI/CD architecture](../img/qa_on_merge_requests_cicd_architecture.png)
-
-<details>
-<summary>Show mermaid source</summary>
-<pre>
+```mermaid
graph LR
A1 -.->|1. Triggers an omnibus-gitlab pipeline and wait for it to be done| A2
- B2[<b>`Trigger-qa` stage</b><br />`Trigger:qa-test` job] -.->|2. Triggers a gitlab-qa pipeline and wait for it to be done| A3
+ B2[`Trigger-qa` stage<br>`Trigger:qa-test` job] -.->|2. Triggers a gitlab-qa pipeline and wait for it to be done| A3
-subgraph gitlab-ce/ee pipeline
- A1[<b>`test` stage</b><br />`package-and-qa` job]
+subgraph "gitlab-ce/ee pipeline"
+ A1[`test` stage<br>`package-and-qa` job]
end
-subgraph omnibus-gitlab pipeline
- A2[<b>`Trigger-docker` stage</b><br />`Trigger:gitlab-docker` job] -->|once done| B2
+subgraph "omnibus-gitlab pipeline"
+ A2[`Trigger-docker` stage<br>`Trigger:gitlab-docker` job] -->|once done| B2
end
-subgraph gitlab-qa pipeline
- A3>QA jobs run] -.->|3. Reports back the pipeline result to the `package-and-qa` job<br />and post the result on the original commit tested| A1
+subgraph "gitlab-qa pipeline"
+ A3>QA jobs run] -.->|3. Reports back the pipeline result to the `package-and-qa` job<br>and post the result on the original commit tested| A1
end
-</pre>
-</details>
+```
1. Developer triggers a manual action, that can be found in CE / EE merge
requests. This starts a chain of pipelines in multiple projects.
diff --git a/doc/development/testing_guide/end_to_end/page_objects.md b/doc/development/testing_guide/end_to_end/page_objects.md
index 29ad49403fe..47e58a425fd 100644
--- a/doc/development/testing_guide/end_to_end/page_objects.md
+++ b/doc/development/testing_guide/end_to_end/page_objects.md
@@ -27,7 +27,7 @@ When someone later changes `t.text_field :login` in the view associated with
this page to `t.text_field :username` it will generate a different field
identifier, what would effectively break all tests.
-Because we are using `Page::Main::Login.act { sign_in_using_credentials }`
+Because we are using `Page::Main::Login.perform(&:sign_in_using_credentials)`
everywhere, when we want to sign into GitLab, the page object is the single
source of truth, and we will need to update `fill_in :user_login`
to `fill_in :user_username` only in a one place.
@@ -105,7 +105,7 @@ code but **this is deprecated** in favor of the above method for two reasons:
view 'app/views/my/view.html.haml' do
### Good ###
-
+
# Implicitly require the CSS selector `[data-qa-selector="logout_button"]` to be present in the view
element :logout_button
@@ -152,10 +152,9 @@ Things to note:
- The name of the element and the qa_selector must match and be snake_cased
- If the element appears on the page unconditionally, add `required: true` to the element. See
[Dynamic element validation](dynamic_element_validation.md)
-- You may see `.qa-selector` classes in existing Page Objects. We should prefer the [`data-qa-selector`](#data-qa-selector-vs-qa-selector)
+- You may see `.qa-selector` classes in existing Page Objects. We should prefer the [`data-qa-selector`](#data-qa-selector-vs-qa-selector)
method of definition over the `.qa-selector` CSS class
-
### `data-qa-selector` vs `.qa-selector`
> Introduced in GitLab 12.1
diff --git a/doc/development/testing_guide/end_to_end/quick_start_guide.md b/doc/development/testing_guide/end_to_end/quick_start_guide.md
index 3bbf8feab39..e1df8be8b6f 100644
--- a/doc/development/testing_guide/end_to_end/quick_start_guide.md
+++ b/doc/development/testing_guide/end_to_end/quick_start_guide.md
@@ -110,15 +110,15 @@ end
```
> Notice that the test itself is simple. The most challenging part is the creation of the application state, which will be covered later.
-
+>
> The exemplified test case's MVC is not enough for the change to be merged, but it helps to build up the test logic. The reason is that we do not want to use locators directly in the tests, and tests **must** use [Page Objects] before they can be merged. This way we better separate the responsibilities, where the Page Objects encapsulate elements and methods that allow us to interact with pages, while the spec files describe the test cases in more business-related language.
Below are the steps that the test covers:
1. The test finds the 'Edit' link for the labels and clicks on it.
-2. Then it fills in the 'Assign labels' input field with the value 'animal::dolphin' and press enters.
-3. Then it clicks in the content body to apply the label and refreshes the page.
-4. Finally, the expectations check that the previous scoped label was removed and that the new one was added.
+1. Then it fills in the 'Assign labels' input field with the value 'animal::dolphin' and press enters.
+1. Then it clicks in the content body to apply the label and refreshes the page.
+1. Finally, the expectations check that the previous scoped label was removed and that the new one was added.
Let's now see how the second test case would look.
@@ -144,9 +144,9 @@ end
Below are the steps that the test covers:
1. The test finds the 'Edit' link for the labels and clicks on it.
-2. Then it fills in the 'Assign labels' input field with the value 'plant::orchid' and press enters.
-3. Then it clicks in the content body to apply the label and refreshes the page.
-4. Finally, the expectations check that both scoped labels are present.
+1. Then it fills in the 'Assign labels' input field with the value 'plant::orchid' and press enters.
+1. Then it clicks in the content body to apply the label and refreshes the page.
+1. Finally, the expectations check that both scoped labels are present.
> Similar to the previous test, this one is also very straightforward, but there is some code duplication. Let's address it.
@@ -211,7 +211,7 @@ A pre-condition for the entire test suite is defined in the `before :context` bl
> For our test suite, due to the need of the tests being completely independent of each other, we won't use the `before :context` block. The `before :context` block would make the tests dependent on each other because the first test changes the label of the issue, and the second one depends on the `'animal::fox'` label being set.
-> **Tip:** In case of a test suite with only one `it` block it's ok to use only the `before` block (see below) with all the test's pre-conditions.
+TIP: **Tip:** In case of a test suite with only one `it` block it's ok to use only the `before` block (see below) with all the test's pre-conditions.
#### `before`
@@ -274,11 +274,11 @@ end
In the `before` block we create all the application state needed for the tests to run. We do that by using the `Runtime::Browser.visit` method to go to the login page, by performing a `sign_in_using_credentials` from the `Login` Page Object, by fabricating resources via APIs (`issue`, and `Resource::Label`), and by using the `issue.visit!` to visit the issue page.
> A project is created in the background by creating the `issue` resource.
-
+>
> When creating the [Resources], notice that when calling the `fabricate_via_api` method, we pass some attribute:values, like `title`, and `labels` for the `issue` resource; and `project` and `title` for the `label` resource.
-
+>
> What's important to understand here is that by creating the application state mostly using the public APIs we save a lot of time in the test suite setup stage.
-
+>
> Soon we will cover the use of the already existing resources' methods and the creation of your own `fabricate_via_api` methods for resources where this is still not available, but first, let's optimize our implementation.
### 6. Optimization
@@ -290,7 +290,7 @@ As already mentioned in the [best practices](best_practices.md) document, end-to
Some improvements that we could make in our test suite to optimize its time to run are:
1. Having a single test case (an `it` block) that exercises both scenarios to avoid "wasting" time in the tests' pre-conditions, instead of having two different test cases.
-2. Making the selection of labels more performant by allowing for the selection of more than one label in the same reusable method.
+1. Making the selection of labels more performant by allowing for the selection of more than one label in the same reusable method.
Let's look at a suggestion that addresses the above points, one by one:
@@ -332,8 +332,8 @@ To address point 1, we changed the test implementation from two `it` blocks into
> Notice that the implementation of the new and unique `it` block had to change a little bit. Below we describe in details what it does.
1. It selects two scoped labels simultaneously, one from the same scope of the one already applied in the issue during the setup phase (in the `before` block), and another one from a different scope.
-2. It asserts that the correct labels are visible in the `labels_block`, and that the labels were correctly added and removed;
-3. Finally, the `select_label_and_refresh` method is changed to `select_labels_and_refresh`, which accepts an array of labels instead of a single label, and it iterates on them for faster label selection (this is what is used in step 1 explained above.)
+1. It asserts that the correct labels are visible in the `labels_block`, and that the labels were correctly added and removed;
+1. Finally, the `select_label_and_refresh` method is changed to `select_labels_and_refresh`, which accepts an array of labels instead of a single label, and it iterates on them for faster label selection (this is what is used in step 1 explained above.)
### 7. Resources
@@ -362,7 +362,7 @@ First, in the [issue resource](https://gitlab.com/gitlab-org/gitlab-ee/blob/d358
Add the following `attribute :id` and `attribute :labels` right above the [`attribute :title`](https://gitlab.com/gitlab-org/gitlab-ee/blob/d3584e80b4236acdf393d815d604801573af72cc/qa/qa/resource/issue.rb#L15).
> This line is needed to allow for the issue fabrication, and for labels to be automatically added to the issue when fabricating it via API.
-
+>
> We add the attributes above the existing attribute to keep them alphabetically organized.
Then, let's initialize an instance variable for labels to allow an empty array as default value when such information is not passed during the resource fabrication, since this optional. [Between the attributes and the `fabricate!` method](https://gitlab.com/gitlab-org/gitlab-ee/blob/1a1f1408728f19b2aa15887cd20bddab7e70c8bd/qa/qa/resource/issue.rb#L18), add the following:
@@ -437,7 +437,7 @@ By defining the `resource_web_url(resource)` method, we override the one from th
By defining the `api_get_path` method, we **would** allow for the [`ApiFabricator`](https://gitlab.com/gitlab-org/gitlab-ee/blob/master/qa/qa/resource/api_fabricator.rb) module to know which path to use to get a single label, but since there's no path available for that in the publich API, we raise a `NotImplementedError` instead.
-By defining the `api_post_path` method, we allow for the [`ApiFabricator `](https://gitlab.com/gitlab-org/gitlab-ee/blob/master/qa/qa/resource/api_fabricator.rb) module to know which path to use to create a new label in a specific project.
+By defining the `api_post_path` method, we allow for the [`ApiFabricator`](https://gitlab.com/gitlab-org/gitlab-ee/blob/master/qa/qa/resource/api_fabricator.rb) module to know which path to use to create a new label in a specific project.
By defining the `api_post_body` method, we we allow for the [`ApiFabricator.api_post`](https://gitlab.com/gitlab-org/gitlab-ee/blob/a9177ca1812bac57e2b2fa4560e1d5dd8ffac38b/qa/qa/resource/api_fabricator.rb#L68) method to know which data to send when making the `POST` request.
@@ -542,9 +542,9 @@ end
Notice that we have not only moved the `select_labels_and_refresh` method, but we have also changed its implementation to:
1. Click the `:edit_link_labels` element previously defined, instead of using `find('.block.labels .edit-link').click`
-2. Use `within_element(:dropdown_menu_labels, text: label)`, and inside of it, we call `send_keys_to_element(:dropdown_input_field, [label, :enter])`, which is a method that we will implement in the `QA::Page::Base` class to replace `find('.dropdown-menu-labels .dropdown-input-field').send_keys [label, :enter]`
-3. Use `click_body` after iterating on each label, instead of using `find('#content-body').click`
-4. Iterate on every label again, and then we use `has_element?(:labels_block, text: label)` after clicking the page body (which applies the labels), and before refreshing the page, to avoid test flakiness due to refreshing too fast.
+1. Use `within_element(:dropdown_menu_labels, text: label)`, and inside of it, we call `send_keys_to_element(:dropdown_input_field, [label, :enter])`, which is a method that we will implement in the `QA::Page::Base` class to replace `find('.dropdown-menu-labels .dropdown-input-field').send_keys [label, :enter]`
+1. Use `click_body` after iterating on each label, instead of using `find('#content-body').click`
+1. Iterate on every label again, and then we use `has_element?(:labels_block, text: label)` after clicking the page body (which applies the labels), and before refreshing the page, to avoid test flakiness due to refreshing too fast.
##### Details of `text_of_labels_block`
@@ -580,7 +580,7 @@ filter_output = search_field_tag search_id, nil, class: "dropdown-input-field",
> `data-qa-*` data attributes and CSS classes starting with `qa-` are used solely for the purpose of QA and testing.
> By defining these, we add **testability** to the application.
-
+>
> When defining a data attribute like: `qa_selector: 'labels_block'`, it should match the element definition: `element :labels_block`. We use a [sanity test](https://gitlab.com/gitlab-org/gitlab-ce/tree/master/qa/qa/page#how-did-we-solve-fragile-tests-problem) to check that defined elements have their respective selectors in the specified views.
#### Updates in the `QA::Page::Base` class
@@ -599,8 +599,6 @@ This method receives an element (`name`) and the `keys` that it will send to tha
As you might remember, in the Issue Page Object we call this method like this: `send_keys_to_element(:dropdown_input_field, [label, :enter])`.
-___
-
With that, you should be able to start writing end-to-end tests yourself. *Congratulations!*
[Page Objects]: page_objects.md
diff --git a/doc/development/testing_guide/end_to_end/style_guide.md b/doc/development/testing_guide/end_to_end/style_guide.md
index 6a888142575..97560e616a1 100644
--- a/doc/development/testing_guide/end_to_end/style_guide.md
+++ b/doc/development/testing_guide/end_to_end/style_guide.md
@@ -45,7 +45,7 @@ Notice that in the above example, before clicking the `:operations_environments_
> We can create these methods as helpers to abstract multi-step navigation.
-### Element naming convention
+## Element naming convention
When adding new elements to a page, it's important that we have a uniform element naming convention.
@@ -67,7 +67,7 @@ We follow a simple formula roughly based on hungarian notation.
*Note: This list is a work in progress. This list will eventually be the end-all enumeration of all available types.
I.e., any element that does not end with something in this list is bad form.*
-#### Examples
+### Examples
**Good**
@@ -98,3 +98,47 @@ view '...' do
element :ssh_clone_url
end
```
+
+## Block argument naming
+
+To have a standard on how we call pages when using the `.perform` method, we use the name of page object being called, all lowercased, and separated by underscore, if needed (see good and bad examples below.) This also applies to resources. We chose not to simply use `page` because that would shadow the Capybara DSL, potentially leading to confusion and bugs.
+
+### Examples
+
+**Good**
+
+```ruby
+# qa/specs/features/browser_ui/1_manage/project/add_project_member_spec.rb
+
+Page::Project::Settings::Members.perform do |members|
+ members.do_something
+end
+```
+
+```ruby
+# qa/specs/features/ee/browser_ui/3_create/merge_request/add_batch_comments_in_merge_request_spec.rb
+
+Resource::MergeRequest.fabricate! do |merge_request|
+ merge_request.do_something_else
+end
+```
+
+**Bad**
+
+```ruby
+# qa/specs/features/browser_ui/1_manage/project/add_project_member_spec.rb
+
+Page::Project::Settings::Members.perform do |project_settings_members_page|
+ project_settings_members_page.do_something
+end
+```
+
+```ruby
+# qa/specs/features/ee/browser_ui/3_create/merge_request/add_batch_comments_in_merge_request_spec.rb
+
+Resource::MergeRequest.fabricate! do |merge_request_page|
+ merge_request_page.do_something_else
+end
+```
+
+> Besides the advantage of having a standard in place, by following this standard we also write shorter lines of code. \ No newline at end of file
diff --git a/doc/development/testing_guide/flaky_tests.md b/doc/development/testing_guide/flaky_tests.md
index 931cbc51cae..eb0bf6fc563 100644
--- a/doc/development/testing_guide/flaky_tests.md
+++ b/doc/development/testing_guide/flaky_tests.md
@@ -35,8 +35,8 @@ Once a test is in quarantine, there are 3 choices:
Quarantined tests are run on the CI in dedicated jobs that are allowed to fail:
-- `rspec-pg-quarantine` and `rspec-mysql-quarantine` (CE & EE)
-- `rspec-pg-quarantine-ee` and `rspec-mysql-quarantine-ee` (EE only)
+- `rspec-pg-quarantine` (CE & EE)
+- `rspec-pg-quarantine-ee` (EE only)
## Automatic retries and flaky tests detection
diff --git a/doc/development/testing_guide/frontend_testing.md b/doc/development/testing_guide/frontend_testing.md
index c909745b1ab..7dc89a3fcdb 100644
--- a/doc/development/testing_guide/frontend_testing.md
+++ b/doc/development/testing_guide/frontend_testing.md
@@ -80,18 +80,20 @@ describe('Component', () => {
Remember that the performance of each test depends on the environment.
### Manual module mocks
+
Jest supports [manual module mocks](https://jestjs.io/docs/en/manual-mocks) by placing a mock in a `__mocks__/` directory next to the source module. **Don't do this.** We want to keep all of our test-related code in one place (the `spec/` folder), and the logic that Jest uses to apply mocks from `__mocks__/` is rather inconsistent.
Instead, our test runner detects manual mocks from `spec/frontend/mocks/`. Any mock placed here is automatically picked up and injected whenever you import its source module.
- Files in `spec/frontend/mocks/ce` will mock the corresponding CE module from `app/assets/javascripts`, mirroring the source module's path.
- - Example: `spec/frontend/mocks/ce/lib/utils/axios_utils` will mock the module `~/lib/utils/axios_utils`.
+ - Example: `spec/frontend/mocks/ce/lib/utils/axios_utils` will mock the module `~/lib/utils/axios_utils`.
- Files in `spec/frontend/mocks/node` will mock NPM packages of the same name or path.
- We don't support mocking EE modules yet.
If a mock is found for which a source module doesn't exist, the test suite will fail. 'Virtual' mocks, or mocks that don't have a 1-to-1 association with a source module, are not supported yet.
#### Writing a mock
+
Create a JS module in the appropriate place in `spec/frontend/mocks/`. That's it. It will automatically mock its source package in all tests.
Make sure that your mock's export has the same format as the mocked module. So, if you're mocking a CommonJS module, you'll need to use `module.exports` instead of the ES6 `export`.
@@ -99,14 +101,15 @@ Make sure that your mock's export has the same format as the mocked module. So,
It might be useful for a mock to expose a property that indicates if the mock was loaded. This way, tests can assert the presence of a mock without calling any logic and causing side-effects. The `~/lib/utils/axios_utils` module mock has such a property, `isMock`, that is `true` in the mock and undefined in the original class. Jest's mock functions also have a `mock` property that you can test.
#### Bypassing mocks
+
If you ever need to import the original module in your tests, use [`jest.requireActual()`](https://jestjs.io/docs/en/jest-object#jestrequireactualmodulename) (or `jest.requireActual().default` for the default export). The `jest.mock()` and `jest.unmock()` won't have an effect on modules that have a manual mock, because mocks are imported and cached before any tests are run.
#### Keep mocks light
+
Global mocks introduce magic and can affect how modules are imported in your tests. Try to keep them as light as possible and dependency-free. A global mock should be useful for any unit test. For example, the `axios_utils` and `jquery` module mocks throw an error when an HTTP request is attempted, since this is useful behaviour in &gt;99% of tests.
When in doubt, construct mocks in your test file using [`jest.mock()`](https://jestjs.io/docs/en/jest-object#jestmockmodulename-factory-options), [`jest.spyOn()`](https://jestjs.io/docs/en/jest-object#jestspyonobject-methodname), etc.
-
## Karma test suite
GitLab uses the [Karma][karma] test runner with [Jasmine] as its test
@@ -229,7 +232,7 @@ module. GitLab has a custom `spyOnDependency` method which utilizes
[babel-plugin-rewire](https://github.com/speedskater/babel-plugin-rewire) to
achieve this. It can be used like so:
-```js
+```javascript
// my_module.js
import { visitUrl } from '~/lib/utils/url_utility';
@@ -238,7 +241,7 @@ export default function doSomething() {
}
```
-```js
+```javascript
// my_module_spec.js
import doSomething from '~/my_module';
@@ -462,7 +465,7 @@ See this [section][vue-test].
For running the frontend tests, you need the following commands:
-- `rake karma:fixtures` (re-)generates [fixtures](#frontend-test-fixtures).
+- `rake frontend:fixtures` (re-)generates [fixtures](#frontend-test-fixtures).
- `yarn test` executes the tests.
As long as the fixtures don't change, `yarn test` is sufficient (and saves you some time).
@@ -515,8 +518,8 @@ Information on setting up and running RSpec integration tests with
Code that is added to HAML templates (in `app/views/`) or makes Ajax requests to the backend has tests that require HTML or JSON from the backend.
Fixtures for these tests are located at:
-- `spec/javascripts/fixtures/`, for running tests in CE.
-- `ee/spec/javascripts/fixtures/`, for running tests in EE.
+- `spec/frontend/fixtures/`, for running tests in CE.
+- `ee/spec/frontend/fixtures/`, for running tests in EE.
Fixture files in:
@@ -527,7 +530,7 @@ The following are examples of tests that work for both Karma and Jest:
```javascript
it('makes a request', () => {
- const responseBody = getJSONFixture('some/fixture.json'); // loads spec/javascripts/fixtures/some/fixture.json
+ const responseBody = getJSONFixture('some/fixture.json'); // loads spec/frontend/fixtures/some/fixture.json
axiosMock.onGet(endpoint).reply(200, responseBody);
myButton.click();
@@ -536,7 +539,7 @@ it('makes a request', () => {
});
it('uses some HTML element', () => {
- loadFixtures('some/page.html'); // loads spec/javascripts/fixtures/some/page.html and adds it to the DOM
+ loadFixtures('some/page.html'); // loads spec/frontend/fixtures/some/page.html and adds it to the DOM
const element = document.getElementById('#my-id');
@@ -544,12 +547,12 @@ it('uses some HTML element', () => {
});
```
-HTML and JSON fixtures are generated from backend views and controllers using RSpec (see `spec/javascripts/fixtures/*.rb`).
+HTML and JSON fixtures are generated from backend views and controllers using RSpec (see `spec/frontend/fixtures/*.rb`).
For each fixture, the content of the `response` variable is stored in the output file.
This variable gets automagically set if the test is marked as `type: :request` or `type: :controller`.
-Fixtures are regenerated using the `bin/rake karma:fixtures` command but you can also generate them individually,
-for example `bin/rspec spec/javascripts/fixtures/merge_requests.rb`.
+Fixtures are regenerated using the `bin/rake frontend:fixtures` command but you can also generate them individually,
+for example `bin/rspec spec/frontend/fixtures/merge_requests.rb`.
When creating a new fixture, it often makes sense to take a look at the corresponding tests for the endpoint in `(ee/)spec/controllers/` or `(ee/)spec/requests/`.
## Gotchas
@@ -585,11 +588,522 @@ end
[jasmine-focus]: https://jasmine.github.io/2.5/focused_specs.html
[karma]: http://karma-runner.github.io/
-[vue-test]: https://docs.gitlab.com/ce/development/fe_guide/vue.html#testing-vue-components
+[vue-test]: ../fe_guide/vue.md#testing-vue-components
[rspec]: https://github.com/rspec/rspec-rails#feature-specs
[capybara]: https://github.com/teamcapybara/capybara
[jasmine]: https://jasmine.github.io/
+## Overview of Frontend Testing Levels
+
+Tests relevant for frontend development can be found at the following places:
+
+- `spec/javascripts/` which are run by Karma (command: `yarn karma`) and contain
+ - [frontend unit tests](#frontend-unit-tests)
+ - [frontend component tests](#frontend-component-tests)
+ - [frontend integration tests](#frontend-integration-tests)
+- `spec/frontend/` which are run by Jest (command: `yarn jest`) and contain
+ - [frontend unit tests](#frontend-unit-tests)
+ - [frontend component tests](#frontend-component-tests)
+ - [frontend integration tests](#frontend-integration-tests)
+- `spec/features/` which are run by RSpec and contain
+ - [feature tests](#feature-tests)
+
+All tests in `spec/javascripts/` will eventually be migrated to `spec/frontend/` (see also [#52483](https://gitlab.com/gitlab-org/gitlab-ce/issues/52483)).
+
+In addition, there used to be feature tests in `features/`, run by Spinach.
+These were removed from the codebase in May 2018 ([#23036](https://gitlab.com/gitlab-org/gitlab-ce/issues/23036)).
+
+See also [Notes on testing Vue components](../fe_guide/vue.html#testing-vue-components).
+
+### Frontend unit tests
+
+Unit tests are on the lowest abstraction level and typically test functionality that is not directly perceivable by a user.
+
+```mermaid
+graph RL
+ plain[Plain JavaScript];
+ Vue[Vue Components];
+ feature-flags[Feature Flags];
+ license-checks[License Checks];
+
+ plain---Vuex;
+ plain---GraphQL;
+ Vue---plain;
+ Vue---Vuex;
+ Vue---GraphQL;
+ browser---plain;
+ browser---Vue;
+ plain---backend;
+ Vuex---backend;
+ GraphQL---backend;
+ Vue---backend;
+ backend---database;
+ backend---feature-flags;
+ backend---license-checks;
+
+ class plain tested;
+ class Vuex tested;
+
+ classDef node color:#909090,fill:#f0f0f0,stroke-width:2px,stroke:#909090
+ classDef label stroke-width:0;
+ classDef tested color:#000000,fill:#a0c0ff,stroke:#6666ff,stroke-width:2px,stroke-dasharray: 5, 5;
+
+ subgraph " "
+ tested;
+ mocked;
+ class tested tested;
+ end
+```
+
+#### When to use unit tests
+
+<details>
+ <summary>exported functions and classes</summary>
+ Anything that is exported can be reused at various places in a way you have no control over.
+ Therefore it is necessary to document the expected behavior of the public interface with tests.
+</details>
+
+<details>
+ <summary>Vuex actions</summary>
+ Any Vuex action needs to work in a consistent way independent of the component it is triggered from.
+</details>
+
+<details>
+ <summary>Vuex mutations</summary>
+ For complex Vuex mutations it helps to identify the source of a problem by separating the tests from other parts of the Vuex store.
+</details>
+
+#### When *not* to use unit tests
+
+<details>
+ <summary>non-exported functions or classes</summary>
+ Anything that is not exported from a module can be considered private or an implementation detail and doesn't need to be tested.
+</details>
+
+<details>
+ <summary>constants</summary>
+ Testing the value of a constant would mean to copy it.
+ This results in extra effort without additional confidence that the value is correct.
+</details>
+
+<details>
+ <summary>Vue components</summary>
+ Computed properties, methods, and lifecycle hooks can be considered an implementation detail of components and don't need to be tested.
+ They are implicitly covered by component tests.
+ The <a href="https://vue-test-utils.vuejs.org/guides/#getting-started">official Vue guidelines</a> suggest the same.
+</details>
+
+#### What to mock in unit tests
+
+<details>
+ <summary>state of the class under test</summary>
+ Modifying the state of the class under test directly rather than using methods of the class avoids side-effects in test setup.
+</details>
+
+<details>
+ <summary>other exported classes</summary>
+ Every class needs to be tested in isolation to prevent test scenarios from growing exponentially.
+</details>
+
+<details>
+ <summary>single DOM elements if passed as parameters</summary>
+ For tests that only operate on single DOM elements rather than a whole page, creating these elements is cheaper than loading a whole HTML fixture.
+</details>
+
+<details>
+ <summary>all server requests</summary>
+ When running frontend unit tests, the backend may not be reachable.
+ Therefore all outgoing requests need to be mocked.
+</details>
+
+<details>
+ <summary>asynchronous background operations</summary>
+ Background operations cannot be stopped or waited on, so they will continue running in the following tests and cause side effects.
+</details>
+
+#### What *not* to mock in unit tests
+
+<details>
+ <summary>non-exported functions or classes</summary>
+ Everything that is not exported can be considered private to the module and will be implicitly tested via the exported classes / functions.
+</details>
+
+<details>
+ <summary>methods of the class under test</summary>
+ By mocking methods of the class under test, the mocks will be tested and not the real methods.
+</details>
+
+<details>
+ <summary>utility functions (pure functions, or those that only modify parameters)</summary>
+ If a function has no side effects because it has no state, it is safe to not mock it in tests.
+</details>
+
+<details>
+ <summary>full HTML pages</summary>
+ Loading the HTML of a full page slows down tests, so it should be avoided in unit tests.
+</details>
+
+### Frontend component tests
+
+Component tests cover the state of a single component that is perceivable by a user depending on external signals such as user input, events fired from other components, or application state.
+
+```mermaid
+graph RL
+ plain[Plain JavaScript];
+ Vue[Vue Components];
+ feature-flags[Feature Flags];
+ license-checks[License Checks];
+
+ plain---Vuex;
+ plain---GraphQL;
+ Vue---plain;
+ Vue---Vuex;
+ Vue---GraphQL;
+ browser---plain;
+ browser---Vue;
+ plain---backend;
+ Vuex---backend;
+ GraphQL---backend;
+ Vue---backend;
+ backend---database;
+ backend---feature-flags;
+ backend---license-checks;
+
+ class Vue tested;
+
+ classDef node color:#909090,fill:#f0f0f0,stroke-width:2px,stroke:#909090
+ classDef label stroke-width:0;
+ classDef tested color:#000000,fill:#a0c0ff,stroke:#6666ff,stroke-width:2px,stroke-dasharray: 5, 5;
+
+ subgraph " "
+ tested;
+ mocked;
+ class tested tested;
+ end
+```
+
+#### When to use component tests
+
+- Vue components
+
+#### When *not* to use component tests
+
+<details>
+ <summary>Vue applications</summary>
+ Vue applications may contain many components.
+ Testing them on a component level requires too much effort.
+ Therefore they are tested on frontend integration level.
+</details>
+
+<details>
+ <summary>HAML templates</summary>
+ HAML templates contain only Markup and no frontend-side logic.
+ Therefore they are not complete components.
+</details>
+
+#### What to mock in component tests
+
+<details>
+ <summary>DOM</summary>
+ Operating on the real DOM is significantly slower than on the virtual DOM.
+</details>
+
+<details>
+ <summary>properties and state of the component under test</summary>
+ Similarly to testing classes, modifying the properties directly (rather than relying on methods of the component) avoids side-effects.
+</details>
+
+<details>
+ <summary>Vuex store</summary>
+ To avoid side effects and keep component tests simple, Vuex stores are replaced with mocks.
+</details>
+
+<details>
+ <summary>all server requests</summary>
+ Similar to unit tests, when running component tests, the backend may not be reachable.
+ Therefore all outgoing requests need to be mocked.
+</details>
+
+<details>
+ <summary>asynchronous background operations</summary>
+ Similar to unit tests, background operations cannot be stopped or waited on, so they will continue running in the following tests and cause side effects.
+</details>
+
+<details>
+ <summary>child components</summary>
+ Every component is tested individually, so child components are mocked.
+ See also <a href="https://vue-test-utils.vuejs.org/api/#shallowmount">shallowMount()</a>
+</details>
+
+#### What *not* to mock in component tests
+
+<details>
+ <summary>methods or computed properties of the component under test</summary>
+ By mocking part of the component under test, the mocks will be tested and not the real component.
+</details>
+
+<details>
+ <summary>functions and classes independent from Vue</summary>
+ All plain JavaScript code is already covered by unit tests and needs not to be mocked in component tests.
+</details>
+
+### Frontend integration tests
+
+Integration tests cover the interaction between all components on a single page.
+Their abstraction level is comparable to how a user would interact with the UI.
+
+```mermaid
+graph RL
+ plain[Plain JavaScript];
+ Vue[Vue Components];
+ feature-flags[Feature Flags];
+ license-checks[License Checks];
+
+ plain---Vuex;
+ plain---GraphQL;
+ Vue---plain;
+ Vue---Vuex;
+ Vue---GraphQL;
+ browser---plain;
+ browser---Vue;
+ plain---backend;
+ Vuex---backend;
+ GraphQL---backend;
+ Vue---backend;
+ backend---database;
+ backend---feature-flags;
+ backend---license-checks;
+
+ class plain tested;
+ class Vue tested;
+ class Vuex tested;
+ class GraphQL tested;
+ class browser tested;
+ linkStyle 0,1,2,3,4,5,6 stroke:#6666ff,stroke-width:2px,stroke-dasharray: 5, 5;
+
+ classDef node color:#909090,fill:#f0f0f0,stroke-width:2px,stroke:#909090
+ classDef label stroke-width:0;
+ classDef tested color:#000000,fill:#a0c0ff,stroke:#6666ff,stroke-width:2px,stroke-dasharray: 5, 5;
+
+ subgraph " "
+ tested;
+ mocked;
+ class tested tested;
+ end
+```
+
+#### When to use integration tests
+
+<details>
+ <summary>page bundles (<code>index.js</code> files in <code>app/assets/javascripts/pages/</code>)</summary>
+ Testing the page bundles ensures the corresponding frontend components integrate well.
+</details>
+
+<details>
+ <summary>Vue applications outside of page bundles</summary>
+ Testing Vue applications as a whole ensures the corresponding frontend components integrate well.
+</details>
+
+#### What to mock in integration tests
+
+<details>
+ <summary>HAML views (use fixtures instead)</summary>
+ Rendering HAML views requires a Rails environment including a running database which we cannot rely on in frontend tests.
+</details>
+
+<details>
+ <summary>all server requests</summary>
+ Similar to unit and component tests, when running component tests, the backend may not be reachable.
+ Therefore all outgoing requests need to be mocked.
+</details>
+
+<details>
+ <summary>asynchronous background operations that are not perceivable on the page</summary>
+ Background operations that affect the page need to be tested on this level.
+ All other background operations cannot be stopped or waited on, so they will continue running in the following tests and cause side effects.
+</details>
+
+#### What *not* to mock in integration tests
+
+<details>
+ <summary>DOM</summary>
+ Testing on the real DOM ensures our components work in the environment they are meant for.
+ Part of this will be delegated to <a href="https://gitlab.com/gitlab-org/quality/team-tasks/issues/45">cross-browser testing</a>.
+</details>
+
+<details>
+ <summary>properties or state of components</summary>
+ On this level, all tests can only perform actions a user would do.
+ For example to change the state of a component, a click event would be fired.
+</details>
+
+<details>
+ <summary>Vuex stores</summary>
+ When testing the frontend code of a page as a whole, the interaction between Vue components and Vuex stores is covered as well.
+</details>
+
+### Feature tests
+
+In contrast to [frontend integration tests](#frontend-integration-tests), feature tests make requests against the real backend instead of using fixtures.
+This also implies that database queries are executed which makes this category significantly slower.
+
+See also the [RSpec testing guidelines](../testing_guide/best_practices.md#rspec).
+
+```mermaid
+graph RL
+ plain[Plain JavaScript];
+ Vue[Vue Components];
+ feature-flags[Feature Flags];
+ license-checks[License Checks];
+
+ plain---Vuex;
+ plain---GraphQL;
+ Vue---plain;
+ Vue---Vuex;
+ Vue---GraphQL;
+ browser---plain;
+ browser---Vue;
+ plain---backend;
+ Vuex---backend;
+ GraphQL---backend;
+ Vue---backend;
+ backend---database;
+ backend---feature-flags;
+ backend---license-checks;
+
+ class backend tested;
+ class plain tested;
+ class Vue tested;
+ class Vuex tested;
+ class GraphQL tested;
+ class browser tested;
+ linkStyle 0,1,2,3,4,5,6,7,8,9,10 stroke:#6666ff,stroke-width:2px,stroke-dasharray: 5, 5;
+
+ classDef node color:#909090,fill:#f0f0f0,stroke-width:2px,stroke:#909090
+ classDef label stroke-width:0;
+ classDef tested color:#000000,fill:#a0c0ff,stroke:#6666ff,stroke-width:2px,stroke-dasharray: 5, 5;
+
+ subgraph " "
+ tested;
+ mocked;
+ class tested tested;
+ end
+```
+
+#### When to use feature tests
+
+- Use cases that require a backend and cannot be tested using fixtures.
+- Behavior that is not part of a page bundle but defined globally.
+
+#### Relevant notes
+
+A `:js` flag is added to the test to make sure the full environment is loaded.
+
+```ruby
+scenario 'successfully', :js do
+ sign_in(create(:admin))
+end
+```
+
+The steps of each test are written using capybara methods ([documentation](https://www.rubydoc.info/gems/capybara)).
+
+Bear in mind <abbr title="XMLHttpRequest">XHR</abbr> calls might require you to use `wait_for_requests` in between steps, like so:
+
+```ruby
+find('.form-control').native.send_keys(:enter)
+
+wait_for_requests
+
+expect(page).not_to have_selector('.card')
+```
+
+## Test helpers
+
+### Vuex Helper: `testAction`
+
+We have a helper available to make testing actions easier, as per [official documentation](https://vuex.vuejs.org/guide/testing.html):
+
+```javascript
+testAction(
+ actions.actionName, // action
+ { }, // params to be passed to action
+ state, // state
+ [
+ { type: types.MUTATION},
+ { type: types.MUTATION_1, payload: {}},
+ ], // mutations committed
+ [
+ { type: 'actionName', payload: {}},
+ { type: 'actionName1', payload: {}},
+ ] // actions dispatched
+ done,
+);
+```
+
+Check an example in [spec/javascripts/ide/stores/actions_spec.jsspec/javascripts/ide/stores/actions_spec.js](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/spec/javascripts/ide/stores/actions_spec.js).
+
+### Vue Helper: `mountComponent`
+
+To make mounting a Vue component easier and more readable, we have a few helpers available in `spec/helpers/vue_mount_component_helper`:
+
+- `createComponentWithStore`
+- `mountComponentWithStore`
+
+Examples of usage:
+
+```javascript
+beforeEach(() => {
+ vm = createComponentWithStore(Component, store);
+
+ vm.$store.state.currentBranchId = 'master';
+
+ vm.$mount();
+});
+```
+
+```javascript
+beforeEach(() => {
+ vm = mountComponentWithStore(Component, {
+ el: '#dummy-element',
+ store,
+ props: { badge },
+ });
+});
+```
+
+Don't forget to clean up:
+
+```javascript
+afterEach(() => {
+ vm.$destroy();
+});
+```
+
+## Testing with older browsers
+
+Some regressions only affect a specific browser version. We can install and test in particular browsers with either Firefox or Browserstack using the following steps:
+
+### Browserstack
+
+[Browserstack](https://www.browserstack.com/) allows you to test more than 1200 mobile devices and browsers.
+You can use it directly through the [live app](https://www.browserstack.com/live) or you can install the [chrome extension](https://chrome.google.com/webstore/detail/browserstack/nkihdmlheodkdfojglpcjjmioefjahjb) for easy access.
+You can find the credentials on 1Password, under `frontendteam@gitlab.com`.
+
+### Firefox
+
+#### macOS
+
+You can download any older version of Firefox from the releases FTP server, <https://ftp.mozilla.org/pub/firefox/releases/>:
+
+1. From the website, select a version, in this case `50.0.1`.
+1. Go to the mac folder.
+1. Select your preferred language, you will find the dmg package inside, download it.
+1. Drag and drop the application to any other folder but the `Applications` folder.
+1. Rename the application to something like `Firefox_Old`.
+1. Move the application to the `Applications` folder.
+1. Open up a terminal and run `/Applications/Firefox_Old.app/Contents/MacOS/firefox-bin -profilemanager` to create a new profile specific to that Firefox version.
+1. Once the profile has been created, quit the app, and run it again like normal. You now have a working older Firefox version.
+
---
[Return to Testing documentation](index.md)
diff --git a/doc/development/testing_guide/img/qa_on_merge_requests_cicd_architecture.png b/doc/development/testing_guide/img/qa_on_merge_requests_cicd_architecture.png
deleted file mode 100644
index 5b93a05db96..00000000000
--- a/doc/development/testing_guide/img/qa_on_merge_requests_cicd_architecture.png
+++ /dev/null
Binary files differ
diff --git a/doc/development/testing_guide/img/review_apps_cicd_architecture.png b/doc/development/testing_guide/img/review_apps_cicd_architecture.png
deleted file mode 100644
index 1ee28d3db91..00000000000
--- a/doc/development/testing_guide/img/review_apps_cicd_architecture.png
+++ /dev/null
Binary files differ
diff --git a/doc/development/testing_guide/review_apps.md b/doc/development/testing_guide/review_apps.md
index ae40d628717..11449712a04 100644
--- a/doc/development/testing_guide/review_apps.md
+++ b/doc/development/testing_guide/review_apps.md
@@ -8,38 +8,33 @@ Review Apps are automatically deployed by each pipeline, both in
### CI/CD architecture diagram
-![Review Apps CI/CD architecture](img/review_apps_cicd_architecture.png)
-
-<details>
-<summary>Show mermaid source</summary>
-<pre>
+```mermaid
graph TD
build-qa-image -.->|once the `prepare` stage is done| gitlab:assets:compile
review-build-cng -->|triggers a CNG-mirror pipeline and wait for it to be done| CNG-mirror
review-build-cng -.->|once the `test` stage is done| review-deploy
review-deploy -.->|once the `review` stage is done| review-qa-smoke
-subgraph 1. gitlab-ce/ee `prepare` stage
+subgraph "1. gitlab-ce/ee `prepare` stage"
build-qa-image
end
-subgraph 2. gitlab-ce/ee `test` stage
+subgraph "2. gitlab-ce/ee `test` stage"
gitlab:assets:compile -->|plays dependent job once done| review-build-cng
end
-subgraph 3. gitlab-ce/ee `review` stage
- review-deploy["review-deploy<br /><br />Helm deploys the Review App using the Cloud<br/>Native images built by the CNG-mirror pipeline.<br /><br />Cloud Native images are deployed to the `review-apps-ce` or `review-apps-ee`<br />Kubernetes (GKE) cluster, in the GCP `gitlab-review-apps` project."]
+subgraph "3. gitlab-ce/ee `review` stage"
+ review-deploy["review-deploy<br><br>Helm deploys the Review App using the Cloud<br/>Native images built by the CNG-mirror pipeline.<br><br>Cloud Native images are deployed to the `review-apps-ce` or `review-apps-ee`<br>Kubernetes (GKE) cluster, in the GCP `gitlab-review-apps` project."]
end
-subgraph 4. gitlab-ce/ee `qa` stage
- review-qa-smoke[review-qa-smoke<br /><br />gitlab-qa runs the smoke suite against the Review App.]
+subgraph "4. gitlab-ce/ee `qa` stage"
+ review-qa-smoke[review-qa-smoke<br><br>gitlab-qa runs the smoke suite against the Review App.]
end
-subgraph CNG-mirror pipeline
+subgraph "CNG-mirror pipeline"
CNG-mirror>Cloud Native images are built];
end
-</pre>
-</details>
+```
### Detailed explanation
@@ -115,6 +110,28 @@ On every [pipeline][gitlab-pipeline] in the `qa` stage, the
browser performance testing using a
[Sitespeed.io Container](../../user/project/merge_requests/browser_performance_testing.md).
+## Cluster configuration
+
+### Node pools
+
+Both `review-apps-ce` and `review-apps-ee` clusters are currently set up with
+two node pools:
+
+- a node pool of non-preemptible `n1-standard-2` (2 vCPU, 7.5 GB memory) nodes
+ dedicated to the `tiller` deployment (see below) with a single node.
+- a node pool of preemptible `n1-standard-2` (2 vCPU, 7.5 GB memory) nodes,
+ with a minimum of 1 node and a maximum of 250 nodes.
+
+### Helm/Tiller
+
+The `tiller` deployment (the Helm server) is deployed to a dedicated node pool
+that has the `app=helm` label and a specific
+[taint](https://kubernetes.io/docs/concepts/configuration/taint-and-toleration/)
+to prevent other pods from being scheduled on this node pool.
+
+This is to ensure Tiller isn't affected by "noisy" neighbors that could put
+their node under pressure.
+
## How to:
### Log into my Review App
@@ -137,8 +154,8 @@ secure note named **gitlab-{ce,ee} Review App's root password**.
### Run a Rails console
-1. [Filter Workloads by your Review App slug](https://console.cloud.google.com/kubernetes/workload?project=gitlab-review-apps)
- , e.g. `review-qa-raise-e-12chm0`.
+1. [Filter Workloads by your Review App slug](https://console.cloud.google.com/kubernetes/workload?project=gitlab-review-apps),
+ e.g. `review-qa-raise-e-12chm0`.
1. Find and open the `task-runner` Deployment, e.g. `review-qa-raise-e-12chm0-task-runner`.
1. Click on the Pod in the "Managed pods" section, e.g. `review-qa-raise-e-12chm0-task-runner-d5455cc8-2lsvz`.
1. Click on the `KUBECTL` dropdown, then `Exec` -> `task-runner`.
@@ -196,7 +213,7 @@ For the record, the debugging steps to find out this issue were:
1. `kubectl describe pod <pod name>` & confirm exact error message
1. Web search for exact error message, following rabbit hole to [a relevant kubernetes bug report](https://github.com/kubernetes/kubernetes/issues/57345)
1. Access the node over SSH via the GCP console (**Computer Engine > VM
- instances** then click the "SSH" button for the node where the `dns-gitlab-review-app-external-dns` pod runs)
+ instances** then click the "SSH" button for the node where the `dns-gitlab-review-app-external-dns` pod runs)
1. In the node: `systemctl --version` => systemd 232
1. Gather some more information:
- `mount | grep kube | wc -l` => e.g. 290
@@ -211,7 +228,7 @@ For the record, the debugging steps to find out this issue were:
To resolve the problem, we needed to (forcibly) drain some nodes:
1. Try a normal drain on the node where the `dns-gitlab-review-app-external-dns`
- pod runs so that Kubernetes automatically move it to another node: `kubectl drain NODE_NAME`
+ pod runs so that Kubernetes automatically move it to another node: `kubectl drain NODE_NAME`
1. If that doesn't work, you can also perform a forcible "drain" the node by removing all pods: `kubectl delete pods --field-selector=spec.nodeName=NODE_NAME`
1. In the node:
- Perform `systemctl daemon-reload` to remove the dead/inactive units
@@ -238,17 +255,8 @@ that a machine will hit the "too many mount points" problem in the future.
thousands of unused Docker images.**
> We have to start somewhere and improve later. Also, we're using the
- CNG-mirror project to store these Docker images so that we can just wipe out
- the registry at some point, and use a new fresh, empty one.
-
-**How big are the Kubernetes clusters (`review-apps-ce` and `review-apps-ee`)?**
-
- > The clusters are currently set up with a single pool of preemptible nodes,
- with a minimum of 1 node and a maximum of 500 nodes.
-
-**What are the machine running on the cluster?**
-
- > We're currently using `n1-standard-1` (1 vCPU, 3.75 GB memory) machines.
+ > CNG-mirror project to store these Docker images so that we can just wipe out
+ > the registry at some point, and use a new fresh, empty one.
**How do we secure this from abuse? Apps are open to the world so we need to
find a way to limit it to only us.**
diff --git a/doc/development/testing_guide/testing_levels.md b/doc/development/testing_guide/testing_levels.md
index e1ce4d3b7d1..0090c84cbf0 100644
--- a/doc/development/testing_guide/testing_levels.md
+++ b/doc/development/testing_guide/testing_levels.md
@@ -63,10 +63,9 @@ They're useful to test permissions, redirections, what view is rendered etc.
| Code path | Tests path | Testing engine | Notes |
| --------- | ---------- | -------------- | ----- |
-| `app/controllers/` | `spec/controllers/` | RSpec | |
+| `app/controllers/` | `spec/controllers/` | RSpec | For N+1 tests, use [request specs](../query_recorder.md#use-request-specs-instead-of-controller-specs) |
| `app/mailers/` | `spec/mailers/` | RSpec | |
| `lib/api/` | `spec/requests/api/` | RSpec | |
-| `lib/ci/api/` | `spec/requests/ci/api/` | RSpec | |
| `app/assets/javascripts/` | `spec/javascripts/`, `spec/frontend/` | Karma & Jest | More details in the [Frontend Testing guide](frontend_testing.md) section. |
### About controller tests
diff --git a/doc/development/understanding_explain_plans.md b/doc/development/understanding_explain_plans.md
index 11aafd7b639..7c926c83a36 100644
--- a/doc/development/understanding_explain_plans.md
+++ b/doc/development/understanding_explain_plans.md
@@ -199,7 +199,7 @@ more common ones here.
A full list of all the available nodes and their descriptions can be found in
the [PostgreSQL source file
-"plannodes.h"](https://github.com/postgres/postgres/blob/master/src/include/nodes/plannodes.h)
+"plannodes.h"](https://gitlab.com/postgres/postgres/blob/master/src/include/nodes/plannodes.h)
### Seq Scan
@@ -224,7 +224,7 @@ used when we would read too much data from an index scan, but too little to
perform a sequential scan. A bitmap scan uses what is known as a [bitmap
index](https://en.wikipedia.org/wiki/Bitmap_index) to perform its work.
-The [source code of PostgreSQL](https://github.com/postgres/postgres/blob/1c2cb2744bf3d8ad751cd5cf3b347f10f48492b3/src/include/nodes/plannodes.h#L446-L457)
+The [source code of PostgreSQL](https://gitlab.com/postgres/postgres/blob/REL_11_STABLE/src/include/nodes/plannodes.h#L441)
states the following on bitmap scans:
> Bitmap Index Scan delivers a bitmap of potential tuple locations; it does not
diff --git a/doc/development/utilities.md b/doc/development/utilities.md
index 0e396baccff..4021756343c 100644
--- a/doc/development/utilities.md
+++ b/doc/development/utilities.md
@@ -6,44 +6,44 @@ We developed a number of utilities to ease development.
- Deep merges an array of hashes:
- ``` ruby
- Gitlab::Utils::MergeHash.merge(
- [{ hello: ["world"] },
- { hello: "Everyone" },
- { hello: { greetings: ['Bonjour', 'Hello', 'Hallo', 'Dzien dobry'] } },
- "Goodbye", "Hallo"]
- )
- ```
-
- Gives:
-
- ``` ruby
- [
- {
- hello:
- [
- "world",
- "Everyone",
- { greetings: ['Bonjour', 'Hello', 'Hallo', 'Dzien dobry'] }
- ]
- },
- "Goodbye"
- ]
- ```
+ ``` ruby
+ Gitlab::Utils::MergeHash.merge(
+ [{ hello: ["world"] },
+ { hello: "Everyone" },
+ { hello: { greetings: ['Bonjour', 'Hello', 'Hallo', 'Dzien dobry'] } },
+ "Goodbye", "Hallo"]
+ )
+ ```
+
+ Gives:
+
+ ``` ruby
+ [
+ {
+ hello:
+ [
+ "world",
+ "Everyone",
+ { greetings: ['Bonjour', 'Hello', 'Hallo', 'Dzien dobry'] }
+ ]
+ },
+ "Goodbye"
+ ]
+ ```
- Extracts all keys and values from a hash into an array:
- ``` ruby
- Gitlab::Utils::MergeHash.crush(
- { hello: "world", this: { crushes: ["an entire", "hash"] } }
- )
- ```
+ ``` ruby
+ Gitlab::Utils::MergeHash.crush(
+ { hello: "world", this: { crushes: ["an entire", "hash"] } }
+ )
+ ```
- Gives:
+ Gives:
- ``` ruby
- [:hello, "world", :this, :crushes, "an entire", "hash"]
- ```
+ ``` ruby
+ [:hello, "world", :this, :crushes, "an entire", "hash"]
+ ```
## [`Override`](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/lib/gitlab/utils/override.rb)
@@ -53,9 +53,9 @@ We developed a number of utilities to ease development.
`ENV['STATIC_VERIFICATION']` is set to avoid production runtime overhead.
This is useful to check:
- - If we have typos in overriding methods.
- - If we renamed the overridden methods, making original overriding methods
- overrides nothing.
+ - If we have typos in overriding methods.
+ - If we renamed the overridden methods, making original overriding methods
+ overrides nothing.
Here's a simple example:
@@ -94,47 +94,47 @@ We developed a number of utilities to ease development.
- Memoize the value even if it is `nil` or `false`.
- We often do `@value ||= compute`, however this doesn't work well if
- `compute` might eventually give `nil` and we don't want to compute again.
- Instead we could use `defined?` to check if the value is set or not.
- However it's tedious to write such pattern, and `StrongMemoize` would
- help us use such pattern.
+ We often do `@value ||= compute`, however this doesn't work well if
+ `compute` might eventually give `nil` and we don't want to compute again.
+ Instead we could use `defined?` to check if the value is set or not.
+ However it's tedious to write such pattern, and `StrongMemoize` would
+ help us use such pattern.
- Instead of writing patterns like this:
+ Instead of writing patterns like this:
- ``` ruby
- class Find
- def result
- return @result if defined?(@result)
+ ``` ruby
+ class Find
+ def result
+ return @result if defined?(@result)
- @result = search
- end
+ @result = search
end
- ```
+ end
+ ```
- We could write it like:
+ We could write it like:
- ``` ruby
- class Find
- include Gitlab::Utils::StrongMemoize
+ ``` ruby
+ class Find
+ include Gitlab::Utils::StrongMemoize
- def result
- strong_memoize(:result) do
- search
- end
+ def result
+ strong_memoize(:result) do
+ search
end
end
- ```
+ end
+ ```
- Clear memoization
- ``` ruby
- class Find
- include Gitlab::Utils::StrongMemoize
- end
+ ``` ruby
+ class Find
+ include Gitlab::Utils::StrongMemoize
+ end
- Find.new.clear_memoization(:result)
- ```
+ Find.new.clear_memoization(:result)
+ ```
## [`RequestCache`](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/lib/gitlab/cache/request_cache.rb)
diff --git a/doc/development/ux_guide/animation.md b/doc/development/ux_guide/animation.md
index 583ff19bc69..a998ab74a96 100644
--- a/doc/development/ux_guide/animation.md
+++ b/doc/development/ux_guide/animation.md
@@ -1,5 +1,5 @@
---
-redirect_to: 'https://design.gitlab.com/foundations/motion'
+redirect_to: 'https://design.gitlab.com/product-foundations/motion'
---
-The content of this document was moved into the [GitLab Design System](https://design.gitlab.com).
+The content of this document was moved into the [GitLab Design System](https://design.gitlab.com/product-foundations/motion).
diff --git a/doc/development/ux_guide/illustrations.md b/doc/development/ux_guide/illustrations.md
index ed072b6515f..3592d25c95d 100644
--- a/doc/development/ux_guide/illustrations.md
+++ b/doc/development/ux_guide/illustrations.md
@@ -1,5 +1,5 @@
---
-redirect_to: 'https://design.gitlab.com/foundations/illustration/'
+redirect_to: 'https://design.gitlab.com/product-foundations/illustration'
---
-The content of this document was moved into the [GitLab Design System](https://design.gitlab.com/).
+The content of this document was moved into the [GitLab Design System](https://design.gitlab.com/product-foundations/illustration).
diff --git a/doc/development/verifying_database_capabilities.md b/doc/development/verifying_database_capabilities.md
index 661ab9cef1a..6b4995aebe2 100644
--- a/doc/development/verifying_database_capabilities.md
+++ b/doc/development/verifying_database_capabilities.md
@@ -1,15 +1,15 @@
# Verifying Database Capabilities
-Sometimes certain bits of code may only work on a certain database and/or
+Sometimes certain bits of code may only work on a certain database
version. While we try to avoid such code as much as possible sometimes it is
necessary to add database (version) specific behaviour.
To facilitate this we have the following methods that you can use:
-- `Gitlab::Database.postgresql?`: returns `true` if PostgreSQL is being used
-- `Gitlab::Database.mysql?`: returns `true` if MySQL is being used
+- `Gitlab::Database.postgresql?`: returns `true` if PostgreSQL is being used.
+ You can normally just assume this is the case.
- `Gitlab::Database.version`: returns the PostgreSQL version number as a string
- in the format `X.Y.Z`. This method does not work for MySQL
+ in the format `X.Y.Z`.
This allows you to write code such as:
@@ -25,7 +25,7 @@ else
end
```
-# Read-only database
+## Read-only database
The database can be used in read-only mode. In this case we have to
make sure all GET requests don't attempt any write operations to the
diff --git a/doc/development/what_requires_downtime.md b/doc/development/what_requires_downtime.md
index 24edd05da2f..944bf5900c5 100644
--- a/doc/development/what_requires_downtime.md
+++ b/doc/development/what_requires_downtime.md
@@ -7,9 +7,8 @@ downtime.
## Adding Columns
-On PostgreSQL you can safely add a new column to an existing table as long as it
-does **not** have a default value. For example, this query would not require
-downtime:
+You can safely add a new column to an existing table as long as it does **not**
+have a default value. For example, this query would not require downtime:
```sql
ALTER TABLE projects ADD COLUMN random_value int;
@@ -27,11 +26,6 @@ This requires updating every single row in the `projects` table so that
indexes in a table. This in turn acquires enough locks on the table for it to
effectively block any other queries.
-As of MySQL 5.6 adding a column to a table is still quite an expensive
-operation, even when using `ALGORITHM=INPLACE` and `LOCK=NONE`. This means
-downtime _may_ be required when modifying large tables as otherwise the
-operation could potentially take hours to complete.
-
Adding a column with a default value _can_ be done without requiring downtime
when using the migration helper method
`Gitlab::Database::MigrationHelpers#add_column_with_default`. This method works
@@ -140,7 +134,7 @@ done without requiring downtime. However, this does require that any application
changes are deployed _first_. Thus, changing the constraints of a column should
happen in a post-deployment migration.
NOTE: Avoid using `change_column` as it produces inefficient query because it re-defines
-the whole column type. For example, to add a NOT NULL constraint, prefer `change_column_null `
+the whole column type. For example, to add a NOT NULL constraint, prefer `change_column_null`
## Changing Column Types
@@ -311,8 +305,7 @@ migrations](background_migrations.md#cleaning-up).
## Adding Indexes
Adding indexes is an expensive process that blocks INSERT and UPDATE queries for
-the duration. When using PostgreSQL one can work around this by using the
-`CONCURRENTLY` option:
+the duration. You can work around this by using the `CONCURRENTLY` option:
```sql
CREATE INDEX CONCURRENTLY index_name ON projects (column_name);
@@ -336,17 +329,9 @@ end
Note that `add_concurrent_index` can not be reversed automatically, thus you
need to manually define `up` and `down`.
-When running this on PostgreSQL the `CONCURRENTLY` option mentioned above is
-used. On MySQL this method produces a regular `CREATE INDEX` query.
-
-MySQL doesn't really have a workaround for this. Supposedly it _can_ create
-indexes without the need for downtime but only for variable width columns. The
-details on this are a bit sketchy. Since it's better to be safe than sorry one
-should assume that adding indexes requires downtime on MySQL.
-
## Dropping Indexes
-Dropping an index does not require downtime on both PostgreSQL and MySQL.
+Dropping an index does not require downtime.
## Adding Tables
@@ -370,7 +355,7 @@ transaction this means this approach would require downtime.
GitLab allows you to work around this by using
`Gitlab::Database::MigrationHelpers#add_concurrent_foreign_key`. This method
-ensures that when PostgreSQL is used no downtime is needed.
+ensures that no downtime is needed.
## Removing Foreign Keys