_widget_to_work_item_types_spec.rb`. Add the shared example that uses the constants from `described_class`.
-
+
```ruby
# frozen_string_literal: true
diff --git a/doc/update/deprecations.md b/doc/update/deprecations.md
index 8949cbffb26..838254c95e0 100644
--- a/doc/update/deprecations.md
+++ b/doc/update/deprecations.md
@@ -872,7 +872,7 @@ We also plan to make this easier to manage by adding an option to control this f
-- Announced in GitLab 17.4
+- Announced in GitLab 17.9
- Removal in GitLab 18.0 ([breaking change](https://docs.gitlab.com/ee/update/terminology.html#breaking-change))
- To discuss this change or learn more, see the [deprecation issue](https://gitlab.com/gitlab-org/gitlab/-/issues/426659).
diff --git a/doc/user/packages/container_registry/protected_container_tags.md b/doc/user/packages/container_registry/protected_container_tags.md
new file mode 100644
index 00000000000..122c46b9995
--- /dev/null
+++ b/doc/user/packages/container_registry/protected_container_tags.md
@@ -0,0 +1,108 @@
+---
+stage: Package
+group: Container Registry
+info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://handbook.gitlab.com/handbook/product/ux/technical-writing/#assignments
+---
+
+# Protected container tags
+
+DETAILS:
+**Tier:** Free, Premium, Ultimate
+**Offering:** GitLab.com, GitLab Self-Managed
+**Status:** Experiment
+
+> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/505455) as an [experiment](../../../policy/development_stages_support.md) in GitLab 17.9 [with a flag](../../../administration/feature_flags.md) named `container_registry_protected_tags`. Disabled by default.
+
+FLAG:
+The availability of this feature is controlled by a feature flag.
+For more information, see the history.
+This feature is available for testing, but not ready for production use.
+
+Control who can push and delete container tags in your project.
+
+By default, users with the Developer role or higher can push and delete image tags in all project container repositories.
+With tag protection rules, you can:
+
+- Restrict pushing and deleting tags to specific user roles.
+- Create up to 5 protection rules per project.
+- Apply these rules across all container repositories in your project.
+
+A tag is protected when at least one protection rule matches its name. If multiple rules match, the most restrictive rule applies.
+
+Protected tags cannot be deleted by [cleanup policies](reduce_container_registry_storage.md#cleanup-policy).
+
+## Prerequisites
+
+Before you can use protected container tags:
+
+- You must use the new container registry version:
+ - GitLab.com: Enabled by default
+ - GitLab Self-Managed: [Enable the metadata database](../../../administration/packages/container_registry_metadata_database.md)
+
+## Create a protection rule
+
+Prerequisites:
+
+- You must have at least the Maintainer role
+
+To create a protection rule:
+
+1. On the left sidebar, select **Search or go to** and find your project.
+1. Select **Settings > Packages and registries**.
+1. Expand **Container registry**.
+1. Under **Protected container tags**, select **Add protection rule**.
+1. Complete the fields:
+ - **Protect container tags matching**: Enter a regex pattern using [RE2 syntax](https://github.com/google/re2/wiki/Syntax). Patterns must not exceed 100 characters. See [regex pattern examples](#regex-pattern-examples).
+ - **Minimum role allowed to push**: Select Maintainer, Owner, or Administrator.
+ - **Minimum role allowed to delete**: Select Maintainer, Owner, or Administrator.
+1. Select **Add rule**.
+
+The protection rule is created and matching tags are protected.
+
+## Regex pattern examples
+
+Example patterns you can use to protect container tags:
+
+| Pattern | Description |
+| ----------------- | ------------------------------------------------------------------------ |
+| `.*` | Protects all tags |
+| `^v.*` | Protects tags that start with "v" (like `v1.0.0`, `v2.1.0-rc1`) |
+| `\d+\.\d+\.\d+` | Protects semantic version tags (like `1.0.0`, `2.1.0`) |
+| `^latest$` | Protects the `latest` tag |
+| `.*-stable$` | Protects tags that end with "-stable" (like `1.0-stable`, `main-stable`) |
+| `stable\|release` | Protects tags that contain "stable" or "release" (like `1.0-stable`) |
+
+## Delete a protection rule
+
+Prerequisites:
+
+- You must have at least the Maintainer role
+
+To delete a protection rule:
+
+1. On the left sidebar, select **Search or go to** and find your project.
+1. Select **Settings > Packages and registries**.
+1. Expand **Container registry**.
+1. Under **Protected container tags**, next to the protection rule you want to delete, select **Delete** (**{remove}**).
+1. When prompted for confirmation, select **Delete**.
+
+The protection rule is deleted and matching tags are no longer protected.
+
+## Propagation delay
+
+Rule changes rely on JWT tokens to propagate between services. As a result, changes to protection rules and user access roles might take effect only after current JWT tokens expire. The delay equals the [configured token duration](../../../administration/packages/container_registry.md#increase-token-duration):
+
+- Default: 5 minutes
+- GitLab.com: [15 minutes](../../gitlab_com/index.md#gitlab-container-registry)
+
+Most container registry clients (including Docker, the GitLab UI, and the API) request a new token for each operation, but custom clients might retain a token for its full validity period.
+
+## Image manifest deletions
+
+The GitLab UI and API do not support direct image manifest deletions.
+Through direct container registry API calls, manifest deletions affect all associated tags.
+
+To ensure tag protection, direct manifest deletion requests are only allowed when:
+
+- Tag protection is disabled
+- The user has permission to delete any protected tags
diff --git a/doc/user/packages/container_registry/reduce_container_registry_storage.md b/doc/user/packages/container_registry/reduce_container_registry_storage.md
index 5dc29355c86..76a8f630eab 100644
--- a/doc/user/packages/container_registry/reduce_container_registry_storage.md
+++ b/doc/user/packages/container_registry/reduce_container_registry_storage.md
@@ -116,6 +116,7 @@ The cleanup policy:
1. Orders the remaining tags by `created_date`.
1. Excludes the N tags based on the `keep_n` value (Number of tags to retain).
1. Excludes the tags more recent than the `older_than` value (Expiration interval).
+1. Excludes [protected tags](protected_container_tags.md).
1. Deletes the remaining tags in the list from the container registry.
WARNING:
diff --git a/doc/user/project/pages/index.md b/doc/user/project/pages/index.md
index 82b109620c7..f6c057bed92 100644
--- a/doc/user/project/pages/index.md
+++ b/doc/user/project/pages/index.md
@@ -348,7 +348,7 @@ deploy-pages:
script:
- echo "Pages accessible through ${CI_PAGES_URL}"
variables:
- PAGES_PREFIX: "" # No prefix by default (master)
+ PAGES_PREFIX: "" # No prefix by default (main)
pages: # specifies that this is a Pages job
path_prefix: "$PAGES_PREFIX"
artifacts:
@@ -356,7 +356,7 @@ deploy-pages:
- public
rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH # Run on default branch (with default PAGES_PREFIX)
- - if: $CI_COMMIT_BRANCH == "staging" # Run on master (with default PAGES_PREFIX)
+ - if: $CI_COMMIT_BRANCH == "staging" # Run on main (with default PAGES_PREFIX)
variables:
PAGES_PREFIX: '_stg' # Prefix with _stg for the staging branch
- if: $CI_PIPELINE_SOURCE == "merge_request_event" # Conditionally change the prefix for Merge Requests
@@ -461,7 +461,7 @@ deploy-pages-review-app:
script:
- npm run build
pages: # specifies that this is a Pages job
- path_prefix: '/_staging'
+ path_prefix: '_staging'
artifacts:
paths:
- public
diff --git a/lib/api/helpers/personal_access_tokens_helpers.rb b/lib/api/helpers/personal_access_tokens_helpers.rb
index 250192aedf8..f82e3da4da9 100644
--- a/lib/api/helpers/personal_access_tokens_helpers.rb
+++ b/lib/api/helpers/personal_access_tokens_helpers.rb
@@ -6,7 +6,17 @@ module API
def finder_params(current_user)
user_param =
if current_user.can_admin_all_resources?
- { user: user(params[:user_id]) }
+ if params[:user_id].present?
+ user = user(params[:user_id])
+
+ not_found! if user.nil?
+
+ { user: user }
+ else
+ not_found! if params.key?(:user_id)
+
+ {}
+ end
else
{ user: current_user, impersonation: false }
end
diff --git a/lib/gitlab/database/migration_helpers/work_items/widgets.rb b/lib/gitlab/database/migration_helpers/work_items/widgets.rb
index a330bc17925..05409695819 100644
--- a/lib/gitlab/database/migration_helpers/work_items/widgets.rb
+++ b/lib/gitlab/database/migration_helpers/work_items/widgets.rb
@@ -11,7 +11,10 @@ module Gitlab
# include Gitlab::Database::MigrationHelpers::WorkItems::Widgets
#
# Define the following constants in the migration class
- # WORK_ITEM_TYPE_ENUM_VALUE = 8
+ #
+ # Use [8] for a single type
+ # WORK_ITEM_TYPE_ENUM_VALUES = [8,9]
+ #
# Use only one array item to add a single widget
# WIDGETS = [
# {
@@ -31,11 +34,11 @@ module Gitlab
#
# Then define the #up and down methods like this:
# def up
- # add_widget_definitions(type_enum_value: WORK_ITEM_TYPE_ENUM_VALUE, widgets: WIDGETS)
+ # add_widget_definitions(type_enum_values: WORK_ITEM_TYPE_ENUM_VALUES, widgets: WIDGETS)
# end
#
# def down
- # remove_widget_definitions(type_enum_value: WORK_ITEM_TYPE_ENUM_VALUE, widgets: WIDGETS)
+ # remove_widget_definitions(type_enum_values: WORK_ITEM_TYPE_ENUM_VALUES, widgets: WIDGETS)
# end
#
# Run a migration test for migrations that use this helper with:
@@ -45,30 +48,39 @@ module Gitlab
# RSpec.describe AddDesignsAndDevelopmentWidgetsToTicketWorkItemType, :migration do
# it_behaves_like 'migration that adds widgets to a work item type'
# end
- def add_widget_definitions(type_enum_value:, widgets:)
- work_item_type = migration_work_item_type.find_by(base_type: type_enum_value)
+ def add_widget_definitions(widgets:, type_enum_value: nil, type_enum_values: [])
+ enum_values = Array(type_enum_values) + [type_enum_value].compact
- # Work item type should exist in production applications, checking here to avoid failures
+ work_item_types = migration_work_item_type.where(base_type: enum_values)
+
+ # Work item types should exist in production applications, checking here to avoid failures
# if inconsistent data is present.
- return say(type_missing_message(type_enum_value)) unless work_item_type
+ validate_work_item_types(enum_values, work_item_types)
- widget_definitions = widgets.map do |w|
- { work_item_type_id: work_item_type.id, widget_options: nil }.merge(w)
+ widget_definitions = work_item_types.flat_map do |work_item_type|
+ widgets.map do |w|
+ { work_item_type_id: work_item_type.id, widget_options: nil }.merge(w)
+ end
end
+ return if widget_definitions.empty?
+
migration_widget_definition.upsert_all(
widget_definitions,
on_duplicate: :skip
)
end
- def remove_widget_definitions(type_enum_value:, widgets:)
- work_item_type = migration_work_item_type.find_by(base_type: type_enum_value)
+ def remove_widget_definitions(widgets:, type_enum_value: nil, type_enum_values: [])
+ enum_values = Array(type_enum_values) + [type_enum_value].compact
- return say(type_missing_message(type_enum_value)) unless work_item_type
+ work_item_types = migration_work_item_type.where(base_type: enum_values)
+
+ validate_work_item_types(enum_values, work_item_types)
+ return if work_item_types.empty?
migration_widget_definition.where(
- work_item_type_id: work_item_type.id,
+ work_item_type_id: work_item_types.pluck(:id),
widget_type: widgets.pluck(:widget_type)
).delete_all
end
@@ -89,6 +101,12 @@ module Gitlab
skipping widget processing.
MESSAGE
end
+
+ def validate_work_item_types(enum_values, work_item_types)
+ found_types = work_item_types&.pluck(:base_type) || []
+ missing_types = enum_values - found_types
+ missing_types.each { |type| say(type_missing_message(type)) }
+ end
end
end
end
diff --git a/locale/gitlab.pot b/locale/gitlab.pot
index 14f7c5df97f..1948ad51756 100644
--- a/locale/gitlab.pot
+++ b/locale/gitlab.pot
@@ -15266,6 +15266,9 @@ msgstr ""
msgid "Confirm password"
msgstr ""
+msgid "Confirm project name"
+msgstr ""
+
msgid "Confirm this email address within %{cut_off_days} days, otherwise the email address is removed."
msgstr ""
@@ -30360,6 +30363,9 @@ msgstr ""
msgid "Integrations|Add exclusions"
msgstr ""
+msgid "Integrations|Add integration"
+msgstr ""
+
msgid "Integrations|All details"
msgstr ""
diff --git a/qa/qa/specs/features/browser_ui/4_verify/ci_components_catalog/release_with_glab_spec.rb b/qa/qa/specs/features/browser_ui/4_verify/ci_components_catalog/release_with_glab_spec.rb
new file mode 100644
index 00000000000..c4d2008c785
--- /dev/null
+++ b/qa/qa/specs/features/browser_ui/4_verify/ci_components_catalog/release_with_glab_spec.rb
@@ -0,0 +1,226 @@
+# frozen_string_literal: true
+
+module QA
+ RSpec.describe 'Verify', product_group: :pipeline_authoring, feature_flag: {
+ name: :ci_release_cli_catalog_publish_option
+ } do
+ describe 'CI catalog release with glab', :skip_live_env do
+ let(:executor) { "qa-runner-#{Faker::Alphanumeric.alphanumeric(number: 8)}" }
+
+ let!(:project) do
+ create(:project, :with_readme, name: 'component-project', description: 'This is a project with CI component.')
+ end
+
+ let!(:milestone1) { create(:project_milestone, project: project, title: 'v1.0') }
+ let!(:milestone2) { create(:project_milestone, project: project, title: 'v2.0') }
+
+ let!(:runner) { create(:project_runner, project: project, name: executor, tags: [executor], executor: :docker) }
+
+ before do
+ Runtime::Feature.enable(:ci_release_cli_catalog_publish_option)
+
+ Flow::Login.sign_in
+
+ Flow::Project.enable_catalog_resource_feature(project)
+ end
+
+ after do
+ runner.remove_via_api!
+ end
+
+ it 'creates a release with existing tag',
+ testcase: 'https://gitlab.com/gitlab-org/gitlab/-/quality/test_cases/514352' do
+ setup_component(project, gitlab_ci_yaml_for_create_release_with_existing_tag)
+ project.create_repository_tag('1.0.0')
+
+ Flow::Pipeline.wait_for_pipeline_creation_via_api(project: project)
+ project.visit_job('create-release-with-existing-tag')
+
+ Page::Project::Job::Show.perform do |show|
+ Support::Waiter.wait_until { show.has_passed? }
+
+ aggregate_failures 'Job has expected contents' do
+ expect(show.output).to have_content('Release created:')
+ expect(show.output).to have_content('Publishing release tag=1.0.0 to the GitLab CI/CD catalog')
+ expect(show.output).to have_content('Release published:')
+ end
+ end
+
+ visit_catalog_resource_show_page
+
+ Page::Explore::CiCdCatalog::Show.perform do |show|
+ aggregate_failures 'Catalog resource has expected contents' do
+ expect(show).to have_version_badge('1.0.0')
+ expect(show).to have_component_name('new_component')
+ expect(show).to have_input(
+ name: 'scanner-output',
+ required: 'false',
+ type: 'string',
+ description: '',
+ default: 'json'
+ )
+ end
+
+ show.click_latest_version_badge
+ end
+
+ Page::Project::Tag::Show.perform do |show|
+ aggregate_failures 'Tag has expected contents' do
+ expect(show).to have_tag_name('1.0.0')
+ expect(show).to have_no_tag_message
+ end
+
+ show.click_release_link
+ end
+
+ Page::Project::Release::Show.perform do |show|
+ aggregate_failures 'Release has expected contents' do
+ expect(show).to have_release_name('1.0.0')
+ expect(show).to have_release_description('A long description of the release')
+ end
+ end
+ end
+
+ it 'creates a release with new tag filled with information',
+ testcase: 'https://gitlab.com/gitlab-org/gitlab/-/quality/test_cases/514353' do
+ setup_component(project, gitlab_ci_yaml_for_create_release_with_new_tag_filled_with_information)
+ project.create_repository_tag('1.0.0')
+
+ Flow::Pipeline.wait_for_pipeline_creation_via_api(project: project)
+ project.visit_job('create-release-with-new-tag-filled-with-information')
+
+ Page::Project::Job::Show.perform do |show|
+ Support::Waiter.wait_until { show.has_passed? }
+
+ aggregate_failures 'Job has expected contents' do
+ expect(show.output).to have_content('Release created:')
+ expect(show.output).to have_content('Publishing release tag=v9.0.2 to the GitLab CI/CD catalog')
+ expect(show.output).to have_content('Release published:')
+ end
+ end
+
+ visit_catalog_resource_show_page
+
+ Page::Explore::CiCdCatalog::Show.perform do |show|
+ aggregate_failures 'Catalog resource has expected contents' do
+ expect(show).to have_version_badge('9.0.2')
+ expect(show).to have_component_name('new_component')
+ expect(show).to have_input(
+ name: 'scanner-output',
+ required: 'false',
+ type: 'string',
+ description: '',
+ default: 'json'
+ )
+ end
+
+ show.click_latest_version_badge
+ end
+
+ Page::Project::Tag::Show.perform do |show|
+ aggregate_failures 'Tag has expected contents' do
+ expect(show).to have_tag_name('v9.0.2')
+ expect(show).to have_tag_message('a new tag')
+ end
+
+ show.click_release_link
+ end
+
+ Page::Project::Release::Show.perform do |show|
+ aggregate_failures 'Release has expected contents' do
+ expect(show).to have_release_name('new release v9.0.2')
+ expect(show).to have_release_description('A long description of the release')
+ expect(show).to have_milestone_title('v1.0')
+ expect(show).to have_milestone_title('v2.0')
+ expect(show).to have_asset_link('Download link', '/binaries/gitlab-runner-linux-amd64')
+ end
+ end
+ end
+
+ private
+
+ def setup_component(project, ci_yaml)
+ create(:commit, project: project, commit_message: 'Add .gitlab-ci.yml and component', actions: [
+ {
+ action: 'create',
+ file_path: '.gitlab-ci.yml',
+ content: ci_yaml
+ },
+ {
+ action: 'create',
+ file_path: 'templates/new_component.yml',
+ content: <<~YAML
+ spec:
+ inputs:
+ scanner-output:
+ default: json
+ ---
+ my-scanner:
+ script: my-scan --output $[[ inputs.scanner-output ]]
+ YAML
+ }
+ ])
+ end
+
+ def gitlab_ci_yaml_for_create_release_with_existing_tag
+ <<~YAML
+ default:
+ tags: ["#{executor}"]
+
+ create-release-with-existing-tag:
+ image: registry.gitlab.com/gitlab-org/cli:latest
+ script:
+ - echo "Creating release $CI_COMMIT_TAG"
+ rules:
+ - if: $CI_COMMIT_TAG
+ release:
+ tag_name: $CI_COMMIT_TAG
+ description: "A long description of the release"
+ YAML
+ end
+
+ def gitlab_ci_yaml_for_create_release_with_new_tag_filled_with_information
+ <<~YAML
+ default:
+ tags: ["#{executor}"]
+
+ workflow:
+ rules:
+ - if: $CI_COMMIT_TAG != "v9.0.2" # to prevent creating a new pipeline because of the tag created in the test
+
+ create-release-with-new-tag-filled-with-information:
+ image: registry.gitlab.com/gitlab-org/cli:latest
+ script:
+ - echo "Creating release $CI_COMMIT_TAG"
+ rules:
+ - if: $CI_COMMIT_TAG
+ release:
+ name: "new release v9.0.2"
+ description: "A long description of the release"
+ tag_name: v9.0.2
+ tag_message: a new tag
+ ref: $CI_COMMIT_TAG
+ milestones: ["v1.0", "v2.0"]
+ released_at: "2026-01-01T00:00:00Z"
+ assets:
+ links:
+ - name: "Download link"
+ url: "https://gitlab-runner-downloads.s3.amazonaws.com/v16.9.0-rc2/binaries/gitlab-runner-linux-amd64"
+ filepath: "/binaries/gitlab-runner-linux-amd64"
+ link_type: "other"
+ YAML
+ end
+
+ def visit_catalog_resource_show_page
+ Page::Main::Menu.perform do |main|
+ main.go_to_explore
+ main.go_to_ci_cd_catalog
+ end
+
+ Page::Explore::CiCdCatalog.perform do |catalog|
+ catalog.click_resource_link(project.name)
+ end
+ end
+ end
+ end
+end
diff --git a/spec/controllers/groups_controller_spec.rb b/spec/controllers/groups_controller_spec.rb
index 32c89d88946..87a722228c1 100644
--- a/spec/controllers/groups_controller_spec.rb
+++ b/spec/controllers/groups_controller_spec.rb
@@ -589,15 +589,27 @@ RSpec.describe GroupsController, :with_current_organization, factory_default: :k
sign_in(user)
end
- context 'sorting by votes' do
- it 'sorts most popular merge requests' do
- get :merge_requests, params: { id: group.to_param, sort: 'upvotes_desc' }
- expect(assigns(:merge_requests)).to eq [merge_request_2, merge_request_1]
- end
+ it 'renders merge requests index template' do
+ get :merge_requests, params: { id: group.to_param }
- it 'sorts least popular merge requests' do
- get :merge_requests, params: { id: group.to_param, sort: 'downvotes_desc' }
- expect(assigns(:merge_requests)).to eq [merge_request_2, merge_request_1]
+ expect(response).to render_template('groups/merge_requests')
+ end
+
+ context 'sorting by votes' do
+ context 'when vue_merge_request_list is disabled' do
+ before do
+ stub_feature_flags(vue_merge_request_list: false)
+ end
+
+ it 'sorts most popular merge requests' do
+ get :merge_requests, params: { id: group.to_param, sort: 'upvotes_desc' }
+ expect(assigns(:merge_requests)).to eq [merge_request_2, merge_request_1]
+ end
+
+ it 'sorts least popular merge requests' do
+ get :merge_requests, params: { id: group.to_param, sort: 'downvotes_desc' }
+ expect(assigns(:merge_requests)).to eq [merge_request_2, merge_request_1]
+ end
end
end
diff --git a/spec/controllers/projects/merge_requests_controller_spec.rb b/spec/controllers/projects/merge_requests_controller_spec.rb
index 3a24dda6daa..0d648c42cdd 100644
--- a/spec/controllers/projects/merge_requests_controller_spec.rb
+++ b/spec/controllers/projects/merge_requests_controller_spec.rb
@@ -363,67 +363,79 @@ RSpec.describe Projects::MergeRequestsController, feature_category: :code_review
}
end
- context 'when the test is flaky', quarantine: 'https://gitlab.com/gitlab-org/gitlab/-/issues/450217' do
- it_behaves_like "issuables list meta-data", :merge_request
+ it 'renders merge requests index template' do
+ get_merge_requests
+
+ expect(response).to render_template('projects/merge_requests/index')
end
- it_behaves_like 'set sort order from user preference' do
- let(:sorting_param) { 'updated_asc' }
- end
-
- context 'when page param' do
- let(:last_page) { project.merge_requests.page.total_pages }
- let!(:merge_request) { create(:merge_request_with_diffs, target_project: project, source_project: project) }
-
- it 'redirects to last_page if page number is larger than number of pages' do
- get_merge_requests(last_page + 1)
-
- expect(response).to redirect_to(project_merge_requests_path(project, page: last_page, state: controller.params[:state], scope: controller.params[:scope]))
+ context 'when vue_merge_request_list is disabled' do
+ before do
+ stub_feature_flags(vue_merge_request_list: false)
end
- it 'redirects to specified page' do
- get_merge_requests(last_page)
-
- expect(assigns(:merge_requests).current_page).to eq(last_page)
- expect(response).to have_gitlab_http_status(:ok)
+ context 'when the test is flaky', quarantine: 'https://gitlab.com/gitlab-org/gitlab/-/issues/450217' do
+ it_behaves_like "issuables list meta-data", :merge_request
end
- it 'does not redirect to external sites when provided a host field' do
- external_host = "www.example.com"
- get :index,
- params: {
- namespace_id: project.namespace.to_param,
- project_id: project,
- state: 'opened',
- page: (last_page + 1).to_param,
- host: external_host
- }
-
- expect(response).to redirect_to(project_merge_requests_path(project, page: last_page, state: controller.params[:state], scope: controller.params[:scope]))
+ it_behaves_like 'set sort order from user preference' do
+ let(:sorting_param) { 'updated_asc' }
end
- end
- context 'when filtering by opened state' do
- context 'with opened merge requests' do
- it 'lists those merge requests' do
- expect(merge_request).to be_persisted
+ context 'when page param' do
+ let(:last_page) { project.merge_requests.page.total_pages }
+ let!(:merge_request) { create(:merge_request_with_diffs, target_project: project, source_project: project) }
- get_merge_requests
+ it 'redirects to last_page if page number is larger than number of pages' do
+ get_merge_requests(last_page + 1)
- expect(assigns(:merge_requests)).to include(merge_request)
+ expect(response).to redirect_to(project_merge_requests_path(project, page: last_page, state: controller.params[:state], scope: controller.params[:scope]))
+ end
+
+ it 'redirects to specified page' do
+ get_merge_requests(last_page)
+
+ expect(assigns(:merge_requests).current_page).to eq(last_page)
+ expect(response).to have_gitlab_http_status(:ok)
+ end
+
+ it 'does not redirect to external sites when provided a host field' do
+ external_host = "www.example.com"
+ get :index,
+ params: {
+ namespace_id: project.namespace.to_param,
+ project_id: project,
+ state: 'opened',
+ page: (last_page + 1).to_param,
+ host: external_host
+ }
+
+ expect(response).to redirect_to(project_merge_requests_path(project, page: last_page, state: controller.params[:state], scope: controller.params[:scope]))
end
end
- context 'with reopened merge requests' do
- before do
- merge_request.close!
- merge_request.reopen!
+ context 'when filtering by opened state' do
+ context 'with opened merge requests' do
+ it 'lists those merge requests' do
+ expect(merge_request).to be_persisted
+
+ get_merge_requests
+
+ expect(assigns(:merge_requests)).to include(merge_request)
+ end
end
- it 'lists those merge requests' do
- get_merge_requests
+ context 'with reopened merge requests' do
+ before do
+ merge_request.close!
+ merge_request.reopen!
+ end
- expect(assigns(:merge_requests)).to include(merge_request)
+ it 'lists those merge requests' do
+ get_merge_requests
+
+ expect(assigns(:merge_requests)).to include(merge_request)
+ end
end
end
end
diff --git a/spec/finders/todos_finder_spec.rb b/spec/finders/todos_finder_spec.rb
index b00685973a7..0040e2a00b3 100644
--- a/spec/finders/todos_finder_spec.rb
+++ b/spec/finders/todos_finder_spec.rb
@@ -289,27 +289,51 @@ RSpec.describe TodosFinder, feature_category: :notifications do
describe '#sort' do
context 'by date' do
let!(:todo1) { create(:todo, user: user, project: project) }
- let!(:todo2) { create(:todo, user: user, project: project) }
- # Todos are created sequentially, so id is a reasonable proxy for created_at
- # We use this fact to optimize the performance of the finder
- # In this test we are purposefully setting the created_at to be before the todo1
- # in order to show that we are actually sorting by id
- let!(:todo3) { create(:todo, user: user, project: project, created_at: 3.hours.ago) }
+ let!(:todo2) { create(:todo, user: user, project: project, created_at: 3.hours.ago) }
+ let!(:todo3) { create(:todo, user: user, project: project, snoozed_until: 1.hour.ago) }
- it 'sorts with oldest created first' do
- todos = finder.new(user, { sort: :created_asc }).execute
+ context 'when sorting by ascending date' do
+ subject { finder.new(user, { sort: :created_asc }).execute }
- expect(todos.first).to eq(todo1)
- expect(todos.second).to eq(todo2)
- expect(todos.third).to eq(todo3)
+ it { is_expected.to eq([todo2, todo3, todo1]) }
end
- it 'sorts with newest created first' do
- todos = finder.new(user, { sort: :created_desc }).execute
+ context 'when sorting by descending date' do
+ subject { finder.new(user, { sort: :created_desc }).execute }
- expect(todos.first).to eq(todo3)
- expect(todos.second).to eq(todo2)
- expect(todos.third).to eq(todo1)
+ it { is_expected.to eq([todo1, todo3, todo2]) }
+ end
+
+ context 'when not querying pending to-dos only' do
+ context 'when sorting by ascending date' do
+ subject { finder.new(user, { sort: :created_asc, state: [:done, :pending] }).execute }
+
+ it { is_expected.to eq([todo1, todo2, todo3]) }
+ end
+
+ context 'when sorting by descending date' do
+ subject { finder.new(user, { sort: :created_desc, state: [:done, :pending] }).execute }
+
+ it { is_expected.to eq([todo3, todo2, todo1]) }
+ end
+ end
+
+ context 'when the snoozed_todos_sort_order feature flag is disabled' do
+ before do
+ stub_feature_flags(snoozed_todos_sort_order: false)
+ end
+
+ context 'when sorting by ascending date' do
+ subject { finder.new(user, { sort: :created_asc }).execute }
+
+ it { is_expected.to eq([todo1, todo2, todo3]) }
+ end
+
+ context 'when sorting by descending date' do
+ subject { finder.new(user, { sort: :created_desc }).execute }
+
+ it { is_expected.to eq([todo3, todo2, todo1]) }
+ end
end
end
diff --git a/spec/frontend/integrations/index/components/integrations_table_spec.js b/spec/frontend/integrations/index/components/integrations_table_spec.js
index 2f0c44d27ed..707890b5146 100644
--- a/spec/frontend/integrations/index/components/integrations_table_spec.js
+++ b/spec/frontend/integrations/index/components/integrations_table_spec.js
@@ -1,4 +1,4 @@
-import { GlTable, GlIcon, GlLink } from '@gitlab/ui';
+import { GlTable, GlLink, GlButton } from '@gitlab/ui';
import { mount } from '@vue/test-utils';
import IntegrationsTable from '~/integrations/index/components/integrations_table.vue';
import TimeAgoTooltip from '~/vue_shared/components/time_ago_tooltip.vue';
@@ -37,16 +37,18 @@ describe('IntegrationsTable', () => {
});
describe.each`
- scenario | integrations | shouldRenderActiveIcon
+ scenario | integrations | expectActiveIcon
${'when integration is active'} | ${[mockActiveIntegrations[0]]} | ${true}
${'when integration is inactive'} | ${[mockInactiveIntegrations[0]]} | ${false}
- `('$scenario', ({ shouldRenderActiveIcon, integrations }) => {
+ `('$scenario', ({ expectActiveIcon, integrations }) => {
beforeEach(() => {
createComponent({ integrations });
});
- it(`${shouldRenderActiveIcon ? 'renders' : 'does not render'} icon in first column`, () => {
- expect(findTable().findComponent(GlIcon).exists()).toBe(shouldRenderActiveIcon);
+ it(`${expectActiveIcon ? 'renders' : 'does not render'} icon in first column`, () => {
+ expect(findTable().find('[data-testid="integration-active-icon"]').exists()).toBe(
+ expectActiveIcon,
+ );
});
});
@@ -74,6 +76,8 @@ describe('IntegrationsTable', () => {
});
describe.each([true, false])('when integrations inactive property is %p', (inactive) => {
+ const findEditButton = () => findTable().findComponent(GlButton);
+
beforeEach(() => {
createComponent({ integrations: mockInactiveIntegrations, inactive });
});
@@ -81,5 +85,16 @@ describe('IntegrationsTable', () => {
it(`${inactive ? 'does not render' : 'render'} updated_at field`, () => {
expect(findTable().find('[aria-label="Updated At"]').exists()).toBe(!inactive);
});
+
+ if (inactive) {
+ it('renders Edit button as "Add integration"', () => {
+ expect(findEditButton().props('icon')).toBe('plus');
+ expect(findEditButton().text()).toBe('Add');
+ });
+ } else {
+ it('renders Edit button as "Configure"', () => {
+ expect(findEditButton().props('icon')).toBe('settings');
+ });
+ }
});
});
diff --git a/spec/frontend/work_items/components/notes/work_item_comment_form_spec.js b/spec/frontend/work_items/components/notes/work_item_comment_form_spec.js
index 9ef47d2679c..606f32964c7 100644
--- a/spec/frontend/work_items/components/notes/work_item_comment_form_spec.js
+++ b/spec/frontend/work_items/components/notes/work_item_comment_form_spec.js
@@ -72,11 +72,13 @@ describe('Work item comment form component', () => {
isDiscussionResolvable = false,
hasEmailParticipantsWidget = false,
canMarkNoteAsInternal = true,
+ canUpdate = true,
emailParticipantsResponseHandler = emailParticipantsSuccessHandler,
parentId = null,
} = {}) => {
workItemResponse = workItemByIidResponseFactory({
canMarkNoteAsInternal,
+ canUpdate,
});
workItemResponseHandler = jest.fn().mockResolvedValue(workItemResponse);
@@ -170,6 +172,24 @@ describe('Work item comment form component', () => {
expect(findMarkdownEditor().props('value')).toBe('');
});
+ describe('state toggle button', () => {
+ it('renders state toggle button', async () => {
+ createComponent({ isNewDiscussion: true });
+
+ await waitForPromises();
+
+ expect(findWorkItemToggleStateButton().exists()).toBe(true);
+ });
+
+ it('does not render state toggle button when canUpdate is false', async () => {
+ createComponent({ isNewDiscussion: true, canUpdate: false });
+
+ await waitForPromises();
+
+ expect(findWorkItemToggleStateButton().exists()).toBe(false);
+ });
+ });
+
describe('email participants', () => {
it('skips calling the email participants query', async () => {
await createComponent();
@@ -315,6 +335,8 @@ describe('Work item comment form component', () => {
createComponent({
isNewDiscussion: true,
});
+ await waitForPromises();
+
findWorkItemToggleStateButton().vm.$emit(
'error',
'Something went wrong while updating the task. Please try again.',
@@ -331,6 +353,7 @@ describe('Work item comment form component', () => {
createComponent({
isNewDiscussion: true,
});
+ await waitForPromises();
findWorkItemToggleStateButton().vm.$emit('submit-comment');
@@ -440,8 +463,9 @@ describe('Work item comment form component', () => {
});
});
- it('passes the `parentId` prop down to the `WorkItemStateToggle` component', () => {
+ it('passes the `parentId` prop down to the `WorkItemStateToggle` component', async () => {
createComponent({ isNewDiscussion: true, parentId: 'example-id' });
+ await waitForPromises();
expect(findWorkItemToggleStateButton().props('parentId')).toBe('example-id');
});
diff --git a/spec/graphql/resolvers/todos_resolver_spec.rb b/spec/graphql/resolvers/todos_resolver_spec.rb
index 659a7e151cf..d61cecb3b3a 100644
--- a/spec/graphql/resolvers/todos_resolver_spec.rb
+++ b/spec/graphql/resolvers/todos_resolver_spec.rb
@@ -131,7 +131,7 @@ RSpec.describe Resolvers::TodosResolver, feature_category: :notifications do
it 'only returns snoozed todos' do
todos = resolve_todos(args: { is_snoozed: true, sort: 'CREATED_ASC' }, context: { current_user: new_user })
- expect(todos.items).to eq([todo2, todo3])
+ expect(todos.items).to eq([todo3, todo2])
end
end
end
diff --git a/spec/graphql/types/ci/pipeline_creation/request_type_spec.rb b/spec/graphql/types/ci/pipeline_creation/request_type_spec.rb
new file mode 100644
index 00000000000..3dff9be9e9d
--- /dev/null
+++ b/spec/graphql/types/ci/pipeline_creation/request_type_spec.rb
@@ -0,0 +1,9 @@
+# frozen_string_literal: true
+
+require 'spec_helper'
+
+RSpec.describe GitlabSchema.types['CiPipelineCreationRequest'], feature_category: :pipeline_composition do
+ it 'has the expected fields' do
+ expect(described_class).to have_graphql_fields(:error, :pipeline_id, :status)
+ end
+end
diff --git a/spec/graphql/types/ci/pipeline_creation/status_enum_spec.rb b/spec/graphql/types/ci/pipeline_creation/status_enum_spec.rb
new file mode 100644
index 00000000000..50f6abf606b
--- /dev/null
+++ b/spec/graphql/types/ci/pipeline_creation/status_enum_spec.rb
@@ -0,0 +1,9 @@
+# frozen_string_literal: true
+
+require 'spec_helper'
+
+RSpec.describe GitlabSchema.types['CiPipelineCreationStatus'], feature_category: :fleet_visibility do
+ it 'exposes all pipeline creation statuses' do
+ expect(described_class.values.keys).to match_array(%w[FAILED IN_PROGRESS SUCCEEDED])
+ end
+end
diff --git a/spec/graphql/types/project_type_spec.rb b/spec/graphql/types/project_type_spec.rb
index 37452c54638..ab5e28cffe4 100644
--- a/spec/graphql/types/project_type_spec.rb
+++ b/spec/graphql/types/project_type_spec.rb
@@ -47,7 +47,7 @@ RSpec.describe GitlabSchema.types['Project'], feature_category: :groups_and_proj
allows_multiple_merge_request_assignees allows_multiple_merge_request_reviewers is_forked
protectable_branches available_deploy_keys explore_catalog_path
container_protection_tag_rules allowed_custom_statuses
- pages_force_https pages_use_unique_domain
+ pages_force_https pages_use_unique_domain ci_pipeline_creation_request
]
expect(described_class).to include_graphql_fields(*expected_fields)
diff --git a/spec/lib/gitlab/database/migration_helpers/work_items/widgets_spec.rb b/spec/lib/gitlab/database/migration_helpers/work_items/widgets_spec.rb
index 51277089785..9957184f443 100644
--- a/spec/lib/gitlab/database/migration_helpers/work_items/widgets_spec.rb
+++ b/spec/lib/gitlab/database/migration_helpers/work_items/widgets_spec.rb
@@ -7,7 +7,8 @@ RSpec.describe Gitlab::Database::MigrationHelpers::WorkItems::Widgets, feature_c
ActiveRecord::Migration.new.extend(described_class)
end
- let(:type_enum_value) { 8 }
+ let(:type_enum_value) { nil }
+ let(:type_enum_values) { [8, 9] }
let(:single_widget) do
[
{
@@ -38,57 +39,87 @@ RSpec.describe Gitlab::Database::MigrationHelpers::WorkItems::Widgets, feature_c
let(:work_item_type_migration_model) { double('migration_work_item_type') } # rubocop:disable RSpec/VerifiedDoubles -- stub only
let(:widget_definition_migration_model) { double('migration_widget_definition') } # rubocop:disable RSpec/VerifiedDoubles -- mock only
- let(:work_item_type) { double('work_item_type', id: 1) } # rubocop:disable RSpec/VerifiedDoubles -- stub only
+ let(:work_item_type) { Struct.new(:id, :base_type).new(1, 8) }
+ let(:another_work_item_type) { Struct.new(:id, :base_type).new(2, 9) }
+ let(:work_item_types_relation) { [work_item_type, another_work_item_type] }
before do
allow(migration).to receive_messages(
migration_work_item_type: work_item_type_migration_model,
migration_widget_definition: widget_definition_migration_model
)
+ allow(work_item_type_migration_model).to receive(:where)
+ .with(base_type: type_enum_values)
+ .and_return(work_item_types_relation)
+ allow(work_item_type_migration_model).to receive(:where)
+ .with(base_type: type_enum_values + [type_enum_value].compact)
+ .and_return(work_item_types_relation)
+ allow(work_item_type_migration_model).to receive(:where)
+ .with(base_type: [])
+ .and_return([])
end
describe '#add_widget_definitions' do
shared_examples 'properly executed up migration' do
it 'upserts widget definitions' do
- expected_widgets = widgets.map { |w| { work_item_type_id: work_item_type.id, widget_options: nil }.merge(w) }
+ expected_widgets = work_item_types_relation.flat_map do |type|
+ widgets.map { |w| { work_item_type_id: type.id, widget_options: nil }.merge(w) }
+ end
+
+ expect(migration).not_to receive(:say)
expect(widget_definition_migration_model).to receive(:upsert_all)
.with(expected_widgets, on_duplicate: :skip)
- migration.add_widget_definitions(type_enum_value: type_enum_value, widgets: widgets)
+ migration.add_widget_definitions(
+ type_enum_value: type_enum_value,
+ type_enum_values: type_enum_values,
+ widgets: widgets
+ )
end
end
- context 'when work item type exists' do
- before do
- allow(work_item_type_migration_model).to receive(:find_by)
- .with(base_type: type_enum_value)
- .and_return(work_item_type)
- end
+ it_behaves_like 'properly executed up migration'
+
+ context 'when there is more than one widget' do
+ let(:widgets) { multiple_widgets }
it_behaves_like 'properly executed up migration'
+ end
- context 'when there is more than one widget' do
- let(:widgets) { multiple_widgets }
+ context 'when work item types do not exist' do
+ let(:work_item_types_relation) { [] }
- it_behaves_like 'properly executed up migration'
+ it 'logs a message for all missing types and does not upsert' do
+ type_enum_values.each do |type_enum_value|
+ expect(migration).to receive(:say).with(/Work item type with enum value #{type_enum_value} does not exist/)
+ end
+
+ expect(widget_definition_migration_model).not_to receive(:upsert_all)
+
+ migration.add_widget_definitions(type_enum_values: type_enum_values, widgets: widgets)
end
end
- context 'when work item type does not exist' do
- before do
- allow(work_item_type_migration_model).to receive(:find_by)
- .with(base_type: type_enum_value)
- .and_return(nil)
- end
+ context 'when two types are passed but only one is found' do
+ let(:work_item_types_relation) { [work_item_type] }
- it 'logs a message and does not upsert' do
- expect(migration).to receive(:say).with(/Work item type with enum value 8 does not exist/)
- expect(widget_definition_migration_model).not_to receive(:upsert_all)
+ it 'logs a message for the missing type and upserts for the found type' do
+ expect(migration).to receive(:say).with(/Work item type with enum value 9 does not exist/)
+ expect(widget_definition_migration_model).to receive(:upsert_all)
+ .with([{ work_item_type_id: work_item_type.id, widget_options: nil }
+ .merge(single_widget.first)], on_duplicate: :skip)
- migration.add_widget_definitions(type_enum_value: type_enum_value, widgets: widgets)
+ migration.add_widget_definitions(type_enum_values: type_enum_values, widgets: widgets)
end
end
+
+ context 'when only type_enum_value is provided' do
+ let(:type_enum_value) { 8 }
+ let(:type_enum_values) { [] }
+
+ it_behaves_like 'properly executed up migration'
+ end
end
describe '#remove_widget_definitions' do
@@ -96,44 +127,63 @@ RSpec.describe Gitlab::Database::MigrationHelpers::WorkItems::Widgets, feature_c
it 'deletes widget definitions' do
widget_definition_relation = double('widget_definition_relation') # rubocop:disable RSpec/VerifiedDoubles -- mock only
expect(widget_definition_migration_model).to receive(:where)
- .with(work_item_type_id: work_item_type.id, widget_type: widgets.pluck(:widget_type))
+ .with(work_item_type_id: work_item_types_relation.pluck(:id), widget_type: widgets.pluck(:widget_type))
.and_return(widget_definition_relation)
expect(widget_definition_relation).to receive(:delete_all)
- migration.remove_widget_definitions(type_enum_value: type_enum_value, widgets: widgets)
+ migration.remove_widget_definitions(
+ type_enum_value: type_enum_value,
+ type_enum_values: type_enum_values,
+ widgets: widgets
+ )
end
end
- context 'when work item type exists' do
- before do
- allow(work_item_type_migration_model).to receive(:find_by)
- .with(base_type: type_enum_value)
- .and_return(work_item_type)
- end
+ it_behaves_like 'properly executed down migration'
+
+ context 'when there is more than one widget' do
+ let(:widgets) { multiple_widgets }
it_behaves_like 'properly executed down migration'
-
- context 'when there is more than one widget' do
- let(:widgets) { multiple_widgets }
-
- it_behaves_like 'properly executed down migration'
- end
end
context 'when work item type does not exist' do
- before do
- allow(work_item_type_migration_model).to receive(:find_by)
- .with(base_type: type_enum_value)
- .and_return(nil)
- end
+ let(:work_item_types_relation) { [] }
- it 'logs a message and does not delete' do
- expect(migration).to receive(:say).with(/Work item type with enum value 8 does not exist/)
- expect(widget_definition_migration_model).not_to receive(:upsert_all)
+ it 'logs a message for all missing types and does not delete' do
+ type_enum_values.each do |type_enum_value|
+ expect(migration).to receive(:say).with(/Work item type with enum value #{type_enum_value} does not exist/)
+ end
- migration.remove_widget_definitions(type_enum_value: type_enum_value, widgets: widgets)
+ expect(widget_definition_migration_model).not_to receive(:where)
+
+ migration.remove_widget_definitions(type_enum_values: type_enum_values, widgets: widgets)
end
end
+
+ context 'when two types are passed but only one is found' do
+ let(:work_item_types_relation) { [work_item_type] }
+
+ it 'logs a message for the missing type and deletes widgets for the found type' do
+ expect(migration).to receive(:say).with(/Work item type with enum value 9 does not exist/)
+
+ widget_definition_relation = double('widget_definition_relation') # rubocop:disable RSpec/VerifiedDoubles -- mock only
+ expect(widget_definition_migration_model).to receive(:where)
+ .with(work_item_type_id: [work_item_type.id], widget_type: widgets.pluck(:widget_type))
+ .and_return(widget_definition_relation)
+
+ expect(widget_definition_relation).to receive(:delete_all)
+
+ migration.remove_widget_definitions(type_enum_values: type_enum_values, widgets: widgets)
+ end
+ end
+
+ context 'when only type_enum_value is provided' do
+ let(:type_enum_value) { 8 }
+ let(:type_enum_values) { [] }
+
+ it_behaves_like 'properly executed down migration'
+ end
end
end
diff --git a/spec/models/todo_spec.rb b/spec/models/todo_spec.rb
index ec7b4daf325..9de1cdb51bd 100644
--- a/spec/models/todo_spec.rb
+++ b/spec/models/todo_spec.rb
@@ -665,6 +665,24 @@ RSpec.describe Todo, feature_category: :notifications do
end
end
+ describe '.sort_by_snoozed_and_creation_dates' do
+ let_it_be(:todo1) { create(:todo) }
+ let_it_be(:todo2) { create(:todo, created_at: 3.hours.ago) }
+ let_it_be(:todo3) { create(:todo, snoozed_until: 1.hour.ago) }
+
+ context 'when sorting by ascending date' do
+ subject { described_class.sort_by_snoozed_and_creation_dates(direction: :asc) }
+
+ it { is_expected.to eq([todo2, todo3, todo1]) }
+ end
+
+ context 'when sorting by descending date' do
+ subject { described_class.sort_by_snoozed_and_creation_dates }
+
+ it { is_expected.to eq([todo1, todo3, todo2]) }
+ end
+ end
+
describe '.distinct_user_ids' do
subject { described_class.distinct_user_ids }
diff --git a/spec/requests/api/graphql/project/ci/pipeline_creation/request_spec.rb b/spec/requests/api/graphql/project/ci/pipeline_creation/request_spec.rb
new file mode 100644
index 00000000000..a979746c7b1
--- /dev/null
+++ b/spec/requests/api/graphql/project/ci/pipeline_creation/request_spec.rb
@@ -0,0 +1,56 @@
+# frozen_string_literal: true
+
+require 'spec_helper'
+
+RSpec.describe 'Query.project.ciPipelineCreationRequest', :clean_gitlab_redis_shared_state, feature_category: :pipeline_composition do
+ include GraphqlHelpers
+
+ let_it_be(:project) { create(:project) }
+ let_it_be(:user) { create(:user) }
+
+ let(:creation_request) { ::Ci::PipelineCreation::Requests.start_for_project(project) }
+
+ let(:query) do
+ <<~GQL
+ query {
+ project(fullPath: "#{project.full_path}") {
+ ciPipelineCreationRequest(requestId: "#{creation_request['id']}") {
+ error
+ pipelineId
+ status
+ }
+ }
+ }
+ GQL
+ end
+
+ context 'when the current user can create pipelines on the project' do
+ before_all do
+ project.add_developer(user)
+ end
+
+ it 'returns information about the pipeline creation request' do
+ post_graphql(query, current_user: user)
+
+ expect(graphql_data['project']).to eq({
+ 'ciPipelineCreationRequest' => {
+ 'error' => nil,
+ 'pipelineId' => nil,
+ 'status' => 'IN_PROGRESS'
+ }
+ })
+ end
+ end
+
+ context 'when the current user cannot create pipelines on the project' do
+ before_all do
+ project.add_guest(user)
+ end
+
+ it 'returns nil' do
+ post_graphql(query, current_user: user)
+
+ expect(graphql_data['project']).to eq({ 'ciPipelineCreationRequest' => nil })
+ end
+ end
+end
diff --git a/spec/requests/api/personal_access_tokens_spec.rb b/spec/requests/api/personal_access_tokens_spec.rb
index c5b2bb1f3c6..674415f0b20 100644
--- a/spec/requests/api/personal_access_tokens_spec.rb
+++ b/spec/requests/api/personal_access_tokens_spec.rb
@@ -57,6 +57,26 @@ RSpec.describe API::PersonalAccessTokens, :aggregate_failures, feature_category:
expect(json_response.first['user_id']).to eq(token.user.id)
expect(json_response.last['id']).to eq(token_impersonated.id)
end
+
+ context 'validations for user_id parameter' do
+ let_it_be(:user) { create(:user) }
+ let_it_be(:admin_token) { create(:personal_access_token, :admin_mode, user: current_user) }
+ let_it_be(:user_token) { create(:personal_access_token, user: user) }
+
+ it 'returns 404 if user_id is provided but does not exist' do
+ get api(path, current_user, admin_mode: true), params: { user_id: non_existing_record_id }
+
+ expect(response).to have_gitlab_http_status(:not_found)
+ expect(json_response['message']).to eq("404 Not Found")
+ end
+
+ it 'returns 404 if user_id is explicitly blank' do
+ get api(path, current_user, admin_mode: true), params: { user_id: '' }
+
+ expect(response).to have_gitlab_http_status(:not_found)
+ expect(json_response['message']).to eq("404 Not Found")
+ end
+ end
end
context 'filter with revoked parameter' do
diff --git a/spec/support/shared_examples/migrations/add_work_item_widget_shared_examples.rb b/spec/support/shared_examples/migrations/add_work_item_widget_shared_examples.rb
index 6ee3254d28c..97574187ed4 100644
--- a/spec/support/shared_examples/migrations/add_work_item_widget_shared_examples.rb
+++ b/spec/support/shared_examples/migrations/add_work_item_widget_shared_examples.rb
@@ -51,16 +51,23 @@ end
# Shared examples for testing migration that adds widgets to a work item type
#
# It expects that the following constants are available in the migration
-# - `WORK_ITEM_TYPE_ENUM_VALUE`: Int, enum value for the work item type
+# - `WORK_ITEM_TYPE_ENUM_VALUES`: Int, enum value for the work item type
# - `WIDGET`: Hash, widget definitions (name:, widget_type:)
# - (Old) `WIDGET_ENUM_VALUE`: Int, enum value for the widget type
# - (Old) `WIDGET_NAME`: String, name of the widget
#
-# You can override `target_type_enum_value` to explicitly define the work item type enum value
+# You can override `target_type_enum_values` to explicitly define the work item type enum value
RSpec.shared_examples 'migration that adds widgets to a work item type' do
let(:work_item_types) { table(:work_item_types) }
let(:work_item_widget_definitions) { table(:work_item_widget_definitions) }
- let(:target_type_enum_value) { described_class::WORK_ITEM_TYPE_ENUM_VALUE }
+ let(:target_type_enum_values) do
+ if defined?(described_class::WORK_ITEM_TYPE_ENUM_VALUES)
+ Array(described_class::WORK_ITEM_TYPE_ENUM_VALUES)
+ else
+ Array(described_class::WORK_ITEM_TYPE_ENUM_VALUE)
+ end
+ end
+
let(:widgets) do
if defined?(described_class::WIDGETS)
described_class::WIDGETS
@@ -78,29 +85,34 @@ RSpec.shared_examples 'migration that adds widgets to a work item type' do
it "adds widgets to work item type", :aggregate_failures do
expect do
migrate!
- end.to change { work_item_widget_definitions.count }.by(widgets.size)
+ end.to change { work_item_widget_definitions.count }.by(widgets.size * target_type_enum_values.size)
+ work_item_types_with_widgets = target_type_enum_values.map do |enum_value|
+ work_item_types.find_by(base_type: enum_value)
+ end
- work_item_type = work_item_types.find_by(base_type: target_type_enum_value)
- created_widgets = work_item_widget_definitions.last(widgets.size)
+ created_widgets = work_item_widget_definitions.where(
+ work_item_type_id: work_item_types_with_widgets.map(&:id)
+ )
+ work_item_types_with_widgets.each do |work_item_type|
+ widgets.each do |widget|
+ expected_attributes = {
+ work_item_type_id: work_item_type.id,
+ widget_type: widget[:widget_type],
+ name: widget[:name],
+ # Hashes from json from DB have string keys
+ widget_options: widget[:widget_options] ? widget[:widget_options].stringify_keys : nil
+ }
- widgets.each do |widget|
- expected_attributes = {
- work_item_type_id: work_item_type.id,
- widget_type: widget[:widget_type],
- name: widget[:name],
- # Hashes from json from DB have string keys
- widget_options: widget[:widget_options] ? widget[:widget_options].stringify_keys : nil
- }
-
- expect(created_widgets).to include(
- have_attributes(expected_attributes)
- )
+ expect(created_widgets).to include(
+ have_attributes(expected_attributes)
+ )
+ end
end
end
context 'when type does not exist' do
it 'skips creating the new widget definitions' do
- work_item_types.where(base_type: target_type_enum_value).delete_all
+ work_item_types.where(base_type: target_type_enum_values).delete_all
expect do
migrate!
@@ -113,7 +125,9 @@ RSpec.shared_examples 'migration that adds widgets to a work item type' do
it "removes widgets from work item type" do
migrate!
- expect { schema_migrate_down! }.to change { work_item_widget_definitions.count }.by(-widgets.size)
+ expect { schema_migrate_down! }.to change { work_item_widget_definitions.count }.by(
+ -(widgets.size * target_type_enum_values.size)
+ )
end
end
end